 Hey, welcome. Thanks for coming to our session. We're going to be doing a talk on the Internet censorship and what governments are doing around the world. And we're going to stage it to be like a conversation. So the idea is us three or four with different perspectives are having a conversation. I'll try to be the moderator and create some runway for the three panelists to have a conversation. And it's something I think the problem with these is you sit around the speaker room and you're getting ready and you're having such a great conversation, you have to constantly be telling yourself like, stop, save that for the stage. No, no, no, stop. So we think we've got some fun things to talk about or at least we think it's fun. So to kick it off, I want to start on my right with Roger and everybody is going to introduce themselves and then we'll just jump right on in. So, all right. So thank you. Okay. Hi, everybody. I'm Roger Dingledine from Tor. I founded Tor long ago. I wrote the original Tor code and now I wear all sorts of hats and I've been paying more attention to the anti-censorship side of things, the anonymity research side of things and the policy side of things talking to governments and trying to help them understand that end to end encryption is important and things like that and that's how I find myself on a policy panel. Hey, I'm Chris Painter. I've been doing cyber stuff for about 33 years now. I started as a federal prosecutor. I didn't start life as a federal prosecutor, but I was a federal prosecutor for a while doing cyber crime cases. Then I went to the mother ship at Maine Justice in D.C. And then I was at the White House creating the cyber directorate and then was the world's first cyber diplomat at the State Department. And now, among other things, I run a foundation that's devoted to cybersecurity capacity building. And Jeff and I share a love for a movie called Colossus the Forbidden Project, which you haven't seen. It's like one of the best movies out there. A little kitschy, but great. That's the original movie where the intelligent computer takes over the world to protect humanity from itself. And all the stories kind of come from that. It's based on a three series of books, but Colossus 1970, 14 years before we're getting determined. That's right. I'm Joel Todoroff. I'm a Fed. I've been a Fed for some time now, working on technology issues really all around government. Right now, I'm working for the office of the National Cyber Director in the White House and some of the projects that I work on relate to this, relate to these questions of internet censorship and countering the misuse of authoritarian technologies. There is one caveat I do have to provide. They're letting me speak, but they said I have to put in a plug for ONCD and the work we're doing and to tell you all to go to the website and look at the documents we've published. And so please don't stop me. You actually have some requests for a comment, right? We do. It's not just look at the documents. Yes, so there are areas where you can actively contribute. We'll talk more on this topic later, but one on open source security. There's a request for information out. So if people are interested in that topic, you can look there. One on workforce and the cybersecurity workforce. So if you're interested in that, please do give them a look. There's really a lot on there. Okay, so let's kick it off. Let's get this party started. You said something, and I jump on you first. You said censorship and other authoritarian technologies. What are authoritarian technologies? Do we know? I mean, it's not, everything is dual use. So is it really just the way in which the commercial off the self technology is configured? Like the way the router is filtering? Or is it we can actually make that distinction? I feel like we're starting with a really hard question, and I love it. So I don't have, I think a great answer, but I have a thought. And the thought is the general sense we come in with is that technology starts as being value neutral. That we imbue it with values by creating it, by operating it, by owning it. And so a technology or a protocol is not itself going to be authoritarian or democratic. And I think there's a sense in which that is certainly true, right? Maybe that's DNS blocking or IP or something like that. You can certainly imagine a lot of entirely legitimate uses. And you can also see instances where an authoritarian regime would misuse the same technologies, the same protocols to do something like target ethnic minorities or surveil journalists or prevent people from accessing LGBT information on the internet, right? Those would definitely be problematic. And we can say, okay, the values that we're ascribing to this are those from the owners and operators. But, but different than say, I don't know, command and control, commercial command and control for ransomware or NSO, you know, things that have made it there seems to be a threshold that gets you on a sanctions list or on a commerce, you know, was in our arrangement list, right? So there are some. Yeah, so you can definitely do things that are really problematic with a technology. And so I know Chris has some specific thoughts on this. But one of the things that I'll also flag that I think is interesting is this question of our default state of saying technology is value neutral. One of the things the Biden administration is looking at, and you'll see a resurgence in, is this focus on things like standards bodies, which gets to this idea that the value neutrality of technology is a really good first pass. But then you have to start asking, is that the case always? Or are there things where really early in the design and implementation, you might build something in a way that it's predisposed to be used by an authoritarian regime in a way that it's predisposed to enable an anti-democratic or an anti-liberal use of the technology? And how do we think about ensuring that our values are put into the tech stack, are put into protocols, are put into standards from the beginning? And that I think is certainly an area of ongoing interest and an area of ongoing effort. I mean, I think that's right. I think that the technology is largely value neutral. And we've always said that. It's been like a talking point for forever that people said. And I think it's largely true. There is some technology, some things that are clearly designed for purposes that are to subvert civil liberties or to enable unreviewed and undemocratic sort of surveillance of folks. But the problem is whenever we've talked in the past about, oh, we need to regulate this, oh, we need to make this illegal, that's really problematic because of the dual use issue that Jeff talked about. And to give you a good example, we tried this years ago now with the Vasanar. The Vasanar arrangement is this export control issue that's used for lots of different things. But we tried this with software that could be used to either launch attacks or to surveil and the protective liberties. And the U.S. was a big mover of this and we did it. But the problem was it really didn't work very well because it also inadvertently covered security software. So this is attention, which I think is very difficult, but I do agree with you. I think there's some ways you can architect the tools, the technology that clearly are meant in a bad way. But at the same time, it's a pretty slippery slope. So figuring out where that line is, I think is very difficult. And I'll disagree so that we get a good panel here. So I keep hearing that technology is inherently neutral. It's what you do with it that matters. But you have to look at the architecture. Some technology is inherently decentralized that empowers a broad group of people. Some technology is inherently centralized that ends up centralizing power. So you might start off thinking the technology side is inherently neutral, but no, technology is inherently political. And you need to think through what people are going to use it for. Because so if you're going to end up, if you're not thinking it through, then you're going to end up reinforcing the existing power structures. And if you want to build something that changes the power structures, that's political. And you need to think that through with the architecture. Wait, I have to challenge it a little bit. And I would agree with you it's political, but I would also say it's inherently now commercial. And so the commercial pressures are centralized for efficiency. And the market forces now are essentially creating pools of technology then that it's like an attractive nuisance. If you only have to provide one subpoena to Gmail and you get half of the email accounts on the planet, boy, that's really convenient. But if you had to supply subpoenas to 50,000 email providers, that's a pain in the ass. And so maybe it's not also political, it's a side effect. Or the political side sees the market force and says, wow, this is great for us. Yeah, that centralization is super scary. And the centralization is a choice made by people when they're building their architecture to centralize the power or decentralize the power. And that central is it's not just one email provider, it's not enough CDNs, not enough undersea cables between continents, not enough app stores. Yeah, I mean, look, I think decentralization is one area. But when you get beyond that, I think it's tough to draw that line again and say, okay, it's you're right. But then how do you actually get to that next mile? And it looks this and that technology is policy too. I remember having a conversation with the state department with the Chinese and the cyber administration of China. And for some reason we were talking about public-private partnerships, which seems like a pretty good thing, right, normally? And they said, oh, you're absolutely right. We need public-private partnerships because we can't monitor the internet alone. We need their help. Not what I had in mind, but... Yeah, just to pick up on that quickly, riffing off some work that my colleagues at the state department have done, if you look at censorship around the world right now, more than two-thirds of the world's population is, you know, is having their human rights on the internet infringed based on these repressive regimes, surveilling them or censoring them on the internet. And they are spending billions of dollars to build out ecosystems that enable surveillance and censorship at scale in real time, right? There is a clear push for... I wouldn't... I think I would use like the idea of centralization versus decentralization a little differently. The idea of moving away from a free and global internet to essentially a series of isolated networks of censorship and surveillance that an individual repressive regime would want to run often against their own citizens. And they would want to centralize in the sense that they want the authority, but not centralized in the sense that there would be global interconnectivity where they would have to reach out to a different central location to engage in their sort of mass surveillance. And that's been going on for a while, because, you know, Hillary Clinton, when she was the secretary of state, gave a speech launching the Freedom Online Coalition, a group of a lot of countries that championed Freedom Online, and she talked about digital bubbles, authoritarian bubbles around the world. And that's just been accelerating, I think, since then. And that's, you know, again, it's hard to stop that from happening. And we've gone from the promise of the internet's the greatest thing in the world. It's the greatest democratizing force in the world, too. It's also the greatest way that repressive regimes, and not just repressive regimes, we'll get into this later, can do things that are not great, and includes monitoring and controlling their citizenry in ways that they haven't been able to before. The... just one other thought on the centralization, the thing that really bothers me is the, um, we have no plan B for, uh, Roger served me to say this, we have no plan B for, uh, certificate authorities. So, if, if something goes wrong with TLS or certificate authority structure, there's no, like, we'll use DNSSEC, and we'll push out Dane, TLSA records, and then the web will keep going, right? There's no agility. We're all in, and the browsers win away. All the browsers now have removed support for Dane and TLS certificates. The only thing that's using it now are, uh, mail servers. So I guess if the certificate authority system sort of collapses, or is under attack, or is subpoenaed, or, you know, uh, let's encrypt blows up, and that's 90 something percent of all certificates. If let's encrypt goes down, we'll be able to change mail. These mail servers use Dane, but nobody else will be able to browse the web. That really worries me, that we have no backup planning. Governments know this, and governments that want to suppress know. I, I just saw a, uh, a takedown against an onion domain name for, I think, a white supremacist website. The registrar in Greece pulled the TLS certificate name for, it, that's just, that's gonna be the start. I'm, I'm happy they did that, but I'm just saying that soon we'll be seeing subpoenas, or we'll be seeing takedowns on TLS certificates, and since browsers are now default, that is the gateway, right? And we don't have these conversations, and we build these technologies that now we have no agility, so don't be surprised, you know, when the world you describe happens. Um, and so when I look at my Rolodex of who do I call what's the backup plan, I get Roger. I don't get five Rogers, I get one Roger. There is one option kind of in its tour, and it's not perfect, but we don't have a thriving ecosystem of five tours, you know. Uh, that, that worries me. You know, I want you to talk now about your unbounded adversaries. Yeah, so part of the, the reason why I am thinking about the policy side of things, and it, it's fun to have the balance here, because we've got a policy person saying, yeah, yeah, we're, we're, we've got the policy stuff under control, but if only you could build better tech for us, then we'll be able to keep people safe. And here I am, the technology side, saying, yeah, yeah, yeah, we're, we've got tools and they work pretty well, and millions of people are using them, but if governments are unbounded, if they're breaking their own laws, if they're, uh, attacking everything at will, if they're putting billions of dollars into harming people, uh, we can't solve this with technology alone. If they, like in China, uh, in the Uyghur province, if they end up sending a person to live in the house of each of the families there, I can't, Tor is not going to be what they need. This is, uh, like a social problem, a political problem, a policy problem. So, yeah, the reason why I'm, I'm thinking broader than just the technology is, uh, increasingly we're seeing governments that, uh, that do more than technical attacks, and we need to tackle that at every layer. This is, this is your, um, censorship implies surveillance models. Yes, so, uh, we used to think of this, so Tor started as a way to let you browse the web more safely, meaning metadata, communications, metadata protection, so people watching, you can't learn what you're doing, and then it, it morphed into a censorship resistance thing, and the key to realize is, they're actually the same thing. So when some, when a government is censoring your internet connection, they start off by surveilling you to learn who you are and what you're doing to decide whether to let you do it. So, it's actually two sides of the same coin, um, so if you're only thinking about censorship, you should be thinking about surveillance. If you're thinking about surveillance, the next step to think about is censorship. These are both, uh, uh, two sides of the, the same issue, and that leads me to also think about end-to-end encryption and other attacks at the, the policy layer that are related to both censorship and surveillance. I, you know, I, I think that the issue is, you know, there's policy for liberal democracies, and there's policies for repressive regimes, and it's much harder to change the policies of repressive regimes, and so, uh, you know, but, but there has been some good work that's been done. But, but if they're buying the technology designed for a liberal democracy. Well, that, that's what I was about to say. So I think there's been some good work that's been done. I mean, for instance, the White House issued this anti-commercial spyware executive order, got 10 countries initially, I think it's up to maybe 15 or so now, or maybe more than that, countries have signed on to this. Now, they're all the, the good people, or the, what we think of as more of the good people. But there are things you can do to exert pressure on these more repressive countries. Maybe not China or Russia is easily, but a lot of the countries are sort of in the middle where you can have an impact, and you have to use the full suite of policy tools you have. And that's not just cyber stuff. That's trade and other things you can do. And I think there's been much more awareness and movement toward that recently. I think the other problem though is, you know, we think, we think that the world is binary. There's the bad guys and the good guys. Well, there's, there's a lot, even some of the good guys are starting to come down this path a little bit because they're worried about terrorism. They're worried about misinformation and disinformation. By the way, my, my suggestion to Jeff earlier was, next year, the misinformation village should have the, the sign out in front of a different room. So that would, I think it would be, they did that when the, when the Soviets invaded Prague, they changed all the street signs. So it's good misinformation. So, you know, so I think that the problem is you have to target the repressive regimes, but you also have to think of the liberal democracies and where they're going and what paths they're taking on these issues. And that involves the gnarly debate around encryption and all these other things. But where, where can we make sure that we have the kind of transparency rule of law and they're actually following the laws, as was said too. And that's, that's not just an issue for the cyber community. That's an issue for the entire, you know, policy community and all the tools that we might have that might change behavior. Yeah, I, I largely agree with what everyone was saying. I think one of the things we're looking at in the office of the national cyber director is that question of what's the backup plan, right? Because to some of these points, Tor is an excellent tool in certain instances. But if a government sends some people with guns to the one cable landing site in a country and says, turn it off. Tor doesn't help you. So the question is, what do you use when the censorship model is a man comes in the room with a weapon, right? Or a complete internet shutdown. And there will be things that we can do with technology. There will be things we can do with policy. And for us to have efficacious solutions, I think we have to work it together. That's part of why we're here. I hope that's part of why we're talking about it. That's part of why there's going to be a workshop later today to hear from you and to get an exchange ideas. But I also think we need to think of this almost from a network stack sort of layer. To say, if someone is going after layer one to engage in censorship or surveillance, how do we ensure that there's security at that layer? If people are targeting, you know, another layer of the stack, how do we ensure there's appropriate security at each of those layers? So I guess we didn't talk about this in the ready room, but maybe just talk about domain fronting, maybe first, why it's useful, but then why now it's people are not supporting it, right? Vendors are not. Google, I think, is stopping it. A lot of cloud providers don't domain front anymore, and that's what Signal Messenger really relied on for some protections. So we're not a repressive regime so much, but again, market forces are making it harder. I'm looking at you. Okay. Yeah. So domain fronting is one of the tricks that we have at the technical level for getting around censorship. And the way that it works is you find a cloud provider like Google or Azure or Fastly or Cloudflare, and you do a TLS connection, and on the outside of the TLS connection in the SNI field, you list a domain that they're not going to block. And then once you've done the TLS connection inside the encrypted channel, you do a host header for the domain that you'd actually like to go to. And technically, this is violating some sort of spec somewhere in IETF land, but it works in most places until a few years ago when Telegram was using it to get around blocking from Russia. And then Russia said, hey, Amazon, could you quit that? And Amazon said, why? Yes, Russia, we will turn that off. Thank you, Russia. And so Amazon stopped it. They convinced Google to stop it. They convinced Microsoft to stop it, but Microsoft said they were going to stop it, and then they didn't, which is pretty cool of them. But they're eventually going to. So some CDNs allow it, but many of them don't these days. And it's one of the better working tricks, though you have to pay cloud prices for the bandwidth you send across it. So some circumvention tools like Torr use domain fronting not for all of the traffic, but just for the signaling in order to have a reliable channel to set things up, and then you fall back to some other channel. But yes, it's a trick that is closing down as the Western huge companies change their mind about letting it work. Right. And so no fight in the UN necessary, right? That's how much we really care about these technologies and the rights that they can support, right? ITF is not racing to formalize it into a thing to support the human rights aspect. It's just disappointing. Although participating in a lot of UN negotiations, getting consensus, because the only way to get consensus is near impossible, because all those repressive regimes are there. Right. And just for people's awareness. What do I mean? It's not even brought up. We're not even saying, is there a letter from the White House saying, hey, Google, don't worry about it. We got your back, right? Or some, I know, 50 state's attorneys general saying, don't worry about it. We'll tell Russia, you know, to go stuff themselves. No. There's not like a state or a local tribal territory. I mean, there's nothing. And I don't know if it's because these are kind of technical things, and it just doesn't reach awareness of policymakers. Or is it actually like they thought it through and said, no, that's not worth a fight? I mean, I suspect it's part of that. I suspect that the policymakers just don't understand the technology and what it's doing. I suspect part of it, too, is when you go to the companies, look, they're there for profits, right? So it's much easier. But the poor is paying for it. No, no, but it's much easier now with Russia for them to say, no, that it is for China. And so we have to kind of come over that. We have to figure out how to disincentivize that contract or incentivize the good conduct. And part of its government, what governments can do, including our government, but if it's just our government, that's not going to do it either. And just again, for people's awareness, there's a lot of discussions in the UN now on all these different issues. One is they're now negotiating a cyber crime treaty, which is pretty far in the negotiations. And these issues, the procedural issues of cyber crime and how you get evidence, some of these issues are being covered, not in this technical detail, which is part of the problem, I think. But this may be a way to push the more repressive regimes to have more things that allow them to do stuff or not. And so I think the West is very concerned about enabling repressive regimes to get more access, but at the same time trying to get law enforcement access. And that's a difficult issue. So this community, I think, should follow those kind of things and make sure that you at least aware of what's going on. Just like you should comment on the documents that you talked about. One thing that I'll add is, so in March, there was a big event in DC, a summit for democracy where countries around the world came together, there was one before. This year, one of the focuses included countering the authoritarian misuse of technology. And there was a call to action to the private sector that went out. That did not talk about any one specific protocol. They talked about things that is sort of slightly higher layer of abstraction for a lot of reasons. But certainly there's an administration interest in exactly this. And you are seeing press coming from the White House saying, we need to figure out ways to counter this. There was specific discussion of things like VPN and ensuring VPN availability and that you could connect securely and safely to things like VPNs, along with some other things sort of explicitly identified and then otherwise an invitation to engage primarily with our colleagues at the State Department on how you could do this and how you could work collaboratively with the government to ensure that people did have freedom to access securely and safely the real internet. Do you think the government then could use sort of the power of the purse to sort of set priorities so like your cloud provider, I don't know, is in alignment with Russia proposal one, two, three for censorship. So we're not buying from you until you are not in alignment with authoritarian censorship model one, two, three. And the cloud provider wants to have a seal on their home page that says, look, we comply with the we're not censorship friendly. Then, you know what I mean? Like, it doesn't have to always be a policy. It can be a choice. It can be a labeling regime. It can be a way of the government saying, hey, we only buy from, you know, green energy providers. I don't feel like I have like a fully established answer, but I'll say microcosm of this that I think is pertinent does circle back to that executive order about commercial spyware and certain commercial spyware entities that were taking actions that were viewed as problematic, really divergent from our interest, enabling things like the surveillance of journalists or dissidents and the administration pushed out this executive order saying, hey, we're no longer going to be using these technologies that pose a national security risk that pose these sort of broader risks to us. And I think that is perhaps an example of this power of the purse of saying, we're no longer playing with these entities. And it was followed up with sort of more discreetly with some entity listings. So those are the Department of Commerce label certain entities making it very hard for them to do business that ends up being sort of export restrictions and really just sort of gums up a business. The combined entity's list or what do they call it now? It used to be you had to watch two or three different lists to see if you're having a personal felony. Now there's a combined list, I think. I think that's right. Yeah. But your idea is an interesting one. There's two questions about this sort of labeling thing. The federal government is just recently, and we've been talking about this for 30 years, or maybe 20 years, moved this idea, maybe a good housekeeping, seal and approval for cybersecurity and stuff. That's been, you know, Singapore's done that. It's happened in other parts of the world. The EU has an act. And the US is moving forward and there's been legislation on the thing. So that's, that may be a way to deal with it. But I wonder how much that's going to actually affect the market. Are people going to, outside of a smaller group people, are they going to value software more because they say we're human rights compliant? Hopefully they would. I would hope they would. But it's more about signaling the values of the government. And look, no authoritarian cloud provider has that seal. What a, what a shocker, right? You've got to draw some bright lines between the difference between team rule of law and team authoritarian. Because if you don't, they're just going to be bled together. They're all sitting in the same room in the IETF, right? They're all sitting in the same room, wherever. They're all whistling when they have standards. Yeah, so we need a way to say, look, how we are different. I think labeling is an important step here. But more and more, we're seeing a shift in how the arms race goes. So it used to be Western companies sell deep packet inspection tools to repressive regimes. And they use them to harm their citizens. And so it would be a technology, like they look for this header. We change it to this and go back and forth. But here are two examples that we've seen recently where we're shifting how the arms race goes. One of them is in Turk Menestan over the past few months. So the Ministry of Censorship in Turkmenestan is basically block listing the whole internet. They block Fastly. They block Cloudflare. They block Akamai. They block most hosting providers. And so we've been running some tour bridges on residential IP addresses. And it kind of works sort of. But it turns out that they're taking bribes to unblock certain people in Turkmenestan. So if you're a company or whatever, you pay them $1,000 US. And now you get real internet. So this is not something I can solve at a technical level. Another challenge that we're seeing in China recently is it's not. So China has a policy to censor the internet, sure. But they also have accidental economic byproducts of how they're growing. Because so you're there you are in China. You want to use a website in China. Everybody wants to do that. So they buy more bandwidth. They make the pipes bigger. They make the bandwidth inside China really good. But nobody's adding pipes from China to the rest of the world. And that means that if you're in China trying to get to Google, even if they didn't explicitly block it, you've got 15% packet loss because too many people are trying to use two small pipes. And for economic reasons, they're not bothering to add that bandwidth. So that's a case where it doesn't matter what code I write. There's not enough bandwidth for people to do what they're trying to do. And that goes back to the economic challenge. There was sort of a comment maybe, or maybe we can look at it just briefly from the other lens. Who are the people that are being censored? Because when we say that, we don't necessarily mean bad guys. What are these people just exploring the internet and the regimes are afraid they're going to learn about equal rights or something? Like what is the danger in these conversations you're having that justify this level of intervention and billions of dollars spent? What is the fear? I mean, there are lots of vulnerable groups. And depending on the country, they could be religious groups. They're often political groups that are being targeted. Back to what you said about the State Department. Part of the State Department, the Democracy and Human Rights Bureau, has for a long time funded a lot of projects, including yours, among others, to try to get around censorship. And it is kind of country specific. But it's not just a lot of countries don't have the capability or the bandwidth to censor everything or to look at everything. China may be a little different. But so they target a lot of these groups who they view as threats to their own political stability often. And those are the groups that are most at risk. But not only those. I mean, it could be LGBT. It could be there's a whole wide range of groups, depending on the country, depending on the authoritarian regime you're in, that are targeted. And you know, people don't justify it that way. They justify using it. They justify it on the basis of security. It's the security of the state. That's the overarching justification. One of the things they thought was a big advance at the State Department when I was there and it's continued in the new bureau they created is to integrate human rights with security. And a lot of times, when we go to other countries and talk about cybersecurity, they say, oh, that's great. That's a way for us now to control our citizens. No, no, no. You have to think about the human rights aspect. And you have to look at them together rather than silos. So, yeah, I think it's and again, it goes back to the point of like, well, how do you change that? You can change it by using the tools you have economic and other tools to try to change that country's view because basically the regime that's in power kind of likes to stay in power. And so it's hard to tell them don't do that because they're going to do it anyway. And so how can you exert pressure? And some of it's commercial pressure. Some of it's not just governments but companies too. As we've seen with Russia. Yeah, so in our experience, it isn't that the censorship ministry sits down and makes categories and goes to find websites that match the categories. It's more about the job of the censorship ministry is to never get noticed by the authoritarian leaders in charge. So every time something goes wrong, they're embarrassing the head dude. And they need to react and start blocking the things that are embarrassing at that time. So it grows organically where maybe some particular activities in the news and then they have to shut that down. And this same pattern happens not only in China where they have to guess what they should try to block next but it also ends up happening in Denmark and Australia and Sweden and so on where they don't have a censorship ministry but whatever they call their online security group ends up deciding that they need to block Pirate Bay or wherever it starts and then it bleeds into blocking more things based on reacting to whatever is politically embarrassing at the time. But it's also the multiplying effect of not just the state doing it but creating policies that are so vague that the companies, the ISPs, and the others and this is what China does doesn't know where the line is so they over-enforce. They go further than they need to do and it's even worse and the state's fine with that because they're like over-enforcements better than under-enforcement. So then how do we, going forward then I guess in your sort of utopian vision give us, give me a where are we at in five years? Where are we at in 10 years? I mean if the team rule of law undecided and authoritarian, if this continues to sort of metastasize, do we have sort of what, three buckets of the internet, right? Your model of where does this leave us? Because what I don't see is any forces that are, oh, everybody's going to have to use the exact same technology with no censorship built in. That's not a thing. So that means censorship will continue to evolve and so it's important that we have policies to either make technology instead of be neutral, let's make the technology at least if we're going to federal government, it's going to buy, it has a bias toward good, whatever good values are, it's transparent or accountable or so maybe one side we have to shift the bias toward good because we know the bad is going to continue. When you look in your crystal ball, tell me what you see. Do you want to start? Yeah, so I don't have as much optimism as Jeff hopes I do. I'll bring you a bit more pessimism. So the Iran thing that I've been watching, so Iran's goal is to build a halal internet. They want to isolate themselves, they want to cut off all the outside internet connections. And to do that, they need their own Facebook, they need their own Google, Gmail, their own everything. And they're not big enough as a country, as an ecosystem, as a software ecosystem to have that. But it gets worse than that because, so for a while, they would block Gmail and everybody in Iran would say, screw you, you block Gmail, I don't like you, stop blocking. And then a previous administration in the US called up Google and said, hey, can you like sanction Iran? Can you like turn off all the Gmail accounts for people who speak Farsi? Anybody coming from around just shut them off. And that means that when people in Iran tried to get to their Gmail, Google was the one that blocked them. And that was a sanction, but the result was it's easier for Iran to end up with their isolated halal internet because there's nobody complaining anymore. You blame Google for trying to turn off your Gmail account. You don't blame your own government anymore. And we're seeing the same thing in Russia where various telecommunication providers in the West are like, we're going to sanction Russia. We're turning off their telephone connections. We're turning off their ISP, their internet connections. That'll show them. They'll have to stop the war now. And the result is that they're letting Russia isolate itself in the same way. So are we going to end up with one amazing internet that everybody's on? Or are we going to end up with each of these countries? China's doing great at not building the rest of the bandwidth for people to get to the rest of the internet. Russia is going to try to isolate itself. Iran is excited to do so. They haven't succeeded yet. So that's where we're going unless we figure out something at the not only technical level to fix it. So I'm a recovering lawyer. So I'm usually a class half full sort of person. But I think there are some good signs and bad signs. I mean, the bads, I agree with everything you said. If you also look at the Freedom House report of the reports every year on the state of freedom on the internet, and it's gotten progressively worse and continues to get worse. So that's not a great sign. And I do think that there's more and more countries who are turning to this more. There's also a lot of political churn where more right wing parties are winning larger and they're more disposed, predisposed for this kind of activity. And so that's not good either. And I do think you're seeing this kind of vulcanization, which I wasn't allowed to use that word in state because I couldn't say splinter that at the time. You're seeing this happen. I think that's going to accelerate, too. The good things, though, the positive things, I think there's more awareness of this trend now. I think there's more desire and ability and actual political will to use tools that we wouldn't use before to try to counteract this. And that's true both with cyber threats, but also with these threats as well. I'm really heartened by the executive order on commercial spyware. I'm heartened by the fact that a lot of countries signed onto that. I'm heartened by the fact that a lot of those countries said we're going to work together to promote this. When I go to these negotiations at the UN on the Cybercrime Treaty, I'm heartened that there are lots of countries who are banding together and saying human rights has to be a core part of this treaty. Now, if there's other countries that say no, well, not for the human rights already, but it's good that you have these countries do that because if it's a process where these countries are strong enough and pushing back, you're not going to get the kind of repressive language in an instrument that's global. But I acknowledge the challenges. And I wish it was going to get better. I think it's got a long way to go before we get there. But I do think there's more policy awareness than there ever has been. And so that will help. It's definitely going to be hard. There's no doubting that it's going to be hard. But I think it's important that we start with the vision for where we want to go and that we aggressively try to move towards that. And I think you're seeing some of that. The executive order, the National Cybersecurity Strategy, if you just look at the introduction from President Biden, it talks about this idea of an affirmative vision where everyone has access to this open, free, secure internet you're seeing from the Summit for Democracy to all of these interactions in the United States and with a cluster of allies around the world, a push towards saying, hold on, we are realizing this is a big problem. We're seeing the trend line is in the wrong direction. And we need to do something as a government. We need to do something in partnership with the people making the technology to stop it and reverse it. So, OK, I want to go back to something you can do. I agree something you said. One of the problems is if you try to restrict those countries, you're absolutely right. They develop their own capability. They become more isolated. And that's true with the technology generally. We've seen this with chips and other things where China is now developing this stuff because they can't get it. So you have to figure out where that balance is and how you can influence it. Yeah, so you mentioned something about, oh, no, I'm totally forgetting it. What in the world? Oh, so the companies are essentially saying, oh, we'll just disconnect fresher. We want to make life more miserable for them. And partially that's because there's no kind of coordinated message from the government saying, OK, IT providers, please help us block. They're doing it a lot of times because they think it's the right thing. And so it seems like maybe there's a role for government to say, no, we need very specific targeting against sanctioned individuals, not all of Rust Telecom. We just need to do it in these sort of net blocks or not the education ministry or whatever, but the arms ministry. And so does it just mean you're going to have to get more precise because the alternative is a giant blunt block or? Yes, I know this isn't a perfectly satisfactory answer, but I think this is hitting on this really important topic of how the government message is when we might want something blocked or when we might want something enabled, even if that's counterintuitive to a business. And so later this afternoon, you all here at DEF CON, the policy space, there's a workshop. State Department is going to be there along with ONCD, along with Roger and some others. And we're going to be able to talk about that when there aren't cameras in some more detail. And so I would encourage those of you who are interested in this very particular topic to come down and we'll have more conversations about that. But it is certainly an area that the government is actively thinking about and is actively sort of trying to work through some of the nuances on, but where at this point there's not come for talking to a camera. And that workshop is this afternoon at 2 o'clock in the policy at DEF CON area. Okay, we're rolling into the final four minutes, four or five minutes. So I'm going to start on my right with Roger for any final thoughts, anything you want to leave the audience with? Yeah, so we gave you a bunch of pessimism today, but there's also optimism because here we are all working on the project from the technical and the policy side and here all of you are learning about this. Please spread this all with your friends. I'm going to, after this session, answer whatever tour questions you may have over at the tour booth in the vendor area. And as a last thought, EFF launched a campaign a few days ago to try to get more tour relays at universities in particular. So it doesn't have to be an exit relay, it could be a non-exit relay. So if you're connected to a university, please consider running a relay for EFF's campaign. And my positive thought is that like I said, there's more awareness of this. And I really think that this debate needs all of you. I mean, it needs people to be weighing in to raise these issues because a lot of times policymakers think, oh, that's too technical. I don't know what the implication is. So explaining what the real implications of this are and what this is going to mean is important because it flags as an issue. And that's the beginning of any kind of real policy debate. But you can only be in that conversation if you're in the room. So it's really incumbent on the people having that conversation to spit out these RFIs, right? Give us feedback. And I was going to say that. And that doesn't necessarily mean you being in the room, but it means that the people who are having these conversations, and increasingly they are. And we talk about this multi-stakeholder model, which means not just government people, but they should be talking to other stakeholders, including this community. And they're trying to develop mechanisms. Some countries are better than others. The US is trying to do a lot more of that. And I think trying to take advantage of those. And frankly, a lot of the US people are here at DEF CON for exactly this purpose. So making sure that there is that connection. And you're at least aware of it. And then you can raise the alarm bells in your own circles, too. And I think that helps. I think for me, one of the big points that I'd like to get across is the idea that security and human rights are actually two sides of the same coin. That they are not opposite sides of a scale. And if you are architecting a system, you're building something out, thinking about the security is not just going to be good for your users here, but can potentially enable people around the world to circumvent a censorship regime. That if you build an architect your system with, in the back of your mind, the idea that an authoritarian may be looking at this, may be trying to use it for a malicious purpose. But you might be able to build an architect in a way that enables someone to reach out and report government corruption. That enables someone to reach out and talk about human rights abuses going on in their country. That is good, I think, for business and also good for all of us, for the world, for human rights. And that is something that we can do as we ensure that our products are secure, sort of from the get-go, that encryption is strong, that the protocols don't have a button to say, you know, mass surveillance, click here, and you just click here, right? Those are things that we need to think about from a technological perspective. And I know you all are in that space. We're in a policy space. You all are actually building. And building with that idea of human rights in mind, I think, is really important. All right, thank you so much for being at our panel on internet censorship and hope to see you around DEF CON. Thank you. Thank you.