 All right. Whoa. Goodness. That was much louder than I expected. Good morning. Good morning. Good morning, everyone. And welcome to this morning's event here at New America. When does content moderation become censorship policing the web after Charlottesville? My name is Kevin Bankston, and I'm the director of the Open Technology Institute here at New America. We're the technology policy and development wing of New America, and I'm focused on ensuring that everyone has access to an internet that is both open and secure. I'm excited this morning to have a fireside chat minus the fire with Matthew Prince, the CEO of Cloudflare, which is the most important internet company that many of you haven't necessarily heard of, which is doing the important but quiet work of DDoS mitigation defending against DDoS attacks and in the process serving millions of websites and in the process carrying about what, 10% of all web traffic. They often provide the service free to NGOs that are in need of defense against repressive governments and other attackers and this as part of their Galileo project, which OTI is a partner in, and they've helped better defend everyone's privacy online by turning on HTTPS encryption for all the websites they serve, basically upping the security of 10% of web traffic in one fell swoop. We're going to be talking probably about a variety of those things later. I expect that the real focus of today's discussion is Cloudflare's approach and the approach of the internet sector in general to taking down or blocking or censoring or moderating the content of their users. Cloudflare and Matthew personally made a big splash this August with a controversial decision when Cloudflare kicked the white supremacist website, The Daily Stormer, off of their service in the wake of the Charlottesville marches and that's sort of the key prompt for today's discussion. But before we get deeply into that, I just wanted to welcome Matthew and thank him for coming this morning and give him an opportunity to talk a little bit more in his own words about Cloudflare's role in the internet ecosystem, what they do and where it falls in the stack of companies that make your internet experience possible, which I think will be important context for later. So without further ado, welcome Matthew and thanks for coming. Thanks for having me. It's great to be here. I know I could wear jeans, or I definitely would have. I didn't wear a tie though, which I was one of the specific things I was told I had to do in D.C. Thank you, I appreciate that. So DDoS is one of the things we're known for, but Cloudflare, what we see our mission as is to help build a better internet and we're only one of many companies that have that same mission. But what we mean by that is internet technologies were really designed and invented 45 years ago before we had any true idea of how critically important the internet was going to be. And if we could all go back and do it again, there's a lot of things we would have done differently. We would have added security by default, right? Everything should be encrypted. It just should. It's not just those people who can pay for it. It should just be encrypted by default. You should have the best performance possible. It shouldn't be if you're Facebook or Google, you can get content around the world at any point on Earth. Instantly it should be anyone with a good idea should be able to do that and make content available to the world. It should be reliable, right? Being able to route around various problems that are affecting the internet, be able to avoid things like denial of services, but really just have a level of reliability that is beyond what kind of the traditional internet has been able to do. And it should be incredibly efficient at doing that, making sure that you can get information at any point on Earth as inexpensively as possible. And so solving those problems are big, hairy, thorny issues. And those are the sorts of things that we just really love to do. So today, a couple of us, nearly 10 million customers, they range from tiny individual websites, small businesses, NGOs, you know, someone's personal blog to, you know, some of the largest companies in the world, major financial institutions that in this town, the State Department uses Cloudflare, the FBI uses Cloudflare, the Internal Revenue Service of Pakistan uses Cloudflare, the central bank of Brazil uses Cloudflare. We see about 10% of all internet requests flowing through our network. The way that our network works is we actually run data centers all around the world where we put our equipment and we act kind of as a virtual Cisco in the sky where we can do routing, load balancing, performance optimization, security, all the things that you have to buy physical hardware for. And so we have facilities in very unexotic places, like the one that's closest to us right now is in Ashburn, Virginia, all the way to, we just turned up a facility in Djibouti, not just because it's the punchline of many jokes. And we continue to build out trying to be as close as possible to where our, to where our people who are using the internet are. And so, you know, I was maybe slightly inspired by Chairman Pi's dropping so many cultural references yesterday in his remarks. I'll tell you about one of our customers, which is Taylor Swift. Let's say you're a Taylor Swift fan and you're sitting here in DC and you want to know when the next Taylor Swift concert is. If you go to the Taylor Swift website, you would hit our nearest facility, which is in Ashburn, Virginia. We would then do analysis on your request and say, are you, you know, actually a Kanye West fan trying to DDoS Taylor Swift or hack Taylor Swift or, you know, whatever it is, in which case we would be able to stop you right there in Ashburn, Virginia and keep you from doing harm. On the other hand, if you're, you know, really into Taylor Swift and want to make sure that you get the latest information on whatever people who are really into Taylor Swift want to know, then we'll put you on a just incredibly optimized path where you can get that content as fast as possible. And we make that available to, again, anyone that's out there from customers that pay us nothing at a very free level all the way up to, you know, really, really, really large organizations. We don't host that content. So we're not sort of the definitive place that content rests. We're not quite an ISP. Like we're not actually trenching through streets and running fiber optic cable, but we're something that's sort of somewhere in between where we're this pass-through that exists out there, and we can do a lot of things around, again, performance, security, reliability, and sort of data and insight, while being something that's not really the host, not really the platform, not really the network, but, again, a really important spot in the internet infrastructure to be able to provide a whole series of services that in the past were really reserved just for the internet giants. And one of the recipients of those services up until August was the Daily Stormer website, which is essentially a news organ of white supremacist movement. I'd love to talk about your decision to pull service from them, which there was a blog post that I suggest you all read, but also I found even more interesting the letter to your staff, which I'd like to read from a bit. And for some context of how this all went down, the letter to the staff was the blog post at first before it went through legal and editing and everything else. And I'm happy to talk to you exactly what the sequence of events was, but I basically, we were getting questions from our internal staff, and Michelle and my co-founder said, you've got to sort of email everyone. And so I copied and pasted the first five unedited paragraphs and sent that out, and our CFO called me and said, Matthew, I really don't think this was very smart, and he's very German, and he said, these things inevitably leak. And I said, Thomas, we're a tiny little organization, we've got 500 people, we've never had anything leak, and then of course it leaked. So anyway, this is what, when I sit down and I'm not edited, did I sound like? Well, I'm also going to read simply because I love legitimate excuses to use profanity in our public events. My rationale for making this decision was simple. The people behind the Daily Stormer are assholes, and I'd had enough. Let me be clear, this was an arbitrary decision. It was different than what I talked with our senior team about yesterday. I woke up this morning in a bad mood and decided to kick them off the internet. It was a decision I could make because I'm the CEO of a major internet infrastructure company. Having made that decision, we now need to talk about why it is so dangerous. Literally, I woke up in a bad mood and decided someone shouldn't be allowed on the internet. No one should have that power. I tend to agree, and I'm concerned that this sets a bad precedent, which was a concern that you shared in the blog post because you had never before made this kind of content decision before. And you wrote, it's powerful to be able to say you've never done something, and after today, make no mistake, it will be a little bit harder for us to argue against a government somewhere pressuring us into taking down a site they don't like. So what was up with that, Matthew? Tell me more about the decision there and both why you thought it was appropriate at the time and why you have a broader concern about it as a precedent or a policy. Yeah, so the people behind the Daily Stormer are really difficult people to advocate for. It was just horrible content. And they'd actually, like yesterday, 30,000 new customers signed up for Cloudflare. And we don't screen, and again, they just sign up, they're not screening the content. So at some point, and I actually don't know when the Daily Stormer had signed up, but it wasn't, I had never even heard of the Daily Stormer until really some work by the Southern Poverty Law Center and some other organizations that had looked into ways that these guys were actually abusing people who had reported abuse about, if memory serves it, about three months before we actually decided to kick them off, we started to get reports that when people would submit abuse reports to us. And again, we act as this shield that sits out in front of things and so it's hard to figure out where something is actually hosted if we exist in front of it. And so we see ourselves not only as a proxy for the traffic which is passing to us, but for any abuse complaints that pass through us. And we truly hadn't ever considered that people could just be really evil. So you would receive complaints about a particular sign, and you would forward those complaints. So if someone was like, hey, Daily Stormer, people are harassing me, that would get forwarded to the Daily Stormer with their contact info. And then they would, and again, in retrospect, this seems obvious, but we just hadn't contemplated the level of evil that exists in the world, that the Daily Stormer people would dedicate the next week to harassing people. And while we generally spend our time dealing with relatively technically sophisticated organizations that kind of understand what Cloudflare is, the vast majority of the world doesn't. So they just didn't have this expectation of what was going on. And so that was the point at which these guys came on to my radar screen and when I first heard about them. And it was really, I mean, it was concerning to us because in that way we were actually facilitating harm. We reworked our abuse process and took this into account and it got better based on that feedback. But that was sort of strike one against these guys. And we're a private company, right? And so it would have been very easy for us in this case to say, Section 14G of our Terms of Service gives us the right to terminate these guys for any reason, right? Which is what Google did, which is what GoDaddy did, which is what most other organizations... Just to be clear, Google and GoDaddy in their roles as domain name registrars? Correct. That is what they had done in those cases. Earlier in the week with the Daily Stormer, what happened the weekend before we actually made the decision to pull the plug was there, you had what happened in Charlottesville. The Daily Stormer post just a really offensive piece of content about how basically the young woman who was murdered in Charlottesville deserved it. And it was gross. And then there was just an enormous public outcry. And that public outcry caused over the weekend the organizations that had registered the domain DailyStormer.com starting with GoDaddy and then with Google to pull service from them. And there was just an online pressure which was building to say, hey, this is ugly content and it doesn't deserve to be online. And that pressure was, we were feeling that pressure. People were saying you should terminate service too, you should terminate service too. And we'd always thought that that was a very dangerous precedent for us to set. We actually published a transparency report that has a series of bullet points of things we have never done. And one of those things was we've never taken anyone offline due to political pressure. And we do that in part because that's sort of our promise not just to our users but to the internet in general of if we're ever forced to do that then this bullet point goes away and that's sort of your early warning that there may be something that you should worry about. We had thought that as this kind of passed through network the right role for us was to be neutral but we hadn't. And we'd argued about that internally but we hadn't really had that debate and over the course of our history about every six months something really controversial comes up. The first example of it that I can remember was in 2011 when Lowell Security the hacking group signed up for us. We had no idea what Lowell Security was. The world didn't have any idea what it was. And the next thing we know we're getting calls from reporters saying why are you supporting the people that hack PBS and Sony Pictures and later the CIA and a whole bunch of other things. And we were blindside. We were talking a little organization. We had 15 people and suddenly we're in the middle of this and it made us start to try to think through this. And it's been terrorist content at various times. And people who are cruel to animals at other times. And in this case it was neo-Nazi content. And so while we'd had those discussions internally and we would get calls from reporters all the time I don't think that we'd done a very good job articulating what the right policy was globally. And it's tough. The hard thing, there's a former congressman named J.C. Watts who said the problem with politics and the press is when you're explaining you're losing. And we thought it was really important for us to be neutral and we had pretty good arguments we thought of why that was the case but we were sort of explaining that. And we were losing. And because it's really tough in 140 characters on Twitter when people are like take down the Nazis, take down the Nazis, take down the Nazis to say, well actually there are these bigger concerns and let's have a conversation around that. And so on Monday a number of providers pulled service and basically said that violates our terms of service and found reasons to terminate them. And we started to feel a lot of pressure and so I woke up and sent emails to a number of sort of civil society organizations, civil libertarian groups that really had been supportive of us in helping us think through what the right policy was and I said hey guys, we need a little bit of air cover here because we're getting a lot of pressure to take things down and I've seen this story before. Like day one it's sort of the Twitter mob and they get angry and put pressure on us and they think we're going to sort of do what they're asking us to do. Day two they start to get frustrated that we haven't done that. Day three they start to plot against like how can they put more pressure on us. And day four we start to get calls from our customers that pay us millions of dollars and those customers start to say hey why are you supporting Nazis? And again we're a commercial organization and even though again we think the right policy is neutrality if you've got a number of customers who are like listen if you continue to do this we are going to pull our business from you. That puts a lot of pressure on you to change your policy. If we're really honest about what causes Facebook and Twitter to control content and it's hard to think this from where we're sitting here in DC but it has much less to do with regulators up on the hill and it has much more to do with people sitting in Procter & Gamble's office saying I don't want ads for act body spray to be next to I guess that's not PNG that's Unilever but to be next to terrorist content and that drives a lot of decisions and again that drives decisions in our case and so Road to the Civil Society organization said hey would you mind putting out something saying it's really important for a deep infrastructure company like Cloudflare to be content neutral and the responses came back like hey we really appreciate your policy but this is sort of a hot button issue and I don't think we got that inquiry because I think we would have loved to have said that No you will next time and I think that that's easy to say now and I think a lot of the organizations said listen we were thinking about it but the reality is Monday I need that Tuesday it's building Wednesday it's building by Thursday I know I'm going to be getting calls from customers saying do something about this and we felt incredibly isolated alone so on Tuesday night beforehand we'd had actually all of our legal interns over to my house for a dinner we'd say I'm late talking and drinking wine and my fiance at the time wife now congratulations said hey you know this I'm really I'm trying to think about this from a bunch of different perspectives and I see how this is a complicated issue and I was like you too and then on Twitter there were a number of CEOs who I really liked and respected who were starting to say you know Matthew do the right thing take this content down and remember I went to sleep and I was sleeping on one side of the bed and she was on the other side of the bed and I was like and just stewing and I woke up really early and called one of the CEOs who had a friend in some other respect and I thought I'm going to explain why it's the right thing for us to be neutral and I couldn't get a word in edge wise he was like listen we have an opportunity here to do the right thing to stamp out this I'm sorry who was this conversation a prominent CEO who again made an impassioned argument he said on my platform I'm not going to allow this stuff you shouldn't either and I couldn't even get to the argument and I remember calling Doug Cramer who's our general counsel and my co-founder Michelle who's our COO and the person who runs our trust and safety team and I said guys we're just losing we're losing this argument and we've got to change the conversation and we had a discussion around because we know what the next chapter is going to be and we knew that we were going to be forced at some point to take action and the conclusion that we came to was we were trying to explain policy to the rest of the world and what we needed to do was get the rest of the world to explain to us why this was really dangerous and so we made a determination and that determination was like we're going to terminate this one customer but then we're going to talk about exactly we're not going to hide behind our terms of service we're not going to sort of say they weren't even Nazis we're not talking about this anymore we're going to say we did this at the end of the day it was an arbitrary decision and by the way whenever any company makes these decisions it comes down to a small set of individuals making what is ultimately an arbitrary decision because if you don't believe there are a whole bunch of neo-nazi sites still using GoDaddy as a registrar and Google as a DNS provider and everything else then you're just not paying attention but the crowd isn't screaming about them right now and this is really dangerous and we'll take every meeting and every press phone call and let's talk about this because we've been talking about this inside the four walls of Cloudflare for six years but we have made a mistake by not having this conversation more universally and so the reason I'm here today is not that I know exactly what the policy should be it's that we're sort of on a listening tour of trying to say okay here's where Cloudflare sits here's where we operate here are the challenges we have and here's the bad stuff that's out there 10% of the internet uses us that means 10% of the really horrible awful ugly things on the internet use us and also 10% of the really amazing wonderful things on the internet use us and what should we do and how do we come up with a policy which makes sense for us as a company that employs 500 really smart technical people who really believe in the mission of doing the right thing and helping build a better internet to our customers 10 million customers 30,000 more every day that are for the most part trying to publish amazing things and provide amazing services and sometimes they're doing awful ugly things as well and then to the internet as a whole 2.8 billion people pass through our network every single month basically the equivalent of Facebook's traffic 99.999% of them have never heard of us and when we're doing our job right the internet should just be a little bit faster for them and a little bit more secure and so then what is the right role for a company like us and I wish I could say this is exactly the right policy I think what really it is this is the right time for us to be talking about that policy and doing it broadly and frankly other companies should be doing that as well and not simply saying look at 14G we have the right to do this by the way we have no further state and certainly we and I appreciate the willingness to have that conversation even if we might second guess the ultimate decision itself which is very worrisome as a president from our perspective but in terms of a listening tour this is mostly intended to be a Q&A but I do have a few thoughts about the action you all took and why it concerns us so I might talk for a few minutes here you mentioned in your blog post one of your staff half jokingly suggested is this the day the internet dies which is perhaps a really dramatic way of thinking about it but if we step back this was clearly a very intensely difficult and personal decision on a micro level but stepping back to a macro level we have a first amendment that's premised on this idea ideally it's a marketplace of ideas that decides what is good and what is bad and when there is bad speech the best answer to that is good speech or speech that counters it and you've said this yourself a number of times it's sort of the fundamental idea behind our first amendment and behind the human right to free expression such that speaking generally only the most heinous first amendment unprotected speech like say child porn or direct threats of violence are off limits and if you look at the internet the internet has been an incredible engine for that free speech and free expression for the past 25 years with few limits the ISPs have mostly stayed out of trying to block or moderate content with a few exceptions see throttling Netflix or scanning for patches of particular child porn images and in fact they've actively avoided trying to get dragged into fights over scanning for copyright infringement for example you know meanwhile with the help of the CDA 230 communications CDAC 230 this is the law that basically shields you know publishing platforms from liability for most liability for their users content in part to allow them to moderate that content without engendering new liability and it's through that law in part that we've seen over the past 25 years an explosion of platforms scaling into the billions which would not have been possible if these companies were legally obliged to try and moderate every piece of the legal content off of the platform you know it simply would not work it would not scale and you know and speaking generally the platforms were relatively limited in their moderating of content mostly targeted illegal stuff or things that would actually upset their customers you know and most of the criticism of that practice was focused on them taking down too much hey Facebook why are these breastfeeding photos coming down hey why do you take down the photo of the girl covered in napalm from Vietnam you know taking down violent content that was newsworthy things like that and the main primary response to bad speech online was other people arguing with it were shouting it down but in the past five years and especially the last year that has completely changed on a lot of vectors the ISPs at this point are pushing hard for the right to make decisions about what content flows over their networks at what speeds whether they can block it or throttle it or discriminate against it and the FCC chairman is enabling that the platforms were under enormous pressure from all sides to more aggressively take down a wide range of content not only the neo-nazis but also you know terrorist content, ISIS content harassing content often political ad content very often speech that is objectionable but protected by the First Amendment and the human right to free expression and at this point the companies are due to that pressure falling all over themselves trying to take down more often in automated ways based on AI algorithms that are also leading to a lot of collateral damage in terms of unobjectionable content or content trying to deal with the objectionable content getting taken down meanwhile CDA 230 is under sustained attack on the hill as policy makers well intentionally I think are trying to address the issue of child trafficking online but in the process are threatening to upend this important policy balance that has made the internet flourish meanwhile you know people like Disney are supporting that in the hopes that once that chink of the armor comes through then they'll be able to say upend the compromise we came up with in the Digital Millennium Copyright Act about how to handle copyright liability online you have Germany passing a law requiring platforms to take down objectionable content or clearly illegal content within 24 hours you have Amber Rudd the Home Secretary of the UK was on this stage two weeks ago basically saying that the companies should try to find be waving a magical AI law to make sure that they detect and take down within two hours every piece of terrorist content you know you have the companies having to engage in hiring thousands of human moderators to try and deal with these pressures in a way that no start up could ever have the resources to do that level of content moderation so for all you people who are complaining be careful what you wish for because if you require them to have armies of moderators you're basically locking them in as our platforms because no one could ever compete and then meanwhile and somewhat contradictory you have some of the same people who are arguing for more take down of content also arguing for greater neutrality in regard to content on the platforms whether that's Steve Bannon saying they shouldn't be imposing their political biases by taking down neo-nazis or Al Franken arguing that the platforms should be neutral in a manner that prevents them from you know abusing their power of the platform in an anti-competitive way either way it's still policy makers talking about wanting to tell the companies how to prioritize content what to keep up what to take down and in that context Cloudflare you know Matthew you are essentially a provider of critical infrastructure for the internet voluntarily, arbitrarily demonstrated not only your technical capability but your willingness to take down something you found objectionable basically strengthening the hand of anyone who has been arguing that those with the power to do so should do so and perhaps should be made to do so and so although I really appreciate the ensuring that your decision was a prompt for this conversation my fear is that we actually have been having this conversation along many threads and this decision pushed them all in the wrong direction that's my fear and so I'm wondering how you feel about that I'm wondering and you know we can turn to a sort of more positive sort of what next what should a reasonable approach to these questions look like this was why I and I think others in civil society were concerned because we're afraid that this became fodder for those arguing that more people should be doing more of what you just did in a way that ultimately is not only bad for free speech online potentially bad for actually really countering neo-nazis to argue against things that you can't actually hold up and say this sucks while it also plays into their narrative of being a discriminated against minority no matter how ridiculous that is and so I think there's a question about like is this actually effective in the first place as a way of countering this trend in our culture but so that's I think that's where we're concerned and I think a number of others in civil society were concerned but giving you the credit that like certainly there were other providers who did similar things that did not also say and this is really worrisome so we do appreciate that and then also just a point of process we're going to break up sorry we're going to break up this a bit and allow questions after this segment and then move on to another segment and then allow questions and then so you sound like me four years ago and and if you go back there's a blog post which is still one of the most red blog posts that we ever posted when we were thinking through Lol's security that's titled Cloudflare and Free Speech and I'm the son of a journalist and I believe in the first amendment I'm almost a free speech absolutist and and believe in a marketplace of ideas have sat with engineers in China and had them say it must be so great for you to have Google and I say oh is that because you want to look up Falun Gong or no no no it's because I need to find the script in Python that can pull this amount of data out of the database and I just don't have the tools to be able to get that so I mean I believe economically accessed information leads to winners I believe that the best everything that you said the best way to to fight neo-nazis is to is to expose their content and laugh at them make fun of them yeah the Streisand Effect is perfectly alive here if you look at Google searches for Daily Stormer which is this particular site and you say before incident after incident incident you say the baseline was one we're now at three so there are three times more searches for this content after these guys got taken off the internet than there were before but again I come back to JC Watts when you're explaining you're losing and while I am a big believer and actually this talent is a big believer in the First Amendment and Freedom of Expression and I think over time that will largely win with some erosion around the edges in places that you pointed out that I'm concerned about too it doesn't play well in Germany it just doesn't like if you stand up in front of the German Privacy Commissioner and say well we have a First Amendment and Freedom of Expression is right and that's part of our tradition she will say well we've had a very different tradition and a very different set of experiences and well that's cute for you literally it isn't part of what we're doing and in our case 90% of cloud service customers are outside the United States 60% of our revenue is outside the United States and so we inherently have to be making these arguments not just here but in Brussels and Beijing and other places where it's much harder and so what I think was interesting about this and where I would push back a little bit is that we needed to get past the First Amendment Freedom of Expression argument and I would encourage you as you're making this argument about what other frameworks make more sense and I'll tell you the one that I'm sort of playing with in my head that's been sort of a part of these conversations which is while the First Amendment is sacrosanct here it is a minority opinion around the rest of the world on the other hand due process is something that is almost and so if you are in Germany and you say due process is important they say yes it is and if you are in Brazil and you say due process is important they say yes it is and if you're in Turkey and you say due process is important they say yes it is if you're in Russia they say yes it is in Beijing in China they say yes it is now we may not like all of those processes but it turns out to be politically legitimate as a government sets of processes that exist, and then you have to follow those, otherwise you lose legitimacy as a government. So due process is almost a precondition to having a stable government. So then what are the foundational principles of due process? I think for sure transparency, to be transparent, you have to know what rules apply to you. Consistency, if I break the law in one way, and someone else breaks the law in the same way, they should have generally consistent applicability of that. And there should be accountability for those people who enforce the laws that you should be able to understand who they are and how they're being enforced. And if you take those three principles and you go around the world, whether it's a democracy or a dictatorship or otherwise, it's remarkable how universal those processes are. And so now if you use that lens to ask yourself about who or what or when or why should content be regulated online, I think it leads to conclusions that are much more palatable and much more sustainable. So can cloud flare ever be transparent, accountable and consistent? 2.8 billion people use our network. 99.999% of them have never heard of us. This is a very tech savvy room and a lot of you probably, I mean I challenge you to tell me which car hailing service when you push the button to hire the car, are you passing through cloud flares network and which one are you not? Good luck. There's no way that you can know that. I can hire sky riders to explain what cloud flares policy is. I can't explain what cloud flare does to my dad. So if that's the case, how am I ever going to be clear? And if I can't be clear, if you don't know that I am there being the one making those decisions, can I ever be transparent about what my policy is? And if I can't, can I ever follow principles of due process? I think there's a challenge there in the sense of typically, I mean due process is the right expected of governments. And that is one reason why I think it's intellectually and ethically and legally defensible to say it's not our job to make decisions about content. If you, the state, want to enforce a law to make us do something in regard to the content, which necessarily comes along with due process if you're actually a functioning state, we will do that, but otherwise we will not. It's when the companies take on voluntarily this role as the private speech police that then we have to start talking about weird things like what due process do the companies of, which we could avoid if the companies weren't privatizing this state function. Yeah, but again, you're- And that's also a response to Germany as well. I mean, you can say to Germany, we will comply with lawful orders, but that's different from us voluntarily addressing this speech. I agree, but and with respect, you know, what we need is a framework where when the, when there is pressure on these companies, again, we are a commercial entity. Almost all of the internet is- Absolutely. And so when Procter & Gamble says either figure this out or we're pulling our ads, or when a big Cloudflare customer says, either kick the neo-nazis off or we're gonna stop using you, like there needs to be really a social contract where people understand that something. Think about it in a pre-internet era. If the phone company listens in on your conversation, and then they don't, they think that you're being a racist, and so they unplug the phone, like forget, forget common carriers, pre, pre all of that, that was understood to be unacceptable. And we understood that as a society because it just felt wrong. If your browser, you try to go to some content and it says, oh, that's objectionable content, you shouldn't go to it. That feels wrong to us. And yet, think about it, your browser makes that discrimination all the time because sometimes it says, oh, this is malware, right? It could infect you, it could hurt you. You stop it. There's, there is nothing that prevents them from saying, oh, this is neo-nazi content that could hurt you. And yet, we have a social contract around what that is. What I think we need to do, though, is really have a conversation about where is an appropriate place for editorial judgment to be made, and where is it not. And I'll, and I'll tell you why I think that, why, why I am the one piece of data that makes me think that while I am deeply concerned, that I am incredibly happy that we provoked this conversation and why the due process argument, I think, is a much more sustainable argument globally. There were German newspapers, several, that wrote editorials that said, neo-nazis are bad, but Cloudflare shouldn't be making the determination, at which point I said, because that's the right answer, right? Deep infrastructure companies probably shouldn't be making that determination, but we first have to have a conversation about why is Cloudflare different than Facebook? Well, I mean, and I think that's a fruitful avenue of conversation. I think one of, one of the big differences is if you're looking at a content platform like Facebook or Twitter, one, they literally are the publisher of the content. Like, you know, the user is pressing post, but like their name is on it, like it's on their platform, they are publishing that content. That's why they needed a law to help shield them against publisher-based liability. Second, they actually are trying to manage a community of people such that if there is harmful content on their platform, it actually impacts everybody else on that platform. Well, if you're talking about an ISP, a domain name registrar, Cloudflare, neither of those justifications apply or make sense. I understand that, and you understand that, and most of the people in this room understand that, and even most people listening on the live stream understand that. But 99.99% of everyone else in the world doesn't. And so the question is how do you start to make people think about those conversations? And when you're explaining you're losing, and we can all preach to the choir, but what we really need to do is get people to think deeply about what the internet technology stack is without their eyes glazing over. We're all a bunch of geeks, we go home for Thanksgiving dinner, and people are like, why is Facebook helping the Russians throw the election? Your weird uncle Ted is asking you questions about that. We've got to get weird uncle Ted to be like, hey, actually I don't think that that Matthew Prince guy should be making determinations on what I see and don't see online. I've never even heard this Cloudflare before. If we can convince weird uncle Ted or German uncle Fritz or Chinese uncle Lee to start to say, actually, while we may have a different tradition of where content should be regulated, there should at least be transparency about it. You want to make Facebook as managing, it's so much simpler than that, I think. There is no more fundamental editorial function than ranking things. Top 10 restaurants in D.C. That is an editorial decision. What does Facebook do all day long? Ranks things. Here's what you see. They are the modern newspaper, and while they push hard to say, we are not a media company, of course they are, of course they are. The search engine? Search engine? Exactly the same. The miracle, and this is the miracle, is that Google has somehow, that the miracle is that there is not a Fox News search engine. But think about it for a second. Of course, news corp is finally crazy, the last war in Europe trying to say, oh, our content should be this. If I were running news corp, I would have the Fox News search engine, big American eagle, search box, only American stuff, fair and balanced, immediately to have 5% of all U.S. search traffic, that's a $10 billion company overnight, instantly. The fact that hasn't happened, it just shows the genius of Google and how they have threaded the line of saying, it's just an algorithm. I write algorithms. I have political biases, I have, and they creep into them over time, and I think that Google has done an incredible job, and you can point at exceptions around the edges, but at being a political, but that is incredibly, incredibly hard. On the other hand, think about Twitter. Is Twitter an editor? I mean, again, take classic Twitter, not modern Twitter, and it actually is much more just like a technology platform. I am the editor, I pick what I follow, and their job is make sure that they get it all to me. It's not filtered or ordered or sorted, and so if you want to understand the existential crisis that Twitter is going through right now, what they are really going through is making the determination of, are we going to be a technology company, or are we going to be a media company? Everyone says, oh, well, that seems like that should be easy, that should be moderating things, and it makes total sense, except that that is a giant shift fundamentally in who they are, and they are tiptoeing into it, and that's why it looks like they're flailing around. They were the free speech wing of the free speech party, we're not going to moderate anything except for illegal content. Totally, but that's because they were, I mean, they were, they were, Facebook's a media company, they're making editorial decisions. Twitter was not, they increasingly are, and as a result, that's why it's going to feel so awkward, the transition that they're going through. So much easier in our world, we are getting bites from point A to point B, and we are not making editorial decisions, mostly, we are a little bit, if you're trying to launch a sequel injection attack, you know, your Kanye West, trying to launch a sequel injection attack to, you know, say Beyonce should have won the award against Taylor Swift, like that, like we're going to filter that, but again, that's, and I'm not sure how much editorial, like, political decision there is, but again, you have extreme people that are saying even that, you know, but then even further down the line, like if you're an ISP, where do you go? That, I think, is the, that is, again, a conversation that this room gets implicitly. What we need to make sure is that the people up on the hill understand that, we need to make sure that, again, the people who are in Brussels understand that, we need to make sure that people who are in Beijing understand that, and again, the place where I, the place where I was like, because, like, I mean, I, I, I, it's, it's a decision to eat me up inside, because again, I, I, I was like, this, I'm skeptical of slippery slope arguments, but sometimes they're right, and in the, in the months that have gone by since, we've had over 7,000 requests for different websites spanning the political spectrum. We had, we had requests for some cooking blog. Someone was like, you took down the neo-Nazis, you should take down this cooking blog, and we spent days trying to figure out what was offensive about the cooking blog to the point people were like, maybe we should try, like, make the recipes and see if they just taste awful. And people were literally requesting, like, you did this, you must do this, we're being sued by a pornographer in, for a, for a copyright case. And I came back from my honeymoon, and the very first thing I had to do was go get deposed on, if you can take down neo-Nazis, why can't you take down the copyright content? Right? And so again, the slippery slope is real, and, and, and it, and it's there. And so, you know, from our perspective, what we were trying to do, and again, what I worried about a lot, and I, and I appreciate that you and others worry about, and I am incredibly thankful that now more people are talking about it, is, you know, how do we understand when it is the right time and the wrong time, where, when is the right place and the wrong place? How do we say, you know, in the, that all tech companies are not the same? And how do we start to really create a framework that doesn't just make sense here, because frankly, I'm very not worried about internet regulation relatively in this country. I am enormously worried about it in Europe and a lot of the rest of the world. So, and actually, I think we're going to restructure, we're going to go straight to questions after a few more minutes. Okay. One of the things I think important to explain the issue to uncles, or Anne, for that matter, in terms of the slippery slope is this has gotten really politicized, but if you create the tools or the pressure to take down extremist content, it will apply across political spectrums. And in fact, we are seeing that now. Like a good example with Google, where they sort of tweaked the algorithm to, to deprioritize more extremist content in reaction to the issues around neo-Nazis, that also started deprioritizing things from, from more leftist, you know, news organs like alternate or whatnot. And you have many of the same left-wing people who were saying take down the Nazis, then complaining that, well, wait, do it in a more targeted way that privileges our political perspective, not theirs. You know, or you see, you see Twitter or Facebook taking down, you know, far right extremist content, but then also taking down left-wing content that, you know, I would not consider extremist, like Black Lives Matter, but that others might. And so it's a good object lesson in, if you force them to deal with extremist content on one side, that is ultimately going to get applied to your side's extremes as well. And do we really want that? Do we really want these private actors to be forcing all of our discourse into some moderate middle? But so Julius Genakowski, former chair of the FCC, and I were chatting some time ago. And I said, gosh, it just seems like we're in this time of political craziness. And I mean, we're getting all these extreme views. And, like, how can we, like, I just fondly remember the time of, like, my childhood and my life. It seemed like everything was so much more stabilized. And he said, no, no, no, no. Like, maybe the aberration was 1940 to 1992. Because if you go back to the, go back to the Adams election, second election in the United States, I mean, it was ugly, right? And early 1900s pamphlets about who's sleeping with who. And I mean, just ugly. And then what happened? You had a new technology, you actually had several in a row, but let's talk about television in particular, that was so powerful and so wealth creating for a really small set of companies that controlled that technology, that they looked out and they said, okay, we've got this. We're making a ton of money. And what are our risks? And the number one risk whenever you've got a new technology that is incredibly wealth creating is it comes down to regulatory risk. And they said, how do we moderate regulatory risk? And they said, okay, I know. Equal time. We'll give both sides equal time. How can they fight about that? Right? Straight every one of our news acres will be from Kansas. Literally. Right? And so you've got this, you know, male, white, right down the middle, you know, elevated, much more sort of intellect, like sort of high minded content. And there's a whole bunch of stuff like that. We might all look back on that as being as being what felt like a more civilized time. And yet there were a whole bunch of opinions and attitudes and faces that weren't seen during that time. And think about that now from the value that is being created by the new technologies at the time. So your Facebook and you look out and you go, who's going to beat us? And what's our biggest risk? And by far the number one answer is regulatory. And so then you think, well, what am I going to do to respond to that? And the right answer is just get rid of the extremes and try to make Facebook look like, you know, the Peter Jennings news hour, right? And it means like seriously. And we need to think through is that the right but that that feels like the inevitable outcome for again, the value of what these these platforms will perform. And again, it's not clear. Like I see sort of it when I when I put my pessimist hat on, I see sort of two dystopian futures. One dystopian future is we have kind of the existing tech giants get locked into place and create kind of a very sanitized version of the future. AOL's ultimate goal is to recreate, excuse me, Facebook's ultimate goal is to recreate AOL. That's their business model. And how do we get everything on it, own the content, sanitize it, make it a happy place where everyone feels good about looking at things. That's that is their goal, right? And that is a dystopian future. The alternative dystopian future is, you know, Fox News launches a search engine. And and then so does the Daily Stormer and so does everyone else. And we all end up in our filter bubbles, looking at increasingly extremist content and not communicating with each other. Those are both really terrible outcomes. But but I think we're going to get both of them. I fear, I think we're going to get those. You might you might you might you might get you you might. But but again, I think that and and and I can I can I can stand on the roof and yell about freedom of expression and how important it is. And again, I understand that plays well to this audience. But it doesn't play well to a lot of the rest of the world. And if we want to make sure that it's if Facebook is making editorial decisions, at least I know they exist. Like that's like that's like the New York Times. There's a masthead, right? And then you know that they're doing it. If we start to do it, like the printing press operator who goes in late at night and is like, I don't like this story, here's some whiteout, right? And knocks out and you have no idea that we did that. And so and so like that's that's that if we're gonna draw lines somewhere like a no brainer that we should be drawing lines to say, you know, Cloudflare shouldn't be picking winners and losers and making editorial decisions. That is a no brainer. And of course, what we did with the Daily Stormer, it's just the wrong policy. But in but but I need your help in explaining this to the rest of the world and saying, like, do you really want that Matthew Prince deciding what content you can see? Because if we don't if we don't create that social contract, if if if we if we don't get Der Spiegel to write that editorial, then like we are going to get forced down that path, no matter how much I believe in the First Amendment. Well, certainly one one easy one baseline easy thing is transparency and something we've been pushing for with the companies that haven't we haven't quite gotten yet, except in weird spurts where they feel they need to demonstrate they're doing things is transparency about their processes for takedowns, the basic numbers in terms of how much is coming down in what categories and things like that like and we you know, we we now have a very mature field of transparency reporting around government requests for data, and government requested takedowns. But in terms of voluntary terms of service based takedowns, we have nada. And so we really want to see that. I have one last question, then we'll take it to the group. You know, we're talking about say, Facebook shaping the content on this platform. You mentioned FCC Commissioner Genakowski, FCC Commissioner Pi. Just yesterday, who is a is it turns out a rabid Cloudflare blog reader. Yes, he's he's cited you for the proposition that, you know, maybe actually, the platforms pose a greater threat to net neutrality than the ISPs. Well, it's an it's not your position. It's an odd argument that that one Cloudflare, a company of 500 people with a valuation, several orders of magnitude smaller than Comcast is an equal threat to Comcast. But I'm flattered. So I think that's that's that's one question. I also think that that the solution to here is a group of individuals that have too much power online is to create another group of individuals that have too much power online. It doesn't feel like the right the right solution. That's there. At least it's an equal play a level playing field. I think, well, you know, again, I so what I think that where Chairman Pie and I actually see a lot of things very similarly is that we want to have policies which encourage innovation and investment and new entrants into the market. And I think he I think he's really genuine in in in wanting that. And and I think that that's I think that's really important. And it's the right and it's the right lens to look through it. And you know, I was I was pretty critical. I just have a thing about criticizing FCC commissioners, I guess, but I was pretty critical of Chairman Wheeler, the previous FCC commissioner when they use Title Two in order to enforce network neutrality rules online. What what I wrote at the time was, you know, it's kind of like being in it's kind of like being in the car with your dad and you're the kids in the backseat, and you're kind of acting up and he says, you keep acting up, I'm going to pull this car over. And that's Title Two. Right. And and the reality was, and we can again point to exceptions around around the edges, but the reality was that for the most part in this country, ISPs were relatively neutral for various reasons. And again, we can point at little exceptions around around the edges. Netflix, I think is a terrible exception, because hard cases make bad law and Netflix at their peak on Sunday night is responsible for almost 40% of all bandwidth. That's a that is a technical problem. That is that is a really hard, difficult one to solve. But but but I think that it is also correct that there is a real risk that terminating access monopolies, because no matter what Chairman Pi says, I certainly don't have a lot of choices in who my ISP is in in my home in San Francisco. And and you know, and if you look at household around around the country, one of their largest expenses every month is to whoever their telecommunications which is television, phone, internet provider. And and so that's that's that these there is a monopoly in whoever I select as an internet provider for content to get to me. At some level that content has to either directly or through a third party. If you want to reach my eyeballs, there has to be a relationship there. And so there is real risk. And we should worry about that risk. But I was worried when Chairman Wheeler said, Listen, you're not acting up yet. But I'm still going to pull the car over. That that was kind of like, that was kind of like taking the extreme action before you had to. And what I worried about was, as soon as you set down formally, here are the laws that you have to go through, that just becomes the ability for lawyers, smart lawyers, to think through again, how can we find the exceptions around these rules? And so my preference would have been, don't regulate the internet under Title two, but keep that weapon as a threat, publish a set of here are the principles. And we're going to watch really, really careful, carefully. And if at any point you're abusing your role as determining access monopoly, know that I can pull this car over at any minute. I think that would have been the right policy. That is not what happened. So now we have a new policy. And now we have a new FCC commissioner, our chair and and now we're going to change the policy again. And you know, if you're going to build a tall building, you need a really stable foundation. And if the policy changes with every single administration, what I worry about is that fundamentally, that discourages innovation, that discourages investment, that discourages new entrants, and we actually end up in exactly the opposite outcome of what Chairman Pi is reasonably trying to create. And what Chairman real wheeler was reasonably trying to create. And so I think there is real risk here. And while I wasn't a super fan of using Title two, now that it's used, I'm not super fan of undoing title two, because I feel like we're just going to yo yo back and forth. And that is the worst thing we could possibly do for for overall innovation. For what's worth. Yeah, I was super fans of title two, but we don't need to argue about that here today. Maybe after the panel, but because we want to move on to questions. So do people have questions? Yes, and there will be a mic coming to you just wait for the mic. Thank you. Mr. Prince David Green with NBC Universal, and you had talked a little bit about the content, content protection stuff. Among your customers are some of the most notorious pirate websites, content websites in the world, like solar movies, movie, 4k, etc. And these sites, as you know, aren't really a free speech issue. These sites are giving making money by giving other people's property away, hurting not only the US economy, but the creatives who are trying to make independent movies as well as the blockbusters. And some of their business model is known to infect their visitors with malware. So my question to you is, why do these companies deserve to be your customers? When are you going to say these guys are assholes and I'm going to kick them off or let's do the right thing and take that content down? Hello slippery slope. Indeed. So the first thing is if we fired them as customers, and they, again, by and large, neither neo-Nazis nor pirate sites pay us anything. So it's not a financial issue for us. If we fire them as customers, content want to go away. Oftentimes, when you talk to law enforcement agencies, and you say, hey, would you like us to kick this off or keep it on for lots and lots of reasons that you can imagine, they would actually prefer the content be on our network to not be on our network. We work with NBC Universal and lots of different content providers to ensure that the rights of rights holders are protected. And true North for us is that clouds or should neither make the role of law enforcement either private or or government law enforcement any harder. And so we have things like with NBC trusted reporter program that allows them to notify and then us to get information to hosts about taking that content down and ensuring that there is compliance with the hosting providers under their DMCA provisions. And but but we also shouldn't be a place that makes the job of law enforcement any easier. Because as more and more of the internet sits behind fewer and fewer providers, if if those providers become a choke point, then that becomes a point of control that can be in place. And so while I really appreciate and and respect the rights of rights holders, and I think that it is important for us to ensure that we are not interfering with those rights holders being able to enforce those rights. The mere existence of cloud flour should not automatically make it massively easier for whether it's people trying to enforce copyright, or people trying to police. You know, in in Germany, hate speech, or in China dissident speech, and and I would encourage you to think about if we put in place things that said, you know, NBC Universal can pick and choose what customers can and can't sign up for cloud flare. How do I then not put those same protections in place for the Russian government or the Turkish government or the Chinese government? And so I think there is a right role for law enforcement, including private law enforcement. And I think true North for us should remain that cloud flare shouldn't be making that role any any harder, but also no easier. That's the right North star for us. And, and in places where there are legitimate law enforcement issues that need to be taken into account, we should be working with legitimate organizations like NBC and others to ensure that their rights are maintained and we're not standing in the we're not making any worse. But at the same time, I think it's really important that we also don't make it any any easier. I thank you, Tara Mallor fellow with New America Security Program and also with the counter extremism project. So in the name of disclosure, we've written some letters about terrorist content on your platform post Charlottesville. And I was just wondering if I could push you said, you know, a framework needs to be developed. I mean, there already are terms of service and there already are actually sort of laws on the books about certain, for example, terrorist activity, terrorist financing. And so I'm just wondering, I mean, in some of these cases, terrorists are using platforms for safe haven, they could be linking to terrorist financing, which is violating OFAC policies. You know, if I were to buy a t-shirt from ISIS, I'm breaking the law. But if you're a company providing them with a place to do their business or to link out to crowdfunding, I mean, I guess I'm just trying to understand what framework would you think would make your job easier in these decisions? It seems like there are quite a few frameworks. If it's not going to come from regulation from Capitol Hill, are you just saying you need more advertised or pressure? I mean, your terms of service already do prohibit a lot of this content already. It seems to be more of an enforcement issue of the framework that's already there. Yeah, so, I mean, we're not an advertising-supported company, so advertising pressure doesn't affect us one way or another. So what I think Kevin alluded to is that there are laws that are there, and we respect those laws. And if somebody says this is something that's illegal using your network, then they won't be using our network anymore. And we respect those laws in all of the different regions in which we operate around the world. And so that means that there is certain content which is illegal in China, which is not illegal in the United States or otherwise. And we need to enforce that. With respect to terrorist content or alleged terrorist content, when we have a question like that, we are not political scientists or terrorist experts. And so the question is, where do you look? And one thing we could do is we could look to Twitter or we could look to private organizations or people who say, go take this down. And again, that would be expedient for us because it would avoid controversy. But when we've actually looked into that, we had a group of people on Reddit who were just insistent that we had to take down this ISIS site. And then we actually asked ISIS experts at the State Department and they said, no, no, please don't take that down. Those are the Kurds. I don't know. Like, you know, I'm a computer scientist, not a political scientist. So when we see those questions, we go to government and we ask, what would you like us to do? And the answer isn't consistent. Because for a platform like us, if we kick something off, it doesn't make the content go away. It does make it more difficult for legitimate law enforcement agencies at some point to execute legitimate court orders and search warrants. And so we get sometimes very different answers, depending on who we ask within governments. So I'd love just a consistent answer. And I can comply with any consistent answer. But when one branch of government says, no, no, please leave that on. And the other says, no, no, please take that off. What am I supposed to do? If I just want to avoid controversy, I know the easy answer, which is I don't make any money off of it, kick it off. But but again, I worry about, you know, what were the were the daily storm or folks terrorists mean, kind of. And and so you know, where where do we where do we where do we draw the line? I would what I would love is just consistent rules that are enforced by organizations that have political authority. And those politically, I mean, and what what do those look like? Well, in this country, it's courts and law enforcement and legislatures and, and those sorts of organizations. And it's, and it's not, you know, random people on Twitter, or with respect, you know, the think tank of the month. Hi, Liz Basuti. I'm the general counsel of public interest registry. We run dot org, the dot org top level domain. Thanks for a fascinating discussion. Just to follow up on your point, though, I think it, the last question, and to some extent, the question before it really raised the larger issue of who gets to determine what's legal and what's not legal, right? And so we talked a little bit, or you did earlier about due process. And in that context, what I heard was due process with the small D small P, which is really, how do we consistently apply our terms of use, right? But there's this larger issue of capital D, capital P due process. And again, who gets to decide we have people come to us all the time and say this content is illegal. And my question is, how do I know that? Well, it's, it's very obvious. Well, that's great. Then you would have no problem going to court and convincing a court to issue an order that I would then be obliged and happy to abide by. And so again, it's a larger discussion. We, like almost every other company, have terms of use. And we say we reserve the right to take things down if, you know, for illegal and fraudulent actions. And people sort of use that as a hook to say, well, then why don't you do it? But again, the, the responses show me that it's illegal, show me that it's fraudulent, right? And in our view, we are not the proper arbiter of content on the internet with very, very limited exceptions, child pornography. I could, I could run the gamut, but I'm not, I'm not on the panel today. But again, it raises a larger issue of who gets determined, who gets to determine what's legal and what's illegal, which brings you back to due process, system of laws and respect for the rule of law. Yeah. And so I, I then come back to, let's talk about, you know, what, what's the framework. So .org, P-I-R, like again, technically it's a heavy audience, but a whole bunch of you don't have any true understanding of how domains get registered. Because there are registrars and there are registries and then there are TLB and all this. I mean, it's just, it's this crazy sort of Byzantine system. And what it is definitely not is transparent. And if it's not transparent, is that the right place for you to be making what are effectively editorial decisions? You know, the other, the other, and again, I'm not, I'm, we're trying to figure this out. So if people have better suggestions, you know, let us know. But the other piece that I'm trying to figure out how to interpret and how to place in place is, you know, there will be different rules in different places around the world. And you've got to figure out ways that you can respect what is effectively the sovereignty of each of those different jurisdictions. China for quite some time has made an argument that says we have a sovereign right to regulate networks, our network. And again, me four years ago would have been like, no, no, no, the internet should be free and open. I think I'm, I'm more of a political realist today, which is actually, you know what that makes sense? China does have a sovereign right to regulate China. But that inherently means that Thailand has a sovereign right to regulate Thailand and Canada has a sovereign right to regulate Canada and France has a sovereign right to regulate France. And if China's rules extended beyond China's borders, they are infringing on the sovereign rights of those other countries. And so when you have one country's laws or one, or one organization's laws that start to have global impact, again, that gets us towards the risk of the one dystopian future, where we all become kind of, you know, the Peter Jennings hour. And so regulating at the PIR TLD level to me seems incredibly dangerous because it is inherently global, right? Whereas actually, and again, regulating at the cloudflare level seems slightly less dangerous because we can at least say, okay, one rule applies in the United States and a different rule applies in China and a different rule applies in Canada. And so as you're, as we're thinking about what the framework is, and I don't pretend to know exactly what it is, but you know, things that are there, thinking about, you know, how do we ensure that we don't sort of drop to the lowest common denominator is, I think, incredibly important. And, you know, you think you have, you think you've got, you've got issues and you do, but just wait until the Twitter mob realizes that if they can start to put pressure on ICANN or organizations that, while, that, that, that could do really systemic global harm and that aren't set up to have, you know, the political will to withstand the pressure that, that can be exerted on them. And that, that's what, that's what worries, that's what worries me. I don't think we can go much more, we're already over time. And so I'm afraid we're gonna have to close