 And thanks everybody for having us here to talk about some of our favorite subjects online speech and privacy. I'm Kathy Gellis half of this duo and I'm going to be going first because if we're going to be talking about potential changes to section 230 which is what the earn it act does in part. We should have a good idea about what section 230 is how it works and really understand that before we break it and find out that maybe that was a mistake. It will be a mistake. Stay tuned we will explain why by way of introduction. I am a solo cyber lawyer in the San Francisco Bay Area. I like to work on online speech issues including section 230 issues. I am a lawyer Rihanna is a lawyer. We are not your lawyers. If we were your lawyers, we would have all signed letters saying so and there's been no letters just to make that clear. But we do know what we're talking about. So this presentation has two parts. The reason I'm going first doing the crash course and then I will be joined by Rihanna to talk more. Hi everyone, we are back again with more technical difficulties, but have no fear I've got everyone back. I don't know if anyone's ever done OBS over a virtual desktop apparently tip for the other folks learning from this stream. If you are logged into a zoom in a virtual machine using OBS and somebody else logs into that virtual machine zoom disconnects the user that is connected to zoom on that virtual machine. So we're back we've learned this new thing about technology. I will give it back over to Kathy who's going to talk about the overarching principles. Okay, thanks Whitney. I'm not quite sure where we got cut off but Rihanna and I both lawyers, not your lawyers, but it does mean we know what we're talking about. The first bit that we're going to do is crash course in section 230 because speaking of learning curves stuff breaks but it'd be really stupid to break it on purpose because we didn't understand this really great and important law. We're changing it with proposals like the earn it and various other things that are floating through the halls of our government. So, I'm getting into it as we go into this lesson this crash course, couple of things to keep in mind overarching principles. The first thing section 230 is a really hot topic these days. We're talking about it, everybody's complaining about it, bear in mind that much of what people are complaining about because people are saying bad stuff, saying and doing bad stuff on the internet is really a byproduct of, we've got a first amendment. Freedom of speech tends to protect an awful lot of stuff that we are upset about. And if we were going to try to change the law to change it that means we'd have to get rid of the first amendment and that's probably not a really good idea. So it also means that basically if we're unhappy about something the correct policy choice is not to change section 230 because it's not actually section 230 fall. A second overarching point that we need to bear in mind is that litigation can be cripplingly expensive, even when you're the position that you've got as a defendant is the better one. The fact you might win the lawsuit doesn't necessarily help lawsuits are expensive and potentially bankrupting. And what section 230 is designed to do is to make sure that platforms can stay in business without feeling like they're going to be financially obliterated because one of their users says or does something stupid, and somebody tries to hold them responsible to it, because if you think about the tremendous volume of what even the smallest platforms end up facilitating in terms of user generated content. It's enormous. Even if they just had to like call up a lawyer to find out do we need to really respond to this. That gets expensive that meter ticks. It's not realistic. And what section 230 is designed to do is make sure that a platform cannot be bled dry financially by getting called into court by legal threats real or even or actual ones as well. The overall goal of section 230 and we're going to get back to this a lot is to ensure that we get the most good stuff online, and the least bad. There are a lot of really cool things that people say or do with over the internet. We want to make sure that the internet remains as available as possible to get the most of that. If platforms had to fear liability, they would not be in the position to facilitate all that good stuff it would end up much more limited or maybe not be possible at all. At the same time, there's a lot of crap on the internet and we would like to minimize the crap. So the other point of section 230 is to put platforms in a position where they can minimize the crap without worrying about getting into trouble if they actually do the best they can on that front. So in order to optimize the most good and the least bad. We have section 230. It doesn't mean that we'll ever hit optimum on either front, but it gets us as close as we can reasonably get. And that gets to the set the fourth point, effective regulation doesn't need to be punitive. Carrots can work best with sticks and what section 230 is best understood as is all carrot no stick that basically we want the platforms to be there to get us the most good stuff and the least bad stuff. And we're going to make it. We're going to incentivize them to do the best they can on both fronts. And the reason they're going to be able to do the best they can is because they're not scared. They're not scared of being punished financially through liability, etc. That's the magic of section 230 to make it possible that the platforms and policy can all be aligned to do the best that we can. That's why we have the internet today. So let's take a look at our next slide and talk about how we got section 230. So if we get in our time machine and go back to 1995 a couple of things were happening. The first thing that was happening was that the internet which had been existing for quite some time but was starting to enter the public consciousness. And as it hit the public consciousness, Congress became aware of it and started to become concerned that, oh my gosh, the internet, there's porn on the internet. Oh no, this is terrible. Some of it, probably not so great but some of it is maybe a little overreaction but in any case they were very concerned about the amount of porn on the internet and didn't want any of it there. Meanwhile, there was also this lawsuit that came up in a state court of New York involving the system prodigy. So prodigy at the time. There were a number of dial up bulletin board type services somewhere somewhat large and commercial prodigy comp you serve eventually a well. They were small in the sense of not as big as today is platforms but they were really relatively big prodigy had been around for 10 years at least at that point, and they handled an awful lot of user generated content. And a user had arguably defamed the plaintiff in this lawsuit, but the plaintiff decided to sue prodigy, basically under the theory of how dare you produce a system that allowed a user to defame me. And they filed this lawsuit, and a judge in a state court in New York said well that sounds good go ahead with your lawsuit. Sure prodigy should be in trouble for allowing this to happen. And a couple members of Congress looked at that case with alarm senator widen and then congressman Chris Cox and they said this is not going to be good. The internet is coming we've got all these cool services and they're going to not be able to be available to provide to facilitate user generated speech, nor are they going to be in a position to take down the terrible stuff. So again, we have a crisis if this law becomes a thing. We won't get the most good stuff. We won't be able to minimize the most bad stuff. This is a problem. So these two ideas came together and form the communications decency act of 1996. And then there was a lawsuit challenging most of it, because the getting rid of porn parts happened to infringe on the First Amendment quite severely. And then a case called Reno versus ACLU, the US Supreme Court said no, those parts go away. They made clear that the First Amendment applies to online speech just as much as applies to offline speech. So those parts of the CDA went away, but section 230 survive, and it survived, and it did its job clearing the way to make sure that people continue to innovate internet services. So, my house. Here we are. So, we're now looking at statute, don't worry if you're not a lawyer, you will survive. This is okay. It's a good statute is pretty readable by normal human beings, but I will give you a scenic tour of some of the parts. So at this particular page that we're on, they put in the beginning of the statute some preamble text and it basically boils down to the Internet Congress finds that the Internet is pretty awesome. They go through two pages worth of awesomeness, and then they get to their policy and that the policy of the United States is to keep the Internet awesome. So this sort of sets what Congress is trying to accomplish. And one of the things that it kind of articulates most good stuff but it also puts in some we want to make sure that there's some leverage so we can get rid of the bad stuff. Here we are. That's what I just said. Okay. This is, this is the most magnificent part of section 230. It is 26 words. Professor Jeff Kosev wrote a book called the 26 words that created the Internet. And I've tried to read it out loud at conferences before that sounds terrible when pronounced, but it is fantastic. And if I'm to summarize it in more understandable language, it basically means whoever creates the problematic content is totally responsible for the problematic content, but not the service they used to express it. If we look at the bits that I highlighted the bit at the bottom the information content provider, that's who created the content. And that other bit the interactive computer service. That's the platform Internet platforms Internet interactive computer services, it can be a lot of different things but it's basically the platform that facilitated a third parties expression. Let's go. Okay, this is the part of the statute that looks at those definitions and basically says what I just told you the information content providers the person who created it, and the interactive computer service is the part that facilitated it. And that first part basically said the facilitators not liable but the people who created it are. So there's some question about what sort of, you know, who gets to be counted as an interactive computer service and you know we think a lot about social media. That's the easiest thing to kind of get our heads around, but an email provider they're facilitating other people's expression, a bulletin board service like prodigy comments sections on blogs and online newspapers these are things that you know the whoever posted the article is responsible for the content of the article, but the things that the community provides to comment on the article that's a textbook case of section 23. And then there's also a lot of courts that have looked at this and said you know it also could apply to you individuals. If you have a Facebook post that takes comments on it, and somebody posts a comment. You're not responsible for the comment if something is wrong with it. And I think when people talk about section 230 they tend to think of it in terms of well Google, Twitter, Facebook, and however you feel about these companies tends to color your view, but understand that it's not about them. Yes, true. It was a law that allowed these innovative companies to come and exist, but it's also what allows any company to come and exist, and it also is enabling an awful lot of online behavior that ordinary people actually do themselves. There is a companion piece to section 230. It's not something that has actually like carried as much weight as the section C one part, but that whole bit of we want the companies to, we want the platforms to be able to do two things keep the most good stuff and the least bad stuff. And so what we've done with that carrot approach is said, Okay, here's the carrot you're not liable if something wrong gets posted on your systems and that's a carrot because it means that the companies can say, I'm fine, I don't have to worry if I'm not able to keep something bad off the internet. And that means that they're safe to sort of be as expansive as possible to allow the most good stuff. If they had to fear the punishment than they would scrutinize a lot of comment content and take down a lot of content that probably would have been just fine and completely lawful. So carrot based approach to make the platforms feel safe to leaving content up. But we also wanted to make them feel safe to take content down, because that's the other part, police bad stuff. If companies had to fear that they could be punished or held liable for taking content down. They wouldn't bother at all. It would just be too risky, but we want them to be partners and helping maintain Internet hygiene so we've also given them a carrot in this department to get ahead. So, here we are next. The other part of the statute is there's Section E where they threw in a couple of other things that end up being really important. This particular clause is not really important. They sort of thought back then that there would be a collision with another law that was on books, and so they basically wrote this text to say no collision, no issue away let's move on and so indeed let's move on. And this is an important clause because if we stop and think about the Internet this is inherently interstate. It's very difficult to put something on the Internet that stays in any one locality if you put something on the Internet it's available everywhere everywhere throughout the country even throughout the world. So it's kind of ruined this whole policy if we let each individual state pass a law that will somehow impact it. So if the companies are feeling safe in Iowa to like well we're not going to be liable here. It really doesn't help them if Illinois passes the law that says no no if it happens in Illinois you're in trouble. So this is essentially a preemption statute and so Congress put this in to say look, it's federal government we got it. The state sent this one out and that's important because internet policies complex enough. It really would be overwhelming if all 50 states plus territories and local jurisdictions could have their say. Okay, but they did put an exception so section 230 applies for almost everything that could be legally wrong with user content with a, but there's a few exceptions, and one of the big ones that they baked in originally was the thing wrong with the content is that it potentially infringed on an intellectual property right. Easiest one to get our head around is it might have infringed a copyright, then section 230 doesn't apply no protection for the platform. They might, they're going to be on the hook for the liability in the user. Now it turns out that shortly thereafter Congress passed another law, the digital millennium copyright act, which does give platform some protection. But the reason I've put up this slide is this is a tiny piece of a tiny section of just one of the pieces of the digital millennium copyright section 512 alone. So this isn't particularly readable. It's not even particularly understandable, and companies go bankrupt, trying to litigate. There is a terrible situation like, if you think that it's a problem that YouTube is so big, you should know that there used to be a competitor to YouTube that got sued for copyright infringement, allegedly in their users content, and they went bankrupt in the process of litigating it to the point where the Ninth Circuit Court of Appeals actually said there's no infringement, they're not liable for this they should have been just fine. I think the company was gone, and we lost a competitor, which now we're missing because it actually might be nice to have more competition online. So when we think about messing with these liabilities or making these liabilities complicated or conditional. This is what happens we tend to obliterate them and that's not what we want to have happen. There was also an exception from the very beginning that said there is no effect on federal criminal law. So that means that if the thing that was wrong with the content is that it violates a federal criminal law also section 230 wouldn't apply. And for a while that was fine I basically sort of said like child porn. That's bad that tends to violate federal criminal law. And so we would impose on the platforms that they needed to product, you know, they needed to do something about it. But that was one ask of a platform and it's difficult and it's got some issues tied up and how do you do that effectively but basically it was a single ask our most important ask of what we could ask for a platform and at least they could focus their resources on that and not have to be dragged into any other things that they might have to look for. But then things changed recently. A couple years ago, Congress put forth the first significant change to section 230 since it was passed. You might have heard it called SESTA but by the time it got passed it was referred to as FOSTA. And FOSTA basically said you know there's another type of bad content. We don't want sex trafficking either. And that's true. That's bad. They poked a big hole in section 230 and one of the things to know just looking at the slide is they poked a very complicated hole in section 230. That's hard to read and understand and you can see those hyperlinks are pointing you to other areas of the US code where they've put another criteria. And so we need platforms really unsure about what sort of content is okay for them to facilitate and what sort of content is not. And as a result, they've started taking down or making themselves unavailable for content that would be totally lawful. A very famous example is Craigslist took down entire sections of classified ads like dating ads. They've been around for ages. Couples have met that way. There's nothing inherently wrong about an ad looking for dates and matchmaking. But because this is sort of adult content that can tip into the period, Craigslist for quite reasonably got scared because they didn't really know I think we might be held responsible for it and that's just too big a chance to take. The other thing that this law did is it made it somewhat unsafe and actually more than somewhat it made an unsafe for platform to take stuff down. Because, and we'll talk about this as we talk about other attempts to reform section 230 they tend to mess with the idea of like if you knew it was there then you're on the hook for it so if you didn't know it was there. It might be okay but then why should a platform take the end time and energy to look for what might be there, because it's just tempting trouble for themselves. So all of a sudden now the platform is not a partner in dealing with the internet hygiene, and that's not constructive it's not going to help us get the most good stuff, or the least bad stuff. So, remember these thoughts, how cripplingly expensive it is to be sued even if the lawsuit is without merit. Remember that we want the most good stuff and the least bad stuff, and that carrots are more effective than sticks to helping us get that and when we think about what some of these other laws are, you know, tested against that notion and I think you'll find that most of these proposals fail. So, with that, we're about to bring up Rihanna to talk to her and one of the things we'll tip on is, there is one proposal floating out there which is okay. There's one, the examination of secondary effects for sex workers study. One of the things is that foster kind of got rammed through where people said, I think we have a problem but instead of really understanding the dynamic of that problem they went ahead and changed the law around it, without really understanding if that was a good solution an effective solution or one that didn't actually make the problem worse. So, to figure this out, there's a lot of evidence that has actually made the problem they were trying to fix worse, worse. Woman, there's a body count attached to pasta, sad to say, and this is a proposal to basically let's figure out what the effects are before we do anything else to it. But the rest is not okay we're hearing executive orders trying to challenge section 230. There's not a guidance document that doesn't really understand it at all. The packed act is the most plausible where it seems to at least understand the statute but it still creates problems. It seems like every week Senator Holly is producing a bill to change it. And then there is the earn it act which is sort of the ban of our existence because it does more than change section 230. I'm going to bring up Rihanna now. We're going to close the sides and hopefully things don't break. Did I succeed. Okay. So, do we have Rihanna. Yay. Okay, we're back to a single person. Great. Well, I was going to suggest we've gotten a few questions coming in over the discord. Thanks everybody. Maybe we could try knocking out a few of these Kathy before we say get talking a bit more about earn it. Okay, excellent. Let's see I see that. I'm seeing questions about moderation by bot, and whether that is somehow bad or not or that 230 speaks to it and basically 230 has a deliberately very light touch, it can apply to a whole bunch of systems. There's a whole bunch of context and it's not very specific which makes it very flexible it was written in the mid 90s and the Internet has obviously changed quite a bit ever since. And it's flexible. It doesn't, it doesn't directly speak to AI, but if we stop and think about what AI is AI in theory is the computer outsourcing of the brain of the people on the company. They have made decisions about how they want things moderated and they deploy their bots to do it. The fact that you have technology pressing the button instead of a human being pressing the button, it doesn't change it. The core thing about section 230 is that the intermediary is not responsible for the wrongfulness of the content that somebody else created. And the bots don't change that they're still connected to the editorial discretion of the platform, and don't 230 is fine. It's not like it's an end run around it they had a central equation of who created the content and who made it wrongful. That's still core section 230 stuff the way it was in the 1990s. And then I see the question about who gets to determine bad stuff. You know I like to use the general term just to help people get their heads around it, but basically the platform does because remember I brought up that example of like you have a Facebook post and somebody post a comment on it. You know if they come and they insult your kid, you want to be able to delete that comment, whereas if they come and they say your kid is awesome, you want to be able to leave that comment up. That is sort of the editorial discretion that the First Amendment protects. And the second thing if you extend that instead of your own personal Facebook post, it applies to Facebook as a whole, they get to decide, because essentially it offends the First Amendment to have an external government entity force anybody to carry speech. Minor exceptions to this if we want to quibble but they are very minor and they tend to be applied to things that are not the internet. And that's that's something that does get talked about a lot where people like wait a minute, who gets to decide this. A private platform is a private platform, whether they're a large corporation of a lot of people who got together or a single person with their own post it's the same equation. And the name of the YouTube competitor got sued out of existence is. I don't know the lawsuit ended up changed to show it was they are networks. That was the name of the video system, they are networks, no cease to be it is no longer a platform. So, now my question is to Rihanna, what is wrong with the earn it act. Excuse Kathy so everything is wrong with the earn it act you know okay and end of stream we're done here we can all go home now thanks everybody good night. No so so to be serious. The earning act is I think one of the most serious challenges and threats to online privacy security and freedom of speech that we have seen come across the transom in Congress for quite some time. And as Kathy mentioned a bumper crop of anti 230 bills that would aim to narrow or even repeal section 230 entirely. And sadly it's also not the only anti encryption bill out there right now as I'll discuss in a moment. Kathy touched upon Cesta foster which created a carve out from the 230 immunity with regard to sex trafficking offenses. In its current instantiation, the earn it act bill is kind of similar, except instead of sex trafficking is child sex abuse material. Now, I'll call the CSAM for short. This bill is a pernicious and difficult problem on the internet it has been around ever since the days when people were allowed to transmit one file to another through some of the earliest services that you can think of, many of which I am too young to have used. And it's something that platforms have struggled with and that they've had to go through evolutions in their approaches to try and cut down on and prevent. What I understand about CSAM is that it is illegal on the federal level on the state level across the entire planet. It's not legal anywhere. And as such, it is already subject to federal criminal federal criminal statutes that impose duties on providers and platforms to report it and remove it when they learn about it on their services. And that is important because it is what we would call an actual knowledge requirement in the law platforms are only required to take action about CSAM, if they find out about it that could be because a user reports it it could be because they have an automated scanning service that looks at attachments that are in email or files that are uploaded to a cloud storage account to try and match against the database of hashes of known CSAM images. So if you do find something like this they are required to report it to the National Center for Missing and Exploited Children or NICMIC for short, which is quasi NGO, quasi really it's been deemed at this point to be an agent of an arm of the state that handles reports of child of CSAM. Also you might know NICMIC from the images on the side of milk cartons that have you seen the images, but I think at present they're probably mostly associated with their really difficult role of having to process these reports that come in from providers. And providers do report CSAM that they learn about on their services millions of times a year. And there's also federal requirements that say that the Department of Justice is supposed to provide funding for conduct studies conduct reporting with respect to all of these reports that are coming in with respect to trying to fight and minimize the abuse of children and other crimes against children online, they've been falling down on the job they have not been filing their federally mandated reports. And I think one of the years that they've been required to do it over the last decade or so. And I think one of the motivations for the earn it act bill is to try and point a finger away from DOJ's own shortcomings, and away from how overwhelmed NICMIC is because they do not have the resources they do not have the personnel they need in order to take on this task and deploy that platforms instead. And the way that they've chosen to do that is by taking kind of the, you know, the whipping boy do sure, which is section 230. And right now there's a lot of kind of popular discontent, I think with section 230 much of it ill understood and so thank you to Kathy for trying to explain a bit better. Why section 230 is a positive and important force online. By going after section 230 enables, I think the DOJ to try and both direct attention away from its own shortcomings and also to promote an agenda that has long had for greater surveillance of online speech, and in particular to go after encryption which is an area that I focus on, in particular in my role at Stanford. By curtailing immunity for under section 230 for CSIM, there's nothing about that that is not something that DOJ can already do. Kathy discussed how one of the limitations on 230 immunity is that it already does not bar enforcement of federal criminal law, you even saw where she noted that child sexual abuse materials explicitly called out in that particular clause in 230. It is already the case that if platforms are not complying with those aforementioned reporting duties when they learn about CSIM on their services that I mentioned, nothing at all is stopping the Department of Justice from penalizing those platforms. What section 230 currently does is to borrow state AGs, state attorneys general from bringing criminal charges against platforms because section 230 borrows state criminal law, and it bars private plaintiffs lawsuits by individual people with private right of action, because those two are barred by the section 230 immunity. So we're in a situation where providers are subject to statutorily mandated duties with respect to CSIM by all of evidence they are complying with those things but there's a sense that they should be doing more. The fact that they are reporting millions of pieces of CSIM a year to Nick Mick is taken not as evidence that they are finding and reporting things that they aren't doing enough which is a bit curious. So the point of section 230 of excuse me the point of the grant act bill in its current instantiation since its introduction earlier this year kind of doesn't make sense under the name anymore because originally what the bill would have done with have been to make the community under section 230 contingent upon compliance by providers with some to be determined best practices if you follow these best practices you jump through the hoops. These would be set by an elected commission of people headed by the Attorney General. Then you could maybe continue to be eligible for continued section 230 immunity for CSIM. Otherwise, you might just be sort of thrown to the wolves that link between following the best practices set by the Commission and continued immunity is taken out in the current version of earn it current earn it is much more like sesta fosta in that it creates the immunity full stop. It still has this commission this 19 member commission, who are still charged with coming up with best practices for fighting CSIM online and protecting child safety online. However, that those are now just non binding voluntary recommendations that are not tied to any particular carrot or any particular stick under the bill. There's nothing left to earn there's no more immunity left to earn and the ultimate upshot of all of this is because the removal of 230 immunity bar opens up potential liability under what could be 50 or more state and territorial laws that deal with CSIM. There's a lot less certainty for platforms about what their duties are. Right now they know what they have to do as as specified under the federal reporting law, but under the under the federal reporting act, there could be a variety of state laws that might open them up to liability for CSIM and to private civil lawsuits and to state AG's going after those platforms under what could be much lower bars, going back to what I mentioned about the actual knowledge requirement under the federal reporting statute. That's a fairly permissive bar for for platforms. It's not that low, whereas we know from a survey that a group called net choice has done of state CSIM laws that some states level CSIM laws impose liability on those who trade or host or transmit CSIM under a much lower standard, such as negligence or recklessness. So platforms could easily potentially be held liable for CSIM on their services, if they negligently or recklessly host it transmit it. Allow it to be promoted or presented or distributed etc etc. You might be asking yourself wait a second doesn't that just mean the ability to transmit files. Why yes it does. All of the things that we do online such as talking to each other such as sending files to each other. This is what the internet is for those can also be used in ways that can lead to the exploitation of children or that can involve the transmission of CSIM online. So by potentially exposing platforms to liability for negligently or recklessly transmitting CSIM. They're really potentially exposing platforms to liability for hosting user contributing content at all for the ability to transmit files for the ability to speak online and speak to other people, some of whom may or may not be over the age of 18. The chilling effect of this of this bill's potential cannot be overstated. We already saw with regard to SESTA FOSTA how it led to the shutdown of large swaths of completely legal speech on the internet. You may recall that Craigslist immediately shut down its entire personal section. They also shut down their section that allowed for therapeutic services so massage therapists were suddenly out of that part of their livelihood that came from finding clients on Craigslist for fear that somewhere in those personal ads or somewhere in their massage therapy ads, there might be somebody who was being trafficked and expose those platforms Craigslist to liability under SESTA FOSTA. Here similarly we can predict that there would be a similarly large chilling effect on the ability of people to speak to each other online. Because again, what if any particular file might sneak through? What if some communication between two users on a service might be exploitative or enticement between an adult and a minor? These are all things that platforms are going to be wrestling with in the event the EARNET Act passes because they now may be exposed to liability under 50 or more state laws. So the fact that there have been some changes that were made to the EARNET Act don't really fix the core problems with the bill. When we think about how this bill relates in particular to encryption, the changes that were made between the early version of the bill and the current version of the bill were intended to placate concerns somewhat. It doesn't fix it. There is an amendment offered by Senator Leahy that was intended to protect encryption and take away, basically restore immunity from suits for the use of encryption for deploying into an encryption for providing device encryption and so forth. It's not really enough, though, in my opinion, both because it does not protect other forms of providing user security and would still allow lawsuits to arise if somebody who was determined to bring a criminal charge as a state AG or a private plaintiff were determined to bring a lawsuit. All they have to do would be to come up with some reason for suing, let's say WhatsApp, let's say Signal, that wasn't directly about the encryption. Any other pretext that they could find or any other security measure that was in use, even say data minimization practices, for example, if that would potentially get in the way of fighting CSAM, maybe that's something that you use as the basis for trying to make a platform liable. And on the other hand, I don't think that the way the Leahy amendment that is supposed to protect encryption is structured would necessarily mean that platforms would be safe from being pressured or induced into introducing other forms of user surveillance that don't touch the encryption. One of the moves that we have seen in the encryption debate over the last few years has been as we've pushed back against proposals to weaken encryption to insert mandatory backdoors into encryption. We have instead seen law enforcement proposals come out that say, well, how about this other idea? It doesn't involve breaking the encryption at all. And those include things like the so-called ghost user proposal proposed by GCHQ, the spy agency in in the UK, which would silently add a user, a law enforcement user to an otherwise end to end encrypted conversation while suppressing notifications to the people who are present in the conversation that this silent ghost law enforcement user has been added. It doesn't touch the encryption, right? It's still a way of undermining the privacy guarantees and security guarantees that you expect from using an end to end encrypted app. That's just an example of how the Leahy amendment doesn't really do enough to protect encryption and to keep providers from being disincentivized to continue offering end to end encryption and other security measures to their users. So I know I get a little head up about these things. It's hard to restrain. I can talk about this all day. But before I wrap up and we go on to maybe having a bit more of an actual fireside back and forth. I want to mention, I noted that Section 230 bills aren't the only threat to our encryption and privacy and online speech rights that are out there. There's another companion bill that was introduced by the same sponsors, Senator Graham of the earn it act. He introduced it just a few days before releasing the amended version of the act. A few weeks ago, it's called the lawful access to encrypted data act, or I like to say the L is silent and it's the awful access to encrypted data act. And that is just a flat out backdoor bill. It is something that we have been waiting for, hoping wouldn't come out, but he's reading I think is the word. It is really the full frontal nuclear assault on your ability to just be able to use encryption and move on with your life that we have gotten used to in recent years as encryption by default has become the norm in the devices that we use and in the apps and services that we use. So what is so dangerous about the la ed act or the late act if you want to sound a little bit naughty about it is that it would require basically any sort of larger provider anybody who had who sold a million devices in the United States in a given year, or who has a million monthly active users in a given month dating back to 2016 so even retroactively would have to proactively re engineer their services to be able to comply with warrants or other legal process they received from law enforcement to decrypt information that was encrypted. They'd have to proactively redesign to build a backdoor and if you are a smaller provider that's not part of that million plus club but say a very small handset maker, or the provider of an end to end encrypted app or other encrypted service that does not have a million users. You still might, you're still covered by this bill because then the Attorney General would be able to serve a notice on you telling you to re engineer your service and add that backdoor into your service, even though you would not be forced to re engineer it proactively. So, this bill was introduced as I said same sponsor right before the earn it amendments were made, clearly intended to make earn it look more reasonable by comparison, and make it look less like an attack on encryption by comparison because this isn't over it rather over attack on encryption, but the lesser of two evils is still evil like the earn it act is bad, and the la ed act is worse that's not as though Congress members have to pick between one or the other they could just pick neither and oh I don't know, focus on making sure that millions of Americans don't become fucking homeless, you know in the next few weeks, and you know do something to fix the epidemic that is ravaging this country, instead lawmakers have been deciding to focus on what really matters, which is Google Facebook or apparently annoying some people, whatever, go and focus on what really matters Congress and start leaving the internet alone leave my right to encrypt alone. So please contact your Congress members. I wrote it down here. I'm making this very small so that nobody can just you know photo shot me and say something stupid. No earn it act.org that's where you need to go call your Congress members call your senators in particular, because the earn it act bill has advanced fairly far within the committee. It's already gotten out of the committee in which it was introduced passed out unanimously out of that committee. And there has been efforts just this week by Senator Graham its lead sponsor to try and force it through by what's called unanimous consent so that it would pass out of the Senate without all of those members of Congress actually having to stand up there on the floor of the Senate and be seen on c span, voting in favor of this terrible destructive bill. So please contact your senators contact your house reps to and make sure that while you're at it, you tell them to oppose the lawful access to encrypted data act as well. That bill hasn't gotten as far in the Senate it hasn't gotten out of committee yet, but it has already had a companion bill introduced in the house by one of the same people who was behind the cesta foster precursors, and is also a threat to online speech. So make sure tell your senators tell your house reps oppose both of these bills. I can just stop being a little reclaimed I think we can like move on and have this. We're not going to completely move on we're going to tease out some of the reasons why this is bad because I think I think we need to the same way we need to understand how 230 works I think we need to understand what went wrong and what goes wrong here. So I think one of the things that's come up is this is not a small thing that basic thing we talked about about what what should Congress want from the internet providers, though them to help us get the most good stuff and the least bad stuff. One of the things you talked about is that it does the exact opposite. You talked about how it will make platforms completely afraid to remain available to intermediate good content because anything could set off the liability threat, especially now when you're including all those different states. So remember that one of the things I pointed out was there was a preemption clause in section 230 because it would be a mess. If you had 50 different state attorney generals with 50 different opinions about how things should be essentially getting to decide what internet policy was. And so you point out that the internet, essentially does away with that preemption and now puts an awful lot of power among individual state attorney generals to essentially censor the internet and it'll essentially become a situation of the least common denominator of, you know, that may be great that 49 states are like yeah yeah whatever be reasonable it's fine. But if one state is like I'm coming to get you if you do something that offends what I've declared to be the policy so that I can get reelected next term. That's going to be a problem and basically the rest of the United States. Your internet experience will get shaped by that single. It's the least common denominator thing that if one state, you know, wants to have something that's irrational but it's on the books they've got the power now to do it, because the section 230 protection goes away and when that protection goes away, the choices that platforms make to have the most good stuff and to minimize the bad stuff goes out the window. We've also talked about how it messes up them being partners and getting rid of the bad stuff because right now we've already left because of that carve out for federal criminal law, we've told them, go ahead and you know we want you to get rid of the child porn and get rid of that platforms are making an effort to do it but now you sort of created a weird incentive of like if you look for it. Now you might be in trouble for having found it so why would a platform now want to look for it that seems like it would just be begging for trouble. They just encrypt everything so that they could see nothing. And now we mess up your encryption. And, yeah, so I mean let's talk a little bit about the politics because I think one of the things and one of the reasons why we wanted to do this talk is because section 230 has become this very hot topic that everybody is banding because it's become the easy scapegoat for anything that's wrong online and anything that's wrong online that's affecting the rest of our world which, you know, is difficult these days for a variety of reasons. So it, you have people who have differing agendas, but we're all kind of unhappy these days and we've now found something to blame. And so you end up with these weird bedfellows of. I mean, our next act is sponsored in part by Senator Graham, who is a Republican, but it's also equally pushed by Senator Blumenthal, who is a Democrat. How on earth that we get a Republican and Democrat in this incredibly polarized world to agree that messing up section 230 with this law is a good idea. Does this have appeal to both parties at the same time, and does that maybe suggest for each camp that maybe they should rethink it because if the thing they want is making the other side happy, maybe that cause that should cause some concern about what it is they actually want. The easy answer there is this is a bill that's supposed to be about child safety, it's not actually going to help protect children on the internet because if major platforms like Facebook or like Twitter now know that they are exposed to liability unless they get a lot more intrusive in terms of the amount of information that they collect from users in terms of the amount of surveillance they do have their users, then everybody who is treating in this material on these major platforms is just going to be forced off. And they're going to be using tour hidden services and they're going to be on other sites that don't comply with federal CSM reporting laws because they exist in order to host this kind of material. Don't qualify for section 230 because they are themselves engaged in violating the law and don't care. So it's not going to achieve the stated goal of helping protect children, by the way, you know this does not provide the sorts of services for housing stability for removing children from abusive homes, all the stuff that you could try to do to prevent abuse from happening in the first place. It's just meant to be another think of the children bill. And so it doesn't achieve those goals that it sets out to do, but by being out there as a bill that's purporting to protect child safety it makes it much more by, you know, by, by camera friendly in terms of getting a bipartisan buy-in from each side of the aisle. And it makes it a lot harder for members of Congress to vote against it because who wants to be seen as voting against something that helps the children do you hate the children you must hate the children. Well the props the two questions one, this isn't actually going to help them it's going to hurt them correct. Right. Yeah, okay so first of all it's not going to accomplish what it set out to do just like foster didn't accomplish what it set out to do. I think the big problem is, yeah, who wants to vote against keeping children safe so if it's got that banner bumper sticker think of the children vote for this. It's a very difficult thing politically to say no to what that means that it becomes a really convenient cover for people who have other agendas. Because they've now made the poison pill that like the other side can't say no to. And I mean, that was happening, even with SESTA FOSTA where people who had agendas for why they wanted to get section 230 off the books were able to push that. Do you have any thoughts about what some of these agendas are. I absolutely agree and you know, there's an organization of sex workers and allies who have been really vocally working against earn it act as they did against SESTA FOSTA called hacking and hustling. And one of the slogans that they use when they are describing both the impact of SESTA FOSTA and the likely impact of earn it is that it punishes people because we want to punish platforms, not you and I we but you know, the general don't punish people because you want to punish platforms if the idea is somehow that Google or Facebook or whoever is too big for their bridges or they're making too much money, our tax code allows them to make it, or they're somehow acting above the law the law permits them to do everything that they're doing that's what section 230 does. Then, you know you pass a bill that ends up making it more dangerous for sex workers to to be alive to exist to work. And that will make it harder, potentially to prosecute CSAM traders if this bill passes. I agree with you that there is an ulterior agenda in each of these cases. And one thing that I thought was really problematic in particular about the use of the earn it bill as what was clearly just a Trojan horse for trying to ban encryption, or at least strongly discourage it, was that we're talking about such a wide range of services that would be covered when really we're talking about a couple of different things, the thing that makes people angry about the way that the internet kind of sucks is the stuff that's not encrypted it's the comments that you see on Twitter comments you see on YouTube. It's why we have emoji only chat in our Twitch stream this weekend. You know, it's it's the things that are happening in unencrypted spaces, I think are largely what motivate people be just kind of sick and tired of what the internet feels like these days. And yet, the thing that is being attacked by efforts to sort of be a backdoor to other encryption backdoor mandate is encrypted private communications it's private spaces. So it's trying to take this popular anger at something that doesn't have to do with the private communications that we have one on one spaces, and sort of misdirected again this is all a big game of misdirection between the DOJ, which I'm convinced largely wrote both of the bills I'm talking about, especially the second one to try and push that agenda. And you know I do believe overall that, you know, it would be very difficult we talk in the encryption debate sometimes about finding some sort of middle ground proposal, or some sort of compromise. But it seems like the only compromise the only middle ground is only ever demanded from the people who have the right and now the ability to protect themselves and to exercise their rights it's never law enforcement offering to give ground. I'm very dubious. I mean I've been called an absolutist but the reason I'm an absolutist is I'm really dubious about that there's a way to have a compromise I mean I pulled up I did the slide about the digital millennium copyright act, which in theory is a type of conditional say whatever. If you do x y and z okay then you're not in trouble for your third party content, but it's a mess. It's very difficult to figure out the conditions it's very difficult to meet them all. It's not necessarily realistic. The point I didn't make for that is how much censorship of lawful speech happens, because we've made this the platform protection so conditional where people send garbage take down notices that the platforms really have no choice but to take down even though the copyright claims behind them are garbage, sometimes because they're not even really copyright claims at all, but a platform handling voluminous content with and potentially expensive liability doesn't have the ability to say no to them I mean sometimes they do, but it's rare because it can't happen all that often it's just not practical. I don't see how we can. This is kind of an all or nothing thing and that's why I go back to the, do you want the most good stuff and least bad stuff. I think there's really only one way to to achieve it. And that's by aligning the incentives of, we have more tools in our regulatory tool belt to regulate it doesn't have to be penalty based and somehow I think we're not good at that I think we don't understand how to regulate effectively by making parties partners where acting consistent with their own interests happens to get them to do what we want. The reason the internet sprouted and did so well in 20 years is because that's how we regulated because that's how we made it work and that's how we went from basically no internet to an awesome internet I mean it's got wrinkles but you can connect the whole world we can connect like an infinite amount of knowledge and ideas and make connections. By and large it is incredibly awesome. Maybe part of the problem we have is that you know if you've got 7 billion people on, you know connected to each other. Some of them suck some of them are not good people some of them say terrible things and when these terrible things happen, we become hyper aware of them because a we now can be aware of them because the internet is providing us with that information and they end up with this outsized effect like oh my gosh I can't believe what that person did and we're upset about it and we look at what. Existed that allowed the bad thing to happen well that was the internet and the internet got to exist because of section 230 so therefore 230 is bad let's get rid of it and it's really hard to sort of call time out to reasonable people and say hang on maybe that's not the best way to think about it, although that's why we're doing presentations like this to kind of walk through the thinking that they understand that like no this is really important and really valuable. And maybe the things that are going wrong are things that we should target our regulation to that if we're upset about something, let's carefully define the problem. And come up with an appropriate solution that helps, you know, fix the thing that's wrong without just taking away the whole thing that's created the internet. Good things and bad things because now we won't have the internet at all. Right, human terribleness has been around since well before there was an internet people have been terrible for as long as they have been people. But nevertheless, I like your point about how we tend to sort of focus on the bad parts without thinking about the vast majority of perfectly fine content. The same is true of encryption. You know, I feel that the focus on the use of encryption to commit crimes or to hide evidence of crimes gets blown a lot of proportion compared to how rarely instance of crime really is when you compare it to the fact that more than one out of seven humans on this planet use WhatsApp, they are one billion people aren't criminals right so most of the usages, most of the people using encrypted apps, most of the people using encrypted devices which at this point is basically everybody, just by default, most of that is not for crime. And so we lose sight of placing these things into context and thinking in more big picture about what the cost benefit analysis is of trying to weaken security and increase surveillance and reduce free speech rights for everybody and other individuals like this. I want to make sure we get to some of the questions that have been coming in we have maybe about five or 10 minutes left to take questions I know we've run over a little bit but it's the end of the day, there's no rules. Like till Congress gets at it but okay so I think some of these questions. Now some of them are sort of technical 230 questions. I almost can globally say yes 230 set totally applies although. Yeah, if we can get to a well, well, well, actually the most interesting one is one that I've been putting forth and since I'm a speaker I can say that somebody asked a question about 230 and Amazon. One of the things that's starting to happen is some of internet services have started to look more like actions than speech, because like people sell things on Amazon and like is that a different sort of activity. And I've argued in briefs before courts that basically still boils down to speech that if like you're on Airbnb, what you're really saying is I have an apartment to rent. Maybe the only thing you're really saying on Airbnb because it's a very hyper focused platform. But again that whole slide about 230 is kind of general and its application and it applies to a whole bunch of things. It can apply to Amazon when you regard that a third party seller is really a seller saying, I have a thing to sell, and that when we start imposing liability for attack to the sale that ripples through and affects every platform that takes every type of expression, because there's no real contingency of whether 230 applies to a broad base platform that takes all sorts of speech, or a narrow focus platform that takes very specific I'm only selling this kind of thing type. But by that's a lot of the 230 things on the earn it side. People are asking and maybe this is the way to phrase the question. If earn it passes what happens. The signal go offshore does something go offshore does the internet die. I mean, it kind of that well, you know didn't completely die with foster, except it really took a very severe hit and so did an awful lot of the users who used it like that body count of sex workers is significant and real. So let me ask you, what happens if earn it passes. If it passes I think we will immediately see even more shutdowns of portions of services or even maybe entire services just as we saw immediately the moment SESTA foster was signed into law. So a lot of litigation, including over this encryption safe harbor that Senator Leahy tried to insert in. It basically creates kind of a license to litigate where because it is narrowing what should be a broad immunity and which already takes quite a lot of litigation often to just get to the point of saying no section 230 bars this lawsuit and kicking it out of court. I think we'll see a lot of litigation over whether a lawsuit is trying to hold a platform accountable for CSAM on its service on the basis of encryption or some other thing if it's encryption then the lawsuit is barred. If it's some other pretextual reason that the plaintiff, or the state age you can come up with and it's not going to be barred. So we'll see platforms drawn into quite a lot of litigation. And that's for the ones that dare to continue, you know, providing providing any kind of functionality for user generated content at all. I think we may see small platforms start to shut down. It's true that signal has threatened to go offshore and move to another jurisdiction. You know, this is one of the limitations with this bill. I don't know how it could possibly purport to touch open source tools. So encryption tools out there are either open source or housed overseas outside of US jurisdiction. So these tools would still be available. And those who want to use encrypted tools for bad acts would flock to using them. There's another question in the in the stream about that. But the people who are just everyday average users probably won't take the time to seek out services that have not been intentionally weakened in order to try and avoid the specter of liability that earn it would raise. In the event that that is in fact what what platforms do I think we'll see a difference between the response. And as you were talking about earlier, between the large well resource platforms that can handle the deep pockets needed to face this kind of litigation and try to, you know, make that calculation that they would rather protect their users and continue keeping these kinds of security protections in place and just deal with, you know, paying for the stuff that they don't catch. So we might see smaller services either shutting down, or as as mentioned potentially offshoring. So we'll see. But all the uncertainty that that opens up is just another reason to like keep fighting against this bill, because I, everything about the role is uncertain right now I would like to know that like my ability to just use my phone and use whatever you signal is still certain. So I think this boils down to that, you know, understand how the law works and the basic mechanics of it that's how on how 230 works, and then definitely understand how the statute messes with that. I think it boils down to also like, I asked the question about like, what is the political motivation behind some of these things and to the extent that there's genuine displeasure with some of the larger internet companies. And also is that they'll probably be able to survive but if one of the things we really want to have like if we don't like Google we really need a new Google and this is not going to help us get a new Google this is going to help us double down and and entrench the Google with everything we don't like about it, double down. So it doesn't actually solve any of our problems. And I think the thing is like the internet might limp along it's pretty great but I mean to the extent that the internet facilitates human speech among 7 billion of us that's a great thing. And when we take that away. I think we will miss it greatly. And so I think our final thought is, we need to be a lot more careful, we can't do legislation based on convenient bumper stickers we really have to understand very carefully the impacts of how law works, how human beings work, and how we start driving behavior when we start imposing punishments or even just uncertainty on people who are provide talking expressing themselves or facilitating that expression. Any other final thought. Yeah, I mean, I just to say again like please take action please call your senators take action against both of these bills. There's a question about whether this would be bad for the US economy. I think the other bill the LED bill is a greater threat to the economy on the basis of a similar bill having been passed in Australia a couple of years ago that has really helped to screw up their their tech economy sector so I think we would see yes a negative impact from the LED act but also from earn it, because of the lesser amount of vibrant services that we'd be able to see online. So, anyway, I could read about this all day. My favorite thing, I mean, I want people to understand it I want you know right now there's an awful lot of people with an awful lot of genders saying section 230 bad. And I think the general populace is like, I don't understand. Yeah that sounds good to me. I want everybody who's listening to be somebody who says, Aha, maybe that's not good maybe I should really rethink that and to start. I think if people really understood what is being threatened by this legislation, they would not be for it. But I, you know, loss is kind of scary and complex system. And you leave it to other people to do that thinking for you but I think, you know what we're trying to do is make it that you can understand the mechanics of how this works to draw an opinion and I think that opinion will be that this is not a good it's just that it's not always intuitive to understand this dynamic how does the law work, why do things respond to it. So hopefully we can demystify it and there's an awful lot of advocates out there me Rihanna EFF other organizations who, you know are happy to sort of answer questions and help sort of make people understand what's really at stake with any of these bills. I think that actually brings us to an end. Thank you so much village for having us and thank you Rihanna for doing this with me. I think this was great and hopefully everybody watching things. Great. Thank you so very much to both of you. It was a pleasure to wrap up our very first day with you both. And so thank you thank you again for speaking at the village. Folks, this is the end of the day for us we have survived day one of virtual crypto privacy village which frankly took more years off of my life, then maybe all of 2020 know I'm just kidding. But I will say, like an influencer because I've never really streamed. If you like and subscribe below. We have various different avenues that you can communicate with us reach out to us engage with us. Since we can't be with all of you in person this year. Thank you so much to all of our speakers. Thank you. Thank you so much again. And if I haven't said this enough. Thank you so much to everyone else who makes the crypto and privacy village possible there are a ton of folks who spend a lot of time, who have done so much work for the crypto and privacy village this year to make it happen. As we shifted gears from going live to going virtual. I could list all of their names now but I'll save that for another time when I get to host and see you all again so I am going to transition and hopefully not break OBS because I'm both doing both so let's see if we can do that. Let's see if we can go to something that is a little more on five. Thank you everyone and I'll see you soon.