 Thank you everyone and welcome to the latest edition of the free speech project online This is a collaboration of future tense, which is itself a collaboration of slate ASU and new America and the tech law and security program at American University Washington College of Law We have a fantastic discussion Slated for you all we have two extraordinary panelists David K and Kate Clonick I will introduce them in a second and just to give you a lay of the land We are going to speak amongst ourselves for about 40 45 45 minutes, and then we will answer your questions So please send us your questions along the way, and I promise you will we will get to them all or as many of them as we possibly can So Kate Clonick is a professor at St. John's University She is an extraordinary writer and thinker on these issues She's published many many articles including in Harvard Law Review the law journal the New Yorker Or she has a forthcoming article in the law journal, and she's just really a Brilliant thinker on these issues. So we're really lucky to have her with us today And David K as well another truly brilliant thinker and writer and leader on these issues as well a professor of law UCI University of California in Irvine Also the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression And I highly recommend his 2019 book called the speech police the global struggle to govern the internet Which is really just a fantastic and clearly written and concise Laying out of the issues and the struggles and the choices that we have to make so I'm going to jump right in and We hardly have a shortage of things to talk about here There is so much in the news right now It's hard to know exactly Where to begin But let's let's start with the relatively recent Decision by Facebook to take down the Trump ad That included Nazi symbolism What do we make that? What do you make of that decision and was it the right decision? Was it the wrong decision? How is how is Facebook, you know, how do you rate the way Facebook is handling these these highly controversial issues? David I'll start with you Great Jen, thank you so much for that super kind introduction. It's great to see you It's great to be on a panel with Kate. Hi, Kate. Nice to see you again so, um, you know for me, I think that if we are if we get into a mode of constantly either criticizing or supporting Specific actions of the platforms we might not be in the right place because we might come out differently on on Specific outcome. So if we're talking about Right in this most recent case the use of, you know, basically a Nazi symbol to identify to identify political prisoners in concentration camps during during the Holocaust and assigning that to Antifa basically, which is what I understand the ad to involve, you know We could pretty clearly see that as is problematic on many many levels but there's always going to be different issues where Where we're going to have some difference and I think the fundamental problem is is less about Specific outcomes and more about two things one What is the what is the guiding principle or set of principles that they're using to make those decisions? and connected to that how transparent are they about About those decisions and about those rules and I think this has actually been one of the fundamental problems for the platforms over the years In that we have very little idea about how those rules are made I mean Kate has a better idea than anybody I think and yet when it comes to specific cases and how those are decided We have very little insight into that and to my mind that that's more of a problem than, you know Looking at very specific kinds of outcomes Right, so I think one of the one of the critiques of of I think a jump in but one of the critiques of Facebook's Approach in general with respect to political ads is that they are Applying a different standard of political ads than they are to the rest of content on their platform And so so Kate, what do you make of that? And you know is that that's not I mean again We can we can look at the specific or we can we can look at the policies more broadly Yeah, thank you again for having having me Jen and this is great And yes, I also loves talking with David whenever I can get a chance and endorse his book Which is one of my favorites But so just to talk about I think that David is absolutely right like just getting into the weeds of every single decision It's just going to we're just feel like goldfish floating around a bowl for getting where we were three seconds beforehand I feel like there is just there just needs to be kind of you have to move towards like creating transparency and consistency and kind of some level of accountability between Users and the policies that guide them and you're exactly right the thing that really bothers me about the Facebook political ad kind of One of the things that bothers me about it Is the ones that they just they first decided they weren't going to please political ads very openly And that was the level of like that's where they drew the line Then they decided to change that then they decided to change that for Donald Trump And I think that there's one and I've written about this But like this is actually one of the things that bothers me the most which is the you know It's not even like they're particularly consistent with how they enforce these rules and against whom So we see over and over again Kind of different types of figures getting treated differently depending on the speech that they're saying and depending on the speech that That that's being recapitulated online. So I also want to kind of point out that like I have recently been in talks with some of my sources at facebook and there's definitely just like Like subsets of individuals public figures that are have their speech handled differently than Then you or me Jen and it's a I just don't think that that's exactly how we want to run the internet is to who they think are You know who like a bunch of people in menlo park think are worth keeping up or taking down Um, and so that's really um, oh we lost Jen But there she is um the but that's really kind of I think the thing that strikes me As what we have to do going forward on these policies is create levels of accountability or I think a better term for it is like participation from users and what the rules are that are governing them Just let them have some say Because right now it's you know, like a benevolent dictatorship in a lot of ways with mark zuckerberg. So I can I just say really I just quickly to to take off on that because I think this point is really important and Kate in her I mean to use the word that should be used the seminal piece on platform governance The new to to use the phrase she coined the new governors You know, these are platforms that have so much power and so much You know kind of a almost like a rule making function around them and The opacity the fact that I mean I think this is interesting news really that they have different approaches to different particular Figures and then you layer on top of that the story that you know, ben smith wrote about in the new york times the other day About kind of secret meetings, which of course, you know corporate actors are always meeting with political figures but it has a different kind of Of valence when we're talking about a company that manages so much of our political debate and has such influence So I think Kate's points are really on target here There's it's and it looks like a big mess when you have this inconsistency Layered by you know, this kind of influence that seems unseemly So I want to I want to get back to both of those points and particularly kate on the idea that People should have a say over this and talk a little bit more about what that means and what that could possibly Look like. I just wanted to push back for one second before we go there on the political ads and public figure question. So As we all know twitter took a very has taken a very different approach Um, than facebook. Um, but even twitter, which Has taken for one banned all political ads but to And even that creates a whole host of questions about what trunks as a political ad which we can get into in a moment, but With respect to president trump at least has taken a different approach in terms of back checking and and and and putting limitations on the retweet in particular of posts that were glorifying violence But that said they weren't treating him exactly the same they kept the posts up And to me that seems like the right decision So there should be some distinction between political figures and non-political figures and that we as a public as a Democratic as part of our democratic discourses value in us Knowing what those political figures say, but at the same time You know, they shouldn't get away with just violating the rules that apply to both of us to all of us the rest of us Yeah, so a couple interesting things i've recently learned i think i've been under reported is that it looks to us like all of this kind of this Use of labeling on trump's speech is something that uh twitter just kind of pulled out of a hat Like in the last couple of uh weeks, but actually since the nancy polosi Um video the one where she her speech was slurred down or slowed down So she sounded like she was slurring her words. Um Since that they've spent the last year that was about a year ago late may of 2019 They've spent the last year surveying over 6,000 people Uh stakeholders and other people trying to figure out what people would have preferred rather than just simply removing Nancy polosi's the nancy polosi video. I also just want to flag really quickly That it's so it's like you when we talk about nancy polosi's speech or donald trump's speech Uh, or the public figure nature or something there is one set if you were the speaker Uh, and a public figure and another set if you're a target and spoken of and a public figure And we're blurring all of those together So like the fact that donald trump's tweets something and doesn't necessarily post it on facebook means that like Is it facebook's job to take down the news stories that depict whatever whatever it was that he said on twitter? um These are kind of just these Commingling of issues that are really unclear. But anyways to get back to twitter So they researched all the stuff and what people told them And like and me and you're reflecting this gen is that they liked interstitials They didn't want it taken down They just wanted a little more context or the ability to go off and be told to go get more context, right? And so that's what the labels kind of Kind of get at and I think that they're much better as you stated Um, the other thing about the poll the last thing i'll say and then turn it over david. It's just the twitter The political ad ban on twitter is so deceptive because they're their revenue from political ads is like One one hundredth of the revenue that facebook uses for political It costs jack dorsi nothing to make that like to make that statement And he did it like three hours before facebook's earning reports meaning at the height of the political ad controversy What people don't see is kind of that human aspect of like boys with toys running around in silicon valley doing this kind of thing And I just think that that's that's it's a that's a very important Knowledge to know why you know to know how to question what some of these things are that we're getting when we do have transparency Yeah, I I totally agree with that. I mean I I also think that You know when one of the interesting things about the interstitials and And really the range of options that the platforms have you know, we often I think at least in the public The discussion is often a binary one, you know leave up take down something like that And and we know that there's many different approaches, you know, they're that are algorithmic That are um, you know related to you know, what can be shared and what can't be shared interstitials So I think that that actually a helpful part of the conversation over the last Maybe six weeks or so has been to to kind of highlight for people that we're not just talking about the binary that there's different options I mean one thing I would say though is that You know the the material the tweets that were taken that were given these interstitials these labels by twitter You know, they were They were on the edge. I would say and they were very problematic But you can imagine others other public figures, maybe not necessarily You know political leaders, but they could be civic leaders or others around the world Who are actually inciting violence and in those particular cases there might be a really good case for a takedown For actually deleting it and not making it available to anyone But you know, those are going to be the extreme cases But that still has to be available to to the companies and it shouldn't be Based solely on whether they're a public figure or not Precisely because it's the public figure that might be in a position to cause more harm That's why that sort of the public figure aspect has been a little bit twisted I think because it doesn't it's not really about the person It's about the harm that they might be causing I have plus one to that. I just think that that's exactly that's exactly correct and the You know Unfortunately a lot of these public figures and a lot of these world leaders if they are calling it causing for calling for harm or violence Are going to be able to amplify their message without twitter's help without facebook's help They can get it out there, right? And it'll be like all over the news and it is baby news that if you're doing that type of thing But there's nothing to say that you know, they have a right to that amplification I think that that's a I can't remember who said that was rebecca mckinnon Or if it was danah boyd, but someone at some point very early on was like you have a right to Speech you don't have a right to be amplified And I think that's a nice thing to remind yourself when you're thinking about all of the different mechanisms in which we decide to adjudicate this type of speech Great, so I think even just that discussion highlighted how hard these decisions are and how contextualized they need to be So these are really tough decisions that require a lot of judgment calls about What's someone saying how it's going to be perceived? What's the likely reaction? What's the relative value of allowing them to have for their speech to be out there versus the the costs of Taking it down. And so k you said let's let the people say let's let the people let's give all of us more voice and in how these really hard And contextualized decisions are made. How do we do that? Yeah, so people forget this. Oh, sorry. Go ahead. Jen. I thought you Done um You know, I've been doing like these zoom meetings for three months and I still like forget to unmute myself for like The only thing I've gotten really good at is like not wearing like like wearing sweatpants all the time um Um, so I think that I think that that's a really great question I don't think the answer to this is a referendum in democratic kind of a democratic form I think that in fact what we see now and we have seen with brexit and we've seen with various prop proposals out of California and all types of other things that like Referendums are maybe not the best idea not to be a little so mad not to be super matersonian about it but um, especially when you have such divergent views across the globe um And in 2009 facebook if anyone remembers did try kind of a referendum project direct voting on its community standards and point three two percent of the facebook population showed up to vote There's about 10 000 people out of 200 million facebook users at the time and so it was just seen as a colossal failure um and uh, you know I think that going forward what you need to start doing is having things like the facebook oversight board Make those kind of more standardized as means of uh users getting redress on their speech that's taken down and or or like or being able to Push back on the speech that is kept up that they disagree with being kept up Um, and uh and creep like better user participation mechanisms. I don't know exactly what those look like I um I but User participation like participatory democracy is kind of more in model that I'm speaking to than a tradition than like a like a direct democracy type of idea Yeah, I mean, I think that there are those tools Can be innovated And and they should be innovated and I mean, I think one way we can think about it is You know, we often think of the platforms And I think this is certainly true of me. Um, you know, we often think of them in our own space Like we think of them like what's the debate in the united states right now and how are the how are the platforms influencing it? and you know something like the oversight board might actually be helpful in integrating the views of of a community whatever that might be Um, sort of the broader public into those decisions and also highlighting what the sources of the Those decisions should be like what what standards are you drawing from? But I think one of the things that the platforms do pretty poorly is integrating public and community perceptions of the rules and their implementation around the world They're much more in tune with the u.s. Because they're all you know, they're in menlo park or san francisco or san bruno I mean they they're you know, they're marinating in american culture and ideas all the time But what about, you know The 85 percent of facebook's user base that is outside the united states. How do we integrate? Those views their views the public or the users in those places Their views so that the rules actually make sense to people And I think that the companies have basically failed in that and part of it is because There's not there's nothing like a case law system, you know So we can just imagine, you know vast inconsistencies and inequities and decision making I don't know that that's true, but we also don't know either way because there's no set of of rules. There's hypotheticals There's you know hard question blogs and stuff like that But we really don't have the opportunity to overview nothing wrong with the hard question blog I love the hard question. It's interesting But but and it gives us insight into particular problems But it doesn't give us insight into the whole range of cases that especially people outside the u.s I think would really benefit from and then could integrate into the decision making and so I don't I don't get that by the way, david I wonder if you have insight into that uh, or if you do gen why eight to nine percent of facebook's user base Which is based in the u.s Um versus the rest of like the 90 percent that's everywhere else and is the only percent that's growing Right like their global the people who are joining facebook are people and and increasingly depending on dependent on them through zero rating Kind of um programs are all in the global south So why they're still setting there doesn't seem to make sense from a bottom line standpoint to keep right? And I so I just don't I don't know what I don't know what that is Maybe americans are richer and spend more money and the ads are worth more. I don't know but like that's the only thing I can think of Yeah, I don't I don't know what the answer to that is either and it also you know feeds into the substantive questions of standards You know their standards are you know community guidelines community stand twitter rules all of those things Which are you know their their company standards and you know, they've they've kind of more or less Although putting putting the political ads aside. There's been a lot of convergence You know and maybe on the margins in some areas There's some differentiation, but there's been a lot of convergence around terrorism Extremism hate speech and those kinds of things But you know if they actually rooted them in a vocabulary that people around the world understand I agree. I mean it would be a benefit to the companies. It's not it's not a constraint. I I agree. I don't I don't get it either So david you've I mean you've written about this a lot. What do you what about the risk? I mean so so, I mean I think one of the arguable benefits of this kind of What you might say my opiate myopia is that? most of the companies and Kate you've written about this as well operate from a Free speech promoting perspective. They don't always achieve that goal and sometimes that goal conflicts with a whole host of other values and is not necessarily You know, there's all kinds of reasons why sometimes just labeling that as the ultimate goal doesn't actually work But as you as both of you have written about in Kate and I you and I wrote about this a little bit together There's real risks of other countries that have much more repressive standards and rules coming in and shaping The speech across the internet via these companies that operate across state lines So how how does a company both do what I think is a very valuable goal of incorporating other perspectives and other contacts without Running a file of real concerns about censorship and repression Yeah, it's it's a great question Kate. You want to go first? It's really good No, go ahead. I mean the the only thing I would say is and I think this would be we could talk about this for hours um The last couple of weeks or especially the last week have been really really interesting From the perspective of seeing other countries particularly in europe deal with speech questions You know the french constitutional court struck down basically a You know the french hate speech law that had just been adopted a couple of weeks before they basically said it's unconstitutional and if you look at the decision it you know it It's in its own kind of constitutional language But it's basically making human rights arguments about why you know pressure on the platforms can't be applied in that particular way You know vague rules, etc. Etc. Super interesting There was also a decision by the european court of human rights just today That struck down some russian website blocking initiatives and so And there's been other cases over the last couple weeks as well Which I think is interesting and also should be a signal That actually there are places around the world that do value freedom of expression But they implement those values and those rules in a different way It's not that it's a better or worse way than ours But one thing that you see in those rule in the rulemaking there is that These courts are saying look freedom of expression is a fundamental value in in these cases in european space In order to protect it and in order to ensure that that protection doesn't interfere with other rights The rules need to look like you know this that and the other it's actually pretty sophisticated And I think americans would really do well to look at that and say well, maybe Either as companies are adopting their own rules or as the u.s. Is thinking about like what our own Regulatory strategy might look like we can actually learn quite a bit from from this recent You know european case law super interesting stuff happening Yeah, I was so excited by the by the french court and I Jen I'd actually love to hear you talk about it Because you probably are maybe more familiar with it than than I am but I I thought that that was great I will just say really quickly that I think that the free speech policy you're kind of speaking of and the the way that the Companies decided to set their standard there was because it matched both The moments when early on in internet history from like google and youtube and and facebook From like I would say like the first decade of 2000 2010 And but but still now I mean we see it with facebook just fighting back against the about in the case that we wrote about Jen in the austrian deformation case You see a pushback that is very free speech oriented from the companies against Nation states let's put it that way and then if they can apply that same type of policy that free speech Keep it up type of policy against users Then there's like it was like a comfortable easy consistency I think that when they started between like boston marathon bombing at least at facebook boston marathon bombing to the napalm girl incident in like 2013 or 2016 2016 that there was basically um you had a slow disintegration into trying to draw lines around newsworthiness and other types of things and take down um take down more speech And I think that there was a monolithic approach of basically like well We'll get in a lot like people are going to be more upset if we keep speech up The probably won't be as mad if we take speech down That flipped around the boston marathon bombing and that is like that was like completely blown out of the water and like has Kind of changed the entire way The companies have dealt with things going forward But you kind of see it in the various iterations of again the human side of who's working at the companies and when And what policies they put forward At what point much like a supreme court might do right and like whoever you know, whoever the roberts court versus You know like the marshal court. So I just I think that this is uh, it's something You know, it's just a very we forget that that's exactly what's going on So let's talk about courts. Um, we've you've both mentioned the oversight board which mark zuckerberg at one point Compared to the supreme court or the supreme court of facebook And so kate before we get into it You've you've obviously studied this more than just about anybody Do you mind just giving all the listeners just to the very brief lay of the land in case not everyone is as familiar with it As as you clearly are um, yeah, it's uh numious distinction to like I'm going like no one has spent nearly as much money and gotten paid less than me We're following facebook around But the uh The oversight board is really something that has is the product uh, the 2018 kind of brainchild of formally noa feldman and then Was adopted by mark zuckerberg and he put into action But it was something that people had promoted as an idea for for the decade before that people have been talking about having an Appeals court of appeals rebecca. We cannon talked about it. Daniel citron talked about technological due process They've but there's been long been kind of discussion around creating more due process in these systems and so there is uh In 2018, um mark zuckerberg basically decided that he was going to really kind of Make a firm commitment to building out an independent oversight board that would review speech that was removed and kept up that users flagged Okay, so let me start over it would remove speech that users flagged And they could then appeal the decision that facebook made on whether to remove that content or to take it down or whether to keep it up and they could review that to this board appeal to this board and then you had I mean that sounds like okay That's a fairly straightforward thing. You're going to have this court of appeals like the supreme court But it actually ended up just being and that's kind of what I spent the last year Researching while I was there. It's an incredibly difficult question of how you draw the jurisdictional lines of what facebook is actually going to do with the court with the Oversight boards decisions what you're going to have them What you're going to have them? Look like are they going to look more like a representative body or more like a court? Are they going to you know, how long are their terms going to be? How will they select cases? How will they hear cases? Will they be anonymous panels? Will they write decisions like all of these things? And so right now, um, there's about 20 people on the oversight board They started in May 6th May 7th That was like kind of the official launch so to speak of like naming all the members and They will are slated to be kind of ramped up and hearing cases from individuals by september And they have said that they will hear cases first from users that have like have appeals to be heard But they in within their jurisdiction Facebook can go and ask them For policy decisions that it wants and so there's lots of there's tons to talk about with this Why did facebook do this? Why did you create an external independent board? Is it independent? Um, it you know, is it going to be effective? Is it does it have any power? Is it a patemkin village? Is it really a court? Is it you know, this all of this other stuff and there's lots to discuss but it has been And i've written I said that at the very least it is one of it is a small small step toward the participatory system Of users it's a very small step, but it's more than we have right now. And so as an advocate for online speech, I'll take it And uh, yeah, I'm happy to answer any questions on that I yeah, I think that I think Kate and I come out in a similar place in terms of You know the debate when they first announced the the first 20 panelists I think there was a lot of you know some very serious criticism of it and I think we're both in the same place of You know, it's a positive step or it is at least a step that isn't it's not harmful in my view And um, let's see what it can actually do. But in terms of what it can actually do I do think we need to be realistic about it. You know, it's it's pretty limited It's it's going to be kind of constrained by the fact that you know, there's You know a couple of billion users and only 20 panelists and a relatively small kind of secretariat That'll that'll staff it up. So they're not going to be able to to deal with all of the different kinds of appeals that people have It's going to be kind of a policy level Cut I would think of it that way. It's almost like we were talking about the hard questions blog It's going to be like a hard questions appellate body in a way and it will be Just as accessible maybe even more accessible to facebook to ask the questions as it will be to users So, you know, I think we just need to be realistic about it And I also think that You know over over time And I guess this is more my hope than anything else over time The board will actually push The the company to to change the standards that it has and and there will be Debates, I mean there will undoubtedly be a kind of Marbury moment Where facebook says to the board, you know, you're exceeding your ability to review our decisions And you know when that happens, I think that's where it's going to be most interesting, right? That's going to be the moment where and depending on what the issue is where we're going to see is this board For real. Is it a real? constraint on facebook or is it something that is kind of on the margins helping helping the company make Particular decisions. I don't think we know that yet and until that happens I mean, I would reserve judgment and think that and just sort of look at like well, what is it actually Going to be doing and it'll be very interesting to see if it has that kind of impact on the company So let me ask about a critique that's often kind of leveled against this the board Which is that the board is reviewing particular decisions based on the standards and policies that facebook itself has put forward And my understanding is that the board's Opinions with respect to decisions are binding particular cases are binding But the board's broader recommendations with respect to the underlying policies and standards are advisory um, and so if that's the case and given How few cases relative to the demand this kind of board is going to hear Like is that is is that going to really move the needle are we do we I mean I I get that it's better It may be better than nothing, but um Like I guess I'm coming at this with a bit more skepticism. I think than either of you have expressed so far No, I think that the the skepticism is completely warranted. Like I said, this is very it's even smaller Honestly, Jen. It's even smaller jurisdiction. Lee than you're describing because the binding the bindingness of the decision is for that one Specific piece of content not for content like it not for content that looks exactly like to get reposted by anyone else It is only good for like if David has like a picture of his like dog up and it's removed And he wants to be like, you know, you you know, like I want this back up They like and I post a picture of his dog. Like they will only take down They will only take down um, they will only reinstate David's picture not mine. Um, and so that's like that is like almost nothing but the advisory the advisory Capacity is and a lot of people have written about this including Evelyn do it at Harvard He's brilliant on this also and it's covered it almost as deep like pretty much as deeply as me But from like not been inside as much um Is that this is really a form of weak form review that by kind of Allowing the board to have any power and to exist at all You kind of can't really keep the genie in the bottle and eventually as David describes You will slowly have more and more feedback that is publicly announced and thus creates immense like public relations pressure on facebook to basically Either and I shall also say they have they're first to respond to the all of the policy advice So they have 30 days to respond Every time the board gives them any type of advisory opinion and that's also like really important If you know, and I think that that's I think that those two things are like I said more than we've ever had before We you know, you still have to kind of consider that it's a private company And I I think that it's capacity to In a way almost shame the company Uh into taking certain decisions or to applying certain rules Is something first of all, we don't know that that capacity quite yet But I think um in a way, it's not totally clear to me that you know that mark zacherberg Knew what he got himself into Because you know, once you create this body And again, we don't know the impact But once he once you create the body and once it starts making decisions if those decisions are kind of pushing the envelope on issues like human rights standards on issues, you know, even you know algorithmic and other kinds of issues that the company might be uncomfortable about You know the the board going in that direction If the board goes in that direction and and really pushes forward and the company Resists, I mean that creates a whole new, you know, whether we call it public relations or something else It that'll be a very serious problem for facebook. Now. I don't know if that'll happen It might not happen. It could be that the board is you know, it's pretty modest But when you look at the people who are on the board, you know, at least, you know I would say, you know, eight to ten who I know That's not their mo usually like they will they will push it'll be interesting to see how successful they are And how public and that was also a huge question is that it was all being kind of formed out or formed in the last year Was like how serious the people would be Should it be a blue ribbon kind of panel of people that weren't going to actually have time to devote to these issues Like, you know, all of these things and I agree with you I think that the mo of most of these of most of the people that are on the got selected to be on the board is One that's going to be pushing for marbury a marbury type decision sooner rather than later Yeah, so here's a one question from an audience member. Is it is it a good thing that the oversight board will hear cases anonymously? Yeah, I saw this question. This is a great question. It was heavily debated The reason that they're hearing them in five person panels anonymously is for security reasons Because they're concerned and I think maybe rightfully so That there was that like that they would there would be targeting of certain board members If they specific cases they were hearing where it was public That said it's going to be kind of interesting because they are not They're allowing dissents So you're going to have kind of an anonymous dissent and then a per curium opinion Which will be I think kind of just I don't know. It'll just be interesting. What do you think, david? I'd rather it not be anonymous. I mean, I don't I actually think that the Look, I mean one of the restrictions on the board that we haven't totally addressed is You know, it's not going to be dealing with Government like legal demands So for example, if you know if turkey makes a demand on facebook to take down content My understanding is that that's not a part of that's certainly not facebook's intention That that would be something that the the oversight board could adjudicate like those legal demands are somewhere else So those are the kinds of cases that I think might raise actual You know kinds of security issues for people not necessarily The kinds that we're going to see so I mean my my inclination is to say I'd rather have more information I'd rather people, you know, just as you know with courts outside of military tribunals around the world or security Panels around the world that occasionally do their justice anonymously, which is almost always problematic I would rather it be More publicly accessible and for people to to debate in in that way. It's a lot harder When your judge or your panelist here is anonymous to the outside I think that but but I also understand there are some risks to to doing that Yeah, so just so I want to broaden the lens a little bit David you started very early on in the conversation you talked about the need for Or the concern that that all these decisions are made behind closed doors and there's no precedent. There's no publicity So so this we've talked about this might be a small step for a single company on the margins How can we think about designing systems and oversight that would work across companies and really do that on a meaningful way? and You know, just from where I come from it's these are such hard choices because if you put that decision in the hands Of any sort of governmental official entity that raises a whole host of legitimate concerns, I think Anyway, these are really it seems like a really really hard design question And so I know both of you have thought about this a lot. So I love your thoughts. Yeah so I would maybe two responses one just on the oversight board generally And you know the idea of it being self-regulation That that should not be a replacement for other kinds of oversight so like the self-regulation of an oversight board from my perspective should be seen as a company implementing Its responsibilities under the united nations guiding principles on business and human rights To ensure that they're not having a negative effect on the rights of their users or on the public Every company should have something like this and and you know for that reason it should be guided by human rights standards To me that seems and that's that's not limited to the tech sector That's a lot of multinational companies that have very grave impacts on human rights And we could have a you know a much longer discussion about that So I think that's one thing all companies should be doing something like this As far as the kind of a cross industry oversight I actually think that you know Article 19 and a few others the organization article 19 You know I've proposed and I've been working on this idea of a social media council, which the lingo comes from Press councils, which are kind of common in the commonwealth and in other places around the world where you you don't want the government To be evaluating for example, what a media outlet Publishes or not, but having some kind of you know civil society and company body To evaluate real hard questions around content Is is appropriate and I think there could be a cross industry approach Similar to the one that article 19 is suggesting that at least says Here human rights standards, you know, we have user generated content having a significant impact on human rights sometimes Sort of promoting human rights sometimes interfering with them And if you have this as a cross industry multi stakeholder approach You know, you could actually have some ways to increase transparency increase public participation And so forth and I think the ideas around this are pretty are pretty rich and sophisticated But they've hardly been discussed beyond kind of a narrow group at at the moment despite I mean the real efforts of article 19 and stanford and some others To to make that happen Yeah, I agree with that. That's totally what I'm thinking as well. So So we've made it this far without mentioning section 230. Um, which is shocking But I don't think we can have this conversation without talking about the possibility and the effect and And the potential need for section 230 reform as a means for putting external Forms of accountability on the platforms to do things That there's that the platforms are not doing sufficiently on their own whether it's not sufficiently taking down harmful speech Or not sufficiently responding to malicious foreign interference, although Whole host of things that a lot of harms are being shoehorned into Let's reform 230 and that will help solve the problem So does 230 needs to be reformed will that help? Is that the kind of kind of oversight and accountability that that is needed in this particular moment? Um, so I've been hosting a uh, a daily webinar. It's 90 minutes, uh doing five different, um Five different discussions from five different panels of people with different perspectives on section 230 about this very question But I will just broadly say that there are certainly Some room for a smart, uh section 230 reform I think that it could be brought in so that it's not just applying to federal criminal law is like certainly one of like the lowest hanging fruit but um I think that it is weirdly such a misunderstood law and what it as to what it can and can't do Um or will and won't do and one of the biggest examples that I gave today when I was doing we're doing the seminar was basically that um I think that there is a strong probability that if you struck down section 230 that um that That courts would hold that platforms have their own first amendment rights that uh prevent them from being um from causes of action on communication torts from uh That protect them to basically decide who comes up and who goes down and like that's that's totally on them The difficulty I think that'll be so stupid about getting rid of 230 is that we just don't have much first amendment law around this Because of section 230 and so you're going to have this entire kind of slow iteration through the courts Of kind of breaking out like how exactly the first amendment creates like pockets of immunity from communication torts Uh, and it just feels like reinventing the wheel when we don't need to um, and so that's I mean that's one layer of it There's the entire FCC admin layer. There's other types of things, but I just think that um I mean the other thing to to david's point about speaking outside the united states is section 230 is the united states law It applies to it within the borders of the u.s It is no it section 230 doesn't exist another in other jurisdictions and the and the platforms have managed to thrive and get along Just fine So I think that this is kind of it's something to keep in mind When you kind of have the the sky is falling section 230 is going to go away and you have um kind of people who think that You know, this is the only thing standing between um A better internet uh is getting rid of section 230. I just don't think I don't think either of those perspectives have it right Yeah, and I I mean I think that this I won't say much about 230 But I think that you know, we are being myopic if we focus only on 230 and if we only get our learning When we're talking about intermediary liability from 230 I mean in europe they have the e-commerce directive Which you know has it's it's has a variation to it But it basically immunizes the companies from all sorts of liabilities much as a section 230 does And you know in europe right now the european commission Is considering a kind of a massive revamping of internet law Across europe it's going to be hugely I don't know if i'd say disruptive, but it's going to make a big impact in europe Maybe to a similar extent as the gdpr on privacy has In european space and that means it's going to have a big impact on on us users as well Because you know these companies operate at scale So if they're forced to make some pretty significant changes in europe They're probably going to do them at scale, you know and implement these across the board So I I actually think we should be doing some educating of ourselves on what the digital services act in europe Is going to mean for internet regulation Around the world in the united states and elsewhere and we're I think we're not even really having that that discussion But it's going to be a big impact on us and I think that You know as we move forward and we think about section 230 I think we should also be thinking about Like non content focused Forms of regulation You know transparency kinds of things disclosures like we we haven't even scratched the surface of what's possible Without getting into you know undermining first amendment rights in the us What should we what should we know about that law davin? the european About the dsa. Yeah. Yeah I don't know yet I mean seriously, I you know what I think what's interesting is so far What we've seen in terms of the draft coming out from europe is that they actually At least the current leadership of the european commission does not want to undermine Sort of fundamental principles of intermediary liability In part because they see that the platforms as as much harm as they might see them causing They also see that they have so much They they have so much power over public space that could be for the good I mean, I think some of the people at the commission Really do believe that and so they're looking for a way to protect that that kind of Public space that is good for public debate while also dealing with the harm I mean, I think that's appropriate. It's gonna be super hard to do Um, but I think we we should all be engaging in it at least observing what's happening because it's going to be It you know, given the fact that we're so dysfunctional in washington It's likely that you know brussels is going to make a move Well before we're in a position to do anything and I would just say that that's like this interesting type of Governance that is happening from outside the u.s. Borders, which is just kind of the brussel It is like literally the brussels effect or like the california effect or whatever you want to call it But like the idea that you have a large market share and in order to You decide to cater to the largest market share I wish europe or california or texas um, and then you know, and then If you're a company then extrapolate your policy off of that and not make separate little pocket Like road island doesn't get to set like policy for the rest of the world. Um And neither does like lithuania all by itself, you know all by itself It's just like these are kind of iterative things and so I just kind of wanted to point that out that it's This it's this um, we're kind of just seeing that play out over and over again in these private spaces I I completely agree with that I also think it's slightly complicated by the fact that there is The possibility and the reality of geo blocking on the platforms which does allow them to Provide slightly different user experiences in different Contacts, um, and that is one way that companies have and continue to do kind of mediate Different norms and approaches across borders as well I want to turn to another audience question, which was um, we've talked a lot about Political ads we've talked about about user generated content But this is a question about foreign interference and What are the social other companies doing enough to deal with foreign interference? Particularly in our elections and what can we do to help hold them more accountable if we think that they're not doing enough? Yeah, it's so I don't know if I could say that they're doing Enough they're doing better Um, there's no question that they're doing better than they they once were But you know, I I guess I'm I'm afraid of a couple of things One is the kind of whack-a-mole issue, you know the problem is that There are a few governments that are extremely Sophisticated, um, and it's it's not just governments, you know, it's also, you know domestic actors That are extremely sophisticated and attuned to How to game the platforms and and I think that that exists Well, you know regardless of the level of transparency of the companies and so it's you know, it's a constant game I think and it's you know, a pretty high level game of dealing with bad actors on the platforms And you know, whether they're always succeeding, you know, it's it's hard to say There's still a very high level of disinformation on the platforms There's there is some and there's certainly better transparency than there used to be But there's not full transparency in terms of of what they're doing and what they're seeing there's um, you know as far as Disinformation and government manipulation outside the united states It's still I think black boxed in a way that um, like the russia us issues are are more open to us than in the past But you know what say twitter or facebook are doing with respect to saudi manipulation, which is pretty significant Very hard to very hard to know. So I think there's still a very significant transparency problem there as sort of at the least I'll just I'll just add that the people that there has been a brain drain In the last four years from people working in cyber security and government To to the best people leaving to go to work for the platforms Because they pay more because they have had this urgency of creating an answer to this problem And so if I don't know again, I don't know I guess doing and I agree with David I don't know about the language doing enough But doing more probably Unfortunately, unfortunately doing more than um Doing more than probably most governments can do right now and having kind of the reach and the data to do that Um, I just I think that they are well motivated To to shut this down. I do not I think that Again for better or worse that I do not think that facebook wants and twitter wants another 2016 election. They do not want another like a brexit. Um, I just don't know I don't know that they can I don't think they can avoid being regulated if they If they if they kind of if they leads clearly to one of those and they Slept on the job or there's some type of negligence at the heart of like why Why one of those events happens again? So that's that's kind of my my main takeaway I'm curious about it maybe before we move up because I think this is sort of um Kate your point here made me think of this issue. I'm wondering what you and what jen think about this Um, it seems like over the last several weeks that we've almost had like a red state and a blue state platform You know and you know facebook is is being perceived as being friendly to trump or at least You know more more willing to accept obviously trump ads and stuff like that and maybe more kid gloves And then there's the story about the private dinner, you know with peter thiel involved and all that and then you have twitter Which is labeling and so forth and i'm wondering how you see that Implicating the regulatory environment Let's say, you know, we have a biden administration in january. Is it going to have an impact on on on sort of what washington does Um, or is it is it all going to be a wash? I mean, I think that the I think that I mean you have biden coming out in favor of regulating the platforms along or in getting rid of Section 230 along with josh holly. It's a very strict. Like I actually don't know Honestly, like I think that that is fascinating like that is fascinating to me that you have josh holly calling for abolishing 230 and you have joe biden calling for abolishing 230 as I've said about 230 It has this weird way of like Takes everyone so from so far on the right and so far on the left meet each other and like this Decision to kind of come like to cut take down this law Um, I don't know. I don't know what's going to happen. There are certainly I think a lot of it is going to happen in how the press plays out the election and how the press plays out the The platforms, um, I don't know how facebook wins if it keeps Uh, if it keeps having to make these calls And this is in particular one of the reasons that I am hopeful about the oversight board because I think that they are Well motivated to use the oversight board because the oversight board means that these crappy decisions of having Where no one is happy at the end of the day about what speech stays up or what speech comes down Are all um, all kind of land in make it someone else's problem and that someone else is the oversight board So I think that that's kind of where I where I think this will go, but I don't know what jen thinks So I would just say that I think I mean, I think the whole notion of Washington attacking a platform for its either conservative or liberal bias is is a low point and we saw that in um, the underpinnings of the trump's eo executive order on these issues on 230 and more and um, I would just I would hope and certainly expect that a Biden administration would have a different tack on at least that particular issue But I will just end We have there's again, there's too many amazing questions that I would like to get to we have run out of time I know Our panelists have to jump off But I would just think both Kate and david for really a fantastic conversation I would like to thank our audience members. I would like to Inform and remind our audience members who don't yet know that there is going to be a fantastic conversation next week same time Same place with anmarie slaughter The CEO of new america and linkedin co-founder read hoffman on 2020 how will it change american tech? So I encourage all of you to join for that and please join me In thanking our panelists and in thanking future tense slate ASU for helping host this along with Our program the tech law and security program at american university. Thanks