 Good afternoon Hello, we're gonna we're gonna go ahead and get started Thank you all for coming to join this event that we're calling who's afraid of online speech My name is Andres Martinez. I'm the editorial director of future tense and a professor of practice at the Cronkite School at Arizona State University Future future tense for those of you who have not been with us before is a project of New America Arizona State University and slate magazine. We explore technology's impact on society We do this on slate where you can read our articles and content every day You can follow us in Twitter at future tense now You can listen to our weekly podcast that slate launched last fall called if then And of course you can attend our frequent events We do a lot of programs here in DC bringing people together To look at provocative questions like today's last week. We had one of our movie nights at East Street We screened World War Z With the scientists from Argonne National Lab next week next Thursday a week from Thursday We're having an event here Entitled what our democracy needs to know which will be a look at how to continue developing our nations What we're calling our public knowledge infrastructure at a time of such hyper Partisanship and a time also when expertise seems to be a little bit devalued in terms of our our political moment A bit of housekeeping as we get into today's event. Please remember to Turn off your phones I'll do that as soon as I shut up because I forgot to do it And also, please be mindful that because we are Livestreaming this event of during any kind of any q&a's wait for a microphone if you Have been chosen to ask a question and and just identify yourself. That's always nice as well So our subject today Could not be more important or timelier Free speech is one of the cornerstones Of american democracy and in my mind it's sort of the the one trait That makes our democracy most distinctively american. I think people who have Live elsewhere traveled elsewhere, you know, you kind of become very quickly aware that our You know bedrock belief and unfettered free speech Is is a very american Notion it's a very american value and it's also a value that from its inception the internet embraced The internet was always seen as a propagation for that american value of free speech and There are a lot of people in the room who know Far better than I all of the policy decisions and choices That were made at the time that the internet started to to To spread To make that to ensure that that was going to be the case that this was going to be A space where free speech was going to be celebrated And we have this conversation today and want to flesh out some of the tensions, you know two decades on That surround these initial choices that were made At the inception of the internet and some I I might characterize them as Anxieties around what had been Our universal consensus about Free speech this american value some understandable concerns about the power of hateful and deceptive speech Online and I feel like we're in a moment in our culture and in our politics Where a lot of smart people well-meaning people are Asking us to revisit some of those original choices that were made at the at the inception of the internet and I think Most of us are having a hard time grappling with with the tensions and trade-offs In terms of wanting to have this Cling to this belief that that free speech is Always the right answer and yet having to navigate What have been you know what our troubling instances online of undesirable speech so I More than anything i'm eager to learn from the amazing Speakers that we have assembled for for you here today And to help us sort through some of these tensions and trade-offs That I alluded to i'm going to ask april glazer To come up and and give some sort of framing remarks For why we're here today april is our april is a our staff writer At future tense on slate and she's also the co-host of the aforementioned Fantastic if then podcast so april. Thank you Hi everybody a little shorter Thank you all so much for coming in the middle of the day today. Um I am april and I do write for slate and also podcast for slate But we're going to talk about free speech on the internet today and um, I'll start by just saying You know, we talk online. We read online We share online and so it does stand to reason that free speech must be protected online Free speech after all is completely foundational to how our democracy works But communicating on the internet isn't always the same as communicating in person or in print or over the radio or on television Anyone can swiftly respond to what someone else says in fact Thousands of individuals can even respond at once and they can be mean and cutting and bigoted And they can chase people out of the conversation altogether sometimes forever And then on the flip side communities that form online to voice dissent or support Can also be a powerful force for good as we've seen A petition can collect thousands of signatures and call attention to an otherwise obscure political problem A tweet or a facebook post About local injustice may go viral and intersect with millions of people from far flung corners of the globe Which can awaken true concern and empathy and increase our understanding and awareness of the need for political reform elsewhere But being able to say whatever you want online also means being able to say things that aren't true and say those things at scale And that means fake news can go viral and confuse people who are honestly trying to get a grip on what's happening in the world and in their communities Tens of thousands of bots can give the false impression of a ground swell of grassroots support It means accounts can pretend to be american activist organizations Puppeted by foreign agents intending to sway our election and rile opposing sides on the brink of america's polarized politics Then there's the problem that some people use the internet to organize in ways that are harmful and even hateful White nationalists and neo nazis organized the unite the right rally in august primarily online The event was promoted and discussed on message boards Like-minded supporters were members of facebook groups Key figures in the hate filled movement were until recently verified on twitter It's a status that promoted their views on the platform and it's a status that's typically reserved for people who have some level of fame But these people like richard spencer or jason kessler, for example Were famous because of their racism twitter and facebook and youtube and youtube which has really long been a favorite platform of like bigoted video bloggers They're all private companies And they have the right to kick people off and remove content and really set their own rules and they do That means if twitter had a strong enforced rule against racism and hate speech and revenge porn From the start fewer people might have been harassed off the platform And if youtube had a rule banning hatred against religious groups that was actually enforced Perhaps there'd be far less easy to find places where people can become indoctrinated in hate If facebook took the fact that it was very obviously becoming a serious An essential source for news and information for people who are looking to stay informed and participate meaningfully in their political lives Then maybe the fake news problem would have never gotten this bad Only as these platforms And internet companies matured and grew and grew into some of the most powerful companies in the world They did so largely without much enforcement of their internal rules and certainly without many external regulations Which they worked very hard to avoid And instead of just becoming another destination online These platforms kind of became the internet entirely The wall gardens subsumed the commons and now it feels like we really don't know how to escape To be sure the platforms didn't get this big just organically They didn't simply grow wildly with no one watching There were some laws that helped the communications decency act Which was critically important in fostering the innovation that led to the internet to become really what it is today Also removed most all liability Or legal responsibility internet companies had for what users said on their platform And lawmakers as well as digital rights advocates who were concerned with internet surveillance Were for for good reason often focused more on government surveillance reform than unregulating corporate data collection for years And though we find ourselves concerned with automated bots now For many years as well internet users just kind of accepted the noxious accounts as this kind of trash background noise that we could tolerate like You know trash on the sidewalk And companies use automation to scale as well It was google's automation after all that suggested journalists who are investigating their tools By ads about how jewish and black people ruin the neighborhoods And now when private companies do start to take action to expel hate from their platforms There really aren't many other places for people who have those hateful views to go And so those people who peddle in hate speech can say that these companies don't respect their free speech And they claim that they're being silenced and maybe they are being silenced Whatever the answer to fixing the mess where it now is it's it's unlikely to be a simple one And it's not entirely clear that the us has the regulatory scaffolding to to like properly deal with these internet platforms Their self-regulation seems to not be working so well at least at the moment And so i'll end by saying that i'm really beyond honored to have the chance to open this conversation today I've spent basically my whole life Thinking about the centrality of technology and the internet to all processes of social justice and political change And it really means the world to me to join these scholars and And journalists and and really some incredible elected officials to discuss what our options are You know in terms of regulation and reform as well as innovative approaches to how we can move forward Thank you so much So i'd like i'd now like to welcome the moderator of the next two panels Cecilia kang who is the national technology correspondent for the new york times round time senator Before we get started I wanted to give some opening remarks about Amy Klobuchar the senator many things can be said about her among them She is the first woman to be elected to represent minnesota in the united states senate She's the chair of the senate democratic steering committee and ranking member of the rules committee She's also a co-sponsor of the honest ads act a bill that she will be telling us more about today senator klobuchar. Welcome Well, thank you so much everyone and it's great to be here and thanks, Cecilia We have a lot going on in minnesota right now like 5000 people stood out last night for a prince tribute concert in five degree weather that is a true fact um So but it is uh in a lot going on on the hill as you know But I thought it'd be great to take a break to talk about something that's going to affect everyone's lives and that is The state of our democracy if we don't do anything About the change in the way campaigns are spending money in the way And I could go and all citizens united and all of that but i'm going to focus today on What happened in the last election and What will happen in the next election if we don't do anything about it? so my interest in Free and fair elections comes from a few things one My dad was a reporter and a columnist for the minneapolis paper When I was growing up and he literally Went from a hard scrabble mining town Because of journalists to interviewing everyone from might did get to ginger rogers to ronald reagan And he actually was the one that called the 1960 election when he was working for the ap for John f kennedy Because he knew how northern minnesota would vote and the guy in new york when they called in the results from the ap in minnesota Said two words to my dad Be right And they were And 13 minutes later the new york times called it based on minnesota So accuracy honesty in journalism and what's out there matters a lot to me I also was a prosecutor for eight years and saw how If you don't get laws that are sophisticated as a system you're dealing with or the people that break them, especially In the white collar area Then you lose your sense of justice in a system And that was one of my Principles that I worked with all the time when I was there And the third thing is that I run for office or I wouldn't be up here But I ran for office coming from a position of not having a lot of money And in my big race for county attorney was like the da and then for the u.s. Senate I was significantly outspent by my opponents And so When you run and you literally have no one that you can call that you know in washington dc To raise money and no one returns your calls because they can't say your names and you end up raising which is a true fact I got out my old rolodex and once raised seventeen thousand dollars from ex-boyfriends That's true as my husband pointed out. It is not an expanding base But when you are kind of in that position you never forget that right? And you think very much about what it means if you have a democracy where people can unfairly influence it Either because of dark money because of big money Because of unfairness Then you wouldn't have had a chance to get in it to begin with and I was fortunate enough to get my start in Minnesota where people were like paul wellstone was the senator and you had people running With small donations, but also a system set up on the state basis That made it much more fair for how you could run So we get to where we are right now, uh, which is what's happening with the elections And the first thing we need to do I'll get everything swirling around us the decision by the administration not to enact the sanctions across the senate 98 to 2 You've got election infrastructure bills out there senator langford and I have one That we think would be really really helpful In terms of 400 million dollars to the states for upgrading their election Equipment we have a pay for from unused grant money And you've got as you know 21 states were attempted to be hacked into in the last election including access to illinois voter data We've got what was going on with russia with their troll factory. We've got $100,000 documented by facebook. I spent in rubles in the 2016 election And so I really I believe strongly with 280 days to go before the next election That we can't just admire this problem anymore that we have to take action and The first way as I said is to strengthen the infrastructure going forward But the second way is to look at what's going to happen with paid political political ads So 1.4 billion was spent in online ads in 2016 1.4 billion dollars and guess what no requirements for disclaimers No requirements for disclosures and you've got print Radio and tv's all over the country not just network little tv stations in mancado, minnesota That are required to have disclaimers on the ads when they're submitted by candidates and issue ads of National legislative importance. That's the statutory definition. Okay They do that and they do it. It's not always easy, but they always do it But you have a situation with major media companies Aren't required to do that some of them, which I appreciate have voluntarily said they will do it But you don't have any kind of consistency And you certainly don't have anything going on with the issue ads Which were something like 90 of what the russians were purchasing And so the answer we've gotten back is oh well, it'd be really hard to figure that out And I think of how huge these companies are and then I compare them to the wilmer minnesota radio station I was on at 8 a.m. This morning. They have to follow those rules They have to keep ads on file if they're about anything if they're about Reproductive rights if they are about energy They've got to keep those ads on file and they have to have disclaimers on them, right? So there's no reason you can't apply that and then national context with these companies Like facebook and twitter And google we've had major hearings. You've probably heard about them about this But we came to the conclusion that either the fec has to act which would be helpful But we should pass something just like we have a law in place For other media and that's why I introduced the bill as the lead And senator mccain is my lead republican and senator warner Was the other co-lead when we introduced it called the honest ads act has a nice name to it And so that's what we're trying to get done In addition to this election infrastructure and just everything people are focused on with figuring out what happened in 2016 And with the Mueller investigation and everything else, but that's really important But i'm telling you we are just Of what did I just say 280 days away? And we cannot just sit there and not do anything about it and let it happen to us again Because russia and other countries are emboldened when they know we do nothing in the face of clear evidence. Thank you And I noticed you're wearing purple And I noticed you're wearing purple. Is that an ode to prince? Oh, that's right But I will say that we had a really resounding defeat to philadelphia and now in a very minnesota fashion We have to be the host to all their means that fans that were so mean to us But we have dog sledding. We have a ski slope in the middle of our city And we have an ice fishing hole that's been set up on the roof Where people are going and it is five degrees and we're really excited about it. So that's what's happening Thank you so much senator. I want to also introduce our other speaker for this panel dan gilmore He is the director and co-founder of news co lab at arizona state university He's also a professor of practice at the walter cronkite school of journalism and mass communications at arizona state university He's the author of media active Is that how you pronounce it and we the media grassroots journalism By the people for the people and way way long ago dan and I worked together at the san ozzie mercury news So it's nice to see him in washington So I I guess i'll take a seat It's it's an honor to be here. Thank you for having me and I live in the bay area most of the time a place where winter is optional Which is a lot nicer than some places. I think but Being from a place where winter was not optional. I think I prefer the current one. I I wanted to first of all say That I think requiring transparency and political as is a great idea I Also can't wait to see the legislative process And how this will be done who will enforce it What are the unintended consequences? I expect that hearings you still do hearings occasionally, right? We've had major hearings on this. It's just a hard when a lot of the companies don't want this to pass To get it done. That is our problem. Yes very direct. I'm hoping that that We we end up with something that will prove to be a useful tool For the people who cast the ballots the people who make the decisions and I wanted to use that as a way into what I wanted to talk about briefly today and that is You're going to hear a ton of stuff about supply of information Online and and god knows we need to improve the supply of information that we have What I want to focus on here and maybe I may be the only one who will be doing that is demand I don't think we spend enough time In our world in our culture on the demand side of information And this is to me a very crucial and important part While we do Need to upgrade supply on the demand side. We have this requirement really because we need to be operating On a common set of facts that are based In reality not alternative facts, but actual ones And if we don't do that if we don't have that Process and if it doesn't at some level start with the people who need to have these facts We may never get a solution to these issues We have to upgrade ourselves This is not just about upgrading journalism or making Companies do better with processing information, but we have to do this for ourselves And we have to help people Find and understand and act on and create Useful information and news and to share it with integrity and do all these things again. These are demand issues What i'm talking about has got lots of names people use including media literacy news literacy and other things But really it's rooted in something very old-fashioned called civics and crucially in critical thinking Which we don't do enough of in our society And this is something these are skills that we have to we have to embed them In everybody starting at a young age and then reinforce them throughout our lives And we have to do it ourselves and we need a lot of help to do it The way we can do it best I believe and the work I focused on is to try and make this scale and I use that in the way tech people use scale which is to Make it big everywhere do it in ways that leverage institutions in Throughout our society and I think there are three possible ways one is education and community systems like libraries There are only a couple of a few states that have even begun to make this a priority But not many and the federal government has done practically zip on this which is unfortunate The second way to make it scale is the news media themselves Who have not done it, but they can do it in lots of different ways by being leaders in their communities In achieving it and then third The the tech platforms define the word scale We have to get them to help us do it so far. None of these have done a big thing. So The thing I'm working on that Cecilia mentioned is a lab to try and do these with different people who we're trying Now make help them do more of it and I will Bring my stuff to an end here But I I want to ask all of you to help us make these literacies the critical thinking Scale in big ways because I don't think we're going to make progress on the bigger issue If we don't make progress on this Maybe you can hear me. Okay. Um, I should note down you noted I think earlier in a call that your lab does receive funding from some companies, right? Do you want to talk about that for just one second? We have funding our initial funding. We got some from facebook, which People who know me and my work will find that surprising Uh We have some from the news integrity initiative at city university of new york from ritha rita allen foundation and The democracy fund And we're building pilot programs one in a new in some newsrooms of mclatchey's company And and in three cities and others. We're trying to work with everybody who can play in this area Okay, great So center of club, which I wanted to start off with you the demand side isn't interesting Definitely, nobody would argue and I'm sure you would not argue that people need to be responsible for how they also Interfer what they see online and they had to teach their children, etc But you're just saying earlier that the the platforms themselves are lobbying against their bill They're lobbying against the fec making action What do you think is the responsibility of the platforms and they've agreed already to voluntarily Disclose The actors behind their political ads. Is that enough? You're shaking your head. No, well first of all There's a lot of stuff that's going on that wouldn't be covered by Just the paid political ads, you know that the bots the trolls in the Thousand trolls in a factory that are just getting stuff out But the political ads are a piece of this and they are important not just because of russia But also because of all of the shenanigans that go in and campaigns that you want to be able to police And so I would hope given all the money that these companies have made off of the internet basically And they are brilliant companies and we want them to exist But I would hope that they would come to believe that a bill that simply puts them on the same footing as these other media companies Is not that radical of a thought and that they are they yes they volunteered to do stuff But it's all over the board like one is just doing candidates one is doing something else This is not and then you have tons that aren't doing anything And if you look at our bill, we're just putting the same rules in place for everyone Which the fec could also potentially do it would be better if it came from congress. So I like the idea of educating people. I will say that in other countries I spent last new year's eve with john mccain and lindsay graham in ukraine on the front line in a blizzard Right across from the russian troops and we went to lithuania lapia in estonia And when you are in those countries, you hear the stories of how their citizens have learned over time What this fake news is and they've seen this cyber war going on when they try to move a statue in estonia And they get their internet cut off or they take in three members in From Crimea that are in exile to a festival and they get all their parliamentarians hacked in So they are more used to what this is their citizens They actually see it in elections And I think that's going to start happening in america and getting that accurate So you upgrade it is going to be important But having rules in place because of money coming in we can't have free elections if we take this whole growing area of advertisement and basically are exempting it from the rules that we apply to everyone else It makes a mockery of our election laws Do you think realistically 280 days away as you've mentioned Do you think that Your what is the chances of your bill at this date? What is the status and what scares you the most about this next election coming and looking back at what happened in 2016? Well what scares me is more of the same on steroids and You they hacked they tried to hack into 21 state election software and voter files And if they accomplish it it's really going to if we don't have backup paper ballots, which we don't have in 10 of the states It's going to put our democracy on its head. So that what scares me the most the money and the spending money is just going to make for unfair results And and illegal results, which we should also be concerned about so my chance of the bill passing are small But the more you this year, but the more you push on it Maybe it gets the fec to act maybe more people come on board and you can't just look at change and say well We don't have a chance of doing it this year because Either minch McConnell doesn't want to or the companies don't want us to you just have to keep pushing Because otherwise if we hadn't pushed we wouldn't even have these some of the voluntary actions that we're seeing now Dan has a long time journalist and now as a professor of journalism How are you viewing? When you think of regulation and discussion in washington about these technology platforms There is like the honest ads act that deals with political ads. There's some other legislation That's being considered that deals with section 230, which is safe harbor But at the heart of this Is competition The issue here that there are a few big platforms that have become essentially our media conduits We're certainly at a time when this concentration the size of certain enterprises is unprecedented and it's it seems to me that Competition law has not kept up with technology I'm not smart enough to know exactly how to make it catch up But I think we have to work really hard on that because A giant fundamentally unaccountable enterprises Have enormous power not just in this country, but others so that worries me a great deal I'm two bills there on the antitrust subcommittee, which I'm the ranking on which would and that's a discussion for another day But would try to account for this change and looking at monopsonies and some of the types of of monopolies that we're seeing now and Again won't pass this year, but we have to start pushing the envelope on this because we're seeing a change in the way We have antitrust enforcement, and I was interested that the administration actually Came out against one of these deals and so we'll see what happens going forward I'm getting the two minute mark already. Wow. I think we want to open up to questions. So why don't I Just say your name and identify maybe where you work Sure, my name is Maurice Turner new america cyber security fellow Thank you all for having this discussion really appreciate it when it comes to elections What's more important or what should the priority be focusing on? Actually increasing the cyber security at the state and local level or focusing on auditing the results after they come in to make sure that they're accurate Well, I think you always want to prevent a break in first As opposed to going after it, but I think both both things are important I was looking at a quote that I didn't use in my remarks from 1923 Joseph Stalin He was then the general secretary of the soviet communist and he was asked about a vote in the central committee of his party He said he was unconcerned about the vote After all he explained that who voted was completely unimportant What was extraordinarily important was who would count the votes and how So, um, I think it's ironic that 95 years later. We're looking at a russian involvement in two ways One is they're trying to influence who votes when you look at some of the suppression And the stuff that came out the ads that basically told people Oh, you can text your vote. We showed those in the committee hearing right a complete illegal act Telling people you can text that you don't have to actually go vote to try to suppress the vote And then secondly who counts the vote gets to your question Is if you don't have secure Systems that are able to election infrastructure that can count that vote and as much as we've talked about these bills now passing There are a number of republicans, especially in that second group Langford and others Lindsey Graham that want to move on funding election infrastructure I think do we have time for one more question? No, we do not have time for one more question So i'm gonna Thank our panels for coming Senator Amy Klobuchar who came who's going to go back to the hill and has a very long day-to-day because there's a little thing called the state of the union address tonight And um dan for coming from i'm on the escort committee who you bring who are you bringing? No, the escort committee means you go in and escort the president. Oh, that's right. What are you going to say to him? I think i'll just i don't really there may not be that much opportunity if you could have like yeah 15 seconds Yes, I I think I would say too that we need to pass the dream act. Yes, that's what i'll say Okay, thank you so much. And thank you so much, Daniel Okay, moving right along do we have I just want to make sure with tonia if we have our Next okay. I'm assuming that our next guest is here. Oh thumbs up. Yes I lied. I'd like to now welcome representative ted lu To the stage who will give some brief open remarks for our next discussion Which should follow and dovetail nicely from our last one which felt a little bit rushed, but we did get some good content and um Representative lu was elected in 2014 to represent california's 33rd congressional district in the united states house of representatives since joining the office. He has made a name for himself as a strong leader on issues around the environment Cyber security and veterans. He's a member of both the house committees of judiciary and foreign affairs So he can talk a lot about these issues about foreign interference as well as competition policy welcome representative lu and There he is Good afternoon. Sorry. I'm just finishing up the great lunch that they gave me here So I want to thank all you for coming. I'm very excited to be on this panel. I look forward to the discussion I am also a recovering computer science major and so I've been watching the expansion of the internet and issues related to technology As was intersection with speech with great interest And I'm also recovering a lawyer and I always remembered in law school um being very concerned about the first amendment and just as Lois Brandeis wrote that the remedy to be applied basically to Misleading speech or false speech and so on is more speech not enforced silence So the question is Does that somehow change because now we have an internet? And my view is no, I don't think it should change I actually think it's the opposite. So I have Introduced along with representatives angle and voice the cyber diplomacy act Which creates an office within the state department to promote and open and secure internet in other countries To have more speech happen in other countries so that the people know what is going on and to me I would love to be able to Stop Russian propaganda and to stop bots and so on but I don't really see how government Actually could do that and if we did I think it'll be awfully dangerous So something to think about is What the Russians did in the 2016 election? It's not illegal to let people know your opinion What was illegal is If you stole Emails or stole information from a campaign And then you put that out you you can't do that there are laws against that and if you conspire with the Russians Well, there that's a violation federal law as well But right now for example, there's nothing keeping the kremlin from printing 10 000 flyers To say something on there Let's say it says, you know, Donald Trump is the most awesome president ever and then handing that flyer out to people on the street Without sourcing it without saying who paid for it They can do that So why would it be any different if they can do that on twitter or facebook? I mean, it's just a scale right that we're concerned about Because i'm not sure there's any principal distinction between what they could do right now with print paper versus electronic media And I think there's also a learning process so I think many americans including me were Caught off guard in 2016 about what uh, the russians did But now people know what they did and I think people are a little more aware that okay This twitter thing i'm reading maybe it's a russian bot or Maybe this is not really who the person says he or she is and I think now there's more Awareness and there might be more people who look at Facebook posts or and twitter tweets With more skepticism and I think that's a learning process that the american people have to go through I think it's good I think it's also caused this interesting spike An interest in publications like the new york times the washington post msnbc and so on because they know that okay Maybe I can trust this twitter post or facebook posts Maybe I can't but I know that if I read something from the new york times It's going to have at least two sources for whatever it is. They're saying in there It's going to be reviewed by an editor by a copy editors by other people There is a lot of eyes. I would have looked at that there would have been some vetting and when they put it out I have a general sense that it's likely to be true and if not they're going to correct it And I think there's now sort of more interest In these organizations that have certain levels of integrity attached to them I think that's generally a good thing. So I think you're seeing A response from the american public To the internet what they've seen in terms of an explosion of just content That people are trying to to figure out and we're in this very interesting time I still believe that section 230 of the communications decency act is vital and critical important That's what basically I think allowed the internet to flourish in terms of not letting all these internet companies be sued Over speech or other issues of third parties. I think we need to Continue to maintain that principle But I am also happy to hear your thoughts I could be moved to change on on different issues But those are my initial thoughts on this and I really look forward to this really interesting panel And thank you all for inviting me for being here I'd now like to welcome our other panelists to the stage. That is Jennifer daskal she's an associate professor of law at the washington college of law at american university. Hi, jennifer Kate clonic is the future tense fellow for the new america for new america She is a phd candidate at Yale law school and a resident fellow at the information society project at Yale law school Thank you Thanks for joining us folks Congressman maybe we can start with um you made some Remarks about your um The importance of section 230 for example Can you talk a little bit about some legislation that is actually up for review right now the honest ads act that That's in the senate right now for consideration your view on political ad disclosures as well as the anti sex trafficking bills That pertain to section that could affect section 230 and some to some degree Sure. Oh, that's pretty loud. Okay Um So I support the honest ads Bill and legislation it's along the same principle of justice brand eyes that we Should counter right misleading and false speech with more speech So it would require disclosure of political ads When you watch a federal tv ad for example at some point in the ad The candidate is required to say, you know, this ad was paid for by joe smith or mary roberts and so on You don't see that with internet ads So you could have a you know Internet ad on on facebook or twitter or another platform And the public has no idea who paid for that So I think it'd be helpful if we provided that disclosure and that doesn't also in my opinion violate section 230 in any way as well Online sex trafficking is a little more difficult for me. I've always been a very strong supporter of I'm making sure we reduce sex trafficking It is if you should just do internet search on this it is a massive problem And you sort of think oh it happens other countries It's happening right here in washington dc and los angeles over america So I have a very strong sort of motivation to try to Do anything I can to mitigate that It does run into free speech problems. It's just a tension. I don't know if I've actually thought through exactly how I would Vote on those bills, but It's interesting issue Jennifer there's been At the same time that there's discussion about free speech in the u.s. There has been much more movement globally for For nations to control data locally Can you talk about a little bit about the trends you're seeing and how that affects free speech globally? Sure, thank you So so I think we're at an interesting moment in time where you see nation states all around the world taking steps to try to In a variety of different ways limit content online So just some examples. Germany has a new hate speech law that went into effect in october It's pretty strong and if companies fail to take down Speech that that violates their law within 24 hours. They're subject to pretty hefty fines Another example is the right to be forgotten It was approved of by the european court of justice in a case several years ago And is now codified in a european regulation known as the general data protective protection regulation That's going into effect in may of this year and it basically requires Search engines primarily google. They're the it's a search engine of choice and most of the e you To take down embarrassing or irrelevant Information if requested to do so by the data subject Even if that information is accurate and even in the absence of a finding of Prejudice And one of the interesting I think one of the really important and interesting questions about these These pieces of legislation is is questions about territorial reach now Obviously one of the greatest benefits of the internet is that it's open. It's interconnected Um, but it also raises real conflicts when you have the european union, for example saying the right to be forgotten is a fundamental Privacy right And then there's a question as to how far does it extend With that with that case in particular google responded by initially just taking down The allegedly impringing material off of the european domain name So google.fr when you go to automatically if you're in france google.es if you go to if you're in spain and um the Various elements various members of the e you said that was that didn't go far enough the french data protection agency ultimately sued google and is Now demanding that google take off Allegedly infringing material off of all of its size Basically trying to export this right around the world and saying from their perspective if there's a fundamental privacy, right? It's not actually accurately protected if somebody can access it from the us or another state Whereas google said, you know, that's will respect your view of what of the appropriate balance between privacy and free speech in europe But you can't export that around the world and I think That case is now back before the european court of justice and I think we'll have really important and interesting implications for Speech and privacy going forward Along it's just sort of spooling out the implications. Kate Can a company like google or facebook actually implement gdpr requirements? Just for you know a certain segment of the global population. Will it spill over to all users in other words? What's the what's the way it is our way of? Of isolating certain laws to a nation state? Yeah, they've been actually these companies have been doing that for a long time And while they have global speech policies for what they put on their site and they Enforced those policies. They do have to be in compliance with the states in which they're operating and One of the ways they do that is through ip addresses They will effectively block or prevent you from so for example in germany or france Like you can't access on ebay any type of sale of nazi memorabilia, but you can do that in the us That's all done through isp walking and ip addresses So that's absolutely something that's possible and jennifer brings up a tremendously important point that I don't think people realize Is that there's this battle happening right now between nation states And between these giant companies these transnational companies that are kind of have organized the internet for everyone not just for the in the us and There's questions of democratic accountability Because they are basically There are speech engines and we have changed our norms around what we expect to be able to say Who we expect to be able to say it to how far we think that we can how much we can project it? And publish it and one of the interesting things that's happening out of the right to be forgotten Litigation is that basically you have the eu dictating new privacy terms and new Speech norms for the entire world and I don't know if that's any better than Google doing it or facebook That said you do see the platforms exercising an enormous amount of power over um Speech globally in the sense that you know, I think I was trying to count back at the different blog posts on facebook on Their changes to the news feed. I think they're at least three in the last three weeks. Um, it's hard to keep track They're trying to get people to share more from their community. They're trying to promote more local news All of this is sort of behind this opaque curtain of their algorithms and Congressman I I'd love to hear from you in some ways it actually we talk about nation states and sort of their Own local policies and spilling up to the world in some ways it feels like the platforms are controlling They're operating like nation states And what is your view on competition right now in the tech industry and among these platforms? That's a great question and I think it's helpful to again go back to a 2016 election The kremlin used facebook in a very effective way as did the trump campaign What they would do is they would launch all these messages To two different people some of it would be normal messages. Some might be crazy. Some might be You know Really over the edge and then they would see the response And they could figure out okay, you know these certain messages resonated in this particular place and you know, wisconsin And then they would just go all in and start talking about that issue And they use facebook sort of as a tool right to do that So democrats have now figured that out. We figured out what they did and now we're going to do that. So You know, there was sort of a response. I think to what we saw where people learn they see what happened and to me that's just really the only way we can do this instead of trying to sort of Regulate facebook or have government mandate that they do x y or z things. I don't even know how you would begin to do that So my view is We should have a very light touch when it comes to any sorts of regulations on tech companies But especially in this area of free speech Although I do support disclosure of paid political advertising Mm-hmm Yeah, I I think that I really I enjoyed the congressman's anecdote earlier about kind of the the power of the kremlin to You know, nothing is stopping the kremlin from printing a bunch of flyers and you know dropping them out of a plane Except maybe littering laws Dropping them out of a plane in a u.s city or something like that to say whatever they want to say on an issue That kind of thing could always happen. I think that there is Um, I think that there's a lot to be done by these platforms in terms of recognizing malicious or Bad intention material that we would term fake news And I think a lot of that is actually and this is a lot of people disagree with me about this I think a lot of that you can learn from the behavior of how these things are posted and the behavior of the Um of the bots or the or the the users that are posting it Because it's very difficult very very difficult to regulate this because there isn't the type of friction that we're used to In the real world where you have to go to a radio station and place an ad and have a real name And have a real organization and have a 501c3 and have all of these various things That just doesn't exist in online speech And I don't know that it should so you'd block a lot of people from speaking if you created those types regulations Yesterday there was an interesting Frankly kind of weird report that came out about the white house considering a 5g network That would be government run and I actually and I thought about you jennifer preparation for the panel because i'm sure you see um Practice like that all over the country the world um and The argument that that my colleagues and I have heard internally within the administration is that um They're just scared to death that china is going to beat us on ai And that having the best networks um that are government controlled like in china We'll enable that for china. I would love to hear your thoughts on on competition. Excuse me not competition economic growth security But also the idea of having a government Run network as a speech implication So I I mean one I just think it's it's kind of ironic that it's um I mean if you think about the party and control that's calling for the nationalization of of our of our infrastructure But I I think it's a it's a potentially quite dangerous Development and something that I think ought to be resisted for a whole host of reasons But but but the the most obvious one is is the speech issue and a real um concern about the You know I your question to to the representative before about are these companies basically serving as governments And I think there's a lot of truth to to that analogy and that in a lot of ways these major Tech companies are in sometimes more powerful than governments because they can mediate disputes across State lines across territorial lines in ways that governments can't or don't But i'm not sure the answer to that is to To take over elements like the 5g network I mean there's got to be new mechanisms of accountability and new mechanisms of control But I don't think that um this particular one is is the right one And congressman if regulation may not be the best approach. What is the best approach to protecting individuals? When it comes to I think I think education is very helpful I also think over time What's going to happen is right people are now sort of adjusting to an explosion of information that they're getting in a way they never sort of had before On social media on internet and I think at some point people are going to be going okay. There's all this information out there but there's only A certain number that I trust And I think people sort of gravitate towards those whatever those may be so I I actually think it might You hopefully this might Start dying down because if you're starting on twitter and you just start seeing these random People say things whether the bots or not You just think okay the random people saying things But then you see a tweet that then links for example To a new york times article you'll go okay, and then you and what makes that tweet More trustworthy is because it's actually linking to new york times article that you then read And I think people start to filter out and figure out they're going to go to certain sources where They know there's some level of credibility attached to the information that people are getting I mean, I certainly hope so. I I'd love to hear your reaction to that case. Do you think that people when they That they have sort of this self I don't I don't want to say self-educating capacity. I want to say it's sort of like are they are they at that point Where they're saying they're able to put easily identify what's Verifiable what's not what's You know Potentially more trustworthy information than not. Yeah, I think that this is something that's happened over and over again If you actually look at history, there's always been Questions of fake news and sometimes it was actually in the guise of journalism yellow journalism for example and There have been ways that Pressures have kind of changed how people can see of where they get their news and I completely agree actually With the notion that like there has been an awareness and a look towards Direct sourcing and kind of an education that has happened to the public and a shift in norms in the last couple of In the last couple of months. And so I do see that that that's happening And there's also we've recently seen a number of studies that have shown that actually Maybe luckily their fake news did not have the impact that people initially thought it had But I do think that the the the fear that was kind of generated the emotional response that we had at the outset To the fear of having been kind of duped or tampered with was maybe a useful one because we're coming out of it with this new notion of how we How we look at news and how we look at information that we read online How are we on time? I You know representative. You're pretty active on twitter. I'd like to hear your thoughts on how they're handling um Fake accounts there's a big story that we had at the new york times recently on fake accounts And that was pretty astonishing and what that means for what's real and what's not on the internet as well as they're handling of the president's account right So on the president's account. I actually Support allowing him to do what he does. I don't think It's twitter that people find troubling or the platform. I think What they find troubling is What's going on in his mind and and right and you see that reflected in the tweet and so we're looking at what the president is thinking and for Two-thirds of americans. It's deeply alarming and I think it's important for us to keep getting that if the president So chooses to keep letting us see how he thinks Twitter does do this thing where they do verify individuals so it would be a little check mark and Unless twitter totally messes up. You'll know that that's one a real person And two that it's the per the person they purport to be And I think that's been helpful and so I'll see a series of tweets by people who don't have check marks. It doesn't mean I discount them But if I see someone with a check mark at least I'll know okay, this person is no longer anonymous And is who the person purports to be and there is again a level of credibility attached to The source of of that tweet whether or not you agree with it I think we're gonna I mean I fully agree with the point about the president's tweets and I think that in all of this space I think you know there's there's there these are real concerns But I think we need to be careful about the rush to deal with it through regulation I think there was there was a very interesting exchange in a hearing about two months ago about dealing with fake news and there was a number of questions posed to the general counsels of major tech companies about why are you allowing people to buy ads in rubles can't you just ban buying ads in rubles? and And there was a response that well we we could but that went particularly be effective because then they could just buy the ads in dollars or some other currency But underlying the response was this assumption that we don't want russians to buy ads And I think that's something to be wary of the idea that we're not going to just have debates about what is an accepted What isn't isn't accepted speech, but who and who isn't an accepted speaker based on where they're from As opposed to what they're saying and so here it comes abroad Exactly, and so I think I mean there's obviously real legitimate really important concerns here But we need to be very careful about how we handle them Yeah So that's a great point. I it also made me think about this interesting conversation I had with uh a Person at google whose job was to deal with with ad policy and There are technical issues related to this so the way it's so the newer times that we're watching both before they sort of Put an ad out someone looks at it and You know But some level proves it but with these social media platforms There are lots of ads just sort of no way a human being could actually look at all that So let's say government would have come in and say, okay, we're gonna ban White power ads Right ads related to white power because we don't like you know that What if someone's selling a white power outlet? Right. I mean they're technical. I mean, how do you figure out that this ad is talking about a white power t-shirt versus a You know a outlet that happens to be white or so there's some very interesting technical issues with How could you even make regulate these ads because no one there's not enough people to actually look at every single one of them before they get Put up And all these companies have said that they're going to hire more individuals I think that's a real acknowledgement that they just tech can't fix everything that there has to be human involvement We're going to open up the floor to questions. It's just raise your hand and The tan sweater back there Just if you could say who you are and what school you go to or where you work. Uh, hi, my name is ashita I'm a fellow here at the blockchain trust accelerator at new america. So my question is to kate. Um, it was a great discussion Thank you so much one point that stuck out to me was when you said you weren't sure why The eu Making these laws to protect the privacy was any better than google or facebook doing it But isn't that their responsibility when they elected by the people of their governments to I mean people of their countries to do that And yeah, could you just clarify that from yeah, but I didn't elect anyone in the eu I'm sorry. What I didn't elect anyone in the eu. I didn't vote for a single person in the eu And so this is this was my point and I'm glad that you asked because they kind of it's not I don't think that it's My point is that there's democratic accountability in governments. That's one of the great things about governments But I don't have democratic ability I don't have any type of democratic say in the eu for the eu to make policy That they decide to enforce on everyone in the world. Does that make sense? So that was what I was referring to And I do think that there are whether it's through pocketbooks or through the market Or through enforcing social norms Or whatever it is that some of these companies Like I have more democratic accountability at some of these companies or My pass for democratic accountability at some of these companies Um Then I do certainly in the european union as an american citizen sitting in washington dc Hi ed black with ccia um Maybe following on what was said I throw out the phrase Outsourcing censorship. I think there's so much data on the internet people from many many points of view Uh, I think something's evil bad shouldn't be said So there are a lot of people want to control that the communication of information. They don't like In I think what we're seeing is Governments wanting to do that and in some good reasons, maybe authoritarian regimes some bad reasons Um, but in order to avoid a certain amount of accountability you talk about they want the companies to do it So we take a break liability companies you do it and you become responsible you make people unhappy And develop the censorship tools and we're going to force you to do it And if we don't like the way you do it, we can sanction you but we're actually alleviating ourselves of direct responsibility for enforcing censorship And I think that's the phenomenon that we're seeing Around the world the governments really don't want to be responsible for it And yet they want to enforce censorship through intermediaries ed. Is that a is there a question in there? Yeah, I Well, and I'm actually that's rhetorical for me to say because I should say that Ed you represented the industry So you should say that you represent these companies. Oh, absolutely. Okay. So next Thank you My name is Soraya Schmaley and I write extensively about these issues and I'm also a civil society advocate with the women's media center and I'm wondering if any of the panelists would care to comment on algorithmic accountability As it pertains to any of these issues. So whether it's fake news the regulation of speech transnationally ad placements underlying everything I think is this question of how algorithmic in accountability contributes to profitability And maybe regulation through the government is not the best way to deal with that But I'm wondering what your thoughts are about the best ways to deal with that So thanks for that question many of these Platforms, perhaps all of them are driven by profit, right? They're for profit companies and I can have this super duper amazing post in facebook And only a limited number of people will see it. But if I put money in there I can jack it up and a lot more people will see it It's not to me any different from Rich people organizations now place in tv ads They can do it and poor poor folks can't So a lot of it is the more money you put into whatever particular ad it is on the social media platform more people will in fact see it so I think My sense of is these algorithms are very driven based on profit It's not clear to me how they work when when there is no money being put in there So it's sort of black box to me. I like personally to know more about those algorithms I don't really know know how that is but I do think profit makes a big difference And I don't think that's really any different than any other platform we have now Thanks. Hi, and thank you all for a great panel. Kevin banks and I work here at new americas open technology institute um I perceive a couple of tensions in the current debate about content online And I'm wondering if y'all perceive them as well. And if so know how to resolve them one is uh At the same time that we're seeing I think legitimate concern about these platforms being Keepers of speech online And you know having a great deal of power over public discourse in a way that's Replacing the public square We're also seeing pressure often from the same people raising that concern for these platforms to enforce that Unaccountable power even more and take down more content, which seems to be a contradiction Meanwhile, we're seeing many people concerned about the concentration of economic power in these platforms You know that's saying that they are monopolies essentially While at the same time pushing these companies to hire literally tens of thousands of human beings to do even more manual content Content review in a way that no emerging competitor Could ever do such that we're basically locking them in as our platform speech masters Do you do you agree that there's attention there? And if so do you have any great ideas on how to resolve that in the political discourse over this very difficult topic? um yes, no Yeah, it's like a thank you for the issue spotter for my first internet law final. I think that that's I think that that's exactly um I think that that's a nice summary of kind of all of the things that are going on the collateral censorship that was mentioned before The ability of algorithms to like to not be transparent But also that maybe they're not playing as big a role in the actual taking down of content as humans But they are playing a big role in what we see And kind of figuring out where that tension lies. I think that these are all kind of exactly What's right? And we have put a lot of pressure on tech companies to resolve this And they don't have any I mentioned accountability for democratic accountability Well, it's not it's indirect all of the ways that I mentioned before are kind of market forces They're not, you know, we don't vote for these people We didn't decide to make them and as you said, we're about to like lock them in in a way To being kind of these types of companies So I think that they're all I think that you've kind of hit on All of the the tensions that are going to be the like the issues that are going to be continuing to unfold in the next couple of years Although hiring it a couple thousand people for these companies that have market cap valuations of crazy amounts That's my little editorial side, but um, thank you so much congressman Ted Lou Jennifer daskel from The washington college of law and kate clonic from new america and Yale law school. Thank you so much for this great discussion No, we're just Uh, is my microphone on yes, okay So, um, thank you our last panel, uh, is a great group of individuals who have not only done a lot of critical thinking about the platforms We've been talking about throughout the afternoon like facebook and twitter and um, and hopefully we'll also be talking a bit about youtube and google or alphabet But a few have even worked on these products with me. I have andrew mclaughlin Not directly next to me. He's the co-founder and partner of higherground labs He's also the executive director of the center for innovative thinking at Yale and a future tense fellow at new america So a mouthful I also have caroline cinders. Uh, she's a product analyst at the wiki media foundation He's been thinking about bots and automation and online speech for quite some time. Um And uh, whitney phillips will be joining us here in a moment She's the assistant professor of literary studies and writing at mercer university and the co-author of the book This is why we can't have nice things as well as Or the author of that book as well as the co-author of the ambivalent internet, which should be out soon And we have um dipan gosh, uh, who is the public interest technology fellow at new america Um, so we will get started. Uh, the the theme of of of this particular one is what what can these platforms do to Fix all of the problems that we've seen You know if we if I was to kind of summarize this year a little bit I would say it's been kind of a great social media cleanup like I'd say since the 2016 election There's just been a scurry of activity amongst these platforms and politicians and users to figure out What exactly we can do to use these tools that we depend on to get our information and to communicate with our loved ones and and our community members and and uh, and to discuss the news healthily um as well as uh, really what these platforms can do and what the government can do To make these platforms do the right thing because they've been trying to clean up their own mess somewhat unsuccessfully Or they've been experimenting with it, you know, we could say and uh, and and it's been a bit of a rush So, um, I I want to start actually, um with the idea that these platforms are just incredibly big These are some of the biggest companies in the world hundreds of billions of dollars And uh, and I'm wondering if they might actually be too big to fix on their own because they've certainly been Trying to implement a lot of fixes So I'm actually going to start with Andrew who's been thinking quite a bit about more local solutions Or smaller kind of ideas on this and and I wonder what you think about the idea of Maybe the scale. Are they too big to regulate? Is that a fair question? Well, I mean, I don't think any company is too big to regulate I mean, I think that I think that uh, you know of properly functioning democratic society with rule of law and And so forth ought to be able to set the rules of the game for the companies that operate with it The people expressed democratically should be more powerful than any company Um problem is of course, it's a complicated world. The companies operate internationally There are many different governments and kinds of governments that have equities in their operation They increasingly have people and more and more jurisdictions and so As a practical matter, it can become awfully difficult for any one government to express its will Over the company as a whole but at least anyway as Americans sitting here and as American companies We ought to have some degree of confidence what I think is interesting is they are indeed super huge and It's an old tired story that kind of like technical innovation Leaps ahead of the ability of kind of governments to regulate that's true in many different sectors It's been especially true with the internet Just to be very crisp about what I think is the new thing that has been causing us big headaches In the way that people experienced the internet in the last decade It's that the social graph And the power of the large operators of social graphs to direct attention through Um a variety of signals baked together into something like the facebook news feed. That's new So um 15 years ago if you were kind of like an avid content consumer on the internet You would probably have used an rss reader like google reader or something like that Posts would have gone up on publications and blogs. You would have consumed them in some order You would have controlled who you followed If you flip over to something like And twitter by the way continues to operate largely like that although they've been trying to move into a more curated Sort of experience, but largely it's relentlessly reversed chronological tweets from people Facebook though, which has become the dominant director of attention around the world operates in a different way The facebook news feed Is a challenge It's a great benefit because it allows you to have lots of friends and see a sort of like an amalgam of things that your friends are posting In in an experience, but the thing that's weird about it is that or challenging about it is that Your friends if you're an average facebook user are posting way too many things for you to ever see in a day The way to see this by the way is facebook actually gives you the ability to flip on Just reverse a reverse chronological view of your feed You'll find that it's very different from the one that you're used to seeing and there are many more posts that you're never seeing Or from people that you don't ever think you see things from and the reason is because if like i've got you know, I don't know Just to take Round numbers, let's say i've got 500 friends on facebook That means that probably there's roughly about a thousand things that I could see a day and even if i'm a heavy scroller I probably won't see more than 50 So it is that action of going from a huge universe of things I could see just from my social graph To the few things that I will see That's where the power of the company is expressing itself and the reason that it's interesting from a legal perspective is because We've basically been operating on the basis of a binary in terms of our regulatory perspective, which is either you're a neutral pipe And so over here we leave you we don't hold you liable because You know for the same reason that if two people use the phone lines to plan a crime You don't indict the phone company We've said that if you're just neutrally carrying content then we don't hold you liable or you're an editorial publisher In which case we hold you responsible as though you were the author of everything you publish and if you look at libel and defamation law That's the way we've always treated the thing is the news feed is right in the middle It's a chooser But it's not a writer. They're not editing the posts They're selecting them And so anyway to me what's really interesting when you think about the scale of these companies to go back to where you started The question is the news feed is just an effort to like come up with a way to give you a Digestible and high quality experience despite the massive scale of the platform Just to end on an editorial note where I think the really interesting work Is going to be done is definitely Quality of execution on enforcing content policies is important and every one of the platforms has tried to step up Mark Zuckerberg has said they're doubling the number of people that that review posts They are tightening up their policies. They're rethinking how they define eight speech I actually think and I'm you know biased because I Come from the industry and I still like feel myself kind of allied with it But they're responding impressively their response times are going down their attention to the You know the kind of the kinds of bullying and harassment and so forth has at least been been getting better Maybe except for twitter But anyway, I think the really interesting innovation is going to happen is on the product side These companies have got to figure out how to deliver better experiences around journalism Better experiences around local news better experiences around well, you see it in facebook already by having Acquired instagram. They're doing a better photo experience as a standalone aside from the core product So that's where I think the companies maybe will get smaller But they'll try they'll be trying I hope they'll be trying a lot of experiments on the product side to break down the massive kind of agglomerated aggregation of product that is currently at the heart of what they offer a lot of the Initiatives to kind of hone in the product has kind of put the onus on users to do a bit of curation themselves right so asking people to rank the news sources that they want or You know to report when there's a disputed story, right? They're actually asking people to kind of do that do a bit of that work for them in a way Caroline i'm curious, you know with wiki media, which is a platform that Is constantly battling harassment and has a lot of active contributors Also, somehow manages to pull a lot of truth out in the midst of all of it. It's really impressive A lot of the onus is on users there. What are your thoughts on Kind of facebook doing that? It seems like they're just it's very different products very different outcomes For sure. Um, and I thank you for asking and I also want to build on your point about products I am a design researcher that works on a product driven team called the anti-harassment tools team So part of our initiative is to build different kinds of tools for our volunteers our users Our users are called editors wiki wikipedia Is well, there's over 280 different language wikis We have a variety of large project projects. So they're all part of the wikimedia world So wikimedia is a broader term referring to a wide variety of different products from wiki data to wikimedia commons Where you upload photos to all of the different language wikis and it's all maintained by volunteers One of the things I love about working at the foundation is thinking about what does user agency look like when you're designing with your community and for your community As a participant on facebook, I'm not a volunteer. I'm not an editor. I'm a user So I'm a consumer, but I don't have any buy-in or agency Inside of how facebook determines or decides to build different products are supposed to benefit me I work on a team that falls under a specific Part of the wikimedia foundation called community technology Where we are building things are actively requested by our community and for our community we then also rate publicly How long they'll take to build different projects and these projects are voted on we also engage in radical transparency So all of my research that I do at the foundation you can follow along with Everything we're building is open to public discourse and debate The reason I bring this up is when we start to think about what does it mean to have Or rather to try to mitigate spaces of harassment or even things Like misinformation or fake news, which I like to refer to as emotional malware I think fake news can be a bit disingenuous You know, there's a lot of different sides to your like if we look at like what is considered news wikimedia Has really amazing standards for that because we're looking at what is considered a citation and how things are Actively cited when you build an article That's a whole different kind of space when you start to think about how you're looking at the verifiability of what is an article That's like a whole separate thing, right? So we have a very strict Rule set and then a community that self checks and self regulates based off the rule set of this There are probably ways to disrupt that But what we're dealing with is a community that has actively come together and agree to enforcing that And then from there has spent over 15 years Abiding by those standards So what we're dealing with is a completely different kind of community with a different kind of community norms than per se Facebook But also that buy-in is is across the entire platform It was designed to be this way. It's from the ethos of the platform itself. So facebook Is newsworthiness or uh, I guess Truthiness is that a part of facebook? Is that how facebook was designed? What is the origins of facebook? It was designed to be a space for college students to interact with And it's solely proliferated out into like the general public and being a large Geopolitical geospatial geolocational space Um, then like going back to online harassment. Let's think about who's determining What are anti-harassment tools? What are anti-harassment measures? Who are those product designers? Can we request things from them? What do they look like? Do they reflect the people in this room? Do they reflect people of a certain age demographic location race or gender? Who are the designers and technologists that are determining what are different kinds of harassment and how we solve them And the reason I bring this up is to correlate back to my work People can intervene and talk about the things that i'm building. I don't get to decide what's built I'm engaging in participatory design with an entire community to determine what the community needs are So when you start to think about how tools exist where product design sits inside of a space of misinformation or anti-harassment Or any kind of harassment mitigation. I do think there is a space for product design I think that will be the forefront of that But we have to determine or rather have a more community buy-in of what are those tools look like and who gets to make them Right, and I know wikimedia is working really hard to increase the diversity of its editor Community as well as decrease the harassment so that way more people will feel welcome being as participatory as as you've described Whitney one thing I'm curious about with you. I know your work. You've dealt a lot with uh, you know how Communities are harass off platforms and how they don't get to participate and then Don't get to you know be a part of the political conversation that is so vibrant and is so really vital to to participating into like Voting and to understanding what's going on whether it's locally or nationally or or around the world I'm curious. What what are your thoughts on how platforms so far have dealt with Such diverse like they have so many different diverse communities are coming into one place and people Experience these platforms differently depending on where they're coming from um, it seems like uh, certain people get a better shot using these services than others and they've Not really they're starting now to think more about how different communities are affected Yeah, I mean I think so one of the things about free speech Or discourses surrounding free speech because when people use that term, they're not always or very often referring to the constitutional sense They're talking about it more colloquially And when people talk about free speech, you know, it often is this idea that everyone or Usually I should be allowed to say exactly what I want and no one should tell me what to do And I think that that that impulse to protect as much or to protect certain kinds of the loudest speech To not censor in other words to not censor is baked into So many online spaces and in some ways I think that it can kind of be connected to sort of early hacker ethic around Um information wants to be free this idea that we just we just we cannot censor we cannot take things away Things need to proliferate. That's what happens on the internet and that kind of ethos whether it's explicitly acknowledged Or more implicit to how people Communicate what it what it does is it creates this sense that Free speech protecting speech means allowing for the worst kinds of speech And while that makes sense on the surface and that sort of aligns with how a lot of people tend to think about free speech Or free speech discourses what that tends to do it actually has a silencing mechanism that often goes unacknowledged or is underplayed um in the sense that if you have a community and The loudest most harassing most obnoxious most bigoted most dehumanizing people Are allowed to have the floor the result of that tends not to be more Speech from more people it tends to be less speech From precisely the groups of people that you would want to have communicating on that platform So by really vigorously defending this idea of you know, I dislike what you have to say But we'll defend with my life your right to say it as is often um trotted out That that again while that makes a lot of sense and while it's a nice idea in theory that tends to not actually support Very diverse expression And so free speech tends to or when you have spaces like reddit that is probably the most Has been over the last few years the most sort of rigorous in its interpretation of free speech as we need to let everyone speak That just does not create a space that's conducive to diverse speech Women's speech the speech of people of color queer people's speech trans people's speech all the groups that might be marginalized or Subjected to a kind of symbolic or actual violence That speech is not protected by this idea that we have to let that we have to give the big It's the floor what happens when you give big it's the floor is that nobody else is able to say a single word And that is not actually in line with a very robust understanding of what free speech is So I think that when we talk about free speech it's important to think about the difference between a more sort of robust sense which is about Empowering the most diverse groups of people to express themselves as diversely as possible versus the sense of free speech that sort of Let's protect the neo nazis Because they they need to have the floor. I don't I don't think that they do And so when you each Each platform is going to be a little bit different in how they understand free speech and what they value And whose speech they value who they're most worried about being fair to and that tends to be reflective in the kind of Discourses that proliferate on site. So reddit is different than twitter is different than facebook is different than book All of these spaces are different but to get to the core of where free speech fits into the conversation You have to think about whose voices are really being valued and protected over others I think that gets to some of the ideological reticence behind these companies really taking an active stance and In fixing their platforms and making them work for as many people as possible and also In a super useful way Dipe and you you had a paper recently about how you know part of the manipulation that's occurred over the past year that we've we've learned about whether it's from russia or You know just people in the united states trying to confuse us as well Is that these platforms are actually working as they're designed to work? Um, and uh, I'm I'm curious now It seems like there's so much inertia already like they're already Like making all this money and they're going full steam ahead and they're not going to stop making money You know, is it possible at this point to to step back and and to reassess and and rejigger it? Or is is that going to be a really hard train to slow down? um, I think long story I think You know, I think long story short it is going to take time It's going to take a lot of consideration But you you hit the nail on the on the head um, this is this is really the way that the digital advertising ecosystem works and over Over the years these companies have developed into massive internet platforms. I mean Facebook Facebook past two billion users several months ago and um They are the the way that the whole industry is set up to to work is that you know internet platforms try to Bring people onto their platforms and and create the most innovative services so that they can maximize their ad space and meanwhile advertisers come onto these platforms to try to persuade people to buy their product or buy their idea And obviously The way that this whole ecosystem is set up is so that the interests of the internet platform and those of the of the advertiser Are absolutely aligned. They want the customer the the users Engagement on the platform to be really high. They want that person Scrolling through maybe not just 50 50 facebook posts a day. I mean Ideally a thousand who knows um As many as many posts as possible and spend spending, you know an hour on the platform a day um and and of course So the the goals of the platform and the advertiser are absolutely aligned and that's that's just you know The way that that the ecosystem works not just with the three companies that testified on the hill I guess two weeks ago and a few months ago, but But also in the broader ecosystem as as uh companies like Verizon and and Comcast and AT&T get into the business of internet advertising as well um That's all fine and well and um, that's that's the you know, that's the spine of the internet That's how that's the backbone of it and and that's just how things work these days Where we run into some problems, of course is when that advertiser has malicious intent and Perhaps I'm not a fair judge of whether or not a you know, a russian disinformation Disinformation agent has malicious intent or not, but My my personal view is that You know, perhaps something should be done as as senator clubashar was saying earlier today But I but I think that um What we really need to try to dissect is is You know now that we have this ecosystem and it works in this way Who are those malicious actors? Who are spreading content that Maybe You know is rubbing people the wrong way And But I also think more broadly Now that we understand that these these platforms and and they're and the advertisers on them are are aligned And sometimes that that fundamental alignment though It's for different motivations for the internet platform. It's for revenue maximization for the advertiser. It's for For persuasion of the internet user Though those goals are aligned Perhaps sometimes we we need to try to segregate that and break it down a little bit further But you know another another Oh, sorry go ahead and Well, I was just going to say that very briefly is that another point I'd like to bring up is that this is a much broader ecosystem and Even though congress and the public are are launching this Deep inquiry into three companies and one for and after Which it which they should There there is a broader ecosystem at play here including technologies that have Really defined the internet over the past three decades And and what i'm talking about really is is behavioral data tracking and Um, certainly online ad buying and the and the targeting of audiences But it also extends to fundamental Internet web technologies for persuasion today including search engine optimization and as Artificial intelligence becomes increasingly integrated across across the web throughout the all these technologies. It could supercharge some of the broader social tensions that that were witnessing right, you know, it seems like I'm sorry. Did you have a comment? I'm gonna say it seems like there's there's maybe one of three things that could happen right either The platforms can regulate themselves and figure this out. Although if I was to Name an industry and I mentioned this in our conversation yesterday where self regulation did not work I think one of the first things that would come to mind is a digital advertising industry In fact, that's probably the best example. Um The government can can regulate them which we heard from You know senator klobuchar who's who's trying to do that. Um, you know, or people could just leave, right? And it may be some combination of that might happen, but certainly we see trust declining You know, I'm curious if you have a thought on that. Yeah, so actually it's a perfect setup So what I what I was going to mention, which is you know, so one way to cut through the thicket Of you know, many diverse kinds of platforms lots of jurisdictional headaches and you know, so forth And a general sense that government should be doing something Um footnote the wrong thing to do is what's been happening in germany Which is like highly prescriptive rules that say that the platforms have to take Manifestly illegal speech down within 24 hours on reports. Um, uh, it's producing massive over blocking So anyway, one way to cut through the thicket though is for government to focus on ends and not means And so I'll give you a couple of examples of how this can work just briefly So in the area of political Um transparency So the bill that senator um klobuchar introduced and and and some others if the goal is political transparency And it's running aground because um the companies say well, look tweets are so short images are tiny mobile screens are this big How many pixels can we really dedicate to that without losing the the ends Of transparency can sometimes also be fulfilled with a real-time api that provides the information In an open and machine readable format And you know provides the knowledge that people would want to know about who's spending how much money on what Even if it doesn't immediately solve the problem of like who put this thing in front of you There's a similar problem with promoted posts in news feeds, which are things that don't even look like ads They're just posts, but you pay for greater distribution Um depending on the ui you might be able to easily put an attribution or a clickable link there Maybe not depending on the device and size and so forth But anyway, if you focus on the ends and then let the companies figure out the means It may be a way for government to feel like it's vindicating public policy goals and moving things forward One other note that I just wanted to make and um caroline did such a nice job of sort of setting up the product designers perspective on this is One of the things that is so interesting to me is the interplay between um platform features and culture So just leaving aside kind of regulation platforms that often like Feel sort of the same and have many similar characteristics End up being very different as a cultural matter Reddit lives on a message board tumblr kind of feels like blogger except with a little bit of a you know, uh of a um Dashboard experience on the inside. Anyway, um, I worked at google where youtube was a you know The comments was a still is complete catastrophe. Um, I've spent about a year at tumblr about a year at medium Um, one of the things that's so interesting to me is the way that that very clever product hacks that work well in one culture Don't work elsewhere. And so one example is um mediums has this very nice and useful feature Which is that anybody who writes a post can basically sever Any response to it from the chain of responses So it's not censorship the response still lives on the person who wrote its, you know feed But if you hate it you can just say like why don't want it to appear when people are pulling up my post Anyway, it's a level of control that is pretty pretty powerful and seems to help Disincent people from writing stupid trolley things Um, and interestingly facebook has exactly the same feature exactly the same capability But nobody uses it from what I can tell and I don't really understand why that is But anyway, the point is the regulator should be thinking about like what are the ends to be achieved and let And the companies figured out and that may be a way to move forward Caroline has some UX thoughts. Yeah Well, there's this this thing called dark patterns So there it's a series of uh bad ui choices that are sort of grouped together specifically to dissuade or hide Choices a user can make and what you're describing is effectively that it's something where how many of you have ever Like frequently check your security or your privacy on facebook When they first rolled that out, um They also changed it every couple months. They had like a little like notification. Check this out. Um But if you actually start to really dig in to have more nuanced privacy settings You go into about three or four clicks and you start to lose people after like the second click Like the second they have to go to a different page you lose. I forget the percentage, but it's like a large percentage It's like 25 to like 50 percent of users like will not dig their way in So if you're hiding like the most valuable information like three to four pages deep, you're losing people I did a whole project actually on researching what happens when you die What happens to your profile on different platforms and facebook actually has a setting called legacy account where you can uh choose someone who's on facebook to take control of your account when you die and to um Like you can outline if you want your profile to stay up If you want it to be deleted if you want it to turn into like a remembering That's what they call like the legacy. Um, but they never notify anyone of this There was no sort of like change and that I was talking to the product designer that rolled it out And she was like well, that's a bit morbid But we're hoping through people writing articles like the way you are writing that More and more people will hear about this feature. It's like, oh, that's fantastic. Yes. I'm writing in English For someone owned like a blog for the gocker media group, of course Yes, so many people are going to find this all of your billion users are definitely going to find this article I'm writing about this this feature that exists But the way the feature is hidden is it's hidden like four four pages in and then it's a very small Very tiny box that isn't there's no language that would like let you know What this sort of filter is So I suggest all of you when you leave here Go as deep as you can into your own facebook and look at all of the presets Look at everything that exists another Dark pattern ui is if you've ever done facebook live for example Or if you've ever done a post where you've changed your privacy settings Like let's say you're making a facebook live video and you want to be private to everyone It changes all of your privacy settings Historically for all of your other posts as well as future posts. So if you have I'd say more nuanced privacy settings like You have specific groups that you have things tagged to or you only have Friends of friends can see things. Um, this will change All of your privacy and security settings and these like little patterns while they sound like innocuous things for a Practiziner to get upset about I think they're incredibly incredibly profound when you think about what are the choices We make inside these platforms when we do spend so much of our time on them when you are Living your your lives digitally and offline so intertwined that these things are designed to be confusing Or maybe they're just poor design But regardless there's something slightly adversarial about them and there's something that sort of begs the question of why aren't we Asking for more in a way when we think about like the kinds of content we're sort of putting out there And then from an online harassment perspective The idea that like you could change your privacy settings so sweepingly without there being any kind of more notification Does actually endanger users It changes the way in which you had articulated how you want to be seen and interacted with and that can be incredibly dangerous Yeah, so I want to kind of Sort of bring the conversation back to the issue of regulation and also to sort of piggyback on that point about How things get so intertwined that when we're thinking about well, how do you regulate problematic speech? You know the speech that crosses over into explicitly dehumanizing You know clearly bad speech because there's a lot of gray area speech that that a lot of us wouldn't like But maybe doesn't cross over that into that threshold But but part of the problem in trying to think about okay. Well, what do we do or what should platforms do? Is the fact that so many different industries Are intertwined in this conversation that if you're talking about problematic speech on social media You also kind of have to talk about the news that is Putting into the ecosystem the kinds of stories that people are responding to you also have to think about Certain politicians who are feeding certain bad or problematic information into the media ecosystem You have to think about folkloric behavioral patterns people who come to these spaces with you know Inherent biases against certain groups and so then they are enacting their lived experiences on these platforms So the idea that the platforms themselves would be able to singularly address and solve these problems when All these problems are so over determined and and a metaphor that I've used That I'm thinking a lot about lately is that of coastal redwood trees obviously Because what's interesting about groves of coastal redwood trees is that their roots are So densely inter intertwined underground that a singular tree can Essentially poison all of the other trees if there if there is some sort of rot that gets introduced into that system It goes everywhere one tree gets sick. They all get sick Other trees trees can deliver nutrients to other trees. They essentially help raise baby trees. It's very actually beautiful But what that means is is that when you have if you have one problem in one sector That becomes a problem for all of these sectors and that's really what we're talking about here There are solutions on these platforms that are critical to think about Well, what about what happens in the news and what about what happens? Not just in media literacy literacy in terms of you know figuring out what to do about fake news But what do you do about prejudice? How do you solve racism and violent misogyny? Like those are all questions too that are part of the broader question of How do we figure out what to do when speech gets out of control on these platforms? Even to your point Whitney who gets to decide on those platforms what is prejudice like do we all agree with mark zuckerberg's political personal stances What is you know, what what what does user buy-in look like in determining what those standards are? Do we as users get to be involved in that conversation at all? So we've been we've really I think defined a lot of gray areas, right? So and and I think that's one of the the major problems with with discussing this is that yes, you know people Say really harmful things and that's hate speech But then there's also news that can be really harmful and you know How do you say that's something that's super conservative that might forward a bigoted view is you know not allowed on the platform But something that's super liberal You know is allowed on the platform and these are two different political things and they feed into each other in different ways You know, it's it's it's complicated for sure even even if you know we have opinions about it It seems like we don't have though the regulatory like in terms of if there was going to be an outside rule coming in on these companies, right? I I mentioned in my opening remarks that maybe we don't even have the regulatory scaffolding to to to deal with this in the United States. Yeah, or or it maybe we do, you know, like what would right who would who would Who would regulate this right and what would that even look like? I mean political disclosure on political ads make sense, but a dipen. I wonder if you have any thoughts well, you know, I think broadly that We have a we have a regulatory system in this country that is That has been developed over years and years in your decades that addresses traditional media and As we all know traditional media is is being subsumed by by the internet new media And Maybe I shouldn't use the word subsumed but but as we all know many eyeballs, especially amongst young people are are on mobile phone screens and and browsers rather than the tv And I I think what that means is that, you know, we've got this regulatory regime and this structure Whether we're talking specifically about the FCC or the FTC or The fec we we've got um these these Big regulatory regimes that that address traditional media And they really don't address the internet. They don't address what has the the the evolution of Um media over the past 20 to 25 years um and Maybe they do in in certain ways, but but it's it's a catch-up game for government So I think broadly Regulation is coming regulation is coming around the world You know the the the biggest internet companies in the world see it They deal with it every day whether it's in latin america or in asia or in europe Um, and I think it's only a matter of time that That the the public sentiment in this country too um continues to Well up to the point that Folks in the senate are are really going to pick up on these ideas um now I think that You know if they don't succumb to the heavy lobbying from facebook and google that is Well, I mean, you know, I think I think that the the lat the obama administration Tried different things. There was there was political gridlock for a lot of different reasons It wasn't really just industry lobbying it. I mean a lot of these issues are very thorny Especially when we get into issues like hate speech or political disinformation where There's just no no clear path forward right now. So well, here's here's a here's a contrary view, which is um, I'm sorry I misread her facial. I want to interrupt you if you I sometimes do that and it's terrible So, uh, so here's an all here's an alternate view which is um I mean what's fascinating about this question is like what you're really asking is Does the united states have the regulatory scaffolding to do anything? Like does the administrative state actually function anymore? And I think one of the things that's interesting if you come from a kind of like traditional internet background is like You look at agencies like the FCC the FTC the fec As like wholly subverted captured broken institutions except for those rare moments When you get like a really great leader in an alignment of political forces But so what's interesting to me is that the right regulatory body is not an agency doing a rulemaking It's the democratically elected institutions of the country functioning properly And so that's like congress doing a lawmaking process through hearings Developing legislation passing in the president signing it what you see in the meantime are some very very interesting examples of Law enforcement and like a good example is the attorney general in new york Using existing laws and tort statutes for example to enforce proper common sense policies So he has been going after companies for um using as the basis of the prosecution impersonation So there is actually a tort of impersonation in some states There is even a criminal law against impersonation and actually a lot of the bad stuff that happens online Can be shoehorned into an interpretation of impersonation people holding themselves out as other people To sort of do their to do their their trolley nasty evil thing And so what I would love to see is instead of talking about which new regulatory agencies should be stood up And which one can be stretched to try to cover the internet just to focus on substantive limitations For example privacy protections that are set out in the proper You know way of federal law which is to say focusing primarily on the ends and then letting law enforcement be our mechanism Uh, whether it's you know private litigation civil litigation or In some rare cases criminal prosecutions to go after bad behavior in that way maybe not rethink the administrative state That's that's one idea. I I do think though that it's it's Not a completely foreign concept to have public interest laws over our media in the u.s Right, we have had laws in the past that ensure that you know equal time or that people You know are are able to from um from all different candidates are able to to speak And and communicate with with potential constituents or with the voters And and the idea the spirit of those laws You know a lot of them came from the idea that you know They're using the public airwaves and therefore they were beholden to act in the public interest The internet doesn't have that same type of public interest obligation But the spirit is that in order to have a functioning democracy. We have to have some sort of like workable fair media and um, I'm curious though uh Whitney or or caroline is it possible to have a framework that like an initial like public interest framework or even type Into because you're the public interest fellow That um that kind of sets the stage that says, you know, this is what Healthy behavior looks like from the get-go and then try to create laws around that It's funny that you say that because that's what we're doing right now at wick media I think wick media is a really under discussed uh platform in terms of like just the amount of people that use it and the amount of participation That goes into it. It's it's really fascinating It's yeah, it's a fascinating corner of the internet because it's a part of the the older internet It is 17 years old a lot of the infrastructure and design of it hasn't necessarily changed I would say for better though some would say maybe for worse Um, because it doesn't look like the way we're used to looking at a social network It's not it doesn't look like the way we're used to looking at platforms, but it's a very specific kind of platform Um, and it's grown. I would say in a sustainable way Um, and I think a lot of what's maybe missing from this conversation is what sustainability look like and how do we think about What sustainable platforms are? Um, I mean, how many of you used friendster or my space? You know, those things are right. I did those things are gone now. Um blogger and blogspot look a bit different You know Zanga doesn't exist anymore What about that journal? You know will life journal be around in a certain amount of time and these are spaces and places where people were sort of living and And existing and living parts of their lives and also putting data online data. They were choosing to put out there Um, but one of the things that I've been thinking a lot about is as an anti-harassment researcher is how do you create Public discourse around taxonomies of harassment and conflict. Um, how do you create, you know, anonymized use cases to educate your In our case are volunteers. Uh, how do we help our volunteers? Mitigate different forms of conflict and harassment that exist inside of all the different language wikis because all the different language wikis Have all of their own elected Administrators and officials that actually help mitigate forms of harassment When there's an extreme case of abuse that is often when like the foundation can help out and provide support But for the most part a lot of this is is self-regulating With regulations that the community determines and picks that are then defined by different policy that the wiki media foundation itself has And most of this most of this policy is also enacted in and uh, sort of I guess through trial and error made on english language wiki Um, the reason I bring this up is we're dealing with like these for lack of a better word federated networks that uh Influence one another. Um, and then they also are users meet up offline a lot at different actual meetups to engage with each other And one of the reasons I bring that up is um Actually touching on your point Is there's It's fascinating to look at what are different forms of harassment that a group can acknowledge is harassment Uh, what are the definable definitive things of online harassment? um Doxing for example is the release of someone's public documents. Uh three years ago Major platforms didn't have in their terms of service anti doxing measures now. We're starting to see that I think there are times where you can argue doxing Actually in a court of law Um, but it's also important to acknowledge that there aren't like any landmark online harassment cases that have existed and have One inside of like the u.s. Legislative system for example a gamergate There wasn't enough evidence for a lot of the victims to actually Uh, successfully take their abusers to court. Uh, there is a recent report that the fbi Had a mass of love information and failed to prosecute on four four abusers inside of gamergate who had admitted to their crimes We're in this really sort of fascinating time period where we were Defining different forms of online harassment in the extremes of abuse that exists inside of that whole scope We're making those definitions now, but we're not actually seeing I would say like these cases be taken off the internet and into a court of law I'm in for a variety of reasons. Um that we could spend hours hours talking about But what I see is trying to think about not just legal scaffolding of like the u.s. Government But scaffolding in terms of the policy of the different companies that users are existing on what are their definitions of harassment? Do they enforce them? Um, are they open and transparent about when codes of conduct or terms of service are broken? Is there any kind of remediation inside of that? Is there, uh, Is there a scaling of of responses to varying kinds of extremes? There's all these unanswered questions that we have And I think starting to be able to see that kind of casework in a way like policy cases that are enacted Inside of communities brought forth by the policy Let's say defined by facebook and then a case that exists around what is harassment or not That maybe leads to someone having their facebook account revoked Like we need to see some things like this before I think we even start to get into actual legislative scaffolding interesting, you know, uh one one just thought experiment that that's coming to mind like what if And I you know, I don't know if there needs to be certain rules of the road in the companies before there are rules imposed by By the government, you know, that's it's a it's a winding question but but um, you know, what if a plot of a social media platform once it reached, you know 100 200 000 users or something like that, you know, certain public interest obligations started to click into place, right? I mean that doesn't sound too out of line for me But anyway, uh, Whitney you had some thoughts on just the idea of what public interest obligations might look like Well, and also this this question of how do you define harassment? Um, I think it's critical to define harassment because then you can build You know terms of service agreements around that people can understand what the expectations on a platform is Um, so you can make it really clear people have a hard time following rules when they don't know what those rules are Um, and if you're assuming that most or many people who use the platform are operating in good faith They would want to follow the rules, but they just don't know what those rules are So I think it's really critical to define harassment clearly But I've recently been thinking I I'm not arguing against that point. I'm adding to it So, um, I don't know how many of you guys saw the recent pew internet research report that talked about People's attitudes towards harassment and what what counted. Well, so what they did is they um, they Surveied a number of people and had them read different accounts of harassing behaviors And and they determined these these people then decided whether or not they thought it was harassment And an overwhelming number of people thought certain kinds of behaviors antagonistic behaviors fated behaviors There's a consensus view of what harassment is sort of a general sense You're attacking somebody's religion or their race or their gender most people are like, yeah, okay that qualifies But what really struck me and what is what makes me question some of these conversations around defining harassment In every single case this is not this is not discussed. This was implicit but In every single case the primary criterion for whether or not something was harassment was intent Intent to harm And so if you have a definition of harassment that that hinges on somebody's specific active desire to harm another person On one hand. Yeah, that covers the really extreme, you know, the really extreme Cases of harassment But in so many cases people are being harmed not because someone intended to But because their speech was thoughtless or was ignorant or was not taking into account that the people that others Are interacting with online are people and have feelings and experiences and things that would would be upsetting to them So if we take if we make harassment something that you intend to do What that does in a strange way is it creates a threshold of anything that doesn't hit that point Is sort of pathologized because it's not really harassment So you shouldn't be getting upset about it and also it means people are less likely to reflect on their own behaviors That could be harmful when you are not trying to harm someone else So much harm falls into that gray area of I wasn't trying to hurt you But you got hurt and and that is a part of the conversation that just doesn't get talked about as much Which is even more difficult to deal with in a regulatory sense Because it doesn't reach the threshold it might be sort of in the eye of the the beholder And that's why although harass it's so important to have a clear sense of it What does that leave out and and does does that then Still leave a lot of toxic behavior. That's just not being engaged with or how could you engage with it? I mean, that's why I think having examples of like of like we have case law Right and we can like look back at like previous cases have been tried in certain ways But we don't have any of that that exists on platforms have a good example of like you're unintentionally an asshole Like here's you doing your drive by harassment and you thought you're being funny And it turns out like you hurt someone's feelings Like um, we don't have like little cases we can point to and say like look here in this interaction Like this is falls under the unintentional like if we're going to do a matrix of unintentional but still a jerk face You fit right here um, and I'm trying to make a joke because We're talking about such a heavy subject and I like to make jokes when I can But also it's important to think about like I actually did this for Whitney's work for fusion I made a taxonomy of trolling that was a matrix that went from Casual to extreme and then absurd to dangerous And then started to isolate different cases I had seen on the internet of different forms of trolling and I sort of used this as a litmus test of how would I Logically argue to members of gamergate that I want to rape you to death. It's not a joke. It's not a form of trolling It sits in this much more Abusive category and being able to point to specific events is actually incredibly helpful for that Whether they're hypothetical events or real events. So on this taxonomy, you can see different things different Lil sec engagement if you will Certain things Donald Trump had done as as a presidential candidate Which now all of those things on this taxonomy would be much more serious because he's now A president But also just different kinds of behaviors to sort of isolate like Even though the intention here is one thing it does sit in this much more serious space And I think that's one of the things that's missing is More discourse from the platforms from maybe their policy teams of here are examples of this right here This is unintentional harassment. This is intentionalized harassment. This is abusive Conflict and thinking of harassment and abuse is different kinds of variations is one more extreme than the other And really trying to break apart maybe actual individualized events to sort of educate Release under the guise of education sharing this knowledge like with platform participants one of the problems though is that um a lot of these events or the The way this stuff plays out is not litigated in public People don't know what happens if there is harassment It might happen in one community and it's dealt with on the platform in this way And we don't know how it was dealt with and so we don't have some ledger to refer back to To help build a definition that could maybe Work to shape some sort of future policy. Um, and so there's a real transparency issue Dipen finally, I'm curious. What what are your On this conversation. It seems like we lack a lot of critical definitions that could go into More constructive policies. Yeah, I mean, sorry. These these are these are obviously really thorny issues and I I see where both of you are coming from There are there is a lot of egregious content that's shared online and It's it ranges from hate speech to all sorts of different types of content including political disinformation maybe And in other you know in other universes Um, and the question is what what can companies do about it? What can the government do about it? And and how can we keep people safe secure and maintain their their their personal security? Um It's it's hard. I mean You know one one idea that's been raised is that companies like facebook for example could try to develop Uh, some sort of algorithm to to detect this kind of content and and even take it down, but I I think that Uh This is this gets right at the right at the issue of free speech and I know that I know that you have strong views on it But it also raises a lot of other complications. We we know that there are technological and economic constraints Um for internet platforms as they think about these issues. So And I know that we have to conclude shortly, but um I just wanted to raise that that it's these are these are really difficult issues to to suss out and and um The perspective of internet platforms has to be heard as as we try to Kind of stretch the these policies in a global way So that nobody's left behind I hope we're able to make some progress before midterms So, uh questions. We have a few minutes for questions. Uh, the woman Five minutes for questions. I find it very very dangerous to free speech that you want to use the government Determine what's healthy behavior because somebody's feelings might be hurt I know of cases of people being thrown off of twitter for being conservatives Somebody was thrown off of facebook for saying there's two genders because someone's feelings were hurt And I don't want the government In it ficting itself on my ability to speak and I've heard that twitter is actually suffering pretty badly now They're losing people and there's an alternative organization now called gab gab dot ai And uh, their whole platform is that they don't censor anyone online Have written extensively and reported extensively on gab and that does exist It's it's twitter Well companies can can regulate and and the government can regulate and and right now It's mostly in the conversation of companies do you guys have any uh comments on the idea of people creating alternative social networks I love mastodon because there's no nazis there Um, but it is important to point out that facebook and twitter are private companies They can if they decided none of us could say the words the they could do that I think that's one thing that's maybe missing from this conversation. I will caveat that Wikimedia is a nonprofit not a private company Important to note those differences um, so there is radical transparency in how Content is looked at created verified and removed inside of any kind of wiki project And we sort of miss that transparency when it comes to private companies because they just don't have to disclose that I just want to mention. I mean having worked at facebook Uh, it's it's of course a public company um with shareholders and I I don't I don't know if Somebody like mark zuckerberg can or or the executive team can just unilaterally say that We're not going to allow a particular word on the platform anymore Because they're hyperbolic Sure, um, sure, but anyway. Yes. I just wanted to make make that quick point It is interesting to think about the the limits of their their power on which we we often think of as They can really do whatever they want I meant it more as like they can set and determine their own sort of rules and policy and Yeah, so Whitney, yeah, I mean, so the point the point about, you know, uh Censorship or sort of intervention moderation. However, you want to frame it um The idea that that there would be that moderation or censorship on the grounds of hurt feelings is Uh, concerned that many people raise and it does raise this point this question of well at what at what point does it become um Important to intervene if someone is being harmed that I I would maintain that having your feelings hurt Sure, that that might exist on some point of the spectrum But it depends on what person's feelings you're talking about in response to what kind of behavior So, you know, it gets into very tricky territory about sort of relativism that you don't want to impose a singular standard of How people respond or should respond to content in the world But I do think that there is a pretty clear line when something becomes dehumanizing and violent And I think when you cross that point when you're delegitimizing someone's life or body That that becomes a much bigger conversation than oh, my feelings got hurt because my friend didn't respond to my facebook message So I think it's important not to pathologize the kinds of events that can take place online that can damage someone's reputation And threaten their overall sense of safety. So I think that you know, when we're talking about when do you intervene It's not just it's not hurt feelings in the sense of again. I didn't get a response It's I have 10,000 strangers saying that they want me to get raped That becomes a very different kind of conversation that does warrant intervention in my opinion Because we're talking about the delegitimization of lives and bodies And dehumanizing behavior Thank you. That was very smart. Uh one more question. We have time for uh, yes, sir So I'm frustrated because I want to see the analysis move beyond We want free speech, but we don't want hate speech. Uh, I want to see Whether we're you're advocating technical tools or policy tools to move this forward back when we were having the pornography discussion in the 90s The big settlement was we were going to have technical tools to prevent children from getting access to pornography Now that didn't work and we never revisited the situation So we could propose technical tools to prevent people from seeing the 10,000 Anonymous people who want to do you harm and is that a a good thing? Are you concerned about other people get exposed to that kind of hate speech? So I really I'd like to see more lawyers on the panel and actually see whether there are Uh Tests that could be used to make decisions about moving this argument forward because it doesn't seem to be getting anywhere right now More lawyers are just different ones. I'm not I'm not enough here Is there a question not doing it for you? I'm just Look as long as the first amendment is the first amendment nobody here has to worry about the government telling these platforms What to do? It's just not going to happen. It's the supreme court's interpretations are very clear Not a problem not a possibility part of why advocates of net neutrality Are so into net neutrality because we don't want AT&T, Comcast and Verizon having the ability to have that censorship Either we want there to be a full free marketplace, which is the whole point of net neutrality And so anyway within the marketplace Though we clearly have like a hugely broken sort of like system where there are inadequate alternatives for people If you want to sit in a cesspool of like neo-nazi right-wing craziness, there's a place for you Yay, if you want to be on twitter with its rules awesome same thing with facebook I think the sense that we have right now is because of the dominance of a couple of platforms because of a variety of You know functions some are economics some are product innovation Some are the kind of like virtuous circle of ad dollars fueling more infrastructure fueling more Users fueling more ad dollars and so forth. So there's you know complicated market dynamics at work I'm not to say that this is easy But as technologist what we ought to be doing is trying to build new stuff that provides new community Habitats where people can find their Find the experience that makes the most sense for them There should be a lot more places on the internet that are a ton safer, you know that are a ton more attentive to Harassment and so forth and take a much more aggressive line doesn't mean that everybody should do that And I don't think every platform will there's no way that that's going to happen But it would be great if there were more Places out in the world serving more needs in more ways. That's to me. It seems like the path forward I have a bunch of thoughts. Um, ball tray limit them. Um, I You know, I'm not sure if the conversation should should solely be a panel of technologists or solely Slowly a panel of lawyers. I think that there's a lot of different nuanced takes takes on this I think a big thing will be a mixture of Technical solutions combined with new policy solutions and it will not be a one-size-fits-all What works for facebook will not will not work for twitter what works for wikimedia Will not work for other platforms, but it can be used as a standard or a barometer Or like we can learn from each other successes and failures, but it's going to be a balance of product designers Developers and policy makers sort of working together to think about how do you take policy? Um, and how do you reflect that into design? How do you reflect that into technology? How do you reflect that into tools that participants and users can actually engage with and use? And that A wrap. Thank you all so much. It was a great conversation learned a lot This is Celia our previous moderator and thanks all of you for coming The fact that