 Good afternoon. It's a totally wonderful, great pleasure for me to be introducing Sarah to you. Sarah, well, there have been two students in my time at Berkman who became co-teachers with me. The first was Jonathan, the second is Sarah. Sarah was my research assistant back a ways when teaching Internet and society, just at the time when we were conceptualizing the idea of Socratic dialogic structure as a foundation to be scaled as a way of building democratic values into the net. And Sarah undertook to teach Internet and society with me. At the same time she was doing this, she was also studying with Catherine McKinnon in what as far as I could tell from all I could absorb from Sarah about what that was about was a wonderful class in which Sarah opened up new avenues of radical feminism and I listened and felt that she taught me and then saw her teach my students. If there's anyone that I would like to see back in our midst as a change agent, it is Sarah. So she speaks to you today about the Internet of Garbage, abuse, it's abuse. But she comes out of an understanding of it that isn't limited to the net. It's just expressed on the net and her attention is how to draw it back. So with great pleasure, I introduce to you Sarah Jung. Thank you Charlie and thank you everyone for having me here. Thank you and thank you for having me here. Is that good? Alright, so Roadmap, I'll first do a little bit of background about the topic and then a bit about myself and then I will be talking about the theory of garbage. So really brief background, online harassment exists. It actually exists. It ranges from mean tweets to people calling the police and getting them to send a SWAT team to your house. Not every instance of online harassment rises to a level of interpersonal violence that requires state intervention, but sometimes it does. Nonetheless, it exists in this big mixed up bucket of all kinds of things and it's very staggering, it's very pervasive, it's a huge problem on the Internet. And furthermore, it's gendered. The US National Violence Against Women Survey says that 60% of cyber stalking victims are women, but the National Center for Victims of Crime say that's more like 70%. Working to halt online abuse, looked at 3,000 reports of cyber harassment between 2000 and 2011 and found that 72.5% were female. The Bureau of Justice Statistics says 74% of individuals stalked. Online or offline are female. The NYPD Computer Investigation and Technology Unit, the majority of their cyber harassment victims are female. And I think this sort of, I think this great study in 2006 from the University of Maryland really illustrates what exactly is going on here. Researchers put silent bots, like these are just completely silent user accounts on IRC and some of them have very female gendered names, others have male gendered names, sort of like Alice and Bob. The bots with the female names received about 25 times more malicious private messages than the ones with the male names. These were silent, they weren't saying anything, they were just sitting there, they were just on the Internet. I'm going to be talking a lot about gender today but that has a lot to do with the studies that are available as well as the activist sort of focus on this issue. You see a lot of the policy recommendations and a lot of the discourse focused around gender but it's really important to know that this problem is exacerbated on every single axis of social oppression. I would particularly like to see more focus on how it affects race because there is a huge problem there and the studies are largely silent as to it. Sexual orientation is also huge, gender identity is also huge and something that we have like no data on is how this problem affects sex workers on the web because they're on the web, they're people, sex workers are people and they get disproportionately so much more abuse than anyone else who doesn't publicly identify themselves as a sex worker. I think that this is a huge problem with our society and it's worth looking into and something I say at the Berkman Center for Internet and Society for a good reason. This inequality in participation on the Internet has a measurable cost to society. 91% of Wikipedia editors are male and surprise, it's harder to find articles about women on Wikipedia than it is articles about men. There is a measurable impact on what happens when women are harassed and silenced, when they feel as though they're online and in a culture of toxicity that excludes them. So like I said earlier, harassment is a really big bucket. You have, I think, everyone, every woman almost experiences this constant low level buzz of harassment. It's not that big of a deal but it's kind of really annoying, right? It happens, it's every day or close to every day, it's just sort of there. Why are these people bugging me? It might not be very violent, it might not be very upsetting, but it's there. And then you have the targeted campaigns and these are the things that make the news. This is like Anita Sarkeesian having to cancel a talk because of a bomb threat. This is South by Southwest cancelling two panels because of threats. This is Gamergate, which by the way is actually an example of domestic violence rather than anonymous strangers taking someone down. It's an instance in which an ex-partner decided to go after a woman very violently. And that's something that we shouldn't forget. We like to focus on stranger danger, we like to focus on scary anonymous people, but really the scariest people among us are the people we know. So here's some tweets at Caroline Crietta Perez who was a female journalist in Britain who wanted to put Jane Austen on British currency. So she got tweets like this, fuck off and die you worthless piece of crap. Go kill yourself, rape is the last of your worries, shut up bitch. And two people were actually jailed over this eventually. Here are some comments posted on Anita Sarkeesian's videos. I hate ovaries with the brain big enough to post videos. Fun aside, she completely forgot to mention that every guy in video games has these stereotypes too. Do you see us parading about it? About it? No honey, it's a video game. Tits or get the fuck out? Yeah, I can't wait for the day we get to play Ugly Feminist Hand Planet the game. That would sell millions of units. As you can see, not all of this is, you know, straight up, I guess, crass. Or not all of these are threats. But it's not nice to be reading this day in and day out. I'm not saying you should get imprisoned for not being nice to people on the internet, but this is a bit much. There's a lot going on here. So compare this to sort of, like I was basically looking at a screen grab, right? Like I took these from a screen grab, so this is like one comment after another after another. I wasn't like going out of my way to find the nasty ones. So here's one to sort of balance out as a palate cleanser. I am okay with this and I believe everyone is too. Case dismissed. All right, great. That was the most positive thing I could find on that screen grab. So these things were happening actually. They were just getting started while I was still at Harvard. The discourse was shifting. I had come to HLS in a time when the frame was still very focused around free speech, good, oppressive governments, bad. We must protect the internet from these threats of interference, of censorship. And suddenly the discourse was changing. And we were increasingly seeing these examples of women getting basically threatened off the internet. And this wasn't new, but suddenly there was more and more attention to it. And I think that has in large part, it's because of Twitter. It's because Twitter is open and it's because we can see it happen in real time ourselves. It is no longer a subjective experience. It is open, objective and available to the public for everyone to see. And it's horrifying. And the discourse started out as this is free speech versus women. And this was pretty upsetting to me. I was coming from a place where I believed in a free and open internet. I did not think that many of the proposed regulations coming out of these episodes were good. I didn't like the idea of people getting sent to prison over tweets, even though I looked at the tweets and they were very upsetting to me. And as Charlie mentioned, I was here, I was working with Charlie on the internet and society class. I was also at the Journal of Law and Gender. I was very involved in feminist issues and activism. And I felt like I was being told that my position was inconsistent because it's free speech versus women. Pick one side. No one wants to be on the side that's, well, I guess women get to lose. So the more I thought about it though, there's a thing in First Amendment jurisprudence known as the Heckler's veto, actually. And the Heckler's veto is sort of this problem where if you yell at someone enough, it causes an issue where, oh, okay, I guess it's now legal to shut down the speech because people are, it's just upsetting everyone way too much. It's like causing, it's fighting words now because it's causing people to fight against it. Or now that it's causing all of this violence to erupt, then we're going to have to shut all the stuff down. And that's how I viewed harassment. Harassment was a Heckler's veto. It wasn't the case that harassment was the speech that was under threat. It was the speech of women that was under threat because women were the ones who were being told to leave these social platforms, who were leaving these platforms, who were ceasing to speak, who were being silenced. And when we're talking about free speech for women, I mean, we're just talking about free speech. Obviously, this is all well and good. And you may say this is little more than a cute rhetorical trick to justify the suppression of speech. And I assure you that it is not. I have learned many cute rhetorical tricks from this institution where many of the masters of cute rhetorical tricks reside. But there's actually more to this and I am going to explain. It goes back to the history of the internet and some largely today overlooked facts about internet infrastructure. This is what I call the theory of garbage. The theory of garbage is very simple. The internet is mostly garbage. There's more to that. So when I'm talking about garbage, I'm talking about things like spam. I'm talking about things like malware. And I want to include harassment. These are just things that we have come to accept aren't great. And there is certainly a contingent and I have to qualify before I even begin going into this. Look, spam is speech. Malware is speech, like code is speech. Harassment is speech. Nonetheless, this all belongs in a bucket where we know that something has to be done about it. It doesn't mean we have to put people in prison. It doesn't mean everything has to be deleted. But there's an issue here. These things make the internet unusable. And I started thinking about this actually because I was having dinner with these older attorneys who had been working with internet law for a long time. And they're like, oh man, remember when people were really upset about server side filtering? Spam filter. I'm like, that's before my time. Like what happened there? They basically spun out the story that I thought was incredible. I will leave out names. But basically around this era in the earlier 2000s, you had people who are typically free speech wing of the free speech party. These are the kinds of people who make the ACLU look like a bunch of Europeans. So these are people working in the tech industry who are really, really like nothing should be censored on the internet. And when it came to spam, however, they were just like burn it with fire. Rain nuclear bombs on spammers. This is the worst thing that has ever happened to the internet. And how dare you tell me that spam filters might be bad. So spam comes, of course, from the Monty Python sketch where two people are trying to talk and these Vikings keep interrupting with spam, spam, spam. Gloria spam, spamity spam. And you see it probably get its name according to Finn Brunton from this IRC trick people would do where if like just for, you know, fun, they would type in spam, then hit enter and then hit the up arrow, which just repeats the message, up arrow, enter, up arrow, enter, up arrow, enter. And so the entire chat room would flood with spam, spam, spam, spam, spam, spam, making it impossible to talk. And so you get this name and eventually you're starting to get a lot of unwanted messages through various systems. So this exists even before email is really email. This exists before the web is quite the web. This exists on Usenet. This exists on things like CTSS at MIT in 1971. It's just unwanted mailings. And obviously this is a very subjective definition, right? Keep in mind this is very subjective and keep that in mind as we go on to talk about harassment, which is obviously a very subjective definition as well. And eventually people create anti-spam technology. They create filters. They start out with very naive filters, regular expression filters, which are just, you know, oh, well, let's just filter out every single email that says Viagra, which by the way there was a woman in Italy named Olivia Gridina and all of her emails disappeared into a black hole for like a few months because of this. And so the regular expression is not great. And around early 2000s the EFF put out this position paper on spam filtering and like any good position paper in a time of great outrage and controversy. You can sort of read what the controversy actually is because they're carefully navigating around it. Like the language is pretty incredible. They're basically like, look, spam is, yeah, we know spam is bad, but hey, people are losing moveon.org emails in spam filters. And, you know, moveon.org, that's political speech. That's speech. A lot of the moveon.org emails that we're getting lost in the spam filters were actually like action alerts. Like you have to call your congressman now. We have like one hour to call your congressman about this bill. And if that gets even delayed for a bit in a spam filter, that's political speech being silenced. That has actual implications on our democracy. And what the EFF was recommending at that time was, look, server side spam filters. So spam filters on the side of the mail servers where the people themselves don't have control over how to tweak the spam filter or whatever. Those are problematic. Those are bad. Those are a threat to free speech. Client side spam filters, good. Nowadays, we all use Gmail. And that's that server side, although it takes into account our own preferences. So it's kind of client side kind of server side. So that distinction is now largely defunct. But the issue is still live. So spam is speech. It's always been speech. Actually, so Finn Brunton says that the first instance of spam was in 1971 on the compatible time sharing service at MIT, which was this network. And this one engineer, or I guess he was assisted men, got access to the entire mail, like to the entire list of people. And he was like, all right, I think I really need to send this message out to everyone. It was an anti-war message. He wanted to tell people that it was their moral obligation to oppose the Vietnam War. He was mailing out to a bunch of people who were taking DOD funds at the time. So he was really good. He was like, this isn't an irrelevant audience. This is a very relevant audience. We need to take a moral principle stand. And so the very first spam message is there is no way to peace. Peace is the way. And then you see a few decades later, sort of as people are discussing can spam and other anti-spam regulations, you have the spammers themselves going spamming speech. It's what America was built on. Small business owners have a right to direct marketing. But eventually can spam passes and spam actually ends up becoming largely the enterprise of criminal groups in Russia. It's kind of an odd twist, but most of the spam that we're getting over the internet now is basically run by the mafia because sort of the small business owners who have a right to direct marketing decided not to do this anymore because it was illegal. And so now we have what we have now is, you know, the Viagra emails, they're like barely readable. They're sent by bots. There are bot nuts. There are computers that are controlled by viruses that are their cycles are getting eaten up by sending out all of these mass mailings. And it's kind of a terrible situation. And in fact, spam is a huge infrastructural menace. In 2013, there was a Gartner report that said that 69% of all email is spam. And this was a huge reduction, like a few years earlier, been 89%. And they were like, yeah, no, this great reduction doesn't have anything to do with anti-spam technology. This is actually the spammers are moving on to the social networks because that's where people are. That's the right market. I mean, I'm sure some of it has to do with anti-spam technology, but it's actually quite dire. The email security market is estimated to be about $7 billion. American firms and consumers experience $20 billion of loss annually because of spam. And what's really sad about this is spammers actually only gross about $200 million total for this $20 billion loss. It's not great. Spam is actually really bad for the internet. The internet is at least 69% garbage. I can tell you that with some accuracy. But at the same time, spam filters are still a free speech issue. Things are still getting caught in there. We don't think about spam filters that much anymore. We just don't think about spam as a problem anymore because we all use Gmail and Gmail anti-spam is pretty good. Now and then we'll get something in our spam folder that shouldn't have gone in there. And we're like, ah, this is the worst. I was supposed to reply to that last week and now something is wrong. But that happens very infrequently. We don't think about spam anymore. We think of spam as a solved problem. But it's not. It's not at all. It's an ongoing free speech issue. Things are constantly getting stuck in spam filters that shouldn't be stuck in spam filters. And right now you have this thing actually where Gmail is filtering out mail coming from self-hosted servers at this point. They did a tweak recently. So that's a huge problem too because now it's basically privileging people who are using services like Gmail over people who are trying to keep the web decentralized to run their own self-hosted servers because various political stances or they just have a lot of free time on their hands. But that's important. This is an issue that this is about the decentralization of the web, turning into the centralization of the web. This is about Google having rather undue control over people's communications to each other. And spam definitions in the end are quite subjective. It's even though we have a sense of what it is, just like we have a sense of what harassment is. We have a sense that someone typing you bitch at someone else that they don't know is harassment. It is nonetheless like we can't just put a filter for you bitch and impose that on the internet. In the end, harassment is very subjectively defined and spam is also subjectively defined. And of course, there's the old adage one man's trash is another man's treasure and this subjectivity is very problematic. We don't want to be filtering or blocking or deleting things that another person argues that's actually good. But look, we still have a municipal waste system. We have to function as a society by taking out some of the garbage. We can upcycle some stuff, we can recycle some stuff and some stuff just shouldn't be thrown away. But in order for things to function, we have to make sure that email isn't visibly 69 percent spam, even if it actually is 69 percent spam. So I go through this history because harassment is spam. When you're looking at the constant low buzz, the targeted campaigns, these news articles about, oh, she received a flood of tweets and then they will even say how many tweets per hour she was receiving. What you're looking at is a targeted personal distributed denial of service. And in fact, the common sort of advice in these situations is just turn off your phone for a while, get off the internet. It's basically social media no longer functions for you. You no longer have this avenue of communication. Your machines have been in a way compromised by too much information because people are screaming at you too much. And this is something that clicked for me when I heard this interview with Mickey Kendall. I mean, this is something that had been brewing in my head for a bit. But I think that she says this pretty greatly. She receives a lot of abuse. She is a black woman and she is public and she is out there on the internet, which means that she receives tons of abuse. And one day she and some other friends decided to switch out their Twitter avatars to white men. And what they saw was that the abuse that they were receiving was suddenly diminished. Like the same, in some cases the same people were interacting with them, but suddenly those interactions were like, oh, they were reasonable. They actually want to talk about the issues instead of just being jerks. And she said, one of the things that's really nice is not waking up to see 62 comments calling me everything but a child of God. These are people who are getting spammed. And they're getting spammed strategically. This is anti-speech. Online harassment is strategic. Spamming, flaming, DDoS, anti-spam, moderation, anti-DDoS. These are struggles over the form, manifestation, and social norms of the internet itself. These are people fighting out what the definition of garbage is. Because they have differing definitions of garbage and what belongs on the internet and what doesn't. And they're using these tools to hash that out. And for misogynists, women are garbage. And they have to be driven off the internet because they don't belong there, no more than the spammed us. So what we're looking at is that anti-spam technology, the code around it, is political. It's not neutral in the slightest. When it comes down to it, the first page of Google search results can only have so many things, right? So even if you're not blocking everything, you're kind of like some of that stuff is getting lost. When you're looking at Facebook algorithms and how they get sorted, some of those communications are getting lost and some of them are getting promoted. When you look at Twitter blocks and how they work, and I bet this is something that a lot of people didn't think about, or a fair number of people did think about a few years ago, but they changed how blocks worked and they made it so that people could still follow you. And interact with you even after you blocked them. And before you couldn't do that. And there was this huge outcry because this would exacerbate online abuse. And this is interesting because that's just a tweak to the code, right? This is just a user interface thing. Changes to user interface are not neutral. They will hurt some people and help some people. Sometimes they're very, very political. And actually you sort of see this use of code on the rise right now to deal with harassment. You have, and it's rising in parallel almost with the history of anti-spam. So you have regular expression filtering, which by the way is still terrible, right? So I actually have a friend who uses a client so that Gamergate never appears in her timeline. But because people never type out Gamergate, she had to also censor GG, which means that if I type out something like arg, that also gets censored and she never receives it. So that's the early side. And then we also have client-side blocking mechanisms. We have things like Block Together, which is a subscriberable block list service. We have things like GG Auto Blocker, which automatically calculates whether someone is likely to be a member of Gamergate and then puts them on a block list so that you can subscribe to that block list and block that person. You have the block bot. You have these basically very early, almost vigilante forms of filtering. And these actually resemble closely how spam filtering worked in the really early days before email became hugely centralized under Google. So this is, I think, a really promising place to go towards. And where we are right now in the policy discussions is not so much infrastructure and code. It's actually deletion censorship and imprisonment in some cases. And one way to think about this is that code is frequently an anti-hoc solution. You're setting up the conditions under which people speak to each other. You are trying to prevent harassment before it happens by making better user interface decisions, by making sure that your algorithms don't privilege one group over another or just making it so that people have options when they're getting harassed. Deletion is post-hoc. After something happens, you hide it. You sweep it under the blanket. And unfortunately, deletion is, there's a lot of focus on deletion. And it's filtered not just into the policy debates but into the legal debates. Like when we think of legal solutions to the Internet of Garbage, we think in terms of deletion. And I think that as a scholar of copyright law, I think that this has a lot to do with the very privileged place that copyright law has in our society. Copyright law is the easiest way to get something deleted on the Internet. It's the easiest way to fight back against someone on the Internet because copyright law is overpowered. It's not at all a balanced law. And we are looking right now at this new problem at, well, not quite new, but a very different problem, a problem that we haven't quite thought of in the terms that we're thinking about now. We're looking at it through the lens that the RIAA has created. We're thinking about notice. We're thinking about takedown. We're thinking about going finding people. We're thinking about subpoenaing ISPs. And this isn't useful. And I'm going to run through a very unusual Ninth Circuit case to explain why this isn't very useful, otherwise known as how to break copyright law. So Garcia v. Google in the Ninth Circuit. This is Cindy Garcia. She was recruited to be in this movie called The Desert Warrior. And it was about, as far as she knew, it was a perfectly nice film. It had something to do with the Prophet Muhammad, but it's a fine film. Later she found out that her parts had all been dubbed over and that it had been released as the innocence of Muslims. This did not go well for Cindy Garcia. So, of course, her name got put out there. Her personal information got put out there. There was a fatwa on her. These are some of the things that were said to her. And these were read out loud in oral arguments in the en banc hearing. So there was an initial decision. And then that was appealed back up again to the Ninth Circuit. So there's Garcia 1 and then this is Garcia 2. So this is quoting from the hearing at Garcia 2. Are you mad, you dirty bitch? I kill you. Stop the film. Otherwise I kill you. Hey you bitch, why you make the movie Innocence of Muslim? Delete this movie. Otherwise I am the Mafia Don. I kill whoever have hand in insulting my Prophet. O enemy of Allah, if you are insulting Muhammad Prophet's life, suffer forever. Never let you live it freely, sore and painful. Wait for my reply. And this was interesting. So counsel was reading these quotes into the record at this hearing. And it was strange because this is a copyright case. And actually almost everyone who was there, they were copyright lawyers. They were really interested in sort of the weird copyright angle including myself. And immediately they were like, why is she reading these threats out loud? Like what does this have to do with anything? And in fact the judges agreed immediately you get sort of an interruption that you never want to hear if you are arguing a case in front of the Ninth Circuit. Counsel, how do these threats go to the preliminary injunction standard? So yeah, so Garcia was a very problematic case. So in Garcia 1 you got a judgment in favor of Cindy, Cindy Ligarcia. And basically the Ninth Circuit ruled, yeah, she actually does have a copyright in her performance in Innocence of Muslims. Now Lindsay Lohan does not have a copyright in Mean Girls. Leo DiCaprio does not have a copyright in Titanic. Like that's not how copyright law works. You act in a movie and then you get paid and then someone else owns the copyright on the movie. And usually there's like a series of very complicated contracts to govern this. But it's largely assumed that even if you are making like a home video without any contracts whatsoever the actors are not the ones who have the copyright in this work. And so this decision basically broke copyright law. It broke the fixation requirement. It created a new form of authorship. It created a new subject matter for copyright. It messed around with the definition of performance in the statute. It had a really weird effect on the movie industry potentially. Like one of the questions I was asked during oral argument was could any person who appeared in the battle scenes of Lord of the Rings claim rights in the work, feasible. It broke section 512 which is the notice and takedown section of the DMCA. And it also broke CDA section 230 which would have basically otherwise immunized Google from Civil Suits. And so this is interesting. Like all of these things happen because basically two out of three judges felt really sorry for Garcia. Like this is a really bizarre and unusual decision. It has no place in law. It is one of the most bizarre things that I have ever read actually. And all for what? It was to take this movie off of YouTube. And what would that have done? Like we think in these terms of deletion and it gets us nowhere because deletion isn't victory. It's not liberation. It's not freedom from fear. It's just deletion. And you see this actually in like a rather unfortunately callous comment by one of the judges in the en banc hearing. Is there anyone in the world who doesn't know your client is associated with this video? Oh maybe in a cave someplace. And those are the people we worry about. But we weren't able to give Garcia any real measure of safety through this. And it means that this focus on deletion is not that productive. And obviously I did spend a fair bit of time talking about filtering but I was talking about filtering from the perspective of how can women be on social media and beyond social media without getting driven off of social media immediately. It's not deletion is a tool. It's not the end goal. Deletion is a tool to give people space on the internet. And I say deletion but really I would prefer to use the word filtering because you know people can yell at me all they want. I just don't want to have to hear it. And so I think that a more productive direction is for us to shift away from these post hoc reactive measures and look at things like a compassionate user interface. So an interface that understands that these things happen and like maybe you know after the fifth time you type something furiously at someone the interface says hey are you sure you want to send that message? Something that just checks you. Not necessarily something that silences you. Something that checks you. Restorative justice I think is also a very promising frontier. So Riot Games is a games company that publishes League of Legends which is known for quite a lot of harassment. Just really uncouth behavior between players. And they recently instituted a series of internal reforms. It's an ongoing experiment to try and make things a little more civil inside the game. And actually so they set up this thing called the tribunal system where the players would actually get judged by basically a jury of their peers. Like when they got flagged with a report they would get put through the tribunal system and most of the time and the tribunal system is checked by the way by the staff. So the staff will almost access I guess an appellate court or a judge in this case. And the tribunal will say like no your behavior was out of line or no it's okay that that report was no good. And the players who are being judged by other players often it turns out they didn't realize that their behavior was wrong or it was unacceptable to the community and they will apologize. They will even voluntarily put themselves in restricted chat mode which means that they only get a certain number of chats and so they now have to choose between talking to their teammates or trash talking. So it puts their focus back on the game, it regulates their behavior and this is something that they opt into. And I think that this kind of thing is really important and very promising and something that I do want to caution you is that it doesn't necessarily mean that all people who are harassers are just like somehow misguided or they're just immature children who need to learn better. It could be that these are all just incurable sociopaths but that doesn't matter they're trying to do better anyways because now that the eyes of society are on them they want to change. They do not want to be perceived as someone worthy of censure. And so what I've done here in laying out the theory of garbage is actually not that interesting. I've basically applied Larry Lessig to online harassment. So there's Larry Lessig's Four Forces. You've got market, law, norms, architecture. And so on the law side you have things like the DMCA, cyberbullying laws, revenge porn laws and actually social media policies. So these are not laws per se but they are becoming the new law. What Facebook decides is acceptable on their platform. That is an extra legal determination but it's one that looks much more like a legal determination than anything else. You have norms. That's just moderation. Just telling people, hey that's not okay. That's the tribunal system. It's people coming in going why would you say that? This is not the place for something like that. These are the things that keep people in check. These are the things that make people feel as though just yelling at someone is not allowed even if no one has expressly said that to them maybe. And then there's architecture. That's interface design. It's algorithmic self-help like these blocking tools. This is the code. This is the code that is never neutral. And this is where I'm hoping we get some respite from the barrage of online abuse if we can just design the architecture of the internet in a compassionate, egalitarian way we can get some place. We can get some place much better from here. And if anyone figures out the market angle to this just let me know because I never did. So when we're looking at online harassment we can learn from the history of spam and it's just spam speech. Spam is also garbage. And so we carry on. We can move forward from there. We don't have to stop it but what about my free speech? The conversation cannot end there. Speech is important but we have work to do anyways. So in conclusion, the internet is mostly garbage. Garbage is a speech issue but we still take out the garbage. Code is never neutral. We should have thoughtful design for the marketplace of ideas and that will make us a better future. Thank you. So we have about 15 or 20 minutes for questions. Sarah, I love it right up until the end. And then at the end I lose it. It's like somehow you're going to try and create a norm on the open net. But it doesn't work that way. When we try and do it in a real world you find that you have to start with an affinity group like this group in this room is we are a group. Now if we were going to talk about something and do it in a way that was not mutually insulting that broke up and gets to be gamergate we could approach that problem. And in fact when we've tried to do this we have used our internet question tool to do it by starting for example on the gender issue. Starting in a group like this mix male and female by anonymously feeding into the question tool what your fears would be about actually talking with opposite numbers about delicate issues of diversity. And then anonymously feed in what your hopes would be if you could do that. And then structuring a discussion that's got a real issue to talk about and setting people to talk. And lo and behold with the aspirational view and the fears people are considerate as they approach it and can do well with it. But if you try to do that on the open net you just get creamed. So my question is as you're trying to use architecture if you're going to limit yourself to the architecture of the open net I don't think you've got a winner. But if you can somehow combine the architecture of real world environments with real world affinity groups and then scale to the net now you could establish some kind of structure. So I think that the open net is actually just not as open as we often say it is in our rhetoric. It's enclaves. It's enclaves of people. It's clusters of communities. It looks more like a cluster of grapes than it does a globe. So there's some places where the internet really sucks and then there's some places where the internet is really great. I think a really hard problem is when those worlds crossover in what is known as a forum raid where people from one of these worlds goes into the other and just disregards the norms there. And I think that's a really hard problem. But beyond that I think that there is absolutely a way to set norms on the internet. Without ever seeing people face to face. People make friends on the internet and then they never see them in real life ever again. They do this. They form these communities. They are truly people on the internet. They are truly citizens and residents of the net and of the web. And they speak to each other. They form norms and they form societies. And I think this is absolutely possible. There are a lot of hard problems here. But when you speak to people who do community moderation, for instance, you can really hear them talking about a craft of setting a level of discourse of this emotional care labor here where you can make people comport themselves in a way that's better or worse. You can make things worse very easily. And it's much harder to keep things civil and useful and also to keep people ongoing and then also to encourage people to speak, but also to get that one person who speaks way too much to pipe down a little bit. This is work. And I think it's entirely possible to do it just on the internet. I guess this is a related question. About 20 years ago, Cass Sunstein wrote a book called Republic.com. He was concerned not just with harassment, but in fact extremist speech. And he talked about how the internet made possible ideological echo chambers that seemed to encourage this kind of extremist speech. The reason I bring it up is that you're throwing spam malware and harassment into the garbage pail together. I understand everything's the same if you ignore all the differences. But I'm not sure, I think for Sunstein free speech aside, there was a kind of communications disorder that this isolation created. And when you look at it from that point of view, the remedies are like what you've just mentioned, but also it gets you away a little bit from this question of whether you have to take sides between free speech and harassment because that's not really what you're talking about. So do you have no faith in the ability to address communications disorders this way? Well, so it is a bit of an infrastructure problem. You have communications disorders when you're talking about things like storm front. Storm front is storm front. It exists and people go to storm front to be white supremacists. But then you're looking at things like, for instance, Enclave's created on Facebook, which is a neutral platform. And because of the way, say, algorithms work or the way people are able to find each other on Facebook, you have a communications disorder there. And in that case, it does sort of become a bit of an infrastructural problem in that people create these echo chambers through the infrastructure that's supposed to be neutral. I'm not entirely sure that this means that people should be hit with like a daily dose of something that goes against their political beliefs. But yeah, there's like some of it's related, some of it's not. And one more thing is that when I say harassment belongs in this big bucket, I do mean that in the same sense that spam belongs there because when it comes down to it's spam is not, it's a meaningless word. Like we do know what spam is, but when it comes down to it's spam is just an unwanted mailing and that means nothing. I'm probably one of the newer people to this space, but I gathered from your presentation that there seems to be a shift from punishing content to regulating users. And I was wondering if the heckler's voice is a direct analog to that and also if inherent in the market is kind of an aversion to regulating users and a preference to punishing content like with the Twitter. So I think it's, especially in the U.S. it's a lot easier to go after the content and that has a lot to do with the way copyright law has structured the web. Basically there are avenues to pursue content and people at companies who are supposed to deal with these requests so there's a bureaucracy in place for that. And in the U.S. we're really leery of the idea of actually going after the users. Not so much in other countries. This was a pretty U.S. centric presentation I'm afraid. The concerns are very different in other countries and of course free speech is a very U.S. centric topic especially when you're talking about it in the U.S. so I'd have to say that that totally differs where you are. Hi. Fascinating talk. And as the son of a man who used to build sewage treatment plants I appreciate the trash and garbage framework because there's always trash to take out. I'm also somebody who sort of lived the spam wars and I'll install the server side filter when you pry my cold dead hens from the server side. And what a lot of that was and I eventually did install server side filters for a university in the area without them having to pry my cold dead fingers because to sort of use your frame here what we were doing was victim blaming. Well, you just can't manage your mailbox. You just need to delete that. You need to just not read that. And it's exactly the same sort of thing people are saying to women about harassment well just don't pay attention, turn over phone, whatever. Out of spam came a $7 billion email security industry so I think you actually do have your market solution there. You just need to find a way of turning this into a profit opportunity for somebody. And I actually mean that seriously in the sense that I mean why isn't the scene, I mean to ask the question why isn't the scene a commercial opportunity for the various platforms to do this to find a way to actually make money and is that just because women's experiences are completely devalued? I mean it might be that. It might be that women earn $0.70 to a dollar. It may also be that I mean this was a snide thing that I said in my book but if harassment seems like a very feminine problem then spam seems like a very masculine one. It's like you know it's Viagra, penis enlargement, mail order brides. So it's a problem that is very in your face when you're a man and harassment is not so much in your face when you're a woman and I mean guess who buys shares in the email security market and who actually works in the email security market and all of that labor capital from top to bottom it's all men. So I don't know, there's structural issues there that go way beyond the scope of this talk. Just going to bounce back on your remark about this presentation being very US-centric from my point of view with a European law background I'm fascinated by the power that the world's free speech and censorship have in the US and in particular how we all agree that there's something wrong with hate speech and all this garbage on the internet but we absolutely refuse to touch speech and we even have to break copyright in order to solve these problems and we always do it in roundabout ways instead of trying to address the actual problem that's with here. Maybe two examples that go in this direction when you have some stakeholders that try to get their responsibility when it comes to speech they just say that it's censorship as if it was an argument enough itself. You can look at Google when they try to implement the right to be forgotten. They just say, oh it's censorship as if it was the entirety of the debate. Also when we were talking about the negotiation of the ITR in the ITU most of the issues were just about not internet regulation but more about telecommunications but then the free speech, free internet issues sort of rose up because the US didn't want to move on some of the other positions but everything got swept up into free speech issues and nothing got discussed in the end. Do you think we can perhaps move towards a clearer discussion of free speech issues will we have to avoid the elephant in the room for a long time? So I think that this is just sort of a big difference between how Europeans and how North Americans maybe because Canadians to some extent do subscribe to but not quite so let's just say Americans this is just a big difference between how Americans and Europeans approach approach the issue this is just I think that to people outside the US our fixation on free speech appears to be a sort of collective insanity that said I subscribed to the collective insanity I think it's great I think free speech is great I am not sure if this is just sort of one of those intractable things or if because when it comes down to it right there's some things that are just closer to being inherent goods than you know other things right like we talk about equality we talk about freedom in general and to some people actually maybe equality is not that good and why do we care about it so much but it's but to other people it's not even quite worth the debate like that's like something that is maybe best left to you know first year philosophy classes I think like I've given some thought to this I'm not really sure where I come out on it I think that in other countries outside of Europe and outside of America those perspectives I find very interesting as well because you have but you have like a very different threat model there especially under oppressive regimes so I think that it's largely a circumstance of how you perceive your own government and your own society and in the states like our fixation with free speech says something very interesting about our society but that's neither here nor there there's a traditional liberal response that the way to deal with bad speech is to general out with better speech why doesn't that work here because it just turns into a shouting match and you can't it's I framed this as a usability issue I don't think it worked before the internet either like I mean so I do in practice I do engage in counter-speech and I think counter-speech is good it's a tactic it's certainly a tactic but it can't be the only tactic like it's one tool in this toolbox there is a big toolbox and I don't think that counter-speech was ever just the answer because when it comes down to it you can only hear so many decibels right and you can only see so many messages at a time like we've always been in a place where not everyone gets to speak and this is a problem that I think or rather everyone gets to speak but not everyone gets to be heard this is just something that we don't talk about enough I think Hi, so you mentioned this idea of the architecture and the design of the internet and as a female user of the internet and developer I feel like psychologically I've had to lend myself to a certain way of thinking in order to participate in the design of the infrastructure and I wonder if you think that you know in a world where a lot of this architecture is designed by men if maybe equal representation in the technology sector may be a possible solution to designing interfaces or just uses of the internet in a way that adheres to female psychology I also study cities and it's quite similar and there's a number of studies that explore how the urban environment, the built environment is also a product of sort of like sexual and psychology and that you know the representation, more representation of women in the design sector creates environments that women feel like they can participate in more so I'm curious as to your comments on that 100% more female engineers more female CEOs more female venture capitalists more women on boards of directors it goes all the way to the top it's from the top to the bottom everything is implicitly organized around how men see the world and not just men, how white men see the world and this is a problem this is why so many things suck and this should absolutely be addressed and it should be addressed for more reasons than just like we're not able to expand this market out because it turns out this product doesn't understand a user base beyond white men who are not exactly a majority in the world it's but that's the start you know like eventually you're going to tap out your tiny minority user base and you're going to have to work on a national level on a global level you're going to have to reach out to everyone because inclusivity isn't just a social good it's financially productive Hello I'm Matt and so I was just curious what you thought about the role that a 230 and safe harbor immunity has played and I know you said you didn't really think about like a market solution but if we repealed 230 and then Twitter was like under threat of litigation for inflicting what they are upon the world I mean that would certainly shift market dynamics if we repealed 230 the internet as it exists would no longer exist so let's not do that 230 good people who hate 230, stupid it's like you know there's some issues with 230 I think and like reasonable minds it's not going to be on whether very narrow exceptions to 230 are warranted but 230 as a whole is good So I think we have time for one more question here Oh no it's not going to be a very good question I'm sorry I'll try it anyway so thank you for a very thought provoking talk you I think make the case for a multifaceted approach in some respects trying to address the behavior underlying it so in the case of the tribunals at riot games improving architecture improving user side filters you also seem to go out of your way to take deletion off the table is that what you intend to do to take what you say deletion is not the solution of course by itself it's not the solution I think there are arguments that deletion should be more rapid should be easier recognizing there are costs in that I don't care for that I think there's too much of a focus on deletion obviously like some things got to go you know it's I don't think that there is a point in providing for instance client side filters for child porn I don't think we should do that deletion is in many instances the answer it is actually the answer I do think however that we we tend to focus too much on deletion and that the law especially is very much a blunt tool in this and the law focuses on deletion and imprisonment and sanctions and when we move away from the law and this is this is one of those things where you know that when engineers are faced with a hard problem they put up their hands and say well that's for the law to decide and when lawyers are faced with hard problems they put up their hands and say well that's for the the tech industry to decide and yeah I'm doing a bit of this where I'm saying like I think that it's much more agile to work with the architecture and to look for a vast solutions to this problem rather than to go after it with this blunt tool