 Senior vice president and chief program officer for Knight Foundation Great. Thank you Robin almost to the almost to the end of the day I think Sarah Bartlett asked at the beginning of her question in the last panel So it was really nice to have a happy panel now for a less happy panel Unfortunately, I mentioned this last night. I think it's it's safe to say at this moment that our love affair with the Internet Has been at least tempered by the growing realization that like any population scale system the internet combines The the great potential for good and flourishing with perhaps equally great Potential for harm. I think it's also safe to say that the rise of the social and mobile internet as an increasingly dominant entry point for information for discussion and for commerce has fundamentally reshaped the stakes of an informed democracy Which is to say that if we're going to be an informed democracy Then we're going to be an informed digital democracy if informed and digital can't go together We are we are without hope but at night we're hopeful and and one of the key steps that we've taken to address this new reality is to make a Substantial investment in the fundamental and basic knowledge about how the information system is changing and the implications for democracy Alberta would bargain alluded to this work this morning, but to put a number on it over the past year We've made approximately 50 million in investments over the course of five years Both to establish new centers of research and to accelerate existing efforts the five new centers that this funding founded Are at the University of Washington? New York University the George Washington University Carnegie Mellon University and the University of North Carolina There's a complete listing of this in your in your materials. It's been led by John Sands who's sitting right in the front of this room So today what we're going to do is to dive in particular Into one feature of this new environment that has been one of the most covered and one of the most disturbing It also I think epitomizes why an investment in basic knowledge is so critical the rise of In the growing profusion of hateful and extreme content on the internet Is that the same time both alarming but also incredibly difficult to describe the amount of knowledge? We have about this content where it comes from The extent of it its impact on us that knowledge is growing thanks to the folks on this stage But it's still relatively miniscule compared to what we understand about other population scale challenges So to guide us through this topic or we're gonna do this a little differently than some of the other sessions We're gonna have a series of short talks First we'll hear from Tallulah Ansari He's a reporter for the Wall Street Journal who has been covering hate speech and the relationship between online and offline manifestations of hate Second we will be hearing from Joan Donovan who is the director of the technology and social change research project at Harvard Shorenstein Center and has focused in particular on how movements different movements use technology and how that should inform our response Well then here from Safia Noble who is a professor at the University of California, Los Angeles And who studies the relationship between how technology is designed at the algorithmic level and the impacts of that design on And the real world on vulnerable populations and then lastly we'll hear from Ethan Zuckerman who is professor and director at the Center for Civic Media at MIT He has been exploring in particular lately what a positive future for the web might look like so again We'll hear from each intern and then we'll have a brief discussion Hey, thanks for having me. Yeah, this is also a happy panel. I'm sure At home at least So it's my pleasure being here. It's my second time here. It's a great meeting such intelligent people doing good work So I'm here with my fellow speakers in academia, and I hope to provide a slightly different perspective I believe I'm the only reporter on this group a little background I was assigned a side beat covering hate crimes and discrimination in the US It was 2016 at the time I wrote dozens of stories over those years ranging from Moss bombings Targeted assaults murders and other such incidents That first year I found myself covering the shooting death of an Imam which is a Muslim religious leader in Queens as I live tweeted scenes from his funeral. My phone began to light up with a lot of notifications I began getting hateful tweets directed at me and minorities at large and it began to just snowball from there I could get the next slide So what you see here is one of my earliest encounters with hateful content on the internet It was shocking at first but over time as I continued to report on these issues It became normal to encounter images and words like this So as I planned for this session I looked back on all my years of reporting and I just kept coming back to this tweet And I think it can sort of impart what I see as some of the issues related to hateful content online So, you know If you look at it, it represents Also the challenges that are faced by the social media companies like Facebook and Twitter as well as just ordinary citizens So it's hateful. It's crass. It's alarming sure But what makes this so damaging in my opinion is that it's public It's visible to anyone online The I believe in this case. We're talking about Twitter It's a digital equivalent of a poster thrown up along a wall on a busy street and Even though it's public this sort of content for one reason or another has been proven difficult for social media companies to tackle Just like their challenges with abating disinformation and lies online hateful content is hard to monitor or catch or Ideally squash before it's even viewed let alone shared thousands of times over So questions start arising. How does one police this sort of content? How is this content not given a stage? I could get the next slide The second issue I learned exemplified by this tweet is the problem of anonymity Anonymity is what allows much of the hateful content to exist online the person who tweeted this a Pepe the frog is by and large anonymous it allows them to tweet with What I can say and I'll be kind here such venom part of the reason they feel safe in doing so is Precisely because no one knows who they are Which leads to my second point with this anonymity comes a lack of accountability The lack of accountability for a person or group spreading this hate Online is at its core what allows for the toxic ecosystem to thrive in the way it does today If this person gets banned, so what? It's a fake profile. They'll just create another Nothing else happens. Are there any real-life ramifications? Absolutely not and If you look at some of the most hateful darkest areas of the internet You'll find anonymity everywhere. In fact, you'll find platforms that base themselves solely on the premise of Anonymity and obviously here I'm referring to anonymous message boards like 4chan hn and the like Time and time again in my reporting and the reporting of so many other great journalists sites like these are often the genesis of much of this hateful rhetoric These places also happen to be the genesis of disinformation Campaigns that you see it going on or the targeting of certain reporters or certain news outlets Leads to more questions What are the potential outcomes of this hateful content online? What can it lead to perhaps the most disturbing? Aspect of online hate to me is the real-world outcomes events that transcend the virtual This can be in the form of school bullying hateful incidents on campus Which is on the rise the targeting of religious institutions or in its worst form mass shootings The mass shooting in El Paso, Texas last year the suspect is believed to have posted a manifesto full of hatred towards immigrants Right there on one of those message boards and the New Zealand mosque shootings the suspect not only live streamed Those killings, but he also posted his manifesto on a similar message board Look at my last slide, please This is an article I wrote after those attacks in New Zealand That shooters manifesto was filled with hate But it was also riddled with memes and references that were half serious. They were inside jokes in there Seventy-some pages. I believe meant to amuse people that live in this alternate media ecosystem Like 4chan and 8chan, but the manifesto also revealed itself to have a deeper intent It was designed to unload a great deal of misinformation with it and literally it was packed with all sorts of erroneous issues Claims that he knew that people within his ecosystem would understand to be jokes But a layperson or a reporter covering this for the first time or breaking news reporter would believe to be true So how did we tackle this and how do we go on from here? More questions. I'm a journalist. I'm asking questions not answering them Should news outlets report on hateful content or is it just giving attention the shooters want? How do social media companies or internet service providers? Deal with the issue if they're trying to handle it. How do they do it better? Does the government have a role to play if so how much? Where is the line between privacy and preventing the next mass shooting? It's not my job to theorize on these solutions I only know that at the very least news outlets and reporters both at the local and national level Need to be cognizant on how to report on incidents like this and also how not to report on incidents like this I believe that this can only happen through an understanding of the dynamics of modern-day hate speech on the internet People in the news media must have a basic understanding of that world And to be honest hateful content online is at the nexus of so many other issues whether it be bullying as I mentioned Disinformation politics outside state actors manipulating our election demonization of races religions or people of certain sexual orientations and so on so much of it is propagated there And that's why as we head into the next election and as more people and and their lives I lived online in ever-increasing ways large and small These questions are of the utmost importance and that's why I look forward to hearing my Colleagues research on this matter as I know they've done a great deal of work about this precise issue. Thank you Hey everybody, Joan Donovan. I'm working up at Harvard now at the Shorenstein Center and previously was working at data and society So last year you actually heard from Dana Boyd about strategic amplification and different ways that we can think through the issue of hate speech online as well as other Problems that we see propagated both by algorithms and by groups of people who are very interested and Putting negative things into the ether and I want to thank T'Laul for setting me up There's a lot of good things that he said that I don't have to cover now Which is great and then after the panel like join me in the hot tub because we're gonna need to relax It's gonna it's gonna be a tense one But I think that there's a role for everyone in this room to play And I hope that we can have that discussion and dialogue about what are the little pieces that each of us can do To help get I don't think we're ever gonna get rid of the problem of white supremacy online But I think we can Mitigate quarantine and lessen the impacts through some strategic work So two years ago in a room very much like this Patricia Hill Collins who's a black feminist professor posed a difficult question to sociologists at our annual meeting she said Do we care as much as they do and she had just given a very long presentation on the history of white supremacist organizing? Both pre and post internet and the violence in Charlottesville and the room was silent And I'm now ready I think to give an answer to that question and I want to say that the work that we've undertaken over the last five years is because I care more about racial equity than any other topic and Ignoring racial equity in our work creates many failures to address inequalities and harms That trouble all of our political and social institution Institutions right now even in journalism this morning. I attended a snap panel down on the second floor Where many we're talking about how they don't feel like their stories are being told black women people of color LGBTQ folks We struggle to be seen sometimes because living on the margins feels like a home but usually it's that Posing our critiques in the way that we do makes a lot of people very uncomfortable Because there are some of us who benefit from keeping things the way they are and I fight against that every day in our research is to say things as they are are not working for most of us and I think there's ways that we can change that but it only starts with admitting that you care more than they do and Then you get organized, right? That's the part two So my team at Shorenstein has a mission. It's very clear to us We want to force accountability for tech companies who build and deploy products without understanding how their profit comes at a huge cost to most of us We study white supremacists because we understand that this is a group of people who have spent the last hundred years manipulating media For their own gain and I'll be damned if I spend the next hundred years watching them do most more of the same especially as platforms have become the primary distributor of Content content whatever that is and news these manipulators have capitalized on the lack of policies of content moderation and under enforcement of their terms of Service so our research team studies white supremacists along four dimensions First we look at polarization on wedge issues where white supremacists piggyback their talking points into mainstream conversations Usually via social media for example the issue of opiate addiction has touched many of us But white supremacists use it as a talking point to highlight white pain and to build the white lives matter movement This movement holds rallies in small often rural towns where they can recruit people in moments of intense insecurity and fear The second dimension is we track the tactics that they use to hoax and manipulate algorithms and platforms Because they can't show up as who they are They always have to cloak and mask themselves in different ways This is actually a very old tactic in the 90s white supremacists use online Communication technologies to recruit new people and to hone their messaging usually using bulletin Boards simple message boards and we see again simple message boards and the anonymity as another perpetual problem but How many of you know that Martin Luther King org for nearly 20 years was run by a group of white supremacists right very few of us know that and Jesse Daniels who's a sociologist has spent a decade tracking white supremacist movements in the internet The way that they've moved online to build their movement and similarly my team looks at the ways that platforms have added Amplification power for little to no cost and we're indebted of course to Physiophia's work for framing the inequalities created by algorithms as a top issue for internet For scholars of critical internet studies The third dimension is that we know that white supremacists attack journalists and researchers because they need to silence us Our work is effective and essential And it's also dangerous and very scary most of our loved ones ask us to quit And that is something that we all have to face The harassment campaigns by white supremacists work because social media platforms make our personal social networks completely public Currently, it's not hard for those who want to dox us or get us fired to contact our bosses our family our colleagues Because that information is neatly displayed in us in a sidebar as Talal has showed you But we need to flip the script on free speech We need to understand content moderation is different from an issue of censorship because what about my free speech? right now My speech is restricted by the threat of violent attacks against myself and loved ones and a mindful that speech doesn't happen in a social vacuum Speech is never free from consequence It's always relational But some people hold the power to shape that speech and we need to put them into the accountability Matrix fourth organized white supremacy has gained power in high places over the last few years And this has left many of us feeling beaten hollow The inaction of platform companies to admit that white nationalists were using their technology to recruit and raise resources Felt like a tacit endorsement of these groups only after murders the burning of churches and community centers and countless Articles written by journalists and researchers did platform companies begin to take action against some of the more noxious offenders But I stand here to remind you that Richard Spencer the architect of the alt-right still has a presence on YouTube and on Twitter today Finally the use of platforms to archive and distribute manifestos of mass murderers must end the killers in Christchurch and last week in Germany Used violence to call attention to their ideas and this tactic only works because platforms are designed for profit and Not for community security So as I close I want you to think about how the future of our society our Democracy depends on reliable and stable communication. It must be predictable and Transparency and accountability for technology companies must be at the forefront of our journalism and research That being said we must think about the internet like we think about democracy These are not products that can work without participation like Democracy the internet and platforms are social processes and as such they will never be finished Social processes though require rules of engagement and consequences for abuse To be sure we can't design our way out of this We need plot. We need to see platforms for what they are They're amplification technologies that tilt in favor of those already powerful and well resourced These are the people who are willing to pay to influence our social and political institutions platforms are not neutral but have Consequences for most of us and we hope that our research can play a pivotal role in how policy is shaped going forward As we think about redesigning our communication environments online and I'll you know Sadly gone are the days when there's social media will hoist the next social movement I have grieved this and it's time to move on. I'm confident that democracy will survive social media But platforms cannot survive without us and so we need to think about what powers do we have? What networks do we have and are we putting them at risk by putting them on blast through social media? And we must remember that our words are our weapons and they are incredibly powerful. Thank you Thank you so much for the opportunity to share a little bit about my work and to be part of such a distinguished panel I want to share with you. My name is Safiya noble. I'm a professor at UCLA I wrote a book called algorithms of oppression how search engines reinforce racism Which when I started about ten years ago on that book, which was a dissertation that became a book It was very very difficult to get even four people to sit on a dissertation committee and Acknowledge openly that algorithms and technology could in fact inherently discriminate in Fact the ideas that we're talking about today have become mainstream over the last decade Now we hear lots of conversations about the biased algorithms and how AI and algorithms can discriminate But I promise you that just ten years ago When we talked about computer code and computer programming and it's incredible importance in fomenting Social inequality and discrimination people would say to me Safiya. That's impossible Because computer code is just math. I Would argue that that's a bit like saying human beings are just cells We're just mitochondria It's really insufficient to talk about programming and code and platforms at the level of thinking of them as simply Mathematical formulations. They're in fact, of course How inherently programmed with all kinds of bias, but they're also deployed and weaponized in a variety of different ways Which of course Joan and Others on the panel will talk about today so I want to share with you some of the findings that Informed that book algorithms of oppression which really started quite simply by doing keyword searches on a variety of racialized and gendered identities I'm a black woman at one point in my life. I was a black girl. I have a daughter I have a host of nieces I was curious about the way in which people were relating to Platforms like search engines as kind of the new public library This is of course what was happening when I was going back to graduate school around 2009 Everyone was enamored with of course many of us were on the old internet Let's say the the the web 1.0 Which was I'm not gonna even talk about full of being full of pop-up ads instead. I'm gonna say that it was You know a very disorganized Kind of experience and so many people were thrilled to see things like search engines come along And I was curious about this because at the time that I was entering graduate school I just was just leaving a 15-year career in advertising and marketing where we were deeply invested in ad agencies and trying to Manipulate the kinds of results that showed up on the first page of search We this was before we started calling Calling that phenomena SEO or search engine optimization It was just how do we make sure that content shows up about our clients to the first page because Most people if you look at the information retrieval Research do not go past the first page of results both in a search engine or in a library database search So what happens on the first page of search is incredibly important just like what happens Let's say in the first couple of minutes of your news feed in a social media platform. It's very important and The types of content that I found as I was doing these keyword searches on terms like black girls Latina girls Asian girls Was represented almost exclusively with pornography And the question is how can that be that black girls and Latina girls and Asian girls are Synonymous with pornography you don't have to add the words porn or sex That's just how we were encoded into the platforms And of course part of that is because the commodification of women and in particularly of women and girls of color It's incredibly big business in the United States and so these are the kinds of things that really gave life to the book and Ultimately as I was writing the book and collecting a lot of data about all of the kind of different ways in which people of color and Vulnerable people are misrepresented in platforms. It became apparent that we are In a crisis in the sense that we cannot control the ways in which we're represented or misrepresented in these platforms In fact those who have the most money are the winners in these platforms because these are advertising platforms And they're organized to return profit and to circulate what is often the most titillating egregious kinds of content because that in fact is the content that goes viral It's the kind of content that gets clicks. It's the kind of content that Puts us at peril The other thing that I talk about in this book is of course the case of Dylan roof and The fact that you know Dylan roof who if you don't know was a Open fire on unsuspecting African-American worshippers in Charleston, South Carolina in the summer of 2015 and killed nine African-Americans in Immanuel AME church and at the time, of course, we've had many more violent mass murders Motivated by race and religion But at the time it was one of the worst mass murders We've seen in the United States and Dylan roof in his own words articulated that He was in fact searching for information about Trayvon Martin and George Zimmerman and was trying to make sense of the news reporting that was happening And why who was this Trayvon Martin? And so you think about the role that? We that we ascribe to search engines in particular but other kinds of platforms certainly that are increasingly responsible for Sharing out information or fact-checking or providing credible resources I think that we've given over an incredible amount of power to these platforms And of course this is what we're here to talk about today But one of the things I want to underscore is that it's really been women women of color LGBTQI feminist Scholars and journalists who really we have also put ourselves on the line take it an incredible number of body blows just to normalize this conversation and bring it into the mainstream and Most of us have done this was very very little kind of financial Support or resource or investment in our work? And so if there were Another plea that I might make about not just becoming educated. It's to think about what does it cost us when we? Keep investing in the same kind of techno utopian dreams that come out of major Silicon corridors around the United States Not just Silicon Valley, but a number of you know private universities and other kinds of institutions and think tanks that that are able to amass incredible resources and after 30 years of selling us this kind of celebratory emancipatory Rhetoric about the possibilities of the digital technologies and platforms in the internet While the rest of us have been in harm's way and quite frankly paid the price for these kind of alleged Liberation liberatory possibilities of the internet and so I would say it's important for you know I think about the work that Joan is doing despite the fact that it's at Harvard and And and the work that we're doing at UCLA and and what does it mean at this particular moment as we're grappling with all of these Kinds of issues that we are seeing a massive divestment from public institutions that could serve as the counterweight You know I often talk to policymakers and regulators and I say it's insufficient to just think about regulating the big tech sector and not Also contend with the fact that we are divesting from public education divesting from public universities divesting from public media We cannot create successful powerful democratic counterweights at the You know as the this sector grows at our expense and of course being in the University of California system where we should be flush With resources and yet our system is literally dying on the vine Relative to other kinds of private universities and we have Silicon Valley and Silicon Beach in California But when those companies don't pay taxes they actually bankrupt our public institutions And I think this is something we want to contend with today The last thing I'll say because I'm out of time is that we need massive paradigm shifting We can't just keep investing in the same old Rhetorics and toolmaking and Technodeterministic Ideas where we're gonna somehow perfect the algorithms or perfect the bias out of AI There was a time in this country when we thought our whole economy was dependent upon things like oh big tobacco Or big cotton and we couldn't imagine Reorganizing our economy even those those economies were predicated upon the enslavement and human trafficking of African bodies and the occupation of indigenous lands I think it's important right now for us to think about how could we have a paradigm shift with respect to big tech? Like we've had with respect to big tobacco and think about all of the ways in which Communities are facing incredible harm through the rise of these new predictive technologies And I think we have a great opportunity before us in the next decade. Thank you Hey everybody, my name is Ethan Zuckerman I'm really thrilled to share the stage with so many scholars that I really admire and have learned from journalists who I've Got in a great deal out of reading and I'm thrilled to be with you because actually I want to make a plea For your help on this set of problems that we are collectively working on So let's just talk briefly about the set of problems Social media right now is not working well for us as citizens in a democracy We have serious concerns that it's increasing polarization We have serious concerns that it is spreading miss and disinformation We have very real concerns that it is pushing people towards extremism And we have real concerns that the algorithms underlying these systems are deeply unfair in ways that are very very hard to Counterbalance and the first thing I say want to say on this is we shouldn't be surprised These tools that we've built were not intended Necessarily to be a particularly healthy digital public sphere. They were intended to be tools that capture our attention grab it and repackage it for advertisers and In the process if they can squeeze as much personal data out of us as possible that would be great They've only become our digital public sphere because they got really really big really really fast and in the process They've become deeply unhealthy spaces to have civic dialogues. They're built around surveillance They're at such scales that they become almost impossible to govern and frankly the governance of them is treated as a cost Not as something that you would actually want to take seriously It's something that the companies try to spend as little on as possible and simply get on to selling more ads The solutions that are being put on the table for the most part suck. They're terrible They either involve asking the government to step in and regulate speech Which is not a good idea in general and it's really really hard in the United States or Asking these incredibly powerful companies to regulate themselves and to regulate speech Which is also not going to work very well The set of suggestions that actually aren't so bad are around transparency But transparency in the grand scheme of things is pretty weak tea. It's necessary, but not sufficient We should have much more information about how these systems work, but even armed with that information It's very hard to know how we make these better What we are facing is a classic situation of market failure We have let a largely unregulated market take over our public sphere and that's a really poor idea Now the good news is that we know something about how you correct market failures There was a massive market failure in television in the United States in the 1960s and you had Newt Minow The new FCC commissioner under JFK Stand up in front of the National Association of Broadcasters and say if you watch your own Programming what you will see is a vast wasteland I've been thinking about that phrase a lot lately as someone who studies social media and who frankly has had a hand in building social media But Minow's response to the vast wasteland was not to try to ban Gilligan's Island or I love Lucy It was to try to build a whole new set of Architectures and infrastructures on which a better future could be built a future where we got better children's programming better civic programming better Local news and that meant doing things like investing in satellite systems to allow public Broadcasters to share information back and forth it meant the corporation for public broadcasting NPR PBS it meant not just throwing ideas up there and hoping that they would work out It meant really deep investments like children's television workshop which spent two years working with academics on What television might be able to be for preschoolers before it ended up creating Sesame Street? The failure we are dealing with is a failure of imagination My friend Kara Swisher was up here before she talked about this idea that there's really only two tech companies that matter Facebook and Google they've eaten the rest of them And in the ad market, that's absolutely true and as a result We have not had much very interesting new thinking for the last 10 years Because what the market incentives to do is build something that's gonna get bought by one of those companies as soon as possible So we need a different way Paul Romer economist won the Nobel Prize recently pretty bright guy He has proposed a tax on surveillance advertising a big Tax on surveillance advertising he believes that Advertising that tracks us that sucks out our personal preferences is corrosive to us living in a democracy. I Agree with that, but I want to do something with that tax I want to put a significant tax on Google Facebook and others who are engaged in surveillance advertising And I want to use that money to create the PBS of social media I want to build something focused on digital public Infrastructure, what are the tools that we collectively need so we can start creating an internet? That's actually good for us as communities What would it mean if instead of going into these social networks that support more than a billion people and Outsource all of their community decisions to very poorly paid and poorly treated people in the Philippines What if we were actually building tools alongside the communities that we all individually work with and serve Why can't my town of 3,000 people have a social network that supports us when we have town meeting? And we can join it for four days before that meeting and we can get off of it a day afterwards These things are technically possible But they are not particularly profitable and what they need is for us to let go of this notion that some Genius in Silicon Valley is going to come up with something new and clever that is Simultaneously going to solve our civic problems as well as make someone another trillion dollars. That's not going to happen We need instead of trying to look at this space and abandon it and burn it to the ground Despite the fact that that feels really appealing at times to lean into it in a very serious way And that's why I'm giving this talk to this group of people Because community foundations local newspapers the sort of people who come to a conference like this are the people That need to be thinking about what a different possible future might look like Imagine for a moment that civic media that social media that digital media Wasn't awful wasn't corrosive what wasn't terrible Imagine that it was the heart of Rebirth and rebuilding our communities. What would that look like and how can we build it? There are some amazing people who are in the early stages of trying to do this work Sir Tim Berners-Lee is with us in the audience his new project is one of dozens of projects out there Trying to rethink from the ground up how we might build an internet that is actually good for us And I am asking you I am begging you to get involved with this conversation early on do not be Satisfied just with figuring out how we put some lightweight regulation on these tools take on the really hard question Of what we could imagine these tools could do for us in society and in democracies And let's figure out a way to start getting these companies that have caused so much harm To put taxpayer money on the table and let us start figuring out how we build digital public infrastructure and actually build some alternatives Thank you. Thank you all for those for those remarks Some people are guiltily tweeting them out. I'm sure After that what I want to do is I want to ask a few questions myself and then we'll open it up to a couple questions from the audience So I want to start with you to the extent that we have to We're gonna have to live with this reality for some period of time even as we move toward solutions You made a comment in your remarks about the right way and the wrong way to report on this And I think in the sense I get You know many of you talk about how many reporters so many of your kind of counterparts these researchers talk to every day What just for folks who really are in the media and thinking about this or even in community where these issues are coming up You know, what what's the right way? What's the wrong way to talk about this phenomenon when it's happening in your community? Well, I mean that there is no One size fits all here. It depends on each case. We're talking about you know in the case of the New Zealand attacks if a reporter didn't know that a guy is claiming that You know some imaginary anime hero Anime characters as hero will actually report on that and they'll make print But if they are aware of you know, what he's even referring to they won't so It's basic digital literacy and I think it affects, you know, not only national reporters but local reporters, especially those that haven't grown up on the internet quote-unquote and If you know, I don't know what form or shape it would take but just familiarizing yourself reading the work of other people these sort of things would go a long way and How to and how not to report on these issues? So so I think I want to ask you Of a question pose a question to both of you Tullal had on one of his slides a Listing of some of the attributes that enable hateful content. So the visibility quotient anonymity I Can't remember what the what the other two were ability go ahead accountability accountability sort of lack of accountability You know, there are I think there are some certainly in Silicon Valley who would argue this is exactly the power This is the liberatory power of the internet They might say it with a kind of messianic devotion, but I think there are others who would say look You don't have me to without some anonymity. You don't have me to without the ability to to get Garner public visibility That you know, that would be an example of reporting where news organizations actually change some of their standards of corroboration in response to an online social movement. So When you how do you think about that? How do you think about the challenge? Are we are we diluting ourselves when we think of these as strengths of the internet? Or is it about is it about sort of preserving the positive power of those attributes without drowning in the oppressive side? Yeah, take the example of you know, I would love to put a tax on everything like and pay for public institutions And you know really do college the right way The minute you tax surveillance advertising you get into a situation where you've sanctioned it just like big tobacco, right? So we're back at the beginning and part of surveillance advertising the actual way things work is that anonymity is no longer possible You actually have to make significant investment in remaining anonymous to platform companies in order to remain Anonymous and if you remain anonymous you don't get the lift in the bounce that you need to do Network harassment campaigns because you're worried about who's going to be connected to you If by being connected to you that reveals something in some other network You know the way in which we de-anonymize accounts at this stage You don't just look on Twitter for evidence on Twitter You start to look around and you look on other platforms And so I have a article in the Journal of Design Science and a special issue that Ethan had put together about on the internet nobody knows you're a bot right which is a riff on this old adage on the internet nobody knows you're a dog And it's cute little New Yorker cartoon with a dog on the computer, right? It's great But it doesn't hold true anymore and I think we have to think about the last at least 20 years of social media and Realize that yes, they have built surveillance technologies that are really good at Shielding people that they don't want to hold accountable for using the tools that they've built This is not hacking. This is so low-tech. It's using the features and it's bringing massive swarms of people to basically shut up journalists that are troubling the The way in which they want to spread Disinformation or the way in which they want to attack journalists or professors and I think that Accountability has to start with platform companies being much more transparent about what they know and what they don't know And what they're acting on and what they're not acting on so we see right now in the disinformation space a lot of focus on Creating this idea that disinformation happens out there. It's a foreign operative problem They're not looking at the domestic space because they'd have to challenge free speech but what we know about disinformation is it is as much a domestic problem as it is a problem of intrusion and so there's a lot to unpack there, but no Anonymity is not guaranteed and if people know what the rules are around anonymity They're much more likely to Think differently about their social media presence Wait, so Sophia you on this kind of positive potential versus oppressive potential of the same attributes Well, I think part of the challenge is we are wrapped up We brought you know the big we Wrapped up in this idea that the platform is somehow neutral and it can be deployed for good or for bad And I think that's a false start The truth is that the platforms are designed for both massive extraction of information about us, but also to Return as much profit as possible to shareholders and one of the reasons why we're caught up in the idea of not particularly holding the platforms accountable is because they count the platforms have invested deeply in The notion that they are not media companies that they're just kind of the dumb pipe so to speak And this is where we get into section 230 of the FCC You know guidelines that govern the internet and in that in this sense I think and again puts the onus back on the public How does the public use the internet either for good or for bad for me too or for white supremacy? And I think that What scholar, you know critical internet scholars are doing is certainly what we're trying to do at the UCLA Center for Critical Internet inquiry is destabilize these ideas about platforms being agnostic and That users are either galvanized or information is weaponized, but that they actually play a really integral role You know Ethan mentioned it content moderators and Philippines, you know It's because of the work of scholars like Sarah Roberts who wrote the first academic study Discovering that there were people all over the world who are moderating content and taking content down and working within all kinds of nation-state rules of engagement around content That work alone destabilizes the idea that the internet is a free speech zone but in fact what we know is that they are doing brand and reputation management work and In some countries your brand is not tarnished when you it led white supremacist rule on it or you let Gamergate go down in it In fact, it might bolster your brand So these are values questions that we really have to hold Up in a different light and talk about rather than again letting the platforms off the hook So I'm gonna ask Ethan a question and then we'll open it up So think about what you might ask you just remember questions are interrogatory not declarative It's just a health pro tip So Ethan, you know, I have a paradigm question for you about some of the concrete things that you advocated Which is I think someone could could could leave this discussion still wondering do we face an eradication? Problem from a public policy perspective or a resilience problem from a public policy perspective I think someone could look at minnow and say it's unclear whether he was sure whether this was an eradication or Resilience problem. What are you from a paradigm perspective? How do you think about it? So I I'm clearly advocating resilience and not eradication right as much as I would love to Press the button and delete all white supremacists and and all hateful voices from online spaces. I don't see Good ways to make that happen What I actually think we want to think about is how we attack existing platforms That trend towards radicalization It's very easy right now to fall into a data void as Dana Boyd talked about here where people have been so effective in Creating data to lead you towards their agenda that it ends up dominating the search engines It's very very easy to find yourself on YouTube Not necessarily just following the recommendations although that can be one piece of it but following these very careful paths that try to take you from Seeking mental help for depression to then finding yourself facing Self-betterment and that rapidly turning into men's rights, which has a way to then turn into white supremacy So we need to look at ways that platforms are already being exploited for me what I'm pushing for in the long run is a Participatory internet that is much much more Heterogeneous that has many many more platforms Where the vast majority of them are actually governed by the people who use them It's much closer to a model that looks like reddit than looks like Facebook for people who don't know reddit well They're probably wincing at hearing that the interesting thing about reddit Which we're doing a lot of research on is that it is not a vast cesspool. It's actually an enormously complex community Much of which works really really well with a small number of cesspools That is the resilient future that I am hoping for is one with fewer cesspools That are less powerful at sort of taking over and Infecting these enormous networks that we have little control over so just to be really clear I don't have a good answer the minnow question My question is is very much one around trying to get far less centralized Far more distributed and then working individually with those communities When they continue to be communities that are deeply problematic like a community like eight kun or Kiwi farm or some of the Truly awful ones out there. Thank you. We have time for two questions one back here Hi, I'm Chastity Pratt. I'm a Neiman fellow at Harvard. I'm launching a media co-op to Report on school funding and I have been attacked by hateful content online So I'm wondering if Joan and the rest of the panel could give some tips to the media and the media funders here about How we can protect our content from these online predators who really just want to disrupt the work that we're putting together Yeah, I mean from my perspective It's that it's the small number of cesspools that are allowed to persist and in there In the way in which they are marginalized on the rest of the net They become highly Motivated and mobilized to do damage to other communities and so you might have a small group of people You don't think that you should be you know You're not doing anything so crazy that you would end up being a target But they come across your content and then they start to strategize They actually start to think with the tools of where you are and think about well If I can get into their comment stream or can I get into their DMS or can I you know? they're getting some media attention can I you know pose a counter narrative or a counter story and plant some some kind of other story that will then Push our agenda right and you can imagine with school spending that there's a lot of Idea about who who actually is using school funds and I've of course read a ton about You know school school resources being used to deal with ESL English as a second language of course that that brings all the races right in and so That's not an answer to your question But it is getting at what I think we need to think more about which is holding these platform companies accountable There's all these places on the net where you can do content moderation Even at the level of the government the government has like an on off switch. They really can't do anything But they can do one thing they can shut down Literally the power grid to turn things off and we've seen that happen in other places At the level of the individual though It's really hard because they haven't built tools that we can use there were a few instances where people had Mass block lists that you could upload and and put into your social media But it didn't stop others from seeing the bad stuff around your content And so I think we have to if as we work on accountability in the space and transparency we actually have to work on a toolkit that is A toolkit of refusal that says no we don't want other people seeing this content Or we want to be open to some communities But when we see evidence of a swarm We want you to help us as a social media company Help us quarantine and bracket that because most journalists. What do you do you shut down your account for a couple days? That's what you do and that's a terrible thing to do if you've just launched a really good Investigation and you're trying to make impact and you're trying to to listen to your audience And the only recourse that you have is just to shut it down. I don't think that's useful But right now I think we're in you know a problem We're a problem space where the tools just aren't there for what we need we have time for one question over here. Yes, sir I'm the publisher of an independent Publication in Long Beach, California And we have a problem in our city that certain neighborhoods our next door pages are particularly grotesque Then there's two issues both the hate language, but also misinformation We've taken an approach of trying to proactively combat And engage to correct where possible which we have some people in our newsroom Some of our leaders that say just don't ever engage But this is a resource intensive practice for us An example of this is there was a rumor recently about a homeless shelter is going to be built in your neighborhood park and 15,000 comments later We finally got control of it it overwhelmed two of our city council offices at overwhelmed city council Meeting for a while my question is are there any best practices or counsel about? Whether to do this kind of direct proactive engagement to correct things like this or Or information any resources any any tips because it's it's it's exhausting and it's it is resource intensive for us. I Mean I would just offer that It's true that education and educating is incredibly resource intensive In fact, I remind my students that even though they're an a bird with things like search engines I asked them why did they go to UCLA then if all know knowledge can be known in a search and then everyone it gets reoriented so I think those investments have to be made in Because those are also about democracy building projects. I mean direct engagement We know from the learning sciences for example those of us who work in education that learning is iterative It requires human beings to kind of go back and forth to engage with ideas and material Research in particular and that is how learning happens It really isn't just a static push and then you digest and now you're educated and you know So I guess I would say you know one of the things we think about a lot is You know slow Information movements versus fast information Movements right looking for other ways that people have come to understand that the local You know the local farmers market and the slow food movement is actually a paradigm that we can relate to when we put it up against Mass multinational corporate fast food, right? So our information environments and our knowledge environments and being very clear You know one of the worst things that I think exists in the field of talking about Information today is the flattening of knowledge and information and propaganda by calling it all content You know content is not content And we need much more sophisticated nuanced ways of understanding knowledge evidence research Even flattening it and calling it data isn't particularly helpful and I think that work is really important And maybe there are ways to partner with CSU Long Beach the community colleges others people who are invested in Advocacy organizations community organizations who are invested in deepening the knowledge Ethan you want to get in let me give a model that's worked remarkably well in Mexico around political disinformation There was a project set up to try to debunk rumors on WhatsApp and it was set up by a Journalism site a news site called anima politico it ended up recruiting 99 other newspapers in Mexico and what ended up happening was if you saw Potential disinformation on WhatsApp, which is a very hard network to monitor. It's encrypted We don't have a good way to sort of come in and look at it You could post it to the newspaper and the newspaper would run a fact check and a group of a hundred newspapers Ended up fact-checking an enormous amount of information within the election and then people would Inject those fact checks back into the WhatsApp threads and it did remarkably well What's happening with next door is is this sort of disappointing model that we've seen happen again and again and again Which is it's essentially an extractive product it invites people to weaponize their fear mostly of black and brown people and Use that to create an endless stream of content, which by the way can be micro targeted to them based on their geography It is not the sort of community that you would really want to create as a healthy community site But I offer this to the room. What is a community site? You'd want to create if that's the example of sort of what we want to avoid and the example of what happens when we leave this task to the market What can we actually imagine? Creating a long beach and maybe your paper is a major actor and creating that along with the community college along with other Aspects of it. How do we look for something better at the same time that we're fighting something worse? That that's the balance that I think we have to take on so I mean I sort of had three takeaways from this from this conversation the first is confirmation bias Which is a very patriotic emotion With the late afternoon joke really did not land there So but I I think what you know in the questions that we're getting I think affirm this which is You know, I think your point Safia that knowledge is really power here The the folks who are studying these issues are the equivalent of frontline public health workers of 100 years ago And in every community we've got a university a community college Where I guarantee you there is an engineering faculty and a social science faculty that would love to have these conversations And certainly a student body that is interested in living this world and interested in having these conversations So I wouldn't I think that a real takeaway here is to invest in that This is a civilizational achievement that we have this kind of knowledge base in the world The second takeaway I have is that this is we should we should Embrace the fact that the way social media now is is radically contingent We do not have to accept the social media environment We have as inevitable and just asking the question of how it could look different I take from all of you is you eat and you're reporting and in your research You've asked the question of why this exists. What are the forces that are causing to exist? What's the harm? And I think the third is that you know, I've heard from all of you that solutions can actually start in community We're not powerless people can either Take action to create online communities that are more positive at the very least our exemplars of what we should have Even if they have to go back to forms of social media that feel less positive But secondly that in a moment of heightened public anxiety and therefore policymaker response We should be clear about the kind of communities that we want in addition to specific role Changes that we might want to see so I think those are incredibly positive takeaways for a group that really is on the front lines Of the effects of this. So please join me in thanking our panel for their contributions Um I'm duty bound to announce that the festivities are on the 19th floor of the hotel and there is no longer Now no longer nothing keeping you uh from that. So thank you all for joining us today. It was I think a fantastic day of conversation