 Hey everybody, Joan Donovan, I'm working up at Harvard now at the Shorenstein Center and previously was working at Data & Society. So last year you actually heard from Dana Boyd about strategic amplification and different ways that we can think through the issue of hate speech online as well as other problems that we see propagated both by algorithms and by groups of people who are very interested in putting negative things into the ether. And I want to thank Talal for setting me up. There's a lot of good things that he said that I don't have to cover now, which is great. And then after the panel, like join me in the hot tub because we're going to need to relax. It's going to be a tense one, but I think that there's a role for everyone in this room to play. And I hope that we can have that discussion and dialogue about what are the little pieces that each of us can do to help get, I don't think we're ever going to get rid of the problem of white supremacy online, but I think we can mitigate quarantine and lessen the impacts through some strategic work. So two years ago in a room very much like this, Patricia Hill Collins, who's a black feminist professor, posed a difficult question to sociologists at our annual meeting. She said, do we care as much as they do? And she had just given a very long presentation on the history of white supremacists organizing both pre and post internet and the violence in Charlottesville. And the room was silent. And I'm now ready, I think, to give an answer to that question. And I want to say that the work that we've undertaken over the last five years is because I care more about racial equity than any other topic. And ignoring racial equity in our work creates many failures to address inequalities and harms that trouble all of our political and social institutions right now. Even in journalism this morning, I attended a SNAP panel down on the second floor where many were talking about how they don't feel like their stories are being told. Black women, people of color, LGBTQ folks, we struggle to be seen sometimes because living on the margins feels like a home. But usually it's that posing our critiques in the way that we do makes a lot of people very uncomfortable because there are some of us who benefit from keeping things the way they are. And I fight against that every day in our research is to say things as they are are not working for most of us. And I think there's ways that we can change that, but it only starts with admitting that you care more than they do. And then you get organized, right? That's the part two. So my team at Shorenstein has a mission, it's very clear to us. We want to force accountability for tech companies who build and deploy products without understanding how their profit comes at a huge cost to most of us. We study white supremacists because we understand that this is a group of people who have spent the last hundred years manipulating media for their own gain and I'll be damned if I spend the next hundred years watching them do most more of the same, especially as platforms have become the primary distributor of content, whatever that is in news. These manipulators have capitalized on the lack of policies of content moderation and under enforcement of their terms of service. So our research team studies white supremacists along four dimensions. First, we look at polarization on wedge issues where white supremacists piggyback their talking points into mainstream conversations, usually via social media. For example, the issue of opiate addiction has touched many of us, but white supremacists use it as a talking point to highlight white pain and to build the white lives matter movement. This movement holds rallies in small, often rural towns where they can recruit people in moments of intense insecurity and fear. The second dimension is we track the tactics that they use to hoax and manipulate algorithms and platforms because they can't show up as who they are. They always have to cloak and mask themselves in different ways. This is actually a very old tactic. In the 90s, white supremacists use online communication technologies to recruit new people and to hone their messaging, usually using bulletin boards, simple message boards. And we see again simple message boards and the anonymity as another perpetual problem. But how many of you know that Martin Luther King.org for nearly 20 years was run by a group of white supremacists, right? Very few of us know that. And Jesse Daniels, who's a sociologist, has spent a decade tracking white supremacist movements in the internet, the way that they've moved online to build their movement. And similarly, my team looks at the ways that platforms have added amplification power for little to no cost. And we're indebted, of course, to Sophia's work for framing the inequalities created by algorithms as a top issue for internet, for scholars of critical internet studies. The third dimension is that we know that white supremacists attack journalists and researchers because they need to silence us. Our work is effective and essential. And it's also dangerous and very scary. Most of our loved ones ask us to quit. And that is something that we all have to face. The harassment campaigns by white supremacists work because social media platforms make our personal social networks completely public. Currently, it's not hard for those who wanna dox us or get us fired to contact our bosses, our family, our colleagues. Because that information is neatly displayed in a sidebar, as Talal has showed you. But we need to flip the script on free speech. We need to understand content moderation is different from an issue of censorship. Cuz what about my free speech? Right now, my speech is restricted by the threat of violent attacks against myself and loved ones. And I'm mindful that speech doesn't happen in a social vacuum. Speech is never free from consequence. It's always relational. But some people hold the power to shape that speech. And we need to put them into the accountability matrix. Fourth, organized white supremacy has gained power in high places over the last few years. And this has left many of us feeling beaten hollow. The inaction of platform companies to admit that white nationalists were using their technology to recruit and raise resources felt like a tacit endorsement of these groups. Only after murders, the burning of churches and community centers and countless articles written by journalists and researchers, did platform companies begin to take action against some of the more noxious offenders. But I stand here to remind you that Richard Spencer, the architect of the alt-right, still has a presence on YouTube and on Twitter today. Finally, the use of platforms to archive and distribute manifestos of mass murders must end. The killers in Christchurch and last week in Germany used violence to call attention to their ideas. And this tactic only works because platforms are designed for profit and not for community security. So as I close, I want you to think about how the future of our society, our democracy, depends on reliable and stable communication. It must be predictable. And transparency and accountability for technology companies must be at the forefront of our journalism and research. That being said, we must think about the internet like we think about democracy. These are not products that can work without participation. Like democracy, the internet and platforms are social processes, and as such, they will never be finished. Social processes, though, require rules of engagement and consequences for abuse. To be sure, we can't design our way out of this. We need to see platforms for what they are. They're amplification technologies that tilt in favor of those already powerful and well resourced. These are the people who are willing to pay to influence our social and political institutions. Platforms are not neutral, but have consequences for most of us. And we hope that our research can play a pivotal role in how policy is shaped going forward as we think about redesigning our communication environments online. And sadly, gone are the days when their social media will hoist the next social movement. I have grieved this, and it's time to move on. I'm confident that democracy will survive social media, but platforms cannot survive without us. And so we need to think about what powers do we have, what networks do we have, and are we putting them at risk by putting them on blast through social media. And we must remember that our words are our weapons. And they are incredibly powerful. Thank you.