 My name is Mehta Kassabdani. I am the director of Critical Connections and organization here in the Valley that puts together events on issues related to American Muslims and other targeted communities. And if you'd like to get more information about our work, please visit our website, CriticalConnections.org. We are very pleased to be partnering with Peruna Center for Peacebuilding, which is an international peacebuilding organization right here in Amherst. And you can hear more about their work from Jean Morrison, who will be moderating our event today. And so Peruna Center and Critical Connections have been partnering on the series called the Many Dimensions of White Identity, Politics, Power, and Prejudice. And what we're trying to sort of explore through the events in this series is the history, prevalence, and resurgence of white supremacist groups. And also movements that are sort of rooted in white identity and the way they have manifested themselves in violent movements, in recent elections, in voting patterns, et cetera. So our event today is actually the second in our series and we will be focusing on how white supremacist groups have used the internet and also social media platforms to perpetuate their ideology. And the challenges that are inherent in curbing online hate and harassment. And to help us sort of understand this phenomenon, we are extremely fortunate to be joined by one of the foremost experts on this topic. And that is Dr. Whitney Phillips, who is joining us all the way from Syracuse, New York. I'm going to read from her bio and this is an abbreviated version. She has a very extensive resume. So Dr. Phillips is an assistant professor in the department of communication and rhetorical studies at Syracuse University, where she teaches classes in media literacy and online ethics, online discourse and controversy, folklore and digital culture, and lore surrounding monster narratives, urban legends, hoaxes, and crime. Dr. Phillips's research explores antagonism and identity-based harassment online, political memes, and other forms of ambivalent civic participation. And also digital ethics, including journalistic ethics and the ethics of everyday social media news. She is the author of This Is Why We Can't Have Nice Things, mapping the relationship between online trolling and mainstream culture. And she's also the author of the three-part ethnographic report, The Oxygen of Amplification, Better Practices for Reporting on Extremists and Tagalists and Manipulators Online. She has written numerous articles and book chapters on a range of media, folklore, and digital culture topics, most recently fake news narratives, and the role social and memetic media played during and after the 2016 US presidential election. Additionally, she has published dozens of popular pieces outlets like The New York Times, The Atlantic, and Slate. She is regularly featured as an expert commentator in national and global news outlets. She holds a PhD in English with a digital culture focus from the University of Oregon, an MFA in creative writing from Emerson College, and a BA in philosophy from Humboldt State University. Please join me in welcoming Dr. Ritneetha. And I'll just sort of describe the format of the event for those of you who have not attended our events before. For the next 45 to 50 minutes, Ginni Morrison here is going to ask Dr. Phillips a series of questions. And after that, moderate your conversation, we're going to sort of break out into small groups and the idea is for everybody to sort of relate the content to what they've heard, to their own experiences with online culture that we all live in. And then we'll reconvene after about 20 minutes for the larger Q&A session. So that's how the evening is going to unfold. And with that, I'm going to turn it over to Ginni. Thank you so much. Let me add my welcome to all of you. For those not familiar with Corona Center, we, in addition to co-hosting these kinds of community conversations and some dialogue on climate change and environmental justice in the DC area, we also work to prevent violence or to help communities recover from it around the world. And we're celebrating our 25th anniversary this year. So we're excited to be here with you tonight. And with Dr. Phillips. So as we start, let me just ask, show of hands, how many of you have set upstairs? I'm looking for the white supremacist meaning. And the police came in. Okay, so that may be the last lap we have tonight. So I wanted to start with one. But thank you for joining us. And let me definitely welcome Dr. Phillips. Very great to have you. And so, Lance, there are a number of different areas that your work has touched on. And so we'll do a little survey, maybe of at least some of the high points. What are the points? I don't know. Depending on your movement. Hold for, oh, am I supposed to just? Oh, you can do that. You can definitely do that. I just wanted to face you. I'll do my best to go there. Yes, you don't need to look at me at all. Figure it out. Exactly. Let's get a little closer to you. Is that, is that Amplify? Great. Interesting. So, diving in, you know that these conversations often are focusing those ones that were convening on white power movements. So can you talk about how the internet provides a safe space for white supremacists? So this question goes back a ways. The immediate answer is that online spaces allow people to find like-minded fellowship with others and share ideas and have those ideas reinforced. The internet also allows social justice activists and other groups to do the same kinds of things that in some ways, in some very limited ways online spaces are, I don't wanna say neutral exactly but they're ambivalent. They allow different kinds of people to organize in different ways. Online, the way that social media platforms are set up, the way they were designed to work, they were always designed to privilege and even incentivize the most emotionally resonant content which often translates to inflammatory content, offensive content. And so by not exactly happenstance because I don't think that it was wholly an accident but social media platforms have really created a place that reward certain kinds of violent ideation and have provided a safe haven for those who partake of violent ideologies because social media platforms have chosen again and again and again to side with and protect abusers and bigots as opposed to making those spaces safe for the people whom they target. And so it's both, the internet is a wide open space and many people have benefited. Lots of marginalized communities have really, Facebook has been great for many groups but Facebook also is designed to privilege certain kinds of behavior that just so happened. Although like I said, I don't think it's totally coincidental but that certain groups benefit and so groups, white nationalist supremacist groups they really have been able to find an itch on these platforms that they couldn't have found if the policies of those platforms had been less amenable to the kinds of things that they were doing. And so you mentioned Facebook which makes me wonder if you find certain platforms more amenable, more accessible? The large ones, they all suffer from the same issue which is that they were created primarily by white men with a libertarian ethos in that they wanted to minimize the amount of restriction that existed on the sites, maximize spread of information. And this sort of butts up against kind of liberal ideas about the marketplace of ideas that if you have the best corrective to bad speech is more speech. So you create a space where you're not moderating or you're moderating in a very minimal kind of way and ultimately the best ideas will float to the top that that's freedom, that's freedom in a nutshell. And so you have these spaces designed to really, I mean they set out, we are not going to infringe on people's free speech even if they're not really talking about a constitutional issue because of course the First Amendment is not, that's not the purview of Facebook, they're a private company. But because they all were baked, the cakes were baked out of the same basic batter, you encounter the basic kinds of issues now years and years after they were established, the cake is already on the table. That how the cake, the ingredients that were put into the cake batter sort of predetermined what the outcome was gonna be. And had there been more, I mean and this is something that the higher ups at Twitter for example, they've gone on the record, they say if there had been more women, if there had been more people of color working at these companies in the very beginning the platforms likely would have been designed differently and more restrictions would have been built into the structure of the site. They would have foregrounded concerns about harassment. They would have thought about what might have happened as a result of coordinated disinformation campaigns. But because the people designing those sites were occupying subject positions that were least likely to be harmed by those things, they didn't need to build in those safety restrictions. That was just not something they were thinking about. They weren't bad people, they weren't themselves white nationalists, although Bell Curve, I'm sure a few people are sympathetic in that regard, but the overarching critique is not oh, they started out with this malice in their heart. They started out from an ideological position that constructed sites in a particular way that ended up benefiting harmful speech at the detriment to a more diverse kind of expression that women and people of color have been silenced and marginalized and harmed on those platforms because of the way the platforms were designed to work from the very outset. And silenced because they're being harassed or in a different way? Silenced because they're being, I mean, so this is the analog that I like to bring up. How many folks in the audience are teachers of one kind or another? So I'll speak from my experience as an instructor. When you have a classroom and there's two or three really antagonistic students, students who put other people in the class down, who are very vocal about their feelings, that tends in my experience not to contribute to the most amount of speech possible. Usually that shuts everybody else up because they don't wanna end up in the crosshairs, they don't wanna be attacked. And that's how it works online that yes, there are protecting free speech of the extreme fringes that protects their speech, but the consequence of that is to minimize the overall amount of speech because people don't feel comfortable there or they don't wanna reveal too many things about themselves because they're afraid of reprisal. And then if there is reprisal, they know that the companies aren't gonna do anything to protect them. And so it actually, the vigorousness with which they defend free speeches is ironic because it has in practical application the opposite of their intended effect. But they're not thinking about those effects again because these are bodies who are not the target. I mean, it's very easy to talk about protecting free speech when your body is not subject to its violence. And that's what happens across all of the large platforms. And so you're reminding me about your writings about white privilege in this context. And I'm wondering if there's more to say about that. So whiteness, whiteness broadly, yeah, that has been, especially, so I'm finishing up a book right now and the book is titled You Are Here, a Field Guide for Navigating Polluted Information. And it's about how do we make sense of a landscape that is so inundated with mis and disinformation that you can't easily distinguish what's true from what's trash. It's very detrimental to democracy regardless of what political affiliation you hold. It's not great for democracy right now. And in that book, what me and my co-author are doing is we're trying to trace, how did we get here? Why did we end up here? And part of it is thinking about the ideological, the ideological reasons that social media platforms were designed in the way they were. So that's talking about the subject position of the creators of these sites. But it also is a conversation about the whiteness inherent to early meme culture. And so I'm gonna go into a little bit of that history. If something isn't clear, if a concept, if you've not heard it and you want me to back up and explain, please just put your hand up and let me know. But meme culture is these days subsumes a ton of stuff. Basically, if you've ever shared or seen a funny picture with text at the top or the bottom that qualifies as meme culture, it emerged as a thing that people referred to in the early mid-2000s on this website called 4chan, which was a simple image board. It allowed people to post anonymously whatever it is they wanted. And the board had dozens of subject specific boards. But the most famous, the most infamous board was the B board, B is in Boyd. And that was the random board. And that was where the term troll in an internet sense first emerged. And Chris, well, the term troll had existed, but it didn't have a self identifying flavor yet. It was a troll was something you accuse someone else of being when you thought they were annoying when they were interfering with the proceeding of an online communication, troler or troll. Early mid-2000s on 4chan, trolls started to adopt that as an identity. And while they were doing this, they were creating just an unbelievable amount of content. And they referred to, they embraced it as meme culture. The term meme had existed before that too. But because this particular space at this particular moment caught on with a particular subset of the population. So this was younger white men, technology savvy. My brother, that was why I started researching this. So basically video game types. And it was more, I mean, there, I would have gotten in a lot of trouble in 2008 if I had said that, but retrospectively, it was video game types. And so they had this play suddenly that they could congregate. The structure of 4chan is inherent to the development of what we understand now as meme culture. Because on 4chan, the board, it was this very not particularly sophisticated website. It could only hold so much. The owner and creator of the site, who was 15 at the time he created it, he didn't have enough server space. And so the way that it worked is that content would fall off the site. It was an archive. And what that did is it forced people, if you liked a piece of content you saw, you would better save it to your computer and remix it and then post it again. And the most popular ideas and images floated to the top basically. So it kind of functioned as a sort of uploading process. And what emerged from that, because so many tens of thousands, and this was wild because so this is still in the early mid 2000s, I started researching it around that time. I could tell based on the way they communicated in my classroom who participated on these sites, that the subculture was so specific and so widespread. It was, I often described it, it was the most influential culture that people didn't realize they knew a lot about. Because all of these young white men were walking around sharing this language. If you couldn't speak it, you didn't know it was happening. But it was happening, so I always knew. I always knew who the troll was in my room. And so you have all these tens of thousands of young men from a similar sort of middle class and it originated in the US and it traded primarily in American popular culture, although also some Japanese stuff anime. Primarily it was a US thing, centered thing. So, they all come at it from similar experiences in life. And there's so many of them constantly doing this and the site could only hold so much that basically the only responses that could succeed were the ones that were terrible, that were poorly made, that were haphazard, because if you spent 20 minutes making a really nice Photoshop, your moment would have passed and it would then fall and then you wouldn't be able to upload it again. So people had to produce really basically crappy content just to keep up with the pace of the site. What that created was an aesthetic that ultimately outlasted the technological restrictions of the site. So troll culture became synonymous with a certain kind of barbed, ironic, arms length bigotry where a hilarious joke would be Photoshopping swastikas over someone's face, that that was the humor, that was the sense of humor that got established. It was easy to do, it didn't require a lot of thought, it was basically shock images remixed often with pornography to hilarious effects for the people who are participating. So it already had some problematic elements. Well, because these young people were well positioned, they were sort of part of the mainstream, mainstream popular culture. Their interest in the site and their interest in these memes and their ability to hold that kind of content at arms length, it contributed to that content being absorbed into popular culture in the United States. So I started this research, I dove into the dissertation in 2008. By 2010, 2011, you'd walk through Target and you'd see t-shirts that had trolling images on it. And when it first started happening, those few of us, mostly women, by the way, who were doing this work from the very beginning, were like, what? Because it was the most offensive site you ever could have imagined in your life. They got to be offensive, they got to just have fun, they got to, the term they used was lulls, which was laughter at the expense of someone else. It was a lullsly aesthetic. You got to laugh at swastikas, ha ha. Because they had no, they didn't know, they had no frame, they just didn't make people mad and that was funny. So this became ingrained in how millennials, essentially, which is my grouping, I'm an old millennial, thank you very much, but how people communicated. And so this internalization of a kind of, I get to make Hitler jokes and I get to make jokes about black people and I get to make misogynist jokes because I'm just trolling on just the internet. That was normalized as a normal way of moving through the world because there were no consequences and that was what I found in my research. A few of them were proto white supremacists that was clear and a few of them were, had some certainly sociopathic tendencies that's also clear, but most of them were just privileged young white guys who got to have fun and then got to turn their computer off when they chose to walk away from it but there were no consequences beyond that. So, fast forward a couple years. This becomes fundamental to American popular culture. You see it on TV, you would have, I remember you would watch the Colbert Report back when that was on and all of their writers had been steeped in this culture so you would see trolling jokes subtly included in this pop cultural content, 30 Rock. Colbert Report, Daily Show, all of these shows have writers who spent time on 4chan. So it was just part of the air that we breathe even though most people had no idea that that's what they were breathing in. So meme culture evolves with that. Meme culture starts to be adopted and monetized and sanitized for family friendly consumption. And so the funny pictures on Facebook that you encounter now have their origins in an aesthetic practice that was established on 4chan and the format itself of an image in a certain kind of text juxtaposed with the image. That was all established, that's been established for a decade now. Where it gets especially, and so that in itself, you're like, okay, people are enjoying all these memes, they don't realize that this is actually the sort of after effects of all these basically white young men who don't ever have to think about consequences and like, ew, it's baked in a cake. Are you like, that's gross. But it gets worse because all of the journalists who ultimately worked in the technology section of their papers or were dealing with popular culture, they were younger millennials than me, just a little bit above Gen Z. So they were born like 1987. They all spent their high school years spending time on 4chan. That's because they already were positioned, they were already able, they already had this sort of mobility, upward mobility to end up in these high profile publications. So they were privileged already because they had access to the sort of gatekeeping apparatus, but they also were white and most of them were male. Not all of them, but most of them. So they brought to their viewing practices in these news outlets this affinity for precisely the kind of arms length irony that had been such a normal part of their childhoods. So they start seeing all this swastika stuff online in 2015. They start seeing all this weird Pepe the Frog and like Trump stuff and they all thought it was a joke because that's what all of their jokes had always looked like. For 10 years at that point, they had been steeped in this landscape where due to their whiteness, they got to laugh at swastikas. Those jokes got to be funny for them. There were no consequences, nobody ever got hurt because it wasn't. And many of them, and I ultimately interviewed a number of them, many of them looked back and think I just, I thought that racism was funny because we obviously solved it. Like anti-Semitism isn't a real thing. That was like back in a long time ago. And so therefore there isn't any concern with laughing at a swastika, it's not a real thing. It doesn't correspond to any reality. So because they were able to affect that positionality to this content, as the alt-right storm starts a gathering, they thought it was a funny joke. And so they spent the first year or so, year or two of the changes, basically pointing and laughing at all of this stuff that was bubbling up from these online spaces. Other older, more established journalists then wrote articles about all of the articles that the younger journalists were writing. And through that process, the alt-right was able to crystallize into a coherent or coherent looking movement. Even though it was always fractured, there were always, always contentious, white supremacists and white nationalists don't play well together, typically. But it was given the, it was given the look of something that was coherent. And so then it could grow up. And so that was, that's how this whiteness that is definitely problematic in the early mid-2000s. It manifested in all of these really sort of unpredictable ways with these very unpredictable effects that ended up becoming central to how candidate Trump was framed in the beginning as a hilarious joke. And then what happened after 2016 and then for the first time in 2017. So I started doing this research, like I said, in 2008. So up until 2017, when I would tell reporters the problems that they're reporting or you are just incentivizing trolls, all they want you to do is write these articles, you're making their life easier, stop, stop. It's the same thing over and over, especially in response to mass shootings. Because up until that point, every time there was a mass shooting, trolls, unfortunately, would, because it was hilarious, descend on the story and say, falsely, that the shooter had posted to 4chan beforehand. And would do all kinds of, play all kinds of the same games, again and again, reporters would come to me and would what comment, because my name was Google search index, so alongside all of this, so I would say, hey, here's five articles from two years ago when we talked about this before. It's the same story, stop doing it. And they were very resistant to that and many of them, again, were white dudes because that's who primarily works in these outlets. They were like, bore objective. We just have to call attention to it and everyone will know it's ridiculous, oh no, it's a terrible joke and then everything will be fine. So that was the pervasive attitude, which also stems from a kind of whiteness, this idea that you get to be an outside observer, you get to occupy a view from nowhere. After 2017, and Charlottesville happened, suddenly many of them started to sing a different tune and many of them realized for the very first time that all of this stuff that they thought had been a joke for three years at that point turns out they're actual white supremacists and one person who I interviewed told me that she had been seeing all this swastika stuff and Trump stuff and Hitler stuff and she always assumed it was a joke, like it had always been a joke and didn't really think twice about it at all and then Charlottesville happens and so then she decides, okay, I'm gonna, I've heard people talk about this white supremacist website called The Daily Stormer, I'm actually gonna go there now, I'm gonna go there. She went there. Every single thing she had been seeing in these jokie spaces was what she was seeing there. She had no idea and it devastated her and so the reporters that I worked with and so again, I'm working with these reporters this whole time, more and more they needed therapy and so we would have these conversations and by the end of it I would be sweating so much because I was taking on their sadness and they were really, they were shocked. They had no idea that they had missed this because their view was so myopic. Everything got to be a joke, everything got to be consequence free, everything got to be just trolling because they didn't, why not? It was easy for them to hold it in arms like. So Charlottesville was the turning point and in talking to all these reporters, a reporter I had been working with was said, I wish someone would put together a best practices guide because we clearly don't know how to do this. We've gotten it wrong up until now and now you can unring the bell and now the outright is here and now there's branding opportunities and when you have a brand more people are able to join and so that was why I was like, oh, I could do that. So then I interviewed many of the people who I've worked with before and that's how I know about their subjective experiences of being horrified by their, by the things that they didn't see because they didn't have to see it because their whiteness meant they didn't have to. It wasn't required of them and they were devastated by that realization and it simply hadn't occurred to them. So it's a circuitous kind of way that and actually in the book, the way we describe this is the circuitous way that whiteness factors into many of the things we deal with now, even I mean obviously when you're talking about white supremacists, whiteness plays a role there but unexamined whiteness of many other people helped facilitate the rise of those white supremacists. They couldn't have done it without that journalistic amplification and essentially the stamp of approval that they got. So in the book, in the chapter where we talk about this movement, the unifying metaphor is that of the redwoods root system because in the redwoods the way that the micro-risal networks connect the roots underground is that it's very difficult to distinguish biologically where one tree ends and another begins. It's so densely interconnected. Trees share resources. They talk to each other in their tree-y way. And so if you introduce poison into one side of the forest, it's gonna suffuse out in all kinds of ways you can't see because it's underground, but if the trees start getting sick over here, the trees over there are gonna start getting sick and that's what happened to us. And that's what happened to us as a country. We didn't see the poison that was being filtered through because of unexamined whiteness. That's the untold story of the rise of white supremacy. The obvious, the white supremacists are obvious. They're clear. We can see them. They're vocal, right? It's the other stuff that's more difficult to identify and much more difficult to respond to because that gets to identity issues that many white people are either don't see or aren't willing to see. And so that's why I took a deep breath before I answered this question because whiteness is central to this discussion, just not in the ways necessarily that we might think it would be. Thank you. Yeah, all kinds of colors of it. Yeah. And so you talked about Charlottesville being one manifestation in which those ideas and actions spill over into real life, if we wanna call it that. And you've written about Christchurch massacre similarly. Is there more to say about how the mediculture online seeps into life and affects people's actions? Yeah, I mean it's, the line between online and off, these days is so, there is none. There isn't a line between online and off. And it's not just going in the direction of stuff that happens online, then can have a real world consequence. You know, when people are, the word that typically gets used is that when you become radicalized through these online spaces, because of algorithms, because of filter bubbles, all of that, then you're more likely to then enact violence offline. And that's absolutely true. The direction goes in that direction. Nope, the relationship goes in that direction, yep. But it also goes in the other direction as well, because so, algorithms, let me give a little bit of a primer. So I like to describe algorithms. So the way, when you search for something on Google, you're not going to a library. You're not accessing necessarily factual information. You're certainly not accessing the best information possible. The assumption, what Google would like us all to believe, is that when you type something into Google, the results you get are trustworthy. They've been vetted somehow. But Google is not actually a search company, it's an advertising company. And so, what you end up seeing when you search is dependent on a whole host of factors that we don't have access to. This is the same when you search for something on YouTube or when you search for something on Facebook. We are fed content in a way that makes us feel like we are taking ownership and that we're doing our homework and that we're getting the whole story. But what the algorithms do, they are basing, essentially it's a decision-making process. What have you done in the past? Or what have the people who have used that computer before you've done in the past? And based on those previous choices, the previous things you've clicked, the previous words that you've employed, that will then send you to your search results. So, if you go to Google with bias existing, Google will give you results that reinforces what it is it thinks you want to know. You think you've done your homework because you put the search thing in and you think, okay, I've done it. No, Google is giving you a menu that has your name on it and it believing that that is what you need from it. That's what it does, that's what algorithms do. And so, when a person who already has bias baked into their cake, they bring that to Google and Google reinforces it. But Google doesn't create radical, Google doesn't create the process of radicalization. People initiate that with their entrenched beliefs about their world and their role within that world. Social media platforms just take it and make it worse. Because it has a finite, those companies have a financial incentive in keeping you to click. The more you click, the longer you stay on the site, the more they can monetize your eyeballs. So they want to give you the things that you want to see. And if it just so happens that what you want to see is white supremacist content, they will be more than happy to oblige you. So the classic example of this, Dylan Roof, who committed mass murder in Charleston, he self, in the way that he describes his process of radicalization, he tells the story thusly. He had just seen the Trayvon Martin case. He was curious to learn more about what he believed was black on white crime. So when he put that phrase into Google to search for black on white crime, he capitalized the word W in white. And he didn't capitalize the word B in black. And that in itself tells you his hierarchical thinking. Google said, I know who you are. I know what you need to see. And then it started feeding him increasingly extremist content. And he placed, and he puts it at that moment. He did that search. That search opened his eyes and he got to embark on his enlightenment. And so it's important to think about the ways that online organization translates into offline action. But offline beliefs filter up into the kinds of actions people are presented with as options. That we don't have the problem of radicalization online if we don't have a problem of bias offline. Those two things work together. And so that's why when we think, okay, well, what do we do about all these problems? It's a technological problem in some ways. Algorithms are definitely a problem they incentivize extremist content. You start searching for one conspiracy theory on YouTube. You end up in a place you don't wanna go. I mean, you can sit down and start doing experiments and you'll see where it'll take you. And it'll take you to scary places real fast. That's a problem. Especially because more and more young people, and this is also, this is a phenomenon that happens on the right more than on the left. There's increasing distrust for mainstream institutions. If you are a person who culturally is encouraged to disbelieve the things you read in newspapers, where are you gonna get your information? You're gonna go to YouTube. And you're gonna go to Google and you're gonna do your homework because you don't trust the official narrative. So the more likely you are as a person to mistrust what the New York Times says, also the more likely you are to go down into those algorithmic rabbit holes because that narrative, the New York Times narrative is never gonna be good enough for you. And that happens in a different way on the right than on the left because on the right since the 1950s, folks on the right have certainly evangelical Christians. There's been mistrust of secular media. That was, and so in the 1950s, evangelicals who have always been extremely technologically savvy, very good at adopting new technologies, really ingenious in that regard, they start building their own alternative media networks with radio stations, with television stations, all of these sort of networks that existed outside of the mainstream because the mainstream wasn't working for them. And they had theological reasons for why that was. But what that did is it entrenched this idea that you can't trust what happens within the mainstream and so you've gotta do your own homework, you've gotta look for it yourself. There's not that same culture on the left and that has nothing to do with who's smarter, who's better at critical thinking, that isn't it at all. It's how much faith are you willing to put into the information you read in a newspaper? And for people who think if the New York Times says it, I'm pretty good on it, I generally maybe I'm concerned about some bias, but I don't dispute those facts. Those are not the people who are gonna be searching for information on YouTube. They're not the people who become victims of these algorithms, it just doesn't happen. So you have this polarization that's entrenched through these algorithms and that's a problem because regardless of where you are on the political spectrum, we really need to be able to have meaningful conversations with each other. We really need to be able to disagree in reasonable ways and then continue having a nice dinner. Like that's critical to a functioning democracy. People disagreeing reasonably. It is one of the greatest tragedies that has happened to this democracy is that the left and the right have then siphoned away from each other in these ways. It is heartbreaking. It's terrible that it happened. And algorithms are a fundamental part of why that has occurred. People are also a reason why that occurred because people become self-selecting in where they're looking for information. Again, nothing to do with who's smart. Nothing to do with who's good. Nothing to do with any of that. It's where are you going for your information? Who's gonna give it to you? And what happens afterwards? So we have to figure out what to do about algorithms but we have to figure out too to be sort of mindful of and understand we are bringing our meat bodies into this space. And then the strange things start to happen. The more strange things. And you're also describing that it's the values that we bring into it also. So your point of view is something I hadn't recognized before, which is not only the polarization between left and right, but that that is maybe one of the real distinctions between the right that you're describing and the traditional right because Republicans are the party of institutions. And so that's schism. There are traditional conservatives and to me that helps explain some of that schism. Oh yeah, I mean so it's not and you can never look, take an entire group and say everybody in that category behaves in a certain way. The more distrustful you are of the media, the more likely you're gonna go to these alternative channels. And there are lots of mainline conservatives who don't go in that direction because they're not searching there because they're okay with what they're reading maybe in the national review or the Wall Street Journal and they're not pushed any further than those. They're center-right or right outlets but they don't fall into the sort of category of the fully alternative far-right media ecosystem. Those who are part of that, they're taken different places and so yeah, that explains why there is such a growing disparity between those on the far-right right of MAGA and then folks who are sort of traditional sort of Reagan-esque conservatives. And that's something that I have to be, so when I teach media literacy and I explain how this works, most students don't know how Google works but most adults don't know how Google works either so it's not, this isn't a function of the kids not knowing how the technologies work. The technology companies don't want us to know how their technologies work because we would be horrified if we did. But I have to be really careful about describing the process by which this happens, not demonizing the people who fall victim to these processes and acknowledging that there's variation on the right that some people get pushed really far into conspiracy theory town and other people don't go that way. But it really hinges on what are you bringing to the algorithm and then what does the algorithm give you as its gift? Sure, and the gift I'm hearing, one of the gifts is if I really distrust one type of institution and I'm fed more of that reinforces that then it's easier for me to distrust other kinds of institutions. Yep, and so then when I, in my classroom, I hear students, you know, I don't read any of the news, they don't, you can't trust any of it so I get my news from YouTube. I'm like, no, just reading your text, it's fine, it's fine, don't go there, please. So it becomes tricky in the classroom but many young people are, yeah, and it's of course not just young people but loss of faith in institutions is a fundamental part of this conversation and so then thinking about how do we, how do we ameliorate some of this? It's about restoring trust but for some groups, that's a very tall order because it goes down to the core of someone's identity that those institutions are not for me. I cannot, you cannot throw enough facts at someone who believes the New York Times is fake news to suddenly be like, no, they're cool, they're good, I'll read them. It doesn't work that way, that's not how human cognition works. Exactly, and yet there are some things that you would suggest that ordinary social media users do to, in response to this? Good or bad, I mean, the good ones, the good ones. Yeah, let's go with what kinds of ways can people influence not getting sucked into this? Okay, so my favorite metaphor, and I am a person who experiences theory metaphorically, it's easier to talk about like trees and stuff than, you know, it's a good entry point to algorithms. So my favorite metaphor here is the biomass pyramid. Biomass pyramid, it represents the relative weight and number of one class of organism in an ecosystem compared to other organisms within that same ecosystem. So there are way, there are fewer bears than there are foxes, and there are fewer foxes, and there are bunnies, and there are fewer bunnies, and there are worms and fungi. And it's a helpful way to think about online harm and the role we all play in contributing to harms that we don't necessarily mean. And the reason that we often, the reason that we, I think, are restricted in how we think about the spread of polluted information or the role we play is because we tend to affix as our criterion of badness intentionality. If you don't set out to do something bad, if you don't mean to harm anybody, then you're good. It's really the apex creditors, they're the bad ones, they set out to eat you. They wanna eat you, they're gonna eat you, and they're gonna make a mess as they do it. And as long as you're not an apex creditor, you're in the clear. But the fact of the matter is that there are so many more worms and fungi and, you know, occupants of the lower strata of the biomass pyramid that apex creditors are actually fundamentally dependent on all of the animals in the lower strata. They cannot survive in an ecosystem without secondary, tertiary, secondary, tertiary creditors. And so when we think about how harm works online, it's very easy to point at the apex creditors because they do a different kind of damage. A violent white supremacist is dangerous in a way that an everyday person isn't. That's just a fact. But there are so many more everyday people with a lot of influence over what the overall ecosystem is like. And so when you shift the camera down to think about that lower strata, which most people would fall into because they're not actively trying to cause harm, they're commenting on stories or they're retweeting things because they wanna call attention to something that's unjust or they wanna say something rude about Donald Trump's latest tweet. That generates a lot of energy that feeds all of the animals in the above strata. And so one of the ways that we can minimize the amount of problematic information that spreads is to remember that we all are individual people. Maybe some of us in here are journalists. Maybe some of us have large social followings. But most of us don't. But just because we only have 300 people on our Facebook feeds or whatever the number might be, we still have the ability to affect the ecosystem in either positive or negative ways. And small shifts, I'm not gonna share this. I'm not going to add, I'm not gonna oxygenate this particular instance of racism. I'm not gonna contribute to this narrative that in the individual instance might not seem like anything at all. But the cumulative effect sets the tone for every other animal that comes after it. And I think that foregrounding all of us, we all have a role to play. We all occupy a position within this ecosystem. Yeah, we need to talk about the apex predators, of course, because they're dangerous, but we are here too. And we matter too. And our actions can cumulatively make a difference. And actually, and I don't like violent metaphors generally, but can starve the apex predators because they can't do what they do without our help. And so it's a way of reframing the conversation to be a fundamentally empowering one. That, yes, there is risk in everything we do and say, we can inadvertently contribute to all kinds of harms. But because that power is in our hands, it means we can choose something different. And that is where I hope conversation about everyday ethics goes. It's not just about finding ways to punish and be platform neo-nausis. We've got to figure out what can we do, not once Facebook gets its act together, which it never will, and not once the government figures out how to regulate social media, which I don't know if it ever will either. What can we do tonight? That's where we are. We have to start making those changes now because we are in trouble. So much more we want to discuss. And we want to include you all. So let's pause there with the slogan, get out there and be a fun guy. And we'll start another kind of conversation now, then we'll bring you back with Dr. Phillips to have a conversation with her so we can learn more. Just to pick up on where you left off. I sort of felt the oxygen go out of the room by the time you ended, but your words were, but I'm hopeful. Yeah. What makes you hopeful? That's a great question. And I maybe hopeful isn't quite the word, but I feel resolved. We're faced with a choice. We can either try really hard and maybe we'll fail, or we cannot try and we will fail. And so I think that there are opportunities for us, all of us individually to make some changes in how we interact with each other and what we're able to see, what we're thinking about online. And again, individually we can only make so much difference, but cumulative we can start changing the way we can start changing what is normalized. I mean, I was saying as we were talking up here, you know, part of what needs to happen is right now when we talk about what happens online, it typically is a conversation about, it's an ethics of rights, mind free speech, and not just my free speech, but my right to be heard. You know, my space, my, you know, all of this stuff that's about ownership and individuality and what you should be able to do. And that makes sense because that's sort of baked into the liberal tradition in the United States. But what needs to happen is we need to shift our focus and understand that what we're missing is an ethics of responsibility. That what I do impacts you, maybe not directly, but certainly indirectly, it goes through the root system. And if we resolve to think differently about the kinds of effects that we all can have and the ways that we are responsible for each other, we can help each other, even, you know, maybe especially people with whom we disagree, that that becomes different orientation to the world. And I think that when you, if I believe that there's a way to describe this to people, to explain to them why it is that your fate is tethered to mine, and once we're able to arrive at that mutual understanding, then frankly, healing can begin. And that is gonna take a lot of individual work and that takes a lot of resolve and that takes a lot of heartbreak and it takes a lot of anxiety, but I think it's possible. But more than that, I think not doing that is worse. And so, am I, is my life full of levity at the moment? No, I feel terrible all the time. There's not a single day, I mean, and I'm being very serious, there's not a single day that goes by that I work on this that I don't at some point start crying. I am very worried about our ability to recover from this. And maybe we've crossed the line already, maybe that's a possibility, maybe it is, but maybe we haven't. And so, how do we go to bed at night? Able to sleep, it's trying, it's looking, trying to see other people as other people. And not letting social companies and governmental policy stack the deck against us so high that we lose the ability to really connect with each other in a way that's meaningful. So it's a complicated sort of optimism or hopefulness. I don't know, it's belief that we can and that we should at the very least try. Just to add briefly, we have a project right now in Nigeria in an area where communities very quickly get whipped up into violence in minutes. And one piece of that is working on the kinds of things you just said, helping people around us that we connect with already, our family, and the people who listen to us to restore these ethics. So I don't just do it myself, but I talk to the people who will listen to me. And maybe, and the other piece of it, is monitoring our social media for when it starts to go off the rails and feeling responsibility to say something about that. To say, this is escalation, let's bring it down. That's so cool. A question and then a comment. You made it a point to talk about white male gaming culture. Now that about 40% of gamers are women, or at least female children, has there been a change, are we seeing a change in the overall climate, for argument's sake, of that meme culture? That is such a fascinating question. So one of the things that happened in 2013 that contributed to these spaces changing, because again, what the trolls were doing in 2008, 2009, it wasn't great and it was born out of this deep privilege. But it wasn't ideologically crystallized in the way that it ultimately became. And one of the catalyzing moments in that process happened in 2014. There was a proceeding event, but that's two weeks. But basically it was an event known as Gamergate. Is anyone familiar? Okay. No, okay. Sorry, two more. So Gamergate was essentially, now I'm worried. Okay, all right, I'm gonna do my best. Gamergate became a very, very complicated story. At the time it was tethered to, no, I'm not even gonna get into it. What it was is essentially reactionary response to the emergence of more women and people of color within the video games industry. And it was couched in a lot of other sort of narratives at the time, but that's retrospectively looking back. There were more women suddenly in this space and there were more people of color who were claiming space within this industry that had forever been relegated to white dudes. And it was never the case that only white dudes played, but that was the assumption that this was what the demographic was like. And when that demographic started to shift in public ways, a lot of people didn't like it. And... A lot of white dudes. A lot, well, sure, yes. And so this is where 4chan enters back into the conversation. So 4chan was a place where a lot of the hate and harassment and networked disinformation that was spread about these women and people of color in the video game industry and anybody who stood up for them. So these women were being targeted by these groups of presumably white dudes who just didn't like the encroachment of diversity within their space. And 4chan was a hotbed. That was where a lot of this organization took place. So 4chan, which had this reputation for being, you know, the worst of the worst of the worst online, was actually, according to many users, far too restrictive of speech. And so the owner and creator of 4chan ultimately got really irritated about all the Gamergate stuff on the site. And he was concerned that many of these participants were posting personal information. They were doxing, dropping documents on the people that they were targeting. So he basically banned Gamergate from 4chan. So then the people on 4chan got real mad about that because of fascism. And then created a new website that was worse. Well, it already had existed, but they sort of migrated over to 8chan, which then became the space that was most associated with white supremacists. And all of the recent mass shootings, the manifesto has been posted to 8chan. But so the, it's like, it's drafted by 8chan. It's very exciting. Gamergate, no, Gamergate, women. Modifying the culture. Oh yes, yes, of course. So basically this moment where there was greater visibility for women and people of color, it created a backlash that then entrenched people who were ideologically committed to restricting women and people of color from entering. So while the industry itself diversified in important ways, there was another group of people who became more crystallized in a reactionary mindset. And from that point forward, that was the mindset that became brand ID on 4chan and 8chan. So now more people knew that that was a place that they could go. That's where their fellows would be. And so then when journalists started reporting on the story, it became basically a welcome sign for people who, it wasn't just that they were engaging, they weren't just telling these jokes because they were privileged. These were people who identified with reactionary perspectives. And so now they had a place to go. Now they had a name and that was, it was out of that, that all the pro-Trump stuff started to grow because of the energies, the infrastructures essentially that had been established. And during that same time, many of the people, including my brother, who were participants on 4chan, three, one of three things happened. One is that they left because they started getting freaked out because all of this, because the space became crystallized in its bigotry in new ways. This wasn't just a source of lo, lo, this was an identity, so they left. Others were attracted to the site because it was being advertised, thanks journalists. And then the third group of people who were those who stayed put and slowly radicalized over time. They started out doing the funny swastika jokes, but then they ultimately internalized white supremacy. And so to the extent that, yes, you have had critical changes within the video game industry, it came at a price. And that price was making really entrenching reactionary perspectives in those who did not like that encroachment. So that's where you have these games, and yes, lots and lots of young women play video games as they always have and lots of lots of people of color as they always have. But that became really, that became in some ways a Rubicon in these energies. So yay. But then also, oh no. So that summarizes my work over the last 10 years, I think. That's a great question. And you had a comment too. The comment is that although the premise is that there's been an acute rise in white supremacy, I think it's just- Is that a point of view? Does that even finish the sentence? Okay, yeah, it's finished. I believe that it's just the question of having been given permission to crawl out from under the rocks. Okay, so now we'll go to Sarah, thanks for wanting to know. So I am such a lot of hate when it comes to any of this stuff. But the other day I was listening to somebody who used to work for Cambridge Analytica who wrote this book, and it scared the bejesus out of me. And the way that professionals are able to infiltrate different sites or have information about people who take a certain belief and convince white men that they're the victims that they're gonna go extinct and that they're being persecuted in and that even though Cambridge Analytica is gone, there's plenty of countries that are gonna send other Cambridge Analyticas to replace them. And the impact of that on our privacy and on the election is really terrifying to me. I mean, it's really to me a form of brainwashing. And I wondered if you could comment on that. Yeah, that's the overall, thanks for that question. That's what's known as surveillance capitalism. So basically every time you go to a website, it is collecting information on you. Facebook, even if you don't have a Facebook account, it has basically like a shadow file where you exist as a person. This is really what makes Facebook so valuable as a company. It's not just the immediate advertisements that they sell. It's that people who buy ads on Facebook have access to basically their files on you. And that data can be used in all kinds of ways that, I mean, think about it. If you basically your entire life, your entire self is quantified and captured in a folder of information, anybody who has access to that information would be able to seed things into your feed potentially or feed you advertisements that would be not just responsive to your previous behaviors but predictive of your future behaviors. It would set in motion behaviors that you could undertake. And Cambridge Analytica was sort of an early harbinger of that. The problem is that we don't, Facebook and none of, no one is very transparent about how this works. The companies are not transparent about what it is that they do with your data. So we don't quite know the full effects of where this is going to take us. But of course, when it comes to political campaigns, this was why the Trump campaign wanted to work with Cambridge Analytica because they wanted to micro-target advertisements that would compel people to click on something because they had a profile on you and knew what would get your goat. So if they have a way to figure out what will get our goats, it really raises not just privacy concerns, it raises questions about free will. I mean, like there's nothing less than that. So we are at the precipice of fully understanding the sweeping consequences of that. 2016, and this is not going to sound very hopeful at all, but 2016 will be child play to what's coming. It's already started, we are in the middle of it. And so, yeah, these things that most folks don't quite know how they work or that they work or what data is being collected on them, then how do you know they are actually clicking on something because you're choosing to click on it? What if it's the algorithm? Like it really raises all of these deep existential questions that we don't have the answers for because we allowed these technologies, we allowed these companies because they provided a convenient service. We want to see our nieces and nephews, for sure. That they were helpful enough to us that we allowed them to become ubiquitous in our lives before we fully understood what they were doing and the impact it would have on our democracy. So trying to dig out of that hole before we even know how deep the hole goes down is a tall order. So, sorry. Shell, and then we'd like to have some students as well. So I used to be a free speech absolutist and the more we get into the weeds with this stuff, the less I am. And at the same time, my Facebook feed is kind of an open town hall meeting where people don't have to agree with me to post, they do have to stay civil. I've only had to throw out two people, the more recent one called someone a rag head and refused to take that comment down. So goodbye. But it's a very sticky dialectic. I work under the assumption that we haven't had privacy for 40 years anyway. I used to give the FBI agents, I thought were listening to my phone in the 1970s, my recipe for chocolate mousse. But I wonder, is there a way to instill the skills of sorting through this, the media literacy, of the analysis of what messages are actually real. My younger child took a class at PVPA in which one assignment was to follow a story on three media with three very different biases. So Raphael was following one story on democracy now from the left of the New York Times from the center and Fox News from the right. I think that should be required. How can we get something like that into the culture? Yes, I mean, understanding how different narratives are being framed is critical. At the same time, I'm very wary of both sides impulses and you're describing a kind of tri-sides sort of thing where it's left, center, right. The fact is that some things are facts and some things aren't. And so acting as if propaganda, that's not true, is equivalently worth time, attention and analysis to something that is demonstrably true doesn't actually do anybody a service. And I do think that it's really, like I said, in my own media literacy classes, once we've established some of the ground rules of how information travels and what propaganda looks like and all of that, then we'll look at differing narratives and we'll analyze the kinds of rhetorical tools that those narratives are using, but I would never send them on a mission to learn about impeachment by having them read Fox News and also read the New York Times because that wouldn't, that isn't, they're not equal. And that's a very tricky thing to frame in a classroom because on one hand you wanna be very inclusive, but I can't in good conscience tell students to get information from a place that isn't true. I can't. Just like I wouldn't, I wouldn't put climate denial on the same stage as climate science. I just couldn't, I couldn't do it. And so I understand the impulse and I think that figuring out where people are coming from different ideological perspectives, absolutely critical, but that's not taking into account the kind of capitalist undertones of these endeavors that Fox News is a business that makes money off of certain kinds of spin. So it's not like it's true information from the perspective of the right versus equally true information from the perspective of the left. It's not, it's deeper than just what the bias is, what the editorial perspective is. I think the purpose of that exercise was to be able to tell a fact in fiction by looking at both fact and fiction covering the same story. Okay, so it wasn't like it's equally the same. It wasn't just don't go over to this, and there would be no analysis in the classroom. Yeah, no, and that's, and that's I think really important. I mean, it's, it does get, and you probably saw like I immediately turned red because that's a true, that's just a tricky, it's a tricky question. You know, I, the way that I approach media literacy is less about sort of individual texts and even individual outlets and more about mapping the overall media ecosystem. So I talked about the biomass pyramid and I talked about the redwood root system, but the book that we're publishing is, it's actually redwood root system at the bottom, land cultivation all around, and then hurricanes above. And each of those metaphors is an entry point into understanding how information flows and how you fit within it. And so by triangulating yourself within, you know, alongside the roots and all the farming that's happening and then the storms that rage overhead, you can see it's not just whether or not a claim that Fox News made is true. It's how does Fox News fit into the overall media ecosystem? How does it influence the New York Times and how does the New York Times influence it? And then how does individual citizens respond to any of those stories then influence the kinds of future stories that get written and so I think it's more, it needs to be more multidimensional and more holistic as opposed to looking at individual texts to analyze and debunk because as it turns out, fact checking and debunking is really not an effective strategy when it comes to media literacy because if you are approaching, so there's something called the boomerang effect, which is one of the scariest things that you'll ever hear. And that is if you think that the source of information is biased and that source of information tells you something, you will be convinced of the opposite. So if the New York Times says QAnon's not a thing everybody, but you think the New York Times is fake news, that's gonna be a good reason for you to start doing your own research about QAnon because you don't trust them. So these efforts to approach certain stories as texts to analyze, trust me, I mean I'm an educator, critical thinking thumbs up but it can have unintended consequences when you're not thinking about how all of our different systems interact with each other and how we interact with them is exhausting to do. It's very, you can't do it in a four minute answer. I can be loud. Oh you can record it, so we'd rather that you have to mic. Okay. Thank you. I'll just stand. Hi there. So I'm an honor student at UMass doing my thesis on this topic. So radicalization and deraticalization. So I was wondering, I have a couple of questions. I don't know how much time we have. So first of all, what do you have to say about the evidence of discord and how now all these conversations that were happening on public platforms like 4chan and 8chan are happening in closed spaces with the same people bouncing ideas off each other? It makes it much more difficult to figure out what you're looking at. I mean one of the biggest sort of dangers in reporting certainly is that it's often very hard to know what you're seeing. That something can look like one thing online and that one thing would or should generate a certain kind of response. But if it's actually something else then that response can do more harm than it does good. And different platforms like Discord or other telegram is another one or other private or semi-private spaces. It means that people can do sort of backstage coordinating on 4chan and they, the participants on 4chan and 8chan relied on this at the end. 8chan is now gone after the El Paso shooting when the white supremacist manifesto was posted. They, the server, the people who were providing server space were like, sorry, we're not, you have to go. It reemerged through the dark web though so it's harder to access. But so originally, the dark, was that a question? No, I just said the good old dark web. Yeah, no, the good old dark web. So it's still accessible but it just is additional sort of barrier to entry. But these spaces, these spaces originally, they seeded stories so that journalists would descend on the story. They knew to go to 4chan, they knew to go to 8chan so they could manipulate journalists by leaving them crumbs and the journals would report it out and everyone would laugh. Here you have additional layers of potential subterfuge because you have people not just posting content but coordinating at the same time on spaces that journalists don't have access to. So it's even harder to know what you're dealing with and how you should best respond. Where did it start? There are problems with explainer articles that just kind of give you a greatest hits of all the most offensive memes but at the same time you can track it. You can figure out where it's coming from and what it ends up doing. Now it's harder to do that because these individuals have needed to become more diffuse because their stomping grounds have been stamped out. You see the same kind of trouble in Brazil. The number one social platform in Brazil is WhatsApp if anybody is familiar with it. So they have a hell of a time figuring out what to do about disinformation campaigns because you can't track where it starts. You don't have access to the group and therefore you can't see where it begins and how it spreads and how you might direct. If debunking or fact checking is something that might be appropriate, it wouldn't even necessarily be possible when you're talking about these kinds of spaces. So it creates a lot of complications that exacerbate the existing complications of these more public-facing sites. So it makes the job of a disinformation researcher that much harder and they just have more places to hide. Are there other people's questions? Yeah, there are. So we'll come back to you. You can email me if you have questions. Oh, but if you can, you can maybe we'll have time. Ask in public, yeah. So would you like to have a question? What is your email? I just want to go to people who had already raised their hands. W-H-I-L-L-I, we're filling. Serian? W-H-T-H-I-L-L-I, we're filling. I don't know, S-Y-R.edu. Yeah, I'm the Syracuse University website to use Syracuse University, Dr. Whitney Phillips, and I'll come up too. Yeah, please, I invite all of your emails, please. I'm very interested in your description about how the Apex Predators intentional and over-white supremacists and others who incite violence are dependent upon the behaviors of those lower in the pyramid, and you've alluded to an ethic of responsible use that might starve those Apex Predators. I'm wondering if you could give some hints about what is an ethic of responsible use for ordinary citizens? Yeah, that's a great question. I mean, so one classic example of Apex Predators would be the President of the United States. He's, so when Donald Trump tweets, and I'm not being facetious actually, when he tweets something, it only carries so far. He only says the things he says one time. They get repeated over and over and over ad nauseam because all the journalists cover it because individual citizens respond because they retweet it. Algorithms don't care whether you're responding to something out of disgust or out of support. So simply by engaging with content is gonna float it. If there's suddenly a mass response to any content, algorithms know that it's happening and then on the usually the right side of the page, it's gonna then give you trending topics. And so then when something is trending, then more people are gonna click it and then it becomes an even bigger story. So without all of those mechanisms, Donald Trump could never have become President Donald Trump. He was always reliant on the signal boosting of mainstream news outlets. And mainstream news outlets at the time kind of knew it. They just thought he was hilarious. And so they knew it would be good for clicks. So it was a symbiotic relationship until suddenly it wasn't. But in terms of what it is that people can do, the first thing to remember is that we are never outside the stories that we're commenting on. That we often, whether we're a journalist or not, but just as a particular ethos within journalism, we're never, there's never a view from nowhere. We're never a casual observer of content. We are a part of a story. The second we engage with that story. How does it work? How does it work? It's, it just is, and that's something that just doesn't get, it doesn't get discussed because the assumption is that I'm gonna, this is what Donald Trump did. This has nothing to do with me. I'm just saying that this is absurd or whatever your reaction to him would be. And an ethics of responsibility would take seriously the idea that we are all always right in the middle of it. That we are a part of, so the hurricane metaphor, the way that it works in the book is that you would, a hurricane is formed through a whole amalgamation of all kinds of overlapping variables. It's the wind speed. It's the temperature of the water. It's the axis of the earth. It's all of this different stuff that results in the formation and movement of hurricane. You would never, in a million years, point to one gust of wind and say that's a hurricane. But that's what we tend to do with online stories. That we talk about things as being separate from everything else, including ourselves. We're not part of the story. We're always part of the story. And it might be a small part because we're maybe a fungus, but we're still part of that strata. We're still part of that ecosystem and understanding that and recognizing what I choose to do, how I choose to respond to Donald Trump has some effect and I might not be able to see it. I might not know exactly what that effect is, but just recognizing that we're part of it is it changes your relationship to that content because then suddenly it's not a text. Then suddenly it's an ecosystem that you are a part of. And reframing our online activities in that way, it requires a greater sense of, it's a greater emotional burden to carry because we don't have the luxury of just saying like, this has nothing to do with me. Everything has to do with the things you do. But it does create, it normalizes a kind of sense that what I do connects to you and what you do connects to me. And then that hopefully can change the kinds of choices that people feel compelled to make when they realize that they're part of the story. Part of the story. That's where we are in the story right now. And we decide to leave the story. Is it possible? Oh, to opt out? Well, and in this case, I mean, if I decide to use social media for the good, then it's okay, right? That's what we say. Or we don't use social media. So just go with that thread. What are the choices really? And, you know, yeah, I'll just stop. I'm just going with that idea. Yeah, we're in a position, all of us, where many of our professional lives are contingent on us, utilizing social media in one way or the other. We've kind of been put in a position where fully opting out is not possible. Some people have more of an ability to step away, but other people, small business owners or people who are involved in an activist community, they can't just leave. And it might be, if we could, if it were possible to totally disconnect, then I would advocate for that. But many people's circumstances require that they engage on these platforms. And so then the question is, how do you do it, how do you do it more thoughtfully? And that's where the question of best practices comes along. And I would say that, you know, thinking about what are some of the most important things to take away, if I were to give you one thing that you would take into your outside life. It's understanding how our tools set us up to fail. We can be the best people in the world and we can be so well-intentioned. Our politics can be strong. And yet, we can still contribute to the problem. That's why I like using the term polluted information instead of disinformation or misinformation because, like physical pollution, we all pollute. There's no way for us not to pollute as human beings. It's a part, we have to sort of minimize what it is you are putting into the world. But awareness that what you throw out on your curve, that that might sort of curb your impulse to just let it all fly. So that's why I like that particular term. But in terms of what is it that we need to know, it's that point of how have we been set up for this to be the outcome. And the tools that we use really put us in an impossible situation. And it happens in two primary ways that interfere with our ability to make thoughtful ethical choices. Not because we're bad, but because this is how our spaces are set up. But first are the affordances of digital media writ large. So therefore, modify, and they go together, so pairs, modifyability and modularity. So that's basically the ability to cut things up into little pieces and modify it and do funny remixes and add text and do all the kinds of things that's creative and great or also terrible and racist either way, but all this stuff that allows us to be creative and playful with our online content, to take something big and then make something smaller out of it. You take a whole film and you take out your favorite reaction shot, your favorite image, your favorite whatever. So those are the first two. The second two are archivability and accessibility. So you have made this creative content. You have the ability to search for it and access it later. And other people can access it too, because you can Google a keyword search and then get certain funny pictures. And those affordances are at the basis of all of the creativity that happens online. They are also fundamentally connected to the process of decontextualization. That we use our tools in a way that cut up context and leave just the text. A text is very difficult to respond to ethically, because often you have no idea whose toes you're even stepping on. You don't know where content originated. You don't know whether the person featured in the reaction shot consented to their image being used. You don't know what happened to. You don't know. You don't know very much. And so you can't make sound, thoughtful, ethical choices, because there's so much that's unknown. So those affordances set us up to not be bad people, but to just not be able to even conduct an ethical calculus. That's the first. The second set of affordances has to do with our platforms themselves, the ability to share things, which is what they were designed for us to do. So any retweeting, reblogging, reposting, commenting, anything that allows content to spread from one person to the other, it does too. I mean, first of all, it's how we all connect. Positive could, can be. But it does two things in addition to that. The first thing it triggers is what's known as Poe's Law, which is an internet axiom not related to Edgar Allen Poe. But in the early 2000s, there was a poster on a creationist forum who went by the name Nathan Poe, probably not his real name, who noted that it was often impossible to distinguish a sincere flat earth creationist from a satirical one. So just by looking at something online, you can't tell if they're joking or being serious. So you have this axiom. And then this is especially true when you're not talking about people who you know in real life. You know that your cousin has a dry sense of humor, or you kind of know how to interpret something that a friend of yours says online. The further removed you are from knowing a person in an embodied space, the fewer meta-communicative signals you have. You don't know how to interpret what they're saying. You just have their words, which can be very confusing. Because then you don't know, is it a real threat? Is it a joke? Are they a white supremacist? Are they an awking white supremacist? What am I looking at? So then in addition to Poe's law, as an added complication, you have context collapse. So in this room, I know I'm being filmed so the audience will expand in ways that I can't predict. But that's actually what context collapse is. But in the room itself, I can count you. I can see you. If any of you suddenly got very upset, I would know that it happened. I can read the audience. I can read you. Online, you often have no idea how many people are paying attention at any given time. You don't know where your content is gonna go once you post it. It may have been intended for a group of just five of your closest friends who know your sense of humor. But you never know that content could travel to totally new contexts who don't know how to read your sense of humor, who actually take you very seriously, even though you were making what would be obviously a joke to you and your friends. So Poe's online context collapse mean that you just can't tell where somebody started, who it was meaningful for, what they were hoping to accomplish, and therefore you have very little ability to know what can be done or should be done in response. And then you add the decontextualization of being able to slice and dice your online content and you're just at a loss of knowing what is appropriate. How should you react? And so the number one thing in thinking about what you do online, you must take as seriously the things you don't know as the things you do. We believe because we are in a culture that values empiricism, that seeing something online that observation is confirmation and it isn't. You very rarely know for sure what it is you're looking at. And we need to be better at assessing those unknowns and asking ourselves, is there potential for harms that I am not anticipating? And if you can't answer the single question, will this sharing benefit? If you don't actually know what the answer to that question is, then you should think very differently about what it is you're doing. Because if there's a chance that what you share, if that's gonna benefit white supremacists, then maybe. And if it's a joke you have no way of knowing, no way of verifying, doesn't mean you can't respond or that you shouldn't respond. It means taking the time to stop and reflect and think. What might this do? And does my inability to know what this might do change my mind about how emphatic I am about sharing it at this moment? That's the number one thing. We never think about the stuff we don't know. We only focus on the stuff we can see. And the stuff we can see is going to mean this leading in one way or the other because the deck has been stacked against us from so many different directions. I'm so sorry, but we've reached the end of our time. I'm sorry. I'm sorry. Please join me in. You can check our website, criticalconnections.org for our upcoming events. And who's signed to that. And my current center.org, so. And thank you to Jeremy for his mother.