 All right, so I'm not sure how many of you have heard about the Facebook papers, but essentially the Facebook papers is thousands and thousands of pages of internal company documents that multiple news outlets are currently picking through to confirm what we kind of already knew about Facebook, that they are a terrible profit driven company that is turning boomers into racist conspiracy theorists. And you know, nothing here is that shocking to me, but it does give us some more insight, insight specifically into how the algorithm causes brain rot in a lot of people. And I think that's interesting because I've seen firsthand the way that Facebook really manipulates people and individuals who I've known for years who have always been apolitical, never really taken a stand either way, they've become radicalized. They believe conspiracy theories that are batshit fucking insane. And these are people who I really respected people who, you know, I got along with because we have similar views when they did express their beliefs, people who are atheists, people who are former coworkers, who I've kept in contact with. And it's sad to see that, but it very clearly is the impact of Facebook's algorithm. So CNN reporter, Donio Sullivan explained one document that laid out the way that Facebook employees, they conducted a test to kind of see the impact their algorithm had on someone who wasn't overtly political. So they created a fake account and they tried to see how quickly the algorithm would recommend extremist content. What happens with the algorithm isn't going to surprise you, but the speed with which this person gets recommended radical content, that's really what's what's horrifying, because the implications of this, it's truly, it's nauseating. But nonetheless, I'll let you watch the video and then we'll talk about it and, you know, look at some more revelations from the Facebook papers when we come back. How does your Facebook feed become so politically polarized? In the summer of 2019, Facebook ran an experiment to find out. It created a fake account for a 41 year old mom living in North Carolina. They called her Carol Smith. Carol started off by liking a few popular conservative Facebook pages like Fox News, Donald Trump and Melania Trump. But quickly, Facebook began dragging her down a rabbit hole of misinformation. After only two days, two days, Facebook recommended Carol follow a QAnon page. And a few days later, it suggested she follow another. This experiment was never meant to be made public, but details about it were included in documents leaked by Facebook whistleblower Francis Hogan, who says the company is not doing enough to crack down on conspiracy theories and online hate. And they know that algorithmic based ranking, so engagement based ranking keeps you on their sites longer. You have long, you have longer sessions, you show up more often, and that makes them more money. By week three of the experiment, Carol's feed had become, quote, a constant flow of misleading and polarizing content, according to the Facebook employee who is running the account. A lot of us spend way too much time on social media. And when we try to cut back on those apps, companies like Facebook will often send us a push notification to lures back. That's exactly what happened during this experiment as well. The Facebook employee noted how they were traveling for a conference in the second week of running Carol's account, and we're checking Facebook a little bit less. And so Facebook began sending push notifications. One notification was actually to a Facebook post claiming Barack Obama was born in Kenya. This was in 2019, years after the ludicrous conspiracy theory had been widely debunked. What it does is amplify the messages that it knows will drive engagement. And it just turns out we humans get most riled up by lies and hate, and all sorts of misinformation. After running the experiment for four weeks, the Facebook employee recommended the platform stop promoting pages that are clearly linked to conspiracy theories like QAnon. But it still took the company more than a year to ban QAnon entirely from its platform, doing so only a few weeks before the 2020 election. All right, so I'm not surprised that that's the way that the algorithm functioned. Not at all, right? But what struck me was how quickly somebody was recommended extremist content QAnon. And this was two days after they liked Fox News, Donald Trump and Melania Trump pages. So the reason why that's so horrifying is because it shows you how easy it is to get sucked into that far right rabbit hole. It shows you why so many people have become radicalized, why there are so many more reactionaries now than there were, say, 10 years ago. And you have to extrapolate to really try to figure out how bad this is. So assume that somebody is not necessarily political, but they've been on this platform for five years, 10 years. And maybe they don't necessarily like Fox News or Donald Trump, but something comes across their timeline that one of their friends shares that is somewhat political. Maybe they don't view it as political, but maybe it's like an anti welfare meme. Like I've seen the one of the two refrigerator pictures, where one side is a fridge that's full. And it says this is the refrigerator of welfare recipients. And then the other one is an empty fridge. And then it says this is the refrigerator of somebody who's working 40 hours a week or something to that effect. So imagine if you're just like someone who is apolitical, and you like what is seemingly an innocuous thing on Facebook, well, then it sees that you liked this thing. So then it recommends you another right leading thing. And then you see how progressively with time, you're believing in anti-vax conspiracy theories when you never really had a penchant for conspiratorial thinking. And you never really liked this, but it kind of sold you one thing. And then another thing. And with time, you're a psychopath. So this is why so many boomers are reactionaries. It's because of Facebook. And as Lawrence Lissig said there, it amplifies the messages that we know will drive engagement. And so you kind of know why now left wing content doesn't thrive on the platform. It's because we're educating people. And when we get you fired up, when you feel angry, unlike right wing content, you kind of just feel doomer and you want to check out whereas with right wingers, they feel hatred and they want to engage more with the content and consume more content. So it goes to show you with the way that this algorithm functions, why right wing content performs so well on the platform. This is why so many people in this country are batshit fucking insane. It's thanks to Facebook. And it's not just Facebook. I'll be clear. I don't want to be too reductionist. But a lot of the issues, a lot of problems plaguing society anti-vax conspiracy theories, far right QAnon, it's all propagated largely due to the prevalence of Facebook. Now, there's more information included in the Facebook papers. And there's so many revelations that I can't possibly get through all of it. But there are some things that stood out to me that kind of goes to show you how nefarious this company is and how they knowingly, they're aware of the impact that they have on society. But they're not taking action necessary to address the negative things that's happening because of them. So first of all, Facebook chose to censor anti-government posts at the behest of the Vietnamese government, because you know, profits over freedom of speech. We also learned that Apple actually threatened to ban Facebook from its app store unless they removed human trafficking pages from the platform. So they apparently struggled to get human trafficking and human slave trades under control. I mean, Jesus Christ. They were also concerned about a decline in younger users, noting that young people see Facebook content as boring, misleading and negative based. So I mean, you have some reason to be hopeful for the future. They were testing ways to, quote, rebalance their newsfeed after getting a reputation of being a platform where political content is, quote, low quality, untrustworthy and divisive. And a plan to promote civic health was actually considered by them. But it kind of just went nowhere. But I mean, at least they were thinking about it. So credit where it's due, kind of. Now there are two things that I want to read. And it relates to COVID misinformation, specifically vaccine hesitancy and the role that they think they played in January 6th. So first, this is about vaccines from the verge Facebook has taken a lot of criticism for its handling of COVID misinformation, including from President Biden, who accused the platform of killing people by letting anti-vax sentiment run amok. But the leaks show just how chaotic the effort was inside the company. One document dated March 2021 shows an employee raising the alarm about how unprepared the platform was. Vaccine hesitancy in comments is rampant. The memo reads, our ability to detect vaccine hesitant comments is bad in English and basically nonexistent elsewhere. We need policy guidelines specifically aimed at vaccine hesitancy in comments. Comments are a significant portion of misinformation on Facebook says another employee in an internal comment and are almost always a complete blind spot for us in terms of enforcement and transparency right now. The document makes clear that Facebook already had a COVID-19 lockdown defense project dedicated to the platform dynamics created by the pandemic, including a work stream dedicated entirely to vaccine hesitancy. That team had also created significant automated flagging systems for misinformation. But according to the files, those simply weren't being used to downrank anti-vaccine comments. As of the March memo, there were no plans to develop moderation infrastructure like labeling guidelines and classifier systems to identify anti-vaccine statements in comments. So let me try to contextualize this for you because as someone who's a content creator on a different platform, I'm also on Facebook, but as a content creator, I have a little bit of insight insight. So they struggle to derank anti-vax comments. That means that rather than pushing vaccine misinformation, if the algorithm detected vaccine misinformation, they would derank it so less people would see it. And that's what's happening on YouTube with leftist news outlets. So we, myself, David Dole, secular talk, anyone who is an online news outlet, who talks about news, we are deprioritized in the algorithm because we are not authoritative news sources. Now that doesn't necessarily mean that we are less accurate than other news sites. It just means that we're not advertiser friendly to the extent that CNN is, for example. But the issue with that is even if their algorithm can prop up certain entities, those entities might be bad actors because now you have Fox News who pushes vaccine misinformation that I have to debunk, well, they're viewed as an authoritative news source. So what's interesting is that Facebook was basing everything off of the popularity and what would get more engagement. So rather than deranking these anti-vax comments, they were actually getting bolstered because these comments, they drew so much attention and that's truly horrifying because you think through the implications of this. This is why so many people are vaccine hesitant because they see a comment that's at the very top in a response to an article about vaccines and it has thousands and thousands of likes and you think, well, if this person is saying it, I don't trust the media. So this person who has a thousand likes on their comment must be saying something that I don't know about and could be completely factually incorrect. But the fact that it has the most engagement while the algorithm prop that up. And so, you know, it's it's interesting to see how on YouTube I get, you know, deprioritized in the algorithm and, you know, you get recommended John Oliver or MSNBC after you watch one of my videos. But on Facebook, it's like the worst actors imaginable who's getting propped up by the algorithm. And it really shows you the impact that that has because Facebook is a toxic hellhole. Now, moving on, when it comes to January 6th and the role that they played, I found this fascinating here. So Facebook discussed developing extreme break glass measures to limit misinformation, calls to violence and other material that could disrupt the 2020 presidential election. But when former President Donald Trump and his supporters tried to stop successor Joe Biden from being declared president on January 6th of 2021, Facebook employees complained that these measures were implemented too late or stymied by technical and bureaucratic hangups. Reports at Politico and The New York Times outlined Facebook struggle to handle users delegitimizing the elections. Internally, critics at Facebook didn't have a sufficient game plan for harmful, non violating narratives that towed the line between misinformation and content Facebook wants to preserve as free speech and some plans like a change that would have prevented groups from changing their names to terms like stop the steal apparently got held up by technical problems. So when I read this, I think they realized what the problem is with their algorithm and the way that it promotes misinformation and hate for purposes of profit because, you know, all of this drives clicks and engagements. So they were trying to find ways to grapple with the harm that they knew their algorithm would cause when it comes to the pandemic, when it comes to January 6th. And every time they tried, it was just kind of a lost cause. Now, that's not to say that on a social media website, everything is going to be peachy keen. Of course, there's going to be hate. There's going to be misinformation. But when you have an algorithm that is purely motivated by profit that promotes that misinformation, that's where the issue comes in. Right. So it's it's really horrifying to think through how quickly people are radicalized and how Facebook is aware of the issues that their website causes on society at large. And yet they they're incompetent and unwilling to fix these issues. It's just look, there's easy solution, why shouldn't say easy solutions, but straightforward solutions that would stop this. Facebook has to be broken up. That's number one. And number two, what's left of Facebook after you break it up, it has to be regulated. And until this happens, until we have legislation that reigns in Facebook and the harm that they're causing, well, I think that as much as we can, we should try to stop using the platform. Now, I know it's difficult because some of you have businesses on Facebook. And I think that leftist content creators and people who share a lot of political content, you probably should remain on Facebook just to be kind of a counter to the right wing misinformation that's being peddled, even if it kind of is a lost cause. I mean, it's something to counter what's out there. But I mean, if you can delete your Facebook, if you don't use it to keep in touch with relatives, you really should. But I mean, I think that, you know, what we've learned from the Facebook papers is that younger people are already off of Facebook. You know, we don't get a lot of views from Facebook. It's a pretty large page. You know, if you go to the Humanist Reports page on Facebook, but most of it comes from YouTube. So, you know, young people already know it's just a matter of convincing boomers to either leave the platform or use it responsibly. And, you know, a lot of people who are using Facebook, they are maybe new to using technology and computers in general, so they don't know what to look out for. It's overwhelming. So this is why they're easily, you know, duped into these things, susceptible to radicalization because they don't know what they're doing. They don't know what to look for. So it's tough. But one thing that I think is sure that everyone can agree on, right or left, is that Facebook's impact on society has been overwhelmingly negative. And if Facebook just went away and we could somehow erase its existence from our memories, society would be better off. I mean, not all of our problems would be corrected, obviously. But I mean, Facebook existing is bad for humanity, just, I think, objectively speaking.