 I'm not sure how many of you noticed, but on Monday, Facebook and its family of apps were down for hours, causing Mark Zuckerberg to lose $6 billion and also leading to the company's stocks tanking. Now for some additional details, we go to the New York Times who explains, Facebook's apps, which include Facebook, Instagram, WhatsApp, Messenger, and Oculus, began displaying error messages around 11.40 a.m. Eastern Time. Users reported within minutes Facebook had disappeared from the internet. The outage lasted over 5 hours before some apps slowly flickered back to life though the company cautioned, the services would take time to stabilize. Technology outages are not uncommon, but to have so many apps go dark from the world's largest social media company at the same time was highly unusual. Facebook's last significant outage was in 2019, when a technical error affected its sites for 24 hours in a reminder that a snafu can cripple even the most powerful internet companies. But it is now over and when it came back, Facebook's CTO, Mike Schrofer blamed the outage on quote networking issues, but I mean if I'm being 100% transparent here, I really wish that Facebook never came back. If we could delete Facebook from existence, the world would literally be a better place I think. And not just delete it from existence, but we need one of those flash pens from Men in Black so we can make all of humanity forget that it ever existed. That's how damaging to the world Facebook is. Not only is it a cesspool of misinformation, but on top of that Facebook is profiting off of hate. And that's not just hyperbole, that's not speculation that's confirmed by a whistleblower who explained in a 60 minutes interview how the platform's algorithm literally prioritizes hateful content above all else because that's what keeps people engaged and it keeps them on the platform. Now that revelation isn't necessarily surprising, in fact it's pretty obvious, but it runs counter to what the CEO of Facebook Mark Zuckerberg said. Everything that the whistleblower is alleging here, Mark Zuckerberg claimed that that's not actually the case. So take a look at this video of Mark Zuckerberg juxtaposed with the whistleblowers interview on 60 minutes courtesy of the recount and it shows you that this man was lying to everyone about his platform and the way that it operates. Do you believe your product can be addictive? We certainly do not design the product in that way. Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they'll click on less ads, they'll make less money. The research that we've seen is that using social apps to connect with other people can have positive mental health benefits and well-being benefits. Facebook's own research says as these young women begin to consume this eating disorder content, they get more and more depressed and it actually makes them use the app more. The way we design our algorithms is to encourage meaningful social interactions. Its own research is showing that content that is hateful, that is divisive, that is polarizing gets engagement or reaction. But election interference remains an ongoing threat so we continue to improve as part of our ongoing commitment to supporting the civic process. They basically said, oh good, we made it through the election. There wasn't riots, we can get rid of civic integrity now. We strengthened our enforcement against malicious and conspiracy networks like QAnon to prevent them from using our platforms to organize violence or civil unrest. But after the election, Facebook was used by some to organize the January 6th insurrection. Now for some additional context, apparently the platform got worse as of 2018 when it changed its algorithm and from there that's when they really began to prioritize hateful content because that's the thing that gets people coming back. And just for a moment before we go to what the whistleblower says here, think to yourself how these social media platforms keep you engaged if you're arguing with someone. If you see someone that says something racist even if you yourself aren't racist but if you confront someone who said something that's insensitive, you wait for that person to respond, you respond back. It's almost addictive in a sense, in some weird way. So those things the algorithm picked up on and they knew that that's what keeps you on the platform. So that's why they prioritize hate, they prioritize misinformation, which is why we see so much anti-vax misinformation. They prioritize things that keep people coming back to the platform even if it's detrimental to humanity, even if it's detrimental to public health, even if it makes society worse off. The whistleblower explains this. You have your phone. You might see only 100 pieces of content if you sit and scroll off for five minutes. But Facebook has thousands of options it could show you. The algorithm picks from those options based on the kind of content you've engaged with the most in the past. And one of the consequences of how Facebook is picking out that content today is it is optimizing for content that gets engagement or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarizing, it's easier to inspire people to anger than it is to other emotions. Misinformation, angry content is enticing to people and keeps them on the platform. Yes. Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they'll click on less ads, they'll make less money. Now, so Facebook profits off of making people more hateful, off of making people more misinformed. It's just truly a terrible platform. And again, I want to reiterate that I was disappointed to learn that Facebook came back. Now, you might think that I'm a hypocrite because I am on Facebook. We post humanist-reported videos to Facebook. And I've actually thought about leaving Facebook, but at the same time, I decided against it because with how much misinformation is on Facebook, with how popular sites like The Daily Poster and Ben Shapiro is on Facebook, any additional voice that's trying to debunk these conspiracy theories about the vaccines, debunk right-wing misinformation, I think it's important. I think that Facebook overall is a net negative for society, but so long as it remains a very popular platform, I do think it's important for leftists such as myself to occupy that space to try to combat it at least a little bit. Maybe somebody shares one of my videos about COVID-19 with their anti-vax uncles and it gets through to them. Like, I don't think that the success rate is that high because, again, the platform is accessible. But if it helps a little bit, then I think that it's worthwhile. But still, ultimately, I think that the platform is horrible. That's not to say that no good has come out of it because there has been a lot of people that organized on Facebook. It's a way for people who have disabilities to communicate with others. So it's not all bad, but the way that they designed it specifically is to bring out the worst in people, to misinform people, and that's where the issue comes in. That's why Facebook needs to be broken up. If I had my way, it would be nationalized, and these social media companies, they have to be regulated. They're basically unregulated. And that's why they're doing things like this. Of course, a multi-billion-dollar multinational corporation is going to prioritize profits over people. That's not surprising. We're seeing it with all types of industries as it relates to climate change. So this shouldn't be a shocker, but what it should do is catalyze action from lawmakers, and they should force Facebook to change their algorithm or, at a minimum, be more transparent with their algorithm. I mean, I don't know what else to say about this, but Facebook is awful. And if you have the option, you should definitely delete your Facebook account. But for those of us who are trying to produce content to counter the right wing misinformation, I think at least for now it's important to remain on there, but I don't know how much longer I think it'll be worthwhile. Facebook is truly just, it's awful.