 Good morning, John, it's Thursday. So, remarkably, we have mostly forgotten about this by now, but there was a day this week when a lot of people had the thought Facebook has been down for a really long time now. Maybe it won't come back. That was accompanied by this tweet getting 400,000 likes and retweets. So someone deleted large sections of the routing. That doesn't mean that Facebook is just down from the looks of it. That means that Facebook is gone. Several people then wrote blog posts about how Facebook might be gone because of a tweet from someone who is not qualified to comment on it because that would be the most interesting piece of news that day and I'm sure it got a lot of impressions. Which I guess is the goal, though it does rely on a complete misunderstanding of how websites work. So maybe that's lesson number one, while something is happening is the worst time to learn about it but it is also the time when we most want to know about it and that is the root of one of many different destructive mechanisms that happen on social media. It has nothing to do with what platform you're on. It has to do with the barriers to distribution and discovery being low enough to ice skate over. This all came in the back of Francis Haugen, a former Facebook employee releasing a huge trove of documents in which Facebook did research on itself and discovered that it was, you know, not great. Research that it then promptly ignored and kind of hit. So Facebook's no good, very bad week gave us a chance to think, what if it didn't exist? But actually a lot of people really do depend on Facebook to maintain their businesses, their social lives, their families, their connections with people in other countries who they love. I'm just saying maybe don't go to the people who are gleefully celebrating to be the ones who have the best insight on this topic. But I do think it's important to engage in a thoughtful way with the existence of Facebook, the impact of Facebook. Hence this very long educational video, which makes it exempt from the four minute Vlogbrothers rule. Now I have over the years been vocally critical of Facebook for a number of different reasons. So you might expect me to just like launch out with how bad and evil it is. Oh, it's a fun little social network that undermines the foundations of our democracy and lets me know when my dad is done building the debt. Why would those two things be put together? I don't know. Justin Timberlake was bringing sexy back 15 years ago when I was a full fledged adult. Facebook was only available to high school and college students. Now it is the sixth most valuable corporation in the world and possibly the most influential corporation ever. Like yes, there have been newspaper monopolies that controlled the news. There's been energy monopolies that controlled energy. There've been telephone monopolies that controlled long distance communication. But Facebook might know more about you than you do. Like their internal research has shown that they can affect people's moods. They can decide whether to inform or misinform you. They can help a community thrive. They can make a community stop existing. And they can do that not just with like 60,000 people, like the signs of Mizzoua. They do it with billions, billions of people. You start thinking about this too hard and you realize that Mark Zuckerberg might be the most powerful person to exist, which is like not a super comforting thought. Facebook has a lot of power. Like a coalition of European politicians on both sides of whatever they have, I guess an aisle. Probably do they have an aisle? They wrote to Facebook and they said, your platform is forcing us into more extreme positions than we want to have. And maybe even that our constituents want us to have because in order to get elected, we have to play the attention and engagement game that you have set up for us. Now you might say, well, those politicians should have some spine and do what they want to do rather than what like the mass of people on Facebook want them to do. But let's all accept together that a politician's incentive is to get elected. And if they don't engage in the attention game and someone else does, they don't get elected and then that person is in power. Look, it's easy to not be sympathetic toward politicians that get that, but I am also very aware that there are probably a lot of people who would be very good public servants who are not ever going to consider taking on that line of work because of the environment that the social internet has created. These platforms make life miserable for every politician who is not a garbage human. We are at the very beginning of this. It always feels, in any given moment, I'm old enough now to recognize this. It always feels like where we are is the destination. We have arrived at this moment and that is where we were going. And then that turns out to be not true. This moment ends up being another step on the journey to the other place where we were going, which is the new present that we will experience in the future. So I'm just trying to remember that we are at the very, very, very beginning of this. If you ask someone what the most disruptive technologies in history are, they are probably going to tell you about some weapon, which I get, but I think the most disruptive human technologies are always communications technologies, ways that people transmit information from person to person. Stories and books and plays and trade and radio and television. Radio was fantastic for the Nazis. The Catholic Church was fractured by the printing press. I used to think that I irrationally believed that the social internet was as big of a deal as the printing press. I now believe that that is 100% irrational belief. In fact, it feels like a common belief. And so if we are at the beginning of this and it is a huge revolution in the main human thing, which is human to human communication, then it shouldn't be a surprise that we're bad at this and we're bad at this. I'm sorry if you work at one of these companies, but we need to accept this. Facebook and Twitter and TikTok and YouTube are bad at this and you watching this are bad at it and I making this am bad at it. Of course we're bad at it. It's revolutionary and it's a baby. We have no systems for how to deal with many to many decentralized communication. It's brand new. We don't know what we're doing. So as with apparently everything, we have to figure it out. And that means that some, not all, but some thoughtful people need to engage with it in a serious way. And if you're watching this, maybe you're one of those people. And actually I was a little bit impressed by the congressional hearing, that the testimony that Francis Haug and the Facebook whistleblower gave and the questions that were asked, it seemed to me to show a much deeper understanding of how all of this works than I had previously seen. It means maybe we're starting to have useful conversations about how to live in a society that contains social media. And look, it is very easy to say Facebook sucks. I say it all the time. But it is much harder to imagine what one would do to make Facebook better. Because guess what? Apologies to the random bloggers who thought that Facebook was gonna not exist anymore. It's gonna keep exist. If Mark Zuckerberg wanted to just pull the plug and be like, well, what a great experiment. We're not gonna do it anymore. That would be a Hondo P. Baller move. But it does seem unlikely. I feel like Mark Zuckerberg has a lot of incentives to believe that Facebook does more good than harm. And so he's probably gonna believe that whether or not it's true, which is actually, for clarity, not something I know. Honestly, I'm not even sure many people could even answer the question, what is Facebook? Let's try and build a mental model of what Facebook actually is. I'm not saying it's going to be perfect, but I am saying that it's probably gonna be better than whatever jumble of hot takes you currently have associated with your Facebook neuron. Facebook simplified, there are three centers of power. There are the people who consume the content, the people who create the content and the people who host and promote the content that third one is Facebook, this finger. I'm not flicking off Facebook. All three of these groups have different goals, but it turns out that their incentives align really well, which is how Facebook became worth a trillion dollars. So let's go over the incentives. Let's try to understand why these people are all doing what they're doing. Starting with the people who create the content. They are trying to make their numbers go up, whether that is for status or for money. And trust me when I say watching the numbers go up can feel really good. And once you start to get them, it can take a whole lot of time and work to start divorcing your self-worth from your internet numbers. Facebook needs to incentivize the creation of content on its platform because without that, there's nothing there for people to do. And so they provide a lot of different feedback mechanisms that make people feel good about the stuff that they make, that make people feel like they are acquiring status and very occasionally that they are getting revenue. Next, let's talk about the people who consume the content, which is to say everyone and let us all accept that we as people who consume content have incentives for what content we consume. We like good cats. We like good jokes. There are a number of different incentive cycles here, but the ones that we tend to talk about and need to talk about most are the ones that are destructive to society or to people who are consuming the content. And the number one emotion everyone talks about when talking about this is outrage. And I would like us to stop talking about outrage because I don't know what it is. Instead I would like to talk about two other things. One is fear. People when they are afraid experience a lot of emotion and they experience a lot of desire to share that information and try and take on that thing that they are afraid of. Which like we knew about that. I mean we knew about that before social media. Like news was always aware of fear. But the second thing that social media is very good at that we often call outrage is actually a feeling of righteous superiority. I feel really good about myself because I know that I am better than whatever other human I have just been exposed to or told about. Feeling like a good person who is specifically good because they are better than someone else. Super seductive. Because it means you don't have to do anything to be good. You get to feel better about yourself just by looking at how awful other people are. But it is not new. Like I don't wanna point fingers or anything but it turns out that righteous superiority was a pretty big deal for Martin Luther. And also people are often correct in their righteousness and in their superiority. Like hello vegans. And finally the last player in the game the platform itself. Their incentive is or at least was very clear. Keep people on the platform, learn more about them so you can sell more advertisements at a higher price. And so there's a positive feedback loop where platforms will promote content that makes people feel righteously superior. Those people will consume that content. They will feel righteously superior. More people will create that content because it will do better because the platform will promote it more. And there's also external positive feedbacks into the loop because if you have a group of people feeling righteously superior other people find out about them and then they make themselves feel righteously superior to that other group because they don't wanna feel inferior. I don't wanna oversimplify this but like if you run this positive feedback loop for about 10 years maybe you end up with everyone spun up like a bunch of Beyblades thrown into a fishbowl full of fragile institutions. There are also other feedback loops which is why there are cute cats and good jokes on all of these platforms. But hopefully this gives us a workable mental model of what Facebook is and we can move to the next section of the video. Salute. So yeah, we're gonna hide. We've got these three power centers and two of them are extremely dispersed and difficult to control. Like no one is able, no one is able to control what people are posting and consuming on social media platforms. But the third one is centralized and it's making $85 billion a year. So that makes it a more attractive target both in terms of optics and also in terms of something you can actually control for regulators. You might have even seen or heard Facebook kind of begging to be regulated, which sounds strange. Here, for example, is the end of a statement that Facebook made in response to Frances Haugen sharing all this information about the platform. Despite all this, all this being the fact that they don't agree with anything she said, we agree on one thing. It's time to begin to create standard rules for the internet. It's been 25 years since the rules for the internet have been updated. And instead of expecting the industry to make societal decisions that belong to legislators, it's time for Congress to act. This is a good point. Like it's a good talking point. It's also just a good point. But let me explain why you're gonna be hearing a lot of it and why it might not be as good as it sounds. This is Facebook saying, okay, I see that you think that we've created a lot of problems. How about you fix them? Are we gonna give you advice into how to fix them? Nah, are we gonna give you insight that will allow you to understand the platform well enough to create good solutions? Also, nah. They're basically saying, okay, I see what you're doing. You're saying that we're the big bad and the responsibility's on us. We're saying, what about you? We think you're the big bad. The responsibility should be on you. And maybe it should, right? Like should Facebook's responsibility be to like impose things that control society or should that be the job of legislators? But this outlines a really good and interesting point which is that Facebook is actually, with regards to this stuff, not like anything else, they're doing fine. With regards to this stuff, they're kind of in a terrible position. So a company as big and powerful as Facebook can kind of have one of two goals. It can either be out there for profit or it could be out there for profit and. And what and is gets really complicated when you are as powerful as Facebook. And I think maybe this conversation kind of terrifies them because it outlines a reality that they would like us to not notice. They are not just a place where content goes. They decide what people see. They make millions of decisions per second about the world that people experience. And it is very easy to make those decisions when you have a particular goal, that goal being, how do we keep people on the site longer, learn more about them so we can sell them more effective advertisements. And then you can start to put on some little limits. Like if this is gonna incite violence, if it's hate speech, we're gonna take that off. But that's just about the law and like what the right thing to do is. But if you get beyond that and you start to say, okay, well sort of what counts as hate speech, what counts as incitement to violence? And raising up the barriers higher and higher and being like, ah, that's on the line, but we're gonna call it incitement to violence, that's gonna constrict the speech of some people more than others, particularly it's gonna constrict the speech of some members of certain political parties than others. And of course they can legally do that. They're a private company until they can't anymore. You're already seeing this, like right wing politicians getting up there and saying like people are being censored on social media in a way that of course makes their constituents feel both afraid and righteously superior. What a win! And many of them are threatening a whole set of separate regulations that would prevent these social media platforms from policing themselves, which for clarity would be, it would be a disaster. So Facebook is saying, if you want us to do something, tell us what to do, because we're afraid that if we do more than the bare minimum, you're gonna regulate away our ability to do anything at all. So until you make that call, we'd rather just do what is normal and expected of a corporation, which is to focus on profit. Now there are also other reasons why Facebook is focusing on this conversation about regulation. One is that it means that it would regulate all internet companies, not just Facebook, the rules of the internet, not the rules of Facebook. And regulations are often actually a competitive advantage for existing incumbent large businesses over potential competitors because the regulations increase the barrier to entry into that industry. I'm not saying I'm anti-regulation on these companies. I'm very pro-regulation on these companies. I'm just not so pro-regulation on these companies that I will ignore reality. But the last and maybe most important reason why Facebook is calling for regulation on itself is that then they get to go after the individual actual pieces of legislation like the individual suggestions, which they will of course do. Like, yes, please regulate us, but then any individual regulation they will fight against tooth and nail. And yes, it is a lot easier for a legislator to just like yell about how much Facebook sucks than to actually write legislation that would make it better. Especially when regulation is complicated, whereas shouting that Facebook sucks can make them and their constituents feel, wait for it, righteously superior. None of this regulation would be easy because yes, legally, Facebook can control its product in whatever way it wants to, but let's accept that it's not just a private corporation in that way anymore. Like, we don't want it to be able to control society in whatever way at once. And that is why they're sort of very hesitant to do that because it's terrifying. Facebook is a place where communities thrive, where families connect, where businesses grow. In a very real sense, it is kind of both corporation and government. Like, if you spend a lot of time on Facebook, a lot of your life there, they are in charge of that part of your life. That makes them a kind of government. A fact that they would like everyone to keep not realizing for as long as possible because it has implications. You'll never know if you don't get the shine if you don't glow. Oh my God, I've been recording for 50 minutes now. What the hell did I do to myself? I do have more that I need to say here. I've implied this already, but I'm gonna say it out loud right now. The real danger and power and weakness of Facebook and YouTube and TikTok and Twitter is not that they host content. It's that they decide what content gets new eyeballs on it. Hosting a piece of content is the building block that the entire internet was built on. Promoting content is an active editorial decision that is being made by the company. Now that decision might be being made by a computer program that no one understands, that only has inputs from the users of the platform. It is still a decision that the company makes. And so when a hateful Facebook group gets promoted, when there's an anti-vax talking point trending on the side of Twitter and watching a YouTube video about history takes you into the alt-right pipeline, those are decisions that are made by platforms full of thoughtful people who did not think at all about the fact that that was gonna be a thing that happened. This is a problem with every social media platform that uses algorithms. Like you can scroll all day on TikTok and it will just be full of people celebrating the diversity of humankind and their own remarkable human body and mind. And you will never be exposed to the garbage, terrible alt-right pipeline inside of TikTok because TikTok is really good at knowing who you are. What I'm saying is when people say that Facebook should be held responsible for the content that's on its platform, I agree, but in a pretty limited way. Like they should not host hate speech, they should not host illegal content. But I believe that in a much more limited number of cases than the following sentence, Facebook should be held responsible for the content that it promotes on its platform. I think that that is something that every single person who works at a social media company needs to square with. And it's not an easy thing because it is very difficult to create that joyous, wonderful experience that you might have on an algorithmic feed without indulging in other people's righteous superiority and their fear in a way that continues to spin us up, create division, and destroy fragile institutions. In general, a algorithmic content platform's response to this situation is, okay, let's create artificial intelligences that can identify hate speech, they can identify incitement to violence, and then we will take down those individual things, but we're gonna leave up this remarkable self-reinforcing feedback loop. That is so fantastic at keeping people engaged on our platform, which is a little like saying, we're gonna like follow behind the dune buggy full of flamethrowers and put out the fires that it's creating. Yeah, guys, you gotta focus on the dune buggy with a flamethrower. And that is a conversation that I actually feel like is starting to happen. I just wanna tell you how I'm feeling. No, I feel like it's necessary to say there are people who are smart and who I respect who say that when you talk about regulating social media companies, you're not talking about regulating media, you're talking about regulating society. And we don't want the government to regulate society. That's scary. My response to that argument is that Facebook is already regulating society. It is just doing it with a very narrow short-term focus, which is how do we increase the amount that we know about people and the amount of time they spend on our platform so that we can make more money advertising to them. Double rainbow on the way across the sky. I wish at the end of this video that I had like good advice for how to move forward. Like I think that we're not gonna educate people out of response to fear and indulgence in righteous superiority. But there are some signs that maybe like immunities do kick in and some people not like think about this actively but realize that like that kind of content seems like overplayed. But we can't say like Facebook is only bad because humans make bad decisions because like humans are humans and we're not gonna get out of that. It might be that these platforms start to get their platforms to recognize when they are engaging in the promotion of society or personally destructive content. I encourage all computer scientists to work on that problem. I think that we need to say to ourselves, it would be very hard for Facebook to make a profit in a world that has been completely ripped apart by Facebook. And as for what legislators should do, I don't know, I'm not, thank goodness, I don't have your job. But I will say I'm very much not in favor of making it so that platforms are ultimately responsible for like anything that gets posted on their platform. That would basically make it so that none of these things could exist at all. But I am in favor of saying that they are responsible for what they promote on their platforms because that is their decision. I don't know if the world would be better without this many to many decentralized communication system that we have developed. I know my life would be very different without it. But I will reiterate something I said earlier which is that we are at the very, very beginning of this. Of course we are bad at it. I'm gonna look back in 20 or 30 years at the way things are right now and we're gonna think about how quaint and clunky and just hilariously bad it was. I don't know how we're gonna move forward from here but I know that we will, John, I'll see you on Tuesday. All of our lovely Pizzamas stuff is available only during Pizzamas. So for the next week or so and then it will never be available again. None of it will ever be available again. You can go to Pizzamas.com right now to check it out. All of the profits go to support our community's work developing better healthcare systems in Sierra Leone.