 Well guys, 2021 is off to an amazing start. After months of lockdowns followed by a spotlight on police abuse that led to a summer of legitimate protests and illegitimate, but I'm told mostly peaceful riots, we wrapped up the year with an extremely contentious presidential election that brought America's political division to a new peak. And because I'm obviously the luckiest person on the planet, I got to experience that partisan animosity all over again in December thanks to Georgia's runoff election. I approve this message. This message. I approve this message. Ugh. As if that weren't enough, President Trump's complaints of election fraud and the outrage over the final outcome brought a massive protest to Washington, D.C. That ended with a group of people that have been labeled insurrectionists breaking into the Capitol building while Congress was in session. And just like at some of the other riots last year, several people got killed in the chaos. It all just feels like escalating nonsense that can only be fully described by words I can't say on this series without getting age restricted. But speaking of restricting certain types of expression, all of this brings me to the subject of today's episode, social media censorship. This is an incredibly complex, often misunderstood, but extremely important subject. So no matter who you are and where you're coming from, prepare to have some of your beliefs challenged on this episode of Out of Frame. As I'm sure you're aware, after the Capitol break-in, much of the news media blamed Donald Trump for inciting the violence in the first place. Now legally, that's a pretty bold claim that's rightfully very difficult to prove. But whether or not that argument actually holds up in court, the political media was on fire from the next week, blaming Trump until he was eventually impeached. Again. I'll leave you all to debate about that in the comments, because what I really want to talk about in this video is what happened next. After years of threatening, Twitter finally banned Trump from their platform. Facebook did the same thing. Then Google and Apple dropped the competing social media app Parler from their app stores and Amazon Web Services kicked them off their servers, claiming that they violated their terms of service by allowing violent threats and hate speech to proliferate on their network. To be fair, Parler's CEO claims that they actually got shut down to stop Trump from joining, so we'll see about all that. Either way, all the major social networks started engaging in a massive purge of individuals, groups and pages that had anything to do with Trump, the stop the steal protests, or in some cases Republican politics in general. Predictably, all these actions enraged a large percentage of the 74.2 million people who just voted for Trump and who now have even more reason to believe that big tech is using its immense power over human communication to manipulate the political direction of the country. For someone who deeply cares about free speech, the last few weeks have been shocking to watch in a bunch of different ways, but it's all seriously reinforced a point I've made many times to anyone who will listen. You can't censor your way into less extremism. Lord knows it's been tried. For years, we've seen growing hostility from people who believe they have a right to silence any ideas they consider too dangerous to be heard. It's become a moral crusade. They're doing it for the greater good, after all. And sure enough, over the last decade, this kind of cancel culture has attacked hundreds of targets with an incredible mix of arrogance and ineptitude. They've bravely promoted the escalation of violence against fascists, while simultaneously using that label to describe just about anyone who doesn't blindly support the latest ideological fad in the woke orthodoxy. It's a confusing world where professors who rejected racial segregation on college campuses, actors who took roles outside their personal experience, artists and musicians who are influenced by other cultures, popular comedians who tell jokes that don't land years or decades after they were spoken, and even respected academics whose work might lead to uncomfortable conclusions all get tossed into the same category as society's worst bigots. But that's exactly what's happened. People who aren't extremists or bigots have been shouted down at events, cut off from social media, doxed and harassed, physically assaulted, and become the subjects of coordinated de-platforming campaigns in the name of purging wrong-think from the world. In some cases, the social media companies themselves have actively participated in these kinds of witch hunts. And yet, strangely, none of this has done anything to de-escalate tension across our society. In fact, the more this illiberal minority tightened its control over what ideas people are allowed to openly discuss, the more polarized and detached from each other we've all become. And with serious political violence becoming more and more common on both sides of this divide, we need to take a moment to learn how we got here and how to fix what's gone wrong. It's hard to say exactly how far back we need to go here, but I do think it's important to recognize that a lot of the shifts in our culture have been a product of demographic and economic changes to the makeup of our society. Thanks to widespread free trade, coupled with innovations in mass communication, cheaper energy and new transportation technology, we all experience a much more international existence than anyone's ever had before. Overall, this has been a huge win for humanity. Global poverty and hunger dropped significantly. Access to health care and education shot up. Lifespans improved, child labor went down. We have more and more affordable goods and services, faster innovation and vastly more wealth. But this also resulted in some major political and cultural changes. Not all of which have been good, nor have their costs been felt equally by all groups of people. Especially in the United States, the cultural split between people living in urban and rural areas exploded. So where once a large percentage of Americans could relate to a blue-collar, duck hunting conservative like Phil Robertson, by 2013 he was part of an ever-shrinking minority. Meanwhile, over this same period, people who had no cultural standing before started to gain power and influence. They were also going to college. And while that should mostly be understood as a positive development, it opened the door to some things that were not so good. In particular, cancel culture. I've done several videos that touch on this subject already, but I feel like I've often had to gloss over the role postmodernism and Marxist critical theory have played in the coddling of the American mind. Starting in the early 20th century with philosophers like Jacques Derrida and Michel Foucault, postmodernism's rejection of many core enlightenment principles began a decades-long dismantling of concepts like objective truth and common morality. Now, before you dismiss this idea entirely, postmodernism's rejection of overly rigid rules did make some important contributions to aesthetics by encouraging artists to borrow techniques from multiple cultures, time periods, and traditions to create some really cool stuff. So it's not all bad. But applying the same deconstruction of grand narratives to science, mathematics, physics, economics, politics, and perhaps especially ethics is a disaster. Of course, postmodernism has an important ally in this story, and its name is critical theory. Around the same time that postmodernism was taking root in academia, Frankfurt School Marxists like Herbert Marcus, Max Horkheimer, and Eric Frohm were building a philosophy of history, sociology, and media that viewed everything through the lens of group identity, ideology, and political power. History is just an ongoing battle between oppressive exploiters, mostly capitalists, and the oppressed. As far as I can tell, their purpose as academics is almost exclusively to be political activists, teaching students to hate the ideas and narratives of people supposedly in power, including I'm Not Making This Up, Work Ethic, Logical Reasoning, and The Scientific Method. Health itself is little more than a social construct, a function of power, and even the meaning of words can be defined and redefined at the whim of those who want to reshape society. By the 1970s, these ideas had thoroughly spread into the humanities and teacher education. By the 1980s and 90s, they were significantly influencing entertainment and popular culture. By the 2010s, a whole generation of young adults had grown up with these ideas embedded in every aspect of their lives, and then they signed up for Twitter and went to work for social media networks. I'm obviously condensing this story a lot, but it's all just set up for the culture war we find ourselves in today. Now, to be clear, there have always been elements of human society that want to shut down speech and prevent unpopular viewpoints from being expressed. Even copyright law was originally established as a way of ensuring that the king had control over what people who owned printing presses were allowed to publish. There's nothing new about that. It took Western society millennia before we saw the first real limits on state power. It took another several hundred years before those limits started preventing the regulation of speech and a few hundred more years before those limits were actually even close to equally enforced. Most countries don't have any legal protections for free speech at all, and almost none have anything close to our First Amendment. And that has had to be vigorously defended in the courts in order to expand the protection of speech to more and more groups of people. It's just that since the Enlightenment, this aspect of society has been steadily improving. Kings and lords have had less and less power to control what people say and do. The Church lost its authority to dictate what was and was not acceptable in society. Publishing and media have opened up, and people have been able to express themselves in all sorts of new and interesting ways. We no longer have strictly enforced censorship codes in our movies and TV shows. Even public libraries are less restrictive of the kind of content they allow on their shelves. Legally speaking, in the U.S. at least, free speech has probably never been better defended from the government, but in the culture, completely different story. As some aspects of our society opened up, other parts, especially among younger people, closed down. Historically, censorship was usually a product of the most important members of society protecting a pre-existing power structure, but the people primarily driving modern cancel culture aren't cranky politicians, tycoons, and powerful members of the clergy. They're kids throwing temper tantrums because it's easier to shout down talks and de-platform disagreeable people than to engage challenging subjects with reason and persuasion. It's also worth noting that the modern mob is no longer geographically limited. Thanks to social media, what once upon a time may have been a handful of angry villagers with pitchforks and torches chasing after a heretic is now a global network of harassment built around algorithms that amplify outrage, and most people don't really even understand how it works. Fortunately, I'm here to help. Terms like artificial intelligence and machine learning conjure up this image of something that's emotionless and incorruptible. It's robotic and inhuman. Algorithms are just math, and as we all know, math never lies. It has no biases. Right? Sadly, no. As we discussed in the episodes on the social dilemma, every piece of content that social media algorithms recommend or ignore, demonetize, age-restrict, flag as inappropriate or outright ban is filtered by automated systems and keyword recognition. This all seems very clean on the surface, except that those systems are created by human beings, who each have their own biases and political viewpoints that unavoidably get baked into the way these algorithms are written. Remember what Kathy O'Neill said? Algorithms are opinions embedded in code. Whoever chooses which words and phrases are filtered and how the systems react whenever they appear controls the flow of information for most of modern society. That is a tremendous amount of power. More than even the most despotic rulers could have even dreamed of a century ago. Even if no one means to do harm, it's just not possible to keep the developers' own values from influencing the decisions they make on behalf of billions of social media users. And especially in Silicon Valley, these values aren't a random or representative sample of the average beliefs of people around the country, let alone the world. We know this from demographic and political opinion surveys, from regional voting records, and most obviously from the things the leaders and decision makers at these companies say in their own blogs, tweets, and interviews. It's foolish to argue that these biases don't affect the flow of information on social media. For example, in a Project Veritas Sting video from 2018, messaging engineer from Twitter, Prené Singh, unwittingly explained exactly how his own biases could shape essential parts of the ecosystem. But I mean, plenty of people do talk like that. It's just that if you exist in an echo chamber that has almost no meaningful diversity of thought, you may never meet someone who does. And if you're in charge of defining which types of words and phrases get restricted and which don't for a huge network like Twitter, you're going to end up silencing millions of people simply because you don't understand how they communicate. It's especially easy to cause major problems if you start by assuming all those people are bots and shut down their accounts. In your mind, they're not even human. The fact is, most people don't really consider how their biases drive their choices, and that's the real danger of these kinds of algorithms. They've always reflected the opinions and beliefs of the people who created them. That's why, if you understand how this works, it's not so hard to see how we ended up with the Declaration of Independence being flagged as hate speech. All it takes is for people working at Twitter to assign negative values to the kinds of language that it contains. We do have the challenge of some monocultural thinking. I have said publicly that, yes, we will have more of a liberal bias within our company, so there's to CNN. But that doesn't mean that we put that in our rules. But hold on. Because what I'm getting at is that at some point in time, things have to get down to a human being looking and reviewing at cases. If you guys are so left-wing in your staff and the area that you live in, and all these things, things are almost naturally going to lean left. Unfortunately, all these systems are extremely opaque and have never been made available to the public. And they're pretty much the exact opposite of due process. Every social network blocks content and removes users on the idea of guilty until proven innocent, where the accuser is also the judge. So we don't actually know what specific words, phrases, and imagery are boosted by the different social platforms, which are restricted or why. And we can't trust the companies themselves to be impartial or fair. What's more, we do know that there's a lot of pressure from the employees, some executives, and many of the top users of these platforms to explicitly silence people they don't like. Over the last few years, companies like Facebook have been in full-on crisis mode, reeling from privacy scandals like Cambridge Analytica, accusations of Russian election hacking, and getting blamed for rising political violence around the world. They're being accused by their own friends of unwittingly helping Trump get elected. So they've reacted to these issues by downgrading the reach of pages and groups, bringing on new fact checkers to put warnings on posts that they labeled as fake news, and outright limiting people from running what they define as political ads. Speaking as someone whose content deals with difficult subjects, I can tell you that this has all been an incredible mess. They also started cracking down on more and more people for community standards violations, including tons of folks who don't actually fit the descriptions of the kinds of bad people you see on the news. And again, nearly a decade of aggressive cancel culture has not demonstrably reduced extremism. If anything, it's only ever made it worse. But I'm sure the question on everybody's minds now is, what do we do about it? Well, it's complicated. In spite of what some of you might believe, this is an issue of censorship. But in spite of what others among you might believe, it's not a legal issue. It's actually kind of shocking how badly wrong most people get this. Wrong! The First Amendment prohibits the government from writing laws that infringe people's right to speak freely. It does not prohibit private individuals from limiting the speech they allow in their homes and businesses. Social media companies have, and should have, every legal right to define the limits of speech on their platforms. In fact, they need to set some rules if they want to create an environment free from harassment, threats, and obscene content. Those rules can actually support a culture of free expression. But that's not the same thing as stifling different points of view because the people in charge don't understand or agree with them. Asserting the moral authority to dictate what millions of social media users are allowed to say in here violates the most basic essence of free speech as a concept. And that concept predates and indeed is the entire basis for First Amendment law. So yes, it's possible for private companies to engage in censorship, and quite often that's exactly what's happening here. But unfortunately, there are a few other myths we need to bust as well. Firstly, we need to talk about Section 230 of the Communications Decency Act. It just doesn't do what you think it does. It's not a magic shield for social media companies to escape liability, and repealing it won't reduce their incentive to control people's speech. Here's what the law actually says. No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. There's nothing about this that's unique to the big Silicon Valley social media companies and there is no legal distinction between publisher and platform. It just means that you're not responsible for my content and I'm not responsible for yours. Anyone who allows third parties to post comments or publish videos on their website is protected by Section 230. Without that protection, it would be a horrible idea to ever allow anyone to post anything on a site they didn't own. What's more, the bill explicitly encourages companies to create filtering tools to limit illegal content and help their users control their experience. The one tricky part of the law is that these filtering tools have to be created in good faith. We may need courts to decide exactly what that means. But no matter what, repealing Section 230 would create a huge incentive for every interactive service provider to control speech and expression even more than they do now in order to avoid a flood of libel and slander lawsuits. This isn't the right answer. Secondly, I can't believe I have to say this, but just because a company is publicly traded doesn't mean it's owned by the government. It means that anyone can own a share in the company by buying its stocks. On a related note, a company that gets government contracts or subsidies isn't owned by the government either. I've spent much of my life criticizing corporate welfare, but anyone arguing that the government should have more control in those situations is moving in entirely the wrong direction. That's not how it works. That's not how any of this works. Lastly, using antitrust laws to break up social media companies is a non-starter, both because it would destroy the things that are actually good about them and because none of these companies are monopolies to begin with. Google competes with Bing, Yahoo, DuckDuckGo, and a host of other search engines. YouTube competes with Vimeo, Twitch, Bitshoot, Library, and a zillion other video sharing sites. Facebook and Twitter compete with each other, not to mention TikTok, Miwi, Mine, Scab, and so on. Amazon Web Services has several other major competitors and a thousand smaller ones, not to mention the fact that it's possible to host website data on your own servers. The only way it's possible to claim that these companies have monopolies is if you redefine the term to mean that Facebook is the only company in the world that does exactly what Facebook does. It would be like saying McDonald's has a monopoly because they're the only company that makes the Big Mac. Even though there are clearly a dozen other hamburger restaurants and a hundred other fast casual chains all competing for the same set of customers. That's absurd. Yes, all of these companies are big. Yes, they're very successful. That's because they produced goods and services people actually like to use. We all made them what they are by giving them our time, money, and attention. And you know what? This insight is also the solution. Censorship, no matter how well intended, is inherently arrogant and comes with some incredibly damaging side effects. After the great purge of 2021, the behemoth social media networks saw a serious exodus of users and stockholders fed up with their policies. Facebook and Twitter alone lost a combined $51 billion in market value. After building an industry and dominating the internet for the last 15 years, these companies are opening the door for their competitors to take over by repeatedly violating their users trust. New companies are already rising to the challenge, but they only stand a chance in a world that doesn't tie them down in red tape. So we need to resist calls for heavy-handed regulation because they're both unnecessary and counterproductive. Just as social media executives can't predict the fallout and unintended consequences from their policies, politicians have no idea how to predict and plan the perfect future of communication on the internet. Worse still, multi-billion dollar companies like Google, Facebook, and Twitter, which have entire divisions devoted to influencing public policy, will always have vastly more sway over the way these kinds of laws get written than any entrepreneurial startup. This is a well-known phenomenon in political economy called regulatory capture, and it's one of many reasons that giving the state more power over the internet is never going to create the results people hope for. We don't need more laws. What we need is for more people to see what's going on, develop a lot more genuine tolerance for different points of view, and support a culture of free speech. Here's what this means. Don't participate in cancel culture. Do use your voice to challenge censorship wherever you can. Demand transparency from the rules on social media. Spend more time on the platforms that don't automatically cave to the mob. Share videos like this one to help your friends understand why these issues matter, and if you can, cultivate your inner entrepreneur and help build the next generation of social media networks that take due process and free expression seriously. Censorship on social media is a very real problem, but we can solve it by doubling down on our best values and abandoning our worst. The future of our society depends on it. Hey everybody, thanks for watching this episode of Out of Frame. I honestly hope that I can get back to talking about movies soon, but this is just not an issue I can ignore. I hope you'll check out all the links in the description and jump into the conversation in the comments. If you like what we're doing here, please like this video and subscribe to the channel. Also, check out our Behind the Scenes podcast. It lets us spend more time talking about stuff like this every Friday, and I think you'll enjoy it. And if you really love this video, consider supporting Out of Frame on Patreon or Subscribestar. I want to thank all of our subscribers, but especially our associate producers. So, to Connor McGowan, Dallin Case, Hemantana, Richard Lawrence, Matt Tabor, and Vega Starlight, thank you. Normally, this is the part of the video where I'd tell you to follow us on the major social networks, but given what we just talked about, I'd like to encourage you to join our Discord server instead. It's not only a way for us to diversify the ways we stay in touch with all of you, it's also a great space to practice what we preach by hosting wide-ranging, interesting conversations online. Join me there, and I'll see you next time.