 outgoing CEO, Susan Wojcicki, last year was talking on all this very topic in Davos at the World Economic Forum in 2022. I wanna play a clip from that just to get a, so everyone can get a sense of how YouTube is thinking about this, you know, from the top and then get your reaction to the proposals that are being laid out there. Investing a huge amount to make sure that we're fighting misinformation and there are a number of different ways that we look at this. So the first would be from a policy standpoint, we would look at content that we would think about in terms of being violative of our policies. So if you look at COVID, for example, we came up with 10 different policies that we said would be violative. Like an example of that would be saying that COVID came from something other than a virus. And we did see people attacking 5G equipment, for example, because they thought that it was causing COVID. And the second one would be really raising up authoritative information. So if you are dealing with a sensitive subject like news, health, science, we are gonna make sure that what we're recommending is coming from a trusted, well-known publisher that can be reliable. If there's content that's borderline content that technically meets our policy but is lower quality, that's content that we basically will not recommend to our users. Our users could still access it. And then lastly, we're just really careful about what we monetize. So we always wanna make sure that there's no incentive. So for example, with regard to climate change, we don't monetize any kind of climate change material. So there's no incentive for you to keep publishing that material that is propagating something that is generally understood as not accurate information. So taking down really blatant disinformation like that 5G towers are causing COVID-19, de-ranking certain sources, upranking authoritative sources and then demonetizing controversial debates that she believes are spreading misinformation. Is that an agenda that you think is a positive one for YouTube to be pursuing? I think it would be great to see them do that. So they're not doing it. No, no, they're not doing it. But here's my reaction to that because I do think it's totally understandable and appropriate for a company like YouTube to not want videos about 5G conspiracies running rampant on a platform. I don't agree with that though. That's not true. They're actually incentivized to have a lot of material like that. In fact, Google came under fire over the last few weeks in terms of how it's spending their ad dollars and not spending them in areas that they told their advertisers they were spending them. So no, I would actually argue that while philosophically they may not like the idea that they're monetizing this type of content, they are absolutely monetizing this type of content. But they have taken those, that sort of stuff. Some of them, you can find all that stuff. All that stuff is there on YouTube. All of it is there, all of it. The flip side to all this is that I've seen what happens when misinformation has become somewhat of a buzzword and I've seen what happens when the net is cast too wide. I mean, even Reason TV, we got a strike or this was a video I produced. It was about these biohackers attempting to kind of create a knockoff COVID vaccine and was called medical misinformation. I mean, I just happened to know in fact that this was not medical misinformation. It was me reporting on something somebody was trying to do. But it got caught and it was like six, seven months after it went up that it got caught in this dragnet. And so that's the kind of impossible thing about regulating misinformation at scale, whether you're a government or a private company, how do you call it? I don't think that's true though. I don't think impossible is the right word. I think that it's early days and oftentimes, because these companies aren't incentivized to police themselves, they do it very badly. And they do it with a blunt hammer as opposed to a scalpel. And it is burly, like, because for sure, Zach, there are people who create music on YouTube who get their own content flag. Like really, even beyond what the example you just gave, like really, we are constantly being accused of infringing our own copyright. Exactly, that happens all the time. Now, I will, we would need five hours for this conversation. I will tell you, and you know how I feel about this stuff, I made the Napster movie. I will tell you that using copyright NIP law in this area to try to rectify this stuff is generally disastrous, right? So I'm not like some big, let's just come down on everyone with harsh or copyright law, it will generally hurt the folks that you're trying to help and help the folks that you're trying to to get protection from. It's just the way those laws are constructed. So I'm not sitting here saying that stuff is easy, but I will tell you that it's a heck of a lot simpler than people are making it out to be and you have to start somewhere. So yes, a lot of the tools right now are very blunt, but they will definitely get less blunt as we go along. We certainly don't wanna do nothing. Well, can I ask about that? Because in a way it seems like you wanna bring back or maybe I realistically probably has never gone away, but this idea that there's a guardian class that must control what people are exposed to because they really can't handle things. They can't sort things out themselves. And this goes back to when the novel was introduced, when radio was introduced, movies, comic books, et cetera, rock music in particular. And I think about your documentary on Zappo. This is retolent with there's certain types of material which should not be permitted because it will have a negative effect on people who can't understand it or can't critique it. Is there a contradiction in what you're saying between your earlier work and your kind of emphasis on people being allowed to express themselves? No, because I think we're talking about there's an enormous, enormous spread there. And I think that if you're disinformation, being information that is specifically created to cause harm, those are basic safeguards. I don't think there's a lot of, that doesn't enter sort of free speech territory. If someone's being called upon to go kill trans people and someone goes and kills a trans person, no, I think it, you know. Well, if it's, I mean, there's fighting words and true threats that are, have been worked through the course, right? Yeah, and there's a lot of studies showing the impact of fighting words on actual fighting. And I think that that is where the arguments for everything being fair game are not valid. So everything. And so like when Steven Crowder says, we need a civil war because my God, boy God, Trump is being investigated by the FBI that that crosses a line that if somebody's saying, you know what, we need a total reordering of society based on recent political events. That would be- Sure, I mean, it's also intentionally being done and funded by some, by dark money interests that are trying to create these harms in this violence. You know, as we saw with the Trump indictment from yesterday, the intention with the January 6th insurrection was to kill the enemy. It was actually to commit that violence. More people were radicalized to go to the January 6th insurrection by YouTube than any other internet-based platform. That is a study with actual numbers attached to it. So you've got to start somewhere. There's- Yeah, I'm not, I'm not convinced of that. I mean, January 6th was organized in, you know, WhatsApp groups, Facebook groups. I mean, private, you know, encrypted communications, the people who showed up there. But the proliferation of the ideology was proliferated by YouTube. I mean, the Bellicam studies is very comprehensive. Yeah. You know, there's, near the end of the film, you start introducing some possible policy prescriptions. You bring up the famous internet law very kind of bedrock internet law, section 230 of the communications and decency act. And that is essentially says that platforms are not liable for user-generated content on their platforms. You know, whether or not they moderate or don't moderate, just getting involved with moderation does not mean you're suddenly liable for every single thing that someone posts on your platform. This has been something that's been attacked by both Democrats and Republicans. And your film seems to also call for some sort of reform needed around section 230. What is it that you would like to see changed about that fundamental law of the internet? Actually it doesn't. The film doesn't do that. In fact, I would argue that our general take, not that my take is that important because we're really just trying to show all sides around this debate. But section 230 is an incredibly difficult thing to reform and it is really what provides the safeguards for the internet period. And tampering with it in a blunt way would actually, as I said before, cause harm to the very people that many people would be looking to protect and it would hyper-empower the monopolies. There's a reason why certain factions in extremist sides of political thinking want to abolish 230 because it would essentially allow for an enormous amount of censorship. And it would erase enormous amounts of content from the internet that is providing very important views and giving voice to people who need voice who would not have it without 230. So my general feeling on this stuff, 230 content moderation, all of it, just to be very clear is that none of this is easy or simple. And there really isn't a, there is not a clearly defined way forward to say police the internet or regulate the internet. There isn't one. My point is that the impact of these platforms on society is inarguably huge. That some of these farms are politically motivated and funded or just ideologically motivated and funded. And that should be called out. And that we shouldn't, as citizens, sit on the sidelines and just throw up our hands and say, well, this stuff's too complicated, we shouldn't do anything. We should start to look at what we can do, however that looks. I'm not convinced that's by reforming 230 personally. But I'm also not the head of ethics at Berkeley, like Honey Fareed is, right? Who does believe there's a way you should ask Honey, right? Right. So how do you look at the revelations of things like the Twitter files and reason got a cache of documents from Facebook where this is also something not quite new, but it became apparent that there was a huge amount of government censorship without calling it that where people in both the Trump administration and the Biden administration, we can certainly assume, you know, George W. Bush and Obama as well saying to platforms, hey, deprioritize that, kill this, don't push this. Does that change the complexity of the task that is at hand? That this idea that there is, you know, the private sector and the public sector, public speech, private speech is kind of out the window now. I think that what it does, which is good, is it shows the specificity of it, which is something we're talking about in the film, that there are ideologies at play. There are political players at play. It isn't just the town square. There are people with agendas that are funded, funded agendas that are giving, that are using this golly gee whiz parasocial influencer to say very specific things or to attempt behind the scenes to curtail the saying of very specific things. So I actually think it's important that we understand that politics and certain players are involved in trying to shape the language or curtail the language that's out there. I think that's the, look, these are the beginning of conversations that we need to have. This is by no means the end of anything. That was an excerpt from our conversation with Alex Winter, the director of the new documentary, The YouTube Effect. Tell us what you think in the comments. And if you wanna see another excerpt, go here. If you wanna see the whole conversation, and you should go here and make sure to come back every Thursday at 1 p.m. on Eastern Time to check out a new live stream from Reason TV.