 My question involves deplatforming and freedom of speech. Have you heard of a fellow called Mike Masnick? No. Okay. I hadn't heard of him until a few days ago when I saw. I think it was on YouTube. An interview he did with Nicholas B. From Reason magazine. Yeah. Now, Mike Masnick, about over 20 years, created a blog called Tech Dirt. And he found he's founder and CEO of the Coppia Institute. His interests are tech technology. Particularly as it relates to freedom of speech. Privacy. And business issues and legal issues confronting technology. Okay. Happily, he seems to be on the right track. And I don't know what he knows about objectivism. But what I heard. Was good. He's. He won't, he, when he uses the word censorship, he means he limits it to government. But he's not in control. So that's refreshing. Yeah. The balance. And he's very much. Support section 230. In fact, he advocates expanding it to provide better protection. For intellectual property rights better than the digital millennial. Millennium copyright act does. He seems really on the right track. Now here's where I get to my question. I've heard you advocate that platforms formally publish their standards and their criteria so that users would know, oh, this I better not do. Now he took a different position. He said that the practices that the different platforms follow, the decisions they make when they moderate their content, the content is very much context specific. And he thought it was impossible to actually come out with rules that wouldn't have to keep changing based on the specific. And I thought, well, that sounded legitimate, sensible, but I also thought that it might be possible to state their rules in terms of principles that would not have to be changed rather than make them very specific. You may not say this, but you may not say anything that would in our soul judgment cause such and such. For example, now, he's actually taken this, so I want to, I'd like, part A is like your opinion of that. Part B is he wrote an article in 2019 for the Knight First Amendment Institute at Columbia University and it's titled, Protocols, Not Platforms, A Technological Approach to Free Speech. I haven't read it yet, and I guess you haven't either because you weren't familiar with him. I can send you a link to the article because I'd be interested, yeah, I'll email it to you. I'll be interested in hearing what you think because it sounds like he's taking it to another step. You know, to try to find a solution when solutions are tough, everyone, I mean, every good person agrees that the government has no role to play, but it doesn't mean it's still easy. So I'll shut up now and let you comment. So I think the important thing to note here is that whatever the solution is, it's very difficult to get to. And I have a lot of sympathy for Facebook and for Twitter and for others who are trying to figure out what the balance is. Now, I think they're biased, and I think they ignore their bias because they live in a bubble where they don't know that not everybody thinks like they do. But no matter what philosophical perspective you have, to try to write up a code that is objective is going to be hard, and I don't have the solution of what it is. And it would have to be a series of principles that could then be adapted to various contexts. So there's no question that a lot of what you say, its meaning is contextual and whether it violates in terms of services contextual or not. So that is all I think absolutely right. But I don't think that precludes being able to have a set of principles that can guide you. It's important for content creator to know, well, what is crossing the line with these guys? What can I say? What can't I say? What can I do? What can't I do? You know, we talk about the importance of having objective laws. And I think the same thing is true here, less so, of course, laws are more important because the government can actually use force against you. But just from a business perspective, it's crucial in order to keep your clientele to have clear rules, clear guidelines, clear principles that allow you to determine. But it's hard. So I was on a call the other day with, it was a private call. So I'm not going to tell you exactly who it was, but two people on the call were people responsible for content curation at two of the largest social media platforms in the world. And they were, I'd say both of them were probably mildly left of center. They weren't rabid progressives. One of them said she used to be a Republican and she's not anymore, but she's not a wacko either. And the thing that came across from having just a discussion with them is that they are really struggling with how to do this and how to do it right. What constitutes, in citing what constitutes, now put aside the legal definition, but what constitutes for their purposes in citing what constitutes the kind of speech they don't want on their platform. And they are struggling, they don't know, I think that's true of Zuckerberg, I think that's true of, I mean, I don't know if you saw the internal memos at Twitter around whether to take Trump off the platform or not. And there was massive disagreement about whether to do it or not. The CEO, I forget his name, the CEO didn't want to take Trump off. He wanted to keep Trump on. And his people argued that he should be taken off. And there's a whole exchange and then he does a whole stream on Twitter to try to explain why they did it, why he accepted his staff overruling him in a sense. These are difficult decisions from a business perspective, difficult decisions to make and try to stay consistent. I don't think they do necessarily a good job at it, but I appreciate the difficulty that they face in doing it. And I think Republicans and some objectivists and many libertarians are way too dismissive. Oh, they should just allow all speech on. I remember when Jordan Peterson was trying to create his own platform about a year ago before you got sick, was trying to create a competitor Facebook. And I heard this from Dave Rubin, I think, and Dave was telling me that the real challenge that they were facing was not technical. It wasn't even marketing. The real challenge they were facing is how to create terms of servers that would allow a maximum of speech, but wouldn't allow everybody. They don't want everybody. They don't want all speech because some speech would make the platform non desirable to beyond. For example, pornographic speech, right? People just wouldn't want to be on. So how do you set the guidelines? How do you set the limits is not an easy problem. And people are way too quick to criticize and to make fun of and to trivialize what's involved here. Paula's struggles are part of their, you know, difficulty in figuring out exactly what is and isn't appropriate to have on their platform. And I think they would admit that some of the stuff that landed up on the platform shouldn't have been there. And then the question is, how do you police that? So this is a very complicated issue that one of the beauties of markets is that the different players do it differently and over time, and maybe it takes a decade. The market kind of figures out, motivates the right set, the set that actually is has stable and is workable. And part of that is different companies have different standards and push and pull and all of that that happens in a marketplace. Now, the problem we face today is that government is involved and that's what distorts and perverts the whole thing. But if just markets were working on this problem, we would solve it. Given the government's involved, that's where it gets challenging. OK, thanks Roy. What we need today, what I called a new intellectual would be any man or woman who is willing to think, meaning any man or woman who knows that man's life must be guided by reason, by the intellect, not by feelings, wishes, women's or mystic revelations. Any man or woman who values his life and who does not want to give in to today's cult of despair, cynicism and impotence and does not intend to give up the world to the dark ages and to the role of the collectivist, Broads. All right, before we go on, reminder, please like the show. We've got 163 live listeners right now, 30 likes. That should be at least 100. I think at least 100 of you actually like the show. Maybe they're like 60 of the Matthews out there who hate it. But but at least the people who are liking it, you know, I want to see. I want to see a thumbs up. There you go. Start liking it. I want to see that go to 100. All it takes is a click of a click of a thing, whether you're looking at this. And you know, the likes matter. It's not an issue of my ego. It's an issue of the algorithm. The more you like something, the more the algorithm likes it. So, you know, and if you don't like the show, give it a thumbs down. Let's see your actual views being reflected in the likes. But if you like it, don't just sit there, help get the show promoted. Of course, you should also share and you can support the show at your own book show dot com slash support on Patreon or subscribe star or locals and and show your support for all for the work, for the value. Hopefully you're receiving from this. And and of course, don't forget, if you're not a subscriber, even if you even if you just come here to troll or even if you hear like Matthew to defend Marx, then you should subscribe. Because that way you'll know when to show up. You'll know what shows are on. When they're on, you'll get notified, right? So, yes, like, share, subscribe, support, like, share, subscribe, support. There you go. Easy. Do one or all of those, please.