 printing press was incredibly revolutionary in all sorts of ways. Nobody talks about like that 50 to 75 year period when the printing press first came about and it was just like madness and revolution and insanity and all sorts of crazy things happened and it took a while for society to sort of come to terms with the fact that like okay the monopoly that there used to be on printing is now gone and I think we're going through something of the same thing right Following last week's attack on the U.S. Capitol by a pro-Trump mob, Twitter permanently banned the president of the United States from its platform and Facebook and YouTube suspended his accounts. Meanwhile, Parler, which marketed itself as a more open alternative to Twitter, was booted from the Apple and Google app stores and Amazon web services evicted the company from its cloud computing platform. Is the great deep platforming of 2021 a threat to free speech? Or should we think of Twitter as a Christian bakery and Trump as a gay wedding king as one user of that platform quipped, meaning that nobody should be able to force a private company to do business with someone it disagrees with? Enter Mike Maznick, the 46-year-old entrepreneur and analyst behind the influential website TechDirt and the digital think tank the Copia Institute, where others are constantly talking about how to restrict and regulate the internet and tech giants to conform to one ideological vision or another. Maznick champions protocols and practices that he thinks would lead to a more decentralized internet and culture, including expanding Section 230, widespread encryption, and tools that give end users, rather than political and cultural commissars, even more power to control what we say and see online. Reason spoke with Maznick about what current debates over social media get drastically wrong, how free speech is simultaneously empowered and imperiled by politicians here and abroad, and why a more decentralized internet is not just possible but preferable to what we have now. Mike Maznick, thanks for talking to reason. Yeah, thanks for having me. All right, we're going to talk about Section 230. We're going to talk about deep platforming, replatforming, unplatforming, and eventually we're going to get to talk about your fantastic 2019 essay for the Knight Foundation, Protocols Not Platforms, which is about getting to a decentralized or a more decentralized internet. But first, let me start off with something that you wrote recently after, and I say recently, last week seems like a year ago, and I don't know, like 2010 seems like a thousand years ago, but you wrote last week, I guess, I think on Friday or something, when Trump had been kicked off at this point of Facebook and of Twitter, you wrote Trump is perhaps the perfect example of why demanding clear rules on social media and how they moderate is stupid. What do you mean by that? Yeah, I mean, it's, you know, there's a lot of context that goes into that, and it's part of a larger discussion. But I think, you know, almost everybody who is coming at this debate with solutions that say this is right, this is wrong, or anything along those lines generally hasn't thought these issues through and hasn't realized not that these decisions are difficult, but that they are impossible. And this is something that I've sort of tried to drive home through various conversations that I've had, that, you know, for people who have to make moderation decisions, they have a bunch of choices in front of them, including do nothing, and all those choices are bad, all those choices have consequences. And so there is no right answer. And I think that that a lot of people, and this is across the political spectrum, in all sorts of different ways, and with all sorts of different backgrounds, people think that there is a sort of correct way to do it, whether it's like, if we get the algorithm, we won't say the formula, but the algorithm correct, we just have to put it into, you know, insert it into the internet, and then everything will be well, it gets beyond that, right? There's certainly a lot of that. And there is this sort of, you know, techno solutionism idea that like, yes, there will be an algorithm and it will fix things. But then there are also the people who say they shouldn't do anything just completely hands off. And that will be the proper answer. And, you know, very few people seem willing to recognize that every one of these solutions has consequences. And many of those consequences are things that the vast majority of people would consider to be negative and harmful and problematic. And so, you know, I spend a lot of time talking to the trust and safety teams at the big companies, at the small companies, you know, I have some, you know, marginal amount of experience with just managing the community at Tectored itself. And what you begin to realize is that, you know, everything is constantly in flux. And the idea of setting up very clear rules becomes impossible really quickly, because as soon as you have a rule, someone's going to figure out a loophole, or how to break the rule in a way that looks like they're not breaking the rule. And so I was trying to get that across with the statement regarding Trump, because so much of what the reality is, is that the context beyond just the content of what they're saying, the context in which it is happening is really important. And Trump is a perfect example of that, because there's all sorts of things in the world and nobody has a rule in place that says, well, how do you deal with the president after a bunch of his supporters have stormed the Capitol? You write though. So I mean, is Twitter a better place in the same piece you wrote, you know, Trump is not being limited. He's not, you know, and reason is a libertarian outfit we've done this way since 1968. So I'm going to insist on the tedious definition, you know, that censorship kind of per se can only be done by the government. So, you know, Twitter is a private company. It has an absolute right to be as bad as they want to be in creating terms of service and either enforcing them or not, all of that kind of stuff. But when you say Trump is not being limited, you're not talking in, you know, kind of Jesuitical and legalistic terms. I think what I said was that he's not being censored. And I was using it in that same sense as you were, and that he has many outlets for speech and in fact, probably more outlets than almost anyone else in the world. Right. I mean, he can, you know, say what he wants to say. He can go on TV at the drop of a hat and, you know, most networks and cable news will broadcast it and clips of that will spread on onto, you know, Twitter and, you know, within seconds. He could put up a page on Whitehouse.gov that are all of his tweets and people would repeat them on Twitter. He has, you know, many, many methods to get his voice out there. He has 74 million people, right? Who, you know, who, I mean, it's a fraudulent election, so we can't be certain they voted for him. But, you know, appear to have voted for him. Right. So he is not being censored. What about for smaller voices and smaller outlets? Does Trump's banning have any impact, do you think, on, you know, people further down the food chain? I don't think that there's a direct impact. I think the larger concern that you're raising is something that is worth thinking about and talking about in terms of, you know, how much access and who gets access to these platforms is something worth thinking about, you know, and sort of how that plays out. And that's a lot of the other stuff that I've written about. But I don't think that the Trump situation in particular has any direct impact. Again, like, you know, the context of the decisions around Trump are very specific and very much, you know, that context is limited to one person in the world and it is Donald Trump. And so I don't see how it like changes anything in a manner that has a wider impact beyond like, you know, don't get, you know, millions of people to storm the capital. Or even dozens or hundreds, right? Well, more broadly, do you think is this, I guess, in the biggest picture possible? Is this a good time for free speech? Because on the one hand, you have things like it seems as if every tech or social media platform is tightening the screws. They're at least going through, you know, you know, I mean, Twitter got rid of, you know, by their count of something like 70,000 QAnon adjacent or direct accounts and all of that kind of stuff, this purge. You see footage everywhere, you know, of Antifa and Portland blockading Powell's, you know, Powell's the greatest bookstore in America, left in America, was shut down because people were saying, you know, you they shouldn't be selling a particular book online. You know, so on the one hand, there seems to be this paroxysm of contraction of speech or outlets where you can just say whatever the hell you want. On the other hand, you know, we're still like living in a world that was unimaginable even 10 years ago, much less 20 or 30 years ago. Is this a good time for free speech? Yeah, I mean, I think I think it's it is undeniable that this is probably the best time there's ever been for free speech, even recognizing all of that. Right. I mean, you talk about, you know, there are these limitations on these platforms that most people didn't have, you know, 10, 15, 20 years ago, none of this existed, go back 30, you know, 40 years ago where where the internet was was in its, you know, infancy, and people had no outlets whatsoever for speech, whereas now they have tremendous outlets for speech, the fact that a few of those platforms and and certainly some of the most popular ones and the ones that people tend to rely on the most have sort of pushed back or cut back on some, you know, segment of their accounts, I don't think is an attack or harmful for free speech on its own. You know, is something like Facebook, Twitter, are they increasingly kind of erratic and arbitrary in the way that they are enforcing, you know, whatever restrictions or whatever guide, you know, guardrails they're putting up? Yeah, I mean, I think that that I don't think that's a fair statement. I think it's one that a lot of people will make. And from the outside, and I'm, you know, one of the people on the outside, I'm certainly not on the inside of these companies. I think that it may appear that way. I think the reality is often a lot more complicated. And again, it gets back to that issue of, you know, what I said with Trump, which is that, you know, to some extent, the rules have to keep changing because the situations keep changing and the context keeps changing. And so they appear arbitrary in the sense that the context around these decisions keeps changing and no two examples are exactly the same. So people like to point out like so and so got banned, but so and so didn't get banned, or so and so said this, and somebody else said this, which I say is exactly the same. And one person got in trouble for the other person didn't. But usually the context between all of those is very different. And who is saying it, whether they're in a position of power or not, whether or not they're, you know, standing in front of people storming the Capitol is different than somebody random saying something. And so there's there's a lot of context there that may make it look arbitrary if you ignore the context. But in reality, I don't think is entirely arbitrary. What about, you know, the Amazon Web Services kind of delisting parlor, you know, the I don't know what we want to call it a kind of alternative social media site or platform that was big with conservatives, it had a big surge in in subscription supposedly in the fall because it seemed like Twitter and Facebook were kind of getting more and more ideologically stringent. AWS said, you know what, we're, we're not hosting you anymore. Is that and in some of your writing, you said, you know, these are, you know, it's one thing when when a platform or an app like Twitter or Facebook does something, when you're talking about the kind of hosting things, that's a different type of question. What is the distinction there? Yeah, I mean, the way I've sort of been trying to distinguish it and think about it is that there is a difference between sort of edge providers, which are the the consumer facing services that you and I use every day. And what then I refer to as infrastructure levels, and that's that goes many layers deep, you know, beyond the stuff that that we see every day. And I think that there are a lot more questions that need to be thought about when it comes to how the infrastructure level players do these kinds of things. And, and again, this may depend on a whole bunch of other factors, including what level of the infrastructure we're talking about, because AWS, which is hosting cloud services is one thing. You know, domain registrar is another, you know, your own internet access is another, there are all these different levels. And, you know, from a, you know, straight up perspective, does Amazon have the right to kick off any customer? Absolutely. That's, you know, part of the way this works. They're a business they have, they can refuse service from anyone. But, you know, it becomes a bigger question. And from a few different perspectives, and one that I really do want to think about is, when we see things like this, where will the government get involved at some point, because the government keeps trying to get involved in these content moderation debates. And I worry a lot that they see something like AWS pulling down parlor and saying, aha, there's a, there's a ratchet, there's a tool that we can use. And when the government gets involved, you know, in some very, very narrow cases, the government can, can force certain speech that is not protected to come down because it is not protected. But when they go to something like Amazon, and not that they have yet, but if they do, and say this needs to be taken down, then you're talking about taking down that prior restraint, taking down a ton of, of protected speech. And so, like, there was some of that. And in fact, like, Representative Rokana on Twitter, on Friday, spoke out and said Amazon should take down parlor. And even if you think that Amazon should, and even if you think that they can legally, which they can, the fact that a sitting congressman suggested that, I think is a reason to sign up for parlor and, you know, petition Amazon to keep it up. Absolutely. I mean, you know, well, this is one of the things that I find, you know, particularly maddening is the way all of the positions that almost everybody takes just seem to be completely situational. So that, you know, you have conservatives who a couple of years ago were talking about how like no baker ever should have to even think about cracking an egg, you know, if it's going to end up, you know, they would abort eggs if they were going to end up in a gay marriage cake, you know, no way. And now they're saying a private company has to be, you know, forced to keep hosting stuff that they don't want. And the reverse is exactly the same. And it's just, it's crazy making. And I mean, how does this kind of stuff, just to stay on this kind of infrastructure deal, you know, how does that deal with certain types of say email providers, email service providers who I don't know, I remember a couple of years ago when people like Edward Snowden and Chelsea Manning and, you know, those were the issues of the day. And email services that said, we're going to encrypt everything and we refuse to have anything to do with that. And the government comes a knock it and saying, give us the keys to this, give us the keys to that. It seemed like the conversation was very different and everybody, at least kind of on the broadly defined left and the libertarian, you know, whatever we are, were, you know, consistent or, you know, unified in saying like the government should never be allowed to do that really. Now that, you know, now if we're talking about hosting services, it's almost like you have a moral duty to get rid of any, like to tell Amazon, you know, this person who might have like a clad account with you or, you know, Google Drive, you know, my old, you know, my ex-girlfriend has some hateful things on me about me on that so that you should, you should cancel them. I mean, yeah, can you just kind of walk through? How do you get to some clarity on this type of stuff? Yeah, I mean, it's tricky is the answer is that there isn't necessarily, you know, a good answer. I mean, I think that again, like private businesses and most of these cases do have the right to decide who they want to associate with them. They have to make a calculation and, you know, is it worth associating with certain people or certain activities and that could be harmful for their business. And so I think it's entirely acceptable and reasonable for them to make other decisions, you know, and to some extent, like the people who are asking for, you know, I will make it very clear outside of elected officials, which I think are in a different category, general people who are speaking out and saying like, hey, Amazon, you need to do this or, hey, Google, you need to do this. That's their free speech rights as well. Right. And so it's part of the debate and it's part of the conversation, whether or not we like it. And so I think that's fine. I think it's important to think through these things. And I think it would be nice to have a, you know, wider, more thoughtful, less reactive discussion and debate about what are the pros and cons of these different approaches. And honestly, for the companies themselves, how should you be thinking through this? And again, like not to say these are the rules. And if you, you know, you have to set these rules because, you know, the situations as they play out will never fit into those rules in particular or very rarely or when they do, like those are the easy cases. The cases that make the news and that are important are the ones that don't fit into those rules. But having a set of principles and how do we think through this and how do we make these determinations, I think becomes important. And it'd be really useful to have that conversation. Yeah. And I agree completely. And also detaching them from, you know, the next three weeks of legislative battles and or perceived legislative battles in DC would be helpful. What about the fact that say a company like Amazon, you know, their web services are one thing. But then they're also they're operating in a political marketplace. They're operating in, you know, a million political marketplaces so that, you know, how are we reaching a point where, you know, the analysis of how a tech giant like Amazon, Facebook, etc. happens. It's like, well, Amazon is thinking about what what's the next tax break or the next tax policy they want that is beneficial to their other business. You know, are they are they swapping off like we're going to give you this on this end because of that. I'm thinking back to when Comcast picked up, you know, NBC and whatnot. And suddenly you have a cable provider that's involved in, you know, telephony plus broadcast television, which is actually regulated by the government, you know, and you know that they're, you know, at some level, they're thinking we, you know, we're going to we're going to give, you know, I don't know what part of the Comcast ecosystem was Czechoslovakia or something like, you know, like we can we can give this to secure peace in our time type of thing. Yeah, I mean, it's tough to tell right. I mean, with any big company, there's a whole bunch of different factors at play. You know, I think that that there are within all these companies, contrary, you know, there are a lot of people who believe like when it comes to like Facebook and Twitter, that it's like Jack Dorsey and Mark Zuckerberg sitting there saying, you're blocked, you're blocked, you're blocked, you're loud, you're it's not how it works, right? You can see Dorsey stroking that beer. That's right. You know, that weird House of David baseball team beard. Yeah. Yeah. But you know, and, you know, but you're saying these are these are like unbelievably messy empires. And I think it was something I can't remember who you or somebody else at Tector who was saying that like, you know, there's a lot of reason to believe that it's like, you know, kind of like mid level junior, you know, members of a team that are really calling the shots here that, you know, the top brass, they don't know what the hell's going on. Yeah. And, you know, and it's interesting, you know, in most cases, obviously, like the biggest cases, I'm sure, you know, the top brass, you know, the decision to take Donald Trump's account town was not made by a low level person or, you know, Jack Dorsey's, you know, his desire to be in what like Burmese meditation retreats and go turning a blind eye to Rohingya problems. Yeah, you know, that's probably in the in the C suite, right? There's there, you're right. There are issues. But yeah, I mean, a lot of the choices are made and you have, you know, conflicting opinions. And again, like, these are situations where there are no right answers. And, you know, we did this, I've written about this, you know, we did the this experiment a few years ago at a content moderation conference, that was like all content moderation trust and safety experts, where we gave them eight case studies, and they had a vote on like, what would you do in this situation? And there were about 100 people who participated, and they all disagreed with each other. And these are the experts, right? And yeah, we chose cases that maybe were a little bit challenging. But the fact is that like, you know, within these companies, you have that same disagreement, and you have different people pushing on different things. And I know like, you know, there's one trust and safety, you know, executive that I know who says like, he tries to think of it as like conducting a trial within the company when it comes to the big, the big decisions at least. As he says, you know, he sets up somebody you advocate for keeping this person you advocate for shutting them down. And let's have this sort of, you know, courtroom setup, where we can actually, you know, advocate on each side and then come to a decision because sooner or later, you're going to make a decision. And that always takes me back to the fact that like, you know, you look at the court system itself. And that's an example of these kinds of debates that, you know, play out in, you know, about the law, and about some of these issues, you know, at a legal level, and you see like decisions are made that get overturned on appeal, you know, or decisions are made that lots of people disagree with. But, you know, for the most part, you know, we don't treat them the same way that we treat the content moderation decisions as like, you know, and maybe that's because they're sort of in a black box and you don't really see how the deliberations are going and there's not necessarily the transparency there. But, you know, these things are really, really difficult. And there are reasonable arguments for all different positions. And yet people automatically jump to the idea that there's like some sort of nefarious plot behind all of these decisions. Well, you know, though, to you worry, you know, when you talk about the court system is by design is adversarial. It seems as if there's, you know, more speech being given or more people are saying that within corporate culture, whether it's, you know, Wall Street, or whether it's Silicon Valley or wherever, Hollywood, that there is a growing, you know, sensibility that we're not, we're not being adversarial here. We, you know, that some, either all of the workforce that is woke or whatever, there's a or is reactionary, you know, that we know what is right. And we're going to enforce that, you know, formally, as opposed to have that kind of robust internal discussion. Do you think that that's mostly, you know, a kind of rhetorical battle? Or do you think there is something shifting? Again, you know, and I was going through a bunch of tech dirt stuff, a fantastic site that I highly recommend anybody who's interested in the internet, broadly speaking, and what used to be called digital culture, check out, but you were, you know, the Silicon Valley people are no longer classical liberal or libertarian in their, you know, in their commitments, it seems. I do, do you think something is changing in the, you know, the broad based culture of Silicon Valley towards a more focused, univocal expression of beliefs within a system? I don't think that's true. I mean, I can see where they're, I hope not. Yeah, I think that, you know, there are, these are large companies for the most part, and there's, they have a variety of different people with a variety of different, you know, backgrounds, opinions, perspectives. And, you know, some of the debates, if you actually get a chance to sort of see how the debates play out, there was a really good, I'm going to advertise somebody else's podcast, but the radio lab a few years ago did the whole episode about Facebook and their content moderation efforts that I thought was the most realistic version I've seen of anyone discussing how difficult these decisions are and how you have this sort of back and forth and pushing on different opinions. And I think that, you know, a lot of the trust and safety professionals are extraordinarily thoughtful about these things, are not so ideologically driven as most people seem to think they are, and are very, very cognizant of both the power and the consequences of the decisions that they make. And I know that, like, people don't believe that. And again, a lot of that is because this is all happening behind, you know, behind closed doors or whatever. But there is a lot of careful thoughts that goes into these decisions. Even if we disagree with them, I think it's wrong to assume that it's done either ideologically, or just, you know, totally arbitrarily. Do you think that transparency is the, you know, is a big important factor that needs to kind of be expanded? Or is that just, you know, people say that now, and then when they see how things are going to be made, they're going to be like, no, it's wrong, you know, because really what people want are outcomes. They don't give a shit about the process. It's a mixture, right? I mean, I'm always just generally supportive of more transparency, because I think that's a good thing to work on. But it's also one of these things that there are trade-offs there as well. And in fact, you know, I was telling somebody recently that, you know, on TechDrip, we have some user voting systems that do the moderation. And the only people who've ever demanded transparency about how that works are the trolls who are trying to figure out how to game the system. Right? So you have this issue where transparency as a general principle is always good. But, you know, what do you need to be transparent about and why? And when you're going to have people whose main goal is to use that transparency against you, I can see why companies, you know, are reticent to be super transparent. Yeah. And, you know, this is something I'm a big fan of both Michelle Foucault and Friedrich Hayek. And, you know, in different ways, they both talked about how, you know, what a nightmare to be, I mean, another word for transparency is surveillance. And like, I just want a little bit of time to myself. And like, you don't have a right to know why I think the way I do. So, and I think the distinctions that you keep coming back to in your work between, you know, there is a difference between kind of government power and corporate power, even if corporations apparently have, you know, monopolistic control of this or that. Right. You know, the how they maintain power is a little bit different. Can we talk about Section 230 of the Communications Distancing Act? Yes, we must, because this is going to be in the news for at least another 24 to 48 hours. So, but, you know, Section 230 is something that gets talked about a lot. People like on the right, Josh Hawley and Ted Cruz, who conservative Republican senators are against it. Donald Trump was against it to a point where he was going to hold up actually authorizing money for his beloved Pentagon, because he wanted Section 230 to be thrown out completely. And Joe Biden also, a year, less than a year ago, said, I am going to get rid of Section 230. It's terrible. It's got to be gotten rid of. What is Section 230? And why does it drive people more insane than, you know, than like, you know, wasp songs from the 1980s, you know, and Cindy Lauper and Prince songs throughout the Reference Music Resource Center nuts. Yeah, I mean, and I know that you guys have covered Section 230, you know, quite a lot on reason, you know, the basics of it is it just says that that a website is not responsible for is not liable for third party content that somehow violates or infringes on the law. So that if somebody posts something that is defamatory is the best example. They are not liable for that with some exceptions. And also related to that, that those websites have effectively freedom to moderate as they see fit. And that, you know, basically, if they are making moderation choices, the fact that they chose to leave up some kinds of content does not make them liable for that content. So it gives them freedom to choose and to experiment and to change how they moderate over time. Right. And this is also in, I guess, Jeff Cosas phrase or not phrase, but the book title, the 26 words that created the internet, right, you know, this this essentially allows for social media as we know it. Service like Yelp would not exist if Yelp was actually responsible for every review that somebody posted at that site. Same thing with Facebook, same thing with Twitter, etc. So why is there so much can, well, I guess first, why are people against Section 230? What is what is Joe Biden's case against 230 and what is Ted Cruz's? I think almost everybody who is against Section 230 either doesn't understand it or is pretending not to understand it. Because the more you understand 230, generally speaking, the more you're going to say that it is not the problem that you think you're dealing with. And so you have people like Holly and Ted Cruz, and they're sort of arguing on the side that 230 is why there is all this moderation, and they feel that that people who are ideologically aligned to them are being unfairly treated online. And this gets back to the, you know, baking a cake example to some extent, where they flip their position. And yeah, and they insist that what like Twitter, Facebook, etc. are either a publisher, and then they're responsible for everything, or they're a platform, and they're like the phone company, they can't tell people not to make phone calls or not to talk about certain things. And that's wrong, right? I mean, it's so spectacularly wrong. I mean, like the law doesn't say anything. It's 26 words long, and it doesn't mention that. And in fact, like the law is actually saying the exact opposite. The law was put in place and deliberately so to encourage platforms to do moderation. And in fact, you know, it was in response to a case, you know, against Prodigy, that where the company wanted to set up family friendly places, and so they wanted to be able to moderate their forums, and they got sued over content that was left up. And that was what inspired 230. And it was put in place by, you know, bipartisan by a Republican, Chris Cox, and by a Democrat, Ron Wyden, and who believe that like having family friendly spaces online is a good value, which you know, seemed to be sort of a traditional Republican value to some extent also. And and allowing the companies to experiment and figure out how they wanted to moderate was was good policy. And so and it's also it's kind of I mean, I know you guys have written a ton about this, I will just leave it to listeners to Google Stratton Oakmont and Section 230. And there's a whole Wolf of Wall Street tie in, which is just it's it's like a Philip K. Dick novel. It's just it's too insane that somehow that would happen. And also the broader kind of legislative history, the Communications Decency Act of 1996 was a terrible piece of legislation that passed virtually unanimously. And the one thing that came out of that was, you know, the one, I don't know, Phoenix that rose from the ash, because the Supreme Court overruled virtually all of it was Section 230, which is now seen as the, you know, the lever of evil for everything that's bad about the internet. What what does so what is what is Joe Biden's kind of case against Section 2? This is this is sort of the opposite. And it's always funny that like everybody hates it for totally opposite reasons. And so Biden seems to believe as as a bunch of people do incorrectly again. And this is just as incorrect as Holly and Cruz's belief that because of 230, the platforms have no incentive to moderate and therefore they turn a blind's eye to, you know, false information, you know, fake news, whatever you want to call it. And so they don't they don't take responsibility for this. So they believe that if they take away Section 230, it will create incentives for the platforms to be more proactive and to take responsibility for content. And it's not even clear that Joe Biden will be able to get the internet on his jitterbug phone, right? I mean, this, well, and, you know, regardless of it's not a partisan thing. And I want to say it's it's actually like an ageist thing. But if the past couple of years of hearings where they drag, you know, Silicon Valley tech giants, CEOs to Washington has proven anything, it's that the people running, making the laws about all of this stuff have no fucking clue what's going on. And, you know, and it's not even totally ageist. And because like, you know, Ron Wyden is a bit older. And he wrote the law and he still gets it. And he totally understands this stuff. And he spends a lot of time correcting everybody else. And I wish more people would listen to him. Because he's he seems to be the only senator who fully understands law. There are a couple of people, you know, and again, like some Democrats, some Republicans who seem to understand the underlying factors here, and plenty who don't. And so, you know, I wouldn't even say it's an ageist thing. I think it's just, you know, who wants to actually understand how this works and who doesn't. And just one quick point that I do want to get back to you, because it comes up a lot. I mean, you mentioned like, how is it that both sides now believe false things about the law, which is kind of crazy when you think about it. And I think it might even have something to do with with the title of Jeff's book not not is not just fall because he didn't come up with that quote, there's the 26 words that created the internet. But you know that I forget who exactly that quote came from. But it became sort of commonly used. And I think people took that and through like this weird sort of mental transference, assume that if section 230 is the 26 words that created the internet, it is also responsible for all of the bad things on the internet. And therefore, if we get rid of 230, it'll somehow fix all the bad stuff on the internet. And that is a complete and total misunderstanding of 230 and also the First Amendment and how the internet works and how human beings work and how societies work. And you what what is your suggested, you know, way to address section 230. And again, a tech dirt, you guys have published a number of articles. And you are a publisher, not a platform. So you actually, yes. Well, yeah, I guess so, right? Because you have comments. So yeah. But, you know, talking about how it actually most people, when you like think about it, like their problem is in with section 230, their problem is with the First Amendment, because they're they are driven insane by speech or expression that they disagree with and that they don't even understand that actually, you know, if you're if you're saying something defamatory about somebody, you can be sued for that. It's just that you can't sue Twitter necessarily. And under certain circumstances, you can sue the platform publisher, whatever you want to call it. So what is your fix or you know, because you have also talked about how like, you know, section 230, if not, you know, it needs to be revised, it needs to be updated or you know, how would you improve? You know, there's a there's a few different things there. And I'll say that you know, and I should be clear, like when I say that the problem is not 230, it's the First Amendment. Again, that goes to both sides that are getting this wrong, right? You do have the people who are really upset about speech that they don't like that is perfectly protected. And they're upset about that. And they want that to disappear. And then you have on the other side, the people who are saying that like, you know, Twitter shouldn't be allowed to moderate or Facebook shouldn't be allowed to moderate that that's an attack on the First Amendment rights of those companies to not be compelled to whose speech that they don't want. So their First Amendment problems on both sides. In terms of the law 230 itself, I actually think that 230, you know, doesn't really need to be changed. And I think, you know, every, you know, I've had this question asked, and I want to write something on it soon, actually. You know, how would you reform the law? And I think there are a whole bunch of other laws that we should be looking at first. If your concerns are about, you know, how much power certain companies have, there are other laws that are that are there, if you're concerned about types of content online and how it's dealt with, there are many other laws that are much more important to be looked at than 230. You know, if anything, and nobody agrees with this, so I'm all alone, like I would expand 230, you know, I would roll back the FASTA, which was the only major reform that's been done to 230, which I think has been really harmful. And could you spell out what FASTA slash SESTA actually stands for? Yeah, I forget what the full acronym is. But they were laws that, you know, section 230 has been attacked from all different angles for many years. And this was the first one that stuck. And the idea was it was really targeted at one site, which was Backpage. And again, like reason is done, like most of the best reporting on the truth behind the Backpage situation, which was a classified ad site that was accused of being used for sex trafficking. And, you know, it is often, you know, prostitution or sex work and sex trafficking get lumped together, which is unfair and wrong. And so there were all these efforts to take down Backpage. And it was believed that because Backpage had won one particular case on 230 grounds, that 230 was the problem. So they created this law that basically created an exemption for, you know, sex trafficking content, but very broadly defined and in a very risky way. And in fact, when that law passed, what happened was a bunch of dating sites, like just, you know, normal legitimate dating sites shut down. And even like Craigslist shut down their personals page, because they felt the liability that came along with Foster was too big. And in fact, now we have seen some lawsuits. And honestly, almost every lawsuit that has come about since Foster passed almost three years ago has been ridiculous. I mean, you know, Salesforce got sued because Backpage used Salesforce as their CRM. And so the argument is that like under Foster, Salesforce is now responsible for the fact that somebody did sex trafficking on Backpage. And, you know, there've been all these stories now about how it's made it harder for law enforcement to actually track down sex traffickers because they don't have the information that they had. And in fact, it's come out. And again, like, this is reasons reporting presented all this evidence about how the DOJ worked closely with Backpage and relied on Backpage cooperation to help bust sex trafficking rates. And it is also there's been all this evidence shown that it's put sex workers at a much higher risk of not having Backpage or not having other forums where they can be safe and they can present themselves in safe ways. So I would roll that back. You know, and then the other area that Section 230 doesn't cover is intellectual property law. And for that, we have the DMCA, which I think is a much worse solution for copyright, in particular, where that has created real censorship and real problematic attacks on free speech that nobody seems to want to talk about in all these debates. And you explain, yeah, how does the Digital Millennium Copyright Act again this, I feel like, you know, we're blowing the dust off of the Federalist papers or something. And that's, you know, it's not that old. But yeah, how does the DMCA stifle free speech? Sure. So the DMCA, and they're often like lumped together, DMCA Section 512, in this case, and CDA 230 are often lumped together because they're both sort of intermediary liability laws. But the copyright one is very different. So for that, a website first needs to register with the copyright office. And there's a whole ridiculousness there that I won't get into where they'll throw out your registration every three years. If you don't renew, and you have to pay money, and it's a big mess, but you have to register. And then if you've registered, if somebody sends a takedown notice to your registered agent saying that there is infringing material, you have to take it down, or you then can become liable. And so then they can sue you if you take it down, then you are protected. But that raises, to me, these very serious First Amendment questions, because you now have the law basically creating tremendous incentives for you to take down content. And what happens in reality, and there's tons of evidence of this, is that people send completely bogus DMCA takedown notices, knowing that just the risk of liability is going to get that taken down. So that becomes, you know, the use of state power to threaten a company to take down content that may be protected speech. And so it is used all the time to take down protected speech. And I think that's a huge First Amendment problem. And I think that removing Section 230 actually gets you more and more towards that. There will probably be a kind of notice and takedown provision for other kinds of content. And so you'll get that same thing where tons of protected content is taken down just on the threat of potential litigation. And so, you know, in an ideal world, I would extend 230 to cover copyright information also to say that, you know, you shouldn't have to take content down just on notice. You can, if you believe it's infringing, and if you believe there's a risk there. But, you know, the structure of the DMCA, I think is very problematic. So again, let's go back to this question of, you know, on a, you know, on a very profound level that we should never lose sight of, I think, you know, this is an unbelievable period of free speech and human history, you know, because pretty much anybody anywhere can say whatever they want, you know, whether it's on a street corner or, you know, on some corner of the internet. But there does seem to be this kind of, you know, ongoing attempt to kind of constipate discussion, both in kind of cultural ways of saying like, you know, if you, to even ask that question is to reveal yourself as racist or homophobic or this or that, or also legally to, you know, kind of restructure what is permissible. You know, are you optimistic? Like what needs to, is there a kind of culture of free speech that we need to reinvigorate? Or is this the culture of free speech? Yeah, I mean, I think there's a mix and, you know, honestly, some of what we're learning about now is kind of like the societal reaction to more people being able to speak. And it's one of these societal changes that everyone, you know, that we have to go through. And, you know, I don't know if you know Clay Shurkey, you've probably come across his writings at times. And he wrote this thing many years ago now, and it's got to be 10, 15 years ago, where he talked about like the printing press and like the arrival of the printing press and how like it was incredibly revolutionary in all sorts of ways. And people talk about before the printing press, and they talk about after the printing press and, you know, all of the new things that it created and obviously like, you know, kicking off the reformation and all sorts of, you know, all sorts of other things. But he says nobody talks about like that 50 to 75 year period, when when the printing press first came about, and it was just like madness and revolution and insanity. And people were not happy with the printing press and the fact that like, you know, not just the church could publish the Bible anymore, and like all sorts of crazy things happen. And it took a while for society to sort of come to terms with the fact that like, okay, you know, the monopoly that there used to be on printing is now gone. And I think we're going through something of the same thing right now. And, you know, in an even more profound way and that anyone can speak. But that leads to some potential issues. And the question is, how do we deal with it as society? And, and, you know, I find that the rush on sort of both sides, whether it's like, the people say like, well, the law has to deal with this in some way, which is, you know, which is happening. And certainly like on the European side of things, like that's very much their focus, they're trying to legislate everything on this front. And then you have other people who are like, well, the tech companies have to solve it. I think that's missing the point too. And I think part of it is just like society has to come to terms with this. And society has to understand this stuff. And so, you know, we have a lot of people who are behaving badly. And I get that that's worrisome. And that's a problem. But like, you know, that's a societal issue for us to deal with. How do we educate people better? How do we, you know, teach people not to believe false information? How do we teach people to understand these things? And how do we teach people? Well, we teach them to agree completely with me. Or, you know, or you or whoever. Walt, where does, where do you come from? You know, and I asked us, you know, this is the David Copperfield, not the magic session, but the David, the Dickens, the David Copperfield craft, the biographical moment in this conversation from your LinkedIn profile. I know that you went to Cornell in the, in the mid to late 90s. You have a BA and an MA, but, you know, how old are you? Where did you, you know, why are you, why are you doing what you're doing? Why am I wired this way? Yeah, I don't know if I have a good answer to that. I mean, I grew up, you know, I got my first computer in 1980 when I was six years old. And do you remember what kind it was? I got an Atari 800, which people don't remember, but was, you know, Atari's entrance into the PC space. And it had a keyboard and a disk drive. And we actually had a cassette recorder. So you could load programs from a cassette recorder as well as one of those that had you would get, I would get like the 120 minute long cassette tapes that gave you the best time, but then whenever you save something, it would walk around the block and come back. And so, you know, so I got really into computers and just love technology and all the things that it allowed. And then, you know, I went to college in Cornell and was still just sort of, you know, kind of a hobbyist, very interested in technology and the culture around technology, you know, and was reading all different things. That must have been very exciting, though, to be your age and be in computers, you know, during the 90s. And because that all that, you know, that everything, you know, came to kind of fruition, it seemed. Yeah, yeah. And it was really interesting. And I had, I really had two professors who I think sort of set me on this path. And what was Alan McAdams, who's a really interesting guy who's an economist and had been in the Nixon White House of all things, and had been a government expert in the IBM antitrust trial in the 80s. And he had this whole idea about technology and innovation. And again, this is like mid 90s. And he was really focused on like open source software. And he thought that like Microsoft and its, its, you know, proprietary software was a problem, and that the world would be better off with open source technology and more sharing of information, and more widely accessible information. And so, you know, I worked very closely with him, and, and learned a lot about sort of where innovation could go with more, you know, more open sharing of information. And just to give you an idea of like how far back this is, you know, in the mid 90s, where everybody had dial up, this is like the very beginning of DSL technology, if you had a high speed, it was like a T one line that cost you $2,000 a month or whatever. And he was arguing at the time, you can go back and find some of the articles he passed away about a decade ago. But he had written these articles saying that we should have fiber to the home to every home in America. And we should be that should be, you know, as hugely important, however it gets done, whether you know, through, you know, private companies doing it, whether the government has to come in and subsidize it, like the impact of having, you know, real fiber access to everyone's home would be completely revolutionary. Really sort of interesting to experience in the 90s and to begin to think about like, how does information, how does information play into economics, which was like kind of a big thing where like, you know, I'd always learned up to that point, the idea of like economics being the study of, you know, resource allocation and scarcity, right? And information effectively turns that on its head, because you no longer have the same sort of scarcity, information being the type of thing, you know, as like Thomas Jefferson family said, you know, with the lighting the candle, if I have a candle and light yours, now we both have it, we haven't, we haven't diminished, we've increased. And that creates all these weird economic incentives that, you know, for most of history, nobody has studied. And really only starting in about, you know, the late 80s to early 90s, that economists begin to really start to think about the impact of, you know, non-scarcity of abundant information on economics. And to me, that set me off on recognizing like the power of information, the power of information sharing and of technology to, you know, to create innovation and growth. And from there, I've just gotten, you know, really focused on all sorts of different discussions around that. But at its core, it's an economic concept of, you know, the economics of abundance. And your main gig is, I mean, explain, is Tectr your main gig or is Floor 64 your company, which provides information, right, to clients? Yeah, so it's a little bit complex. I mean, so it's all sort of associated, Floor 64 is sort of the overall corporate name, but it has Tectr as the blog and we do a bunch of different things there. And then related to that is we have this thing called the Kopi Institute, which is sort of a think tank that sort of tries to do more interesting things around the type of stuff that we write about at Tectr. And so we do events, we do research. One thing that we've been doing a lot of in the last- Corporate espionage, I assume, right? That's another word for it. But that's not, that's not, we did get asked that in the early days, we say actually get asked to do that. And it's like, that's really not what we're doing. But, you know, we've done research, we've done research reports, either for public research reports or internally for some folks. But one thing we've done lately, which has been really interesting is actually sort of building what we call games, just recognizing that when people are doing something different, a game or something of that nature, it sort of takes them out of their preconceived notions and allows them to think of things in a different way. You know, I talked earlier about getting people to, you know, actually have to answer how they would handle certain content moderation decisions. And that's a perfect example of that. When you put people into the shoes of the actual people who are making those decisions, they begin to see, you know, begin to see like, oh, these are really challenging. So like, we did one on like election disinformation recently and like how the election, you know, how different factions might use disinformation to try and, you know, drive an outcome regarding the election, which has suddenly become certainly pretty relevant. Do you find with, you know, the advent of mass information, you know, it really, it really does change everything. And, you know, thinking about the 90s, I'm 57 and I started, I bought my first new car in 1996, and I used rudimentary information that was online to, I think, get a better deal. But 10 years later, I actually was buying cars, always cheap, shitty cars, but completely online. And I would just go and pick it up. And it, you know, it's such a case study of like how having more information as a consumer, like you can just, you get a better price. And also the car dealer, like they're not going to waste time sweating you for stuff when they can sell you a car for, you know, in 15 minutes over, you know, for a decent price over online instead of like, you know, they don't have to work as hard, just so different. But then there's this question of like, you know, kind of everybody has, almost everybody has fiber optic or the equivalent coming into their house or into their hands or their phone. And they use that information very differently. And it doesn't seem, I mean, I guess what I'm asking is like, is some of the techno optimism or techno utopianism of the 90s where information will set us free. It doesn't quite do that. It just kind of, it makes us all able to live how we want, but it kind of complicates things more than it ever settles anything, doesn't it? Yeah. I mean, again, like, you know, I see it in a few different ways. And one of those is that, you know, there's a societal issue, right? I mean, everyone has access to this stuff. It is a question of how they use it. And people have different, you know, frameworks and thought processes. And again, like the flood of information is something that we haven't really learned to deal with. The abundance of information is something that as a society, we haven't learned to deal with. And that's created problems. You know, some people are able to process that and some people are able to judge credibility. But that's a skill. And it's a skill that hasn't really been taught. You know, hopefully it's now starting to be taught and people are starting to understand it. But it's still the really, really early days. And so I think over time, I'm, I still remain optimistic that this, this will, you know, we'll get to the right thing. And I certainly, you know, I'm often put into that camp of sort of techno-optimist or techno-utopians. And to some extent, I don't think that's fair, because I've spent my whole life basically, you know, arguing about, like, you know, against why we should, we need to ward off, you know, I'm not a utopian or optimist in the sense that I think this stuff is inevitable. Because I think you sort of have to fight for it. If you want the technology to work the way I think it should over time, then I think that's worth fighting for. But I recognize that it's not just the technology, it's the societal questions as well. And that, you know, the literacy question, having people be able to understand this and give people the tools to better process things and to, you know, understand information flows and what's credible and what to take seriously and what not to. You know, and I rushed to say that I like the fact that there's more information and that people can kind of create their own bubbles, because you're living in a bubble one way or the other and if it can actually be reflective of your subjective desires, that's great. I like also, I'm pulling out of what you said, I always talk about how we need, you know, better media literacy, because there's so much more media, but I think actually we need information literacy, which takes it to a whole other level. And I agree with you that we're, you know, we think we're at the end of history always, but in fact we are at the dot matrix. This is a really bad metaphor, but we're at the dot matrix, you know, stage of kind of internet, you know, as society and stuff like that. We've got a long way to go. To that end, you in 2019, you wrote a fantastic essay for, I think it was the Knight Foundation, a journalism group or a media kind of literacy group, proficiency group called protocols, not platforms. Can you sketch out what your basic thesis there was and is that still governing kind of your broad vision of not of the specifics, but how we should be thinking about things like the internet? Yeah. And just to clarify really quickly, it's actually the Knight Institute at Columbia, which is partly funded by the Knight Foundation, but the Knight Institute at Columbia is a First Amendment free speech institute that does litigation and writing stuff like that. And the Knight Foundation is named after Bobby Knight of Indiana, so they just like to beat people and things. Yes, with a chair. Yes. Yeah. So the protocol's not platforms paper, which came out of some of the other writings that I had done on Tector previously, and then you know, the Knight Institute really asked me to dig deep and write a whole paper on it, was an attempt to look at all this, all these things that we've really been discussing today, honestly, and say like, if we get beyond the sort of legal questions, everybody always jumps to the legal questions, how do we, you know, pass a law to do this or that? And instead, look at the technology side and like, is there a way that technology can help drive the good part of this forward? And is there a technological approach that is better than jumping to a legal approach? And so the idea here is that if we rethink like a lot of the problems, and the paper goes into a lot of detail, so I'm not going to describe the whole thing, but like, you know, if we look at a lot of the concerns that people have, and this is again on both sides, and some of this is certainly legitimate, is what they're really worrying about is how much power these companies have to make these decisions or not to make these decisions, right? And that's really the complaint that you have just a few of these companies, and they're very big. And, you know, so that leads some people down the path of saying like antitrust is the answer. And because of the nature of information flows and network effects, I don't think antitrust solves anything. You don't break up Facebook, like you broke up the telephone company into baby bells, you can't have like a West Coast Facebook and the East Coast Facebook, that just, that doesn't work. So, you know, what I was trying to do was say like, is there a technological approach that would be better for everybody, including the companies themselves? And it is, you know, thinking about, you know, social media or other websites as a protocol. And the example I use is sort of the simplest way to understand about it. And this is not, it's not a perfect analogy, but the easiest to understand is to compare it to email, right? Email is built on a bunch of standards, SMTP, IMAP, there's a couple other things in there. And so anyone can set up, you can set up your own email server if you want, or anyone can set up their own email service. And then you can communicate with anybody else's email, you're not locked in. So if you use Gmail, you're not locked into that, if you use Outlook, you know, Microsoft's email, you're not locked into that, you can set up your own, and you know, you can communicate. And that also means that if you want to switch, it's very easy to switch, you don't lose access to all of your contacts. And you can see like different providers can come and offer different services, you know, Gmail is now, you know, certainly the biggest and most commonly used email provider. And they jumped into the space and they offered something better, which was a much better interface for email and a huge amount of storage at the time. But others have jumped in. So now like Proton Mail is really popular among the sort of privacy set, because they're much more privacy protective. And yet, you know, you can use any of these, and you can communicate with other people. And in fact, you can do other things with it. So you can have a Gmail address, but not actually use Gmail's system, you can actually pump it into your own client, if you use like Mozilla, I forget when Mozilla's email client is like Thunderbird. Thunderbird, yeah, yeah. Now I'm getting all nostalgic. But there's all different things that you can do. And, you know, so there are some really nice things there, if you think about it that way, and that it enables for more competition, even if you have like a large player, Google is still a large player. And they still have probably the majority of folks who are using email services. But if you're uncomfortable with them, it's easy to leave without using access to everybody that you talk to. And in fact, you can very easily continue to communicate with them, you can move all your email out, you can do all these different things. And to some extent, I actually think that also makes the companies themselves act better. It becomes a sort of competitive element there. Because if Google realizes they're losing lots of people, and early on there were all these concerns that like Google was snooping on Gmail. Well, the price that you paid for Gmail was that a bot would read keywords and the original iteration, right? It would read sweep for keywords and then post ads. But they moved away from that. Like people forget that, but they moved away from that because they realized that that was problematic and they were losing people to it. And so it's like a market based technology based approach to actually allowing for real competition. So if you could build like a social media platform, that was the same thing that was a protocol that anyone's like Facebook could have their own implementation, Twitter could have their own implementation, somebody else could have their own implementation. And if you felt that like you didn't trust Mark Zuckerberg or you didn't like the way he was doing things, then move elsewhere. And you could still communicate with the people who did want to stay on Facebook, but you didn't have to give in to their specific rules. So, I mean, this is kind of a way of either maintaining or reinvigorating a notion of a decentralized internet, right? Which was what the internet was kind of sold as, that it was a collection of computers or printing presses or however, you know, you want it to conceive of it. I guess one of the questions I have is, you know, the paper protocols not platforms, there'll be a link somewhere on the podcast page for this to that. It's really fascinating and thoughtful. But we do like our walled gardens, don't we? And this is one of the things you talk about is that, you know, the protocols were there. And in many ways, the internet and certainly the web is kind of based on top of these. But we kind of like the walled garden and people made fun of America online. Because it had this really, I mean, it was an incredibly, you know, well rendered, you know, mall that you online that you could go to. And then it had to, you know, let people pass through to the internet. And then it was kind of over. But now Facebook is essentially a walled garden. And one of the things we like about it, or even the Google suite of services, because when it's a platform that is under the, you know, a single kind of commanding intelligence, everything tends to work better. Apple, people love Apple products. I think less because they think they're good. But it's like they know it's a pain in the ass. If you use one thing that's not Apple, Apple makes it really hard. Amazon, yeah, yeah. But, you know, what, what are the, like, why would a company like an Apple, like Amazon, you know, Facebook, why would they ever give up their walled gardens that have, you know, in some cases, billions of people who are happy to hang out there? Yeah, I mean, it's, it's tricky, but there are a couple of things. So one is, and the thing that you didn't even mention, of course, is the other benefit of the walled garden is that you can often, you know, extract extra rents that way. It sort of locks people in, you know, from, from a, from a profit perspective, you know, there's, there's clear value there. So, but part of my argument is honestly, like this whole debate that we've had for all this time that we've been talking is becoming more and more costly for these companies, right? The fact that they are constantly being raked over the coals, the fact that they're constantly being called before Congress to have, you know, Democrats yell at them about not doing enough and Republicans yelling about at them about doing too much. At some point, that becomes incredibly costly. Facebook has hired what, like 30,000 content moderators at this point. I was going to say that's how we're getting out of the pandemic. It's like everybody, you don't have to leave your house, but you have to do content moderation on your neighbor. It's like the Stasi, you know, in the internet or something. That's, I mean, that's a joke, right? The future of work is just we're all going to be moderating each other's content, you know, but, but like that, that becomes increasingly costly. And at some point, there is, there's reason to believe that companies might say, you know what, you know, what is a better approach is to go back to these earlier principles of the internet where we push that power to the ends and we don't, we don't have it all be centralized. And that doesn't mean that each of us has to be, you know, making all of our decisions. It shouldn't be every day I wake up and say, do I want to hear Donald Trump today? Do I, you know, do I want to let us, but, you know, we can, we can build sort of, you know, trusted, you know, trusted sources, trusted communities. I can say like, you know, I, I want to see the, the people that reason things is important, or I want to see who EFF thinks is important, or, you know, or whoever else we can, we can set up different things and different rules. And so for the companies, I think it gets to the point where they recognize like, maybe we're better off not having this because it becomes a liability in, in and of itself. And already, you know, Twitter has indicated that they're interested. And in fact, they cited my paper in saying that they're now exploring this idea of trying to build a more decentralized Twitter and sort of, you know, in effect, take, take themselves out of the wall garden business. And that, that is a process that's, that's ongoing. I think they're going to have some announcements about that soon. But it's, you know, it's interesting that I think companies recognize that I think that, you know, the other thing that may happen, and I don't know that this is like Twitter's thinking on it, is I think that a lot of companies who are not Facebook may recognize that the only way to defeat Facebook is to embrace this kind of approach, which is the kind of thing that is harder for Facebook to approach. And then you may have sort of two internets, one of which is Facebook, and then one of which is the more open protocols based approach that lots of companies embrace. And I think that's a real possibility too. To close out, could you predict, you know, let's say the next two years before midterm elections or something, there seems to be a renewed interest in, in antitrust, particularly against tech giants. This is something that Elizabeth Warren talked a lot about and Joe Biden seemed kind of open to. You see, you know, the weird kind of, you know, mirror version among people like Ted Cruz and Josh Hawley, who are also talking about possibly trying to turn certain types of platforms into public utilities, essentially. You know, what do you, what do you think this is going to play out over the next couple of years? Are we going to see more and more attempts by the government to kind of dictate, you know, tech, I don't want to say tech policy, but you know, the way platforms actually are. Zuckerberg, Mark Zuckerberg, I think I lost the note, but I mean, in July, he said something like, we don't want to be arbiters of the truth. We want to be, you know, kind of a place where people can come and scream at each other for 20 hours a day. He's kind of changed his tune lately, as have other people. You know, and I think, you know, a real deficiency among the kind of conventional libertarian thinking is that is to only think about government as the source of power. And you know, in that government, it's always there, always bad. What we're in now is a world where we see, I mean, it's kind of a Neil Stevenson novel where there are, you know, there are governments, quasi governments, corporate governments, marauders, everybody, you know, there's a lot of power circulating. How do you, are you excited or exhausted by the thought of, you know, covering tech for another six weeks, much less two years? Well, I'm definitely exhausted. There's no way not to be exhausted at this point. You know, my, there's stuff that's going to happen. I think the antitrust cases are interesting. Frankly, like reading through the details, I think they're somewhat weak. And based on the way antitrust law works, I, you know, maybe they'll get a win, but, but, you know, a win would be a fine or a useless breakup. You know, like, I don't think any of these, even if the antitrust cases succeed, which I would, you know, I would give them maybe a 40% chance of succeeding, you know, it's possible. Oh, but if WhatsApp is broken off of Facebook, the world will be totally different. Right. So, so, you know, to me, I don't think that that approach is going to work. I don't think any of the attempts to regulate, to reform or repeal 230 are, you know, even if they're successful, they're not going to have the impact that people want. And that's, that's sort of a big part of my problem. And, you know, a big part of why I'm exhausted, you know, I feel like very few people are thinking through, you know, I think for a lot of people, it's, it's punitive, right? They don't like Facebook. They don't like Twitter. They don't like Mark Zuckerberg. They don't like Jack Dorsey. They don't like Jeff Bezos. And so they, they, it's, it's about, you know, just causing pain for these companies. And they think that that's going to magically fix stuff. And that's not true. And in fact, like, you know, I keep trying to point out to people, you look like Facebook was the first company that said, we're okay with reforming 230. We're okay with that because they will be, right? It'll be, it's an anti-competitive move on their part. It will limit the amount of competition that they face. And in fact, like Cheryl Sandberg just yesterday from when we're talking gave this, this talk where she was like, we're the only company that can stop the, you know, the QAnon crazy people from, from communicating. So basically like give us all the power, you know, it's like shut down the parlors and the other sites and just let Facebook handle it. It's like, that's, that's much more scary to me than, than any of these other discussions. And so do you think that'll happen? Or do you think that that also is an unlikely outcome? I think that, that, that, you know, the reform discussions for 230 are going to continue. And I think Facebook will be a part of them. And we'll try and drive the conversation in a way that they're happy with. And that is harmful to competitors. I think that's, that's almost definitely going to happen. You know, the one thing that I hope happens this time, and I think is already happening is that a lot of the sort of, you know, what I would call second tier companies, not the, the Facebook and Googles of the world are recognizing that like Facebook is not on their side in this debate and that they need to be present. And that they're having these discussions and sort of trying to argue like, why Section 230 is actually really good for competition? I mean, we did a study, you know, on the, the copia think tank stuff, we did a study a couple of years ago that, that looked at, you know, data on whether or not 230 and laws like 230 help create more competition. And it, and we found that there was like much greater investment and we looked at like 230 versus the DMCA, we looked at US versus Europe, and we looked at a few countries where like their laws changed, you know, whether the law changed or there was a big court case that changed the interpretation of the law. And the more 230 like you were, the higher investment there was. And when you moved away from that, you decreased investment into internet startups. And so like 230 is a method for getting more competition. So I fear that like pulling back from that, we're actually going to limit competition and limit a more, you know, sort of market oriented approach to this. Do you worry at all that European kind of protocol or European laws and kind of traditions, conventions, are they going to, you know, the internet weirdly is still, you know, kind of a US centric apparition, you know, are the European, the way Europe is going? Is that a model to follow or is it something to run away from? Europe's a real risk. I mean, you know, I think Europe is getting to the point that we're much more likely to create a fragmented and fractured internet than before. I mean, we already have some of that with like China has sort of walled off its own internet. But Europe is getting to the point, the EU, you know, I should be specific, is getting to the point where they've passed a bunch of laws recently and have a few others that they're working on, all of which are, I think, extremely dangerous and very, very worrisome. And even when they're done in good faith for what they believe are good reasons, are very, very short-sighted about how the technology works, how the internet itself works, and are going to create massive unintended consequences. And we've already seen some elements of this with like the right to be forgotten, which has been a ridiculous and tool for outright blocking of speech. GDPR and their privacy rules, which have, again, sort of like locked in the big companies and given them tremendous benefit and sort of wiped them out clear of smaller competitors. We have the new copyright directive that are coming into play that are going to be hugely problematic for a whole bunch of companies in the EU. It's getting to the point where I think internet companies, some of them are going to say it's just not worth it to do business in the EU at all anymore. And we might get a very separated fragmented internet, which I think would be a real loss. Yes, it certainly would be a final topic. And I apologize for keeping you on as long as but I'm enjoying myself. I hope you are. That's fine. No, what about encryption? Because this is, again, Joe Biden, who has been in office since the second Washington administration, essentially, is a long history. And so he's talking about a domestic terrorism law. Going back to the 90s, one of the big issues that has kind of faded was whether or not encryption should be legal or not. And obviously, we know the intelligence services all hate and tend to encryption and things like that. That is kind of being dragged into a lot of conversations now because the Boogaloo boys and whoever are using encryption. And simultaneously, they can't be allowed to meet in public, in public places like Facebook, because they will radicalize people. But then we need to make sure that they're not meeting anywhere, et cetera. I mean, it gets very complicated. But do you worry about encryption? Encryption becoming a battle again? Yeah. And in fact, I think that's a big concern. It's connected to all of this, right? I mean, like, you know, with everything that's happened with Parler and other platforms lately, you know, yeah, obviously, you know, I mean, I think people said that that signal downloads and telegram downloads, both are encrypted messaging services are at an all time high because people are moving there. And so the encryption debate is about to get very loud again. And I want to throw in one point that you said the intelligence services all hate end to end encryption that I don't think is true. It's the law enforcement folks who really, really hate encryption within the intelligence services you have sort of an expect because you have a whole bunch of people in the intelligence community who are like, we rely on end to end encryption. And so you've had, you know, intelligence officials sort of pushing back on the attack on on end to end encryption is usually law enforcement. And, and, you know, the fact is that that is also they're being incredibly misleading, right? What they really want is just easier access. But, but like today, we have more ability to speak than ever before law enforcement has more ability to find out more information on on any sort of criminal activity than they ever have before. And even people who are using encryption tend to be really bad at it, especially, you know, the kind of people criminals are not the smartest. Yeah, that's that's the biggest, you know, thing I hate Arthur Conan Doyle for convincing all of us that criminals are smart when they're a couple of grades below Professor Moriarty for sure. Yes. And encryption to use in a way that is as protective as people seem to think it is, is very difficult. There are people who can do it, but they are not your everyday criminals. And in fact, like, you know, I know some of the people who created the original PGP, you know, encryption for email, who don't use PGP encryption for email because like, it's too difficult. It's not that easy to use this stuff. And it's certainly not easy to use. Well, and all of the examples that we have is like, you're still giving off so much information. So like, yes, I do think there will be a fight about it. And I think it's worth watching. And we should be concerned about how that fight plays out. And I think, you know, I think there are real reasons why law enforcement should be paying attention to where people are going with these encrypted messages. But like, I don't think that like, the excuse that like, oh, we have to get rid of encryption in some way, because, you know, bad people are going to do bad stuff with encryption. I don't think that holds any weight. On a absolutely final note, but a positive one, too, you wrote sometime semi recently, imagine how much worse the pandemic would be without the internet. Yeah, coming into a year ago, people hated tech, they hated the internet on some level. And obviously, that's still kind of holds. But it's also true that, you know, the past year, what is would have been so much worse as bad as it's been without the internet, you know, say something good about it. I mean, it's stunning how quickly people forget like how much the internet has has made the this pandemic, you know, bearable and survivable for a huge segment of the population. Now, that's not to leave out, there are a ton of people who, you know, have been, you know, decimated and just destroyed by the pandemic and the inability to work or other situations. But the fact is that there's so much stuff that that could not have been done, you know, away from other people before that is now possible because of the internet, you know, the ability to have this conversation on zoom, obviously, like, you know, I had you zoom before the pandemic, but almost every day I know I'd never even heard of zoom before the pandemic. And like, you know, the fact that like so many companies, you know, Silicon Valley was like the first one to shut down and send all their workers home and realize like we can we can do this, we can all work remotely and it works okay. The fact that, you know, everybody has has the ability to do that, you know, the fact to shop online. So many people like my parents, who are, you know, up there in age have discovered how to shop online and realize like it's much safer if I, you know, use Costco delivery or Instacart or whatever. That's amazing. And I think that, you know, we, you know, because it's here and because we're used to it, we've sort of forgotten about how amazing it is and how much it's really helped. You know, if this pandemic had happened this way, even 10 years ago, I think the disaster level would have been much worse, just because of the lack of internet infrastructure that we have today that we can rely on to to make all of this stuff possible. Well, that's a great note to end on. I want to thank Mike Maznick of TechDirt for talking to the reason interview. Mike, thanks so much. Thanks for having me.