 Okay, everybody, let's go ahead and take our seats, please. It's one o'clock, time to start the talk on data privacy. You know, one o'clock right after lunch on Friday. People are starting to run out of steam, but I'm so glad to be able to hang out with you guys this week and had a lot of great conversations. And I can sense the energy level is still really high, even though we're in the afternoon of almost the last day of the conference. You know, as our lives become more digitized as we interact with our digital devices on a more and more regular basis, as we see different bits of information about ourselves tracked and recorded and distributed on all kinds of media and platforms, many of us are increasingly concerned about privacy. Right, you know, who has control of my data? What is, how is my data or how are my data? Some dispute among grammarians about whether data should be singular or plural. How are my data being used? Are they being used in ways that are harmful to me? Do I have some kind of property right in my data? Can I request that my data be used in a way that's different from the way it's currently being used? And I'm gonna focus mainly today on kind of, you know, private sector use of data, data by search engines and social media platforms and so on. The subject of how the state uses my data is sort of a whole different subject and would require a lecture of its own. So we'll focus mainly on the economics and politics of data privacy, you know, within the sort of private commercial sector. Lots of people are concerned about this, right? And so we see legislation and executive orders and all kinds of litigation and many attempts to try to use the state to constrain or to shape how our data are used by private companies. Even just this summer, later in July, there will be an antitrust hearing with the CEOs of big tech platforms in US Congress, where among other things, Congress people will be suggesting or demanding some kind of increased protections for consumer privacy. A lot of you are probably familiar with the weather Europeans do it, the GDPR as it's called. Let's talk a little bit more about that later. But if you're like me, you've noticed even just in the last two to three years, the number of clicks you have to go through to get to read a webpage, the number of places you have to indicate your consent to cookies and so forth, contracts that of course you're not reading, but you're just clicking, yes, I accept, I accept, just show me the stupid page. That has gone up and a lot of that has to do with legislation, especially outside the US, but US companies have embraced it as well. And I'm often surprised how often I have friends who live outside the US. And I'll send them a link to an article, not in a controversial place like Mises.org, right? But just an article from a local newspaper or something. Hey, check this out. And they'll tell me, oh, I can't read it. It's blocked in the EU. That website is blocked. The Auburn Open like a Daily News or something, not because it has nefarious content, but because it doesn't conform with the data privacy restrictions that are imposed by the European Union. So ISPs in Europe are required to block content from sites that don't conform to their rules. That's the kind of thing we want to talk about. You may not know this, but every January 28th is day to privacy day, according to Wikipedia. We care so much about data privacy. We even have a day for it. So plan on celebrating next January. Let me start by just helping us think about privacy in general. You know, what is privacy? What forms does privacy take? What does it mean to violate somebody's privacy? I mean, obviously this is something that predates the digital era, right? Celebrities are very concerned about their privacy. That's why they wear a disguise when they go out, you know, to get coffee in Hollywood or whatever, and the paparazzi are trying to get a picture of some celebrity without their makeup or whatever. We don't like people peeping in our windows, peeping toms or whatever, seeing things they're not supposed to see. And, you know, many people lock, you know, you may lock your doors at night. You may have a safe or a briefcase or a suitcase with a lock on it, right? So, you know, the idea that people want to prevent certain kinds of information from falling into the wrong hands is of course something that long predates the digital era. At the same time, you know, privacy is kind of an abstraction. So can we say that privacy is an economic good in Carl Manger's sense. Do you buy, can we buy and sell privacy? Is privacy something that is parceled out and priced on the market? Well, you've seen this before from some of the other professors this week. Remember, Carl Manger gives a very precise set of criteria that must be met for something to be an economic good. There must be a human need that can be satisfied by that good. And there must be such properties as to render the thing, the good or service, and question capable of being brought into a causal connection with the satisfaction of this need. You have to have human knowledge of this causal collection and command of the thing sufficient to direct it to the satisfaction of the need. And you have to have all of these, right? You know, do people want and need privacy? Yeah, a lot of people want privacy. We need privacy in some spheres of our lives. Is there some way that we can exercise some control? Is there something we can do that has a causal connection to the amount of privacy we have? Yeah, I mean, I can put window shades on the window and that will limit the ability of the peeping Tom to look into my house. I have, and lots of people know about this, you know, but do I have command of the thing sufficient to direct it to the satisfaction of the need? Well, again, window shades, I do have, I can go purchase window shades at Home Depot and I can install them and I can control the degree to which someone can look in my window through the window shades, through the window. But so what is the thing that we're talking about that does seem to meet Manger's criteria for being an economic good? So window shade is not privacy, right? I'm not buying and selling units of privacy, but I can buy and sell units of a window shade. Remember, Manger's analysis points to, you know, a requirement for economic goods and services to be, you know, discrete marginal units. You can have one of them or two of them or three of them and you can buy them and sell them. I mean, what does it mean to have one privacy or two privacies or three privacies? Yeah, that's kind of tricky. That's why we often say that things that are good, like, you know, the environment, having a clean and nice environment without pollution. I mean, is that, do I like going out into the forest and walking through the trees and enjoying the, yeah, of course I do, but is the environment an economic good? No, because the environment is kind of an abstraction. You can't buy and sell units of the environment. There's no price for one environment, two environments, or, you know, love is love a nice thing, is love desirable, is love beneficial? Do we want love? Yes, but love is not an economic good in Manger sense, right? You don't buy and sell love. Now you can buy, somebody asked us in the Q and A the other day awkwardly, you can buy and sell certain goods and services that may be connected in some way to this, but you cannot buy love. So in other words, you know, privacy is not really an economic good. I think it's analogous to information. So back in the 1980s and 1990s, there was a surge of interest among economists in the economics of information. Of course, it's not something that economists have been thinking about for a long time, and Hayek had his famous articles in the 30s and 40s about knowledge, but you started to see people like Hal Varian, Carl Shapiro and Hal Varian, who wrote a very interesting book called Information Rules in late 80s, I believe, 89, 90, about sort of how we do apply economic analysis to information markets and information goods. More recent book by Josh Gans called Information Wants to Be Shared, which is another, which is worth looking at in this genre. But their point was, when we analyze information using economics, what we're analyzing is specific goods and services that they called information goods. So a book is an information good, right? Because the reason that you buy the book, or at least that most people buy the book, is not because, you know, they're gonna build a house with it like a brick, or not because you wanna hit somebody over the head with it, though you could use a book as a weapon. I mean, imagine throwing George Reisman's capitalism as an intruder, but most people buy the book because they want the information contained therein. You know, when I buy a movie to stream or to go into the theater, whenever we can do that again, right? I want the experience of getting that entertainment or maybe it's a documentary. I'm watching a Tom Woods YouTube. I want the knowledge of Tom Woods genius, right? It's not the physical experience of sitting there and clicking that I want. It's the information, you know, cell towers, mobile phone cell towers. You could think of those as information goods, if you like, a hiring a consultant. If I hire a consultant or I pay for a course, an online course, right? I'm hiring a person who's gonna provide a service for me and that service is give me information, okay? So there are lots of goods and services that embody information and which are bought and sold on the market, but information itself is not an economic good. That's the point. You don't buy and sell information. You buy and sell books, lectures, consulting services, cell phones and towers and so forth, okay? So again, I think we can use this same kind of framework to think about privacy. So, you know, celebrities wear dark glasses when they go out in public. So you won't recognize them. Of course, that just draws attention to them, right? Baseball cap and dark glasses, because they don't wanna be seen. They wanna protect their privacy. I go to the sunglasses that in that context, that's a good that gives you more privacy, presumably. You're wearing a disguise. You know, putting locks on your windows and doors, shades and so forth, putting a tall fence around your property so people can't see into your backyard. Those are goods and services. The thing that you ultimately want is more privacy, again, you don't buy and sell privacy. You buy and sell these goods. Maybe software too, you know, encryption software. Think of that as a privacy good. You can install, you know, depending on your browser, Chrome or whatever will block certain cookies and prevent pop-ups from coming up and you can limit the amount of data that gets transmitted to third parties, sometimes with extra software. Think of those as privacy goods. Again, you can buy and sell a piece of software, right? Something like bleach bit. I don't know if you remember, this is something that you run in your hard drive that basically shreds it, it was made famous because this is what Hillary Clinton's team used on the infamous email server that was in the basement of her house, right? They ran bleach bit on it before they turned it over to the authorities. Who knows why, right? Privacy goods and services can be analyzed just like any other economic goods, okay? We can use supply and demand analysis, we can talk about the law of returns, we can talk about diminishing marginal utility, we can analyze technical change and regulation, tax and so forth. In other words, my point is, the fact that a good or service is used to, you know, it embodies information or it facilitates privacy does not make it different from any other good or service in terms of how we would analyze it using economics. We don't need a special new kind of privacy economics to analyze these kinds of goods, right? I can understand the price and quantity and market the characteristics of, you know, the window shades of window shades and the window shades industry just like I would the hamburger industry or the clothing industry or anything else, okay? So like any other good or service, right? There are markets for privacy goods and privacy services, right? So if you really want a lot of privacy, there are ways that the market can provide that for you, right? So there are entrepreneurs who are making window shades and offering them for sale, but in terms of software, right? And hardware, there are all kinds of goods and services that provide greater levels of privacy, right? Either through encryption or disclosing less information to third parties or whatever. And entrepreneurs offer, you know, they offer goods and services that include a bundle of features, some of which relate to privacy, others of which relate to convenience and so forth, right? So I don't have, I mean, if this were an audience of normies and I said, you know, how many of you use, have some kind of hard encryption on your emails? Probably the answer would be none, but in this crowd, I'm not so sure, right? I mean, if you are really concerned about the feds or anybody else snooping on your text messages, instead of doing regular texting or whatever you can do, you can use Telegram or you can use Signal. I think Telegram is the one that Antifa uses when they organize their rallies and I don't know, probably some Nazi groups use Signal or something like that, but most people don't quite frankly because it's not as convenient, right? You can only communicate with somebody else who has the same thing on their system and people think you're weird if you use it, I mean, right? So, but there's nothing preventing you from doing that and there's nothing preventing an entrepreneur from offering some kind of communications technology that provides greater levels of privacy protection, okay? And if that succeeds on the marketplace, that means people are willing to give up something else. Maybe they pay a higher price or maybe they give up some other feature in order to have more security, more privacy, and so forth. You can have your whole data network encrypted with Tor, one of these other kinds of peer-to-peer encryption systems. How many of you use DuckDuckGo as your default browser? Again, this crowd about half the hands. Again, most normies have never heard of it, but in fact, you could go a little bit farther and say that if we look at what Austrians call demonstrated preference, let's look at what we observe in the market, what do most people actually seem to prefer? I mean, with the exception of you guys, right? It doesn't appear empirically, so I'm not making a praxeological statement here, just my own conjecture based on my interpretation of the empirical evidence. I think most people are not willing to give up many other features to get more privacy because DuckDuckGo has a very small market share. Almost everybody else just uses Google even though they know that Google is tracking all kinds of stuff about you. Why? Because they're lazy because they don't want to download and install DuckDuckGo and make it their default browser or they think the search results are not as good. They think Google, rightly or wrongly, they think Google gives them better and more accurate search results. If they really cared a lot about being tracked by Google, they would say, more people would say, yeah, I'm willing to live with the fact that the search results are not as good on DuckDuckGo because I don't want to be tracked. But probably most people don't. That's why most people don't use encrypted texting and encrypted emails and so forth. I mean, again, that could change. I think WhatsApp actually uses a higher level encryption than obviously regular texting and even like Facebook Messenger and so forth. So there actually are a lot of messaging options out there and people are choosing based on a number of different criteria, privacy being one of them, but it doesn't look like for most people, privacy wins out. I mean, if you really were really super, super concerned about privacy with messaging, you would not use an electronic device at all. You would write your messages on a piece of paper with a pen and if you wanted to transmit it to somebody at a long distance, you would roll it up into a little roll and you would put it in one of those things on the back of a pigeon, right? You'd put it on a carrier pigeon and you'd send the, you smoke signals or I don't know what you would do. You'd do like a cloak and dagger kind of a thing where you have it written down on, I don't know, inside a stuffed animal and then you leave the stuffed animal for somebody under a tree in the middle of the night and they, you know, whatever. You could do all those things. Most people don't because it's a pain, it's not convenient and they don't care about it that much. In fact, you know, since, since March, since the, since the COVID pandemic hit, you know, aside from, you know, testing and social distancing and masking and so forth, you've heard a lot about contact tracing, right? So at the moment, there is no sort of US nationwide mandatory contact tracing system, although some smaller countries are experimenting with this. But in fact, you can, even today, here in the US and probably wherever all of you live, you can download a contact tracing app if you want. There's a whole bunch of them out there. Apparently not many of them are being downloaded so the public health people are worried that there's not enough use of voluntary contact tracing apps, right? To have a big impact on the spread of COVID, okay? But what does that suggest? I mean, it suggests either that, well, there are a lot of possible interpretations. People don't think COVID's a big deal or they don't think contact tracing would really help or they think it is a big deal and contact tracing would help but I'm not giving up my privacy for that. I mean, we think that the, you know, software, the big tech platforms and your mobile phone company, you know, Google and so forth, they could NSA, right? I mean, they can trace you anyway but we don't want to make it easier for them and most of us are not willing to give up our information to a third party. Actually, I mean, Apple has some built-in contact tracing options but you have to enable it. I think Samsung has as well and very few people have which suggests that maybe even in the face of a pandemic and a proposed intervention to mitigate the pandemic, most people are still sufficiently concerned about their privacy that they don't want to, they don't want to install it, okay? So my point is you can have more privacy or less privacy depending on which kinds of privacy goods and services you consume and your willingness to consume them depends on the overall benefits and costs of those relative to other goods and services that provide other features with more or less privacy. Again, just kind of on the empirical side, I mean, how do things actually work today? I mean, one interesting argument that I think does not get made as much as it should, is that, yeah, I mean, despite Big Brother, despite NSA spying on us and despite Google knowing where we are and our mobile operator being able to locate us, I mean, if you think about it in many ways, we actually have more privacy today than previous generations did because think of all the things that you can do from the privacy of your own home, you can order stuff on Amazon and it comes to your house in a brown box and your neighbors did not know what is in that box, okay? Your neighbors did not know what you're streaming, they don't know what music you're listening to, they don't know who you're texting with. I mean, yeah, maybe Google knows and maybe Netflix knows and Donald Trump knows and all kinds of other people know, but I mean, your neighbors don't and your employer probably doesn't and your friends at school, your grandmother, in the old days, it was much harder to keep those things private, especially if you live in a small community, I mean, people see you going around. If you wanna go buy a whatever, some suspicious thing, you gotta go buy it and anybody who sees you walking into that store knows what you're buying, that's not the case anymore. If you don't believe me, Woody Allen is kind of a controversial figure nowadays, but you might know that Woody Allen's early movies are quite different from his later movies. They're more kind of slapstick comedies, absurdist comedies. My favorite of his movies is Bananas, which is a movie where he's this sort of weird guy in New York, he ends up becoming the dictator of a Latin American, fictional Latin American country, but there's a hilarious scene early on where he tries to go and buy a dirty magazine at a store, and like a convenience store or something, and he's like browsing the magazine, there's a bunch of people around and there's this old lady kind of looking at him suspiciously. He's like, oh, I think I'm gonna get a copy of Time Magazine, I'm gonna get a copy of National Review, here's the New Republic, and I'll take one of these. He slides it under the stack, and then he goes to try to check out and I'll just let you watch the YouTube clip to see what happens, but it's pretty funny. So in fact, we do have quite a bit of privacy, and we have options to consume even more privacy if we really want, okay. Also, it's easy to say, gosh, I hate it that Google knows so much about me. I hate how much Amazon knows about me, and I hate how much Netflix knows about me, and I hate how much AT&T or Verizon or whoever knows about me, but again, keep in mind, why do those tech platforms want to know so much about you? What is it they want to do to you? Right, I mean, if it's the NSA, maybe they wanna shoot a missile at you, but what the private tech platforms wanna do is sell you stuff, right? They wanna get you to buy stuff, they want you to make sure you are aware of things that you might wanna buy that are available for sale that you otherwise wouldn't know about. Another way to say it, a more sort of technical way to say it is these kind of platforms that collect a lot of information on sellers and on buyers allow for a much more efficient matching of buyers and sellers than would otherwise be the case. I mean, I remember it used to creep me out. This, I mean, Amazon has been selling books, what, for 20 years, but I remember the first time I realized that when I went to the homepage of amazon.com on my computer, maybe you guys don't do that anymore, you're just doing it on your mobile, but when you go to Amazon, I guess on your mobile too, you know, the things that are advertised on the front page are customized for you. So I get it, when I first realized that, at first I thought, man, that's really creepy. Hi, Peter, we think you might like this. Like, well, you don't know me as well as you think you do, and no way I'd want that. But of course, that allows me to find things that maybe I actually would wanna buy that in the old way of doing business, there would be no way. Millions and millions of books, what are the chances I would find a book that I really like? And I can go to, you know, people who bought X also bought Y, and I can browse through those things, and quite often I'll find stuff I'm interested in that I otherwise wouldn't be. I mean, occasionally it's pretty funny, I'll be on there. So you may also be interested in, and it's got like one of my books. And I think, I think I'm totally not interested in that. No, not at all. So look, here's another way, it's very easy to see this is dating sites. I don't know how many of you have tried, you know, an online dating platform, but I mean, I guess I'm old enough, I was already happily married before dating sites came online. But back in my day, you know, how do you meet people of the, how do you meet potential marriage partners or whatever? Well, you know, you go to the singles bar or a friend introduces you to somebody or you meet somebody at work or whatever. I actually met my wife at a Mises Institute event, but that's neither here nor there. You know, it's not easy to meet people and they say, I haven't tried it, but they say it is easier to meet people now on dating sites, and maybe it doesn't always work out well. But the whole idea of matching, you know, you fill out a profile and they fill out a profile and you upload your picture and I guess a lot of people lie and make stuff up. But anyway, in principle, the algorithm can kind of match you up based on your likes. But by the way, you know, you probably know this, but the fastest growing social media platform today is TikTok by far. Yeah, TikTok has exploded in terms of market share. And one of the things about, you know, I mean, TikTok is, mainly it's a weird platform. It's much more restricted and limited. I don't know if you've used it than Twitter or Facebook or Instagram or whatever, because all you can do is upload goofy little short videos. It also doesn't have the concept of like a friend's list. So whatever you post is available to anybody in the world. And if you like scroll through a feed, what you're seeing is what the algorithm thinks you'll be interested in based on, I guess, what you looked at before and how much time you spent looking at it and whether you liked it or whatever. So it's a totally different way of trying to customize the content to the user, you know, so that you'll like it and want to stay on there for a long time and spend hours, you know, looking at these goofy videos so they can also make money from advertising, right? But the technology to match potential buyers to goods and services that the seller thinks they would be interested in is, you know, it can be annoying and it can feel creepy, but it's really a great, it provides a lot of advantages in terms of facilitating market transactions. Also think about, you know, the stuff that we really don't like, like credit rating and credit scoring. I don't know if you've ever applied for a loan and they ask you to type in your social security number in your date of birth and you get really nervous, right? And oh, if I ever, you know, I don't know if I don't pay a debt or they used to call it bounce a check, but I don't know what the modern equivalent of bouncing a check is, you know, it hurts your credit score. We think that's terrible and we don't like that these mysterious agencies are collecting information on our credit worthiness. But in fact, that allows a lot of people to get loans who otherwise wouldn't, right? In the absence of credit ratings agencies and a credit rating service, a stranger walks into my bank and says, I want a loan to buy a car or a house. I say, I don't know you. You don't live in my neighborhood. You don't go to my church. You're not in my community group. I'm sorry. Unless you got a whole bunch of collateral, you're out. Okay, but now a stranger comes into the bank. I can just look up their credit score and I can decide whether I want to make them a loan or not. You know, I use other criteria as well. But actually credit scoring has facilitated a lot of credit transactions that make everybody better off, make the buyer better off and the seller better off. Same thing with, you know, medical privacy. I don't know, you know, probably it won't be long before we all carry a little, you know, on our phone or in the chip inside your head, whatever. There's like the record of your complete medical history. And it does seem kind of creepy in a sci-fi movie way. But imagine you have an emergency and you go to the emergency room and they can quickly scan something and get your complete medical history. Maybe you get better treatment. Maybe we could save lives that way. Now, you know, do we want to look at a balance or a trade-off between saving lives and maintaining some degree of medical privacy? Absolutely, right? But these are things that entrepreneurs can work out in a market system. A lot of people are concerned that, well, if sellers have a lot of information about me, they're gonna use it not to offer me cool stuff that I might want, but to try to take advantage of me somehow and charge me higher prices or if you've studied so-called price discrimination that employers, sorry, sellers are gonna use this to price discriminate. I mean, theoretically they could, but empirically, there's not much evidence that that kind of information is useful to sellers to price discriminate. There's an article, interesting survey article in the Journal of Economic Literature from four years ago that you can look at. Okay, let me speak a little bit about regulation. Some of the rules that actually are in place, some rules that have been proposed. So the kinds of privacy regulations that people talk about, the potential restrictions on private market information sharing that have been proposed include things like mandatory privacy protection. So there are proposals to make it illegal for Google to give any information to certain third parties without explicit consent, okay? You might remember during one of the issues of the 2016 election was that Facebook gave some data to a private consultant who was working for the Trump campaign and apparently it violated the terms of service between that consultant and Facebook, but it wasn't discovered at the time. So very strict rules about what information can be shared is one kind of proposal. A second type is a rule that would establish a kind of a mandatory data portability, right? That's the idea that the law says that you own your data. So if you've been on Twitter for a long time and you decide like some people are doing, I hate Twitter, I'm switching to Parler or Gab or some other social media platform. It should be possible for me to go into my profile on Twitter and click somewhere and like extract the file that consists of all of my posts I've ever made and all my likes and comments or whatever. I should be able to extract that onto a USB and then download that data to my device. Twitter has to erase it. Then I can upload it to some other social media platform and now I have the same experience on that platform as I did on the original one, okay? There are also proposals to impose harsher fines and penalties on private actors to violate contractual terms, right? So you've probably heard in the last several years there've been a lot of big data breaches every month or so we hear about some data breach. Even just a couple of days ago there was some kind of hack on Twitter you might have read about where all the people who are like the blue checks, the verified Twitter users, they couldn't tweet, they couldn't access their accounts or change their passwords or their profiles. Apparently there's some kind of a hacking scam where hackers are trying to use verified profiles to engage in whatever kind of nefarious activity. So people say, Twitter should pay a huge fine or people should go to jail or whatever for violating user privacy or sometimes you hear about a store. This happened to target the big retailer a couple of years ago where some hackers stole a bunch of people's credit card data or whatever. You know, I mean, right there we could ask ourselves just kind of a procedural question. I mean, if I download the target app onto my device there's some kind of a license, you know, there's terms of service and license that I clicked through. If that specifies that target will not reveal my credit card numbers to Ukrainian hackers and they do, well then I would potentially have a civil breach of contract case against target. So one response to a lot of this is to say, well look, if all we're talking about is contractual violations, the user has a contract with a provider that says you won't give my data to X and you did anyway, well then I can sue you for breach of contract and maybe a bunch of us could get together and file a class action suit or something along those lines. It's not obvious that we need legislation. We don't necessarily need positive law as Judge Napolitano would say. We can use the common law or natural law remedies for these kinds of things. I'm gonna come back to that point later because I actually think it's important. I mean, if you think of these kinds of restrictions or these proposed regulations, all of them could potentially provide some benefits to users but could also impose some significant costs. What do I mean? Well, again, you know, imagine a rule that says my data has got to be portable. Twitter, Facebook, Instagram, TikTok, whoever. They've gotta have some way to let me extract all my data and import it to another platform. I mean, how is that actually gonna work from a technical point of view? I mean, you'd need some kind of universal, some kind of universal database format that all kinds of different social media platforms are compatible with. You know, what would the columns be in that data set? What would the fields be? Because different platforms have different forms of interaction. I mean, if you think about it too hard, it's not even clear what that would mean. I mean, how would you implement such a thing? Because, you know, my data on Twitter, for example, is a record of my interactions with people on Twitter. It's what I posted, what I commented on, who commented on my thing. There's no way I could extract that and put it on another platform and have it work or be meaningful. So if data portability were mandatory, what that would likely mean is that social media platforms and any other kind of entity that uses data on customers, they would have to limit what they collect and they'd have to organize it in some kind of a form that makes it super portable. And that means that they probably wouldn't collect most of the stuff they currently collect that makes their product work the way it works. So you'd have platforms and e-commerce sites that don't work very well. You could not get personalized recommendations or you couldn't have an algorithmic feed like on TikTok or even Facebook does that too. Or you wouldn't be allowed to comment and like on other people's stuff because that wouldn't be portable. So you'd probably have a lower quality product. I mean, again, if people really care about privacy, someone can offer, some entrepreneur can offer a strictly limited kind of a platform with complete data portability. You can have it whenever you want it, but it probably wouldn't be very good. And the fact that there isn't one like that on the market now suggests that not very many people really would want that. If you ban cookies and third-party ads, that's another big one people complained about. How in the heck are these platforms gonna make money? Okay, we're already grumbling now about streaming. Well, streaming isn't that expensive, but well, I've got, I pay for Netflix and Amazon and Hulu and Disney Plus and Peacock is NBC's new streaming services and HBO Plus or whatever. You know, there's so many different streaming services. If I subscribe to all of them, who knows how much I'm paying per month. That's why mostly people just cheat and share their friends, login credentials, right? If they weren't able to crack down on that somehow. I mean, imagine that all your social media platforms are like that. How much do you pay to use Instagram? Zero. How much do you pay to use Twitter or Facebook? Zero. How much do you pay to do a Google search? Zero. Imagine that it were not possible for companies like Google to use their current business model where revenues are generated from advertisements. I mean, would it even be profitable to operate a search engine? Maybe there'd be some kind of micropayments technology where you pay a fraction of a penny or you pay in Zimbabwe dollars or something. You know, whenever you're doing a search, but I mean, it's just a huge complicated mess. Nobody likes ads. Nobody, I saw a cartoon the other day of, it was like Satan in hell with the flames and guys coming in and Satan says, oh, you're the one who invented the double ads on YouTube. We love you down here, okay? I mean, nobody likes that, but would you rather have to pay for everything and not have any ads? I don't know, maybe you would. The other thing, you know, I already talked about the GDPR, this EU rule. Not only is it extremely annoying, but there's a lot of evidence now that these kinds of rules, I mean, what sorts of tech companies are most likely to be able to comply with complicated government mandates about privacy rules? The little startups? No, the big guys. And in fact, the large dominant social media platforms have increased their market shares in Europe following the passage of these kind of data privacy rules. That's why you often see, you know, when Mark Zuckerberg goes to the Hill, he's not making an articulate case for the free market, right? He's saying, yeah, we're worried about privacy at Facebook too. We think the government should impose stricter privacy regulations. That's, you know, the argument of so-called, you know, raising your rival's costs. They're trying to make it harder for smaller, more innovative newer firms to compete by asking the government to impose regulations that Facebook can more easily meet than a smaller competitor. Okay, I want to say a few words before we wrap up about the most fun topic to argue about these days in this realm. And that is the infamous section 230 of the Communications Decency Act. Now I'm gonna get a little bit into the weeds here, but you'll see there's kind of a method to my madness for those of you who have not gotten into this yet. But I mean, everybody knows, right? What does Trump call the major media companies? Fake news. Trump says Twitter is fake news, right? People on the left and right, but increasingly now on the right, complain that the Twitter and Facebook and Google and so forth, they engage in censorship. I mean, it's not technically censorship, right? Because censorship typically refers to government restrictions. But they say Twitter will ban or shadow ban conservatives, you know, started out, people started talking about this when Alex Jones got banned from all the platforms and you can find your own favorite example. Yeah, so this is where a lot of people are leaving, right? People are leaving Twitter. In fact, if you search the hashtag twexit, which ironically you can find on Twitter, right? You can get accounts of all people who are saying I'm deleting my Twitter account and, you know, like Ted Cruz, I'm going to Parler or some other platform because Twitter discriminates against conservatives, probably libertarians too, right? And that Google does the same thing, that their algorithm is built so that if you're searching for not something to buy, but you wanna learn about, you know, the US Civil War, you won't, it doesn't put Tom DeLorenzo's books or Tom Woods's books near the top, that artificially puts them at the bottom for purely ideological reasons. And maybe you saw that infamous thing and after the 2016 election, right? Whereas it was some sort of video that was leaked of like a Google company meeting, you had all the executives at Google and engineers, like literally crying about the results of the 2016 election. And what are we gonna do about it as a company and what can we do to reverse that? You know, I mean, it's pretty obvious that the people in the tech industry are not totally unbiased, right? So what do you do about it? Well, the reason, part of the reason that this is such a complicated kind of legal issue is because of a very important piece of legislation passed in 1996 in the US context called the Communications Decency Act. So the Communications Decency Act was congressional legislation that was passed in the early days of the commercial internet. And the purpose, the concern at that time was that people were using internet services. And in those days, you probably heard about these things, your parents have told you about them, the old system of kind of dial-up internet, like AOL and CompuServe and Prodigy, where you had like a thing connected to your phone line. You know, you've heard that sound, the modem dialing sound, like a fax machine. And then you get like, you know, super slow internet and then your mom would pick up the phone and it would mess up your connection or whatever. The concern was that people were using these services for criminal activity, in particular child pornography, and sex trafficking, human trafficking and so forth. So Congress passed the Communications Decency Act to give government officials more tools to crack down on people who were using these technologies for illegal gain. But there was an important carve-out that the issue was, well, if I go on to one of these services like America Online or Prodigy was another famous one and I post something illegal, can the victim, and then I harm someone, can that person sue the platform? Can they sue AOL for allowing me to post something harmful or can they just come after me? And there were actually some pretty famous cases along these lines. There was one case where, there were a couple of cases where courts ruled that the service provider was not liable because they didn't really have control over the content. They were just a neutral platform where people could post things. But there was a case against Prodigy where the court ruled that Prodigy was liable for something that one of its users had posted in a chat room. So there's a lot of uncertainty about how this would work and these internet companies were asking the government for some clear and consistent rules so they would know what to do, how to build out their networks. So the Communications Decency Act included a now infamous part, section 230, which referred to kind of, there's a distinction in the common law between publishers and common carriers. Okay, so if you're like, say you live in your dorm room or your apartment there's like a bulletin board and people can post little flyers or whatever upcoming events, stuff or sale. If all the apartment complex where the dorm does is provide that physical bulletin board but they don't exercise any control whatsoever over the content, then they're treated by the law as sort of, again, they're just a neutral provider of that space. And if somebody puts something inappropriate it's not the fault of the owner of the bulletin board because anybody could post something and we don't curate, we don't screen the flyers, we don't edit the flyers, we just let anybody put up there anything they want so we're not responsible for what gets up there as opposed to a publisher, right? So if there's an article in the New York Times that defames you, Walter Block, right? You could sue the New York Times for defamation, right, as Walter has done because the New York Times chooses what to publish. They could have chosen not to publish that defamatory statement about you because their articles are edited and curated, right? As opposed to like the comments section of NewYorkTimes.com, if that's unmoderated where anybody can comment, then they can say, well, for that we're just a common carrier, right? Likewise, you can't sue Verizon or AT&T if somebody makes a threatening phone call to you because they don't have control over what people say on the phone. They just provide the connection. So what section 230 did is it said, well, look, we want the platforms to go on there and get rid of child pornography. We want them to monitor drug dealers and child traffickers or whatever and we want them to take that stuff down but if they are worried that by doing so that makes them a publisher, not a common carrier and therefore potentially subject to being sued for not taking something down that hurt somebody, then they're gonna be afraid to intervene, right? They're not gonna curate, they're not gonna moderate, they're not gonna take bad stuff down. If they feel like once they do that, that's gonna expose them to liability either for taking down something they shouldn't or failing to take down something that they should have and therefore they'll just, they won't have bulletin boards or comment sections because they'll be too scared and we want those to flourish so we'll have a special carve out of the law and we'll say that online media platforms, they can engage in content moderation but yet we will treat them as if they are a common carrier. We will allow them to moderate content if they want and we'll give them a blanket exemption in any civil suit, any civil lawsuit for the content that they put, that they allow to be on their site. Now, according to defenders of section 230, they say, well gosh, if it weren't for that, the internet probably never would have taken off, right? The only reason the internet grew and thrived and turned into what it is today is because of section 230 protection. The book on this by Jeff Kossoff called The 26 Words That Created the Internet. The 26 words are the 26 words of section 230 of the Communications Decency Act. He says, without that, the internet would have just died, right, so therefore it's a good thing that we give this blanket liability. Okay, is that a good idea? Well, Trump has recently proposed and a lot of social conservatives are proposing revoking section 230 or rewriting section 230 to say, okay, if Twitter wants to ban conservatives, if Twitter is gonna intervene and decide what gets on there and what's not, then it should be treated as a publisher and should be subject to liability. And they think that this will encourage tech platforms to be content-neutral, not to be ideologically biased. It's kind of like these old net neutrality debates or like, you know, the electric company is not supposed to shut off your electricity because you're running a libertarian website, right? They're supposed to be neutral with regard to how you use the electricity. Well, according to Trump and Josh Hawley and people like that, Twitter and Facebook and Instagram should be required by law to be ideologically neutral. Now, there's the counter to that that you see from like the Electronic Freedom Foundation and some other, even some kind of Washington libertarian groups. No, no, that's wrong. You know, these are private companies and they can do whatever they want. They can moderate or not moderate, you know, they can't censor because only the government can censor, that's true. I wrote a little thing on Mises.org not too long ago, trying to tease this out. And I said, well, I guess my position is, you know, it's sort of closer to the EFF position but with some caveats, right? I mean, so yeah, certainly true, Twitter is a private company and Twitter should be able to do what it wants, right? Yet at the same time, it's very weird to have federal legislation that basically throws out the natural law or common law notion of liability for speech and so forth. Right, so in fact, you really don't need some legislation like the CDA or Section 230 or the rest of the CDA either. We should just let this be handled by the common law, by the court system, right? Let social media platforms do whatever they want but let them be subject to liability if they violate the contract or the terms of service, if they engage in some, you know, if they engage in some behavior that violates rights, violates the NAP, right? You know, as to the argument, whenever I say this to my libertarian friends, my sort of Washington D.C. libertarian friends, they say, oh, but if we did that, then it would just be too costly to run a social media platform. They would just all go bankrupt. To which I say, well, maybe. I mean, we have to run that experiment to see if that's really true but if it hurts internet companies, so what? I mean, the purpose of the law is not to benefit a particular industry or to make a certain product more profitable than it otherwise would be. It's to protect rights and we should have the same legal framework that we would have in any context regardless of what impact it has on companies. And of course, it's ridiculous for the government to try to regulate content neutrality. That just doesn't make any sense. I would remove the GDPR and repeal the CDA and repeal all government restrictions or regulations related to privacy and liability. Let this be handled by the courts, by the common law, as it always would. Okay, so I'm already over my time but I guess sort of, you know, I also want to just close by saying from a sort of political philosophical point of view, I think the idea of owning your data is fundamentally flawed. I don't think you can own your data because data are not ownable. Okay, data are not property. Because if you think about it, you say, well, I want to own my data on Twitter, what does that mean? My data on Twitter is the electronic record of a bunch of interactions between myself and Twitter and other people who liked and commented on my stuff and on who stuff I liked and commented. I didn't create it. It was co-created by the actions of all of these different parties. Nobody owns it. It's not an ownable thing. You don't own your data, so let's stop talking about how I get to keep my data. It doesn't make sense. And it's kind of the same thing with privacy. This is a very blocky and kind of point. So remember, it's like what Walter says about reputation. He says, you don't own your reputation because my reputation is what Leith thinks of me and what Lucas thinks of me and what Karis thinks of me. My reputation is beliefs in your minds. So of course I can't own your minds, right? Likewise, you know, I can respect someone's privacy or I can try to look through their window and not respect their privacy, but I can't steal their privacy. I can't take your privacy because privacy is not a thing that can be taken or stolen, okay? Sorry for running over and thank you.