 I want to talk to you this afternoon about data and data privacy and data platforms. This talk is mostly about economics, but I'll throw in some political economy and related issues as well. This is not one of the topics that was much discussed in Karl Manger's day, but I think some basic Austrian principles can generate a lot of insight and help us to think more clearly about these complicated issues that are so much in the public eye today, right? Oh, within the last year, year and a half, but then stepping up in the last few months, there has been increasing attention on questions about privacy in the digital age, what are the big social media platforms, Google, Amazon, et cetera, what are they doing with our data? Are they infringing upon our privacy? Are they making money by selling our data in ways that we didn't consent to? This is a bipartisan issue in the US and, of course, in Europe and elsewhere in the world as well. Europe has always been a little bit, or last few years, a little bit more strict than the US in terms of data privacy protection. Protection, I'll put in scare quotes, but you can expect, within the next several months, to see additional action in the US. So there will be hearings, there have probably been some hearings there will be more coming from all kinds of directions in the political landscape. The President, of course, is constantly going on about how his tweets are being downgraded and so forth, so this is a big deal. I don't know if any of you are aware that January 28th is now known as International Data Privacy Day. Did you do anything to celebrate on January 28th? I'm not really sure what you're supposed to do, but it must be a big deal because it has its own Wikipedia entry. So let's start with a few very basic concepts that I think will help us to frame this kind of conversation. First of all, what is privacy? And can we have such a thing as a market for privacy? Well, first question, is privacy like a thing, or rather is privacy an economic good? Can you buy a unit of privacy? Can you sell a unit of privacy? What is the price of some quantity of privacy? I would argue that privacy, per se, is not an economic good. Is privacy important? Yeah. Do I like my privacy? Yes. I also think love is important in the world, but love is not an economic good either. Perhaps the best analog to privacy is information. Brother, there's been a lot of discussion in economics and related topics over the last few years about how to study information. How do we apply economic analysis to information? What I think is a very good book is a book by Carl Shapiro and Hal Varian called Information Rules. It's a neoclassical economics analysis of information that raises some really important issues. More recent book by Josh Gans called Information Wants to Be Shared, which talks more about the networking aspects of information sharing. But one of the points that Shapiro and Varian and these other authors make is that information, per se, is not the thing that is bought and sold in markets. Rather, there are things like information goods. There are information goods and information services that are bought and sold, the things in which information is embodied. Okay. Go back to Manger's notion of an economic good. Right, an economic good is something that is scarce, something that can be separated into discrete units and valued on the margin can be bought and sold in markets and so forth. So for example, a book is an information good. Why? Because there is information in a book. You can read a book and obtain the information. Maybe we don't all obtain the same information from reading the same book. And you can buy and sell books. You can do it downstairs in the bookstore. Right, a movie is an information good. Some kind of communications infrastructure, like a wire or a satellite transmitter or a receiving unit. Certain kinds of labor could be described as information goods. Hiring a consultant, right? If those of you students, if you're getting ready to take the GRE or something and you sign up for one of those GRE or LSAT or MCAT training courses, that is an information good. In other words, the thing that you buy and sell is not the information itself, but some discrete economic good in which the information is embodied. And I think the same kind of distinction makes sense for privacy. You know, if you're a Hollywood celebrity, right? When you go out to avoid the paparazzi, you put on a baseball cap and your dark glasses and I don't know, like ratty clothes and you kind of huddle over so people won't see your face. But of course, the paparazzi always find you and there you are on, I don't know what's the modern equivalent of the National Enquirer, TMZ, yeah, TMZ or something, right? So, you know, something that covers my face is a good. I can buy sunglasses. I can buy a hat designed to prevent people from knowing who I am from seeing me, right? A disguise, a fake nose and mustache is a privacy good. These shades on the window, right? Are privacy goods, prevents people from outside who didn't pay for Mrs. U, from peeking in the window and seeing all the great stuff that we're doing in here. I guess they don't have access to YouTube or something. You know, fences, even buying a bunch of land. You know, rich people like to buy a huge property and then put their house in the middle surrounded by trees where nobody can see in. Okay, those are economic goods, according to Manger's definition. What is it, you know, what does the human want that is satisfied by the purchase and consumption of this economic good, protecting your privacy, okay? And of course, the digital versions of these, an encryption program, you know, something you install on your computer to block cookies or to scrub your hard drive, Bleach Bit became famous a few years ago. That was the software that Hillary Clinton apparently used to wipe clean the server in her basement, you know, that had all the incriminating emails and so forth. You know, to the extent that you buy these things from some marketplace or some store, those are privacy goods. So my point is privacy like information is not some kind of spooky, weird, nebulous thing for which we need a whole kind of new economic theory. Right, we can just use conventional Austrian analysis of markets and prices and so forth to analyze information goods or privacy goods just like we can anything else. Again, the thing you buy and sell is not privacy, you can buy or sell privacy goods. Given that, right, entrepreneurs offer us various information and communication goods and services that include varying levels of privacy protection with them. Okay, so if you are really paranoid that the government is reading your emails, I mean, I believe that the NSA can get all of my communications but they don't bother and they're pretty incompetent and I'm not all that important, right? But if I really cared about it, there are all kinds of specific products that I can purchase and probably pay a premium for, right? Or sacrifice some other desirable attribute if I want more privacy. Instead of using regular email and texting and so forth, I can use Telegram or various kind of Tor encrypted clients. People complain all the time about Google, how much information Google has about us. Okay, fine, you can use DuckDuckGo, you can use any other number of search providers that do not track you, that do not put cookies on your machine and so forth. It turns out that, well, so first of all, I should say in a free market, we would expect that when you are considering which kinds of communications tools or information goods or any economic goods you wanna purchase, you would think about, okay, the various attributes of that economic good or service compared to the alternatives, the quality, the ease of use, the value it will add to my business and so forth, but you would also think of the level of privacy protection as one of those attributes, right? And of course, most communications tools and information tools do disclose something to you about how they handle privacy. I mean, the Mises Institute, if you go to Mises.org, there is a privacy statement, which I'm sure you have all read and studied in great detail, right, that describes, discloses the institute's formal policy on data retention and what information it gets about you and so forth. And if you read this thing and say, no, no, no, I don't want them knowing my IP address when I go on Mises.org, guess what? There's an easy solution to that. Don't go on Mises.org, okay? But we do, most of us, because we feel like the benefits we get from going to Mises.org, vastly outweigh the risk that, you know, Jeff Deist is back there looking on your screen or whatever. I should say as an aside, you know, if you just think about the Austrian concept of demonstrated preference, right, just looking at the data demonstrated preference suggests that much as we all talk and complain and whine about privacy and how we are worried about big tech spying on us, yada, yada, yada, yada, yada, most of us are really not willing to give up almost anything for a little bit of extra privacy. Few people use those super encrypted tools. You know, if I wanna send a message to, you know, Lou Rockwell and I wanna be sure that nobody is gonna read it, okay, I can handwrite it using some private code that Lou and I have developed and I could put it in a little piece of paper and put it on the back of a trained carrier pigeon which will fly, you know, to Lou Rockwell's office and deposit the little thing. I mean, but why would I do that? That is way harder, way less convenient and has all kinds of other drawbacks compared to just calling him even though somebody might be listening in or sending him a regular encrypted email or whatever. You know, if you don't want people to know, read your electronic diary, well, you could just write it on paper and put it in a safe or something or try to memorize it, whatever. I just saw a thing from a friend of mine, Alex Stapp, two days ago, pointing me to this New York Times article on DuckDuckGo which indicates that hardly anybody uses it because the search results aren't as good, okay? And much as we hate Google, it's just more convenient, it works better. You know, the point is there's not any sort of a mark, so-called market failure here. If people really value privacy over other attributes like the accuracy and quality and usefulness of the search results, then we would expect there to be greater use of products like DuckDuckGo, but in fact, hardly anybody uses those, okay? Some other sort of empirical facts that are interesting. I like the way Brian Kaplan put it in a recent, sorry, in a recent Econ, I guess it was Econ Lib article. I mean, in fact, much as we complain, we actually have a lot of privacy today compared to most other periods in human history. You know, I can, if I want to buy some weird embarrassing product, you know, I don't know, among you, you know, I wanna read some obscure Keynesian tract because I'm really secretly into Keynes and I don't want you guys to know about it, right? I mean, in the old days, I'd have to go to a bookstore or something, go to the library, look over my shoulder. I guess I'd have the sunglasses and the fake nose on, et cetera. Yeah, am I worried that when I ordered on Amazon, Jeff Bezos knows? Yeah, I guess so, but lots of people who otherwise wouldn't know don't. I mean, I can buy some embarrassing product or watch some video that I would not want everyone to know I'm watching from the comfort of my own home in a way that was not possible, you know, in any previous period of human history. So, if you put it in proper perspective, actually in many ways, we have more privacy now than we did in the pre-Internet, pre-Amazon, you know, pre-Google era. And in fact, if you wanna put this in sort of technical economics jargon, there are huge efficiency gains for matching buyers and sellers, right? Uber, Airbnb, eBay or Alibaba, Rover, right? Any number of these sharing platforms that allow us to sort of soak up excess capacity by allowing people to borrow fixed resources that would otherwise be idle and so forth, you know, meeting people online, right? Dating sites provide a great platform for people to interact and match up. I saw a chart recently, I forgot to screenshot it for you. It was like looking at people in different age groups or maybe in different periods of time, where did you meet your spouse? And you know, in a bar at work, through mutual friends, whatever, and then the category met online, we know it's like really trending steeply upward. Back when I was young, if you said, I met somebody online, that was like weird, okay? Now it's totally normal. Lots of people meet their spouses online. It's perfectly fine. Also keep in mind, the evidence suggests there are huge cost savings from synthesizing and organizing our information and keeping it online. Savings that more than outweigh what most of us are worried about in terms of loss of privacy. Okay, I don't know if you guys use any kind of note-taking app if you use Evernote or OneNote or something like that to organize your school notes or whatever. You know, when I was in college, everybody had all these pieces of paper. You know, you'd lose your paper notes. They'd get out of order or whatever. If you have your notes in electronic fashion, you know, on Dropbox or something, oh well, you know, they're not secure or somebody can hack into Dropbox and get my class notes from Peter Klein's privacy lecture. Yeah, I mean, that's a risk, but it's greatly outweighed by the convenience and security of having all of your electronic information in one place. A lot of concern about digital medical records, right? People say, well gosh, it would be so much better if I could have all my medical information on a little chip, you know, on a USB, on my phone, whatever. Then when I go to the doctor, I had some health, recent health issues where I had to go to lots of different doctors and even though they were all part of the same network of hospitals, every time I would go to see a new specialist, have spent like 15 minutes filling out so these stupid forms, my address, my date of birth, my insurance information, you know, none of that had changed, yet there was no way for me just to click or tap or, you know, stick a USB in something and have all my information there. Why? Because people are paranoid that your digital medical information is not secure. Well, yeah, that's a risk, but it's also super convenient to have it all in one place. Also, public policy people have worried that companies will use our digital information to try to engage in, you know, so-called price discrimination, like charging different prices to different users for the same thing based on your ability to pay, your willingness to pay and so forth. There's really very little evidence that any online vendors are doing that. Of course, when you go to Amazon, you know, what shows up on your home page as here's some stuff you might want to buy depends on your personal information, right? But there's not any evidence that somehow consumers are being taken advantage of and charge different prices based on willingness to pay. Okay, so what about regulation? What about attempts by the government to protect our privacy, to increase our privacy and so forth? Well, the kinds of things that Congress, in the U.S. Congress is now discussing, European and other regulators are discussing, some of which have been implemented, right? Includes, for example, you know, rules that mandate privacy protection, right? There are proposed laws that would say, okay, Twitter, Facebook, Instagram, whatever. It is illegal for them to share any of your information with a third-party advertising firm or with third-party merchants, people who want to sell you stuff, et cetera, okay? Other rules that have been proposed are to impose mandatory data portability. In other words, somehow, like if I get tired of, I don't know, I get tired of Instagram and I want to switch to some other platform that does the same kinds of things that Instagram does. Instagram, by law, would have to like, let me download all of my data in some kind of universal database format. Then when I joined some other platform, I could upload and it would like seamlessly give me a history on that other platform. I mean, just thinking about that, just the technical aspects are kind of challenging, right? But there are laws that claim that all any user data should be stored in some universal portable format so that you're not locked into any one platform, right? Some people maybe don't want to delete their Facebook account even though they're mad at Facebook because they don't want to lose all that great stuff they have before, all their previous photos and all that. If you could only just switch to some other platform and preserve all that, that'd be great. So the argument goes. Also, it's been proposed that some very recent legislation wants the FTC or a similar agency to impose very stiff fines on platforms for any data breaches. Like the Cambridge Analytica scandal at Facebook or lots of online companies and brick and mortar firms have had data breaches, you've heard about these. Sometimes you get a call or an email. Oh, if you swiped your credit card at JC Penney within the last year, you need to check your credit card statement because there might be fraudulent charges on there. You know, one thing that's interesting is, of course, the market is not very, the market is quite harsh in penalizing data breaches, especially for financial firms. Wells Fargo had a big fraud and data breach case a few years ago and the share price of Wells Fargo went way, way down. So of course, if we find out that a merchant with whom we are dealing is not protecting our credit card information or whatever, we're much less likely to use that merchant in the future. So I guess, but also just from a sort of regulatory and legal perspective, the issue here is, do we need Congress or an executive branch agency to legislate some kind of penalties for data breach or can we rely on something like the common law? In other words, if when I upload my data to my bank, JP Morgan Chase or something, there's a click through license, right? There's some contract. I agree to abide by certain terms and the bank agrees that my information will be kept secure and so forth. If there's a data breach, well, that's just like, you took your gold to the gold warehouse and now you're carrying around little paper tickets instead of gold bullions or gold coin. Somebody breaks into the warehouse and steals your stuff in your deposit box. Okay, well, that's just like any other kind of theft. It's like if I store my furniture at a self-storage place it gets broken into, somebody steals my furniture. We don't need like a regulation. We don't need legislation or some kind of executive action to say if Peter Stuff gets stolen from the self-storage place, the government is gonna find that self-storage company. No, I mean, I would have a legal action against the firm for breach of contract, right? We signed a contract that you would keep the door locked, you left it unlocked, my stuff got stolen, you gotta reimburse me for my stuff. We already use those kinds of remedies to deal with property rights violations. So this would be just or breaches of contract. This would be just a breach of contract like any other, okay? Also, it's worth realizing that these kinds of proposals have pretty substantial costs as well as prospective benefits, okay? So it's possible that these kind of stricter rules about privacy protection might make using services more valuable, okay? I'm more likely to use some online service or to use a social media platform if I know that the government is forcing them to be forcing them to take better care of my data, protect my data. But sort of as I've already suggested, right? The market can kind of sort this out in the sense that if you and I value stronger protection for our information, then it should be in the interest of some entrepreneur to provide a verifiably safer method of storing my information and then charging a premium for that, okay? So it's not clear why you would need government intervention to realize those benefits. But what people haven't talked about that much is there's also a lot of costs associated with these kinds of remedies. Look, a lot of us are uncomfortable with the fact that Google has so much information about us. Now, you worry about the state, I worry about the NSA knowing everything about me, I worry about the FBI listening into my phone calls and reading my emails without a warrant and XYZ and all other kinds of black agencies that I don't even know about, right? Why am I worried about them? What do I think that they could do to me? Well, they could like kill me, right? Or put me in jail or they could do something really horrible to me. What is the worst thing that Google can do to me with all that information that Google has? Why does Google want that information? So that it can give it to advertisers who want to advertise stuff that I might want to buy. Okay, that's the horrible thing that Google is doing to me. It's making it easier for people to try to sell me stuff. So I mean, like I don't like getting sales calls in the middle of the night any more than you do or junk mail, spam email, but in the grand scheme of things, that's not all that terrible. People wanna sell me stuff, okay? Why is it significant that that's the kind of business model that firms like Google use? Well, as I think I mentioned in my monopoly lecture, all those fantastic services that you get from companies like Google, Gmail, Google Maps, Search, obviously. Think of all the extra sort of applications and tools that are provided by firms like Google. How much do we pay for most of those? Zero, right? We don't pay a dime because the business model, of course, is you give away the services for free to the end user, that's us, and then you make money by selling advertising to those users. It's not really a new business model. It's the business model that commercial radio operators figured out and perfected in the 1920s and 1930s, right? Broadcast, radio, broadcast TV is free. If you've got the equipment, you just have to listen to ads. That's how it gets paid. If it's illegal for Google and Twitter and Facebook and so forth to use our information to feed that into algorithms that will target advertisements to us, well, we're not gonna get that stuff for free anymore. They've gotta come up with another business model. Now, it might involve paying service fee, paying user fees, okay? It might involve different kinds of advertising, maybe more intrusive and less, more unpleasant kinds of advertising. I mean, are we willing to give up the low cost to us and convenience of using these services in exchange for some other kind of business model that maybe offers better privacy protection but for which we have to pay or suffer some other inconvenience. I mean, again, it's like the example with DuckDuckGo. I suspect that most people would not be, would not consider it an improvement if you have to pay for every Google search because Google has no other way to make money, okay? You know, good example of how many of these mandatory data protection rules and laws tend to work is given by the European GDPR. How many Europeans do we have here? If you've been in Europe or if you've visited any European websites, you might have noticed you have to click through a lot more annoying pop-ups to give your consent to lots of different things. I consent to the cookies, I consent to this, I consent to that. Just show me the stupid article. I'm tired of consenting, right? A lot of that is GDPR rules. So GDPR is a EU-level set of policies that impose stricter legal requirements on how, partly on disclosure, right? That they have to be more transparent to us about how they're using our information and their greater penalties and so forth for potential misuse of that information. What has been the effect of GDPR on the tech sector in Europe? Oh, by the way, another way that in my mind makes it ridiculous is I'm involved with an academic society that holds a big annual conference with several thousand people and I was involved in putting the program together and so you get people who want to present their papers, they submit their papers in a computer system and then those papers are assigned to anonymous reviewers who will read them and provide comments and ratings and it gets fed back into another system and then you select the papers that will be on the program and I had to manage a lot of that stuff. At one point I asked the back office people, I said, look, I need to contact some of these reviewers who haven't turned in their reviews on time or I need to contact authors about these issues. Can I just get a list of everybody who submitted something or everybody who agreed to review? Can you just give me a spreadsheet with their name and email address so I can easily and conveniently contact the people I need to follow up on? Nope, GDPR made it illegal for the back office of this organization to send me a spreadsheet with the names and email addresses of all the submitters. Why? Because when people submitted their papers or signed up to be anonymous reviewers, apparently there was no box where they could consent to have their email addresses given to Peter Klein or the person who was doing the Peter Klein function. I mean, everybody that I was involved with in this conversation agreed that it was completely ridiculous but that was the nature of GDPR, right? So remember, legislation or government intervention through the executive is a very blunt tool and when these rules are imposed, you really have no idea how they will actually work out in practice. But the legislators and executive officials don't care. They just impose the whole thing on you. There have been a number of studies I've seen in recent months on the effects of GDPR. I think it's been in place for about two years and one effect that was fairly easy to anticipate, especially if you were in my lecture this morning on big business, what kinds of firms would you expect to benefit from the kinds of privacy rules and restrictions imposed by GDPR? Small struggling startups that barely have enough cash to pay salaries each month for big, powerful, deep-pocketed tech platforms. The latter, right? So there are a lot of studies showing that the primary beneficiaries of these stricter privacy rules are Google, Facebook, Amazon, Twitter, the big platforms and the big providers, right? Because they can much more easily comply with GDPR than smaller newer firms, okay? It's the argument that I discussed this morning raising your rival's costs. So these kinds of privacy restrictions are they impose a differential burden on small firms. So who do you think is speaking to Congress in Washington and giving either explicit or maybe implicit support for a GDPR kind of thing in the US? Google, Facebook, Twitter and so forth. Let me digress for a moment to talk to you about something I've been thinking about a lot and that is the infamous section 230 of the Communications Decency Act. You might be thinking, what the heck is he talking about? Okay, it turns out to play a very important role in some contemporary debates. I haven't quite made up my mind on the analysis here but I've got some preliminary thoughts that I'll share with you. The issue here is recent concerns about so-called content neutrality, right? So conservatives and libertarians are usually pretty upset about Facebook, Twitter, Instagram, I guess Google and other social media platforms for displaying a kind of a political bias, right? So you've heard about all of the figures that have been de-platformed, banned explicitly or shadow banned, poor Alex Jones, right? Infos was banned from Facebook, banned from Twitter, banned from YouTube, app is thrown out of the app store and so forth. I don't know if any of you have used any of these sort of more anonymous, sorry, the sort of social media platforms that don't regulate content like GAB, the social media network GAB which in the mainstream is called the Nazi network, right, but GAB calls itself the free speech network because it has a policy that no content is monitored or regulated at all. I think only direct personal threats to harm a specific individual can be removed. Other than that, you can post stuff that's pro-communism, pro-fascism, whatever and the whole idea is it's just a completely unmoderated platform. The way you moderate is if somebody says something that offends you or bothers you, I forget what they call it. You click mute or block and you never see that person again, okay? GAB has, so it's a social media platform, GAB had an iOS app and an Android app, those were quickly banned by Apple and Google respectively from their app stores. Then the PayPal and the other big payment systems blocked GAB from accessing any of their services to collect money from premium users and so forth. So as I'm sure you guys have heard, there are a lot of complaints that because of sort of bias, political bias, websites and other content that is not politically correct or whatever can easily be pushed off of social media platforms, okay? Like I think I mentioned before, even the president complains that Google is shadow banning him, okay? He says that if you search for Trump news, all you get is the fake news media. It's all rigged, all caps. It is rigged and this stuff is bad. You know, fake CNN is getting, anyway, you guys have heard of these kinds of concerns, right? So there are several proposals in Congress. Josh Hawley, who's sort of a rising star in the Republican Party, the current governor of Missouri has a proposal that would essentially make it illegal for Facebook and Twitter to discriminate based on political content, okay? I mean, why would you worry about this kind of stuff at all? I mean, would you really think that Jack Dorsey, the CEO of Twitter guy who looks like that, could he possibly be biased against libertarians and conservatives and traditionalists? No, that would never happen. The root of all this is a very important piece of legislation in the early history of the internet, which was the Communications Decency Act, which was passed by the US Congress in 1996. So the CDA was a response to concerns that began to arise in the early 90s when the precursors to today's sort of social media platforms and internet sites began to emerge like Prodigy, America Online, CompuServe. Most of you weren't born then, but maybe you've read about these things in the history books, but there used to be dial-up internet and the way people got access to information and so forth was not on websites like you do today, but by dialing into these proprietary communities in which you could share photos and have chat rooms and there were things that were like web pages and so forth. So around this time, there was a big push by people in Congress and activists. There was concern about harmful content on these networks, on these kind of pre-internet or early internet kinds of networks, electronic bulletin board and so forth. In particular, like Child Pornography or Pornography in general, the wife of Senator Al Gore who was the vice president under Bill Clinton, her name was Tipper Gore, she led sort of a public relations campaign claiming that there was all kinds of bad stuff on the internet, we need to protect children from the internet and so forth. So the Communications Decency Act was passed to hold hosting companies potentially liable for harmful, indecent content that was posted on their networks. So the Prodigy, AOL, CompuServe and so forth, they had argued, well, we really don't have the capability to police all the information that's on our networks. We're not like a newspaper. So if you look at the Wall Street Journal or the New York Times, with the possible exception of the classified ads, all the content that's in those newspapers was reviewed, it was edited, there's some sort of curation involved. So if something libelous or slanderous or obscene or pornographic or whatever is in the New York Times, someone who claims to have been harmed by that can sue the publisher of the New York Times even if the New York Times itself didn't write the content. Some journalist wrote it, some freelance reporter maybe wrote it, but you published it in your curated newspaper, therefore you're responsible. Well, the platforms argued, we're kind of not like that, we're just like a bulletin board where anybody can post something, we don't really have control over it, so you can't really hold us liable. So they managed, the telecom companies managed to get an additional piece put into the Communications Decency Act called Section 230 that essentially gave online computer platforms an exemption from the standard kind of common law liability rules. So the standard common law kind of distinction is between what you might call a publisher or what the law typically did call a publisher and a so-called common carrier. So if I own like a physical bulletin board or a chalkboard and I put it in Downhound Auburn on Tumor's Corner and I say, this is the free speech chalkboard, come and write what you want, I just provide the chalkboard and some chalk and then I walk away, then I'm not choosing who's gonna, anybody can write on it, anybody can write whatever they want, all I'm doing is providing the infrastructure on which the writing can take place, but I don't have any control over the content. So it's kind of like if somebody calls you and threatens you in a phone call, you can't sue the phone company, right? You can take action against the person who threatened you or harassed you, but in traditional telecommunications law, something like the telephone network where anybody can pick up the phone and call anybody, the phone company was not deciding whether or not you could make a call based on what you were gonna say or who you were, right? So they said, we're just a neutral platform, we're not responsible for content, as opposed to a publisher like The New York Times or Mises.org or Arlington House Books or something where they choose what to publish. So the idea was a publisher can be held liable in court for material that is obscene or defamatory or whatever because they are controlling the content, whereas a common carrier cannot be held liable for harmful content because the carrier is not exercising any control over that content. So section 230 of the CDA basically wiped out that distinction and said, hey, Prodigy, AOL, CompuServe and so forth, feel free to go in there and moderate all you want. Again, at the time the concern was not getting rid of Alex Jones, but the concern was getting rid of pornography or I don't know, threats or whatever kind of inappropriate or obscene material might be out there. Go ahead and be as aggressive if you want in removing that stuff. You're not responsible if you miss one and something is still there and hurts somebody. The law will treat you like a common carrier but we still want you to go in and moderate and get rid of bad stuff. Okay, of course the issue now is not so much about direct personal threats, but okay is Facebook has the so-called community standards which are so vaguely written that it's impossible to know what does or does not violate community standards and I'm sure you all have plenty of examples of stuff content you know that was blocked or banned or users who were kicked off for saying things that might be out of the mainstream, but don't seem to satisfy any objective criteria of being threatening or harmful. Of course, we've got the whole thing nowadays that you're on college campuses that speech is equivalent to violence, you know that. So if you say mean words that offend me that's basically the same as violence so I'm justified in punching you over the head in my Antifa garb or whatever. People ask me, are you worried about stuff you put on social media like me personally? Like well, as long as Tom Woods is still on Twitter as long as Lou Rockwell is out there I feel like I'm safe but you never know. Anyway, so critics of section 230, sort of libertarian critics some have said section 230 it's basically, it's kind of like a subsidy. I mean I don't think subsidy is really the technically correct term, it's not cash but it's like a form of protectionism. It gives online publishers special rights that other kinds of publishers do not enjoy and that therefore 230 enabled the growth and the rise of the big tech firms that everybody's now complaining about and that maybe in a pure free market if we didn't have something like section 230 then we would have, you know the internet would have evolved in a different way. And you know again it's a little bit ambiguous but I think this is a thing that we really need, people in our circles really need to be looking at very closely. You know my view has been whatever concerns people have about the, you know who should be held liable in a particular case. Is it Twitter's fault if somebody says something that is harmful to me on Twitter or not? You know are they technically capable of having removed that content and they fell down on the job? Did they fail to remove that content because they're out to get me? That really this is something that can be decided by the courts. This can be decided through the common law rather than sort of top down legislation. So here I'm appealing to you know Hayek's distinction between what Hayek calls no most and thesis, right? No most is the bottom up kind of market oriented common law solutions to these sorts of problems and thesis is the top down legislative statutory remedy and I much prefer the bottom up to the top down. Okay so last thing, every time somebody says, well the big tech platforms are bad because they're stealing your data, they're taking your data, I always stop for a moment. You know do you own your data? Well my answer is no you don't own your data if we define that properly. Right so of course there's a whole literature, libertarian literature on intellectual property, our ideas property and of course many of us argue, like I said before, you know a book is an economic good and if somebody takes your book that's theft. An idea is not an economic good and if somebody copies your idea that is not theft. Right because you can't own ideas, you can own books and you can have some data on a hard drive, you own the hard drive but you don't own information per se. You own your computer, you have some legal rights that you've specified by contract. Right about the storage services or data transmission services. I have a thing that I've signed from Dropbox. My contract with Dropbox is that they can't delete my data without telling me or whatever. But that's just a contractual stipulation. But I say I don't own my data because what is my data? What is my data on Facebook? Right what is, well it's my reputation, it's what other people think about me. Ah okay, you may have seen Walter Block's arguments about things like libel and defamation. Walter Block says in a libertarian society you would not be able to sue somebody for saying a bad thing about you because that's not a violation of your person or property. Right likewise, do I own my reputation? I hope that I have a decent reputation in our circles and in the community where I live. But no I don't own my reputation because my reputation is things that other people think about me. It's ideas that are in other people's head. Likewise, if I'm sitting in my bedroom at home in whatever, doing whatever, and somebody comes and opens the blinds and peeps through my window. We would consider that a violation of privacy because what I do in the privacy of my own home I have a reasonable expectation especially if I have the blinds closed that other people will not see in. If I'm walking down the street here in Auburn, Alabama I'm walking down the street wearing a Mises hat and somebody says, hey guess what everybody there's Peter Klein with a Mises hat. That's not a violation of my privacy because I'm walking down the street and I don't have any reasonable expectation of privacy because I'm in a public place. If I write something on Twitter or I post something on Facebook and you copy it or you tell somebody else about it or you record my activity on Twitter and then sell it to an advertiser is that like peeping through my bedroom window? I would say no, that's like observing me walking down the street, right? Stuff I post on social media, I do not have a reasonable expectation that no one else will see that, draw any conclusions from it and take any action based on it. Okay, so the stuff that Facebook's advertisers know about me is not property that I own. Okay, that's information that is embodied in goods and services that they own. So you can steal somebody's stuff and we can say, you know, philosophically you can respect someone's privacy or not but you can't steal someone's privacy so we should stop talking about platforms as if that's the kind of thing that they do. Okay, thanks very much. Thank you.