 OK, good afternoon, everybody. I want to talk to you a little bit this afternoon about data privacy. And the original title is the Economics of Data Privacy. But I want to go a little bit beyond economic analysis, per se, and talk about some of the broader sort of political policy and even sort of conceptual issues related to privacy. It's kind of funny that we've gone within less than a decade to being really excited about social media, the fact that our devices can interact with us in a lot of valuable ways, and that things are really customized to us. That when you go to Amazon, you get a home page with reading recommendations that are specific to you. That's great, but now people are getting a little bit uncomfortable with the amount of information that your devices know about you. I have an Android phone. I don't know if this is also true for Apple, but those of you with Android, a lot of times you just be walking down the street, and Google will say, I'll get a notification on my phone. I'm walking by a McDonald's. Oh, how did you enjoy your experience at McDonald's today? I just this morning I was looking at my phone and it said, people around you are talking about Mises Institute. Would you like to comment on Mises Institute? Like, wait a minute, what's going on here? Now we're a little bit creeped out by that kind of stuff. And again, within a very short time, we went from being really enthusiastic about this technology, now all of a sudden to having some concerns and some doubts. And I started thinking about wanting to do a lecture on this topic back earlier in the spring when Mark Zuckerberg was appearing before Congress to talk about whether Facebook had allowed Russian bots to steer the election toward Trump, but also more generally, whether Facebook was doing enough to protect its users' data privacy. You may remember the details of this kind of, I guess, scandal, at least a Washington DC scandal that for years, part of Facebook's business model, of course, is that it compiles profiles of users based on their likes and their browsing habits, what pages they visit, their comments, and so forth. Then it packages that information and uses it to target advertisements to individuals. That's the main way that Facebook raises revenue, but there's some third-party app developers who can also get access to your data. I don't know if you ever wonder why somebody, some programmer took the time to create the What Disney Princess Are You quiz, right? It wasn't just for fun, I mean, that's done as a way of when you authorize that app to access your profile so that you can find out what Disney Princess you are. I wonder what David Gordon got. You're letting the app developer have access to some information from your profile, your Facebook profile, which the app developer can then use for marketing purposes, can sell to third parties and so forth. So there was this issue that a researcher with this firm, Cambridge Analytica, had developed an app that extracted some user information and then gave it to a political consulting firm, possibly in violation of Facebook's own internal rules about transferring this information. This firm consulted with the Trump campaign and so forth. There's no evidence that this had any impact on the election, but it made people very nervous. It made politicians nervous, of course, the threat that someone could interfere in the electoral process and the democratic process. And so Zuckerberg was hauled before Congress to explain what procedures Facebook is taking to protect people's privacy. So we still like our apps, we still like our personalized devices, but we're now very nervous, we're all nervous about who knows what about us. And again, for the purposes of this lecture, I'm only talking about private commercial use of information. I'm not talking about the National Security Administration, not talking about what the government knows or doesn't know about you, which is also, of course, an extremely important topic, but one that goes beyond the scope of my lecture for today. So I don't know about you, but if I can't remember my password to something, I just kind of shout it out and hope that the NSA will. Could you please tell me my password? Because they know everything. I don't know if you noticed this, but earlier this month, it's around the 4th of July, was the Plain Bay episode. I don't know how many of you were paying attention to this. I only found out about it later, but apparently over the July 4th weekend, there was this case, there was this woman named Josie, something or other, who's sort of an aspiring screenwriter. She was on a flight, I think it was from New York to Dallas. She asked to switch seats with another passenger so she could sit next to her boyfriend. That's her on the right and the boyfriend. And there was a young woman, the young woman she switched seats with, sat in the seat in front of her. And next to her was an attractive young man and the two of them struck up a conversation and this person started live tweeting it. So she kind of live tweeted this romance between two strangers on a plane and her tweets went viral and got hundreds of thousands of retweets and celebrities. Everybody was talking about it. The guy, that's the guy right there who apparently is a retired former professional athlete, I think a soccer player and kind of a wannabe celebrity himself. And so he became known as hashtag plain BAE. And she became known as hashtag pretty plain girl. And so there are a whole bunch of articles about this. It was on today's show, it was like a big deal. And then the young woman who is right there complained that she had been, I guess you call it doxxed. So various internet sleuths, although the person live tweeting the romance had tried to cover up everybody's face. But various internet sleuths sort of dug around and they found this person's Instagram account and so forth and revealed her identity. And apparently she got a lot of harassing calls, texts, whatever. She claims that she had to close all her social media accounts and I don't know if she lost her job or whatever. But she claimed that she was being harassed online because of this incident. And she says, there's a gross violation of my privacy. I'm sitting on a plane, I'm talking to the guy next to me on the plane. It's none of anybody's business, what we're doing. And now I have been harmed because somebody tweeted about this and it went viral. And apparently she's threatening litigation, presumably against, I don't know if it's against Twitter or against the person who did the tweeting. And now all these celebrities, there's good old Monica Lewinsky. Now all these celebrities who retweeted and wrote about the on the plane romance are now apologizing for doxing this person and whatever. So I mean, what privacy rights do you have on a plane? Is it okay for another passenger to take your picture? Is it okay for somebody to tweet about you against your, without your consent? I mean, from a legal point of view, these were not issues that were dealt with in the common law where they didn't have social media, right? But so the courts are sort of working out the boundaries of this, but let's try to do some analysis and see what we can say from the point of view of Austrian economics about what privacy is, how privacy protection might work in a free market, how various policy proposals about privacy would work, what sort of impact they would have. So let's just start by talking about privacy itself. Is privacy a thing? Or to put it in Hungarian terms, is privacy an economic good? Well, I mean, there's not a lot of literature on this but there is some literature that is analogous going back about 20 years now, so-called information goods. There's an influential book by Carl Shapiro and Hal Varian published about in 1998, I think, called Information Rules. It's a very good sort of mainstream economic analysis of markets for information. So information goods are online directories, profiles about you, how-to guides, YouTube videos and so forth, those are bits of information. They're things that embody information. More recent book, it's also pretty good by Joshua Gans called Information Just Wants to Be Shared which is another kind of updated analysis of how some markets for information goods work. The problem is information is not an economic good in Carl Manger's sense of what it means to be an economic good. Remember, for Austrian economists, an economic good or service is something that can be, you know, parceled into discrete units, can be bought and sold, can be exchanged on markets, can have a price, right? So there has to be something that can be owned, that can be exchanged, that can be given away and so forth. I mean, information per se is not an economic good. However, there are goods and services that you can buy and sell in the market that are related to information or embody information like a book, right? Books are economic goods. There's a bunch of them down there in the bookstore and Chad will happily take your, let me sorry, Brandon will happily take your money or Drew in exchange for these books. So books are economic goods, movies are economic goods, movie rentals, movie purchases, tickets to a movie theater, you know, networks are economic goods, a wire is an economic good, a satellite dish is an economic good, a server is an economic good. There are information related services that can be bought and sold on the market just like other kinds of labor services, you know, consulting. When a company hires a consultant, what are they hiring? Well, it's not physical labor, they're hiring the knowledge of the consultant or they're paying the consultant to do some research and obtain some new knowledge and pass it on to the client. If you hire an attorney or a stockbroker, you're hiring that person to gain access to certain kind of knowledge. You know, you take a class, sign up for an online course or enroll at a university. You're exchanging dollars for what? Well, I mean, in an abstract way, you're buying knowledge, but if you enroll at a university, technically speaking, you're not buying knowledge. In exchange for your tuition dollars, you get the legal right to walk into a certain room and sit there and listen to what somebody is saying and take an examination and get some kind of certification, right? There's a specific discrete bundle of goods and services, legal rights and responsibilities that you obtain in exchange for your money. Why are you doing it? Well, to get smart. Well, I want knowledge. Yeah, I mean, colloquially we can say, oh, I'm buying knowledge, but technically speaking, it's not knowledge per se that is traded. It's specific knowledge related goods and services. Likewise, privacy is not an economic good. You cannot buy a unit of privacy on the market. You don't own privacy. You can't compute the marginal utility of privacy. Privacy is not a thing itself. Now there are what we might call by analogy to information goods, privacy goods, the sunglasses that Hollywood celebrities wear when they go out. So no one will recognize them, but of course, that merely draw attention to them, right? Okay, we call that a privacy good if we want. You know, if I go out and get a fake mustache and a fake beard and a wig so that I look entirely different, I look like, I don't know, I look like Bob Murphy with that weird beard and I get one of those fake bald hats or whatever. I mean, yeah, okay, so the fake beard is an economic good, just like a bottle of water or an automobile or anything. Window shades, if you have these shades at your house, call that a privacy good if you want. A fence that people can't see over or how about just buying a lot of land so I can build my house way back from the public road so no one can see my house or into my house or what's behind my house. You could say, oh, I'm buying this land just because I want to buffer between myself and my neighbor, but you call that a privacy good if you want. In digital space, right, there are lots of goods and services that are economic goods that are exchange in markets that are related to privacy. Encryption software is an economic good. Some add-on that blocks cookies from your browser. Has anybody heard of Bleachbit? Bleachbit became famous. This is a program that wipes your hard drive completely clean where even a forensic computer expert cannot extract any information from it. Became famous last year because this is a year and a half ago. This is the software that was used on Hillary Clinton's private email server before it was turned over to the FBI or whatever. So yeah, I mean you can buy stuff that will protect your privacy. And my point is we would analyze those things the same way that we would analyze a bushel of wheat or a chair or a computer or a car or any good in service. We don't need a special kind of economics. We don't need a different body of economic theory to explain the price and quantity and market characteristics of Bleachbit. I mean, I don't know how much it costs to get Bleachbit. You can probably get a demo version for free and you have to pay more to get the premium version. And in the old days it would come on a disk. Now you just get the legal right to download it. Oh, who knows? We can say what's the equilibrium price in the short run. What Mises would call the plain state of rest price. We can imagine what would happen to that price in the long run. We can say, well, in the evenly rotating economy what would happen to the owners of the Bleachbit servers and so forth? And we just theorize about it the same way we would any economic good or service. So there's nothing unique about privacy in this respect. So what do we actually see on the market? Well, because consumers place some value on privacy as an abstract concept, right? We often see goods and services on the market that are bundled in a certain way so that they include varying degrees of privacy protection. Okay, I mean, normally when you buy a house it might come, there might be trees in front of the house that block the windows or it might come with a fenced backyard. You could say, oh, well, I'm getting the privacy of the backyard that can't be seen from the street or whatever as part of the house. Okay, maybe you can add that on later once you've bought the house. But if you think about in the sort of electronic space I mean, you can buy an encrypted, you can buy devices with various levels of encryption. You can buy encrypted software. There are all kinds of ways that you can protect your privacy depending on which apps you use. Don't send regular text messages. WhatsApp is supposedly encrypted. Apple Messages is encrypted. Would they give you up to the feds if the feds are coming for you? I mean, maybe they would. But those of you who are really into it and of course the libertarian gathering, there's always a lot of people who are really into this stuff. You can use Tor or whatever. There's all kinds of and probably illegal ways that you can encrypt your information before you send out a chat to your friend about what movie you're going to tonight. If you don't like the fact that Google is tracking you and that Google knows so much about you, well, don't use Google as your search engine. Don't use Chrome as your browser. Use DuckDuckGo as your search engine because this is a search engine that doesn't track your searches and so forth. If you don't like Facebook, there's MIUI. There's a lot of other attempts to create social networks that have stricter privacy protections than Facebook. A lot of it, of course, depends on trust between consumers and providers. If you believe that Facebook would not follow through with its terms, would not protect you in the way that it has claimed to protect you, then regardless of what it says on Facebook's privacy protection page, you can choose to use a different product, right? That you believe will protect your privacy more. So the point is, you're not compelled to use social media apps of any particular type that do or do not protect your privacy in any particular way. This is something that sort of worked out on the market just like anything else. And of course, it might be that people don't care about this at all. And I think in the early days of social media, most people didn't think about it and didn't care about it. Now it's a big deal. People care about it more. So entrepreneurs have an incentive to be more explicit about their privacy protection and to follow through on whatever claims they make. I mean, almost every website you go to now has some information about privacy. In fact, look at this crazy outfit, right? It even has an explicit statement about how it will or will not use your personal data. And people who care about these things can look very carefully. If you're not comfortable with the privacy policy of this particular organization or you don't believe that this organization will follow through on its claimed privacy protections, you can go to some other horrible website instead of this wonderful one. Now, just as an empirical fact, Rothbardian demonstrated preference suggests to me anyway that on the margin consumers don't care about privacy as much as they say they do. People are not really willing to pay much on the margin for additional privacy protection, right? Now there's been some reaction by incumbents to increasing privacy concerns. Again, making their policies more transparent. There's some chat apps that increased their level of default encryption and so forth. But there are lots of options for people to load up on extra privacy or to choose bundled services that include higher levels of privacy but maybe lower quality versions of some other attribute and people don't seem to be switching, right? So besides me, there was another social network called LO that was popular a few years ago that was a completely non-ad-supported privacy a site that really emphasized privacy and it failed. I suppose me we will probably flop too. Back in the spring, I was seeing hashtag delete your Facebook all over the place. Oh, we hate the Zuckerberg guy. He sold us out to the Russians. We're gonna show him and delete our Facebook profiles and not that many people did, okay? A lot of people are especially conservatives and some libertarians are really upset at Facebook, sorry, at Twitter because the founder of Twitter, Jack, what's his name, Jack Dorsey. I mean, he's a very openly progressive, I think he's like a Bernie's type supporter and there are a lot of claims that Twitter deliberately, they removed the blue check marks from all the right-wing accounts and they're hiding, they don't use just a regular feed, a linear feed, it's an algorithmic feed and they claim that they're demoting Twitter feeds of people who have politically incorrect views and so forth. I mean, if that's all true, hey, there's a simple solution, delete, okay? Delete your Twitter, don't use Twitter, use something else. Not that many people seem to be switching. I mean, look, if you really, really care about your privacy, you don't want the NSA to have any access to your encrypted data stream because you think that Facebook built in a back door and they gave the keys to the feds. Well, I mean, you could use carrier pigeon, right? You could write down, if I wanna give some information to Timothy back there, I can write it on a little piece of paper and I can stick it in a thing and put it on a pigeon and send the pigeon to Timothy. I mean, I guess the pigeon could still be captured by like a CIA drone or something, right? But it's be harder to get that into the NSA database. You're worried about your, you're keeping a personal journal and you're worried that it's on Google Docs and Google's gonna take it and give it to the Russians. Well, I mean, you could put it on a piece of paper. For a while, this is what they called the hipster PDA back when PDA is personal data assistant, right? So if you really wanna be hipster, you keep all your personal information on a piece of paper with an alligator clip and you write it with a pen. Now, my point is when, if you do a survey of, you know, send out surveys, how much do you care about privacy? Are you worried that various internet firms are not protecting your privacy strongly enough? People will say, oh, I'm very worried about it. But in their actions, they demonstrate that they don't seem to be that worried about it. Or rather on the margin, they're not willing to sacrifice the benefits of being on a social media platform that all their friends are on in exchange for greater privacy protection, okay? Now, one sort of caveat to this is some people say, well, but you know, when you install a new app or any piece of software on your phone, your tablet or whatever, and you download it and you go to open it and it opens for the first time and there's all these pages of legalese, you know, accept, click yes, go on. How many of those have you actually read? None, right? Nobody reads these software licenses, so-called click-through licenses. Some people have argued, well, if people really knew, if they did read all this stuff, they would be very uncomfortable with what these companies claim they have the right to do with your data, et cetera, et cetera. And it's not that people willingly consent to giving up personal information to social media or whatever, it's that they don't know, they don't understand that Facebook has the right to do this. I mean, Facebook has always had the right to construct a profile of you based on your habits on Facebook and use that to target advertisements to you. That's the whole point. That's how it works, that's the business model. And it's very explicit in the click-through license that that's what Facebook is gonna do. But if you think nobody reads those, oh, well, then maybe those shouldn't count. Well, I don't think that's a really good argument because there's been some litigation in recent years about whether consumers should be bound to conditions they agreed to when they clicked okay to install that app. And the courts have ruled, again, this is not all that new. It's a common law thing that, what does it mean to consent to a contractual relationship? There has to be a certain level of awareness on the part of the person who is consenting. And there are certain conditions under which courts for centuries have ruled, no, you're really not bound by the terms of this contract because you were incapable of understanding the terms that you agreed to. So in other words, it's not the case that these licenses de facto allow firms to do things that make people very uncomfortable, okay? Courts are typically applying a standard, kind of a reasonableness standard and not holding consumers to every single statement that's in the click-through license. So it's not that people are being bound by things they didn't want to be bound with. You're really not being bound by them anyway. I just might add a few curious things. It was curious to me about what do we see kind of in the data about how privacy is priced and how privacy, goods, or price, how people seem to behave in the new era. Well, one thing it seems obvious to me theoretically and seems to be borne out by a lot of mainstream empirical studies is not surprising. There are huge efficiency gains from people's willingness to share certain kinds of information. On Airbnb or whatever, I mean, you're revealing some information about your preferences and your priorities and so forth as is the person renting. To be able to find a match in lots of different kinds of markets, it is, that matching process works much better when the matching algorithm has a certain level of detail about the people being matched. I mean, dating sites, right? Dating sites went from being kind of weird to being totally mainstream in just a few years and yeah, I mean, you don't want, if you sign up for a dating site, you don't want the dating site giving all your information to anybody. You don't want them to be giving your information to MSNBC or something so they can put it on TV. But of course, there would be no point in signing up if you kept everything completely private, okay? In fact, in the sort of mainstream literature on sharing platforms, there's some concern that these platforms work so well, they're so efficient they lead to outcomes that from certain policy points of view may be considered undesirable. For example, that there's more racial segregation in some of these matching platforms. And if your starting point is that you want to, that public policy should eliminate or at least discourage any kind of racial or gender or ethnic or age or whatever kind of separation, then you might be concerned that yeah, these things are working so well, they're letting people, sort of supporting people's preferences that are bad. People only want to associate with other people like them and we should prevent that. But these algorithms are so good at finding a match that's a good match for you that now there's more segregation and discrimination. But again, even if you think that's a public policy concern, it's further evidence that these matching platforms do a really good job of putting buyers and sellers or different kinds of partners together. There's also quite a few studies on efficiencies from better storage and exchange of certain kinds of information like medical records. In most countries and certainly in the US, we don't yet have a kind of generally agreed upon standard for how medical information will be recorded. There's not like a single database format. And it's actually kind of annoying. I don't know if any of you, if you've moved to a new town lately or anytime you go to a doctor in the US and it's your first time for that particular doctor, what do they do? They hand you a clipboard. It's always an old fashioned paper clipboard. We have to fill out a survey about yourself, all your personal information, what kinds of illnesses you've had, what kind of illnesses people in your family had. And it's kind of annoying because you wish you just had that on your phone and you could just click something and that new doctor would get your health profile. We've not yet developed a system like that but there are some hospitals and some health associations, private ones that share information like that internally and it's great. I mean, it works better. It works much better than the old fashioned way. But of course there's a concern. People say, well, I don't want my medical history to be worried somebody could hack it. I don't want people to know that I had some health problem a few years ago. I don't want potential employers to know. So obviously both from a technological point of view and sort of a legal point of view when you sort of figure out or allow patients to decide how much of that information they want to reveal to their health providers. By the way, I don't know about you guys but when I have to fill out those forms like at the doctor's office, they ask for date of birth and they often ask for social security number and I usually leave the social security number part blank. Sometimes I don't even fill in the date of birth and occasionally they'll give me a hard time about it but usually they won't and I think, okay, well just, if you know anything about identity theft, if somebody has your birth date and your social security number, it's pretty easy for them to spoof your identity or engage in one of those kind of phishing, PHI, S-I-N-G kind of scandals. By the way, that's the whole thing with the Russian hacking of the election and all that. Maybe some of you can correct me on the technical terminology but I always thought the word hacking was reserved for like sort of physical intrusions into machines or some clever network tricks for getting into somebody's computer. Now the way they hacked into the Democratic National Hacked into the Democratic National Committee's computer was just the old fashioned phishing scam somebody sent John Podesta, Hillary Clinton's one of Hillary Clinton's top campaign deputies. One of those emails from the Prince of Nigeria or whatever. I can't remember exactly what it was. Oh, your Google password has been compromised. Click here to reset it and he just clicked, typed in his Gmail password and now of course it was a fake Gmail password reset thing or whatever it was. Now they had access to his Gmail. That's how they got all of those emails that were then sent to WikiLeaks. It wasn't some cloak and dagger, super sophisticated hacking scheme. It was just the old fashioned phishing scam and he fell for it, just like that. Same thing with credit histories, right? A lot of people say, do you guys know what your credit score is? A lot of people don't want to know because it's like, I'm worried about my credit score. But the fact that we have credit scores and agencies that track and pay attention to credit scores actually is great for making, it makes capital markets work much more efficiently, right? And so having electronic versions and summary scores that capture people's credit history actually makes mortgage markets and other markets work much better, notwithstanding all the problems in mortgage markets that were in the wonderful movie, just explaining the wonderful movie last night. I'm trying to throw a compliment here, okay? And one thing also that's somewhat surprising is you might think that, okay, Amazon knows a lot about you. So when you go to amazon.com, if you go to their homepage, you see personalized recommendations of stuff that Amazon thinks you might wanna buy or eBay or whatever, you might think, oh, if they know so much about you, then won't they charge different prices to different consumers based on what they expect will be that consumer's willingness to pay, like the old price discrimination models of the textbooks? And you would think that online vendors have a lot of information about you they could use for price discrimination, but apparently very few of them are doing it. There's no evidence that consumers, different people are being charged different prices based on information that the seller has. Recent Journal of Economic Literature piece that summarizes a lot of this literature if you wanna learn more about it. Okay, let's talk a little bit about the regulatory aspects. So should the government regulate privacy in some fashion how might different privacy kinds of regulations work? Well, I mean, there's some analysis going back several decades about sort of markets for information in general, right? So is it better from an overall point of view if people have more privacy, meaning they disclose less information to other people? Or if people have less privacy, meaning they disclose more information to other people? Actually, Richard Posner back in the 80s wrote a piece on it was not about digital privacy but just about privacy in general, essentially arguing that privacy is socially harmful, right? As a society, we're better off when people know as much as possible about other people. Why? Because privacy can be used strategically and opportunistically, okay? You guys ever put together a resume to apply for a job or apply for grad school or something? Do you put everything on that resume that you've ever done? No, of course not. You only put the good stuff you've ever done, right? If there's any bad stuff, you leave that out unless you're applying for a subprime mortgage, then you include it. But yeah, I mean, the point is if people can choose what about themselves to disclose and what to keep private, they could do so in a non-transparent opportunistic way to potentially mislead other people. You know, you might deceive an employer into hiring you or deceive a bank into giving you a loan. And Posner argued, no, everybody should be forced to disclose everything about themselves and then we wouldn't have to deal with these kind of problems. George Stigler had a piece that made sort of a similar argument. You know, in the last 20 years there's a lot of interest among mainstream economists about sort of markets for disclosure from the point of view of asymmetric information theory, signaling theory, screening theory, and so forth. And there's not really any clear conclusions that come out of that literature. There may be some cases in which making people disclose more, provide some social benefit, other cases where privacy can actually be beneficial. It kind of depends on the circumstances. What about the kinds of regulations that are increasingly popular in Europe and now in the US about what they call data portability? So one kind of legal regulation, well one kind of potential regulation will be something like the following. Okay, it's okay for Facebook to collect your data. But any information that Facebook has about you, about your Facebooking habits and your likes and what you comment on and how long you stay on different pages and what you click on and so forth, that all needs to be encapsulated in a database, some kind of a database with a transparent structure so that if you decide, oh, I wanna quit Facebook and I wanna go into one of these alternative sites like Miwi, you should be able to download a file that basically has everything that Facebook knows about you and could use to monetize its relationship with you and you should be able to take that file and upload it to your Miwi site and Miwi would now know exactly the same things about you that Facebook knows. I mean, so from a technical point of view it's not really clear how you would do that because Facebook doesn't just have a list of every website you've ever clicked on because that would be millions and millions of links. It has some complicated and proprietary algorithms for mining that data to come up with some bit of information that it can use to sell you an ad. So it's not even obvious how you could do that but the claim is, look, it's my data. It's not Facebook's data, it's my data. So if I wanna leave, I should be able to take that data somewhere else. If I don't like the apartment I'm living in, I can take my couch and my chair and my suitcase full of clothes and go to another apartment and be able to put them in that apartment. Why shouldn't my data be treated in the same way? Well, aside from the technical difficulties, let's assume that those can be overcome somehow. I mean, the major problem is forcing providers to store the data in a way that makes it portable and giving the user the ability to take that data and delete it from Facebook and send it somewhere else. But basically makes it harder for Facebook to operate because Facebook's revenue model is based on the idea that it has this kind of proprietary, almost tacit knowledge about you. It's not literally tacit because it's an algorithm but it's difficult to communicate to third parties. So Facebook would have to find some other way of staying in the black if it were not allowed to use and monetize these data. So what would that be? Well, more kinds of advertising. I mean, the truth is, I mean, my personal view is, platforms like Facebook, Twitter, Instagram, I don't find the advertising all that intrusive. I mean, on Facebook it's mostly on the side. You know, Twitter has some promoted tweets. I don't know what social media apps you guys use the most. You know, Snapchat has some ads that I don't think they're all that annoying but you can imagine a model which they're really annoying, like YouTube, right? YouTube is super annoying because almost every time you want to show like a 10 second video clip, you've got to watch a 30 minute ad beforehand unless you pay for proprietary Facebook. I mean, sorry, proprietary YouTube red or whatever it is, which I haven't done. You know, I use Google News to quickly scan kind of what's going on but it annoys the heck out of me that Google News does not let you choose your feeds. It doesn't let you choose what sources you want. And my Google News page is dominated by links from links to the New York Times and Washington Post both of which are paywalled and neither of which I have a subscription to. I mean, there's nothing in the Washington Post that would justify me paying, you know, even $2 a month or whatever they want. So I hate that it keeps, every time I click on these links I get to a paywalled site. Okay, imagine all social media apps were like that, right? That every time you click on something on Twitter or Facebook you have to sit through a 15 second or 30 second ad. That would drive people away from the platform but that's a much more intrusive and annoying aggressive kind of advertising. But that might be necessary. The reason that advertising on those social media platforms today is relatively harmless and not that annoying is because the ads are very profitable because they're targeted to you, right? Even if there's a smaller chance that you will see and click on the ad, the fact that the ad is for something that the algorithm thinks you would like makes it valuable to the company who's paying for the advertisement. Therefore they're willing to pay a high price for that ad and therefore the platform can be profitable with that kind of sort of light advertising but they would have to have much more in your face ads to make up the difference if they were not allowed to use your data in this fashion or maybe they'd just charge you a fee. I mean, how much would you pay per month to be on Twitter or Instagram or whatever is your favorite platform? You might pay something but most of us would prefer not to pay even if it means they have some of our personal information. You could make a kind of a coast theorem argument and say, well, it really doesn't matter what the legal standard for data portability or privacy protection is. The market can just sort of sort that out, right? And even if the law says you're not entitled to own your own data, if people are willing to pay something for the right to portable data, then it will be in the interest of some entrepreneurs to supply that, okay? The EU has actually been much more aggressive than the US in imposing uniform privacy restrictions. And in fact, there's some research suggesting that privacy protection, privacy protection is actually better in the US than the EU because we don't have a national standard, right? We have sort of decentralized competing standards for privacy protection. It's probably not all that surprising. Okay, last point here is just talk about this from sort of a legal, political and even sort of philosophical concept. Can information be owned? Can you own information? Well, I mean, this has of course been a big discussion topic within the libertarian literature for many years. As you know, so Rothbard originally argued that patents were illegitimate grants of monopoly privilege by the state, but that copyrights, trademarks and those forms, intellectual property protections that protect the sort of creative expression of an idea were legitimate and probably would be recognized by a libertarian court. Others have argued that that's not the case. But think about it this way. If we're talking about Facebook information or whatever, what do you own? What kinds of things are ownable? Well, you own your computer, your laptop or your tablet or your phone or whatever device you're using to access that social media platform, at least in principle, you can own that device. If you pay for an ISP to come to your house, you have the legal right to get so many megabits of information per month, whatever you've paid for. If you paid for Dropbox Premium, you have the right to whatever was agreed upon in that contract. But you don't own data. You can own physical media that store and exchange data, but data per se is just information, right? A certain series of ones and zeros. You can't own a series of ones and zeros. So in other words, this whole thing of data portability, I mean, a lot of the discussion is confused because it's often framed as, well, people have a right to protect their data, you know, I like it on Facebook. People own their data and they should be able to do whatever, you know, do with their data what they want. No, I think that's not, I think that's not correct. That doesn't make any sense. You don't own your data, right? Because data is information and it may be information that other people have about you. To put it more specifically in the context of privacy and then we'll wrap it up. What is privacy? Right, well, privacy deals with other people's knowledge about you. If I did not wear my sunglasses and my fake beard, you know that it's Peter Klein, you know, in the, I don't know, some disreputable bar or whatever. If I had put on my Bob Murphy disguise, you'd see that you'd think it was Bob Murphy, which of course you would expect him to be in a place like that and my privacy would be protected. But my privacy, you know, it is embodied in what you know about me. Do you know or not know that I was in that place at that time? Well, I can't own your knowledge. I can't own what's in your head. I can't buy and sell something in your head. It's like, I wasn't at Walter Bloch's Ask Me Anything, but you know, Walter Bloch often talks about things like blackmail. Did he talk about that at all? Like Walter argues that, you know, blackmail is sort of legitimate and that, you know, your reputation is not something you can own. Your reputation is what other people think about you. And if someone claims to say something bad about you that will harm your reputation, well, that's unfortunate, but you don't have a legal claim against that because you don't have the legal right to prevent other people from knowing certain things about you. Right now, if somebody broke into your house and stole your diary and broadcast it, yeah, that's a violation of your rights because they broke into your house. Not because other people now know something about you that you would prefer they not know. That is not per se a rights violation, okay? So in the context of privacy, well, I mean, from a moral point of view, you can respect someone's privacy or you can violate someone's privacy. Clearly the woman on the plane who was tweeting about this other woman violated that other woman's privacy. And as, you know, from a moral point of view, from an ethical point of view, we might have a problem with that. But you can't really steal somebody's privacy because privacy is not a thing that you own and can be stolen, okay? So in having these kinds of discussions, I think it's important to stick to the basics of what is or is not an economic good, what you can own, what you can not own, and any kind of government policy that tries to sort of violates those basics is gonna run into problems. So thanks a lot. Thank you.