 OK, we're going to start. I'm going to lather for long enough for Roger to have one more bite before he speaks. Welcome first, so I don't forget. You can notice in the back that this event is being filmed. So we're being a little bit more transparent about the data collection than other platforms might be. My name is Larry Lessig. My friend Ron Suskind and I are going to be engaged in a conversation with Roger MacMee, who has been a legend in Silicon Valley for as long as I knew anything about the place and now has become quite a central figure in the debate about social media platforms, in particular Facebook. And we're here to talk about his book, Zucked, which I take it is a spin off of the name of Zuckerberg, but maybe not. But so this is an extraordinarily important issue. I'm reminded of the film Independence Day. I don't know if many people saw this film Independence Day. And the kind of aliens come and they start setting up shop all around the earth. And everybody's convinced they're here for good reasons and we're celebrating how wonderful it is that they've come to give us joy. And then at a certain moment, people realize they're not there for good. And it's almost impossible to imagine how you resist them. And of course, it takes a Hollywood movie to actually succeed in resisting them. And my fear is that's where we are, but we don't have the Hollywood movie techniques to resist this great invasion. Because the effect of this platform, not just Facebook, but the set of platforms, which Shoshana Zuboff's really fantastic book, Surveillance Capitalism, evinces in a powerful, powerful way, the effect of this platform is to radically change the nature of our existence as people. And part of it, I think Shoshana doesn't emphasize this part enough, part of it feels great. Like I found Shoshana's book because Amazon said to me, you seem to really like Roger's book. You should read Shoshana's book. That's fantastic. I love that part. And part of it could clearly be terrible if, for example, Facebook determined that I was slowing down in my typing and then decided that I had a neurological condition and called my insurance company and said, you need to walk away from this guy quick. That would be really terrible. The category that's most troubling to me is this category where Facebook is giving us what we want, but we getting what we want destroys the capacity for democratic government to actually function as it puts us into increasingly isolated and polarized spaces where we can't even understand what each other are talking about anymore. So Roger has done an extraordinary service in this book, not just in telling the story of Facebook, but actually translating and putting into writing the insights of a really brilliant technologist, Tristan Harris, who's become a real advocate with Roger in the fight to get people to recognize this. But Tristan has been out there in the field not really writing a lot, not really putting it in a coherent way. And Roger has really done that for him too. So we now have a vision of the problem. And again, I just feel lost once I understand the scope of the problem because I don't begin to understand exactly how we're going to address it. So I gave you a chance for at least half of that to be eaten and you slowed down. So I failed. Now you lose. Now Nassan's told my doctor I have a neurologic. OK, sorry. I'm Roger and I'm a technology addict. I'm Roger. You're going to move your mic? I'm a technology addict. You're supposed to say hi, Roger. Now you're good. Hi, Roger. OK, so now you're good. So I want you to understand I've spent 36 years in Silicon Valley. I got there at the tail end of the Apollo period when it was all men wearing short sleeve white shirts, ties, plastic pocket protectors, and heavy framed glasses, just like Apollo 13. I was there for the whole personal computer industry. I was inside Kleiner Perkins in the 90s for the whole beginning of the internet. And in 2006, I got an email from a guy at Facebook saying would you please come and take a meeting with my boss. In 2006, Mark Zuckerberg was 22. Facebook was two years old. It was still just high school students and college students. It was before newsfeed. I took this meeting in my office. Now keep in mind in those days, I had a firm called Elevation. My business partners were Bono from U2, Fred Anderson from Apple, John Riccatello from Electronic Arts. We had one conference room that was set up for video games. Giant speakers, totally soundproofed. Every video game literally ever made was on this emulator in there, and we had every platform. So in this dead, silent room, Mark and me, he's 22. He comes in, I say, dude, before you start, I got to tell you some context for this meeting from my perspective, because once you start talking, you'll assume that everything I say back to you is influenced by what you said, so I need to get some things on the table. I said, if it hasn't already happened, either Microsoft or Yahoo is gonna offer a billion dollars for your company. Now, context, they had nine million in revenue to that point from basically pizza delivery display ads. So a billion dollars was a huge number. So they're gonna offer a billion dollars. Everybody you know is gonna tell you to take the money. I'm here to tell you that I think because you have authenticated identity and control of privacy, if you take that model all the way up, you're gonna be bigger than Google is today and far more important. So if you believe your vision, you need to stick with it. You need to tell everybody you don't wanna sell the company, and you need to recognize that nobody, not Steve Jobs, not anybody had the perfect idea twice at the perfect time. I remember Steve Jobs the second time around took 10 years to get Pixar to work. What followed that? So I laid this heavy trip on him. It takes two minutes to do that little rap. I'm expecting some kind of reaction. We're in this totally dead room. Think how uncomfortable that 10 seconds just was, right? At the one minute mark, he still hasn't said anything. I'm going, wow, he's really showing great respect. He's really thinking about what I said. At the two minute mark, I'm getting really uncomfortable. At three minutes, my fingernails are digging trenches in the cushion. At four minutes, I'm literally ready to scream. And sometime between four and five minutes, he finally relaxes, decides he trusts me and says, I can't believe this, but the reason I'm here is that the thing you just said has just happened and I need to know what to do. He didn't wanna sell the company, but he didn't want to overrule everybody in his life, including his parents. I helped him understand how to position this thing so that people would understand. And he went home and killed the deal. The whole meeting took half an hour. It began a three year period where I was one of the people who advised him on the business. My relationship with Mark could not have been better. He was nothing like the person you see in the social network, although that person is real. But I never experienced any of that. He was incredibly mature. He was incredibly thoughtful. That attentiveness, that ability to listen and process was there all the time. He took advice. He was, if you're a mentor, the thing that you wanna see is you want your opinion to be valued. You want people to respect it. Take notes out there at this part. No, but he really used me super well. I mean, and my focus was pretty narrow. Only business stuff. We didn't socialize. I was 50, he was 22. We were not gonna socialize. But what we did focus on was like, he had to swap out his management team because the whole team wanted to sell out to Yahoo. And so those were not the right people to carry the company through to that vision. So he needed a new team. And it happened that the person who had introduced me to Bono was the chief of staff to the secretary of the treasury of the Clinton administration, a woman named Sheryl Sandberg. And Sheryl Sandberg decided in 2007 that she was gonna leave Google. And she was gonna go to the Washington Post and I went, are you high? I mean, Washington Post? You know you're at Google, right? You know the reason Washington Post is dying is because Google is making it irrelevant. And you're gonna go to work at Washington Post? That's insane. You should go to work at Facebook. And she goes, Roger, he's 23. It'll never work. I said, no, no. His mother's a physician. He's got nothing but sisters. He's gonna be that rare soul convalescent guy who can work with a woman. I then go to Mark. I say, Mark, you ought to hire Sheryl. And he goes, she works at Google. I'm going, yeah. And he goes, Google's nothing like us. And I'm going, and what is? He thinks about that for him. And he goes, okay. So anyway, they get together and they get along really well. And that kind of was the final thing that I did for Mark. And in 2009, I realized that they entered a different point in their life. My, the things I'm good at are as relevant. So I stepped back. I'm sitting on the sidelines, cheering on, I've ended my investment career because Silicon Valley's culture has gone to hell. I mean, companies like Spotify and Uber and Zynga are the best to breed out there. And they all have business models that are essentially deeply predatory to somebody. And I'm going, I just can't do this anymore. I can't manage other people's money if I'm not willing to invest in the best that Silicon Valley has to offer. And so it's time for me to leave. So I blissfully retired. I'm on vacation with my wife. It's January of 2016. And I see these misogynistic memes on Facebook, which I'm totally addicted to. And they're coming from groups associated with the Bernie Sanders campaign like Bay Area for Bernie. Day one, I see one meme from one friend. Day two, different meme, four friends. Day three, another meme, eight friends. And I'm going, I've been running my band's Facebook page forever. Nobody goes one, four, eight on consecutive days if they're not spending money to get people to join those groups. And so I'm going, that's really weird. Then two months later, Facebook expels a group that was using the ad tools to basically gather data about people expressed an interest in Black Lives Matter. And they were selling it to police departments. And I'm going, that is really evil. Three months later, June, United Kingdom, Brexit referendum. Eight points swing on the day of the referendum from four points in favor of Remain to four points in favor of Leave. The difference between the two campaigns was that Remain's campaign was totally emotionally neutral. We have the best deal in the EU. Let's not mess with it. Remain's Leave's campaign is those evil, awful foreigners, those immigrants are destroying our culture. They're gonna eat your dog for dinner, right? I mean, I'm not joking about this. That was the sort of stuff they were doing. And I'm thinking to myself, oh my God, what if Facebook, because of its natural virality prospect, gives an advantage in elections to the more inflammatory campaign? That's just really bad for democracy. Then in October, I get news that the Housing and Urban Development has cited Facebook because the ad tools allow people in real estate to discriminate in violation of the Fair Housing Act. At that point, I write an op-ed for the Recode blog and send it to Mark and Cheryl before publication saying, I think there's something about the business model and algorithms of Facebook that are allowing bad actors, evil people, to harm innocent people. I think Facebook is the victim. And I go to my friends on the 30th of October, nine days before the election, and I just talk about the four things I described there, plus the fact that Peter Thiel wants to get rid of the women's right to vote. I'm going, you can't do this. You can't be hostile to the civil rights of the people around your platform. I mean, and you can't enable bad guys to harm innocent people. That's just wrong. They treat it like a public relations problem. Their goal is to get me to shut up. They hand me off to Dan Rose, who works with them, whose job is literally to deal with problems like me. Now, remember, I'd been a mentor to both of these people. A different pair of people might have reacted differently to this input. Because all I was saying is, you've got to get on top of it. Then I talked to Dan a bunch of times. The election happens, and then I blow up. Because in between, we've had the news that the Russians are trying to interfere. And I go, Dan, the Russians tipped the election using Facebook. You have really got to take this seriously. You've got to treat it like Johnson and Johnson treated the Tylenol poisoning in Chicago. You have to do everything in your power to protect the people who use your product. He says, Roger, we're cool. We're a platform. We're not a media company. The law says so. Long pause followed by Dan. You are a trust business. If the people decide you're responsible for harming innocent people, you're dead. Get a grip. We spent three months on this issue. He never budges. I finally give up. And having been arguably the greatest cheerleader for Facebook for the prior dozen years, I realize I need to become an activist. I come out of retirement every single day since February of 2016-17. I've been out trying to build awareness of this problem to learn what actually happened and try to fix it. That book is the first part of that journey. It's me as Jimmy Stewart in Reroando. No, I'm not joking. I see what looks like a crime scene. I do not understand it because it's total cognitive dissonance. It doesn't conform to my deeply held beliefs about this company. And keep in mind, I confess to failures of analysis. There were signals that I missed that I should have picked up. OK? But I missed them. But I use my journey of intellectual discovery to explain to you everything you need to know, not because it's the whole story, but because it's enough to prepare you to handle what's coming now. Zuboff's book is ungodly important. And there are elements of it that are polemical. But it doesn't matter. I think she has the economics dead right. And I think she's got the behaviors of the people right enough, OK? What's going on here is really simple. And Facebook is what I saw first. But Google is the much greater problem. What this is is that there was this free good in the world, which is data about human intent, about behavior, that nobody knew how to economically exploit. It took rapidly improving generations of machine learning and artificial intelligence to make a market out of it. And Google in 2003 figured that out and set about to make that happen. And they also set about covering their tracks. They describe it as digital exhaust, like it has no value. The core point here is that a person who buys a car will typically do 15 or 20 steps before buying the car, two-thirds of which have nothing to do with buying a car, but every one of which has predictive value. So if you can see those 15 steps, you then find anybody who's done the first 12. And you can say the odds of them buying a car are very high. At 13 steps, it's higher. At 14, it's essentially 100%. And if you have artificial intelligence and you have recommendation engines, you can actually steer the outcome and cause people so to make your behavioral prediction model accurate by making it self-fulfilling. This is not a casual exercise. They started with news. They're working their way through video now. They're going to do automobiles. They'll do energy at the same time as automobiles. They're going to run the table on the economy, because only two companies control essentially all this data. And we are forced. And you guys can be the people who solve this problem. I have enormous hope. We're up against a big enemy. But guess what? There are 100,000 people who work for Google, 32,000 who work for Facebook. The rest of us are all on Team Human. And here's what we've got to do. We've got to ask questions that nobody's asked ever. Like, why is it legal to sell financial transaction data, like credit cards? Why is that legal? Why is it legal to sell personal health information gathered outside the health system? Why is it legal to sell geolocation data? With those three things, you can have essentially a perfect resolution picture of any human being. You don't have to be on Facebook. You don't have to use Google. They have that of every single human in every market in which they operate. Why is it legal to even gather data on minors? I would like to litigate on every one of these issues. I would like to pass laws. I would like to make all these things illegal. And then I'd like to have a reasonable conversation about what usage are OK and which are not. I think starting from this notion where everything is, per se, legal is a bad place to start. The advantage is in the wrong place. It's not enough to have a global data protection regulation because that's about the data you put into the system. The real value is on the things you do before you transact and the things you do after you transact, not on the transaction itself. It's not enough to have a dividend for what you put in because the real damage what's worth 100 to 1,000 times more is what your data is relative to me. So I have bad news and good news. The bad news is we're all in the same place and they know it. So you're now associated with me. And there's a whole bunch of people who are really unhappy with me right now. So you're very brave and I want to thank you for that. But the good news is the state of California has a proposal to extend its groundbreaking privacy law to give a right of civil action to recover damages, to pierce the requirement in every term of service that forces arbitration for all conflict. The right of civil action is an opportunity to change the financial incentives of this industry. I would like to personally support anybody who would like to engage in those civil actions. I believe we need to have criminal engagements by state AGs because this is conscious theft. This is conscious manipulation of people's lives. I want you to understand the people at Facebook and Google are not evil. But their educations are very narrow. Their experience is incredibly narrow. And they've had terrible advice from the people who are legally required to provide them with good counsel. They're boards of directors, which have just gone along with all of this. Their parents have not given them good advice. They're friends and they haven't listened. They have not sought out critics. They've tried to suppress. And that's too bad. But you know what? There are only 132,000 of them. The rest of us can work together. And in 2018, a beautiful thing happened. The Trump campaign successfully suppressed millions of votes in 2016 using micro-targeting and disinformation on suburban white women, people of color, and idealistic young people. In 2018, there was dramatically more disinformation in the market than there was in 2016. And yet, suburban white women, people of color, and idealistic young people turned out in record numbers. Evolution is already happening. We've had at least five successful labor actions by teachers in the last nine months. The air traffic controllers had a minor sick out. And in a matter of hours, ended a government shutdown. Collective action is working. My job is real simple. I have a biography that makes me credible in some places that's useful for this goal. You guys all have biographies that will be useful also. I hope at least a few of you will join in this thing, because this is a lifetime opportunity. This is an opportunity that's like the civil rights movement. This is an opportunity like breaking up the trusts. Whole lives are going to be made against this mission. And I would welcome each and every one of you to join in this with me. Thank you. Bravo. So we have some seats down here in the front. If you guys in the back want to come locate. Ron, do you have some wisdom to have? Let me just jump in to just get this circus started. Thank you, Roger. That was quite an infashioned offering. I'll disagree with you cinematically a little bit. I get rear window and Jimmy Stewart. I get that. But I would say more George Bailey from It's a Wonderful Life with Mr. Smith Goes to Washington. Because you're actually a bunch of Jimmy Stewart characters here. Your life is evolving at breakneck speed. I used to deal, of course, with a lot of whistleblowers in Washington. And you're kind of a whistleblower now. And the lives and fortunes of whistleblowers are very varied. I'll just say this, having had experience with many of them. Some of them say, well, I did my job, my deed. And now I want to go back to the life that existed before now. And that almost never works. But there's a natural impulse to do that, certainly. You know, I'm done. I did my service to truth, especially in regard to informed consent, which, of course, doesn't live unless it's nourished by truth. Often truth that puts people in crosshairs of institutions who say, the last thing I want is the truth, certain kinds of truth to be out there. Yet some of that community will not take next steps to embrace this role, which then unfolds into, well, various other chapters, each one of which are chapters of conflict. You're standing up against vast institutions here. The incentives beneath them drive human behavior as you've talked about so much, and frankly, in your book. And of course, we know so much the lives we're living and how our incentives being tapped and shaping our behavior. You're up against armies of self-interest here. So I guess my question to you now is, as you stand here, a book is, what, a month about three weeks? Not quite three weeks at all. How do you feel, first off, now that you're out, really out? That's the first question. And how are you different? What have you learned that most surprised you since you're outing your self-generated outing? So the book is actually phase three of the outing, right? So phase one was confronting the problem itself and then going to Mark and Cheryl. Phase two was going public with it in August of 2017 after I'd gone to see Senator Warner and tried to persuade him that while it was not the mandate of the Senate Intelligence Committee to look into protecting the 2018 and 2020 elections on social media, that was the only committee of Congress where the two sides were talking to each other, so it was the only hope for solving the problem. And having persuaded him to do that, we then went public and started to write op-eds and do some television to raise awareness through the hearings that took place in the fall of 2017, which came out as basically the result of Senator Warner listening to Tristan's idea of getting the executives to testify and deciding that was a good idea. Again, outside, remember, they're an intelligence committee and we were having a thing about social media. So that was a big leap for Senator Warner. And thank God he made it. And we thought after those hearings, we were done. We'd done our whistleblowing job. And Representative Adam Schiff took me aside and goes, no, no, no, no, no, this is the beginning, not the end, that now you've made yourselves iconic around the problem, you have to help us to the next level. So we then spent the ensuing year and a half, okay, going around and just talking to people and finding out what the hell was going on. Because remember, at the beginning, I see elections, Tristan Harris, who was my colleague that Larry talked about, he'd been a design ethicist to Google, he understood the psychological tricks, all of persuasive technology. So we're looking at this as a public health and election problem. We haven't tipped to privacy, we haven't tipped to the issues of the economy and God knows we hadn't tipped to the master plan at all. In fact, Zuboff really is the one who tied all that together for me because you'll see in the book, I hypothesize a lot of what she talks about, but she has 10 years of actually studying, she's got data, okay? I mean, it's like, you know, I'm an analyst, I'm good at using Occam's razor against available information, but I usually about a millimeter deep, okay? And so, you know, and by the way, I readily confess this. What I will tell you is that we're now in phase three with the book, which is trying to take this thing to mass market, not about building awareness, but about pivoting to solutions. And I've learned some interesting things. Somebody I knew really, really, really well and have had a great relationship for a long time, we're not like close buddies or anything, but we had a great professional relationship. Bill Gates came out publicly and condemned me for doing this, which really shocked me because I'd sent him the book three months ago in galley form, I sent him the final book a month ago. I encouraged him to read it because I wanted to be part of the conversation. I didn't expect him to take a call from Markin and say things about me that I don't say in the book. But you know what? That'll happen. And I've had two major opportunities to speak at places that you would know by name that are really identified with public speaking around technology, where my invitations were withdrawn because of political pressure brought to bear by Google and Facebook. You know what? That's the price you pay for doing this stuff. If you're an investor, one of your advantages is if you're gonna be really successful, you have to be okay with being different. You gotta be okay with people disagreeing with you and you gotta be okay with people making fun of you. I, for whatever reason, am better at that than the average bear. And I was successful enough that I have no fear about my career. And my point is, I'm like the three stooches thing where they're in the army and they're looking for volunteers and everybody, but Curly steps back and Curly's the volunteer. I get it, okay? So I'm gonna see this thing through. And what have I learned? Well, I've learned that the problems are much more pervasive than I thought. That Google's really the mastermind of it. And Google's genius was that they made a mistake in their 2003 patent because they actually gave away the game, but they've covered their tracks brilliantly ever since. And they have used campaign contributions and lobbying believably effectively. And you gotta admire them, okay? I mean, the idea truly is genius, but it's not socially appropriate. I mean, we're each entitled to self-determination. They wanna take that away. And I don't believe that's their right. They're not the first business people with that idea. They're just the first ones since the robber barons who had enough juice to pull it off. And they're pretty deep into it, but we can stop it. In fact, we're already beginning. And I'm really confident that Congress will come around. I love what the Europeans are doing. I love what the state of California is doing. I love more Healy in Massachusetts. I think we've got real opportunities to use the AG and the state to do incredible things. They're absolutely on top of it. And again, there's so much to learn, right? That staying on top of it requires continuous learning. And that's the advantage of me having been a professional analyst. I always know that I'm wrong and that I have an incomplete view, so I'm always looking for new things. And in this thing, that's been a huge advantage. I mean, as you'll see in the book, I start as a complete idiot. I mean, really, I mean, I look back, I'm gonna go, my God, I was stupid. How did I miss this shit, right? But I did. And it's okay to admit that. It's actually really healthy, right? And then, as you learn stuff, you just assimilate it, right? You start with hypotheses. I have no conclusions. I just have hypotheses. But they're working out. Well, you've talked about us being addicted. And that's a, I think, a metaphor that we're talking about. No, no, I'm addicted. These guys are all, now, here's the question. I mean, addicted to this. When do you check your phone in the morning? Is it before you pee or while you're peeing, right? Because there's- There's a wall peeing here? Are you aware? No, of course. But everybody's either before or while, okay? There's only two choices, right? And it's like, how bad do you have to pee? That's the thing that really determines it, because everybody's gonna check it in the first three or four minutes, they're awake. Right, so we're all addicted to one degree or another. And the key thing is, are we conscious of what's going on? Do we understand that the three big use cases of AI are taking away white-colored jobs, telling us what to think with filter bubbles, and telling us what to buy and enjoy with recommendation engines? And you ask yourself, oh, that's just really convenient. And I go, well, wait a minute. The things that identify us are our job, the things we believe, and the things we enjoy and buy. And we're gonna delegate all that to a computer? Hmm, let's have a debate about that, okay? Right, you can draw your own. Maybe you're cool with that. I'm personally not cool with that, right? And my point is, I just think we ought to have debates about all these topics, right? If I lose, so be it. People are coming to you with business models, how to help change this. If this is heroin, are folks coming to you with methadone, with some way to wean off, that still gives people a little bit of a fix because it's hard to go cold turkey off of it. I don't think that's the problem anymore, okay? I think addiction, we started out thinking addiction was the problem. Now the problem is this business model where at the margin, they don't need you using these products anymore. They have so much behavioral data. And remember, with products like Amazon Alexa, all of the smart devices going around your home, right? They no longer need you on your smartphone. They're gonna be able to do surveillance wherever you are, okay? And with AI, they're gonna be able to implement the behavioral modification under the guise of this very authoritative, neutral piece of technology. How many Alexis do we have here? How many people have Alexis as they show hands? So, the great moments of this, so hilarious moments. In the last week, I've done two podcasts with people who had Alexis in the studio where we're doing it. And both times, when I get to this part of the conversation, I just say the word Alexa, not hey, Alexa, just Alexa. She comes on and says, I'm sorry, I don't know what you're asking. And I get, my point is made for me without saying anything, right? These things are listening all the time. And I believe Amazon when they say, we only record when you say hey, Alexa. But I also know that they will eventually push the edge of the envelope. They can't help it. Some greedy product manager will not be able to resist the temptation. They will figure out the device is in somebody's office, trade on inside information. They'll figure out it's your bedroom. They'll use it for political gain, right? You can see what's going on here, right? I mean, there was a family that got hacked two weeks ago on a Google Home security device and told there were inbound nukes coming. Now, you have to have a fourth grade education to hack Android, but that's all you need, right? I mean, literally kids under the age of 10 can hack Android. So we've got a mic, that's basically to make sure that the question is recorded for the recording. We've got a lot of people- I would like to start with the ladies if you don't mind. I find it's healthier if we do that. So ladies, which of the ladies has a question first? Okay, right? Hey, Joan Donovan. I work over at the Shorenstein Center now, but I used to work at Datum Society and I'm very familiar with your research. Thank you. Thank you. And the crew of people that you draw research from. And so there's a burgeoning field here of disinformation, media manipulation, researchers that have been really trying to get at this industry policy question. And I think that some of the work that you've been doing around, especially there was an op-ed you released last year with eight points. Do you think at this stage, what do you think is the top maybe two priorities that we should work on? Maybe something that only industry can handle and something that we need government to take seriously? Thanks. So I don't have any solutions that industry can handle at this point because we're two and a half years into this and industry has, with the exception of Apple and to a lesser extent IBM, the industry has been a wall on this. I believe the top priority in the short run, are the collection of, I would like to begin the process of a legislative program to make the gathering, sorry, the sharing or selling of financial transaction data, health data, location data and any data on children to make that illegal and to make it criminal. We need that incentive change. I think the California thing to create rights of civil action is really important and that's the first thing we're gonna get. I mean, I think if we can get that through California, we can, you know, the state because of the way it's currently structured, we can potentially get that in a matter of months and that will really change things. I think the European efforts, particularly the German effort to eliminate whole classes of profiling on products like Facebook is a really good idea. I think that the change in copyright in Europe across all of Europe that will, if it survives, make YouTube's business model non-functional, I think that's super important. I think I'm actually a fan of the fiduciary rule notwithstanding the obvious issues of it being not enough but I believe that the only purpose of the fiduciary rule is again to create a right of either civil or criminal action, right? It's piercing the terms of service. That's the critical thing you gotta do here. And so those are the things I wanna do first. The thing I'm personally working on is antitrust and I think antitrust is super important to create space for a competitive universe to develop. Right now these guys are blocking all the sunlight and they're choking off all the capital and I have, you know, again, it would have been super helpful if Zuboff had shared some of her ideas sooner because we would have gotten there faster but I had arrived in the same place and found a way to use the Chicago School to pierce Facebook and Google's barter system, okay? Remember, they're trading services for personal data and they're only compensating you for the personal portion of it, they're not compensating anybody for the behavioral surplus and that represents price inflation and we're working with people here on the legal side and we're working people at Yale on the economic side and Knockwood will make some progress on that this year. Right over here. Hi, my name is Kira, thank you for coming to talk with us. I wanted to ask about sort of the complicated confluence between government regulation on the one side and government interest in the sort of data that's generated you talked about briefly about Facebook selling data to police forces and other instances. It wasn't Facebook, it was an advertiser. Okay, but facilitating that sort of exchange and there's some polls on different sides between government desire to create privacy but also to harness those tools. This is such a brilliant question and I don't wanna pretend I have an answer here. What I've noticed is Microsoft is currently having an employee pushback relative to a depends department contract. Amazons had pushback relative to drones and relative to facial recognition on the same thing and Googles had some pushback. Ironically, the employees these companies have no problem with Myanmar. They have no problem with messing with all the users. They just have personal peace with some of the military uses this. I don't know how to solve the conflict you're describing and I believe it exists. In fact, I am told by people who used to be in that world that in fact the status quo is considered very desirable now that the agencies have seen not just the Russian stuff but what Bannon did, use of micro-targeting to suppress votes, that that is something they're excited about using around the world. I don't know if that's true or not true. I've just been told that that hypothesis may be valid. That's a tough tension, right? But again, I would simply observe they're more of us than there are of them and we have voting power and we have election coming in 2020 and we need to make this a huge issue and I will tell you, there are people. If you go in California, my Congresswoman, Anna Eshoo has Google in her territory. The adjacent district, which is Jackie Spear has YouTube and Facebook. They are both standing up to those companies right now, very brave. I do not see our two senators in California doing that, one of whom is running for president right now and I can't let her run for president until she stands up and says she recognizes that the interests of the country are in exploring at least having these debates because we're not there yet. We're in Larry's area of expertise here. What about companies this powerful taking them on in terms of elected representatives? Larry, this is your specialty. What odds are you Roger facing? Yeah, it's exactly zero. No, that's not gonna happen. But are there any presidential candidates who have, I know there are, but I mean who are the ones who have? Because it's interesting about this debate now versus network neutrality in 2018. I mean in 2008 is that it became a really sexy topic for Obama to take on and show that he knew something and it was aligned with the interests of Silicon Valley. But it's gonna be very difficult for these Democrats to take on an issue that once you recognize surveillance capitalism is the new capitalism that these people are gonna be opposed to. So who's been good about it? To be clear, only two companies today benefit completely from surveillance capitalism, right? But Microsoft is eager to become the third. That's right. No, that's right. And Microsoft, I actually would argue, is they are clearly in the AI level, a massive performer and they are using LinkedIn to the best of their ability to do the same. So it's the future for everybody. They're just, they weren't a driver of it. They're not a thought leader in it but they're a willing participant. So name names, who's identified? So the identified people, Warren is fantastic on this issue, okay? And the reason is because her core issue, which is corruption in the financial services market, is literally the exact same intellectual problem that this is. And the day that we met Warner in July of 2017, she asked for a meeting with us on this issue. I mean, nobody was talking about it then. So I think if you had to pick one person that I am most confident of taking this to the goal line, it would be Warren. Klobuchar has been fantastic in the hearings. I don't know her, so I can't be certain where she is on this. And she has other issues, but Klobuchar is really good. And of the people running, they're the only ones who I think have shown any leadership in this space. There are plenty of other people showing leadership. Nancy Pelosi has been fantastic. The Bay Area delegation has been not uniform, but they've all been trending in the correct direction. And the reason is really simple. Google and Facebook can have very fine businesses without harming humanity, right? So the truth is America was much better off busting the trust in the early part of the 20th century. I mean, growth exploded and the companies that were broken up all prospered. And in tech, antitrust has been uniformly great, not just for the industry, but also for the people who were the targets. Well, while we had antitrust, the last one was the Microsoft. I'm not arguing with you. I'm just saying going back to antitrust is not a crazy idea, which is why Republicans are open to that notion. This is a right versus wrong issue, not right versus left. And I'm working really hard. I got my, if you go back, it's really hard to find a photograph of me with my hair this short, right? Because normally it's shoulder length because I play in a rock and roll band and I am fundamentally a happy value system. But I have a message to sell to a community that might not be receptive to a guy with long hair. And so I have modified it, okay? Another thing that I've learned. There's a school that's called Hipster Antitrust and he's like one of the big leaders in it. So more questions over here. We have one, are we allowed to talk to men now? Yeah. Okay, right. So we have a man down there. Yeah, no, but I'm right about this thing. I've noticed this thing. If you let the men go first, they just dominate forever. If you let the women go first, you have much better conversations. Even the men behave better. Patient privacy rights. 5G is an interesting sort of legal and platform evolution because it eliminates the ability to run a firewall relative to the things in your house. You now are creating the situation that I think is qualitatively different. And we just saw in the times, for instance, mayors going against the telecoms in terms of infrastructure for getting to the internet. So never mind the platforms like Google and Facebook that you're talking about. What do you have to say about this 5G thing and the way Washington is gonna deal with it? Oh, I mean, I have no idea how they're gonna deal with it because think about it this way. From 1956, which is the first AT&T antitrust case, the consent decree until 2003, the technology industry, particularly Silicon Valley and Route 128, produce an endless string of life-improving products, right? Steve Jobs used to describe computers as bicycles for the mind. Technology was about empowerment. Google's model then adopted by Facebook and now by Microsoft is about, it's an extraction model, right? They say in advertising, you're not the customer, you're the product. In Google and Facebook's model, you're the fuel. And in that model, we're in big trouble, right? And the problem with 5G is currently envisioned, is that it is architecturally indistinguishable from, or from philosophically, from the world of optimism where we have nothing to fear. And the irony is that the hardware is primarily made by Chinese companies currently on a list of sanctioned businesses by our Defense Department and our Intelligence Agency. So you have three layers of risk. You have the surveillance risk of this pervasive thing where you have no opt-out and no privacy. You have the fact that the hardware is made by people who are potentially, from a strategic security point of view, at odds with the country's interests. And thirdly, everything is on top of the Android operating system, which again, you need at least a fourth grade education to hack, right? So there's some issues with 5G and we're definitely not prepared to take it on. And I don't know, this is the place where my optimism fades and Larry's pragmatism probably prevails. We're probably gonna let this get too far before we realize how big a problem it is. I mean, at a system, I mean, a third of you already have Alexis. And some of you have them within earshot of your bedroom. For what? To tell you the weather and a playlist. And turn the lights on. And turn the lights on. Things you can actually do with almost no friction. No, that's really hard. Yeah, it's hard for you, Larry, but. But that's why you're a professor. These people are students. They're gonna go out in a society and be productive. Harsh. Harsh, straight up here. Good. Hi, thank you so much for your talk. You had mentioned that Facebook and Google are not necessarily evil. They just don't know better or they have narrow knowledge. So I wanted to know what type of system allows people with narrow knowledge to attain so much power. And what are your thoughts on that? Maybe there's a system in place that we should probably address as well. Well, let's think about how the United States views education, right? For 40 years, we undermined public education because one party viewed it as a waste of money, and politically unhelpful to have an educated population. Then you've got the issue of today's insane focus on STEM, to the exclusion of everything else. And I'm sitting here going, look, here's a pro tip. One of the first use cases that's really gonna make a lot of money for AI is programming, right? Teaching everybody to code is less useful than teaching them Latin, right? You're basically teaching them a dead language, but it'll be a dead language that doesn't have any good literature written in it, right? And so I think the STEM thing is one of, is spectacularly misguided. Teaching people to test is misguided. You are all survivors of that system, right? Which is amazing, because the system was completely organized against your interests. And the challenge is, can we teach people critical thinking skills, right? Facebook and Instagram are designed to prevent critical thinking. YouTube and Google are designed to undermine critical thinking, right? The first to just make critical thinking irrelevant by keeping everything in a rapidly moving newsfeed. But the other two, right? The level of nonsense in them is so high that it literally subverts it. I mean, I had a person come to a talk I did in LA last week and my eighth grade daughter came home and told me the Pearl Harbor could not have happened. She asked the daughter, well, why not? Oh, because the Japanese didn't have a plane that could fly from Japan to Pearl Harbor. Where did you hear this? I saw it on YouTube and she couldn't talk her out of it, right? And then you've got obviously all the stuff where the recommendation engines, right? If you are worried that you might have beasts, have a body image problem, your first, second, and third recommendations on YouTube will be things that aggravate some sort of eating disorder, right? I mean, there's a lot of money in that on all these platforms and it really undermines things because somehow people don't have a value system that says I'm responsible for other people. We have this rampant, really weird form of libertarianism that says that it's not about your freedom to choose, it's about your freedom to disregard the rights of everybody else, right? And I find that inconsistent with my own value system and I'm just hoping to go and be a spokesperson for a different model but I don't wanna pretend I got the solution. How long does this take? We have lots of educators here at Harvard. We have lots of educators in this room, in the second to last row on that side who can help you get a better answer. So we're down to four minutes. Katherine Steiner-Adair, right back there. Go say hello to her afterwards. She's written a book on this topic, okay? And Harvard is full of people who've really thought about this deeply. Remember, I'm only a millimeter deep, okay? My job is to raise awareness and get people focused on solutions proposed by people who actually know what they're doing. Other questions? Thank you so much for being here today. My question is, I remember a time before Facebook started changing or choosing what you see on your news feed and everything just came in chronological order and people would post at dinner time because that's when they know everyone is checking. Before these recommender algorithms, before they actually did start becoming kind of like a curator of news instead of just the platform that just pushes out to you like an internet service provider. Do you think that was the time when everything kind of changed? Because for 10 years we had Facebook and we never saw any of this until very recently. So I think it actually changed at Google first. Again, Google followed the pattern in 2003 that was basically about this notion of using what they call digital exhaust, using the things other than what you were. You know, that the search query, originally before 2003, all Google data gathering was about improving the quality of the search results. But there was a guy inside there who noticed that there was predictive power in relative to other things in the metadata, the data about data related to your searches. And so they filed a patent on that and that was the actual beginning of it. Facebook was responding to that and they responded to it pretty much. Cheryl came over and other Google people came over and brought with them that insight. And I don't think it was the technologies created at Google, which is why it took Facebook relatively longer to catch up. But in Facebook, you could see it sooner, right? Keep in mind, it wasn't until 2011 that Eli Pariser wrote about filter bubbles, right? I mean, even really smart people paying attention, it took them a while to see what was going on. These guys covered their tracks really, really well. And in Google's case, they continued to do that, right? I mean, Facebook stumbles over itself every single day. Google is generally speaking more careful, but even they're making screw-ups, right? I mean, they put out a quote research app that Apple had to ban from the app store, right? I mean, so we're still learning, right? You may figure this out way before I do. If you do, please tell me. And I'm really serious. If you guys are interested in this, let me know. Let me help. I mean, this is a really hard problem. It requires people who wanna change the world, but this is one of those moments when we can actually do that. So I think you have books outside. I do, if you wanna get one, feel free. Are you gonna sign books for people? I'd be happy to. Okay, so Roger will be out there to sign books. And if people would like to follow up, I'd really like to hear from you, okay? Because this is, here's the thing, you don't need my approval. If you see something to do, just do it, okay? Because I have no idea what I'm doing, okay? I'm making it up every day and you may give me the critical idea that becomes the next big thing, and I'd love to hear that. But the other thing is you may just do it yourself. In which case, let me make sure you get the resources to pull it off, okay? Thank you all for coming out. Thanks. Thank you.