 Well, hello and welcome people both in real life and fittingly in the virtual world To our 20th anniversary symposium. We've actually planned this since 1998 and pleased to now be able to deliver on a paper Larry wrote Larry Lessig that is that at the time I think Maybe impose some structure over the creative chaos of what maybe was a field that would call we would call cyber law and Larry Lessig tried to Offer up some theory about that and to get things started in a productive way And we're gonna learn about the intentions and project that he had embarked on then And take a look with the benefit of hindsight now that so much is written down There is or at least has been a field of cyber law Where that has taken us and we welcome your questions and comments and We'll have a discussion among our three parties here as well I also want to start by thanking Ben Gal Sharon for having the idea of He was one on the alarm clock that was like it's been 20 years Isn't it time for a panel and we agreed that it was so thank you Ben for that And we should introduce the folks that we have here to Discuss things Laura D'Nardis. Maybe I can introduce you since you are looming over the whole project Laura was meant to be here today, but the winds and rain got the better of her flight out of DCA So I just tell us a little bit about yourself and perhaps When you first encountered the forces of cyberspace Can you hear me? Okay. I will start with a quick audio check. Yes. We can Wonderful. Well, thank you very much. It's wonderful to be here. My name is Laura D'Nardis And I'm a professor at American University where I also serve as a faculty director of the internet governance lab I first discovered the forces of cyberspace when I was a graduate student at Cornell in 1988 when the morris worm hit and got traced back to Eventually Cornell it got traced back to my building and got traced back to my floor and a fellow graduate student Robert Morris had launched this It was his mistake to call it the morris worm The morris worm. Yeah, I became very interested in that. Of course. I first became Exposed to Professor Lessig's work right after code and other laws of cyberspace came out and It really helped me to understand what I was doing at the time. I was a practicing engineer. I was doing work designing Parts of internet infrastructure, especially for large fortune 500 companies but I also did security clearance work for the Department of Defense and I always knew that what I was doing In terms of designing architecture was also a form of governance and The book really helped me to understand that so that was my first exposure to that work Uh-huh. Wonderful Professor okedigie ruth okedigie. Tell us a little bit about yourself And uh, if you like how you came upon the forces of cyberspace So my name is ruth okedigie and I teach intellectual property law copyright in particular And trade a bunch of other international things And i'm a faculty member here at harvard law school I first encountered well first, let me say I was standing In front of the lewis building for those of you who are familiar with the campus and I used to work at langdell and somebody Said have you seen this paper? I'm man. This is like where you were when kennedy was shot I remember the exact spot and I remember seeing professor lessig's name and going who is this lessig person Um, and I sat on that bench right outside lewis and read the paper Wow And I will never forget Both my sense of sheer terror That we were launching something that we had no idea where it would lead us Um And then the sense of skepticism like well, how does he know he's right? Um, and that began my own sort of work and research into both Thinking about the internet as a form of governance, but also thinking about social interaction on the net And the way in which law could mediate that or might not mediate that and um, it's been 20 years and larry was right And i'm still researching Wonderful. Wow All right. Well larry that tees up a nice introduction Opportunity for you both to tell us a little bit about who you are and For many people may be in person and out on the internet. They're now hearing about this transformative paper What was it saying Um, well first something I never do I want to start by thanking Jonathan and this People have pulled this together. I'm really grateful um To to mark this paper because in a certain sense It feels like the issues that this paper was struggling with Have just come back to life. And so it's really interesting to see primavera here Who's just written an extraordinary book about the law And blockchain. Um, and it feels like this struggle all over again And so I'm happy to have this moment to reflect on it and see what we've learned and maybe What we've not learned. So the the basic impetus for the paper was Frustration with frank easterbrook You mean the esteemed judge on the seventh circuit? Yeah, who who evokes frustration in most people who encounter him Because he's a very strong willed very brilliant judge and he gave a presentation at a conference we had organized at chicago, uh, where I was before I came here Where he basically said there is no law to cyberspace. There's nothing interesting about this. It's like the law of the horse There's like contracts and cyberspace and torts and cyberspace and pornography and cyber You know, there's all these things to regulate, but there's no thing that holds it together And it seemed to me he was missing something pretty important and what he was missing was actually A way of seeing what the objective of legal regulation In general was or what the what the problem of legal regulation in general was and that was To regulate the way to recognize that regulation is always about a choice among different modalities of regulation So we have You know well-developed society different ways of trying to bring about what we're trying to bring about so we could Use the very crude form of just regulating through law. The law tells you what you have to do Um or you could regulate by trying to use norms social norms to bring about what you're trying to do Or you could do the government or you could be anybody you could be anybody but let's start with the government okay, um Or you can use the market you can Tax or subsidize to create the behaviors you're trying to create But what the internet brought out was the way in which you could use the very found physical environment which of course in cyberspace is just the Technology the code of cyberspace but in real space it's it's the You know just the architecture of the space and depending on the choices you make in all four of those modalities You bring about behavior differently or you enable different kinds of behavior or you Disable certain kinds of behavior. So in this room right here the architecture of this room had they chosen to Make the windows reach all the way down to the ground That would have encouraged or enabled people to stare outside of the window rather than focusing on what was going on in the front That would be from the standpoint of pedagogical control a bad design choice But we can see the way in which the design choice enabled or disabled certain kinds of behavior that we might decide were eager to Enable or disable and and that's exactly the kind of reflection that I think Every regulator needs to be thinking about and especially because the government Plays a role in that trade-off the government decides Which of these modalities it might try to deploy and what it ideally should be doing is thinking about What's the in some sense cheapest way to bring about the mix of modalities that gets us what we want? And does that mean you are the Right-hand person to a regulator for the purposes of this paper. You're like hey regulator You're wasting some of your time perhaps on law And you know a little bit about markets and taxes and subsidies, you know a little bit about Norms like I don't know smoky the bearer says help prevent forest fires kind of thing But boy you're missing out you could be regulating the code and just if you do Do it wisely Is that was that kind of the disposition you had at the time you were writing? Well, I mean that's kind of the Story you would tell if ever there were a regulator interested in listening to me Which I found not many in the 20 years, but I mean that would be the conversation to have But but the important reason to have that conversation is to recognize that the contingency of At least the modality we're thinking of the contingency of the architecture of cyberspace Can enable or disable the capacity to regulate so way back in the day, you know 1998 was the time when we were still Engaging with the brilliant john perry barlow who Two years before had made his declaration of independence of cyberspace And had made it sound like cyberspace in its nature was a place that could never be regulated And that in some sense life was going to move into cyberspace and be free of the government and part of the Motivation responding to barlow and in this way and in the in code was to suggest well, you know The regular ability of that place is a function of the Architecture of that place. So yeah, the architecture of the internet as it originally was made it really hard to regulate But you know, you might imagine the technology evolving to persistently Watch everything you're doing and enable simple traceability back to the identity of the person who's actually doing it And all of these evolutions in the architecture Would increase the regular ability of the space and so then we need to decide do we like that? Do we want that? And of course the fear the fear that I described in code more extensively than this paper was that there seemed to be an interest of both government and private markets to Increase the regular ability of the space the ability to track and to trace what people were doing markets Commerce for the purpose of selling things governments for the purpose of controlling people and That if that's what we wanted great, we were going to get it But if it's not what we wanted then we should think about what interventions might as you put it the kinds of things a commercial search engine or other enterprise might do to Figure out where you are so they can ship you a pizza instantly without you having to say Are the kinds of things that then could be adoptable by Government Laura, maybe I can ask you as you were thinking about The pipes that the internet comprises metaphorically Singing of Ted Stevens the metaphorical pipes Um To what extent is a framework like the one Larry was talking about Informing that and how much Do you think now? Maybe versus then it's important for all citizenry to somehow be Present at the kinds of meetings that let's be honest Most people would find as boring as can be when we're talking about internet protocol or I can domain name council or something How important is it for the citizenry at large to be there given that the choices made there Affect the regular ability of the space and therefore ultimately the behavior of the citizens online There was an example just in the last 24 hours of how citizens can have an effect on The rights that are instantiated and platforms When chinese citizens who are lgbt community members pushed back against censorship in webo that came from The government initially so i'm always one who believes that there is a way for citizens to intervene It gets a little bit more complicated. How did they push back? Maybe you maybe should tell that story real quick I I only have the uh media level awareness of it because the story just dropped today, but uh Under you know, there are two different roles or two different models or views of the role of governance right now and the role of governments in Internet governance and the one coming out of china is really one about tightening controls more than they ever have been under the guise of cyber security And one of the things that has been happening similar to in russia and other countries is You know cracking down on certain kinds of information flows and one of them was content related to lgbt community members so Weibo actually turned back and reverse their decision And that's the extent of what I know about it But I think it's an example of how citizens can get involved But your question jonathan is a very important one because many of the decisions have nothing to do with the content level platform Um scenarios like that one if you think about um, you know One of the great contributions of this article and and of subsequent books as well that professor lesic has written is this idea of The arrangements and and i've adopted this myself and in my own work Both as an engineer and a scholar of sts. The way i describe it in my own work Is that arrangements of technical architecture are arrangements of power This often has nothing to do with governments whatsoever. For example, um the pipes There are protocols. There are routing infrastructures. There are interconnection arrangements that connect to civil rights There are um decisions about the technological affordances and platforms Just to give one or two examples that are very brief the world The worldwide web consortium Designs accessibility for the disabled into their protocols. This is a perfect example of how the architecture is also Determining public interest issues getting a little bit more technical now There are some protocols. Um, in fact while I was reading this article I was working on my doctoral dissertation On uh ipv6 and that's something that perhaps only this crowd can appreciate It's a a single protocol that has enormous public interest implications One of the decisions that had to be made was whether to Design into internet addresses Hardware identifiers which would basically create an identity system Connecting people to their laptops and therefore potentially to them. There are many many other examples of this certainly in social media platforms Can citizens get involved in those? Absolutely. There is an effort right now in the internet engineering task force where Not it's it's hard to have the technical expertise necessary to get involved but there is a surge right now in graduate students and young professionals and people from industry but not necessarily involved in these protocols To go to the meetings and have an impact and I think they really do have an impact behind the scenes so you would advocate maybe for citizens to somehow have a everything you need to know about a technical organization like w3c or ietf But did not even know how to ask To get up to speed and then to turn up at the meeting and express a view with that I mean that's kind of like the town hall equivalent for governance of code That is the town hall equivalent But there is the major caveat that I'd like to suggest And that's that some of the newer areas of cyberspace Are embracing proprietary standards that come out of closed organizations There really is a resurgence of the proprietary protocol This is certainly the case with the internet of things where we have proprietary enclosure in a way that we never really had before We also have it in even in social media platforms and lack of interoperability between You know various kinds of platforms in some of the standards organizations behind that or you know, even if it's not completely proprietary There are organizations that don't let people in the room or that don't Publish their standards for people to inspect them, which is a very important part of accountability So that is a caveat. It's not possible to go to all of the meetings because not all of them are open Yes, and that might oh, yeah, sorry. It's actually more than a caveat, right? I mean, this is the point of your of your work, right? because You know if you imagine the internet that starts And the consequence of its design is to create something like an innovation comments This is what network neutrality is eager to protect Since there's nobody who controls what content or what applications you get to run on the internet What that means is the right to innovate for the network is held in common And everybody can innovate and if somebody likes your product, they'll take it if they don't it'll disappear but as as the space of innovation moves into these proprietary spaces or these islands Of innovation as you get to innovate in the google platform you innovate in the twitter platform you innovate in the facebook platform um Then the as someone might put it the generativity of those innovations is contingent upon the platform's permission to allow you to innovate in that particular way and this is an example of how The technical change and the legal ownership um work together to change the basic Opportunity to innovate because now you've got to fear That jeez i'll produce this really cool app in facebook and it'll take you know get a whole bunch of people excited And then facebook will pull the rug out from underneath me and i won't be able to do it anymore So i'm not going to waste my time In that space the way i might if i didn't believe that that underlying feature was there and and so the point of The modalities is just to think about how these these different components work together and we got to evaluate What is the environment? What is the freedom? What is the affordance that this mix might make possible, but i think it's actually even worse than that i mean Yes, i think i'm i'm much more troubled By another step um in what larry has has just shared and that laura laura has um articulated as well and that is It's not just the proprietary spaces Those are bad enough, but i think one of the things that the article Set so starkly at least in my eyes um 20 years ago Was that the choice of how we design freedom? Was muted by the mystification of the technology So when when laura talks about you know the importance of citizens showing up at this sort of equivalent town hall It's important to show up Even if you don't understand the technology in my view in part because there's some sense of accountability When there is presence when there's no presence then in fact the idea that Those who want to go into their proprietary spaces can do so with no accountability Becomes the baseline we all begin to expect What we are told or how we're told the technology has to work In part because we're not there To ask questions at the beginning And to participate in the choices or at least to weigh in on the kinds of underlying values that each of these modalities represent and so the The accountability the transparency gets pushed sort of as a second order priority Because what's really important is getting people on the app or getting people on the platform um And so I think even 20 years ago larry i was struck in the paper By how difficult it would be To pers for collective action problems for all sorts of different reasons To persuade A sufficient number of the population that these technical questions were in fact questions about constitutions and about law and about liberty and Ruth part of maybe what's lurking here Is a long-standing distinction and then confusion about the public and the private The kinds of meetings that maybe laura was referring to like those of the internet engineering task force or the worldwide web consortium They're not public in the sense of government meetings, but they're public in the sense of a cultural And legal in the sense of no ip claims made and electro property claims understanding of a shared resource and a choice to govern it in some quasi transparent or open way Uh at these ietf meetings they call for a hum in the room to see if there's consensus and anybody can kind of turn up for that Or the worldwide web consortium has members and they vote in a certain way and membership is Porosity to it But almost by historical accident. There are other platforms and technologies that just as much Influence are affordances laura kind of averted to them with time up proprietary Uh platforms and larry was talking about these islands of innovation like facebook and google Do you think the Way to think of those going forward is all right. Well, they're private We're not going to like nationalize them or internationalize them We just need to regulate them from the outside at the margins Kind of what appears to be contemplated for facebook right now Is it's going to hearings? Will it be regulated or are there some platforms even if they're private so? pervasive that they call for the kind of accountability transparency And participation that we normally think of is reserved to public governance systems Yeah, I mean I think I would perhaps suggest that I don't take solace in the idea that Facebook will be regulated if though if the point of that is that we move internal controls from a private company to the government That doesn't strike me as the kind of Both Push in larry's article about the regular ability of cyberspace Nor does it solve I think laura's point about These islands these closed meetings what I would expect Is a regime sort of maybe call it consumer protection call it um You know public policy call it the health code That there are these islands that eventually become significant enough That notions of public welfare notions of how norms are even set Are so vital That it's important for the regulation to be both internal and external In other words, we have consumer protection laws that the government can enforce and the private citizens can enforce I don't see why we have to have a binary between either the private or the public Um, it's not clear to me that one of the advantages we shouldn't exploit a cyberspace is the possibility That as users when you set your privacy controls as As governments when you think about where data is stored and what your privacy laws will be That that kind of voice from users is as vital a source of governance As the formal governance that we might see coming from from governments And my sense is that if we're going to be serious about the idea of liberty We're going to be serious about the idea that regulation is not just law It is in fact code and law and norms and markets Um that there has to be a way and a mechanism in which The voice of the users and the voice of the public gets channeled into the very architecture of the system I mean think about our disability laws just to use the hypothetical that larry Raised in his book in the or even this classroom Years and years ago. It was all right not to have access Or accessibility built into The engineer's design of the building the idea that People with disabilities physical disabilities Should have equal access To a physical space is an idea that began to grow as people began to demand it and people began to recognize The legitimacy of that and the necessity of it That then of course translates eventually into something that we call hard law But it it was stemmed from the expectation That equality was not just about putting words on paper But that there was the sense that the architecture of the space required us to do something different For those who who could not access it on the same terms But the advantage in that case that's a really great example Is ordinary people could have the imagination enough to recognize that you needed a ramp And that people in wheelchair could use a ramp to get into a classroom That's not terribly difficult once you've seen it once you can begin And so the capacity to think critically about how we build and what we require Is common like we all can do it but in the in the context of cyberspace everything one of the Things that continues to frustrate me about this domain Is that we don't yet demand of our regulators like for example our law students That they have enough understanding Of the potential of this space to actually critically evaluate the various trade-offs that are that are possible So for example think about the privacy versus security debate You know the standard political way You know after 9 11 the standard way to present this Was as a trade-off we either have privacy or we have security as if you know, okay Given we've got terrorists we're just going to have to give up privacy because we need security And of course anybody who knows anything about the technical infrastructure knows that's an incredibly crude Framing of a much more subtle set of choices that could be made there are ways to architect The identity layer there are ways to build so that you could preserve traceability and Still privacy or at least pseudonymous privacy and and that the regulator needs to be Able to imagine these alternatives When deciding which to demand and that requires The understanding of these alternatives which in some sense, you know, it doesn't require necessarily You know how to code but at least you know enough to see the Option so that you could have the imagination the equivalent of saying let's build a ramp into the classroom Imagination will be the equivalent of saying let's require that the technology enable this type of capacity So that we could do this and not that but but I just one point. I think larry that the Maybe it's ignorance But the incapacity of the regulator to imagine those options is is either If you were thinking more nicely about the regulators that they are themselves Awestruck by the technology such that they don't think that they can access it sufficiently to imagine what the alternatives are Or it could be a deeper problem Where these private islands are sufficiently? Powerful enough wield enough influence over the regulator that they're saying we're the experts. We're the technocrats We know what is best for everyone and what is happening and I think you hinted this Certainly in the laws of cyberspace. We see it more in your later work Where we're seeing this shift? From citizens exercising liberty choices To experts who are in essence saying to elected officials Well, we know what's best and you have to trust us and that I think is a much more dangerous subtle shift about the governance of of of cyberspace and we I think we and certainly Much more subtle much more serious than I thought could happen 20 years ago Now larry you drew first blood in uh this panel by using the word blockchain First you just kind of got ahead of the inevitable invocation of it It might be an interesting latter day example though as a kind of test of your theory then and now One conception of blockchain especially distributed blockchain is the idea of trying to build a technology That ties all of our hands That gets a momentum all its own for which it's really hard for anybody to intervene in how it works Like a key feature of it is its existence independent of any party Trying to control it including an especially Government and that represents then trying to build a technology that cannot be shaped that is like a force of nature And i'm curious how you Think about that is that like Yes, you know you all have leveled up makers of the blockchain You figured out how to use code to advance your values and to make it unchangeable Or is it what the hell were you thinking people who made blockchain? Now there will be contracts put out on people's lives and everybody was like Don't play me the atom split itself really I'm curious where on that spectrum if that's the right spectrum your own thinking lies for a technology as transformative and as apparently pecuniarily valuable is that one Well, I mean, you know, so primavera invited me to a conference. I held I guess in 2015 in australia And it felt like 1998 all over again or 1996 all over in a good way in in uh Well, it was australia. So it was australia. It's hard to be. Yeah, it's gotta be good But you know, there's there are many people who are excited about Uh the blockchain in the kind of john perry barlow way like they imagined it just in its nature would make it impossible for governments to muck things up Be like this autumn it was like a libertarian Brilliant and a lot of people seem to be attracted to it as if there was no Bad to that and like this is unavoidable like this is what and um And of course it bet it raises all the questions which again as I read this paper I'm kind of scared to read this paper, but I reread this paper for this panel Um were all the questions right in the middle of that paper, right? So when you think about copyright law and you imagine copyright law being enforced through technology And we recognize copyright law has certain limits built into it like fair use or limited terms And you ask the question. Well, will the coders build fair use or will they build limited terms and the answer is why would they? Um And if they don't if they have no reason to do it naturally and you have a public policy That says we ought to be doing it Then that's a reason to be intervening to assure that these values are preserved But in the context of the blockchain That's a really hard choice now, you know, I don't I mean I I do really genuinely Recommends primavera's book because it is a really Subtle sophisticated effort to do this law Technology analysis on the blockchain to sort of raise these questions and exactly that Context but it seems infinitely more difficult Given the power of this machine that machine Then it was in the context of the internet because the internet was pretty fragile and malleable In a way that now you just used copyright consumer facing copyright as an example of the code able to go far beyond the parameters That the law might choose to protect that which a creator has made Um, it'd be great. And this is really for any and all of uh, the three of you Uh, I'm curious your reflection with 20 years behind us now If I'm right as an approximation that what many of us who read your paper and were moved by it We're thinking about copyright was big blinking red light There's all these new technologies that we think can let us share stuff really easily But the same technologies of digital rights or as Richard Stallman would say restrictions management will lock it up tighter than ever Is it fair to say that in the intervening time For a consumer there is far more access to stuff than ever before including access by which to critique make derivative works that Well, there's still some light DRM around the world peeling off the plight of libraries trying to save same stuff for humanity forever That's a really bad situation Put those aside for a consumer in this room or watching online on youtube and able to kind of Enjoy this stuff Were our fears overblown or was it like y2k precisely because we were terrified and raised the alarm We saved humanity I don't think we saved humanity um and in part I think because Again, I don't know how many regulators larry has actually tried to talk to As many as you there There is this mystique and and I think one of the things that we often forget Particularly for those who are lawyers in the room or law students is that law is itself a form of code and has its own mystique And so when lawyers get up and make the arguments that the internet is the place where pirates thrive That feeds very easily into it's the place where terrorists hide I didn't mean for that to rhyme but The point is it sells And it sells easily. It's piffy And once you add that to the power of the market We have created franchises That have made this nation extremely wealthy in the context of global trade The idea that that would be threatened by the internet Followed by the threats of terrorist attacks and the fear of someone having greater technological abilities That might match or counter what we do. It created an environment in which The government in many ways said we need stronger problems I see the rhetorical and I don't mean I mean genuine rhetorical linkage between A moral panic around copyright and a moral panic around say security And terrorism and the way in which privacy may really be I think if We were to pull the room my guess is we'd find people at least as if not more nervous about privacy now as they were in 1998 but Just on the ip front Do you think it's right that we're not in the copyright dystopia from the point of view of a consumer or a maker that we thought might come about? Yeah, you know jay-z. I actually think that we're worse off than we think we are I mean the the reality is that most of us in this room that are creating and sharing Are creating and sharing in the shadows or in an environment in which nobody really cares about My article or larry's or who's going to make the next You know buck off of something that you've written right most of us are sharing in what I call an innocuous space There's no big market player Per se that is interested in what you and I and most of the people in this room are doing But if you were to touch one of these islands the purveyors of these islands of of creativity um The story would change very drastically And I think that's the binary that concerns me that we are seeing What those of us who are in the ip space have always known is that there are these elite um sort of bastions of creativity Where if you were to touch um a disney or touch um a mcdonald's if this was a trademark You the response that you get and the kind of reaction about what you can and cannot do Is far different than if somebody circulated larry's article 50 million times around the world for which the real I mean the real third rail is the olympics Yeah, so but but here's a great example of this so at the very beginning of um 2000 I guess um I was friends with shan parker Who you know at that time had helped start napster? he would go on to Facebook and become you know quite incredibly successful, but he also went on to Spotify, okay, so when shan parker was in the early 2000s His view was that copyright law was wildly outdated needed to be updated for the internet It was crazy. It made it impossible to do all sorts of obvious things and it was a disaster I was on a panel with shan parker about four years ago And his position then was copyright law was perfect shouldn't be touched There was nothing wrong with copyright law. It was I and the reason was His company spotify had managed to negotiate the the incredibly complicated set of structures That now copyright law was their asset because it made it impossible for anybody else to step in and do the very same thing So when you think about the alternative the what the future could have been You know one is the spotify future where of course we do have access to unbelievable amounts of music And I love it and it's relatively cheap and that's fine But compared to like terry fischer's vision of what copyright law could have been like the You know the ability to share and have some monitoring and and but in a much more decentralized way that would have enabled Much more competition and much more innovation around business models Um, I do think that it's worse than it seems to us because you know from the standpoint of the people who care For innovation and the opportunity for this kind of creativity. It's not a great place Even this if for the consumer there's lots of things we can do without fear. I got it. We're already 40 minutes in We have 20 minutes left. I'm eager to get comment from Folks in the room. Uh, yes, sir. Feel free to use the um button in front of you with a un style Microphone into which you can then speak and feel free to tell I'm not yet. I'm not doing everybody gets a shot at everything. Uh, so let's go to that question and mara will No doubt be able to work in her point. Yes Sir Yes That's okay. Hi, I'm I'm zack and thank you for this wonderful talk Um, I'm just wondering my question is more about the The price mechanism in the framework So we mainly discuss about architecture and law and norms. Um, for example, when Doctor the nardis you mentioned about the lgbtq Incident recently on weibo from my perspective actually, I think it's more related to the regulatory arbitrage between Different platforms. For example, china has the largest online lgbtq dating app That has a very strict regulation on all kinds of content Like obscene violence contents and lies live streamers But on weibo there are still a lot of like obscene and those content So this time their major target is about this kind of content to reduce this arbitrage And actually today they gave like a response saying that we no longer target lgbtq only target the obscene It's actually they're very stupid by And so the lgbtq community was super angry at them and they posted like Like 200 million posts under the same hashtag. Like we are homosexuals. What are you going to do with us? And then I think the sort of revolution Laura was talking about. Yes. Yes. That's the revolution just happened this week And today we like said we're no longer cleaning these contents We're only cleaning obscene and violent contents So from my perspective it's because of the arbitrage between these profits from different platforms That generated certain like blocks Um, and what the chinese government and certain platforms are trying to do is they're cleaning the arbitrage to level it This is what i'm doing with professor touch net We're trying to analyze like how the advertising ecosystems on different platforms from the market perspective To influence this whole from your framework perspective. So I would love to know if how can we put more like emphasize and discussion about The market part in the system Well, I do think that the market part becomes a central organizing Regulator so again the spotify example is that you know to the extent the price of spotify Is really cheap so it's easy to opt into full compliance um for most people um And yet the cost of competing with spotify is incredibly high because the regulatory infrastructure is so Those are two price mechanisms that create the market that spotify likes and the potential spotifies don't like And it's a combination. You know price is always a function of property rules and contract rules So it's a combination of property rules and contract rules that get exploited like that Um, but I think that it for most of us. That's the way at which it gets expressed Just what does it cost to do the right thing and so we have the opportunity to do it Laura did you want to uh Respond I'd be very happy to weigh in I think it's and I'll tie it with a few of the comments that were before that as well It's very important to think about some of these issues as not being about content So some of the most powerful things are beneath content And if you think about it that way Then even in the area of intellectual property rights, which we were just discussing is not going very well It's true that we could exchange a lot of information But if you go just underneath that layer of content We have more technical architecture embedded intellectual property rights Then we have since the proprietary online systems of the 1990s So it's embedded with trade secrecy around algorithms very difficult there There are standards embedded patents in a way that never existed in the early days of the internet We also have a new area of copyright, which is very troubling and that's where Just to give an example if someone is Like a wyoming farmer who has a John Deere tractor He can be violating copyright potentially if he does something to repair his own tractor Now that also ties together to this idea of we can't think about this as just islands anymore Because everything is tied together in the back end so just I was very interested in seeing what happened in 1998 before this panel and I the first thing I came to to was The stone washed jeans, but also a drudge report just came out with The bill clinton sec scandal aol was billing itself as the first billion dollar interactive Online provider and that's true But they were an island of automation if you look at the way platforms now tying it together with the Recent comment if you look at how platforms are operating. They're not islands anymore They are islands in terms of interoperability and moving us away from permissionless innovation That they are tied together in the back end by these third parties that collect the data About us that aggregate the data and it's not just among private platforms, but it's also with governments It's the what jack balkan calls the Soilant green the big data is us So it's um if you if you look at all of these issues It's very important to take it below content And what's happening underneath and that's where a lot of the power is the power to do things like censor lgbt people The power to restrict people based on these architecture embedded intellectual property rights And also the power to monetize us through big data that's aggravated Aggregated rather with companies that we have never even signed a terms of service with If if we keep this distinction going It's a venerable one between proprietary and open and the respective benefits and costs of each I think that seems like a really good illustration I do find myself wondering one of the things that's changed in the past 20 years is More of us have gone online and put more of our lives online Is an appreciation for example of the real costs and dangers of online abuse and harassment And it may be that there's a rough correlation between say open permissionless innovation systems That permit all sorts of speech that can't be censored or controlled part of what might be celebrated But also then might permit much more unaccountable harassment of people which in turn Prevents them from daring to present themselves online How do we think about the fact that One of the advantages of a handful of platforms That yes, maybe right now have algorithms that are quite unknown or trade secret But government can still come in and say look facebook you got to clean up your act with respect to abuse Here are the standards that we expect of you lest it be actionable by the person Abused you can see doing something like that Isn't it true then that Having those focal points for intervention Allows for some Regulation of the space in a way that reflects the political will In a place that has democracy and the rule of law rather than just putting our eggs in the basket of Lower case a anarchy well like blockchain where we can't Affect how things go and it is sort of just a free-for-all The ownership affects regular ability absolutely. I mean this is one of the points of like the puzzles we used to talk about when we taught The microsoft case way back in the days of the microsoft case Which is also 20 years ago and like one puzzle was why was microsoft? Why was the government so interested in breaking up microsoft Given that would make it harder for the government to regulate because if they had just microsoft Then there would be like one focal point to be able to achieve all the things it was trying to achieve I think you see that here precisely, but I think one of the problems is that um, you know, think about online Like fake news and online news You know in the old days there were concentrated Entities called editors People who made a decision about whether something got out there or something didn't get out there We called those people sensors 20 years ago, right? And we were against sensors and we celebrated the idea of a platform developing where there would be no sensors Well, we've seen what that's produced right in a world with no quote sensors or slash editors Is a world where the content that gets out there is something that doesn't have this layer of credibility or confidence attached to it and It you know arguably creates a world where we're all more skeptical and Vulnerable to being steered in the way that we saw in the last election But but you know that that's a consequence of this interaction And I guess what the objective that I was trying to push 20 years ago was we needed to be Come more sophisticated about thinking about this interaction because that would be the kind of change that would you know flip us Other questions or interventions way in the back. Yes, sir Um, hello everyone. Hi. Hi. My name is Alan Holder. Thank you for holding this this panel. It's great um, I have no questions for uh, doctor professor leslie, but um, I would like everyone to weigh in um, I believe um time is uh a theme that's been running through this panel Obviously, we're talking about the 20th anniversary of this paper And also the story that professor leslie told about sean parker and how his view of corporate law has changed Given the places that he's been and the positions that he's been in Um, professor leslie in your paper as I was reading it you wrote that um, the single greatest error of theorists of cyberspace Is the error of naturalism as applies to cyberspace is the error of thinking that the architecture as we have it Is an architecture that we will always have um And you said that in the context of architectures that enable freedom versus control But the sentence stuck with me because it's something that I've um thought about a lot especially every time the government makes a Ham-handed attempt at regulating the web or behavior on the web like testa and fosta recently Or um, every time they try to like let's say question um, a very powerful tech CEO Such as what we saw last week and so that makes me want to ask you um 20 years later And with everything that's happened and with everything that Could happen now in the place that we are that we can see could happen um How are we committing this error of naturalism today? and How do we Avoid it as would be regulators and as members of this community of would be advisors to regulators How do we watch against it? well, I think that one of the things we take for granted Now I think ruth was adverting to this Is the utter incapacity of government to make any sensible judgment about policy in this space Not just unlike other spaces. No, not just in this space, but in this space You know, and that's a function of like when ruth talks about you know, the fear of The government stepping in and trying to do something. It's really just doing what the corporate Will want it to do and maybe the corporate's right Maybe the corporate's wrong, but the point is the capacity to have an independent critical judgment on what should be happening here Is deep is emaciated. We don't have such a capacity, right? So so and and I think we just we take this for granted And it leads us to solutions which are always suboptimal You're always worse than what they would be if we had that capacity. So in the context of privacy You know even the europeans Are basically pushing an infrastructure that in the end Is just empowering individual choice And it makes it some hurdles With those choices But it won't be hard for the technology companies to help you get over those hurdles to basically Opt into the world of giving up your private data that they want And the idea of like the government having rules Like here's what you can do with data, you know the way we have consumer protection rules Here's what you do with toasters or here's the conditions the toasters live under Seems crazy because we don't imagine the government have any having any intelligence sufficient intelligence to make those kinds of judgments So we will be left because we don't we have that kind of Naturalism about the incapacity of our collective governance right now. We will be left with Essentially what we are right now after going through this big hurdle of getting people to say, yes, I do Want this world where I'm giving up all of my data Rather than like getting to a place where we're actually deciding what kinds of data We should be enabling people to give up and what kinds of data should we be blocking And I think what troubles me Quite significantly is is we have this 911 mentality when it comes to policy We only seem to appreciate the danger and i'm talking about policymakers Who should be thinking about the future? And we only seem to appreciate the danger when it is at our doorstep It's already happened and then there's this reaction to well, let's fix this problem now rather than identifying that There's a systemic problem that there's there's a design problem and that Avoiding something in the future requires us to be thinking about it today Not tomorrow when the problem occurs. So that's that's that 911 mentality I think is something that we need to begin to ask ourselves How do we create a culture in which the debate and the questions about options and modalities are questions that we can make Dispassionately with the sorts of experts That weigh in but ultimately we make a decision and it might be the wrong decision at the moment But at least we've started somewhere and we can build upon it and we don't have that space When all of a sudden there's a crisis and we're trying to respond to that crisis So I think that's really important. There's a there's a verse that I love that says, you know Without a vision people cast off restraint. They perish, right? We know what's the vision for what cyberspace should look like 20 years from now So that's that's I think a really important question. The other thing is In our discussion about interlevel property, it's it's striking that we now have blockchain A huge reliance on contract law But if you look at the law of contracts Very little innovation when it comes to technology The rules and norms about how you contract what makes a contract valid Can you sign away? You know all of these rights without just because you clicked even though everybody knows you can't read There's been no serious attempt at innovation in other legal regimes that might create the sets of constraints around architecture That you might otherwise find and so it's not just that we are missing Sort of the mark in the area of cyberspace. It's that the regimes that that surround cyberspace Also have not received the attention that they ought to receive and so we are I think in a moment in which What we think of as a public welfare In some ways has been compromised because the seduction of being on the platform Has already the tentacles of the platform have sunk deep enough that uprooting people out of their respective cyberspaces Is now going to be a challenge for a regulator. You've got to think about how you do that I don't know if you wanted to jump in or if Okay, well, we're almost at time Maybe we should bring it in for a landing and I hope of course people can stick around and there's a reception afterwards for people Lucky enough to be in real space But larry, let me ask you What would you tell 20 years ago larry? Non goatee larry, which means good larry. You're evil larry What would you tell good larry That if the with the benefit of hindsight that he's missing or is this article Quite possibly prescient enough and hedging enough in its claims that It reads just as well now as it did then well, I think that the What was missing Is this practice of recognizing the trade-offs Through the experience of making them so And I don't know that an article could produce that sensibility about the trade-offs But To connect it more directly to The kind of experiences that demonstrate those trade-offs. So the one I was just talking about for example The infrastructure for news Um, you know, which is a market and it's concentrated or not concentrated and it enables People to access and speak or not to enable people to access and speak and What would those interactions Be and what could we expect would be produced by them, you know, we just did not see Or I didn't see I don't know. Maybe you saw just didn't write about it and tell us about it Ouch I mean you could have had secrets back then. I don't know But we did not see I think the way in which this Infrastructure would disable a capacity at collective understanding, especially in the political context And and I think part of part of that is just not having the mechanism Developed enough for this trade-off among these modalities and then the second part of it is Sometimes it when you think about the way code regulates you think the only solution is to fix the code So I see this a lot in the network neutrality debates So you see people are convinced network neutrality is an important thing We have to preserve network neutrality And so they go to the FCC and they basically say we want you to be regulating code in the following ways to make sure that There's network neutrality But it might be that code is just exactly too complicated a thing to regulate to preserve network neutrality Instead you ought to be regulating contracts. You ought to be saying to Network providers you're not allowed to have a contract with disney that says you will provide disney contact on your So that it's that simple place that we're affecting the regulation rather than the complex base of trying to ask Is this quality of service regime really? Just a way to allow you to discriminate And so I think it's just the facility of this interaction that I think that was not enough there that I wish we could develop now You know one of the distractions of my life right now is I'm Playing in a space where they're developing this virtual persistent AI world The largest persisted virtual AI world And they're embedding in this the choice for each of these communities to select the form of governance And you are single-handedly designing that I'm not design. I mean, I'm talking to them They're doing the building and the coding, but they're designing it But the point is Oppenheimer-esque maneuver of I just I just gave him the layout Um, but the but they'll do two things here. They'll select forms of government But more interestingly they'll choose whether the rules will be imposed through code or through Markets or through norms or through just rules and they'll so we'll have you know a million different communities that make these trade-offs And we'll have a Fire hose of data about what these trade-offs produce in these different communities so that we begin to have some Empirical basis to test and to think about how these Things evolve and that's that's the thing that felt was missing Well, I think we all had a sense back then that we were on the threshold of something big One way to look at that is just humanity had the applied technology Empowered itself on average that there was power now that previously had been Chance or affordance where previously there had been none And immediately gave her the question of where will it concentrate? who will it benefit who will it disadvantage and who makes choices about it and oftentimes I can maybe just should speak for me the Refuge we find in these complicated questions is to say well we should Decide this collectively. This is how polities make decisions about big Allocative questions of taxation or of who gets water or anything else that's really big It's just tough in a time whereas was pointed out There isn't a ton of faith institutions to function in ways that either work at all Or work in ways that reflect Some sense of what a polity wants and when you have a polity that now appears to be global And maybe a desire not to localize the rules as you just described There are real puzzles about How this power will be exercised in whose favor and I guess those have only gotten more complicated and almost dire over time I hope we'll all mark our calendars 20 years hence Let's get back together Either that or we will already have all been joined in the singularity and can just devote some cycles to thinking about Where it has gone and stopped motion? Photography from now to then but in the meantime, please join me in thanking ruth and laura and larry for these reflections All right, the food is free talk about a market failure