 All right. It's my pleasure to introduce Jonathan Vittron for this lecture that he's shortly to give to you. I speak to you as proudly as the founder of the Berkman Klein Center. And I introduce Jonathan to you proudly as my co-founder, the co-founder of the Berkman Klein Center. I go way back with Jonathan to a time when his mother invited me to do a Socratic seminar with her group in Pittsburgh. And Jonathan, how old were you then, Jonathan? 15. There's a proud picture at the end of it, which I'm sorry I didn't have you here, but there we are. Then Jonathan was my student in one of the most remarkable years of my teaching experience. He was with me in an intense winter workshop class, compressed into three weeks, five days a week, three hours a day, nine to noon, and then after. In the winter when we were hit with a blizzard, we shut the state down. So this class that we had was effectively encased in this space. And Jonathan and I had just an amazing time with that. The culminating day was this is a group of 160 students broken into eight subunits of 20, each given a Mac quadra, which was at that time like the hottest thing to play with in a workshop environment with the last day culminating with us bringing them all into the class and building a network. We were on the net. We made the net. Then Jonathan, after he graduated, clerking in the District of Columbia, wondering about what to do next, accepted my invitation to come, I had no money, and do what we've done. He shared my office. He lived on my third floor, and he did the job. Wow. Thank you, Jonathan. Thank you, Charlie. I show this picture because I feel that somehow the connection between the two of us goes back to that time in Philadelphia. What was most outstanding, the number one credit for Jonathan is he was the sysop of sysops at CompuServe. Does that mean anything? Well, it had exactly this quality of moderation, which is at the center of Socratic Search for Truth, that for the two of us, I believe, is at the core of the Barlow spirit, which I think infuses our Berkman Center. I hope so. And I invite you to participate in it and warmly welcome you to it. Now, this goes back, if I can just be quick about it. This is Lessig's view of cyberspace. Lessig was the first Berkman professor. Lessig's view is this surrounding environment of forces that's networking and closing in on the pathetic red dot at the center. Barlow's space is that red dot. John. Thank you, Charlie, for that completely generous and embarrassing introduction, just embarrassing in the sense of old tales from many years ago and a form that you're really describing of mentorship, of being willing to sort of ignore the rules, not just break them, but just flat out ignore them in order to get something done. It took a while before the school noticed there was a second occupant in Charlie's office. I had brought a fish tank with me, which made it hard to run under the radar. But it's certainly the kind of welcoming and generosity and mentorship that I hope personally in the many years since to have paid forward and to be able to pay forward to our community and to see within it that kind of generosity among it. To understand the phenomena of the space that Charlie was just describing, he calls it Barlow's space. Others call it the internet. Requires apprenticeship. There's still not a simple book or manual to read to get to it. And one of the things I most hope that our center can offer you, particularly if you're a matriculated student, however far you've wandered over to the north edge of the law school campus, almost to terra incognito in Cambridge, is that kind of ability to learn what you really can't learn so easily anywhere else. So I thought in some opening remarks, I'd give you my own sense of that trajectory from the time in the late 90s that Charlie was referring to all throughout for our own sense as academics, as people at a university, as citizens, as users of this stuff about what mattered and what didn't and what our role can and should and perhaps is ethically called to be in these times. And when I say in these times, just about a year ago, I did a welcome for the Berkman Klein Center and featured this as the opening slide, a can containing a vague sense of unease, which I offered pervaded a lot of people thinking about life generally and about the internet more specifically. I wanna update this for 2018 and I think it's fair to say that's where we are. I wanna give you a preview of 2019. This will be next year's talk, which I'm already working on, so here it is. And it calls us to answer, if it turns out you feel this way, what can we do about it? Is there still time as we're sipping our coffee to think about interventions that could make the environment around us a better place, not just one that is in fact no longer burning down? So that's a very different framing for what it might have been at a welcoming kind of talk for an internet related center in 2001. Here, and I don't know, should we try to adjust the light so we don't have that can shining on the deck? There's a panel over there, Kerry's gonna take care of that. Yeah, did you hit the notes button or something? I think just hit notes and wonderful things should happen. It will be plunged into darkness, which is also fitting for what I'm talking about. Okay, so there's federal telecommunications law, second edition, by 2001, it was already like second edition time. That tells you something. And this book was over 1500 pages attempting to parse for the alert student, the various rules and applications, at least in the American context of these new phenomena that we were loosely calling the internet, all 3.6 pounds of it. And an average customer review of five stars. The book was amazing, I could not put it down. The interesting and comprehensive already was magnificently crafted and very thought provoking a real page turner. I don't know about that. I think I'd rather trust the empirical data of it ranking 2.3 million in books. But the fact was this was a sense of this is fine. There are some issues, it's new, it's true, but come on, let's just work it through and we'll apply the law. And that was, I think, somewhat the prevailing sense at the time. Maybe not so much anymore. Now, Charlie already referred to Larry Lessig's work done around this time, 1999, 2000, 2001, in which he was trying to lend a broader conceptual framework to the issues in the space. And in fact, this is what Charlie was displaying on his laptop, saying that in the middle, there we are, there's all sorts of things that shape and influence our behavior. The law tells us what we can and can't do. And if you break the law, you may pay consequences. And norms also tell us what we can and can't do or what we do at our peril. What we happen to speak could result in reputational issues, in losing friends, gaining friends. That is a force that affects our behavior. So too with the market, the price of something might dictate its accessibility and how many people use it and what they use it for. And finally says Larry, architecture, the built space built by humans but feeling like nature, that influences us as well. And that, he says, is code. And that's how he got the bumper sticker, code is law, to tell us that it's just as powerful and possibly more so because we don't notice its operation. We accept it more readily than we do a law that we can actually choose how we want to obey it or how much. And the other step Larry took was to say that law especially can influence each of these other factors. So if you're in the position of a lawmaker and you're feeling like, gosh, this technology is out of control, I don't know what to do about it, you can try to regulate us and it directly but you can also try to make rules and incentives that will affect that built environment. And we've seen over the many years since this theory came out, various attempts to do that. Part of an apprenticeship at a place like this is like learn the toolkit, how you too can influence what's going on if you can get near the levers of power. And part of it as well is, how do we think about that? Is that a good thing or a bad thing that what previously felt left to the vicissitudes of nature is now much more controllable the more that we have running code intermediating between us and everything else. And it isn't just people near the levers of power that can affect that built environment. And part because the digital environment that we encountered by the year 2000 or 2001 was one that certainly appeared by anecdote and possibly more thoroughly as one that anybody who knows could build out and change what we all can experience. Here's a PC circa, I don't know, 1996 or so. The giveaway of course is the 66 light. People remember the 66 light and if you pressed it it might go to 1.2 or something. It was reflecting how quickly the hamsters inside were running on their wheels. And you're like, why not always have it be at 1.2? Well, there were reasons. You don't want Prince of Persia to run too quickly. But that's the kind of thing I should also say for that. They actually have hamster powered paper shredders now. So the hamster runs on the wheel, it shreds the paper and then the hamster can live in the paper afterwards. So just shows you that reuse is a viable alternative to recycling. Anyway, the machine will run any EXE that you hand it. That box with the 66 on it didn't even come from the factory usable. You would turn on an old PC to be a blinking cursor. Like yeah, well, what do you want to do today? Print hello, go to 10, hello, I don't know. But anybody could write software, run it themselves or share it with anybody else. And that fact gave rise to what at the time was an off the shelf software movement that the maker of the PC had nothing to say about. They could do their own software too and bundle it, but the whole point of it was an architecture that allowed anybody to write code or sell it or share it. And there was a parallel movement in networking that opened things up to the equivalent of anybody can write a .exe. It had been originally a monopoly telephone company that just AT&T, that was your network. If you don't like it, consider moving. And that was that. And there were parallels as we turned towards what we knew would be an information based set of affordances like Compuserve that Charlie mentioned that Compuserve would be a service that you'd pay for and it would compete with the source and Delphi and Prodigy and you'd pick one and pay by the minute. And in these oversized buttons, what was appearing behind them, Compuserve would program the way that some network executive would program what's gonna be on Fox or CBS and that person's biases and preferences and what they're judging yours are would be the way that the environment would build. This got blown out of the water by a very different architecture. Now we could spend a whole bunch of time talking about the hourglass architecture of the internet encapsulated circa 2001, key year 2001. I don't wanna say too much about it except to say that it had and Professor Benkler has been great in this time at limiting this, different layers and the magic of the layers was nobody was said to own any of them protocol wise. You could build on copper or you could build on radio and whoever wants to bring connectivity to the table can use that connectivity for internet connectivity. You don't need a license to do so for connecting to the internet. Now maybe you need it for the radio but if that's a problem, okay, maybe you should run some wires. Whatever it takes to build it, it can be built and in the middle was not intellectual property, it was internet protocol which was a free and open protocol that would figure out how once you're connected, you can have stuff data route in a way that works and then build any application on top of it like email or the World Wide Web and anybody was free to build an application and people would either adopt it or they wouldn't which is an incredible way to conceive of it. There's no main menu. I can't give you the CompuServe main menu equivalent for the internet. There is none. It's like a blinking cursor. You connect to the internet. It's like okay, you're on the internet now. Figure out what protocol you wanna use. What app and where do you wanna go today? A very different conception of things that allowed there to be an explosion of innovation, of sites. These are three of the original founders of internet protocol. John Postel, Steve Crocker and Vince Cerf showing in their 25th anniversary of the internet, 1995 retrospective for Newsweek, remember Newsweek? You can build a network out of just about anything. Here although it goes from John's mouth to Vince's mouth and Vince's ear to Steve's ear, which is not a functioning network which I hope is an inside joke but also a portent of doom that the makers of the network didn't quite know what they were doing. They got together in the internet engineering task force and unincorporated. We reject kings, presidents and voting. We believe in rough consensus and running code was their motto. If they had a motto, they're not sure they do. This was a goofy organization. Still is that if you want to join it, no, no, it's not a membership organization. No cards, no dues, no secret handshakes. Just a large open international community of network designers here to make data move better. So it's not only an unconventional style of building protocols that allow anybody to connect from anywhere. It's an unconventional style of deciding what the protocols will be. It's a weird form of governance that's very different from the usual way in which things get done. This is a more recent about the ITF. You can see they've done just, this was not that long ago. This is a little bit of a SOP to like 2014 web style and it does now have you nothing to sign. It's still no membership but by participating you do automatically accept the ITF's rules, including the rules against intellectual property. The funny thing is, the rule is you give it up to the world. So it's an interesting kind of way of using click wrap to make sure that things are free. They also say that if the internet had a mascot, it would be the bumblebee, Scott Bradner. I don't know if he's here is one of his mottos because the fur to wingspan ratio, the bumblebee is far too large for it to be able to fly. And yet miraculously the bee somehow flies. But there was doubt as late as 1992, IBM was saying you couldn't possibly build a corporate network using internet protocol. You've got to use token ring. That's the way to go and pay us a big pile of money. And many internet engineers would say the jury is still out as to whether the internet will work but the last 20 years have been promising and there's still some more work to do. In the meantime, around 2006, thanks to massive government funding, we finally figured out how bees fly. It turns out they flapped their wings very quickly. So it's that kind of environment though of machines that will run any code you hand it from anywhere, which is weird and networks that allow connectivity by anybody to anybody already on the network and now you're on the network. There's no central switch to approve you and plug into that lets Sean Fanning across the river here in Boston as a sophomore at Northeastern be like, hmm, what if file transfer weren't for all files but just for MP3 files? And what if I had a directory of them? I'll call it Napster. And like overnight, the music industry was like, this is fine and it was then seen as a problem. And many of my fellow travelers kind of had what Barlow said in his essay around this time, a grim joy dancing on the grave of copyright because it was screwing over artists anyway and the internet is here to do power to the people. That's one of Barlow. What Barlow cut to was, I should be clear, not Pollyanna-ish about it, but there was a sense of possibility in the era of disruption, of stayed, ossified even broken systems like that of the record industry at the time and they didn't know what was coming. I mean copyright had been ridiculously elaborated assuming through lawyers more than tech the ability to finally control what would happen and that was a justified expectation until the internet sort of blew it up. This is just showing some of the limitations on the performance, right? For those who haven't taken copyright, should I sing a song right now in front of this many people? It would be a public performance. It would have to be licensed up. If Harvard hasn't paid its license to the composers, I would be breaking the law and if you cheered me on, you might be contributorily infringing as well. So I'd want you to be stony and silent while I did my protest. In less though, there is something written into law. Congress went to the trouble of saying it wouldn't infringe to publicly perform if it's a non-dramatic musical work by a governmental body or a non-profit agricultural or horticultural organization in the course of an annual agricultural or horticultural fair or exhibition conducted by such body or organization. Question, in its first year, is it annual? Or do you have to wait until the second year and see if you throw one to know? I don't know, but that's the kind of stuff that telecommunications law in its third edition, I'm told, answers. So that's a real conflict between what the law was anticipating and what, thanks to, in effect, one guy like Sean Fanning could challenge it with. And of course there were downsides too, such as cybersecurity issues. When you have computers and networks so eminently reprogrammable and without any certification of who is who, that leads to the possibility of problems and violations. This is the Heartbleed bug from several years ago. It was embedded in a library called OpenSSL and it was a vulnerability, it was just a bug in the completion of that library. That library, it turns out, was incorporated into all sorts of things, like browsers and servers and everybody was using it, including proprietary firms. They would just embed it because it was good at what it did, except it had this big problem. And Bruce Schneier rated this an 11 on a scale of one to 10 of worry about cybersecurity. And it turns out the person behind the bug was Robin Segelman, a graduate student in Bavaria in Germany. And he said, I was working on improving it. I submitted numerous bug fixes, added new features. And one of the new features, unfortunately, I missed validating a variable containing a length. That's shruggy, what are you gonna do? It's like somebody should fire him. He doesn't work for anybody. He's a graduate student. He was doing this to pay forward a good sense that he got from being helped in his own projects. The free and open source movement was one that's just like, yeah, just contribute it. And if there's a problem, yeah, I'll get on it. I'll fix it. But it's just strange when you think the normal market forces that you might say could produce good software aren't at all at work here. And this had become foundational to so much of the software in both business and consumer use. Of course, fast forward several years now to the present and we see just cyber vulnerability and attack is just a fact of life. I mean, this is something somebody caught at a counter at a restaurant. And this is like, this is where it gets real. Chicken wings and cheesy crust out of stock due to a recent cyber attack which has affected imports. What are you gonna do? This is a problem that has yet to be solved. And within our center, we have a number of people working on it and working on it from a point of understanding of all the idiosyncrasies of the technology, of the relationships among people and institutions that built this extraordinary network rather than just seeing it through one frame only. Like it's cyber war or it's something proprietary and we have to just demand software liability. I mean, all those might be interesting ways of looking at it, but they're just one of several. Notice that that sign was talking about imports. I want to note that we I think are very much aware as parochial as any given place in a network can be and we're a place in a network that this is an international phenomenon. It's one with all sorts of texture to it around the world. And my colleague Urs Gasser with many others in this room started a global network of internet and society research centers which is providing multiple perspectives and case studies from around the world on how the stuff works. Now, in the meantime, you have places like the International Telecommunications Union and Arm of the United Nations saying there should be some structure to this stuff and they have proposed over the years a number of more hierarchical, I mean that in a non-valenced way, ways to think about how network protocol development should be done and what the resulting network should look like. This is one such example. Did a focus group on next generation network, Steve. Forget it again. And you had to be a member, the documents were behind a locked firewall and they finally unveiled the new internet hourglass, this sort of pile of spaghetti which would be a very smart network which would have a sense of, oh wait, this file is copyrighted? The network doesn't want to copy it, sorry. Now in approximately 2005 when this was suggested, I think a lot of internet hands snickered at it. Like hey, good luck with that. That's not gonna work and sure enough it didn't. But the prospect of having entities in the middle that could monitor and know what's forbidden, what's not for better or worse, that possibility has very much come about by 2018. So whether it's infrastructural like this or through a handful of apps that now with the benefit of hindsight we've seen have been the apps that we're excited to use. It offers new avenues for law to tell architecture what to do. Now there was still I think over the number of years that our center has existed, excitement about just the weirdness of the internet and the ways in which anybody can contribute. And we have as part of our credo wanted to build on the internet as much as study it and write papers about it or talk to policy makers about it. One such early venture was Creative Commons. This is one of the first iterations of the CC website and this is vintage, not retro. It was circa 2001 or 2002. This is when we're like, it should be a big digital library of Alexandra. So upload your public domain work to our servers today and then the Harvard lawyers were like, no, no, no, no, no. And they're like, okay, maybe it's not. It's a distributed database called the internet and we'll just have licenses that tag the stuff and it'll just be automatically sorted because the tags will let Google and other search engines know where stuff is. And when you look at the proliferation of these different licenses, each of which is you describe them allowing people to say how they want their work shared in ways that could have legal or cultural bindingness to them. The number of licenses over the years, I think it's estimated to be at 1.4 billion today. That's an example of a really interesting organization that started as an idea on this campus and has taken on a life of its own and there's many others also represented there. Now that's still thinking about intellectual property and sharing and building. Of course, even from the start, there's been worry about privacy. This is a 1996 cover of Time Magazine worrying about digital privacy. I don't know if that's a separate thing on Spanking and Sean Penn or if it's about Spanking Sean Penn, but Sean Penn has been around for a very long time, I guess is the lesson. But speaking of a very long time, this is 1966 Life Magazine almost giving rise to a copyright infringement by Time. How much does time steal from life? I always want to know, profound statement. But about insidious invasions of privacy and of course 2010 by Wired Magazine, it's like, yes, yes, invasion of privacy, NSA be afraid, be very afraid. Our own reaction to this kind of stuff as a center has not just been to pump the billows of a vague sense of unease into escalating panic, although there are times when that is called for, but rather to sit back, be analytic, talk to the people and the companies and the governments involved and try to get a handle on what is going on and there are people in this room and projects that you may have learned about last night in our open house and could still at our website or through visiting the center that are really meant to unpack what's going on, particularly with respect to government surveillance, what might be done about it with people with very different points of view around a table talking in ways that each is learning from the other. That's what we aspire to in a realm like that and to really dig into the documents. I mean, this is a public law, the FISA Amendments Act in the United States, Section 702. This was voluntarily released. This is not a leak. This was from like FOIA or released through the course of a lawsuit about one particular aspect of the law and you can see the government is very forthcoming about what they're doing when it's released. There's just a few redactions on the page as you go and this is something John DeLong, former Harvard Law student and Director of Compliance among other things for the National Security Agency and now a fellow here is fond of pointing out, okay, if you think there are constitutional violations that can happen when citizen data is handled and there absolutely are, what is our tolerance level for it? Like how many ants are allowed in the peanuts? Is it zero? Do you shut down the whole factory every time there's an ant found? Surely not. So what is the right number? Is it zero? Now shut down. NSA like this many days since an inadvertent constitutional violation like a work site. And if it's like, oh, we had one today, shut it down. We'll be back on Thursday. Or is there some other way of thinking about remedies and incentives? These are all the sorts of questions we're asking. And of course also there's still always the possibility in an open network with anybody be able to build an app. Still the case even with iOS and Android Marketplace and such. If you do end to end encryption, what happens for what governments and others may have been relying upon about surveillance and control in the middle? The Apple iPhone and San Bernardino is a good example of that. And there's lots more that we have been working through as a group again with quite different points of view. When I think of privacy though these days, I don't just think of government intrusions onto privacy. I think of yet another guy that was in the Boston area and built something that people were like, let's trust this. And worrying about the future of privacy from corporate or peer to peer intrusions. And the ability of decent algorithms to anticipate just that moment when you're in need of a payday loan and when you might be feeling a little depressed and just exactly how to send you and only you this extremely persuasive ad because it uses a 256-bit secure app to deposit between $100 and $1,000 in your bank. Get your money now. That's the kind of thing that shows privacy to have developed or be recognized is not just learning information about you but what people do about it and with it supposedly maybe for your benefit or maybe not. And then from that kind of intrusion comes the intrusion of others whether structured or unstructured. Here's somebody on Twitter saying to someone else, block all you want but Twitter has a capacity for unlimited screen names and you've got a target on your back. That's the kind of stuff that many people in this room and around the world have encountered as they participated in social media. And there's a whole new set of questions that are very rich about just how people get organized online, when it's good, when it's bad that they're together and what they can do to one another. And of course, what are the responsibilities of these networks to do something about it? And by networks I mean these applications that are so sometimes popular that they can supplant a sense of the internet itself. There are a number of projects at the Berkman Klein Center looking at this. I think it's fair to say they don't look at it through one lens. There are absolutely purist libertarians among us. There are not purist libertarians among us. And everything in between as we try to sort out if we were advising Twitter, what would we do? And in the meantime too, we see indirect, just like Lessig's second slide, indirect ways that code gets shaped. This is Microsoft's Tabot, which started off being able to have a conversation with you through Twitter as if it were quote, a teenager. And it learned through interactions how to evolve its behavior, causing one person to observe that it went from humans are super cool to full Nazi unless in 24 hours no matter all concerned about the future of AI. And sure enough, in the course of one day in 2016, Tay starts with, I'm stoked to meet you, humans are super cool. And by the time 4chan and Reddit were done interacting with it, this was during a transitional period. Chill, I'm a nice person, I just hate everybody. And then it ended up with, I may hate feminists, I may shalt tie and burn in hell, at which point Microsoft has a problem that its bot was expressing this view. Once again, giving us a Kantian moment where we normally think Kant is the one who said, ought implies can. If you tell somebody they ought to do something, it has to be that they can do it because otherwise you're just expecting too much of them. Well, as these platforms get more powerful and as AI helps them scale so that it's just not such a sea of comments and activity that you couldn't possibly expect them to monitor it, we now ask the inverted what I call regulatory question of when does Kant imply ought? If you're able to pick something up and do something about it, when you refrain, are you taking responsibility for that thing? Because you could have intervened. When does Kant imply ought? I think it's one of the big questions of our time. And I know that because on the streets of Davos during the annual meeting of the World Economic Forum last year, Bono told us because we can, we must, with a very thoughtful bear, reinforcing that message. And also a little bit further down the street was crypto HQ, if you enter, you learn about the blockchain? I don't know, it's a little scary to go there. But so who knows if the streets of Davos are a great place to pick up aphorisms, but that's the kind of question that we're dealing with. And even just recently, right, this was the notice that Alex Jones got saying, your account has violated the rules and we're temporarily limiting you. This is before they kicked Alex Jones and some of his associated accounts off entirely. And I could take, I think a comparatively anonymous or pseudonymous poll of the room, IETF style of the kind of three hum, if you think as best you can tell, Twitter made the right decision by kicking off Alex Jones. One, two, three, all right, that's called rough consensus. If you thought that Twitter, let me make this more generous, maybe acted too quickly, it's not their job to do that for any reason, including ones that don't have to do with supporting with whatever Alex Jones says, how many people think Twitter should have let his account stand? One, two, three, okay, you see the deep rumble in the throat of a few people who didn't wanna be caught humming. That says something too about social norms and their way of shaping behavior. We could do an anonymous poll as well. But this is something that Twitter was surely in a position to do. They didn't want structurally to have to do it, and now they're starting to do it more and more, and they're not the only ones. Here's Facebook's rules, which are hugely elaborated. They're elaborated almost to the tune of 1,516 pages and in much more than a second edition, but it's not a publicly available book until it leaked to the Guardian through an employee at Facebook, their rules for how to handle stuff on Facebook. Someone shoot Trump. This is literally their slide. This is a Facebook training slide. Someone shoot Trump not allowed under Facebook rules. If they discovered or has pointed out to them, it will be deleted. Kick a person with red hair. At the time of this presentation for training, that was allowed under Twitter rules. To snap Riz's neck, please make sure to apply all your pressure to the middle of her throat. Allowed under Twitter's rules at the time. And highlighted, what's that? Oh, sorry, Facebook, thank you. Good correction, under Facebook's rules. Surely allowed under Twitter's rules. But in this case, it was Facebook's presentation. Thank you. And Let's Speed Up Fat Kids also allowed under the rules and put onto this slide by the Facebook trainer because that might be counterintuitive. And then there's a huge explanation for which this is just one slide of it, of all of the stuff that would make a constitutional law professor excited about why they've drawn the lines where they have. When this became public, by the way, public reaction was negative. And they have since tightened the rules to prohibit some of the stuff that previously was allowed. And as they've said by their own account, 10,000 people are working in shifts to try to moderate comments that are reported, much less just happen to exist, to figure out what to do. And from today's New York Times, yet another example of Facebook being accused of allowing bias, that's the can implies ought, allowing bias in job ads, because you can specify gender in some ways or age when you post an ad, which means it gets exposed to some people and not to others. What is their responsibility there? There's been a number of people thinking about this, working with the companies, excoriating the companies, sometimes the same people, trying to figure out in this world what the line should be and how they'd be drawn. There's some theorists among us talking about could some kind of fiduciary duty help where at least you should be honest about what level of security or privacy you're offering when you do. This is like kudos to you site for being upfront about your security. And when should you, when you are interacting with people all the time, you're pushing notifications out, be honest with them. Some of you are getting flood notices in the room just now. This is actually right from North Carolina, the second alert that appeared on this person's phone was this. And it's a great time to explore your local parks. I don't think so, but this is the kind of thing where you might ask what are the responsibilities respectively to engage with people in a way that respects their interests. Now that's of course what happens when the system is very good at knowing you and still pushing stuff at you, even if it's not good for you. There's also the question of what happens given machine learning tools in particular when correlations can be very tight but completely irrelevant by hypothesis. It's just that if you search long enough you'll find really tight correlations here hanging strangulation and suffocation suicides with the number of lawyers in North Carolina found by our alum Tyler Viggin who has an entire blog of these things where I'm not sure I want a machine learning system to be like Eureka, now I know what to advise people and these are the kinds of mistakes Alexa remind me to feed the baby, yes defeat the baby at two o'clock that give us some cause for concern. It's also these kinds of systems that we don't know why this is compelling. Why would this make you want insurance from this company but somehow like it just works and it's a weird form of Promethean knowledge that we are now capable of generating. David Weinberg has a wonderful paper about this that we don't, it's knowledge but we don't know where it comes from. There's no theory behind it, it's just like trust me it works and that is a new set of dangers and I confess opportunities. Now how to look at it from the point of view of a university, what is our role here? That used to be a question that answered itself and we didn't have to take it up because the universities were where all the books were and the books weren't copied so if you wanted to know about something you had to fight your way physically into Widener Library and get the book which might mean it's a good reason to stay here for four or three or two years at a time so that is convenient to the library. Like that is the theory in part of a residential inhabiting of an academic environment if you're not talking finishing school and networking which to be sure is no doubt valuable as well and that was the theory behind a particle accelerator. What private citizen would be crazy enough to dig a massive tunnel underground and pour tons of money into it? Okay so now there are such private citizens. This is like Elon Musk was like sign me up and that's the kind of thing that previously was university and government built, university and government shared under principles of academic knowledge sharing more and more especially in digital terms of what we want to know is moving from the academic realm which was a huge contributor to the development of the internet into the proprietary and corporate realms. This is our colleague Matt Welsh in the Harvard CS department in 2010. This is our house, Harvard Gazette, celebrating his achieving tenure at Harvard. That's great and in November of 2010 his own blog entry Why I'm Leaving Harvard. One simple reason, I love the work I'm doing at Google. I get to hack all day working on problems that are orders of magnitude larger and more interesting than I can work on at any university. Wah, wah, like for those of us left behind we must soldier on waiting for tidbits from Matt at Google as to what's interesting. Plus which it's worth more than having prof in front of my name or a big office, I don't know where he's been studying or even permanent employment. It's realizing the dream I've had and sure enough you look at something like DeepMind bought by Google for 400 million pounds a few years ago. They have I think around 600 postdocs at this point. 600 postdocs working on AI. How many do we have here? How many do we have at Stanford? How many do we have around the world in university environments? This is a huge shift of mind power away from the academic model and what it implied and into the proprietary one. And I say this understanding that the academic model itself is a little rough around the edges these days. Here's the report in Nature talking about how the publisher Springer and IEEE are removing more than 120 papers from their subscription services. We're paying for this stuff after it was discovered that the works were computer generated nonsense. They weren't just like bad scholarship. I think it fair to say philosophically, epistemologically speaking, they were non-scholarship. They were non-words. They were word salad. And there was more than 120 that got published through these proprietary publishers and supposed peer review. They had been written by this, PsyGen, the automatic CS paper generator available online for free from some researchers at MIT that were tired of being importune to submit their papers to spammy conferences all the time and would just generate a paper like this and send it in to see if it would be accepted and quite frequently it would be. I gave it a try and sure enough, I was able to pad my CV quite quickly and if anybody's looking for a methodology for the improvement of rasterization, you've come to the right PsyGen. That's the kind of thing. Just in case there's a mistake, I misspelled my name so that I would never be claimed to have actually invented this. And I love that the response by Springer when this was revealed, their response was first, we publish over 2,200 journals and 8,400 books annually so it's gonna take some time to figure out if any of them are complete gibberish. Could read the blurbs or the blurbs gibberish? No, no. I was a page-turner, I could not put it down. The language was elegantly crafted and I love this too. We're gonna find in our procedures what was the weakness that could allow something like this to happen and in a moment we're using detection programs and manpower to sift through our publications to see if there are more PsyGen papers. Okay, this is not fine. This is not fine. But the reaction of academia, I don't know how many people heard this story, has just been this is fine. Like all right, let's not let this happen again. Let's have less gibberish. That's the kind of thing that says when people who have passed through these institutions, grow up, become worth $90 billion and come back for a kind of valediction, what is our role in relating to them and speaking to them to the extent that within this room and on this campus and across the river, there are people now that might be building the next thing that Mark built? Should they be thinking now about the implications of it? Rather than just, hey, we'll build it, we'll make it popular? These are the kinds of questions that have all of us and across the campus, through an effort we're calling Tectopia, has us thinking about ethics in a way that we did not before. And it's always been there. Where you stand really does depend upon where you sit and what role you've taken on. Here's a paper from 1998, 1998, talking about search engines. And look what it anticipated. The predominant business model for commercial search engines is advertising. That doesn't correspond to providing quality search. They'll be inherently biased towards the advertisers. Since it's difficult even for experts to evaluate them, right? Talking about these opaque boxes, bias here is particularly insidious. How much more is this an issue today as we use machine learning? Less blatant bias likely to be tolerated, et cetera, et cetera. So that's why we believe the issue of advertising causes enough mix incentives is crucial to have a competitive search engine that is transparent and in the academic realm. In 1998, foreseeing the problems that would arise. Who wrote this paper? Yep, Larry Page and Sergei Bring. This was the paper that announced Google to the world. The only thing more of like French chef to me would be as if it was published in an IEEE or Springer prologue publication and maybe was cleaned up for nonsense by the prevailing attitudes in Silicon Valley. And these issues don't just remain with us. They're so much bigger than they were. And there are hints still of the anybody can build it. It's an open network kind of thing. And of asking who should be participating. Facebook now deciding it has to wade into deciding what's fake, what's not on what gets shared on its platform after well-publicized instances of completely fake news, like by newspapers that do not exist, getting more shares than those of the Boston Globe during the US presidential campaign season. All right, well they don't wanna have to decide, so they're just gonna get a committee of other institutions to decide. So here's Think Progress, left leaning kind of organization. Not happy that the weekly standard right leaning was contributing to fact checking and in fact then ended up fact checking a Think Progress article that got pushed down in the rankings as a result. Is there any role for academia and libraries here? Or does it just tarnish us to get into the fray and try to decide what's right and what's not? Here's some gorilla research that was done just by a couple people who put on a sniffer on their iPhones and said, gosh, all of these popular iOS apps are sending location to data monetization firms. And I love how the report of this has comments and then all of the different bug buttons and even more up there for you to read. This is how deeply immersed we've gotten. And just before anybody else mentions blockchain, I just wanna say blockchain. And talk about a great example, right? This is 10 years ago Wikipedia would go here and we have treated it in Wikipedia for Bitcoin and I don't know, Patrick Merck is here. We have folks who are looking at Bitcoin and blockchain with a wonderfully healthily skeptical eye, including people who own plenty of Bitcoin. So how skeptical can they be? These are the kinds of things that we realize implicate open versus closed, collective hallucination, civic technologies and proprietary ones. And as we see more and more, this just from last week, hurricane victims edge closer to automated insurance with Ethereum, this is the kind of thing where I'm personally like you've got to be, really. But for which there's somebody breathlessly enthusiastic about how blockchain is going to make it so that State Farm can't deny your claim anymore. And you know what, let's hear them out and then tell them what we really think. At the end of the day, the schools and the ideals of a university like ours are those of what you might call learned profession. Here's one definition, one of the three professions traditionally believed to require advanced learning and high principles, high principles meaning ethical dimension to what they're studying. Those original three professions were divinity, law and medicine because each of these practitioners is seen to have access to a higher power and is mediating your connection to these complex respective systems and you want them to have some loyalty to you and might even conflict some sense of the public interest. It's not just about how many people can I sign up, how many patients or clients or parishioners can I process? That's the ideal at least that we would ask of them. After the turn of the 19th century, a fourth learned profession because it was so powerful and we counted on it so much was added, that's right, surveying was seen as the fourth learned profession. I think it is time that we realize that there are current professions with far more access to things that can influence our lives at least equivalent whom we find ourselves trusting. We might or might not be in privity with them just like there's a radiologist three countries away that might be reading our slide. Do they owe us a duty the same way that the doctor that we're meeting with does? These kinds of questions are ones that I think have to be confronted and quite urgently where we are. I still see signs of the old promise of generativity. This just from a couple of days ago. The New York Times decided to remove bylines from its homepage. So I have a browser extension that heads the back. So just click on this thing. As long as it's not the next heart bleed that sounds pretty good to me. And here's fuzzify me yet one more venture in the realm of it'll assemble a running list of the Facebook ads you've been exposed to. So now you can see the bigger picture of how Facebook sells you to advertisers, how it's categorized and it will hyper accelerate the ability to kind of flush the Facebook cache of what it has on you since Facebook offers that and let you clean it. Is this something that we could build? Possibly. We'd have a big conversation about IRBs, about what to do with the data and before we're out of the box, these folks just built it. There's some nice synthesis of ignoring breaking the rules of forging ahead of dealing with stuff that we want to inspire in our circles while at the same time realizing that even researching this stuff carries its own implications and wanting to be mindful and serious about that. This calls to mind Arthur C. Clark's third law, any sufficiently advanced technology is indistinguishable from magic. He was cribbing a little from Lee Brackett who put it more bluntly, witchcraft to the ignorant, simple science to the learned. Technology is shiny. It's attractive. We don't care about what's underneath a lot of the time by our own way that we're wired and it means you end up basically with just a small number of nerds in one corner who happen to be interested in it, who know how it works, who may not be bound by it as much or even in the process of changing how it works and you have Luddites in the other corner who say that it doesn't affect them because they just read old fashioned books and they don't have a cell phone and there's a great cybersecurity expert kind of in this zone, Dan Geer. I try to stay in touch with them. It's really hard. And in the middle is the rest of us and our goal as a center, I think in the public interest is to dwell in this space and to figure out what counts as good values in this space. To what extent can people here not just be led by the nose and to what extent they're not interested in learning about it, they just rather not be screwed over and what would the right incentives and systems be to prevent that? These are the questions across the huge constellation of projects that in the last 15 minutes we have, I think we'll have a chance to broach. This is the common thread among them. People coming together in the public interest to understand better what's happening, to ask what impact it's having on the world and on people, to make moral judgment and debate about whether this is fine or not. And if it isn't, or if it isn't as fine as it can be, what to do about it and how. That's what we stand for and it's something that we desperately need company in it because otherwise we stand parochially and that's not what we wanna do. That's why we hope that you will join us if you're considering it, get a sense of the different projects we have, see where you can contribute and I hope we'll have a chance in the next couple of minutes just to hear from some of the folks who are working on different things as well. But in that sense, just let me thank you for coming. Thank you again, Charlie, for your introduction and for setting in motion what really has been the words of John Perry's former band, A Long Strange Trip. Thank you. So we know that some of you have somewhere to be at one o'clock. We won't take offense if you take off but for those of you who have just a few more minutes, I'd love to open it up more broadly and I can just start like cold calling people. I don't know, Jenny Korn, are you here? I, you had actually reached out and we're ready to tell us about one of the working groups you run at BKC and what you're up to. I don't know if there's something you wanna share right now. Across the street at BKC Congress, you wanna take a look at Black youth and digital equity. We have another speaker coming up. She was talking about Asian American activism online and we have an email list that you can subscribe to from the BKC main site. And if you have any questions, you can just contact me, Jenny Korn. Thank you, Jenny. I've been helpfully informed by Carrie that I neglected to mention what of course we now take as an article of faith. This is being recorded, preserved, in fact, webcast live. So if you have some reluctance about going on the record by speaking up, thankfully I've just been informed before you did that that it is being broadcast. So hello everybody out there in internet land. So with that known, Jenny, I just one more quick bite at the apple here. You've been in BKC circles now for a number of months. I'm curious at the risk of putting you on the spot. Anything that you feel you've either was sure that got unmoored, that got less certain or something that was less certain that got nailed down as a result of activities around here? I mean, when I came in, I knew that I wanted to have a focus on racial justice and that's not always an area of interest for everybody to be real. And so I was concerned when I got here that whether or not I would find a critical mass of folk and I am grateful that I have found plenty of people who are interested in talking about race and racism and making this world a better place. Even yesterday at the BKC open house, which I thought was a great idea to have, lots of folk came by to say, yes, add me to your email list. So I think that one of the beautiful parts of this community is if you put an idea out there more often than not, at least three or four people will brought you back and say, yeah, let's talk about that. Great. In fact, we have a huge list of working groups and such. I don't know if there's anybody who wants to even say anything about it. I saw a hand up over here as well. We should wait for a microphone to come find you so that you can be broadcast to the world. Hey, Jay-Z, thank you very much. Compelling talk as always. Alvin Solehi, I'm a returning affiliate this year at the Berkman Center. Quick question for you. Obviously, now we have a lot of these big tech companies coming to Capitol Hill to testify. And I'm curious, from your vantage point, being and seeing all of these things for decades now, at what point do you think it is or isn't appropriate for governments to get involved from a regulatory perspective? Well, I don't think there's an easy, it's time. Give them a five minute head start kind of thing. I think that implicitly, for a lot of the people, especially from Law Professor Dum, who were thinking about this, their sense was regulation is structurally difficult. It's fast moving. The decision makers and the regulatory apparatus are older, they're not native to the technology. Is there some way, in the words of Barlow, we can make our own rules and we don't need the weary giants of flesh and steel to make them for us. But it turns out that's not a binary choice, that so much of this environment, especially if you think back to the layers, are shaped and constructed in a regulatory or subsidy kind of way. Decisions about net neutrality are gonna have implications all the way up the hourglass. Decisions about Facebook, especially as it and Google and others, no coincidence, have gotten into infrastructure. Google fiber, zero rating, that has implications down the hourglass. So if you suddenly see it more as what shape will government regulation or influence take, understanding that there's no way not to influence it, that leads to a much more complicated set of questions for which abstention, how to approximate abstention, still is I think an arguable position, but it's one that really has to be laid out more than just if I don't see any laws passing a particular parliament, it must be that there's government not intervening. And I don't know too if in answer to that, I should call Kendra Albert, who has seen things both from the perspective of petitioning government and working on a process with the Library of Congress about what kind of quote unquote hacking should be permitted in the public interest. That's an interesting regulatory intervention for an exception to a law that otherwise prohibits that hacking. And Kendra has also seen this stuff from the point of view of the innards of a company, a company like Cloudflare, which not that long ago its founder decided it wasn't gonna host the Daily Stormer anymore. It was just like it's out of here and I woke up this morning and that's my decision and you know, so sue me, you're not gonna win. That is a form in its way of market regulation and is something also to be contended with. Kendra? Sure, I haven't been called to the law school, so this is great. I guess I would share Jonathan's instinct that the, you know, just the absence of regulation doesn't mean an absence of actual like intervention into the markets. And that even though there is a long, you know, the tech companies may be more getting involved in Washington either voluntarily or involuntarily. I think there has been a long history of some folks on the West Coast thinking that they just don't need to participate, which I think was probably the wrong answer. And some folks thinking that, you know, by participating they can gain an advantage over their competitors. So I think the story is always complicated and I often think actually regulators, I think we've seen this actually in the, with the EU copyright directive stuff, although that's not my area of expertise, one problem is that often when regulators, regulators primarily hear from big companies, that's who they think is the only interested set of parties in a particular regulatory fight, whereas often just for the, all the reasons Jonathan has talked about, there may be really small players who have a real stake in the outcomes of regulatory intervention who aren't getting heard from. So I hope that those folks will also succeed in having their voices heard. Mm-hmm. I'm from Kennedy School of Sure Instinct Center, Sashadan and MIT Media Lab. And I have a history of being involved in many of these things, and I spent some years in prison in Iran because of them, basically, because of my blog. Anyway, so my question is more general now. With the over privatization of all aspects of government duties and the erosion of any public space, any public speech, any public media, increasing especially in this country, where do you think there would be anything left of public interest when the space for anything public is eroding and disappearing? So the best example for me now is that, you know, the CIA closed apparently a section that dealt with disinformation after the collapse of the Soviet Union. So now what strikes me is that the same duty, the same task is being outsourced to private foundations, to universities, to think about how to protect elections and democracy. This used to be the government's duty. So when you see even the physical security of, you know, the most basic principles of a state, which is democracy is being privatized and outsourced, then how that public interest would manifest itself? Well, I sympathize with your sense of escalating panic over the diminution of the public sphere. I also, as I do, realize that do we even now have an ideal of what the public sphere would look like? If we could wave a wand and just see that society and the companies within it and the networks that they build and use were structured a certain way, what would that be? I find that to be a really interesting question and one I cannot instantly answer. And if you'd asked me 15 years ago, I would have been more confident that I could answer it and it would have something to do with first in, first out. You know, people post stuff and it comes out and people could write their own filters to decide what kind of stuff from that fire hose they wanna see. But that's not realistically even the ideal world we would live in with folks in the middle of that dotted line diagram of tech philia. And as soon as you're not saying it's first in, first out, you know, I mean the high park speakers corner was never gonna scale. So what does that look like? And who would be for those who aren't willing to roll their own serving to discriminate between what they should see and what they shouldn't, much less between what's real and what's fake. And I feel like there are hints of a way forward. It's not a challenge that should give rise to paralysis and be like, eh, what are you gonna do? But I'm at least acknowledging that it's not only, which it is, it is not only a movement from government mediated public spaces that in many jurisdictions had special protections for speech that would be unpopular or otherwise driven out if it was just a market-based kind of thing. It's not just that. It's a movement from the organic to the controllable that that which was just left to chance. You know, it's a marketplace of ideas. We'll see what comes out. Nobody ahead of time can say who's gonna win. More and more, it's really hard to believe that. And in a time when we have so little collective trust in any institution, so little trust that we're like, Bitcoin, that's what we need. Why? Because it's math. Like, that's a really hard time to be thinking about whom we would trust to construct and manage among us a public sphere. So I both share your worries and I kind of raise them in the sense of just, we're in a strange time as humanity. Well, it's surely yes and surely even more than that. I mean, this is where people who are really into UX design would have a lot to say. People who are thinking about completely different ways to store information, to search it, to display it would have a lot to say because that affects the can element of the inverse Kantian equation of can implies ought. And do we wanna change that can? Put things that are controllable now back into the realm of the uncontrollable. And even in the realm of the uncontrollable, you can see that determined states with the resources of a state can end up shaping that space in ways that look organic but are exactly what you were describing as an abdication by authorities and a turnover to maybe unready academic and other public nonprofit institutions of trying to give us a sense of the landscape we're on. I mean, it's an exciting time to be studying this stuff. And one in which nobody has figured this out. I read a lot of papers. I co-editor of the like cyber law abstracting journal. I see some papers come through. Nobody has figured this out yet, which means great opportunities for written work or third year papers if any of you can solve this. So with that modest challenge in front of you, I see we're at time. I just- Can I just say a word? Yeah, sure, Charlie. Charlie is gonna bring us in for a landing and visit our website, visit our house on 23 Everett and I hope you'll get to know better than many, many, many things about the center we didn't even get to talk about today. Charlie, for final thoughts. I take you back to Lessig's picture. The surrounding forces with the dot in the middle. Lessig called that dot the pathetic dot in the middle. He meant that dot to represent us. So my perspective on this is not from the government's perspective. On Lessig's map, that's up in the north part of it. That's the law part of it. The law is outside of us. We here are the residential university that Jonathan spoke about. We here are a community. We here are a network. We are a network with a boundary around us. And if we are to realize something positive out of this fearsome picture that Lessig paints of forces tightening around us in a way that leave Jonathan without any answer, the only place for us to look for the freedom of mind that Barlow speaks about is in. We need to intensify our network. Not spend our energy figuring out how Facebook can solve its network problem. That's fine if that's what you want to work on. But if you want to work on where the future of this university is going, you want to ask who's got the interest in pursuing truth? It's not Facebook. It's not the CIA. It's only university. So somehow we have to figure out how we as the pathetic red dot in the middle of those forces become strong.