 For both lectures, I've titled them Gaining Power and Losing Control, already using words that have a lot of meaning and different meanings in different fields. I wanna just note that it's on the occasion of a real move of thinking. This is Wired Magazine, circa 1997, embarking upon the long boom, 25 years of prosperity, freedom, and a better environment for the whole world. Well, okay, that's what it said on the tin. But that wasn't exactly what was inside. And several decades later, 2018, you see even Wired has come round to say there are some issues, and we need to talk, frankly, about them. So if I were characterizing 2018 and our consciousness thinking about technology, I would say it probably maps quite nicely to a vague sense of unease. By the time we get to 2019, we might call it escalating panic. And it's a new year, welcome to 2020, aka mortal terror. And I'd like to try to map what it is that has us thinking that a lot of the promise of 1997 has paid off. A lot of what we thought the technology could do, including binding everybody together with suffusing networks to which anybody can contribute, has come true. And yet, we're still so concerned. And I think a big part of what's occasioning the challenges in front of us right now and our sense of unease is that the technology has worked. It's done the job of taking what was formerly fortuity, something that you can't predict to the future. It's anyone's guess, even if you're very powerful and have big banks of computers, that you can't control what's going on in the world because it's a complicated world out there. And more and more, that is less and less true. And figuring out the allocations of power as humanity as a whole in the kind of economic aggregate sense is gaining more power through technology. While each of us, us including the people say in this auditorium and our respective occupations and positions as citizens may be feeling less empowered, somehow more embattled is the tension I want to explore today. And a good example of that tension is just in automated vehicles for which a baseline case and maybe three alternates are worth going into. The baseline case is you used to have to drive yourself in a car soon and for some already. There's your own chauffeur. A robot driver will take you somewhere and the big issues in automated vehicles are just to make sure they're at least as safe as a human driver. Turns out to be a standard, perhaps not as high as one would think. But then let's do some variants of that. Here's a variant. A police officer thinks that you're up to no good and needs to get you and bring you to account. And so wants to issue a warrant for your arrest, goes to a judge and the doors lock on the car automatically identifying where you are. And then the car drives you to the nearest police station to drop you off as if you were a package. Great example of a movement from the roulette wheel of fortuity of both detection and control to oh well now we can do that and who wouldn't if you could find the person that you're looking to arrest and you've already got the process, the legal process lined up. Why wouldn't you do it? A shift though from no control to control and then allocating that control from the car to somewhere else. Another case, oh yes and then you're not very happy if you're the person in the car. How about a city that's about to face a hurricane as Houston did with Hurricane Harvey. They did not order evacuations then because they were worried that in the rush, chaotic rush to evacuate that people would be facing the hurricane still on the roads that would be clogged. But imagine if these were automated vehicles they could in coordinated order pull up in front of your house, your own car and say I'm leaving in 15 minutes, the trunk is open, you better put your stuff in and then I'm going with or without you and that would occasion a very orderly and mandatory exit from the city. Not a bad use for public safety of this kind of technology but surely a reallocation of a new form of control from again what previously from the societal point of view was fortuity. And finally, case three, what I would call the sponsored rides. This is the normal ride, you tell the chauffeur where you wanna go and the robot takes you there but now let's do and you heard it here first an Uber for Facebook in which the car is free, you don't have to pay for the ride. It's not your car, it's just free rides anywhere except it's a sponsored model. It will take you to McDonald's first and wait a certain amount of time. You have a chance to get something at the drive-through you are sitting there and then you go to your destination. So is that empowering for you? Is it empowering for me? I don't know, I haven't thought about that as much as I should but that's what awaits us now that the technology lets us reallocate power this way. So the first of these two lectures today is what I call between suffocation and abdication three areas of digital governance and the examples I'll use are less kinetic than the most kinetic thing in common use like an automated vehicle but I think they still connect quite nicely to this issue of the shift of fortuity to control and let me just explain quickly what I mean to tee this up between suffocation and abdication and for the three areas they very nicely illustrate it. 1995, the big worry among people in my neck of the woods thinking about technology was let's make sure that all these new things we can do including surfing a worldwide web courtesy of Sir Tim of being able to download, to upload all sorts of things we don't want to be treated like sheep we're worried that the government will surveil us will control us we've got to be resilient against that threat and that's why I call the era starting in 1995 the rights era and here I mean rights in a very narrow sense there's lots of meanings of rights I mean it in the kind of Americanized prototypical sense of I'm just here I'm in my automated vehicle or whatever it is leave me alone let me have my freedoms that distinct set of sense of rights by 2010 a parallel and not particularly compatible set of thinking has arisen and the very platforms that were enabling things that we wanted to leave us alone don't treat us like sheep now the concern is wait a minute these platforms are leaving things alone they are abdicating and they are see no evil hear no evil speak no evil that is not good because there's a lot of evil that they are facilitating and that they could stop and I call this era the era of public health it's very different from the rights era and it focuses on the ways in which technology is allowing new forms of harm to come about but may contain within it the seeds of amelioration if only we are bold enough to compel or sways those intermediaries those platforms that are empowering us also to impose limits and control in the name of public health and then I'm going to suggest that looking ahead what we want what we ought to have is in fact an era that I call the process era and it's really saying that it's really gonna be hard to reconcile the first two eras there ought to be ways to facilitate agreements among the people affected across jurisdictional boundaries to formalize them in some way effectively writing them down and then to say cheers and feel good about what that process generates even if you don't agree with a given result and I call that the process era that is right around now if we are lucky just starting up and in lecture two today I'm gonna talk about something else which is the way in which machine learning in particular but a lot of technologies in general are accelerating this movement again from fortuity to control and knowledge but here I'm gonna talk about a very specific kind of knowledge which is insights without explanations and why correct answers with no explanation offered or obviously reverse engineered could be a problem so alright let's go back to lecture one since that's what we're in and for this I wanna open with Lasse-Guvstavson as a way of just pulling the audience here today this gentleman is Swedish he was a firefighter on his first week on the job when a propane tank exploded near him as part of his work and he was burned and as part of celebrating his birthday he posted this picture of himself on Facebook and Facebook took it down violated their terms of service he complained he noticed that a number of other people who had literally been burned had also had their photos taken down for violations of the terms of service and the initial response by Facebook was to say sorry your appeal goes nowhere it stays down and if you keep trying to post it we're gonna suspend your account and it causes me to ask how many of you have ever thought in a case like this or one that you could think of why should Facebook or anything like it be intervening in content that is utterly legal there is no hint of illegality here even coming from Facebook and yet they're saying they should take it out who makes them the judge why should they be judged I'm curious how many people have ever found themselves thinking that thought I see some hesitant hands but then as people see other hands up more hands coming up okay it's certainly a significant number of us feel that way by the way after the second posting and another tour of outrage on social media he got this warning update on his phone saying it was a warning was it was a mistake and now it can stay up okay survey question number two this is a Facebook ad from Earthly Earthly is a company that sells soaps solves tinctures and more and these ad this paid ad by Earthly is a guide to pertussis and if you read this guide to pertussis it includes this did you know aluminum in some vaccines are tied to neurological damage autism learning disabilities and more okay that's not true at least there are some post-modernists among us as best I can say true that's not true on the state of scientific knowledge and it is also true that ads like that are fueling people not getting vaccines and possibly although to establish this scientifically is tricky contributing to a higher incidence of these diseases that had formerly been conquered in certain areas certainly a study of pediatricians recently says 63% of them say misinformation from social media is a major barrier to persuading parents to have their kids vaccinated another 27% say it's somewhat of a barrier so here's the poll question have you ever found yourself saying why is Facebook earning money caching checks from the provision of information causing physical harm in the world okay can you see it's 2010 is sooner than 1995 the first test was a 1995 rights question this is a public health question and there's a lot of debate over what role say Facebook should play either in leaving us alone or in intervening to make sure harm doesn't happen and it really amounts to two issues of digital governance in 2020 here they are number one we don't know what we want within each of us there may well be raising your hand twice I would raise my hand for each of those hypotheticals and then try to figure out how I distinguish when I want Facebook in and when I don't but I'm not sure here's issue two we don't trust anyone to give it to us the idea that somehow Facebook's customer service team is gonna have a deep sense of scientific truth and validity and this and that and make wise decisions when they are trying to process each thing that they are gonna decide about it scale seems fanciful so if we solve these two problems by the end of the lectures we're in good shape so all right let's talk about the rights era a little bit more and again very American sensibility tracking to the Americanization of the initial expansion of the internet and the major applications built on top of it worldwide here of course is the first amendment don't let us get treated as sheep we are citizens we should speak government stay out of the way and that was applied in a digital context in 1996 by the so famous now it's almost a caricature within the field the declaration sorry a declaration of independence of cyberspace by John Perry Barlow who waxes eloquent about the need for independence from all governments of the world those weary giants of flesh and steel no moral right to rule us and we'll figure everything out who is us we're the digital netizens and a lot of the examples over the ensuing years very nicely track to that sensibility here's one from China where MSN Microsoft had gotten a blog service established there called Spaces and anybody could make a blog and my colleague Rebecca McKinnon tried to sign up for a blog and it said what's the title and she typed this is the title which is I love freedom of speech human rights and democracy and it gave an error if you translate the error you have to enter a title it cannot contain profanity please type a different title this is classic bait for Barlow to say yes that's why the rights are so important closer to home in the United States in North America one of the first editions of the Kindle contained a third party's rendition of George Orwell's 1984 that third party thought it was in the public domain so happily just typed it up sold it for 99 cents a copy and then realized in fact was free in Canada but under copyright in the United States Amazon had a panic that they had had a hand in distributing copyrighted material without permission reached into every Kindle and deleted 1984 off the Kindles that had purchased it could not think of a better book for which this to happen you don't have 1984 you never had 1984 there's no such book as 1984 that's the kind of thing that for the rights era it just feels like yes I get it this is what we have to be alert to and just to be vendor neutral here here's Warren Peace on the Barnes and Noble Nook as you're reading it as she heard his face of Viviglone nooked in her face the Sulphur splinters nooked by what is going on in this copy of War and Peace it turns out that every place that Kindle appears in the original book it's been replaced by Nook as part of the worst product placement ever but a good statement of the rights approach that says don't intervene in my book please I wanna read it the way the author intended it and just a final example on this front this is Microsoft's Connect why not have an all-seeing eye in your living room at all times for video games and such here's a patent Microsoft filed related to it it includes a consumer detector that's the consumer detector and the idea is that you could use it to sell movies at different rates depending on how many consumers were in the physical room to watch the movie and here again it's total rights era stuff if you just substitute for a user or a consumer you know all right it's treating people like sheep but it's also a little bit like the car going to McDonald's it might make for cheaper content if there's only one of you and thinking that through again the rights framework offers some way to think about that also notable this is not a bad copyright issue example for copyrighted stuff the rights era tended to favor the consumer of content over the rights holder of content a very again distinct flavor of rights that I'm talking about in describing this era now Barlow wasn't just saying it would be immoral for governments to treat us like sheep and we need to have a bulwark against it he also interestingly said nor governments do you possess any methods of enforcement that we have true reason to fear you can't really get us in cyberspace so we're not sheep we're doing just fine and that's also recognized by his colleague John Gilmore saying the internet treats censorship as damage and routes around it and in early internet law cases including in America there was a recognition by courts of the difficulty of asking any intermediary in this case AOL to intervene at scale they say you just can't do it it's staggering there's so much going on including possibly so much bad stuff but unless you want to just call the whole thing off we've just got to tolerate it that spirit found itself in strong places by 2005 for this example this is a search for the word Jew on Google and the second hit is Jew Watch News which includes a section on Zionist occupied governments Jew Watch News was a neo-Nazi site if you look at it more closely it's proudly neo-Nazi and a lot of people wrote to Google and said why is that the second hit on the word Jew and Google said that's just how it works in fact Google bought itself an ad to appear at the top called offensive search results we're disturbed about these results as well please click to learn more and if you click to learn more it says yes this was a very unexpected result read it's a roulette wheel what are you gonna do we're surprised too so surprised that we have pre-written this entire essay about it because it's so surprising the beliefs and preferences of those who work at Google as well as the opinions of the public do not determine or impact our search results just saying like look it's all science it's computers and we don't touch them if we did that would be unfair we're really sorry you were upset thanks for telling us if only we could do something Google and you know the anti-defamation league linked here which is an organization expressly chartered to combat anti-semitism put out its own press release supporting Google in this saying not intentional it's not a conscious choice by Google but as a result of this automated system of ranking that was dropped off by a ship from Mars so that was the heart of the rights era still going strong and I can tell you at the time I supported this as well it seemed like opening Pandora's box to ask Google to somehow get in and start hand tweaking search results that's an era before the conversion had been more complete from the roulette wheel to something more controllable so what brought us to the public health era what got around 2010 a different way of thinking about governance in this space well one thing is what I and I'm using the terminology of Randall Monroe here called the digital evolution from tool to friend so some kind of search engine is basically a tool you look at it like a research tool you type in something you get results here on Bing I said should I vaccinate my child four out of the five responses on Bing were no you should not one is the American Centers for Disease Control saying you should does Microsoft bear any responsibility for that under the Google theory that we just looked at the answer would be no it's all automated and it's giving you a window on the web and if you don't like what you see blame the web don't blame the person that offered you the window and I think in 2020 that may not be as much our sensibility about things but there's even another phenomenon we're not just talking search engines over here anymore so-called organic results we're talking about knowledge graph and this is an effort in this case by Bing Microsoft to assemble like a Frankenstein monster in real time some statement about appendicitis meant to be more authoritative than what you see over here now if this said appendicitis is caused by an imbalance of the four bodily humors and you require a leaching in order to get better and here are some sponsored links suggesting leachers below with their Yelp ratings or something that I don't know I feel like we might be more how many people be willing to say Microsoft knock it off in that instance all right a lot of hands going up and that's because not just our sensibility has changed in a vacuum from 2005 to now if in 2005 you think you would have thought let it ride on the search on Google before but also because these intermediaries are not just tools they're your friends they're trying to give you actionable advice that you can use and another great example of this is my colleague Aveshai Marguliet you look him up his knowledge graph from Google says that until 2011 he was the Kenan professor at the Institute for Advanced Study which is weird because according to Google he died in 1962 now Marguliet writes to me and says hi I'm not dead always a good opener to a message how can I fix this and I said did you try the feedback link yes I did okay talk to a Google engineer that kind of thing but it seems like there's more responsibility on Google here about information it has processed in a way that just if there were falsehoods in here it's like what are you gonna do well when you look at our digital assistants and concierges that is all the more present when they're only offering you iracular answers this is the original version of Siri your humble personal assistant and there are all sorts of counterparts to it Cortana, Google assistant, Alexa, et cetera, et cetera and here's a question somebody asked the Google assistant physical box right towards the end of President Obama's term in 2016 is Obama planning a coup? according to secrets of the Fed according to details exposed in Western Center for Journalism's exclusive video not only could Obama be in bed with the communist Chinese but Obama may in fact be planning a communist coup d'etat at the end of his term in 2016 oh well asked and answered I asked a Google engineer about this and their reply was oh gosh that's awful it's pronounced coup d'etat it's like no you're missing the point and this is an equal opportunity problem hey Google are Republicans fascist? according to debate.org yes Republicans equals Nazis well now regardless of your views on any of these questions which I don't mean to presume the idea that these weird knowledge towers that we welcome into our houses just sort of in like kind of a casual way or throwing off answers like that is just so emblematic of this move from tool to friend and therefore the rights era answer that just says get out of the way and give me answers well they're the ones giving you the answers there's no getting out of the way you're in the way and thinking that through the rights architecture doesn't have a ready answer which is why we see a move to public health and in fact now for Alexa so that you don't end up getting recommended a leaching they're saying that the National Health Service in the UK is going to dispense medical advice to questions first thought is like that seems strange and the other thought might be good for them people are going to be asking these questions maybe they could be answered by a more trusted source and credited there so what I'm identifying I think is that what I call an inverse of the Kantian bromide that ought implies can if you tell somebody you ought to do something it has to be that they can do it fair enough it's weird to say that you need to do something if there's no way you can do it but I want to ask from a regulatory and ethical perspective when does can imply ought when is it that when you can do something now you're responsible should you not do anything that's the problem the issue of abdication raised because no longer is it well it's just a bunch of bits in these intermediaries how could they possibly control it thought of the rights era and we can see that happening by the Bono's capitulation of it because we can we must on the streets of Davos at the annual meeting of the World Economic Forum so there's a new power that is finally prompting us to ask that question especially in the area of social media so let me talk about that Facebook originally was just let's see what our friends are up to more and more over the years it became let's learn about the world I'm just gonna go to Facebook and see what's going on out there and that's where by 2018 you have a UN Human Rights Council talking about the ways in which Facebook has played a significant role and one that is underexplored because they have not been forthcoming with data about possible ethnic cleansing in Myanmar and this is the UN speaking which is usually quite reserved about its judgments and Facebook's own response to that particular report was connecting the world isn't always gonna be a good thing we do lose some sleep over this which was a notable admission from Facebook which previously had basically leaned on rights era argument and terminology to say we're connecting people sometimes freedom has its cost this is saying well we're losing some sleep over it and sure enough over the years Facebook elaborated a terms of service at first in secret about what they wouldn't allow irrespective of whether it was illegal let's say it isn't illegal but we're still not gonna let it on the platform and at one point the Guardian picked up leaked slides from Facebook explaining some of their rationales so these are now slides from Facebook for the purpose of training their own content moderators of which there are thousands who are reviewing content so on the area of credible violence these are examples someone shoot Trump as a sentence not allowed if that gets posted on Facebook that's gonna go down how about kick a person with red hair in this example again this is their slide that stays up because having red hair not a protected class to snap his neck make sure to apply all your pressure to the middle of her throat that stays up because it's not telling you you should do anything it's just saying if you wanted to do it this is how you would do it again very important nuance that they wanted to make sure their content moderators knew about so they would not be over censoring material let's beat up fat kids also allowed okay shortly after this leaked Facebook revised its content policy to put X's next to all of these and they then started to release the extremely elaborate set of rules rules that if they were issued by a government would consume in America a First Amendment class with reviewing them because it's not a government no class is consumed with reviewing them even though they have far more power in shaping speech than government rules about pamphleteering which can occupy an entire week of a American constitutional law curriculum would Mark himself, Mark Zuckerberg talked about the ways in which they had noticed at Facebook that as postings got closer and closer to crossing over the red policy line that Facebook had set they got more popular people really liked sharing stuff that was edgy and probably would keep sharing it if they were allowed to not have it taken down over here and he said why don't we just change it so that as it approaches the line Facebook makes it less and less viral might be a good idea but a great example of just how refined the control for organic posts Facebook is now taking up as a mantle and indeed more recently they said they deleted nearly a billion posts mostly spam what were they? I don't know they're just letting you know they've been out there deleting hundreds of millions of posts behind closed doors maybe one of these would be a candidate this is during the 2016 American presidential election FBI agents suspected in leaks found dead and this is fake news I can confidently say it in the very narrow sense it's from the Denver Guardian which if you live in Denver, Colorado you cannot subscribe to the Guardian it does not exist it is not a newspaper and yet it represents itself as that if you click through there is a Potemkin site that looks like the Denver Guardian and there's like one other article and that's the entire operation and here's a depressing statistic during the presidential election in the latter part of it these are the most shared stories from each of the major publications my hometown Boston Globe its most shared story on Facebook shared just over 100,000 times the Denver Guardian nearly 600,000 times attributable to that article now is that a problem from the rights era? probably not from a public health point of view it may well be and they could if they wanted intervene now we know that now the ad ecosystem is worth just dwelling on for a moment too that was organic content let's just talk about ads for a minute this is the targeting window very evocative military language for somebody wanting to place an ad on Facebook and ProPublica again went in and said they want to target with an ad anybody whose field of study is Jew Hater and out of 2 billion people turns out there are 2,274 to the person to the account that describe themselves as Jew Hater Facebook then says wow through a little bit of machine learning we can tell you some other suggestions given that that's who you want to reach with your ad including how to burn Jews German Schutzstaffel and then at the bottom your audience selection is great you can reach 108,000 people now this is from tool to friend this is Facebook as an unconscious entity again think about the ADL answer to the previous invocation of anti-Semitism they're not going out to recruit neo-nazis but the software is so flexible once it knows what you're doing it just rolls right along that is a movement from tool to friend that might ethically call for more responsibilities than if it were just hey it's a service bad stuff gets said on it indeed among the different employers that it said as suggestions with Nazis nationalism German Schutzstaffel refugee camp Italy NYC which is a restaurant it's Italian themed so a weird resurrection of the World War II axis in Facebook very strange and the way in which they do that is by being able to learn a lot about the people visiting Facebook so this was their Valentine's Day post in 2014 pointing out that up to 100 days before a relationship they can actually project which two Facebook members are going to end up declaring that they are in a relationship months before it happens possibly offering an ancillary service to would-be-in-laws of an early warning system and a chance to head off the impending collision and sure enough you always look to patents to see the highest aspirations of a company without worrying about plausibility here's one from Facebook predicting life changes of members of a social networking system which includes predicting death and not just your death but possibly the death of a person or pet associated with the user now this might just be for the purpose of being able at the right moment after a waiting period to like advertise them a new pet or something but it just shows how much the needle of control has swung to these platforms for which now that they can do it we might ask whether they ought now the platforms themselves have seen the specter of disapprobation by the public in a public health sense and of regulation and have said, gosh, maybe we should dial down the can a little bit because if we do, Kant will help us can to Kant because now, can implies odd if we can't, you can't say that we should and we've seen efforts in recent years Twitter just a few weeks ago said, oh, maybe there should be a decentralized Twitter now with less control over what happens that's the point in part of a decentralized Twitter and 2008 me would have been supporting this and in fact, I find myself supporting it now but I am also conscious that it bears the cost of much less control for better or for worse if you're concerned about what's happening on Twitter Mark himself last year issued a privacy focused vision for social networking which was end to end encryption which again, many of us including myself are in favor of but let's be clear part of the benefit to Facebook of implementing that thing is that there can be much less demand upon it to take responsibility for what's going on and of course famously, four years ago the dispute with Apple over whether or not the American government with a warrant can get into a locked iPhone, Apple fighting that going so far as to say, no, no we're not gonna let this happen and to make sure that the architecture of later additions of the iPhone five S and above would have a special encrypted system in here, the secure enclave so that even if they're served with a warrant and compelled, there's nothing there particularly privileged to do to get in and we see this debate echoed even today both in America and around the world Prime Minister Cameron was very much on Attorney General Barr's side at the time saying we need to be able to get into these phones this is a debate about where to set the needle on can and the thing is, we know you could make it that you could so when you make it that you can't and then say you can't, maybe you can that's hard to say as a bumper sticker but in fact, I think it parses to both a grammatical and true statement so the rights versus public health debate is what I'm calling the debate between suffocation rights as a solution if you're feeling suffocated by the intermediary and public health as a solution if you feel that they are abdicating they should have a utilitarian public health view to prevent their products that are maybe creating new harms from occasioning those harms and don't joke with us platforms we know you can and if you can't we know you could tweak yourselves so that you can now we are only now meeting in earnest the question about so ought when should they knowing that they can and I think so far the answer is just the tension here's somebody just last week on Twitter reporting from Twitter that they're gonna have a new feature that lets you control who can post replies to you you could make it so that only people you follow could reply to you very rights oriented so that you control your discourse somebody then responds with a public health concern oh great so we're in the middle of a tornado outbreak somebody posts a picture of a different tornado says it's happening now it's got a ton of retweets they shut off replies so nobody can correct them and blunt the impact they're raising a public health concern to a rights oriented intervention Twitter again can design itself however at once how do we answer this dilemma and the first amendment I think of this as a counterpart to the first amendment article 19 of the international covenant on civil and political rights from 1966 basically a first amendment style everybody has the right to freedom of expression but then if you read a little further it says ah yes but there are restrictions including public health fair enough again I feel torn you can get me thinking this way or this way depending on the fact pattern how do we reconcile them and the answer so far has been I don't know now the normal way to solve this would be take it to the Supreme Court of the jurisdiction in question have the legislature have government be the referee on these hard questions but we don't know what we want these questions are really hard and even if we did we don't trust anybody to give it to us trust in government empirically speaking including governments that embrace the rule of law is at an all time low so we're kind of stuck now one solution would be at least on the first part what do we want what should we want a massive subsidization of philosophers around the world so finally after several thousand years settle the issue of the balance between rights and public health and I want to go on record in this lecture as totally supporting a massive subsidy for philosophers they come comparatively cheap and it's a good thing I'm not sure if I'm doing the PowerPoint deck to justify that subsidy that I can promise instant results it's gonna take a while so what do we do instead well I say we need a process era and the process era look process has always been with us again governments decide stuff when civil society the public at large turns to them to settle a dispute with public implications but I want to identify maybe a specific form characteristics of process that might use the original qualities of the distributed network that many of us were so excited about in 1995 to try to assist in a trust building exercise to provide answers to it this is especially important because when you look at government coming in to balance rights and public health as in Europe the right to be forgotten attempts to do they issue a statement much like article 19 balancing you know we need to say is this still relevant this thing that this person who doesn't like his own search result wants to have deleted on the other hand what about the public's right to know it needs to be resolved but there are too many such cases coming for the European courts to hear them so Google you need to settle it and it literally throws it over to Google to answer so what should the process look like well one thing might be transparency because every time I've described as I did earlier these sort of black boxes that irraculately utter answers I mean right this is very Kubrickian Clarkian the monolith of 2001 and I couldn't help but be struck by the similarity you know it's like is this what they were going for and it's like yes thank you for informing me but you know come on I'm a human I want to know what's behind stuff thinking about modes of transparency and in transparency when Facebook takes down 865 million posts that ought to be recorded somewhere somewhere not at Facebook so that people can study it maybe not the entire world but bonafide researchers others on a non-discriminatory basis make it kind of like the locked restricted area of a library where you can even if you don't have the key see what you know topically is there and then have a process to get that key orchestrated potentially by the great libraries of the world and you have that sort of thing in German culture and law going back decades there's been this idea of the gift shrunk the poison cupboard for censored works and here you sort in separate areas of libraries if it's slippery or politically questionable that's where they're guarded and published for a limited time for research purposes many academies in Germany have this so-called remota and you could see implementing a counterpart to it from the private entities that have data only themselves that bears on what kind of job they're doing when they are meddling that the public through its academic in particular representatives ought to have a place to see we've seen among the companies enough awareness of the ethical implications of what they're doing particularly around AI that they're saying okay let's think of an internal process to make ourselves our workers and our publics feel better about things so we've seen the rise of the chief ethics officer somebody who's like okay this in the hierarchical chart of the company is where all the hard problems go so somebody has to ask the chief ethics officer don't like the answer hire another better chief ethics officer which in turn will lead to subsidies of philosophers to produce programs at universities for certification in chiefs ethics officering and maybe it'll work I don't know this seems to sort of punt the problem but at least it's admitting that we have a problem including one of isn't CEO already taken as a title like which CEO would you like to ask this I asked the ethics officer not the exiled darn so anyway that's going on and Microsoft has announced a special committee within Microsoft the AI and ethics and engineering and research committee to again recognize the gravity of the decisions they are reaching the understanding that they are having impact outside the company in units that should be measured other than as is it making the company money and if it's making us money is it getting us in trouble that will cost us more money later than maybe we don't do it it's a different calculus this kind of structure invites and in that sense I support it as a sense of an initial step but it's still internal to the company if it's really problems of such weightiness that they implicate public social issues maybe there should be people from outside the company involved that's what Google attempted last year when they created an external advisory board to monitor for unethical AI use a week later they dissolved the board because there was dispute over who should be serving on the board and whether those were good people to weigh in on these decisions I take this as a good sign that people are taking this seriously enough to worry who's on the board and remember in the absence of such a board it's just something inside the black box of the company that triggers the decision this formalizes it and again creates some externalization of what's going on and here I gotta say Facebook has ended up taking the lead in part I think because Mark Zuckerberg so single-handedly runs the company that he woke up one morning was like I want an outside oversight board not advisory when it makes a decision about a content takedown from which an appeal has been taken we should be forced to follow it which is truly a disclaiming of the power the growing power that something like Facebook has and Mark has been quite up front about this I shouldn't have this much power says Mark and again I think this is a very intriguing kind of thing will it work I don't know it's gonna have 40 people on it it's the kind of thing where it's like a retired Bolivian judge got paid $200,000 a year to be on this board and to take up a very difficult question of vaccine information should it stay or should it go well I don't know they probably know more about this than I do in balancing stuff I will defer to their expertise that could be legitimacy building but there may be ways to even go further one way of going further is suggested by this 1998 article which talking about search engines and their commercial incentives said there's just no way to do this right there's no amount of pushing around the relationships within and outside the company we simply think that the for-profit nature in this case advertising causes enough mixed incentives that we must have a competitive search engine transparent in the academic realm who wrote this 1998 paper Larry Page and Sergey Brin that's true although they organized their names alphabetically Sergey Brin and Larry Page wrote that paper this was the paper that introduced Google to the world and then I think they quite literally got bought that's usually not a nice thing to say but I think for them it turned out very well and that's sentiment though I think was quite prescient in 1998 saying you know maybe you can't patch it but we can't wait on a I mean let me be clear am I calling for a nationalization of Facebook again remember we don't trust anybody to give it to us I don't think nationalizing something is gonna help but maybe there are other options that draw in the public at large so that when there's a thorny question that involves ethics in some way consider this very hypothetical Uber ride in which somebody's going from Stone Avenue to the San Diego airport the initial quote is 69 US dollars here is exactly the same itinerary at the same time asking for $79 can anybody at this distance spot what is making the price change happen hypothetically yes battery life that's a really good idea because if the battery is low throw on another 10 bucks who's gonna take time to like check lift or some other competitor you might as well do it now again that could be reputationally dodgy on the other hand these things are so dynamic without a bunch of like really good testing we're not gonna know why we get the price we get in any case is that good or bad I would like to put something like that out to the public the way that my colleague Yad Rawan does this moral machine site thinking about the ethics of automated vehicles here a vehicle has lost its brakes should it keep going and hit these cats or should it swerve breakless and hit these people this seems like an easy one but they get that's the that's the first level and once you defeat the boss on this level it gets harder and asking people and even noting differences from one region from one culture to another this has turned out in gathering millions and millions of people not bots coming to the site and actually taking these tests and reflecting on it I could certainly see asking that question about battery life and a bunch of others that are probably harder than battery life as a way of giving a company a basis on which to say this is why we do what we do and we've done it in consultation with people and in fact I'd like to take the consultation with people even further so this is an ad a real ad that appeared on Facebook from the North Dakota Democrats during the 2018 election season attention hunters voting in North Dakota could cost you your out of state hunting licenses so maybe better not to vote interesting effort at voter suppression that's kind of self-targeting, no pun intended to those who might be voting Republican, right wing anyway because if you're not a hunter you wouldn't be troubled by this even if you believed it and what should Facebook do about this? Facebook's current answer is total rights-era stuff about you don't want us treating you like sheep this is political, this is very sensitive there's gonna be debate over almost any form of ground truth that seems true and so Facebook says they're getting out to much disapprobation in the press and among academics about the way in which that represents abdication because again we live more in the public health era than we do in the rights era these days but I say let's have process intervene so what if we took this ad and instead of doing nothing with it except showing it as requested we send it to an American because it's for an American election public high school class chosen at random having prearranged to participate in the program possibly funded with trust money from Facebook that Facebook can no longer touch an endowment and with the help of and grade of their teacher they actually talk about this ad they apply standards of truth to it and come up with a decision and maybe with the help of a librarian local to the school as well and then they write up their decision and maybe their decision is no this ad shouldn't stand at which point the ad goes away and Mark like with his independent review board says don't blame me blame the anonymous high school somewhere in the heartland or possibly the coasts that decided this was not a good ad and I recognize this is a crazy idea it is crazy and I'm trying to figure out why it shouldn't happen if you have ideas great to hear them I have a few I know but this is an example of a process intervention that again at the very least would treat with the gravity it deserves content like this instead of saying it's too sensitive I'm gonna let it go say no no truth matters but we're gonna have to figure out what truth is with a process that actually has its own degree of fortuity to it a kind of fortuity without randomness because what is this class gonna do that is in a Rawlsian sense I have a veil of it I have no idea what this class is gonna do which means I don't know structurally if I am an interest consistently right or left whether to support or oppose this proposal it might start to get into well which classes again we'll get sent to the questions the fact that different classes will have different opinions on what might be millions of ads and millions of ads are scalable there's enough classes that millions of ads not a problem for human review here that they have different uses okay that's actually a feature because it means stuff that reasonable people will by definition disagree on will get treated a little bit differently but stuff clearly over the line will consistently have trouble finding its place on Facebook so where do we stand on the process ideas I've already talked about transparency in the form of the gift shrink and getting libraries involved I've talked about binding shifts of control outside the firm kind of as Facebook is pioneering with its independent review board rechartering juries for representation and scale that's both an example of a hijacking of the moral machines project to certain questions and companies and this ad juries idea that I just broached and the idea of just start trying a lot of stuff don't try to develop one perfect thing that then won't work and say well we tried process but it didn't work try lots of processes in this time taking advantage of the uncertainties that are here and finally new loyalties for digital architects let me describe what I mean by that and for that I come to the concept of the learned profession the learned profession one of those is as said here three professions traditionally to require advanced learning and high principles why do they require high principles because of the power that that advanced learning gives the professional and the first three learned professions is divinity law and medicine each with a respective form of power that they come into through their training for which they then have duties to their parishioners, their clients, their patients that go beyond the commercial and they have duties to society at large it's very article 19 balancing kind of thing as to when confidentiality in a patient should be respected and when the possible contagious nature of a disease requires a report to the public health authorities but there's a guild, there's a set of principles specific to that category of problems that develops over decades to provide a sense of wisdom and most important in these eras because we don't trust anybody to give it to us of trust and legitimacy by the people who were affected by these decisions now around the turn of the 19th century a fourth learned profession was added surveying that's about it so I think it's probably time to think about more learned professions so that the data scientists, the engineers, the people in the engineering areas of a Facebook, of a Google even of a two person startup because sometimes those things get very big have a sense of compass independent of just what does my employer think and these professional models let it happen so I remember 1985 something like the Sony Betamax you might not be able to set the clock but you could push something in and play a program or record a program how it works, not clear calls to mind the person who wrote 2001 Arthur C. Clark his third law and he sufficiently advanced technology indistinguishable from magic he was borrowing a bit from Lee Brackett who years earlier put it more bluntly witchcraft to the ignorant simple science to the learned and there is a division between those two represented by that turning of the dial of can much higher for these intermediaries that I've discussed and those intermediaries aren't just the social media companies that I've dwelt on for the duration of this lecture they translate to Tesla or Toyota Bentley as the intermediaries building those automated vehicles what we can do here in the purely digital realm while it still remains that way as much as possible could translate to those kinetic realms as well where governance is going to be that much more clearly important because of the physical implications in a sort of oddly represented histogram of people I see only one small corner as kind of the nerds who aren't bound by much of this because like Barlow's original implication they say I can build whatever I want I can hack the rest I live free and clear and trust only myself and my friends my friends I'm not so sure about on the other corners that I call the Luddites people say so long as there's a library with a book called 1984 in it I don't care what they're doing on that nook enjoy your possibly adulterated goods maybe but you know you live in a world that has non Luddites near you you need to worry about what's going on with them at least if you're going to shake hands with them and possibly catch pertussis from them and finally in the middle is the rest of us and these are the sheep whom we want to protect in the rights narrow rights sensibility I've described against intervention because the technological pathways that we've invited ourselves we bring these knowledge towers and connects into our homes to preserve our rights our autonomy our sense of boundary but also protecting the entire flock from a public health standpoint both are in the interest ultimately and figuring out how to make us more participatory honestly it was the question in 1995 it was a question in 2005 the fact that this is hard and we haven't answered it to me is a call to arms to do so today because if we don't just try the status quo and follow the dotted line to where it's going and my naive sort of optimistic ramblings about what we can do here become actually more of a necessity Barlow ends his declaration of the independence of cyberspace he says may our civilization of the mind be more humane and fair than the world your governments have made before very simple dichotomy there's these governments that who knows who's running them and then there's us the people I talked to in my chat room well that's maybe outdated but this is a charge to us to figure out what would that civilization look like and how can we draw our children our students ourselves and ultimately the authorities because remember who is running the governments is largely dependent on what kinds of ads will be running on various platforms and what kind of targeting will happen these are not contained problems it's a complex system that will affect the kind of government we'll get this charge from 1996 seems to me quite vital today thank you very much