 Hello. We're going to get started here tonight. My name is Seth Manukin. I'm the Director of the Communications Forum and a couple of quick announcements before we start. First, communication forums are held three times a semester, six times a year. If you would like to be informed of future events, there's a sign-up sheet over there. Put your name and email and we will promise that we will only send you news about our six events a year. We have pretty good ones. We had Sarah Vowell earlier this year, John Hodgman last semester, these three. We have some great stuff planned for next semester already. Also tonight's forum is being filmed by C-SPAN. During the question part of the forum, if you would go up to one of the microphones and also hopefully state your name and your question, another reason we ask you to state your name is because we then do a write-up of all of the forums afterwards, which you will be able to read a couple of days after the event on our website, which is comforum.mit.edu. The last announcement is that this event tonight is co-sponsored by Radius, which is another group here at MIT. I am thrilled to be able to introduce these three. It is a different three than we initially thought would be here because Jeff Howe called me up at a little past five because his daughter was puking and as the father of two young kids myself, I said, please stay home. Fortunately, Chris Couch who writes a lot about technology, works with the comforum, is a brilliant journalist in her own right, has agreed to fill in as a moderator. But let me introduce everyone. Noam Cohen is the author of the new book, The Know-It-Alls, and that will also be a hashtag tonight and I think moving forward, is that right? Know-It-Alls? Got to get on that, yep. Noam and I worked together a decade and a half ago. I've known him ever since. He is a great guy and a brilliant journalist. He covered the influence of the internet on the larger culture for the New York Times, where he wrote the link by link column beginning in 2007. His first book, The Know-It-Alls, The Rise of Silicon Valley, is a political powerhouse and social wrecking ball, is an intellectual history of Silicon Valley, and critically examines how its disruptive culture and ideology belittles civility, empathy, and even democracy. It was published in October 2017 and it is available for purchase right here. And in addition to supporting open discussion, we also support both bookstores and authors. So please, by all means, buy the book. It's a great book. Chris and I both read it and loved it. To Noam's left is Sarah M. Watson. Sarah is a technology critic who writes and speaks about emerging issues in the intersection of technology, culture, and society. Her work has appeared in the Atlantic, Wired, The Washington Post, Slate, and Motherboard. She's an affiliate with the Birkham Klein Center for Internet and Society at Harvard University and author of the Tau-Tau Center for Digital Journalism's report on the current state of technology coverage. And then to Noam's right, to my left now, is Chris Couch. Chris Couch is a science journalist who I have had the absolute pleasure of working with for several years now. She's also the coordinator for the Communications Forum. Her own work explores the intersections of technology and psychology, and her bylines have appeared in Nova Next, MIT Technology Review, Fast Company, Coexist, Science, Friday, and Wired Magazine. We have also, for your convenience, put all of their Twitter handles on the board, at Noam Cohen, at SM Watt, and at Couch CS. And without further ado, I will turn it over to Chris. Oh, actually with further ado. Sorry about that. In addition to Noam's book, we have a book that Jeff co-wrote with Joey Ito, the head of the MIT Media Lab on sale here, called Whiplash, which is also a great book. So both of those are available immediately afterwards. And Noam obviously will be her design. So now without further ado. All right. Thank you. Thank you guys all for being here. We're so excited about this panel. I feel like it addresses a lot of really important issues. I too would encourage you by a book. It's a great book. So first of all, I want to kick off this panel by talking about the central argument of the book, and correct me if I'm wrong here, is really the disruption and individualism that's very endemic to Silicon Valley has kind of in a lot of ways eroded humanity? Is that fair to say? I mean, I was thinking about the question, like the premise of this get-together, like, have they lost their humanity? Of course, the glib answers that they have it. Then the deeper question is every person has humanity. So what are we really talking about that's happening? And I think it, I kind of approach some of this from the computer science aspect. The book goes a lot of the history of computer science. The thing of people's machines and machines as people is one of the crucial mistakes or paths that we're on that I think is scary. So I think that is kind of denying the humanity of your fellow people when you think of them so individualistically as little data points. And I think about like, in the introduction I talk about, it's kind of a well known anecdote about Google's first design director who was asked to create a design for Gmail when they were doing it, and he suggested a color. And instead of using the color he wanted, they went and tested, A, B tested 41 different shades of blue, and was the one that people use the most was the one they were going to choose. And he eventually resigned over these kind of issues saying to be a design director, at least then at Google, was like an oxymoron to have a human vision for what they were doing when they were going to test it. And they don't apologize for that because basically they say the color, the shade of blue we picked is the most popular one. It's led to $200 million in additional revenue. And so I think it's that breakdown of seeing people as data points. And I think they're not apologetic about it, but I think it leads to bad outcomes, if that makes sense. Can you speak a little bit to what those outcomes are? I mean, for people that are not familiar with the intricacies of Silicon Valley, can you tell me a little bit about how does that play out? Well, I guess what I was trying to argue is that they are taking this fringe philosophy or ideology called libertarianism and making it seem very normal and mainstream. And so what do I mean by libertarianism? I mean, resisting regulation, the idea that we should regulate taxis or hotels, or you can think of all the different companies that we should regulate what children see on video, if not TV, that we should regulate who can pay for political ads, should they live in America, should they just declare what they're doing. So that's one part of that ideology, that distaste for regulation and distrust of government, which I think is really poisonous to our society. So that's one part. I think the extreme idea of free speech is another one. I know all these issues are complicated. We'll talk about whether I'm coming out too strong. I mean, to me, I wrote a piece that was in New Yorker.com this week about an issue at Stanford and the Bolton Board of Stanford, and whether you should, even back in the 80s, where there should be any limits on free speech. So there was a joke group that told really virulently racist and sexist jokes. And Stanford tried to limit that, and there was such a fierce pushback from the computer science department, the kind of people who were talking about it now, or running Silicon Valley, that it was reversed. And to me, having limits on free speech is vital to having a community that is cohesive. And so that's another dangerous aspect of that libertarian ideology. I think they're doing all this, and I think some of it is done in good faith, but I think it's just having horrible consequences. So I wrote this book before the 2016 election. I was certainly working on it, thinking about it. But I think a lot of what happened there, fruition from that election, bears out these points. Because the fact that these big companies like Google, Facebook, and Twitter are so blasé about the idea that a foreign country could try to influence our election, or that there should be disclosure of who's advertising, whether these powerful tools of targeting people should be used by anybody to stir up anger and resentment. To me, it shows this disconnect. They're not seeing themselves as custodians of this power as they have, but instead are exploiting it for profit, or maybe they're kind of utopian visions of, I think like for Mark Zuckerberg, a dream of connecting the world that he thinks supersedes a lot of other concerns. So those are sort of the effects. I would just add one thing, as people will ask me when I've been talking about the book, what property to do it, because clearly it wasn't the 2016 election. And I kind of think about how one thing that was sort of a turning point, and you can look at me as a hypocrite, because it's about using Gmail. And I remember just thinking that the idea that Gmail would have a computer read your email, which at least it used to do in the past, in order to place ads for it. My mom actually passed away from cancer during that time. And I remember thinking how I never mentioned the word cancer in an email, because I just didn't want like, hey, if you thought of radiation treatments. And that notion that someone would be a custodian of my information, they're giving me for free, and would still feel like they had a right to sort of try to commercialize it, was I think a kind of crystallizing moment for me too, if I think back on it. So a lot of Sarah's work deals with technology criticism, not just of technology itself, but also of the culture surrounding it. I'm wondering if from your perspective, do you agree with the premise or how do you feel about the premise, I guess, of the culture of the technology world having an impact on humanity and empathy and civility? I absolutely agree with Noam's kind of overall premise, and I think a lot of people have started to unpack the kind of implications of the way that technology is built, but also the kind of assumptions and ideologies that are acted out in the technology itself. And obviously looking at the individuals who are leading these companies and coming up with these designs and looking at their assumptions and their ideologies really do matter. I think the biggest thing for me is, I like to think about this in terms of optimization. I think most Silicon Valley leaders and companies are designed around questions of optimization, whether it's the design itself, whether it's getting you the information as fast as you can or connecting people as efficiently as possible or connecting you to all of the world's material goods for Amazon. Those are kind of questions of efficiency and optimizing for profit. Those are kind of taken for granted as the right terms of optimization. And I think trying to unpack what those assumptions are is a really productive starting point to say, okay, well, what if spending more time on Facebook wasn't the optimization model? What if it was a quality experience on Facebook? What would that look like? How would that change the experience? How would that change the design of the platform, but also what would that change about what Facebook's role in our lives is? That's kind of the crux for a lot of the questions that I continue to ask about technology and society. I think that the trick is using the terminology of the industry, optimization, and the way that they're thinking about problems and problem solving is actually a productive way of sharing language and trying to get at we haven't necessarily agreed to the terms of optimization, but they're coming from it at a market perspective and that's the kind of natural way for things to evolve. But we as a society, we can start to question whether those are the terms that we've agreed to or not. One of the things that I thought was really well done in the book, the book addresses how many of the issues that we associate with Silicon Valley now and with the larger technology world, issues of privacy, issues of commercialization, issues of users being assessed in ways that they may not agree to, at least some of the companies that are major giants currently, Google being the one that sticks out of my mind, really started with an ethos that was entirely against all of those things. Can you speak a little bit too? I mean, how did we get here? I want to just pick up like about the I think the way Sarah put all that was really spot on and I was reading her report that she wrote and I think she kind of classifies critics and I could see myself in it and I think that what she's talking about is like a very practical ways of trying to like get to a better place and I think in this book I was sort of looking at the history like you're asking and also trying to ask bigger picture kind of questions like think about the efficiency argument. It's not in the book but I was thinking I was almost going to make it the kind of beginning of the book is this idea of like gleaners and the idea then the Bible there's this kind of instruction that when you have a feel that you should just give it one go one pass in harvesting it. You shouldn't go back a second time and efficiently get every little fruit and kernel you miss because partly there's an ecosystem of people who are traveling or poor people who live off of the the the gleaning and that you know that is a metaphor where again you're saying you know what the efficient thing would be like I have a farm I need to get all the content out of it that's what I do I'm a farmer or you could say well you're part of a society and actually the efficient thing is let some kind of scraps be there for other people because they efficiently use it you know again it's like using their language and thinking about the picture there you know the world they're trying to create and when I an editor sent me recently like a tweet where someone pointed out that Mark Zuckerberg was saying how he cared so much about this election meddling the Russian meddling the election that he was gonna the company was going to spend all this money to hire people that's why he mentioned it in an investment call because it was so important they were prepared to lose money over it and of course the natural comment this person made was like well you're basically saying you make money from the current bad situation that's how important is you're gonna flip on that one and so that that's again these efficiencies in the way they're set up that are are really troubling about what you're asking about that history I was trying to look back I didn't really know these answers I was learning about how did we get here that was definitely the question I was trying to ask and in the end you'll see like kind of a thesis of the book is that the computer science half as accounts for some the hacker kind of mentality accounts for some of these extreme ideological ideas about free speech and kind of lack of diversity and hostility outsiders and then you know I kind of credit or blame Stanford for a lot of the the kind of profit seeking and so you know for me the Stanford the the Google case was a real like just very enlightening because you go back and read the original papers that that Sergei Brin and Larry Page wrote when they were developing this which was really an incredible invention like the Google search engine or page rank you know I think everyone agrees that they really maybe they were they were standing on the shoulders of others but they really created something that took this chaotic thing called the early web and made it you know coherent and it was an incredible it's the reason why it like became so popular it's an amazing invention but they really also explain as they were describing this amazing invention why it needed to be advertising free and needed to be in the in the actual academic world that it needed to be a place where it was transparent there shouldn't be these black boxes now of course it's we just come to accept the idea that the Google algorithm and the Facebook algorithm are these secret things that no one should know what they're doing and I'm sure Sarah could talk about this and that they're constantly tinkering with it in this mysterious way I mean they were arguing that's very bad for science it's very bad for trusting the system because if you don't have any you know scrutiny of it and so they wrote this paper explaining all that and at least the way I see the the story going is that basically they they are serious academics their parents were academics they were getting PhDs off of this incredible idea they had and then basically it ended up using so much bandwidth at Stanford that they were told you have to start figuring out how to pay for this now to me it's a fair question to say couldn't Stanford have said this is a great invention we're gonna pay for it of course we're gonna pay for we pay for like a nuclear reactor in our in our building you know it's very important for for our studying of our society and science to do this but instead they were told you better figure out a way to do this and they immediately were connected through the Stanford network with an investor like immediately before so early that they actually didn't they weren't incorporated so the story that's told in these books is about how a person who had been at Stanford a graduate wrote them a check for Google Inc and they're like there is no Google Inc he's like there will be take the check you'll need it soon and like a month later there was a Google Inc and they deposit that hundred thousand dollar check and the rest is history so I do feel like there was like a maybe it's a little corny or something there's like a corruption or you know a because he called selling out whatever there's a corruption narrative for them and for Facebook as well where they really had some idealism they kind of were in awe of the power of computers and they weren't necessarily trying to become you know billionaires that wasn't what was making them tick I think in the book you'll see there are other characters like Peter Thiel and Jeff Bezos who were bankers and that was really what they were trying to do was to figure out a way to make money from the internet but these idealistic hackers I feel were kind of led astray a bit that's my view anyway. Sarah you've written extensively about how coverage of the technology world has changed over time can you weigh in a little bit about you know how has that how has the media evolved as the tech world has evolved? Sure so in the research report I did for the Tau Center for Digital Journalism at Columbia I think one of the things I was trying to look at was you know coverage from that kind of early almost breathless excitement about the Silicon Valley moment the dot-com boom and all of that kind of energy that went into covering before you know the Amazon era and then later in the Google and Facebook and and others era and that kind of starting from a very business oriented coverage model or from a kind of tech blogger model and so that kind of breathless coverage moving into something a little bit more concerned with you know as the technology starts to intersect with a lot of things like politics and people and society those kind of shifting the narrative about what matters about technology and why this is changing our lives and affecting our lives and I think that that shift kind of happens at a couple different points I think you know 2007 or so the iPhone comes out and like we all of a sudden have dramatically changed our like day-to-day relationship with a computer in our pocket basically so I think but yet that was still kind of in that gadget excitement phase and then we have like a 2013-ish moment which is the Snowden moment and I think that's kind of where everyone comes to terms with the fact that like technology has both good and bad uses right that to me is this like larger discourse moment where everyone excuse me specifically journalists and publications are willing to acknowledge that like yeah we need to think about this also and hold power to account and and so on I think what's really interesting right now and I think I wanted to touch on this a little bit because right now they're you know at this very moment your book came out there's Franklin Four's book the world without end which is a little bit more about these companies controlling our access to knowledge and information there's also the four by Scott Galloway which is a lot more about the kind of monopolistic approach of these companies so a little bit more on the market side of things but you know you also have Tim Wu writing about the attention merchants talking about you know these companies monopoly over our our information and our attention so I think it's an interesting moment right now in part because like all of these books were being written before the crisis hit so it's it's fascinating like the writing has been on the wall for a long time publishers seem to have acknowledged this to think that there was like a market for these books I like to think of a lot of this and a kind of meta like where's the audience who does this who is this for your narrative that 2013 was a sort of important moment I think about I wrote a piece earlier than that about this German politician young guy who who petitioned to get all his data about all the tracking that was done with with him and it was like you know and it ran on the front page of the time so I mean it was it was but it was again you kind of point out and you're in that paper you wrote that the breakthrough was hard so like it's writing an article saying yeah isn't that weird they are keeping all that data about you but until something like Snowden really you know gets attention for a lot of different reasons it probably takes a while for people to to see it so I think that probably was a big trigger yeah well we were talking a little bit earlier about the the problem of access right like who for a journalist and I would be curious to hear your take on this is you know for journalists to have access to these companies they have to kind of stay on their good side to to a degree and that is especially true if you're like the business tech journalist covering the story so you know as more kind of journalists from different beats and different walks of life are you know also coming to terms with technology's impact on society the narrative starts to change right yeah I know I think that's really I mean because what you're pointing out is also that one maybe the benefits from leaving that kind of gadget phase is that that's less important so I think there was a sense that in in 2007 like getting access to the first gadget is usually important but now we are you know we're kind of beyond even the gadget phase now the the ramifications become as important and a well-told story explaining that is very vital I think they're you know I didn't seek out a lot of access I knew that there wouldn't you know wasn't really what they were interested in talking about and I don't know that a lot of them have such such there's an incredible site called the Zuckerberg files where a professor at the University of Wisconsin Milwaukee has just he kind of over a privacy issue saying that you know they know so much about us has systematically found everything that's like Mark Zuckerberg's ever said and he has a you have to act you have to like kind of log in with him but it has access to a usually it's videotape of it and you know streaming of it and also text of it everything he's ever said since he's 19 and you know I read like a lot of that I mean I think almost all of it and so likewise you know and Mark Andreessen until he deleted all of his tweets he tweeted a hundred thousand times so it's like you know there was more than enough about him and you know Peter Teal's written two books I mean one book he talks about a lot more than the other one but there was an early book he wrote that was really I thought revealing about what his worldview and you know so they all had a lot of documentation I felt like it wasn't so vital to have an interview with everyone I did try the interviews it was often very like meant to be very stenographic you know and not not that not that revealing so yeah I think that there is a appreciation for the deeper journalism the more critical journalism you're talking about I think that's that's great I think it's what we need but it has to be supported by you know the the institutions the publications to be willing to stick their neck out and you know put that forward I think specifically of the kind of Amazon workplace environment example like you know the reaction from basis with for that piece was just like what are you talking about we're fine just to fill everybody in the New York Times ran a pretty large piece on the inner workings of Amazon employees and it really spanned from very low level all the way up to high ish level yeah middle middle management not all the way to the top but they really detailed the sort of break deck work conditions of that place and it ended up getting lots and lots of attention yeah and I felt like the response that that Bezos gave was a really a classic libertarian response and like Bezos obviously is the owner of the Washington Post he's considered someone call him a liberal terrier and I mean I think he embodies a lot of what I'm saying in the book and his response was very clear that he said it can't be true because these are people who could go to another company and if they weren't if they weren't being treated right they would just wouldn't work here so therefore they are being treated right and because they do work here it's kind of the same logically here a lot in Silicon Valley that says there can't be you know gender you know sexual discrimination because there would be these arbitrage opportunities there would be a company would take all the great women programmers and they would be the best company and therefore it can't exist the market would would correct for it so it's that kind of and I that is another theme of the book it's definitely this this detachment from reality the kind of what was when you ask about their intentions or there's something very seductive about the idea the internet kind of can erase all the past so we don't deal with the history in the current racism and sexism it's a new world and so to me it was a really interesting anecdote I don't know if it's in the book but it was about github and how this company github had a huge or really proud of this this carpet they had there was a big symbol of github and it had meritocracy on it and they were like we're so proud we were we run a meritocracy and then women were walking and saying this is really offensive that's like what's offensive meritocracy it's the great the best rule I mean the best rise up I mean and they didn't understand that by writing about having that as their like slogan they were in essence saying what we have now is fair and if you aren't represented here that's because you didn't you didn't make it you didn't cut it so that I did that and I think the head of github eventually took the carpet away but it was like such a re-education for him because they really believe the world has been remade because it's a you know a digital world that none of that the legacy problems that are obviously current don't matter so it's a real yeah well I wanted to dwell on that baseless response because it's so clearly articulates this like complete disregard for the physical world like they're in Seattle like for them to get another Silicon Valley job they would actually have to uproot their families and lives and everything else it's not just so it's not as you know fungible yeah well it's not as fungible as he's making it out to be so yeah I mean it's it's the questions whether it's on purpose or not like if he's a smart person and he's saying that is he really not understand it or is he sort of you know but that was always I always got hung up on but when we talk about things like biases in technology both in terms of biases within the culture of technology under representation of a number of groups of people as well as biases that are exhibited by the products themselves whether that's computer vision systems that have a harder time detecting certain types of skin or there was an article fairly recently on women having a hard time getting prosthetics that fit because they're largely designed by men I mean so many of these problems just stem from you have like a dominant group in power that you have you know Silicon Valley is so very much dominated by white men first of all from your perspectives do you see those types of issues changing changing over time or right now or now we're talking about it right I mean there are these there are books that are coming out there is a lot more media coverage on these types of things I mean is that landscape beginning to shift or do you still do you still see it as we have a really long way to go oh I mean I would say that that it would require for it to change I mean it would require the book is fundamentally saying that these these companies are anti-democratic that they are against democracy they're against how do we correct in a society for to have wheelchair access and fill in every blank ideally we have a democracy where people all you know it's not true but we ideally everyone gets to express their opinion and and that is how you represent people and and I was struck like Amy Klobuchar there was a hearing that that the Senate had about the where they had the the top lawyers the general counsels for Google Facebook and Twitter there and she was sort of explaining to them like very patiently like we live in a representative democracy so we really it's really important that we control our elections you understand that like that's how we do it and and and so I feel like that is like a real like message they don't have that is fundamentally how do you assure fairness or you know we have a democracy we vote we people who are I remember hearing it said that like that like the Japanese internment maybe if there had been a Japanese American representative you know the internment in America of Japanese Americans wouldn't have happened it like you need to have some political way of correcting these things so I do think fundamentally as long as they're going to argue that they can self-regulate and that they are above the government it won't change so I think the I'm like kind of optimistic that there could be some sort of a a wave election that will kind of present a new new path but I don't think it can be done by themselves I think it's it's the idea of self-regulation is is it won't work I don't think so self-regulation is like 60 percent of the people that I interview is is just people who are talking about well this is another problem that stems from pure self-regulation within the technology world I mean the number of issues that are endemic to that are just just astronomical don't you think it's fair that it's hard for anyone to regulate themselves I mean just think about you know I mean if I were given free will to like you know being charged everything I'm sure I wouldn't be quite as fair you need checks I mean that's like I mean I do fundamentally believe in democracy I think it's a little scary that you get the sense like and Peter Thiel is a major you know I feel like one part point of the spoke was that Peter Thiel is not an outlier that it's that he he's often described as a sort of fringe character but that it is that he's really expressing the the the main thought there which is that you know that democracy is bad that like when democracy happens and you have not smart people running the world and that's not good and I think they really but I think I really believe that I remember like there was a person Max Levchin who was who was the co-founder of PayPal with Peter Thiel I mean he was sort of saying he believes in regulation but he doesn't really like politicians and like you know so it's like they kind of they just think it's not efficient or I don't know so I mean that's what I do think it gets down to our democracy though and it is really important that's why I wrote the book I mean sorry you've written about how specifically within technology criticisms that that world is also in certain ways very reflective of technology in that female voices voices of minority writers have been overlooked in a systemic way can you talk a little bit about you know do you see that end of it changing at all I absolutely do see that changing which is part of why I was looking at this kind of larger ecosystem of people who are writing about these things I think certainly in the last couple of years that it has drastically changed which is all for the better I think that also has put pressure on Silicon Valley to change so at the very least so speaking of like how now versus in the future I think we've at least seen the kind of oh yes we will work on diversity for hiring we'll work on you know thinking more about users interests and needs whether or not that's actually being effective is another question on the kind of writing side of it I think you know I I was really interested in looking not just at a set of people who are like covering technology but the rest of the people who are also contributing to a larger discourse about the role of technology in society and so some of that has to do with looking at a whole range of writers not just technology journalists who's like beat is technology but it's the people who think of themselves as critics people who are you know just writing an op-ed because they their academic work has a direct response to you know the current issue on Russia for example so my interest was into in trying to articulate this larger ecosystem of people who are contributing and a lot of that has to do with you know women writing blog posts about you know terrible things either happening at their workplace or critiquing a technology that doesn't include you know fitness tracking the iPhone not having a menstruation tracker you know those kinds of pieces are coming from a lot of different directions from a lot of different disciplinary backgrounds and existing in a lot of different places and so obviously that's not limited to publications but I think what's what is frustrating is that a majority of the kind of traditional ways and places that you would look for technology coverage for a long time still were dominated by you know your standard tech white dude bro sorry no and as you're pointing out like just hearing the list of things you're you're mentioning and even the thing I'm talking about github that was a woman was blogging that she had a bad encounter with Mark Andreessen over and that's why I kind of stumbled on it so I and yeah there are things that no matter how enlightened you are that it's not you're not going to be able to do that we were I wonder if you like look at the example of all the coverage of the harassment and the media particularly and look at a lot of that are women journalists who are writing that and are I mean is that like a reason for I hope imagine if there were similar kind of kind of push going on about that with the way the way Silicon Valley works written by women journalists who uh yeah I mean I'm certainly like especially in this current like in the last two you know two months moment I'm hopeful I think when you look back at like what Ellen Powell went through with her I think it was Kleiner Perkins lawsuit talking about explaining yeah so Ellen Powell had a I guess she was in a VC company Kleiner Perkins right and had a sexual harassment issue and that kind of just got shoved under their the the rug basically but that again was you know probably two years ago at this point um she now has a book out that is all about her you know follow through on what to do about this kind of systemic sexism and not having an ability to have support and have people take her seriously take her claim seriously so I wonder what you thought of that that uh I'm not gonna remember the guy's name exactly the google uh engineer who uh yeah you know a lot of people look at me and go like look at google did they they fired this guy you know uh James DeMore yeah James DeMore and like so you're sitting there saying that they're they're libertarians and in fact they fired this libertarian uh guy for his you know for really uh for his like insidious ideas and what he put there and uh I wonder how do you how do you see that like because somebody's it was like a PR uh yeah James DeMore for those who don't know is the author of the infamous 10 page google memo um it included a pretty large critique of google's internal culture and included uh some information uh stating that women might be naturally inclined to to be less suitably biologically less inclined uh to for uh coding than men so it went over super well um and and he ended up getting fired and then he went on a fairly large PR push um after that saying that you know it was his like autism spectrum that like led him to believe this um that's free speech but I'm just saying why why is any idea I remember seeing some on twitter saying you know Charles Darwin couldn't work at google and I'm sitting there thinking like you know uh there's so I mean the problem is that like google is now a company that isn't just doing programming that would be bad enough the computer lab being uh so non diverse is bad but this is actually affecting our society so really you even Charles Darwin couldn't do other I mean there are so many roles to be played at google it's just it's just a very weird way of defining what uh what it means to be a a tech worker anyway but I mean the interesting thing on the kind of free speech side of it was fascinating to me because it was so much like I should be able to say what I think and yes that's true in a kind of public forum but this is actually within the company and the company basically saying like well yes you have free speech but we can also decide to fire you like that doesn't preclude you from you know you don't fit in our culture anymore um which I think is fascinating that we have reached a point that google can say that that's not what their culture is or what they don't what they don't want their culture to be like perceived as anyway yeah sure right um but yeah it's still indicative of this you know engineering mentality of just like very transactional like very data driven assumption or uh you know backing of of an idea and saying like this is the way the world works and all of these kind of more just uh interventions are not valuable in his mind I think that there is also as far as uh biases within tech culture within tech products go uh there is a very valid argument that uh as technology increases as we become more reliant on automation on algorithms that biases that once you might once be able to hold somebody responsible for say hiring only men or hiring uh whoever that that if an algorithm is doing it now you're losing that person to hold somebody accountable to uh can you would you guys mind um talking a little bit about the role that explainability and transparency might might play with some of these issues sure yeah I mean I think there there's uh this great interest in like okay well we'll let the like HR algorithm system do the work and like then we have to say that it's not biased because because it's science well and it's not a human deciding so haha um but then of course you know what are you optimizing for this gets back to my main question which is okay success if you're if you're building this algorithm to say like what a successful person at google looks like then you're already baking in a lot of assumptions about their background their history their schooling all of these things that continue systemic you know injustices in the sense that you know you don't have access or you don't have the right background or you are just not a white man who gets along in his you know coding cohort so I I mean I think there are a lot of people talking about this especially in the kind of AI ethics and shout out to brookman client center and MIT for some of their work on this um on kind of accountability of algorithms and saying you know these are yes they are objective and yes they are outsourcing the decision-making process but looking into what the terms are really does matter and I think that's still a hard a hard conversation yeah I mean I was like there was a footnote in a book that sherry turkle wrote that was I kind of went back a lot and looked at AI the one of the one of the chapters is about a guy named John McCarthy was a professor at MIT and then he moved to Stanford and he's the one who came up with the term AI and he was an early computer science you know pioneer but also AI pioneer and um so the whole I was really interested in the whole quest so to create a thinking being again the idea of thinking of people's machines machines as people thinking that a brain itself is an entity that could exist outside of a body just that a lot of uh it really was a to me a very odd and revealing quest that these early computer science pioneers had and so sherry turkle had this so one of the a key scene in the book is the between the debate between John McCarthy and and this professor at MIT and Joseph Weisenbaum over whether a computer could be a judge and they're like uh they're really very uh hang you know they're very argued a lot all the time they were almost maybe they appeared in debates a lot against each other and McCarthy was like of course a computer could be a judge a computer can do anything as long as it's programmed correctly and has all the information it needs why couldn't it do anything and to Weisenbaum who was a refugee from Nazi Germany that was a really obscene that was where he was obscene idea that you take something as human you know again the loss of humanity as humane as being a judge or being a therapist and thinking that that could be performed by a computer was just a real disconnect with reality so and sherry turkle had this uh a footnote on that exact topic I think she knows Weisenbaum before he died and um it was about how he she interviewed uh minority students at MIT like in the early 80s and they were very uh encouraging of the idea of a judge being a program of a computerized judge because they thought you know we know judges are biased and horrible and this computer will be fair and I think it was like the era when they really believed you know computers could actually be these separate entities and there would be this like thinking being and then you know cut to 10 years later and then you know the view entirely switched because they realized that basically it's being given this garbage information about how judges rule in reality what's it going to learn how would it be different than any better than what it's learning from the kind of machine learning argument so all long way of saying that like that these these computers are neither good or bad obviously the people use them that's the problem what their information they're given they're not going to any better than our society out why would they be again it's this fiction that the computer world separate from the real world it's like it's a product of the world and if we have a racist sexist society it's not going to fix that problem how could it and I think of one quote that Weisenbaum said which was that which was kind of a carried over in the whole book was the idea that a lot of the problem that these people the computer programs think that they're so great at math and they should be solving all these other problems he's like math is easy at least for him it was but like actually solving injustice or writing a poem like that's hard so that it's that fundamental disconnected thinking if you're good at programming or math that you actually should run our society when we know how hard it is to fix our society is the fundamental problem I'd say do you feel that that technology has any role to play in terms of correcting correcting these issues then I you know to contradict myself I think there was this famous well there was this famous court case in the 80s which was arguing that that the death penalty should be ruled you know valid because they showed systemic bias so I'd say I think data and showing systemic bias can be very enlightening to the public so I think you you know and the court and I'm not using that and didn't basically said you have to prove that there was racism in each case you can't have some systemic argument that our criminal justice system is unfair I proven it with with data so we should fix it so I mean I on the other hand I do think data could really enlighten us about how how unjust our society is and what we should work on fixing but I don't think you know a program is going to do that does that make sense I don't mean sure yeah I mean this leads into kind of a bigger question I had following up on the book which is you know I think you do an amazing job of kind of articulating the the ideology and the history of where these concepts are coming from and the kind of hacker mentality butting up against the entrepreneurial model but what I was left wanting more of was like okay so so what and also what do we do um and I think one of the so I kind of always go back to Lawrence Lessig's points of what we do to change things this he wrote this in code and code to a long time ago and so one of those things is okay we can we can decide what the technology is optimizing for we can decide you know what terms were either designing the algorithms for or kind of what direction we're leading for and that still goes back to the question of you know is it optimization towards efficiency or optimization towards justice right but I think there's still other parts that we could start to unpack like okay if the libertarian approach is a problem or the mentality that like this has led to huge monopoly systems right we can start talking about you know markets seems to be one of the so Lawrence Lessig's model has markets code law and norms are the four kind of ways that one could or the levers that one could change society or change where things are going I'm like hesitant to say that markets seems to not be a real possibility here like we have ended up in a monopoly situation network effects basically mean that the market lever is kind of impossible right like do we have an alternative to facebook do we have an alternative to google do we have an alternative to amazon right kind of yes but not really at the scale that these companies are operating at which is why we kind of get back to the regulation problem that's the law piece I'm scared because that's certainly not going to be a functional lever for the next four years at least you know even looking to the net neutrality moment we're you know rescinding all net neutrality limitations antitrust is still not really set to address the way that these tech companies are set up it doesn't really apply to free services either right like the standard ways that we look at antitrust which are like competition pricing harm to the consumer right these kind of don't work and so I I think we have to think about other ways to hold these companies accountable but that has to evolve into some new model that's not you know based on our old our old levers well norms I think are something right here we're certainly seeing you know if we're seeing what happened with sexual harassment and the public that's a norms kind of kicking in in a way and I and I guess I am leery of the idea of code ever being useful because again it's like not not not knowing what you don't know so I mean like a well-intentioned male programmer I mean I think of administration you know calendar it's like it isn't even it's just like not yeah it's not it won't be a viable solution so I do have to you know so you say and certainly it's you mentioned in your report too that like the idea of what do we do now is always a a big you know a big question and and you know I think either characters in your paper who talk about how hey I'm I'm pointing out all the problems give me credit for that I don't have you know the solutions that's you know not my department as as they say and so but I do I did try and the arguments I were making were that smaller local things if the points you're kind of making that are that need to be fixed I was thinking though that like the power of narrative that maybe a mistake I made in the book and maybe now I try to talk about more is is to sort of embrace this sort of individualism argument and the because I do believe in individualism and and and and people you know self-actualizing getting the most out of their lives and maybe the way to think of it is the way the data is that people should have a right to their data and that that is like a way of framing it that that something is like gone off the rails that it was just considered normal for these companies to collect everything about you and even though they're giving the services in exchange it's not really done in a very transparent way and it's just fundamentally wrong I mean all I can fall back are these like analogies that like imagine if you walk into a store and someone follows you and go oh oh did you buy that that's interesting I'll go tell I'll make a dossier about you and like I mean it would be like unthinkable why it would be a norm I don't know I mean like people point out that like that um you know maybe an efficient thing is to go to a store and you see that take a penny add a penny and go take 55 cents and go what can I buy with this I mean that is I break a rule I don't know but it's not our norm it's not how we behave so I think partly maybe I'm I am hoping for a norm norming away from these these beliefs and and a belief that we as a society if it's still possible to come together and make better rules so that's all I can fall back on and have certain values like your data is yours and and and partly I think Libertarianism is one of the day because even the way you frame it with Snowden you know I in my gut I feel like if if Google and Facebook and these other companies hadn't collected so much data to start the government would never have dreamed of it it's like they they were the because they are the custodians of of all this material and they abuse that and that's like set us on a very wrong path and we have to get back from that path I've certainly wasn't there someone the CIA director or someone was saying how Facebook has a better dossiers on people than we ever could do because they because it wouldn't work a government like coming there and saying tell us everything you bought and what you like it just wouldn't you know it would be we would raise raise your antenna so I think we we need to try to get back our control or society I said pretty open-ended comment but even even if that's and I totally agree with that as a direction I still wonder like how does that actualize like does that mean we stop using Facebook does that mean we don't let Facebook do certain things does that mean we demand certain features and and protections of these companies does that mean we have a cut like collective action movement about like you know fuck the default or some some kind of way of getting into the the the way to change it right like yeah because it's one thing to say yes I need to have access to my data or have ownership of my experience or you know be able to voice my interests and needs to determine the way that my newsfeed is filtered right like we don't even get to do you say make a Facebook group about this right now I'm with you uh no I like does it take a movement yeah I do think it takes a movement I do think you're pointing out like things like Snowden I think the election are going to be galvanizing moments and I was struck watching this hearing that I watched uh to comment about uh about the the senate judiciary hearing that there was uh it was the the sharpest questioner was this senator republican from Louisiana John Kennedy who asked just pointed the best most pointed questions about what's going on like does could Facebook you know give me a list if I'm an advertiser people are depressed and could I sell them alcohol could I you know do you know who's overweight and I sell them diet pills are you he and it's like it should transcend you know petty partisan politics I don't think it should transcend like you know greed versus not greed or but I think that you know middle America is definitely the one that's being hit hardest by all this is there their their wealth is being taken and being sent to the coasts you know by and large so I think there is a chance for a genuine movement I mean if there is this you know belief in our political system that there I do think that something could change I think things are galvanizing I'm hopeful I mean well even in that case I think that's one of those okay so if we're gonna define what our norms are we at least have to have all these examples and all these cases of saying like yes this is a possible way that this advertising platform could be used and oh by the way we don't believe that that's an appropriate use right I think it has taken us a long long time to get enough of those examples and enough of those understandings of how those systems work and what they're capable of and how advertisers are using them for example to even begin to establish a threshold of what we are defining as appropriate or not appropriate and for the most part all of that is kind of hidden yeah but it's hardening that like the senator I hadn't thought of it I didn't articulate nearly as well as that like he went right at him and it was they were just kind of hemming and hauling is they were they really you know very concrete it was very it was great it was really impressive so I you know I mean there is a sense that yeah that maybe that is sort of the more of a burden and maybe that's what you're saying that the book or the next thing I have to do is to really be more concrete and more galvanizing that way I want to just tell the history because I was really like what Chris was saying was like I wanted to just sort of say how did we get here because I was genuinely like baffled and I kind of came at this stuff with a lot of optimism and Wikipedia is like a real obsession of mine like I wrote many many articles about it for the times I really saw that as like an incredible like anarchic you know some of these little parts become this incredible thing and I so I do have a lot of optimism about it I'm more was just trying to answer that question how did we get here more than like how can we get it back but that is definitely the next like question I think we're gonna have to open this up for questions in just a minute I'm gonna ask one last question and then if anybody in the audience would like to ask questions please go to one of these two microphones here we would love to hear from you because we are at MIT I'm obligated to ask you know what what role does the university have for students who are here who are designing things who are making their own startups you know what ethical considerations should they be thinking of before moving into the real world wow I mean because we're kind of saying how the deck is so stacked you know and not they want to say it to young people I guess I mean there is a when I was sort of advocating for local you know there's a certain tend to your own garden and like trying to create I would think there must be a real thrill and like trying to create a small project that can have better values maybe not be driven by the market and and seeing that come to fruition and I do think there's got to be a way to you know not to sort of view that what they're basically saying now because of the monopoly we're the system we're living with now that even new companies are just only looking to be acquired and so I would say that that you know I mean who am I to sit there and say don't like seek your billions but are your millions but like that would be the message try to try to nurture something that that is smaller I think there is like a real yearning for it I mean but even if somebody is seeking their their billions which is weird that that's accurate here strikes me is very strange what should they be thinking of before when they're on that path like what questions do you feel that MIT students should be asking when they're building their groundbreaking technologies I mean I think the hardest question to always answer is you know you have one intention for this technology whatever it is that you're building how to recognize the myriad ways that the same technology could be used that's always a really hard question to answer until somebody has discovered some other way to use it or to monetize it or to apply it to you know weapons and military industrial complex right so I think that's probably like learning how to see your own effort from multiple angles like in target your own idea yeah yeah and run it by so many different people right like that's the thing I think that's missing from a lot of what happens in Silicon Valley is like how much user testing do they really do where a user gets to tell them how they feel they do so much ab testing but does that ever tell you what my intention was as a user my experience my emotions all of these things aren't you basically saying right now they're just using the market as they're as the only way to sort of see what the intent you know and the throw something out and it's pivoting right they call it like I'll pivot yeah there's so many assumptions in the way that most of these technologies are operating that and that's kind of by design right like the data-driven approach is that we can look at what the behavioral data is and then determine like what's going to get you to spend more time or what's going to get you to spend more money um and that doesn't get into anything about like what my actual intentions are so yeah try to talk actually talk to people yeah I would say forging real connections instead of virtual ones would be a real way to interrogate your idea because I think even even you know Mark Zuckerberg's coming around the idea that there's more to there has to be some real basis to these relationships otherwise it doesn't it's not good for us I had one more thing which is I I think so much of what we're talking about is like the leadership role that you know these individual especially in the way that you've structured the book is like the men who are in charge who you know have these ideologies but you know that filters into the culture of these companies and it's actually I think it's really imperative to think about the more systemic contextual um way that this these ideas get played out and you know how that is baked into what your um your you know engineering goals are your um your efficiency your like how how is your job um you know checked right like what's your performance review um having some control over those questions seems to be another way that we could like cut into you know the the overall culture all right we are going to open this up to audience questions now if you would like to ask um please use one of these microphones uh please be kind and we're going to switch off from one to the other uh we'll start at this one over here and then next one is coming over to you um my name's Nina Lytton I'm here from the Humanist Chaplaincy at Harvard and the Humanist Hub in Harvard Square so thank you so much to Radius and the folks for bringing this important topic out in the open I wanted to ask the question about the um Sara you had mentioned that the the technologies don't take people's feelings into account and I wanted to say uh from a an ethical perspective as I experienced that I believe the use of the the emotions you know like and anger that is a proxy for hooking into the dopamine circuits and creating an addictive experience which excuse me if my voice is breaking I think is very unethical and I wonder what your feelings would be about how to answer how to bring these you know bring this to awareness absolutely that this you know this is like whiskey and cigarettes and right the rest of it right yeah how do we regulate those kinds of things that's a great question um there's a couple things one is you know there are already engineers and designers who are recognizing that I mean that this again gets back to the optimization question are you is it really just to get you to keep through scrolling through and auto refreshing and you know infinite feed right um Tristan Harris is um one of these engineers and designers former google um person who is working on the question of time well spent so time well spent dot io um and that's really driven by this um kind of okay we're engineers and designers like we can design these systems in better ways um on the addiction side of things um I really love I think she used to be at MIT now she's at NYU but Natasha Dow Schuels work um she has a book called addiction by design um and it's it's actually about slot machines but is of course very relevant to the design of any technological interface um and yeah she's very clearly articulating this as like these systems are designed to keep you in flow keep you um going not necessarily to keep you winning um and that's kind of a really interesting um look at specifically such a clear use of addiction but um people have you know naturally taken that and applied it to um these more kind of broad consumer interfaces hello my name is Elliot Owen I was born raised in Bay Area Silicon Valley come from all this stuff and yeah there's a lot of complexity there there's plenty of good things there's plenty of bad things I think one thing that really struck me was your argument about inefficiency of if you have some large corporation efficiently optimizing to take over some market or some product whatever it is and you're squeezing out maybe the small people who live on the crumbs um I see also like a flip side I can argue um you know when you pay a medical bill who's still advocating for inefficiency when we have trillions of dollars of waste in the healthcare industry and it goes both ways there's like a huge cost to inefficiency and there's also a huge cost to the people that you squeeze out there and so I'm trying to think if if you had a tech company go through and try to you know completely redo healthcare and make it a very efficient system you might be able to save trillions of dollars but what happened to all the people who you got squeezed out from that system and I think it's you can't really paint it in black and white either way there's just there's just trade-offs yeah I would think in having universal healthcare is is a much more important goal than either one of those things you know so in my opinion so I mean I totally hear you and of course it's like a there's a reason why these companies aren't successful and they're popular because if there there was a need for efficiency and and for hailing cars or for shopping and so I guess we're just sort of saying it needs to have a higher goal I think the way Sarah is like framing is really what is really good that like you know the idea of if your goal is just to maximize profit that is not a good goal so that could explain why you'd want people to be addicted because it maximizes your profit but it's not good for society and maybe another way of thinking besides the Gleaner analogy is to think of it as just like a pollution the way the first question is that these are you know she's branching alcohol alcohol companies tobacco companies they also are giving people what they want they're very popular I mean it's like that's not the only way you judge these things so I think we need to have of course the enthusiasm and the creativity is really vital to our society but it isn't have some kind of larger goal where we're gonna have a problem and we are having a problem I would also follow up just the idea that you know why hasn't Silicon Valley completely disrupted the healthcare industry yet and I think it really comes down to the allergy to a highly regulated ecosystem right like that is so far down the road of like we just aren't going to touch that I mean looking at the quantified self community and quantified self devices you know none of those are medical devices for a reason those are trackers and stuff like that yeah yeah yeah trackers you know Fitbits you know a lot of these companies like wanted to start getting into the what's the tricoder like let me scan your face and see if you have a like Star Trek vision no no no for like how sick you are and you know to have a checkup that's just like like check up in five seconds right the reason that that company is kind of you probably haven't heard of it is that you know they've gotten so far back with FDA approvals and things like that so what do you make of the life hacking stuff and the sort of like the blood transfusions and the you know because that's also part of that culture to me again I was you talk about you writing you think on a meta level I was trying to think about like the idea of to me it was related to the idea of creating artificial intelligence and the idea of kind of getting birth to yourself and not ever dying and sort of being again detached from reality and thinking of your brain and something to be on the cloud that lives forever so that's how I viewed in that kind in that kind of context of the the ideologies of that led to computer science how do you see that the biohacking yeah that kind of stuff in the is that medical or or is that well I mean certainly but then you know you've got a lot of hacker types who are you know willing to experiment on their bodies and just think they there's something about thinking differently about your like corporeal place in the world and you know whether or not that's a sacred thing or not you know but that's getting getting deep in the weeds but thank you so much hello thank you for the engaging discussion my name is murthy I'm a graduate student in mechanical engineering so I grew up in India and the one thing that really struck me once I moved here was the commitment to free speech and the protection of First Amendment that the Supreme Court has provided in this country which is unparalleled in my opinion so I was really taken aback when you mentioned free speech as a libertarian value which I think is as much a liberal or conservative value and I wonder what you mean by abuse of free speech of course internet is not the best place for for uh uh cordial discussions and we all know that but how do you think we can really maintain the spirit of the First Amendment going ahead right obviously this is something people argue about I mean you could argue limiting the amount of people can donate to campaigns is a restriction on free speech and we currently live in a Supreme Court that thinks that any rich person to be able to give as much money as possible to affect an election and that's that is their freedom and corporations corporation people are people as well and and and and so I mean that either could be a very enlightening uh you see why I actually thought the the Russian meddling in our election was really revealing is because it really put the lie to that argument of ultimate free speech because if you really believe free speech is just an absolute and it's just all all the these Russian outlets or any other meddlers were doing we're putting words and pictures together what's the problem what what's the problem and antagonizing people are getting people to hate each other it's just words and pictures but obviously they are there there's a sort of there's a consensus I think in our country that that was a bad thing and maybe our president doesn't agree but almost everyone else agrees that that was a bad thing to have a foreign country try to stir up antagonism among our our people and to try to pick a winner of our election so that would be a limit on free speech at one level but to me that seems a very necessary one so I totally hear you I want people to express themselves and you know that's hard to to to belittle that and I understand that if you live in a society where you can't do that how harmful that is but there are real costs to you know and and in essence like places like twitter are are allowing incredible it isn't random anger it's anger directed certain groups it's like you have to sort of again look at it in actual historical context I mean that's how I would argue and that that it's almost like thinking of his code to think of that that this thing called the ultimate free speech is just going to work because it's the right thing to do and not recognize all the actual real real effects I mean that's my view of it I mean yeah I mean I would add especially in the twitter context I think they've done so much to protect free speech that they do that it hampers them from addressing very serious things and behaviors like harassment and you know how to embed checks on that kind of behavior in the system I think they've done a whole lot and tried to do a whole lot so far but I think there are a lot of people who are really underwhelmed by how that gets manifest in the code and how you know as a user who's being harassed like what are your what can you do aside from just blocking all these you know and probably again is the scaling idea that it should be automated instead of that you know if you were living in a normal society if you were somewhere really picking on someone in a harsh way we would sort of say don't do that and I would say you say it's my freedom to yell at this one person make them feel really uncomfortable so they'll run away but we would say no don't do that and you know I mean that's like that's the fundamental and when you sort of abstract it to this level it's my speech rights it becomes you know I think dangerous I mean that's but I totally hear your point I mean I guess we disagree hi I'm a humanities graduate student at MIT I'm also a six year moderator on Reddit so nice piggybacking off of that question where do you view you know you talked a lot about governance and in your reply to the last question you know you mentioned that these companies believe that all of these regulatory behaviors should be algorithmically controlled or administered but the fact is that they still rely quite heavily on human moderation whether it's commercial or volunteer like such as in my case so what do you see as the current place and possibly future potential of human moderation be a commercial of volunteer within these online social spaces yeah I naturally fall back to Wikipedia which does a lot of I think they do some automated policing but they do a lot of human volunteers and I think fundamentally it has to be human I think that's the sort of when we talk about what's bad about big it's not just the monopoly part it's that it is not human it's not human scale and I remember there's a book called The Boy Kings I remember yeah and so she said a beautiful description about how like scalability was the most important thing this growing fast and scalability really meant you couldn't be human human customer service it was just removing people out of it and that's just fundamentally Wikipedia has managed to grow quite big but not billions by using people who are very motivated to do the right thing so I yeah I think it's kind of vital that you have to have a human a community that will respond I mean I think it's like vital yeah I think it gets back to the norms question as well as to you know who's determining how norms are expressed for algorithms or for you know automated platforms right um that's a pretty hard question so yeah that's where the the interplay between technology and humans or you know the filter through the algorithm but then have a human look at it and I I kind of stand on the side of you know I think we're going to be working alongside of technologies and AIs and whatever else for a long time and it's not just going to be it either or it's going to be both um so my name is Ines I'm a postdoc at MIT and a social scientist by training and I guess so I look at this conversation and your work actually from the perspective of both a political scientist and a European I guess right and we do have a slightly different standards and in a way I cannot help but sort of want to ask you where do you see the political system you know in all of this because ultimately I would argue that you know what we see in terms of because you take sort of an very individualistic sort of approach explanatory approach which is valid and very interesting but I would argue that ultimately this is about you know how much influence do we want to give government and how much influence do we want to have on our own and that's why we have you know like the German the German court case or why we have the European Commission you know being very explicit about the use of cookies right now so if you could elaborate a little bit on this yeah I do think in the book I definitely hold up Europe as a model that way I again it's the idea of the individual like it's hard to know what you're you're describing that as individuality because the government is protecting individuals but I think in America they'd be like how dare the government be involved in protecting me you know I mean so that's the kind of catch 22 here in America and we frame it that way look how it isn't great we got all these rules so we protect individuals people are like get the government off my back you know get get the government off my Medicaid you know I mean it's like they don't people don't really appreciate what is being done on every half so I do think that's exactly I think of you know the right to be forgotten is another you know we talk about free speech that's another case where you know in Europe they have this rule that basically says you have the right to we've been kind of you've served you're done something wrong you've served your time you have a right to serve not have it be talked about you that way that you know you did something 20 years ago it showed me the first thing you see on Google and but again Wikipedia I remember discussing this with the executive director Wikipedia they very much don't like that they're in encyclopedia they if you committed a you know a bank robber even something else is maybe more declared bankruptcy 20 years ago that's a fact it's part of your biography it should be in there how dare you say we can't put that in but I do think that's another reasonable regulation on recognizing how the internet is different than a newspaper or court record like in the old days things were very there were no limits and things were published but it was very hard to go to the court to find out what happened who declared bankruptcy if you want that effort okay you could find out 20 years ago I declared bankruptcy but now it's immediately the first thing it said about you you have to sort of bend you know you have to adapt for that and that that's a different definition of free speech I think so yeah I would take it one step further which is I think part of what you're getting at is like what's at stake here is the legitimacy of these companies and and individuals to govern and rule us right like they are as institutions we've kind of opted in to living in their worlds and those are very different worlds from like the traditional transnational like Westphalian view of the world right and that's particularly interesting especially when we're talking about like Zuckerberg's aspirations specifically to you know understand that he has built like the largest or beginning to be the largest community right so I feel like that kind of political term legitimacy is really operative here to say like did we sign up for this actually are these the leaders are these the ideas that we believe in and if not what do we do about that and my last point here is that my my hope exists on the EU and the GDPR and so yeah GDPR GDPR is the general data protection regulation which is coming out or applies in I think May 2018 this is data protection and but the big thing here is that any company that serves EU citizens could potentially serve somebody visiting or a EU citizen visiting the United States so like it applies blanket to basically any company especially all these ones we're talking about if they operate and serve EU citizens that is like the best hope for any solid regulation and I think it's a kind of least common denominator situation where you know in the same way that cars are manufactured to meet the highest standards California I think we're going to see that kind of level of regulation obviously it's a little easier to change user experience based on geography so that's kind of a trickier loophole element of this but I am hopeful that that's at least pushing the conversation in the right direction again you're looking to Germany for the as the leader yeah yeah I everything you add so many thoughts while you are talking because this is my life and all the articles you brought up and all the people you named or give us all your thoughts I am a software engineer and product manager who used to work in the Bay Area and is now at Berkman with Sarah focusing on the social responsibility of my field which is tech and engineering and also the intersection of tech and government and how do we get policy people to understand technology but also how do we get technologists even a sliver of interest in what's going on over in DC not only to make our government better but also to impact policies in ways that maybe folks in the Silicon Valley can also respect and then care about but what I so I wrote down notes because I know I would just start rambling and lose my train of thought I had on the first point of harassment diversity in tech there are a lot of women who've spoken out they don't get the same kind of coverage as maybe the people in the media but Kelly Ellis, Alan Powell, Erica Baker, Susan Fowler and all the people who spoke out against Justin Kalbeck when he was a VC and so many other countless really really brave women I don't know how to really amplify them even more and get the same kind of coverage that you see from yeah it's not enough for them to have maybe you need women journalists I mean we were talking about women journalists not women brave women in the field we're talking about you know and a lot of the stories about media and women journalists writing about so yeah and then in addition to that something you said you know that really stood out to me and then Sarah also talked about was this idea of both not only the market driving decisions but also this idea of like utopia you know you're going to connect all these people and the world's going to be a like beautiful place of rainbows and unicorns and that's what and the second part for me because coming here to this area this is my first time in Cambridge I've been here three months I often hear the narrative if those people would just stop caring about money maybe they'll care about users but on the ground a lot of the engineers and some of the engineering students in this room probably have this deep like sense of we're here because we're changing the world they facebook's mission of like connecting people and google's do cool things that matter is that drives a lot of these engineers to go there if you talk to many of the engineers you're like what's the average rev or the say annual revenue for your company they're like I don't know they even put like their ads and marketing people and like over on a different campus and my engineers here's like an unlimited pot of money to go do whatever you want and so in the world of people actually are only driven by money maybe we can help figure out what to do there but what do we do when people generally believe that what they're doing is so good and that we're at google because we're going to make the world a safer place because this is where real security happens or we're at google because there's the power and money here to really connect the world and teach someone in developing regions how to farm so they can pull themselves out of a pot like these deep missions but then not think about and that for me is a much harder thing to put a finger on you can even take your anecdote and switch it around and say there's something wrong about demeaning the marketing and advertising people it's like they're part of your company so the idea of like saying these are you know the because it really you're saying that they're they're not they're not of a pure mission like engineers but it's also they're not as smart as we are and so they should be over there and that's another so part of it is that is that seduction that they I was very I wonder what you think of this but I was very struck with the argument of this that the computer is a closed world I know you as a programmer so that you you are in such control there and you can make the rules and it's so clean and it all makes sense to your mind and that that's kind of you know I maybe it's like unfair but I like talk about how Zuckerberg you know his first program was was a risk game based on Julius Caesar and you know it's like he is doing world conquest now in real life and it's like who cares what he's doing on the computer that's that's fine but like it's it's that it's that crossing off the screen to reality that is what the book's really about and I wonder were you drawn to programming because of that like closed world where it all kind of made sense and is that what we're sort of talking about when they talk about rainbows and unicorns that like they can create a world that is you know coherent that way yeah I it's it's so interesting to hear you say that that wasn't what drew me to the field I think perhaps there are groups that think of it that way as well I was drawn in by like the utopia of wow I really can help the whole world connect with each other and do all these things and candidly studying computer science this was back to what Sarah was saying what is it that you're graded on in class it's your algorithms your how great your right like some of your theory classes what is it that people's performance reviews are at these companies they don't touch on ethics or users like all that is is is separate so that wasn't what was drew me in but I definitely think the way we're trained and the way what the way we're what's the right word the way we're graded or our performance reviews at companies definitely gears towards developing a certain way even if we have certain interests in mind I don't know how to fix that it's what I'm I think about all the time and the is it a question of actual impact right like if that's what you believe you're doing how do you know that you're doing it yeah like how do you check that you're doing that and I don't think most companies have a way of following through on that and there's not room for that right like once you finish one product or feature or whatever you're on to the next thing and like how much aside from using it yourself do you really get to understand how it's impacting users how users actually use it what their experiences are and you both brought this up like how do we get people to think very critically about the impact of what we work on requires different training right social science other skills that are not what these people have and it's the arrogance of thinking that I'm good at programming I'm also good at analyzing how my wolf my work will affect the city it's like why why would you be good at that I mean it doesn't you know one final and I'll sit down I promise there's um you I would love to even talk to you more know what I will some point later about your experience covering this field for so long there are definitely these roles are seen as idols in the tech field mostly men but people who people look to as like people who paved the way and they're usually in like the hard quote real computer science field so so the leaders in AI or some of the leaders who wrote who are the original writers of the the main algorithm for Google search who weren't learned and when they say stuff people people really listen things they say aren't usually around ethics or users they talk about really efficient algorithms and different things like that so perhaps there are ways we can influence the the the leaders that many people listen to to really yeah I found it I mean the interesting thing doing the book I was like I would collect like all the each saying how smart the other one was but also like how everyone was insecure about their own intelligence so you go to Quora and there's like is Marcus Zuckerberg a great programmer and it's like of course he has anyone could have done that you know blah blah blah John McCarthy who was like the central figure in this book you know was a mathematics professor at Stanford assistant professor passed over for tenure went to Dartmouth and then MIT became a computer scientist was brought back to Stanford as a full professor and mathematicians were like oh the computer science is obviously very easy this guy couldn't hack it as a mathematician suddenly he's like a full professor so I mean there's always no one is secure when you have this world of constantly everyone's got to be the smartest and so I but you know I hear you it's like maybe an idol would be the way to get some change you know I have some important person like endorses stuff yeah for your panel and the insightful conversation my name is Stefania I'm a graduate student in personal robots group and I had a quick question around what you mentioned from Lawrence Lesig Sarah so I really think like a lot of these issues and externalities of scale came to be because of lack of good business models so one question and Eric von Hippel here at MIT actually just released a new book about free innovation first he had democratizing innovation and then he's like free innovation so it's like how do we think of monetizing value when you're not you know aiming for acquiring but actually like how do we create and value the what we we create and do you really think it's going to become like decentralized blockchain everyone owns the data or we're going to go back to our tribes that are interconnected um like what are the scenarios you're imagining because I'm European as well and it was very interesting I was just like a summit of entrepreneurship in Estonia last week and there was like such a big debate around scale-ups like the startups that need to scale and people were really revolting against that like young people uh senior people everyone was like really do we need to scale like is that always the model of success um so yeah there was a lot in there but I'm curious what you think that's great questions um so I think that again gets to the optimization question like is scale the optimization question I think you know in the in your account of Facebook it's purely about like well we don't know what the the business model is like they didn't know what the business model was for a very very long time but the question was scale and like network effects right so we'll figure out yeah right and by the way like we'll bring in Cheryl Sandberg but that's another story um so yeah I mean I think it's really hard for people to like imagine other formats aside from like okay well you know maybe mastodon or maybe um what's the other one diaspora like where how long have we talked about that since you know it first came out and we talked about it as like one possible way to disrupt Facebook could you sure diaspora I think was you know uh distributed locally hosted social network infrastructure um to not have any centralized kind of control over information and data um also I think some of the guys were former Facebook guys is that right or NYU I don't know I forget um and then mastodon similarly I think but more for like a twitter model maybe if that's a rough a rough description um yeah I mean the thing with even thinking about different models like blockchain blockchain in itself like ideologically is a very libertarian right I mean the way she even she's phrasing it is interesting I hadn't thought about it that like even the idea you know tribes that you know I mean they're probably make more sense to say that they're communities you belong to or maybe there it's okay to share but it's it isn't this universality of like I'm an individual in the world and we're connecting like that I would say the other thing about business models is that you know again that's a fall back on regulation but the idea is that you're supposed to make a penalty for things that are harmful and that's why you don't pollute otherwise it would be a great business model to like chop all the trees down and pollute the river and you know you make the most money so I think it is on us to sort of add the cost to this so that that will you know I mean that's that's how it works yeah to extract the most value too right so yeah to extract the most value from that from from them or what they're doing what that's that's the right alternative yeah right as opposed to the battle term right yeah so I would also add um I think it might one other lever is the like consumer demand side of things right so even if we if it's hard to imagine a world without facebook and amazon and google and apple what does it look like if we start demanding different versions of those things or different um features or different you know ways of interacting with these companies and I think that's really hard for people to get their heads around but you know is probably the only way that these companies are going to change so I mean the pessimistic view also is that these companies are so big and powerful that it's very hard they're gonna stop this from happening and that that when you get sat in pessimistic that's what you think about the power they have the lobby and and you know there's such you know I mean I know a president is fond of pointing this out but that doesn't mean it's wrong that jeff bezos owns a wash and post and it's it's it's a it certainly led to some good things but it's an ominous I think anyone watching ought to be it's an ominous thing you know I think it's kind of undeniable so the fear that they're so entrenched is definitely a problem I mean thank you and I come from the sociological perspective of having delivered mail for 30 years in san matel california right smack in the middle of silicon valley so you have seen the technology so I have seen the houses go up hundreds of thousands added new zeros to property and had you know people that are 30 years old and work for google and work for facebook and work for whatever come into these neighborhoods and buy the houses and the people get pushed out from those houses are the people that are now the contract workers that help support facebook and google and these are not not just the the advertising people that you spoke of but the people that work in the cafeteria the people that drive the drive cleaning cars that bring the cleaning to these people so they can work 15 20 hours a day and they're none of them are apple or facebook and they are not their employees even though they spend full time there yet they have to like spend more time on the internet getting drawn in to say can you pick up another shift yet you can't go over 40 hours because then we might have to pay you benefits we might have to pay you so the fact that there's a disconnect between like the the zuckerberg foundation the chan zuckerberg foundation right now I read an article about how they're trying to work on this homeless problem well one of the people I know that is actually homeless is somebody that lost the benefits from a job and couldn't and only has part-time so they only value the core center of their of who they are but they don't have any value of the people that truly support the company all the outside workers I think maybe M and MIT probably it does have adjunct professors but I bet you the person that's working in the boiler room is an MIT employee and probably gets full benefits and the lack of benefits is destroying the middle class and that's the total lack of humanity they've so much wealth that's what's very peculiar about it you feel like they could at least take care of their own I mean you we're talking about the societal effects but this is their immediate backyard it's really but they're I think the the most common argument that they make is like well our core business is not the cafeteria and cleaning like that we as these companies are going to do only our core business and like that's the exact of course right that the business plan isn't developed my niece worked for a startup company she was number eight well I mean she had a travel to Paris and stay in a room with five other guys where she was sexually assaulted by one of them but this was before the company had a PR department or an HR department sorry an HR department where you learned what the rules were so a lot of these startups are very rogue and yet they don't understand how it is to value everyone but even think about the logic of saying that it's not part of our core business it's like well how are you having food I mean like how are you how are you the halls being clean and I in the book I talk about the there's a philosopher Susan Muller-Oakin who argued that basically libertarianism is anti-feminist by definition it's like not a coincidence that's all tech pros because basically it's come from it's based on this fiction that like that you could be a google employee but the food just shows up magically or so I mean but libertarianism is kind of the argument you show up in life you're an adult male you don't owe any debt to anyone who raised you the community that raised you you just it's like then it all makes sense I would at least agree that libertarianism would make sense if you felt it was a fair playing field if your ideology that hard work is what you should reward okay that might make sense but we know it's not a fair it's not a fair playing field so by definition it's like it's a faulty but logic but that it kind of is predicated on on devaluing women and the role of family and women in in in getting us here to adulthood so what happened to her after did she end up staying within technology with the company and and really talked with you know who they she was in on interviewing who they hired for the HR department and and that person did get fired just by a conversation but she was really worried about actually losing her job and didn't right so that's a good thing but probably she was really good at her job and it was like he maybe wasn't as much but it shouldn't have happened but there weren't the things set in place but this is one thing I think could possibly be a way to like bring everybody around is like if all of the amazon drivers that are hired right now for the holidays if like december 23rd they all went on to take a shift people would be like oh my gosh what am I you know sorry but it's you know it's a lost battle labor unions I think are the counter effect they are our hope in that sense it's true thank you hi um I'm max I'm a undergraduate in mechanical engineering and part of the personal robotics group as an engineer who's really interested in user-centered design and getting into that um in particular how design conversations have to change for different users I've noticed that a lot of the conversations tend to be stuck in either academia or these design firms and that's where a lot of these conversations kind of exist like even in my capstone class the a huge emphasis they put on user experience on user design is great but then the shift between the jobs that people end up taking and the frameworks of the jobs ended up taking is kind of like very stark to me and I was wondering like as somebody who's going into uh engineering field and wanting to be a professional in this or like people all of us here as consumers like how do we see that shift in thinking in these larger companies that's a great question I mean I think the best way to get in is to like make that a business case right like this is going to lead to better experiences and better you know uh better user like value right um I think the question is how to speak their language and put a number on it or you know integrate it into the the kind of way that they want to value their their process and their you know development you know I mean it's a tough question I think I think it's also just a question of like where finding those people in the institutions themselves and in the companies themselves who are thinking like that and have been influenced in those ways um or just working outside of it right like Tristan Harris leaves Google to kind of advocate on a large scale to influence a bunch of designers to be asking these questions in their you know institutions from the bottom up right um I think that's arguably the most effective way um to to kind of have the ethics from the bottom up so thank you hi thank you guys for the work you do to bring this these very important issues to light um my name is Aidan I am a resident fellow at Harvard Divinity School so shout out to the humanist chaplain over here there's not a lot of us who come into these rooms um but I actually thank you for coming yeah no I I worked on tech policy in the Obama administration so I'm kind of a weird hybrid person um worked with Kathy over we need all of these hybrid people that's like that is what is missing yeah so so my question is sort of related to that I mean um one thing that I think a lot about is um during the Obama administration we did a lot of work to bring all these techies into government in part because it's like all right well if you want things to change then come join us right and so I'm thinking about this norm's question this how do we interrogate our own products um and if it's even realistic to talk about having teams that are within the you know leadership of some of these companies whose job it is precisely to do that to think about what are the effects on humanity what is the justice lens of this stuff because there is an element of if you're not if that's not your job to do it there it's like a frog and boiling water problem of like you just you forget that how how to look at those things right or you tell yourself it's easy to kind of um to believe the narrative of we're here and we're changing the world in these ways or those ways or even if the product this sort of aspect of the product that you're working on maybe is connecting people to education who wouldn't otherwise have it your job is not to look at a macro level at how the overall company is influencing the direction of things and so should those of us who are talking about these issues it strikes me that a lot of us should talk about joining the companies themselves but I also wonder if that sounds idealistic that like people could actually be empowered in those roles like what would it take for Mark Zuckerberg to say yes we want a team that does this it would mean having the humility to admit that like right now they're not doing those things so that's I mean I think they're certainly thinking about it right now um at least from a kind of oversight perspective for a bunch of different reasons but um maybe not quite to the to the level of like having a chief you know human experience officer or some you know some version of that um yeah I mean I think this is why it's really important to not only look at the individuals in charge and the Zuckerbergs but also to look at the systemic support systems around them like what is the institutional uh org chart for Facebook right like that matters um for Google that matters and how does that change over time that also really matters as these companies scale as they get bigger as they start touching more things right um I think that is and aside from like trolling LinkedIn like how how do we as a public get to look into that um maybe the shareholder like 10k has a little bit of insight into that but probably not much um yeah there's some response to agitation so I think the book was you know didn't dwell on this but a major absence is like the lack of a labor movement and that that that is what there need to be these checks and I believe me there could be I'm sure a long litany of why labor unions are inefficient but they serve as a very valuable when you see the whole of society you see that you could see labor unions as being efficient inefficient and and giving bad incentives whatever the criticism would be but they serve this vital role to check these companies and like they're not going to do it voluntarily we're saying there's not going to be self-regulation it's not going to be them appointing someone to serve this role it's only in reactions to the bad press and the valid concern about the election that's leading to some change if there's agitation about lack of diversity that could lead to some change if there were an actual labor union that could that would actually be able to strike and really affect the balance of power that's the only thing that are gonna I think make them change and it's not beyond the realm of possibility that would happen that's but that I think you expect an org chart or somehow to reflect that it won't happen voluntarily because I think they think they are doing the best that's why I was if any message of the book was that they think they are doing the best job these are experienced and they will say look how successful you are that is how I know I'm doing a great job and you know you're talking about ethics or whatever these things are like go back to example the google designer he was saying I have a vision for google I have a human my vision for how it should look I'm a designer I study this and they're like we're gonna just test it then and people are gonna like what we do and that's I mean it's just like they're not speaking the same language I would say it doesn't compute I would say what we're talking about doesn't compute yeah I think that makes sense it strikes me though that maybe now we're in a moment when they can no longer say we're a morally neutral platform and so we don't need people whose job it is to think about these things but again that may be naive of me it just seems like if there's a moment to push on it this maybe is yeah absolutely thank you I was I would just add like you know there's all any product goes through okay like the lawyers have to do the check box right like is this before you ship like the lawyers are the last step right and engineers hate this because it's like oh we built all this stuff and like now the lawyers are saying we can't do it or like it's because it's not integrated into the process that's I so imagine that but like not just lawyers integrated into the process but like experience ethicists right like who are those people how do they get hired this is like huge case for human what were their calculus scores you know I mean right what huge case for the humanists to find jobs in these companies it's not just the engineers that we need so and last question you've been waiting so patiently thank you hi my name is Heather and I'm a lawyer but not one of the lawyers you were talking about and the thing that has struck me is that what we study in law school largely is how the rules got there and why we have them and the internet seems to be a giant erasure of history and why we do things because you know I know lots of engineers and the general mindset of engineers is we can figure out anything and so that means that expertise is not valued because we can always learn it we have all this information out here but they've forgotten the part about bad information is not helpful and so when you have a platform that has no way for you to tell what's good information and what's bad information you make bad decisions and you know the in my corporations class the first thing we studied was taxicabs not be not actually the uber issue really but more that what they used to do to avoid liability was one car one corporation ten cents in the bank and so and that's that's why we have the rule of piercing the corporate veil if you don't capitalize your corporation then guess what you don't get the benefit and we're losing all of this um because we have all of these people who were so much smarter than everybody else that they've decided they don't need it and and they're also really young and they haven't lived and they haven't seen that there were reasons for why we do things a particular way and yes we should reexamine them periodically but we should recognize that there is a reason yeah there's a uber goodness thing called the chest and gate you know to say gk chest didn't talk about the idea you walk in a field and you see a gate there and you're just like well it makes no sense in the middle of the field you know obviously i'll just tear it down and he's saying you should at least have the humility to go someone put it there there was probably some reason like i don't know that it's the right reason but like things don't just ran they have some respect for like you're saying history and context and he came out i think from like a right-wing perspective a very conservative perspective criticizing george marcia and like you could see why it's fundamentally a lot of what we're talking about is pretty conservative values and that in reality these libertarians i'm talking about are highly highly radical i mean you know in a sense like not since we've seen since like you know the russian revolution or something that the lack of respect for institutions and for history and and this belief that progress is this mysterious things that happen instantaneously so i think i think we the stakes are very high and sometimes it's easy to get confused by what what's being pushed forward but it's really a scary dangerous ideology and that's what i was trying to say in the book for sure yeah yeah i think i would just add that you know to to address the kind of allergy to regulations you know that's a perfect example right like there's a reason the taxis are regulated or you know have rules around them this is the kind of uber model right like well we're just going to make it more efficient like that's all we're trying to do um so regulations of taxis and local jurisdictions don't matter right like that was quite literally like the words that came out of travis kalanakis's mouth um but we all know where he has ended up but all of that is to say like it's worth acknowledging that like technology has politics and like there are you know the the libertarian stance is such that it's like i'm just apolitical like this is this is efficient this is you know markets driven there is no politics involved and i think what we can really push against is like yes there are politics involved it's you know to call it out and call it a spade a spade i was telling my editor that like there's the line from the band rush that says you know if you choose not to decide you still have made a choice and as he was saying they said well actually it was Pascal who said that but uh so maybe it's a little more weighty that way but uh that's the myth that they believe they're not making decisions but in fact they're making new decisions and and additionally the way you ask a question has an answer embedded in it absolutely um i mean that's what you learn in legal writing um thank you all so much for coming out and again this is one last reminder before we get a big round of applause to our speakers um uh num's look um is up here for sale i cannot recommend it uh highly enough so please go and buy it and our mailing list if you are interested in hearing about further communications forums uh is right over here please join me in giving a really big round of applause