 Welcome everyone. First Media Lab talk of the year. I'm Kate Darling. I'm a researcher here at the lab. I'm excited and slightly terrified to be next to Douglas Rushkoff here today. I first heard about Douglas from our lab director Joey Ido who said, I quote, Douglas is braver than I am. Which if you know Joey Ido, the guy who invented the disobedience award at MIT that's pretty high praise. So for those of you who don't know him, Douglas is a fairly prolific media scholar. I think your website says media theorist. Although looking at his work, I feel like media is defined probably as broadly as we define it here at the Media Lab. How many books have you written? Depends if you count graphic novels. If you do then it's like 20, but otherwise it's like 15. That's either way an inhumane number of books, including the book that we are talking about today, Team Human. So I have a signed copy of Team Human. It has a snarky inscription. You too can get a signed copy of Team Human if you like after this talk. We're selling books back there. And if you're watching this on the live stream and you're in the lab, please don't do that. Please come here. It's much nicer in this room. You can get a book. You can hug Douglas if he'll let you. Ask for consent first. Come join us in the atrium. A few housekeeping notes if you are watching on the live stream and you can't join us here today. And you want to participate on social media. There's a hashtag. I think it's up there. Hashtag ML talks. And you can ask questions on Twitter because we will... The way this works is we're going to talk a little bit about the book. And then we'll open it up to a broader conversation. We'll take questions from the room and also some questions from Twitter that I'll sneak in, probably to Douglas's chagrin because I know you prefer in-person questions. But please ask questions on Twitter. I'm a big fan of Twitter questions. So, Douglas, we go way back to last September when we first met. I met you in New York. You were about to give a talk and I did not know who you were. And my baby crawled up to you and tried to knock over your water. I think that's how we started talking. And I remember that conversation just being struck by how warm and kind and mild-mannered you were. Just really nice. And then I sat down to watch your talk. And this nice guy just launches into this most powerful, passionate, angry speech I had ever heard. And I almost choked on my wine because I realized who you were. I realized you were Douglas Rushkoff. Courageous guy. The brave guy. So, I feel like the reason I didn't put two and two together is because it seemed like such a contrast at the time. Like, fiery rhetoric and your demeanor. But looking at your work and reading this book, I actually feel like it makes a lot of sense that you are both nice and angry at the same time. And that, like, maybe we should all be a little bit more of both of those things. And I do want to get, I'm hoping we can kindle some of that fire here today. But I actually want to start with the warm and fuzzy side. So, you're not jaded about people. Your book is called Team Human. Like, don't you think people are kind of terrible? Like, what gives you so much faith in the human race? Well, what else am I going to have faith in? You know? I mean, I certainly think people can be corrupted. I think people can become addicted to systems, to operating systems that they're not aware of. I think they can sometimes see invented things as given circumstances of nature and then respond accordingly. You know, if you're born into a competitive, angry war game, you're going to think that's the world. But no, I don't think, I think human beings are just confused right now. You know, and I can watch almost anybody, I mean, think of the darkest people in politics today. I mean, I can watch almost any of them and see the human being there. You know, I can have, I don't know, compassion is exactly the right word. But I can see the person struggling, trying desperately to establish rapport in a world where they don't know how to engage with other people and all. But yeah, when I look at the story of human civilization, what I see is a story of our efforts to collaborate and coordinate and work together and forge solidarity and rapport with each other. And then how fear or capitalism or something kind of turns those things against us or against that effort. But yeah, I have to. I mean, unless we believe like that some savior's going to come down, some duke's ex machina thing and fix it, then I've got to believe it will come from humans. And I think our current lowest, low sense of self-esteem we have as a species is largely manufactured, is largely a result of living in a dehumanized landscape. And I think our big problem now is we're having trouble understanding people in any terms other than our utility value. And I'm a Mr. Rogers kid, right? I was told that I'm special just the way I am. And you could say that that's, you know, a sick boomer illusion. But if you don't think that there's some essential merit or worth to humans, and if you don't have the experience of establishing rapport, seeing somebody's pupils get larger and feeling the mirror neurons fire and the oxytocin go through your blood and bonding with another person, and then you don't understand where our power actually comes from as a species, where the whole thing derives. And then yeah, sure, you're going to end up being one of the billionaires building a bunker in New Zealand. Oh, tell that story. That's a good one. Yeah. That was the thing I did that talk about was I had been invited to do a talk for what I thought was going to be this group of bankers about the digital future. And it turned out to be five billionaires who they brought into the green room to pepper me with questions about like how they should invest their money. But eventually the whole conversation turned to where should they put their doomsday bunkers, you know, for the climate catastrophe or the electromagnetic pulse or the social unrest that was going to come. And they were spent the majority of the time on the single question, how do we maintain control of our security force after the event? Because they know their money is going to be worthless. And then these guys will have the guns and be more powerful. So should they have a combination to a lock? That's what one of them thought. All about the only one who has the combination to the food. It's like, that's really, that's a recipe for waterboarding, you know, or shock collars or other disciplinary techniques. So I decided to, you know, I mean... Did they really say shock collars? Yeah, yeah. It was one of the... That's incredible. I mean, it was half facetious. So what do we do? Shock collars? I mean, it was sort of more like that. You know, it's where you have collars around the guys that if they want to serve you, they've got to wear these things so you can... You know, you'll be asleep. They'll change the controls. It doesn't work. So what I was trying to tell them is they're at the end of a scenario that they're thinking about wrong from here. In other words, rather than trying to figure out how much money they need to earn to insulate themselves from the world they're creating by earning money in this way, they could think about what about making the world a place that they don't have to insulate themselves from. But that's the anti-human bias that's so embedded in, really, in our technology culture today, in digital culture, and that's because digital culture is built on an unrecognized operating system of corporate capitalism, which has always been about getting humans out of the equation. You know, it isn't just digital companies that wanted fewer workers because they can't scale if they have humans. It isn't just digital companies that think we have to use all of our technologies to manipulate people rather than serving people. So it's much older than that. You know, and that was part of my trepidation coming here, is I don't want particularly MIT media lab people to think that I'm railing out against technology. I love technology, if anything. I'm disappointed in what we did with technology because I really believed that the internet could have helped us practice collective intelligence. You know, collective awareness, collective activity could have helped us not do it, but at least be training wheels for, you know, at the time in our little psychedelic world, we thought, you know, for the guy in mind, you know, for the global neural pathways to emerge. And we just surrendered it so fast to the market that we're using it for the opposite. You know, we are not the users of the internet anymore, and we're not even the product. Even that would be something. You know, we're the medium at this point. The net is playing us. We are the medium of our technologies. We don't use algorithms. Algorithms use us. We don't use our smartphone. Our smartphone, every time you swipe on your smartphone, it gets smarter about you and you get dumber about it. You know, and we can't even learn about the smartphone because the algorithms in there are protected by proprietary black boxes. So we can't even know the systems, right? And an oppressive law, at least bad laws, you can see they're on the books. Oh, look at this bad law. Once the laws have migrated into code, they become subterranean. They become part of the operating system, and that's a little bit different. Yeah, what I really like about this, so, you know, you do talk a lot about technology and digital media in your book, and, you know, I'm a millennial. You know, the cutoff for millennials is 82, so I'm barely a millennial, but I'm there, and I'm extremely online, as the kids say. I love social media. I met my husband on Twitter. We got engaged on Twitter. Had you met him before? No, we met on Twitter. And you got engaged on Twitter, but you met between? Three years later, yes. Yes. But, like, I'm a huge fan of social media, and, like, I believe that it connects people in new ways and in interesting ways, and not just social media. Like, a lot of stuff that we do here, like, from social robotics to effective computing to, you know, fluid interfaces, I think that there are a lot of things we do that connect people in new ways, in interesting ways. And so, normally, when someone, you know, a lot of the popular tech criticism out there, like, what I hear is that Douglas Adams quote, anything that gets invented after your 30 is against the natural order of things in the beginning of the end of civilization. Right? Because their argument is usually just, like, look at that teenager absorbed in their cell phone, and, like, that's their whole argument. And they sell books because there's a whole generation of people who are, like, well, yes, clearly, that's a bad thing. And it's the same argument we've heard about every technology, like, new medium that's got, like, people say that about books when they came about, like, oh, the books are gonna destroy the kids, you know, the rock music's gonna destroy the kids, right? But your argument is different. Like, your argument is not a criticism of technology, it's a criticism of the systems that co-op the technology. Is that fair? Yeah. Yeah, it is. I mean, I'm as annoyed as the next guy by this rampant sort of almost medium rhetoric that I see now. And I love, I write for Medium, and I love a lot of it, but basically it's by putting two sentences next to each other somehow, connects them. So it's like, you know, writers deserve to be paid for their work, you know, the Internet archive can be clicked on by anyone. Okay? You know, it's like, I get you're upset, you know, and that's really the only response I can have. I got it. You feel threatened, you're upset, but it's like, you're not making sense, right? There was a day that for me shall remain an infamy, the day that Netscape went public, right? Netscape was an operating, a web browser that was actually based on Mosaic, and Mosaic was done at, what, Champagne, Urbana, I think it was at University of Illinois, shareware, right? Netscape went public on the same day that Jerry Garcia, the guitarist for the Grateful Dead, died. And for me, it put together something. For me, it felt, and I'm saying it felt, right? And this is another one of those putting two things together that are actually unrelated. But to me, those two things happening on the same day made me feel like the 60s communal, common-z-ing, counter-cultural cyberpunk values that I thought were going to be expressed by the Internet were being surrendered to the needs of the sort of IPO stock market. That Wired Magazine won and Mondo 2000 lost. That the Internet would be contextualized not as a cultural renaissance, but as an economic revolution. And an economic revolution really means we are not going to disrupt anything. When I read Kevin Kelly's book of that period, Ten Rules, New Rules of the New Economy, it seemed extraordinarily reactionary to me. What that book was saying is, here is how, even though we have this seemingly disruptive digital technology, you all can still make money the same old way by investing in things, externalizing your costs, extracting value, and moving on. So it's like, don't worry, the Walmart model will still work. And it does. We have Amazon. We have Uber. It's the same rulebook from the British East India Trading Company on how to go to a place and colonize it and form a beachhead, extract its value, enslave its people, and move on. And my hope, my naive hope, was that the Internet, rather than being a revolution, it would be a renaissance, that it would retrieve the values that had been repressed in the original renaissance, the peer-to-peer values and local marketplaces, and all the kind of late medieval mechanisms that got sidetracked or repressed by centralized currency and chartered monopolies and the replacement of the city-state by the nation-state, all these kind of abstracted, scaled solutions to things. But instead, digital ended up being more about scale than anything else. And the problem, as I see it, is that human beings don't live at scale. We live locally, and the planet turns out to be local as well. So we're in conflict, and that's where you get these sort of throwing rocks to the Google Bus situations, where you have a company that has, on an abstract level, is extremely wealthy, but in its actual physical world operational level, it ends up being extractive to the humans who are trying to coexist with it. So when I was reading your book, and it kind of starts out by saying, market forces depend on human predictability to operate, right? And so the market forces try to separate us for social control. And I was like, I don't know, Douglas, that's a really dystopian view of things. And then, like, literally just last week, I was at this conference with a bunch of marketing data analytics people, and I had never really talked to people in that world before. They were all super nice, right? But I learned so much. I learned that that Gillette ad, they knew exactly what was going to happen, and it was exactly what they wanted to happen. They weren't trying to make a political statement or anything. You know, I'm naive, right? I learned that I'm part of the 45% that prefer lime skittles to green apple. It was fascinating. So I met this woman there, and she was great. She had this party trick that she could do where she could ask you three totally unrelated questions and tell you exactly what type of menstrual product you use. Like tampons, pads. She could even tell whether you use pads with wings. And it's because I found out you're absolutely right. Like, the ad industry just... The menstrual product industry says there are three types of women. There are exactly three buckets, and they can sort you into a bucket, depending on all these attributes. And I was like, holy... That's a fun party trick, but, you know, what's the bigger picture here? I mean, it's funny, because I kind of came up with this construction for that, even for that TED Talk. The idea that, you know, when the digital renaissance, whatever it was, was emerging, part of what made us excited about it was we were excited about the new, the novelty. We were excited about the possibilities of an unbridled collective human imagination and what would that bring forth? You know, the digital future seemed like open terrain, infinite possibility. And investors don't want that. Investors want predictability. They hire scenario planners, some from even this very institution, I'm sure, in the global business network, to come what's gonna happen so they can bet on it. You know, you want, if you're betting, you want the most predictable outcome. You want to bet on a sure thing. So we've ended up, I feel, we've ended up using data and technology more to figure out where things are going than to have some impact on where we might want things to go. And especially those billionaires who saw themselves as so utterly powerless to influence the future, what they had to do was build bomb shelters to prepare for the inevitable collapse of civilization. I thought, wow, I feel more powerful than they do. Is that because I'm an idiot? Or because they're so locked into their betting. So when I look at the primary use of algorithms today or of big data today, when I look at Facebook, what I see as an operating system that uses data from our past to our statistical bucket and then use behavioral finance and machine learning to get us to behave true to our statistical bucket. So if they determine with, say, 80% accuracy that I'm gonna go on a diet in the next two months, they're gonna start filling my newsfeed with, hey, Doug, you're looking fat, or this is what the veins of someone who's not taking care of themselves look like, and they're not just doing it as a particular diet product. They're doing it to make sure I stay true to my statistical profile, to get that 80% up to 85% or 90%. So what they're actually doing, and I understand why, because they want to increase the predictability and ultimately serve me better if I want the thing that they've got, but what they're doing is taking that 20%, that Pareto principle weird factor and reducing it to 10% or 5%, or if they can get it down to nothing, they would. What they're actually doing is reducing our novelty. They're reducing the one thing that humans have over machines, if anything, is our 20%. Is that anomalous behavior? Is that unpredictable thing? If we're gonna cure cancer or solve climate change, it's not gonna be the 80% doing things the way we do it. It's gonna be the weird 20% who figure it out. So if we get rid of that, that's a problem. Is that 20% basically considered in our current technological parlance? That's called noise. That's not noise. That's humanity. That's what I see as the thing. That's the quirky, weird thing I'm trying to promote and celebrate, and that's the part that seems soft and squishy, but I'm arguing that there's a weird, good reason to keep people around. This was the argument I got into with the famous singularity guy on a panel for CNN, and they cut this part from it, where he was arguing that the singularity's coming and people should accept that computers are our evolutionary successor and we should be humble enough to pass the torch to them and then recede and stick around as long as computers need us to keep the lights on and then accept our inevitable extinction. It happens. And I was like, no, but people are special. We should be kept around. And being, we can sustain paradox and we can enjoy ambiguity. We can watch a David Lynch movie and not understand what it means and still experience it as pleasurable. What is that, right? It's those soft, squishy, liminal, contradictory places. That ability to experience awe and confusion, that moment that the dog has, the dinosaur didn't do it, but when you confuse a dog and it goes like that for a second and we go, oh, we recognize that human, huh? That's the part I'm trying to celebrate, because I think that's where the magic of life happens and if we intentionally stamp that out and we have machines that are really good at shaving that off, who we are, at automating our behavior, we will never be as good machines as our machines. They will never be human, but we won't really care. So I'm worried for the day that computers passed the Turing test, but not because computers will have gotten so smart but because we will have gotten so dumb that we can't tell the difference anymore. So Larry Lessig, he railed against copyright legislation for years and years and years and years and he popularized his whole movement, the copy left movement with his work and then after years in the space, he realized he was still just fighting this uphill battle and just getting nowhere and he realized, oh, it's because the problem isn't copyright, the problem is our system of government that's so corrupt that I'm never going to win this battle and so he shifted his focus to fighting government corruption. And so in this case, you know, how much of this problem is technology versus just unbridled capitalism? I don't really blame the technology at all. Technology does not want anything. I promise you, Kevin's wrong on that, but it wants for something in the sort of Shakespearean sense, W-O-N-T, it wants for direction or consciousness or intention and that's what we would have to instill it with. You know, we were just talking about, you know, Dear, Dear, John Barlow's Declaration of Independence of Cyberspace and how it was inspired, but it also inspired a wrong turn. You know, those of us in the early cyber days, we saw government as the enemy and I remember fighting with Larry Lessig about this because they had done Operation Sun Devil, I don't know if any of you are old enough to remember that, where the government and the FBI went in and they raided the apartments of these little raver, you know, raver hacker kids who had, you know, someone broke into AT&T or someone broke into a shopping mall, you know, on the computer just to see if they could change the thermostat you know, the FBI is coming in there with handcuffs and tear gas and we're all like, oh, fuck you, fuck the man, you know, and at the same time it was like Tipper Gore era and they were doing the Computer Decency Act and they were gonna shut down websites that had dirty words and we were just like, okay, so then John Barlow writes the Declaration of Independence of Cyberspace saying governments of the world beware, stand away, we will govern ourselves, we don't need you and we got rid of government off the internet but stupid little raver kids that we were, we didn't know that government and corporations kind of balance each other like fungus and bacteria in the body, right, you get rid of all the bacteria with antibiotics and your fungus goes nuts, that was the same thing, so we got rid of government and we created this free space for corporations. We didn't know that they would want to come, the corporations hated the internet at that point, the average household that had an internet connection was watching nine hours less commercial television a week in 1994, AT&T was offered the internet for like a buck and they turned it down because they were like, what is, we don't want to have to maintain this stupid social thing, people are just talking to each other, you know, we've got to get back into commercial media, they thought it was, I mean, my first book on the net was canceled in 1992 because they thought the internet would be over by 1993 when the book was supposed to come out, I mean, that's how little and stupid they thought this thing was, so the idea that we were resting it from the hands it seemed like a good thing, but no, it turned out to be a bad thing and the problem is it's not just the corporate capitalism on steroids thing, but that the young developers they drop out of college before they've taken a civics class or anthropology or sociology, these are kids, they're 19-year-olds, they don't even have the myelin sheaths fully formed on their frontal lobe and they're going, so they have post-control issues and they're already computer geek kids, with post-control going in and instead of having their professors as their mentors now they've got some Silicon Valley guy in a sweater saying, here's how it's done we're going to get you VC, your company kid, instead of being worth $20 it's now worth $20 million and that sounds so good until they realize they have to pay back $2 billion, right, the hundred X is the problem, so now we're kid, we're going to take this network for connecting people to people and we're just over here, just a little bit, it can look like that, but we're actually doing this right, so now your business plan is to extract value and data and whatever from people and the great, you know and sell this company before the data bubble pops but they lose they lose what they were doing, you know, so I don't even, and I got a read Zook, I haven't, I haven't read, I just read the beginning, but it feels like even that book, McNamee's book is kind of presenting Zuckerberg as a nape as an innocent and sort of like Cheryl Sandberg and her armies, you know, corrupted this, you know, this adolescent, I mean the Facebook's original purpose was pretty dark, but at least it was social, you know, it was white male toxic social, but at least it was social, I feel like it would have been easier to pivot sick social toward healthy social than pure capitalism to healthy social. So you mentioned, you know, the young white male aspect of this and a lot of people including myself, you know, would argue that more diverse people building technology or even leading technology companies would lead to better outcomes, you know, because people's work is so influenced by their life experience and, you know, if you have like a 20-something dude bro in San Francisco who's like, I want to make an app so I can order pizza with one button, like that's, you know, and then you look at the technology that can't recognize dark skin from photography to the automatic faucets in the bathroom to now facial recognition, you know, it just seems that diversity in tech might lead to better technology, but maybe also to better business models. Do you think that that could be part of the solution? Yeah, I mean we can blame capitalism for half of it right? And their fact that they're unconscious of capitalism but the other half I feel like is this a kind of an anti-human agenda that seems to just be embedded, particularly in western culture, you know, so I keep thinking about the Thomas Jefferson's dumbwaiter, you know, and yeah, he was a privileged white male and he developed the dumbwaiter and we're all taught that the dumbwaiter was there to save his slaves on the the effort of having to carry all the food up the stairs but it didn't, it was just there they still had to carry the food up the stairs and through a whole like two mile tunnel from the real kitchen, the purpose of the dumbwaiter was to hide the slave from the dinner guests it was to externalize the labor so we don't have to see, so it was ultimately a dehumanizing device to make it look like slavery wasn't there, I mean and that's part of our problem is we have in America such a pedal to the metal a blindered, forward looking understanding of technological development where everything that we've done to get to this point and everything that we're doing that all the externalized harm is behind us you know, it's all back there and there's almost, there's for all the memory in these devices there's no sense of memory so we make movies about robot slaves that have a revolution and kill us where do you think that fear is really coming from, it's from a nation that was built on slavery and still hasn't acknowledged where the heck it came from and it still hasn't looked even that far back much less at the exhaust pipe sticking out of the back of everyone of our lives as if you can go forward with it, so I feel like capitalism is a big problem but there's also a more fundamental problem with any technology that we develop and I would go all the way back to language and text that all of these terrific potentially unifying or collaborative technologies and languages and media if we're not aware of the affordances of that medium we end up at the mercy of the medium rather than in control of it and technology, digital is just the latest one of them, you know when we got text you could look at the convention of Judaism say as a society trying to deal with the potential downsides of a world of text of a world where we're going to have a history and a future you know they remake their relationship with God into a contract, a covenant is what Torah is, they write down laws because they're looking and they're saying oh wait a minute, when we start writing things down now people are using text to keep track of their slaves, it's the first thing we did with text, people are lying in text they're writing contracts that they don't follow so what if we try to develop laws that are going to codify, I mean they were really trying to think about you can and I have, you can analyze even the Ten Commandments as these are the things that we're going to need to deal with as we move from an oral culture into a textual culture it's kind of interesting and they understood what was going to happen they understood when we moved from an oral culture to a written culture, a lot of the rabbis were so upset that we were going to write this stuff down, they said oh no people aren't going to remember the stories once they're written down people aren't going to have to, learning the stories won't be a communal event, so then they made a rule and said okay okay we'll make it so that if you read Torah you've got to have ten people they're a minion to try to reinforce the social fabric of it, so if we had been that conscious developing radio and television and the internet of okay what are the biases of this medium, how are they going to change the way we relate what are the, what ethical presumptions about humans might these technologies not recognize and how can we compensate for that, you know that would be a very different path but I feel like we're developing this stuff on top of operating systems that we don't even understand the biases of them and we're just building on and building on and building on and we need to disinter some of the biases and embedded values and I would argue that rather than rejecting technology all we need to do is retrieve essential human values and embed them in the technologies of tomorrow rather than forget them utterly so when I asked you what you wanted to talk about today I know that you mentioned you know that artificial intelligence and technology might be interesting you know we're at the media lab after all but there's actually another part of your book that was really fascinating to me it's just a little part but I thought it was really relevant to our here institution and that's the part about education you know what, what is education? Yeah I mean I thought a lot about that because I've been teaching myself for four or five years and I have all these kids coming in and their parents all about what job can I get when I study media studies, you know what's the job and public education I teach in a public university, public education was not developed for job readiness public education was developed as compensation for people who had to work all day the idea was that the coal miner was working in the coal mines all day he should be able to come home at the end of the day and have enough education to be able to pick up a novel and appreciate it that even though he's a coal worker he should be able to live with the dignity of a thinking conscious person with real cognition thoughts that are valuable and plus if we're going to live in a democracy they need to be able to read the newspaper and be informed enough about the issues to actually exercise the enlightenment value of voting and instead now we've turned the classroom into job training we have CEOs meeting with high school principals and college presidents who are anxious to find out what skills do you need our students to have so they can get a job in your company do they need to know excel spreadsheets should we teach them that or do they want to know python or java what do you need so the classroom is a way now for corporations to externalize job training rather than being these these dare I sound too idealistic these sacred places where young people get to through mimesis get to practice what it is to learn with a capital L you know and that's where I mean I got into Horkheimer's Eclipse of Reason it's a great small book where he's arguing about this is old from like Frankfurt group where he's arguing that there's a sort of capital R reasons that we do things are like the big almost kind of almost platonic value I don't want to talk about because it's really more Aristotelian that's a long story but the real ideals the reasons we do something being reasonable the little R utilitarian reasons and that's it we've started to think of education in a utilitarian way in terms of inputs and outputs how am I making a more how am I optimizing these people to be workers in the economy of tomorrow rather than how am I enhancing these human being this human being's ability to experience the essential dignity of being human and it's just it's so funny that even that now is almost considered elitist or a luxury oh you're not the one who needs to get a job when you get out of school it's kind of this there's an ass backwardsness in that and I use that as sort of one of the main examples of this reversal of figure and ground how human beings have become the objects of our reality rather than the subject and that's a dangerous place for us to be and it's just not an appropriate way for us to teach each other and be with each other can you just like explain the figure and ground concept for people aren't familiar I mean yeah figure and ground it started it was I guess a Danish psychologist it was that famous picture of that could look like a goblet or it could look like two faces looking at each other and some people see the goblet some people see the faces so it's sort of a test in a way of whether you're seeing the figure or the ground the the subject of the picture or the landscape in the picture and you know you want kind of a healthy balance where you understand the ground and you understand the figure but I use it really as a way of describing this kind of profound reversal between people and their technologies like when I think about the internet of things I think of the human beings are the things right we're the things in the internet of things that are actually being tracked and ultimately manipulated so that's not the only so education is is one that you kind of unravel there there are a bunch of constructions that you talk about in the book that we kind of just take for granted today and this might actually lead nicely into the AI discussion because my favorite part of the AI chapter was talking about automation and jobs because you know all day I hear people being like oh are the robots going to take all the jobs or are they not going to take all the jobs but you peel back an additional layer like well what what even are jobs and so you say about jobs today they're not a way to guarantee that necessary work gets done but a way of justifying one's share and the abundance what do you mean by that well what are what are jobs for right now I mean all politicians always talk about let me get people more jobs more jobs more jobs I mean who really wants a job a job when were jobs invented jobs were invented we talked about that and the transition from medievalism to Renaissance capitalism people used to have small businesses and then those businesses were made illegal so they had to get employment they had to get jobs that was the first time since slavery that people had to be sell their time they're talking about technological determinants that's when they put the clock on the tower in the middle of town to make it look fair that you're selling your hour instead of selling the bread or the thing that you made you know when I heard and originally it was I was listening to Ben Bush and all those guys talking about jobs creating more jobs for people and I was thinking why do they want to create jobs for people is it because we need more stuff we need more work done we're tearing down houses in California because they're in foreclosure and we don't want the market prices to go down the USDA is burning food every week in order to keep market prices high and trying to you know it's what you know so why can't we let people live in those houses or have that food well they can't have it or live there because they don't have jobs so then we have to what? loan money to a bank to give it to a corporation to build a factory to create plastic doodads that nobody wants so they have to hire an advertising agency to create demand for this crap that people will use and then throw into the ocean and starve the fish that we actually want to eat also this person could have a job right and so there's you know what I mean and if we're just trying to program a fix a clue to the existing system then digital technology will come to the rescue right and create jobs or task rapid jobs or these jobs will make jobs will create jobs it's okay you know we'll pump out jobs if that's what you're looking for so I was like I don't want a job so what if we look instead at rather than I mean it goes to a whole lot of possible economic arguments but let's think about what actually needs to get done and if we don't have enough jobs then we're gonna have to share the jobs so that everybody can have the experience of contributing you know the fact is we're not really near a jobless future if we were then we wouldn't have to like send kids into caves in the Congo to get rare earth metals we wouldn't have to leave mercury and landfills in China and Brazil you know if we wouldn't have to destroy the topsoil in the next five or six decades you know if we had more labor intensive careful practices in our production and agriculture and everything else you know we might actually get to stay alive as a species you know so there's not even a shortage of tests for people to do but we consider jobs another thing you know you want to limit the number of bodies that are in your company so that you can scale infinitely and sell you can't sell a company that has people unless they're programmers and even then the aqua hires kind of that's kind of faded as well so yeah there's just this this that's another figure in ground problem it's this ass backwardness that happens if we don't interrogate the underlying assumptions of the problems that we're trying to solve yeah so what about AI though should should we I see people constantly trying to create AI systems that you know are a replacement for human intelligence that can perform human tasks and we want to automate human jobs like is that a good thing or a bad thing depends what we want the AIs to do you know I mean seriously it does you know AI is going to try to do whatever you tell it to do so if what's the what's the function of a car salesman is it to get someone in the car they need or is it to get someone in your company's car you know and that's two different things I don't like AIs being told to get people to do something you know and that's language I've been trying to avoid since I talked to technologists at the beginning how do we get people to do this even well-meaning lefty liberal whatever how do we get people to care more about the comments how do we get people to you know and that that whole construction is again it's objectifying the person so when AIs are about that then no I mean if an AI wants to drive me around or run my subway or something yeah if it's safe sure and if it's going to make a calculation whether to kill the rat squirrel you know because it can steer and kill one of the other sure that's a problem that you don't have a preference rat or squirrel I do but I don't know where the actual I don't know where for me I think it's prejudice I think the rat's actually isn't a rat smarter than a squirrel even though squirrel is cuter I mean I don't know you know that's so the AI can figure that out I just want to ride on the subway and drink my beer and read the paper you know leave me alone can you figure it out I mean but yeah I mean because that's a painful choice I don't want to make well so okay so you know we're at the media lab we're talking a little bit before this about how you know I know that there are a lot of students in this building who do care about what happens to their technology after they've created it and they do worry about you know I told you about a student who was worried that McDonald's was going to get their hands on this educational toy he built for children and use it to exploit kids and so I guess the big question is you know what can we do about that you know should can we lean into the positive sides of technology are there ways to design it in a way that's less likely to be co-opted should we give up and go home is it possible to get it right in today's capitalist system are there examples of people getting it right what do we do I mean there's so many ways to get caught up or to take a weird wrong turn I mean one of them is a lot of times we design technologies before we have a use for them and that's a tricky one I mean it's not to say we shouldn't do pure research I mean we have to sometimes it's just cool how do electrons spin and how does this work and all that but it's like I love watching nothing against blockchain but I love watching blockchain conferences because so much of it is about what we have this ledger what task can we retrofit it to that will do social good and not allow this and not allow that and sometimes I feel like we've got these toys that we don't know how we don't know how to use even let's try some of this try some of that it's almost impossible to in the current landscape it's almost impossible to develop technology that won't be used also in some other way so do we go higher up like is it a political thing do we have to I think we go lower down in some ways I mean if we keep interrogating the operating systems beneath what we're doing and look you know what is it that's fueling my lab and letting me to do this who am I working for what control am I giving up as I as I develop this thing forward that's where I mean it's really it's a case by case basis you know right now we are developing digital technologies with an industrial age framework and that's not going to work this is not the fourth industrial age it's something else you know the industrial age is about one size fits all scaled solutions to whatever and part of the beauty if you remember the early cyberpunk era was how distributed this was was this homespun I've got my own computer I've got my own thing here I've got my own server you know there was a and it wasn't about I'm gonna I mean I know it's the good old day we did share we shared our apps on the back of the school bus that's when you wanted six other kids to use to play your friggin maze game you know and maybe it would get to the neighboring high school you know and this was on paper tape you know I mean back in the day and they would play it I mean and that's was the original that that excitement of the way you knew if what you had done was good and worthy was if a lot of people were using it you know that was that was kind of the point and it's a it doesn't really translate anymore into well the way you know it's good is if VCs have given you enough money to force your cab company into this town that really doesn't want it I mean you know what I mean this is no longer it's sort of a natural uptake of technologies but yeah I think if you if you start by looking at an actual human need and then think how can I address this need with technology and actually engage with people on the ground because I mean I work a lot at Civic Hall in New York which is a very well meaning place where lots of independent people come in and develop these civic technologies but I'll talk to a guy it's like oh I'm making this you know app for homeless people on the street to be able to use blockchain to get durable identity over long periods of time it's like if you talk to anybody about this it's like finally they launch this thing and they talk to the homeless people and they're like I don't want durable identity I'm hoping that I get out of this and no one remembers who I am at this stage of my life don't don't put that to me or I'm trying to use the benefits of two different shelters at once so I got two different IDs you know don't mess with me buddy so you know what I mean so there's like yeah so we're not connecting with people enough again it's right and the technosolutionist urge even when it's meant as I'm going to do something good for humanity it's still so often comes from the place of human beings are the problem and technology is the solution and that's that's troubling really you know right now and most people's experience technology is the problem and human beings are the solution and that's the mindset as a prejudice as it may be people are people understand that from the vanever well they won't understand it in these words but vanever bush went to Eisenhower and said your colonial expansion is not going to work anymore you can't grow capitalism on the backs of the third world but I have a new territory for you to colonize and that territory is going to be virtual it's going to be this computer territory this is going to be the new industry that lets the American economy expand but what we didn't realize was that you can't actually colonize the internet you colonize human expansion you colonize human data you colonize human cognition and that's what's been colonized and people don't feel I don't anyway like I have enough time of the day in my own head or with other people I feel you know at the mercy of these algorithms that want to figure out what kind of menstrual pad I use and they're not if it was just surveillance capitalism if they were just watching me that'd be one thing but they're not just watching me you know what I mean they're tilting the very landscape to influence my behaviors they're changing the world it's like there it's a Truman show where the internet that I get which is everything at this point is being rendered in real time by algorithms that are trying to get me to behave in particular ways and that is a weird world to be walking through and it's a world where because I can't really see that I don't distrust the simulation I distrust distrust the other people I distrust the other people because I can't establish rapport with any anymore because they're not looking at me because they're walking down the street staring in their phone or because I'm only seeing them in Skype where I can't see their pupils get bigger I can't establish rapport and I don't blame the technology I blame the other person and that leads to a kind of a feedback loop of increasingly dehumanized developments well we are in a room with other people right now we can connect and I do want to open it up to some audience questions we have this question box which if you haven't seen it before you just throw it to each other and so if anyone would like the box it has a microphone in it so please speak into the box and let us know who you are my name is Anna Hessenbrook I teach innovation here at MIT you describe my problems you describe my mother's problems you describe my children's problems you describe my granddaughter's problems but at the same time half a billion Chinese have been lifted out of poverty since Jerry Garcia died isn't that more important so wait I just want to get the two things so give it back to him though how are your grandchildren's problems and the Chinese getting out of poverty connected what we're talking about is our attention being lost the dehumanization of of our world all the values that we had as kids are sort of disappearing that's something that we in this room can all agree to because we're rich we're the rich guys but there's a world of poverty out there and there's half a billion Chinese just to represent them and that's not the only ones they've been lifted out of poverty while we've been losing a tiny bit of our quality of life so which one is more important and you attribute the Chinese to AI to Facebook to Google to what sure to electronics to the electronics that's been assembled and the globalization of the system where all of a sudden there's a lot of work in China where people can get paid that wasn't there 30 years ago I mean it's an interesting model so let's say and it's possible white western culture has run its course we had we killed Native Americans we enslaved the Africans so maybe the appropriate and ethical step for us to take is to create technologies that make us suffer and maybe end our civilization but our consumption of these technologies creates wealth for the Chinese who are assembling it I mean maybe it still feels to me a little bit like a zero sum game that I don't know if we it's sort of like saying okay all of America is addicted to heroin but we're buying the heroin from these Arab and Chinese countries that are growing the poppies and they're getting out of poverty as a result and fuck it you know we're kind of bastards anyway so let that happen you know I could buy that as kind of a civilization wide penance but I think we're also in danger and this is just me and 99% of scientists I think we're also in danger of destroying the planet itself you know and I don't know if arresting the American psyche in an effort to save the Chinese economy through industrialization is the easiest way to fix things but no I see it I mean the real thing I would say is you know for however much I hate Facebook for Americans you look at how Facebook or crypto are being used in Africa it's quite exciting you know in Africa they just call the internet Facebook that's how they get on and that's how they do you know money transfer and find out about jobs women use crypto you know what here it was an investment scheme in Africa it's a way for women to make money and hide it from their husbands so they can you know rather than having it beat out of them when they get home so there's a lot of ways that people who have more genuine needs are or more direct needs are using technology in ways that we can't quite imagine because we're using them for entertainment but at the same time I feel like our our use of technology in this way is paralyzing our ability as a kind of a civic guided Republic and potential catalyst for for positive global change I feel like it's distracted us from that purpose I don't feel like America other than maybe through the purchase of Chinese industrial goods which that's still poisoning the planet I don't see how we are actively positively contributing to you know some kind of global harmony I feel like we're descending it to kind of in some sense is digitally induced nationalism and borders and a very binary polarized dehumanized way of seeing the rest of the world but sure I mean some of what I'm talking about are white people problems some of what I'm talking about though are species annihilation problems where the box go thank you I'm Neil Mohsenman the PhD student in Patty Moss lab fluid interfaces and my question is about cyborgs where do you think they will fit in team human imagine in a decade or so we will integrate with AI so every individual will have superhuman abilities like so how do you see that unfolding with the current operating system of the society who did you call them cyborgs cyborgs the human cyborgs I mean I always start of like a person with glasses is a cyborg right it's sort of the beginning you know or I think it's always a matter of balance you know different people can tolerate different levels of enhancement before they kind of lose their center of cognitive gravity and that's sort of what we're going to see is how much can you do how quickly before the person tips you know into something else so it's going to be interesting I mean I don't think that the some of it which is interesting to me some of it is about reacquainting people with the physical world which is interesting like the people who put a little sensor on so that buzzes when they're facing north you know and it's like and they experience it is really as this kind of a grounding thing or people that have a little shock or something go off when they're just about to go into just about to fall asleep so they can sustain the liminal state between waking and sleep I mean that to me those are cyborg enhancements so I'm more interested in cyborg enhancements that kind of extend the nervous system my perceptual apparatus then I am in ones that extend my supply of fixed data you know so having the Spanish language you know the Cassell's dictionary here is not as interesting to me as you know these these almost more humanities artsy kinds of extensions but I do think we're going to have some fallout you know it's the same as with pharmaceuticals you know and the again I think the guide should be and again this might be you know white western guide but it's a guide is are we correcting the individual for the values of society in other words are we giving the person a drug so they fit into a sick depressed extractive society are we drugging 30% of what is it 30% of America's on SSRIs now or something I mean are we drugging because there's a systemic problem that needs to be addressed or are we actually enhancing an interesting and fun fun direction that's sort of for me will always be the litmus test on whether I'm kind of interested in the thing you know if we're just going to increase somebody's utility value by giving them the claw you know so now they can you know make sure if you like that but you know what I mean that's that starts to make me think of the human is the canvas rather than the artist actually I'm going to take this twitter question because I think it'll infuriate you what do you think of the concept of voluntary obsolesion where humanity is slowly phased out in favor of a new conception of what it means to be human team human strikes me as needlessly adversarial like I think that's a transhumanist don't you I guess adversarial needlessly adversarial in other words it's interesting construction why not just accept that humans are going obsolete right my argument to retain a place for humans is adversarial to those who would replace us I get that it's sort of like creative destruction right then robots come and that's a creative destruction and the people go no I don't think I'm being adversarial I think I'm arguing that we're not good enough at programming yet to take into account all of the weirdness of humans I don't believe that we yet know everything that happens in a square centimeter of soil we're only now getting scientists and everyone to agree that soil is alive that soil is a matrix the trees use soil to pass nutrients to one another and there's you know the mycelia are more more adapted in advance than us and keep us alive so there's so much about us that we don't yet know that I'm concerned that the Xerox copy that we envision even with its improvements may leave something out you know I still feel again controversially like there's something about record albums that CDs don't capture even at 44,000 whatever they are cycles column that sampling rate and it may be everything and maybe all you need but and that when it finally comes down to it it sounds almost theist but I like Aristotle I believe in the human soul I think that there's some kind of pre-existing something about us I think we come in with value that we don't have to prove our value and until we really resolve a lot of questions about the quantum fields and all I'm not willing to let us all go I think that the human project is still in its adolescence and I will admit in the 21st century it is considered adversarial to argue for a place for humans in the future that's adversarial and if it is adversarial then I understand my work my work is important because I am arguing for a a sustainable role for humans in our future at least for the next couple of hundred years I think it's worth keeping us around more than three or four of us in a zoo and I think that losing you know several billion people to climate change as things move on would be a catastrophe a bad thing people on the other side of the wall in Mexico are humans they're not just MI-13's or whatever they are they're human beings and so yeah but I don't mean it's like team human is fighting words I guess right they say you're you only say this because you're human it's like hubris and I say yeah fine you know guilty is charged I'm on team human and that's kind of something fun to fight for you know it's just it's kind of fun where I come from old-fashioned humans are kind of humankind was sort of this given that we have this role I think we have a unique role in nature I think that we're the only ones who are self-aware in the way that we are and I think that we can be stewards I think we can make nature less cruel I think that we can bring meaning to existence and I don't yet have faith that the rapid deployment of digital technology in the name of the next kind of human is being done with the care and precision with the understanding of the underlying biases with any consciousness of capitalism and the rules and what we're embedding the technologies with I don't think that we're wise enough to build the next species you know as worthy as we are well I'm human so thank you for fighting for me and for all of us we're over time but you know pick up a book get it signed I'm sure we can hang out for a little more in this room and please give our guests Douglas Rushkoff a big hand thanks so much for having me