 in that very specific area of agency or claiming parts of our visual space when we're doing online. So we invited Tristan to come talk about his work now that he's focusing on full time and had some delightful chats about this afternoon. I hope you'll help us be more reflective about how this is already an issue and not an issue for the future, something that we all face without thinking about it. And hopefully give us some ideas about what we can do. Thank you. I think they're going to make sure that this will work. Yeah. And this is an informal conversation. This is an informal chat. We'll have a little discussion, some presentation about things that you've been writing to those of you who haven't been following the time we spent for a while. And please also bring your experiences of ways that your interaction with the web is being mediated in ways that you don't want. Thank you guys for coming. So I'll fiddle with it. So I'm Tristan. For the last three years, I was product philosopher, design ethicist at Google, which meant that I was very concerned about how technology shapes people's social psychological vulnerabilities. The reason I was concerned with that is my whole life I've been interested in what influences people and kind of questioning the feeling that we have, the felt sense that we have, that we're an agent of our own actions. We're making choices, free choices. So actually just to give a demo of this. So I wanted to do a quick thing. Doc, if you would pick a number between one and five, free choice, any choice, I'm not influencing you whatsoever. Do I have to not tell you the number and you'll guess it anyway? You're going to tell me which number. Okay, five. Five. Yeah, I go high. It's what I tell every Uber driver. Okay. Would you just pull something out of there? What does it say? Yeah, what does it say? Doc, how did I know you would pick five? Is that what it says? Very good. Okay. Because you picked it out of the pockets it has had to five in it. What's that? Do you have a one or two or three or four in other pockets? Nope, I've got nothing. Because I'm influential. Well, so I manipulated you into picking five. Okay. Oh, very good. No, actually I didn't. But that's, I was able to, he feels like he made a free choice, right? He feels like he made a totally free choice. Not anymore. And that's the thing that basically all my life I've been kind of questioning. So when I was a kid I was a sleight of hand magician. When I was at Stanford as an undergrad I worked in the persuasive technology lab which basically taught young engineering students how to use people's psychological vulnerabilities to create more engaging products. So to keep people coming back, to finish sign up forms, to keep scrolling, to be as engaged as possible. My professor was BJ Fogg. If any of you know BJ Fogg's work he basically trained up this whole cohort of engineering students in my class with the founders of Instagram who are friends of mine. So this is not just sort of a random thing. This is a whole cohort of people who are now sort of in the industry using this on people. And I became concerned in that class on persuasion and ethics. Let's move on to persuasion. I became concerned about the ethics of it. So when you learn all these tools about how to influence people's minds what's the ethical code that you use when you're doing that? And the professor, BJ, didn't have an answer to that question. So I'm just watching the screen here to see. God, because I'll have some demos. Yeah, for one minute. So there was no discussion about the ethics. And when I got to Google the way I got to do this work as design ethicist, product philosopher was our company was acquired by Google. I landed there through an acquisition. And after a year I was going to leave because I was working with the Gmail team and I felt like there was this missing question that we weren't asking. Which is not just how do you build a better email client that gets people to want to use have a better experience in email but asking what email, how does email fit into a life well lived? Where is email actually a net positive contribution to people's lives? And how do you redesign the urban planning that's the intercommunication network of email so that it actually ends up being best for people? So I was going to leave Google after a year and I made a presentation about the moral responsibility that Google has in shaping how billion people's attention works. That's sort of like a goodbye kind of manifesto. The presentation was like a deck, it was a slide deck. And it was 135 slides, it had a big image per slide and it had a few words per slide and it's just making this point that never before in history you saw the description of this talk have a handful of people in technology shaped how a billion people spend their attention. And the thoughts that they have for example if you think about your thoughts or your attention in the last 24 hours so just like think about where your time and attention went the last 24 hours. Just close your eyes for like one moment just think about where did your attention go. Think about all the places, all the screens, all the people you saw, all the people you talked with. And think about of the places you spent your attention how much of that was a matter of choice? Were you choosing each time you put your attention on something new or did it seem to just kind of happen? So the idea is that our attention, I've studied hypnosis, I've studied neuro-limistic programming, I've studied persuasion, I've studied cults, I've studied basically so many disciplines that are about the art of how people get manipulated because I was concerned about this ethical issue. And my hypnotist teacher would tell me do you know the thought you're going to be thinking three thoughts from now? You don't, it's going to happen to you. So when essentially this thing in our pocket is shaping how a billion people's thoughts sort of sequence, how do you do that? So the presentation that I made spread virally throughout Google. It became a kind of hit and instead of leaving I spent the next three years studying the answer to this question, like what is the ethics of how you do shape so many people's lives? So this was me as a magician and a kid and that was the story I was supposed to go with that. What I want to convince you of today that's for the three main topic areas. The first is, we talk a lot, I just came from a conference this last weekend at MIT on runaway AI. How many people hear about runaway AI or like AI existential risk? Does anybody hear about that? So this is a big public concern, it's in the media about every week because we're going to build this intelligence and it's going to overpower human nature and there's this concern about what is it optimizing for, how do we put human values into AI? And what I want to argue is that that first wave of runaway AI is already here. It's already directing our thoughts, our behavior, what we're thinking and believing through things like the Facebook newsfeed which is optimizing for engagement. We'll talk about that. And I want to talk about how we can get around the current attention economy where everyone is competing to influence people by structuring our attention in a certain way. Apple and Google who control these two intermediate platforms could restructure the way that we actors compete for our attention in a way that might be net better for everybody. And then surrounding these questions is a concern which is that agency which for the purposes of this room will just describe as the ability to have the thoughts you want to have to choose the choices that you want to have made and to relate to people the way you'd like to relate to them. That's a thing that's being compromised by all of this technology. And then there's a big portion of the advertising economy which is dependent on this loss of agency right now. And so if you want it back, is it something you have to pay for or should agency be something that's a public good or sorry, a private good, it's your good that you just have a right to. So those are the three topic areas. I'm going to zip through this bit really quick and we'll get more into more of a conversational style but I do want to show some examples. So just to give you a hint of influence, persuasion. If you get this email, this is one of the most persuasive things that a human being can probably receive. If you can't read it, it says you've been tagged in a photo. Why is it so persuasive? Because your social approval, your vanity, your social acceptance of rejection are on the line. It creates immediate false urgency that oh I have to check this out right now. Of course I'm not just going to hit see photo, I'm probably going to spend the next minute. And we kind of know that too and that's the whole point is that even knowing you're going to be influenced by this by the consequences. Or you pull to refresh your email. How many of you like right before you got here or right before the talk started just pull to refresh your email? Probably did. Okay great. And then how many of you have like done it less than a minute later? So why are we doing this? So think about this. Is this a choice? Is this a conscious choice? Is there a moment in your mind where you said I want to check my email again? Or is something else happening? And what's going on here is that in this case which is my phone here our phone is actually a slot machine. So slot machines in Las Vegas make more money than all of movies, all of baseball and all of the theme park industry is combined. Just slot machines in a casino. Which is crazy because they're played with coins. It's not like people are playing gambling 100 dollar bills, they're playing with coins. They don't have action with coins add up to this massive, massive industry. And it's because this is operating on a principle of variable schedule award. It's one of the most persuasive tactics you could use on someone if you wanted to get them addicted. You just give them a reward sometimes and you don't give them more in other times and it's very persuasive and it happens you wake up in the morning and you check your phone to see what did I get. You check your email like a slot machine to see what did I get. Sometimes I get invited to a conference and I get to come to Berkman. So I'm getting a variable schedule award. I school a newsfeed, I'm playing the slot machine to see what's going to be the next thing that comes in the newsfeed. Sometimes I get a really interesting thing, sometimes it's an ad and actually the variability of sometimes it's something I like and sometimes it's something I don't like that actually increases the persuasiveness. So this is very, very persuasive and just to kind of line it up, in the attention economy you win the more you manipulate these instincts. So what I want to argue to you is that it's sort of this race to the bottom of the brainstem to seduce people's psychology and by the time you've done that and you've raised $50 million for the ALS foundation is the mechanism, the way that you have to compete in that attention economy, ultimately achieve the original values, right? Meaning if you have to, you guys know what this is, the icebreaker challenge? So the idea was you dump a bucket of ice on your head to raise money for all timers and then you tag all the other famous people you know so that they have to do it. You've got like I think Bill Gates and all these other famous people who do it. So why does this work? Well this is just a classic race to the bottom of the brainstem, it's just hitting all of these things. You have famous people covered in ice with that excruciating look on their face. It's a very persuasive thing, but should this be the way that nonprofits have to raise money? Should this be the way you have to win? So because at the end of the day the race to the bottom of the brainstem results in this and that's what I want to really draw attention to, that as persuasion goes up something else goes down. So as the attention economy becomes this unbounded race to get people to spend more time and attention on things and there's an escalation of tactics. Almost imagine there's a door and behind the door is the internet and the app economy and behind that door there's this invisible gravitational force that's only been increasing the last 10 years and it's getting more and more persuasive and so the second you open the door to the internet or you open the door to your phone it's just going to be more and more persuasive over time and as persuasion goes up agency or something about being human goes down and it's about a different kind of indirect power that Aldous Huxley was concerned about. So there's always these two visions of this topian power there's direct power which is oppressive tyrannical power of 1984 George Orwell and then there's this second form of power which is invisible power. Power people do not see but actually controls more of their behavior than the rest and there's a book called I'm using ourselves a depth written in 1982 by Neil Postman which is fantastic and I recommend reading it and he contrasts these two visions saying that what the civil, what the libertarians failed to take into account was man's infinite appetite for distractions. In other words that human nature is fundamentally limited we are influenceable and it really happens and it's a real thing and unless we start from that position we're going to end up in some kind of future like that. What do we do? I want to make a quick contrast so with the future of AI ethics and runaway AI we have this sort of narrative that there's a handful of engineers at less than five tech companies Google has an AI, Facebook has an AI, IBM Watson has an AI and they have this goal which is to maximize revenue for some interests that they have right and so there's a feeling of the perverse incentive and the concern is that it becomes out of control and it overpowers what we might want it to do and then the blind spots are we'll use AI for good which means that how the local group of individuals that run that company to find the good will dictate how it gets used and most likely we'll use for commercial purposes without that being transparently revealed and then there's accountability there's no, it's beholden to increasing the performance of the AI not ethics, ethics isn't built into the charter that controls what this thing is used by but again design ethics is about the ethics of designers we have less than twenty to thirty twenty to thirty year old designers, three tech companies five tech companies, maximizing hooking people or engagement and they're very very very good at it, every single time you get distracted there's a thousand people on the other side of the screen whose job is to do what they just got you to do it becomes out of control and overpowers human interests and the narrative is we make the world more open and connected right and so it's not that that's not happening but how does the narrative hide other forms of other truths that also might be there and it's beholden to increasing engagement not the impact on people's lives so how do you get around this and by the way I just want to say this gets all the attention the one on the left because in the future it might happen, this one's affecting billions of people's attention every single day every hour, right so what do we do two companies are effectively the urban planners of this city that organizes this structure of this problem and those companies are Apple and Google because they make the neutral platforms which are the iPhone and the Android basically phone that are neutral that basically structure how things reach you and how you go out to reach applications on the internet and so what I want to argue is that there's an opportunity to change how these things work to better defend or protect human agency which we defined earlier, the ability to make the choices you want to make by changing these frames and structures and the second is they can actually restructure incentives so instead of app stores ranking things by the most downloads or the most sales they could rank app store rankings by the most benefit to people as reported by them so and and I just want to I often get the objection that don't Apple and Google's business models conflict with bettering the situation for people because don't they get money the more time people spend on their phones and in the case of Google sort of, I mean yes on some level, in the case of Apple sort of meaning they have to provide a developer platform that makes developers app developers make money so they basically need to tell every app developer if you do you do build an app on our platform we're going to be good for getting lots of usage and you can reach lots of users and they're trying to compete on that but if consumer demand were to shift and say instead of more privacy or instead of more shiny new phone features we want something that protects human agency they would have to respond but there's no fire to put out right now that's why I'm here is to create a fire that we have to respond to the companies have to respond to and so what would it mean to redesign these things to better support human agency so the basic design principle that you're redesigning these things by is to empower people's ability to self-reflect on the current situation that they're currently experiencing to then collect ideal preferences that would say instead of this reflection or mirror of your behavior what would you like that to look like and then to adapt the design to fulfill those preferences meaning this kind of makes sense to like imagine your phone shows you there's a thing that shows you your top morning habits so basically shows you this is the things that I see you doing in the morning with me here's what I'm noticing in our relationship and then you basically get to say instead of those behaviors I would prefer to look like this instead and then it would adapt the design to fulfill and support your good and I think this relates to something Doc talks about with conversational this is in conversation with not your reptilian brain but in conversation with your deeper reflective self and fulfilling you in some way and that effectively these phones are kind of like agency exoskeletons they're these things that should we capture our values and be able to support the kinds of choices that we want to make and I don't have to do this we can make this more of a conversation we love talking but there's some specific design solutions that could enable the kinds of things that I'm talking about so I can kind of pause here and see where we'd like to go I can walk through some of those design solutions or we can kind of open up for some questions right now to get a sense of where people fall in that you've laid out a nice spectrum of ways that you could enhance agency and maybe we could talk for a minute especially about the backgrounds that you gave of what it feels like right now what does it feel like right now in terms of so you spent three years at google because everyone at google thought this was interesting and they were asking you to follow up on this and propose ways to make things better so how did that play out what is that like so there I couldn't get a lot of the design changes through google because they were basically a departure from literally I mean think about hundreds of people who are plowing ahead on road maps that are predefined based on the competitive dynamics of the current market right so that's dictating what we're going to design next what's going to go into the next release and there is no public narrative that this is the thing that everybody wants so what I found was a lot of spiritual support for yeah this is a problem we should try to be careful with this and then a limit to what that would translate into in terms of prioritization I think there's also keep in mind google a very engineering driven company the engineers I think misinterpreted a lot of what we're talking about as they're just being too many notifications that there's just too many notifications and we'll just do a lot better by just reducing notifications and I think what I'm hopefully communicating here is that this is so much deeper than that this is about whether your phone is actually in any way helping you live the life that you care about and how would you do that I'm wondering whether it's just like human beings feeling okay we need a catastrophe to wake us up do you think that one that's necessary and how long do you think that it's going to take to get that because otherwise we're feeling good so unless we each feel it ourselves I'm surprised that the younger generation is so open to I'm sure I'll share this I'll share this information about my phone a number or whatever that's not what I grew up with how long do we need to wait or do you think we need to create that my experience of this is that people do see this as a problem how many people agree basically with this how many people disagree with this I think the distraction is not a problem can I just put a theory forward it's not my own marriage therapist they talk about something called bids for connection I think some people regard this as a self-care machine particularly as you age you need to connect and so this becomes the way you give and receive bids for connection I understand that as well another line perhaps I agree with you about the needs self-reflection you can live the life you want but you could have a lot of people who say they do not want to reflect they just want to take what is there available for them so what you are doing actually is manipulating them to self-reflect if you design in a way that allows them to do that so that goes a little bit against your fundamental principle and I just want to know your thoughts on that I'm so glad you're bringing this up I'm sure you're all feeling kind of tension rising around you don't rush answer that and we'll get to that so in fact what you just said was one of the other objections that I heard at Google there's actually a whole new discipline here that I think we're entering into a new era of understanding human beings as being influenceable if we realize that we're not just influence sometimes but actually in this room right now your self-experience, where your attention is going has been predesigned the thoughts you're going to have are very predictable given where we are right now and what is happening in the room right now and so I call this I don't know if it's Schrodinger or Heisenberg morality people don't have a problem when our attention is steered unconsciously by nature in its chaotic form and as soon as you assign a direction saying this is the good people freak out and feel like well who are you to say what's good for people but also the thing that happened at Google and Google is driven by very libertarian cultural values but if you really care about libertarian cultural values and being able to make your own choices then wouldn't you want sort of a superstructure that's actually even enhancing that ability so and what we're talking about here is not to impose self-reflection on everybody but right now there's actually no way for so many of us who are feeling the weight of this problem and for our children who will not even know anything about this problem will simply be subject to it to enable them with that choice so that's one thing I want to say go ahead Tom several things here quickly one is that the internet that we have now is on April 30th it turned 21 years old because that's when the internet went commercial something called the NSF that shut down commercial activity opened up and pretty much browsers, email the protocols we use were sort of finalized at that point and we've been iterating that ever since but that's only one generation it's not very long we've had 10,000 years to work out the subtleties of for example privacy we are all wearing privacy technology we don't have that online yet so toward your earlier question survey after survey will show that over 90% of people will lie or hide in order to avoid the all seeing eyes of those that are surveilling us a colleague of ours here Shoshana Zuboff I highly recommend reading her stuff writes on a subject she calls surveillance capitalism she feels that we've moved into an economic era where being spied upon by forces greater than ourselves is a basis of the way that we're doing business and it's the degree to which Google and Facebook are embedded in our lives puts us in this surveillance state and we don't have and it is a net loss of agency I define agency as my definition one of the number of definitions of agency which comes from the Latin verb agarra which means to do is simply to act with full effect in the world and it's what we have with our hands our voices and we have it to some extent with these things there is an interesting toward your design concern here there is this divergence between Apple and Google right now on health Apple has staked out health as an area they care about and they've gone out of their way in health especially a very long term game that's based on individual privacy and they've gone out of their way to sort of say that there are zones in that phone that are yours and not ours your data ought to be yours and so forth and I think this is about there's agency around privacy privacy falls in the category of an intuitive moral trigger that people feel intuitively inclined to want to protect but it is an invisible way that people's choices for the magic trick at the beginning might be shaped out of their own awareness and that's not a kind of agency people would know to ask for or defend because they lost they didn't have it in the first place yeah like in terms of agency can you make an objective statement about how much agency we have now let's say versus a hundred years ago say I was living in a village of two thousand people I don't have much agency I cannot do anything I want I am subject to social norms and laws and people gossip and all my actions are observable and so forth I thought that the norm I would be immediately judged and everybody will know so I also live in an old seeing like a thousand eyes in a society of a thousand eyes in many ways now I live in a much more anonymous society where I have individual freedom and the phone is giving me tools that allows me to explore hobbies and ideas and way beyond so what is the trade off now we've clearly gained some agency in many ways we've lost some agency so this is the point in the conversation where I think the human mind will get tripped up with there's some goods there's some bads it's complicated let's just like leave it and let's let things be right I see this all the time there's this sort of there's goods and bads all I think all we can ask for is simply identifying harms and say let's reduce failure modes let's reduce harms so there's some very clear areas where moment to moment agency around time and our attention and the kind of menu that our phone gives us are not serving us in ways that I think will self report so can we just simply minimize that that loss of agency there but Peevie? Zach when you said the internet is 21 years old that's true except for the vast vast majority of people it's not it's only a few years old and it's only ever been on their phones right like for the vast majority of people in this world and their experience of that device is of something that gives them facebooking games and maybe a few other things or maybe not even quite that so I think a little bit of what's missing for me is the question of what should be there because of course your phone gives you a hit of pleasure of course that's why we all use it like of course it's addictive and of course if you're working a 14 hour shift a crap job you want that hit of pleasure of course you want to be distracted and I think that's most people's experience like most people's experience is not checking their email when they're going to a big conference most people's experience is like playing the game to distract them from their day and so what do we what's the point like what do we make it instead and I think that's the open question that's a great question I'm not going to answer I think it's a great thing to pose I just want to make an additional point certainly amplify one of those points we what the internet does and the phone makes it extra personal is by design it reduces to zero the distance between everybody and everything and this is absolutely no cost close to no cost and this is new to human experience we haven't had this before we had like the village example you gave everybody lived in a circumstance where they were under their Dunbar number Dunbar's number is you know 120 people or so the most you can know we not only live in a society where we depend on anonymity as we go about as a social grace you don't want to be known by every Christian beef at Harvard Square it would be inconvenient and weird but on the other hand we are terribly known to forces that we don't even know and we just have a lot to work out really hi yeah I think it's also useful to distinguish between connections and it can be interruptions it can be quite a bit that are reasonably straightforward which are between you and people you intended to have some connection with or things that you are looking for information from and the real balance of power issue that a tremendous amount of the things that are trying to get your attention are doing so in some fairly deceptive way where it's you know an example is you know things that pretend to be your friend or to be another person but are actually a front for a company whose entire purpose is to get you to buy things I think there's a huge commercial element to this of that you know if you looked at you know when you said the web is the internet is 21 years old I mean if you're thinking about it you know the internet is much older than that but that's the commercial internet and before those 21 years most people didn't think that the internet was going to turn into a vast advertising medium that just wasn't an issue that people thought that much about and it has and so if you look at increasingly intimate devices like a phone that are becoming more and more commercial and more and more stealthy in that commercialism but still trying to get your attention but continuously I think that's where a lot of the problem is it's not the existence of interruptions or things like that but interruptions that have been billions of dollars have been spent on figuring out human psychology in order to manipulate you into doing things that are to the benefit of some other commercialism the whole point of opening this up at the beginning with a magic trick was to the metaphor for ground I was trying to create was the number of hidden covert influences that are shaping what you're thinking, doing and believing is so many more than we are consciously aware of I want to get to some questions but I also wanted to show a couple of concrete examples so maybe I could do a couple more first and then we can get to our questions with conversation so what I want to do in terms of how would you redesign this I'm claiming all you can redesign this we can make it better with design do I mean like are there some path we can make is this just human centered design like we've done IDO style is there something different here so what I want to introduce is a different lens on what design could be about and you can imagine almost like a heads up display there's essentially a timeline of reality for a human being using a phone and we would like to design that reality in a way that aligns with what would be the ideal timeline for that person so imagine that that timeline includes someone's thoughts and concerns their behaviors and their feelings and emotions now there's many more you could draw but I'm just simplifying it for now so if you think about and I also want to draw attention to the top right the thinking frame there's sort of a frame or a way of seeing yourself seeing obligations, seeing people that's invisible and we'll get to that but I want to draw attention to that part of what sort of an agency reality construct level is and there's also the amount of energy that gets expended to live out that ideal timeline so I want to make a distinction between if you do if I'm using Facebook and I know that they're trying to get me to scroll forever and in one case I have to expend a lot of energy to be like okay I don't want this to get me to scroll so much that's the reason why there's a harm there is I have to expend a lot of energy to live out my ideal timeline so what would this look like if we're using this model to redesign a screen that we all use so imagine if you were almost pricing agency pollution like a CO2 emission or something like that what you can pricing is the measure of divergence between the ideal timeline that I would have given my reasons the thinking frame I would ideally want to have the reflection that I'd be able to see and then also how much energy I had to expend to get the timeline of behavior thoughts and experiences that I wanted this might sound a little bit utopian and out there but just walk with me for a little bit so we're going to go through waking up so a billion of people wake up in the morning and the first thing they do, 80% of people report what they do is checking their phone how many of you guys wake up and check your phone okay everybody and I want to also discount this is Joe Edelman my collaborator in Berlin who this example comes from him with this design methodology so do we want to have a world where 1 billion people have lousy mornings I say this because not a lot of people feel so good when you look at the data people don't feel so good about checking their phone the very first thing in the morning is to redesign it imagine that there's this person Jen and we asked her what would be time well spent for you what would be your ideal timeline waking up in the morning if we're basically honoring your agency or your values and I want to make a distinction not aspirationally not unrealistic things she would never do but an actual morning and she might say I want to wake up maybe get a couple minutes of fresh air I maybe want to do 5 minutes of yoga not someone who's never done yoga before just trying to start a habit maybe do like 5 minutes of journaling thinking about her day and then go off to work something pretty simple and reasonable okay so what happens is that this so what I'm drawing attention to is just how aligned how empowering is this to that timeline so what specifically is wrong about this let's quickly just kind of dissect this for a second so what does this make easy so if you're thinking about this is a menu you ask what's on the menu currently on the menu is it's Facebook so what is this putting on the menu for a person who turns their phone over and then it's programming their mind to think about their life in a certain way so let's just quickly understand it so what it seems to make easy well it has these time stamps on the notifications so one thing you can say is this right another activity it makes easy is that each of these notifications goes to an app so another way to think about it is it's never been easier first thing in the morning to jump quickly to apps it's making it really easy to jump quickly to apps first thing in the morning it's never been easier to see discussions going on without you first thing in the morning it's never been easier to see new content releases first thing I'm just naming how empowering is this how this person would ideally like to wake up okay so and then what kind of relationships does this kind of encourage how does it encourage her to see her relationships so we've got this sort of endorsement relationship they liked your photo they added you as a friend they followed you that's kind of a way of seeing my relationships as people who endorse me in sort of shallow sense there's these kind of chatty relationships about indirect objects like kind of naggy relationships this app is telling her to do something and plan her day and then overall what's like the thinking style or frame like what kind of mode if you think of it as a gear shift for her mind what mode is it putting her mind into and it might be something like I'm way behind there's a lot going on so this fresh human being like coming up in the world waking up and her mind could literally have anything in it okay and we're programming it with this so what if the screen made room for what mattered to Jen so instead of those activities what would be her ideal activity so if we think back to that timeline she wants to wake up get some air yoga journal go to work so maybe the two activities that might be more empowering for her would be to make a decision to get out of bed to actually get up and then also that she has nothing but time until her first meeting that she has plenty of time to make time for routines so how could you do that what if maybe there's this kind of like bell that when I ring it it is asking which of your morning habits do you like want to do today and it gives you the sense that there's nothing but time between now and your first meeting so then instead of these activities instead of these relationships these kinds of relationships what might be the most empowering relationship that she could have for her timeline of yoga, journaling you might say nothing or you could say maybe companions, relationships like buddies that will help her do her activities so someone who's up for yoga over Skype right like right now and why would that happen because basically two people when they ring the bell they mark which of their like when they ring the bell it asks you like which of your habits do you want to do today if any and if you said yoga it would just you pre sort of popped into this yoga group and so two people who wake up around the same time that are know each other could automatically get sort of passionate into this group and you're boom there you go you're off to the races this is an empowering and then which speaking styles instead of anything instead of a lot going on what if like anything is possible like you're a human being you have full agency what do you want your data look like and there's lots of ways to interpret that but you can imagine something that's like what's important to you today and it might propose something that might be more aligned with your timeline so this is a deep redesign I hope you see this is not like human-centered design where do we put the buttons what is the most empowering kind of way for her to see her life in her day and how could the phone support that so I just wanted to add that first there's many more examples but I wanted to introduce that because I haven't named some of the design principles we had some questions I can go over there that I don't know if they're still this is actually a social dilemma so it seems like there are not as the redesigned and restructured as these but my understanding is that there's some things like that help with my management for example so do we have any sense of like is this actually bad for Google's pocket line to like give this as an option given that it seems to be a market out there for these kinds of higher level organizational choices market meaning that people do download time management apps is that what you're saying I thought you were just saying a different thing which is isn't this against Apple and Google's interest so how is it I assume the first assumption given that they didn't want to take you up on it is that it's against their economic interest now there's this other piece of evidence that people actually are willing to engage with things that help them combat these kinds of nefarious ambitions how much does that house whether it is a social dilemma for the company or not there's two more seats over here if you're changing the doorway you want to come around a couple of seats over here so we're going to embarrass you into coming into a seat on the other side so two answers so why do this seems like one reasonable choice that might be a little bit more empowering for people why wouldn't someone have built this for example one of the things I want to draw our collective attention to is in terms of our ability to have things like this Apple for example locks down the home screen so there's no ability there's no marketplace so if you have another idea about how you prefer your lock screen to work you have zero agency, zero freedom zero ability to do that now when I was at Google we tried I was trying to create something that was not exactly like this but a different lock screen alternative and even that's hard with Android because the code is very complicated and the OEM contracts with like Samsung and things like that so OEM like Samsung and all the other manufacturers will customize Android with their own build so even if you did something at the Android level these other manufacturers will kind of co-opt it so there's a bunch of reasons why people don't have the agency to even have things like this and that's kind of one of the political actions from this whole conversation is that should be a freedom that you have you should have a right to the way that you want your lock screen to organize and be transmitting things to you yeah I guess a couple of thoughts one is that the notifications thing is an option like you can I have my notifications turned off so I don't get notifications from all the things that I don't care about you know like that person did, Jen did we have to go in and do it by choice but it helps a little bit but my thought is something I've thought about for a while which is waitiva works for television and being able to skip around through commercials and whether there could be like a hack designed to likewise skip advertising and other types of notifications like you know probably TV channels don't like that TV you can buy a TV connected to your TV and then skip through all the commercials but couldn't something likewise be developed for all sorts of online advertising to basically block you from having to be targeted by it so if we model this problem as we have a persuasive attention economy and then this one line of defense which is the choices that people have and will happen to make so far and then there's agency and Doc's written about ad blockers is another form of what is the right that you have to install an ad blocker the similar thing it's someone saying I should be able to install an ad blocker even if the internet business model depends on my attention being monetized so if this is a conversation about is agency something you pay for or it should just be something that you have a right to and the thing is that this has been so so it's been a slow creep from like the 1950s with marketing and advertising it's like what's the big harm of a billboard but when you suddenly make it a moment to moment thing that mediates your moment to moment thoughts the weight of it has bigger implications several things one is none of us going to have the same phone in three years we tend to have them for 18 months probably the next phone is going to come from Apple or Google so we're sort of in that world right now but I think that there's our choices I love this exercise and I think it's a really good way of looking at it I also think that there there are other companies with agency besides these two you know and besides Facebook in addition to that and there's a lot more we can do on our side to assert our agency in the world I think there's I mean this system that we have right now is very ripe for disruption it's totally ripe for disruption and there's a major groundswell right now there's a general dissatisfaction with loss of privacy and with generally being surveilled even if people are taking casual means to address it I wonder what other, because besides Apple and Google what other choices do people have there's Cyanogen mod Cyanogen it's a small open source thing but Linux was small and open source at one point now we're all using it there you know there's nothing to say that this is the only thing we're going to use I mean even I mean what Facebook just invented with the chat bot is really basically just a conduit between any two endpoints that uses a messaging protocol and that can actually facilitate relationship of some sort you're still doing it in Facebook's closed system but it doesn't necessarily have to have the encumbrances of persuasion at all times so I push back a little bit on that there's this it's sort of like saying you know well if you want an alternative you can drive 50 miles to a phone booth where you can call between the hours of 8 a.m. and 5 p.m. to cancel your subscription to I mean there are alternatives but if people have let's say a major has invested all of their photos and videos on Facebook for 10 years and Facebook doesn't give you a way to get that out and there's no way to build up an equivalent social network if your connections are managed then outside of the things that you already invested so much time in other words there's lock in effects and the same thing happens with AI where you have sort of a hegemonic power where you aggregate so much power in one entity not just because of it's a thing you use and you can choose not to use we have to acknowledge lock in effects network effects and so you can't just say it's a different social network because you'd have to have everybody switch to it at once and I think it's also quite fundamental in the way at this point our society is structured if you look at something like Instagram which let's just say it was a totally neutral platform forget it's owned by Facebook it was like completely neutral this wonderful sort of photo and information sharing piece but you look at the experience of say a creative teenage girl who uses it and she starts by connecting with her friends and she puts her photographs up and then she gets involved in this art community and people are putting up these great photos and they're really stylish and they're doing all these photo shoots but it turns out that like 80% of those are not actually people but they're every clothing company pretending to be another teenage girl but with these very high end photo shoots and these great clothes so you have this very competitive environment where you're constantly looking for things and there's a little bit of it is real connection with other people but you're primarily caught up in something that looks like a very interesting network of creative people a lot of which is basically meant to persuade you to buy more clothing so I don't think you can look just at the platform or at what the screen is when you log in I don't think there's that type of quick fix to this because a lot of that comes from the level of self-discipline you could also have a nice poster in your kitchen that says here are all the things I could do first and then at 9 am I will get my phone and there's a lot of pieces you can do like that but there's a pervasive cultural element that is a lot harder to and can we create things that instrument culture better so meaning that we can have for example there are workplaces I can give some of the I'll give an example in a second but if you want to hold your hand up do solving a problem from a design perspective might be useful but I don't think it will solve the basis of the problem especially from a privacy perspective I mean you're feeding your daily your daily morning to google or to apple directly so they have very accurate data about what you're doing even more than if you have like all of this data in like one block and at the end of the day you're still using apps you're still using your daily routine with the programs you've been using until then to manage your mornings so it may be a comfort and a visual and maybe it may be even more misleading because the data you're sharing the whole economy is just the same maybe a second point about alternatives well there is an alternative so in those phone you have like two apps competing for your attention there what two apps? I don't know there's not a big app landscape there so I never have this sort of problem when I wake up in the morning I think I want to acknowledge something you're saying which is that the premise of this example I put up at the beginning empowering self-reflection capturing ideal preferences adapting design to live out the preferences that involves a new social agreement people are saying if I want my environment to be more supportive of the life I want to live I need to be able to give more information and cooperate to cooperate with it and for it to cooperate with me so it involves this kind of design whether it's private companies or decentralized blockchain versions of this this exchange and so it would be better to have a decentralized version 100% I'm trying to call it from mostly the design I'm not personally focused on privacy so I want to give another example that might get to to this point which is a social agency so part of this is about matchmaking and coordinating among multiple agents who have different ideal timelines so someone might want to know hey are you still on the way to this meeting or hey I want to fire you this document and the other person might be focusing for a while and they don't want to get interrupted so how do we coordinate these ideal timelines of multiple people so how would that be solid design should it be solid design so I'm going to give you one example and I'll give this in my Ted talk so let's say you have two people you have one Nancy on the left and you have John on the right and then John remembers I need to ask Nancy for that document before I forget I have this thought I need to get it off on my now so what does he do he fires off this message and it completely bulldozes Nancy's attention right and every single time that happens there's actually a cost of about 23 minutes before we get back to the original task that we're doing this is a knowledge work specifically Gloria Marks research at UC Irvine demonstrates this her other research is that if you look at how many interruptions people get in an hour in the second hour if they don't get interrupted their rate of self-interruption has gone up meaning if you get a lot of interruptions for one hour the second hour you will self-interrupt more frequently so there's sort of an increasing of the clock rate of self-interruption we all experience this we can't read books anymore because our attention is flying all over the place so there's externalities to the mind to how much our own internal thought timelines are shifting and switching we might call that a less ideal timeline so how would you fix this so let's imagine we redesign this with this coordinated infrastructure to be captured two people's ideal timelines so let's say this woman Nancy on the left wants to focus for two hours let's say in the basic built in part of her computer there's a way she can say I want to focus how long two hours okay great and it puts up this focus mode that's a two hour thing now John has the exact same need so what does he do he opens up this message now you might think well hold on a second this is just like an away message he sees that she's focused but that doesn't change the fact that he still needs to get the thought of his mind right now before he forgets so the cool thing here is that he can send that message like a stoplight it's like a city it's like we don't have stoplights in our city we're all just going right but instead it holds the messages and then when Nancy goes unfocused she'll get the message so both parties get to live out their ideal timeline with a coordination signal as mediated by a better urban planner which is in this case Apple or Google now you're probably thinking hold on a second I would never trust this because I would be worried what if I miss something important and I'm Nancy so the point here is that the person can always interrupt or escalate a message but for this two-hour period we're flipping the world upside down so now you can only interrupt if it's a conscious choice as opposed to a accidental or mindless choice okay so you can imagine this is a design principle it might seem like a single thing this is actually a design principle where a whole company could say after Friday at 6 p.m. no one's messages get delivered until 6 p.m. on Sunday or whatever they want to collectively agree would be that new social contract so that would be a way of instrumenting work culture so that there's a shared ideal timeline created in a reflective context and where these messages could still be interrupted you could still interrupt someone and there would be this clear path but you need urban planners to better support so I wanted to introduce that example because that's another powerful one and this is what actually dating apps are doing and you're capturing two people's preferences and you're making it safe to express your actual preference without actually offending anyone and help people without the time limit you've been describing a lot of these interventions as design decisions that the providers of the operating system of the tool chain can make are there other levers that consumers and users can act on I think that the options are really limited right now I mean there's so many extensions that help people for example there's something that blocks like the newsfeeds when you get to Facebook you still get Facebook but it blocks the newsfeed part of it there's something, it's a separate extension that de-metrifies the red notification badge that there are and takes away the number of likes because that itself creates its own reward system but each of these things are kind of bit and parcel and there's like so much effort onus is on you to install this whole collection of things the whole like I have to drive 50 miles away and then like buy a bunch of stuff from these randomly markets to add to my web browser so I can kind of live the life I want with the internet and if there is more of a movement that said we need Apple on Google to better support this and here's a whole list of demands I think that would be more empowering but I'm sorry let me just put this out there because this has been my work for the last 10 years here imagine like every time this operating system gets updated you get a thing that says take a look at our 55 page thing you won't read and click accept this is a world we've been living in for 150 years what if Apple had to agree to your terms said these are my privacy policies these are my these are my concerns these are what I want you to agree to this is within sight we're working on this we spun off an organization called customer commons this last week we worked on term one term but the term says I'll turn off the ad blocking if you just show me ads that aren't based on tracking me this is far more teeth than do not track had I do not track by the way it was an idea that came up at this table in 2007 and this is not something that happens inside Apple system Apple will not reform itself that way Google will not reform itself that way we need we need you know tools that give us ways to prove that free customers are more valuable than captive ones capturing customers has been part of the way of business as usual you listen to the language of business you're not people your targets that they acquire and manage and control and lock in as if your slaves are cattle that is within business it has been for a long time and we're living this out in a way that we what a powerful company like Apple or Google can do is leverage the internet to have even more control over people than they ever had before in all the ways you eloquently put up front which made it even scarier than I thought it was pretty scary but there are there are ways where we have some leverage we have it right now for example with ad blocking and ad blockingers is one of the kinds of tools that we've encouraged with project VRM here so my point basically is I think it's also as Jay said there are there are points of leverage that we have that are external to the Apples and Googles of the world but can give us something and I think we just need to unify all of these these things so it's really an agency movement agency or privacy is agency around your thoughts agency around your behavior it's just you need to collect all that together and there's a few more so one thing I'm hearing here is a long ago take away from Brenda Laurel's computers as theater was basically design the action and what's right? computers as theater game designing now we have interactions instead of games so anticipate and influence what the user will do next in part because you have a limited set of options and like your magic trick you want to make it work in certain ways and constrain things in various ways and yet you are not in this proposal here for reflection you're not clearly offering us the user the ability to collaborate in the affordances that we might have it would be useful you know so it's one thing to offer us the possibility of reflection it's a different thing to insist upon it where's the insistence? well if this if this new model becomes the new paradigm and the only one then you're just changing one for the other as opposed to giving us our own agency and how we want to control our world but that's true of the world right so if I'm happy with anything in my environment I can't just change it and so right now there's private interests that are governing all of our environmental situated decisions and so this is just bringing attention to again right now it's unconsciously designed it's not like the Apple and Google designers want you to be interrupted all the time actually I will point out Facebook does want you to be interrupted all the time because they will outcompete a messaging service in which it's asynchronous so that's another part of the race to the bottom of the brainstem is if one messaging service uses that tactic the other have to support it and so the only way to get out of that game is to go one level up to the controllers who say what you can and can't do on the platform right and to make that a more fair and eating playing field for what would be good for people and Facebook used to give a lot more options to control I mean there's certain things I would love to tell Facebook no I never want to see that again I never want to see this type of thing again they never seem to hear that signal from me and maybe the explanation of constantly variable reward explains their motivation not to fix that exactly I just want to make sure you had a question oh thank you my question keeps shifting as this conversation continues which is nice but I know you spoke into this a little bit I wonder if you could just say a little bit more about how you feel that Google and Apple would respond to consumer demand shifting and instead demanding that more agency be given back to the user because for me as a designer I feel like the kind of central project for me it constantly is negotiating on behalf of user interests with business interests and it's just very hard for me to imagine any response to user demand for more agency over user attention would be anything other than just no that means the business makes less money if I make software and I give you more agency over whether you use it in any given moment you're going to use it less and if I make hardware and I give you more agency over whether you're using any particular software at any given moment you're going to want to use your hardware so let's go to that for just one second because there's a common misconception here you're absolutely right when it comes to most businesses who are in the attention economy a medication app needs to hook people into a habit a New York Times needs to hook people into reading a addictive game needs to hook people a Facebook needs to hook people a level up from that Apple and Apple is happy so long as people spend what ultimately ends up being whatever $800 on a smartphone every two years that's the number one thing that makes Apple happy whether you use it all the time or you use it in a way that perfectly aligns with your ideal life and the timeline that you would live out is actually they would be indifferent to that don't you think if you rely a little bit less you're going to want a new one less often that might be true and that's an important consideration I think that's actually already happening with Apple and that's actually why their earnings just went down but you had a question talk about ways of measuring agency that's one of the things I kind of want to pose as kind of a research project yeah and because what's missing here is a way to price or name or cost out what is that misalignment how divergent is the influence or persuasion that I'm receiving and what yeah and there's some other design patterns that I could go into that a little bit so I can get to that but more I mean I guess what I'm thinking about is sort of like how closest is focus as a proxy for attention like your ability to focus declines during the day declines kind of almost like a muscle that's being worked a lot and we can measure that at least to some extent and your ability to resist notifications is pretty in line with your ability to focus right well and that's right well that's why in terms of setting a north star the whole point here was that to live out the ideal timeline to live out the agency the ideal timeline that I wanted the thoughts I'd want to have at the there's many things that are inside the timeline they're not going into so for example am I inside of that thought timeline am I switching between thoughts like a billion times that's not the kind of thinking timeline I would want and then to live out that timeline how much cognitive energy is required to resist my environment as it's designed for me so if we're trying to reduce friction between how the environment is causing you to use my finite conscious energy just like you said it's a muscle it's a battery ego depletions a real thing we have a finite amount of unsurveillance vigilant energy to resist the world as it is so how can we basically save that energy for the conscious choices that are most useful or most important for me so that would be one of the measures and so this is getting to his question as well like what's that cost and divergence between the two timelines have you looked at like patterns of abusive relationships and the reasons that compel people to stay in them as like a possible corollary to the relationships we have with our phones I do feel like as much as people resent the distraction it's also giving them something as much as you can say I wouldn't want that I actually think maybe that's a battle that's that kind of distraction and that kind of battle is one that people would rather have than to be faced with like existential questions of their purpose in the world and that sort of thing which they might have time to reflect on it's a fantastic thing and I think that you're ringing up no people feeling the need to distract themselves all the time so this narrative comes up all the time that what if people like slot machines or what if people like to distract themselves and this thing of people believe oh they actually like this this is what they like people like heroin they like these different things is that the right way to interpret the behavior so I don't know I'm not for those of you who do meditation or something like that if you get still the moment before something happens you can identify that there it wasn't that what I was desiring was to be distracted I had there's a there's a surge of anxiety that needs to get put somewhere and there's a conditioning that the more we kind of do that behavior when we feel a surge of anxiety the more that's what happens next time but would the need be I need Facebook or is the need I need something that helps deal with that feeling that just came up and if we didn't have these things if we didn't ten years ago we had other ways of coping there's a lot of questions she had one sorry I've been related with the last question this time the proposal that you are making it's related with the empowering of the self-choice so in some way reinforce the idea of having an echo chamber because you will be selecting certain type of content and you will be in some way enclosed in that selection that you do and leave outside all the fantastic ability that have internet and applications online to provide you an access to a whole new world and maybe it's not something that you are constantly touching the reality so what about that how super important right what if people are just designing their bubble of their life of their own current values and agency and that's limiting or them from things that actually they get benefits from from like landing accidentally on that website or there's all sorts of positive things that happen when we get distracted so absolutely no argument there I would also say a couple things on it one is if you think of this exoskeleton not as an exoskeleton just for you but as an exoskeleton for like all human values that we care about so I get to toss in kind of one part is how I want to live my life in relationship to other people and to the world and then other people get to put in their their values about how they want to relate to other people and that's how you get that focus match kind of alignment things but then for example with facebook like is there some way that we sort of say collect like there's some delivery process for facebook newsfeed which is currently a trade secret like not available for public review about how it biases which newsfeed stories get weighted we also don't have agency there and currently what they're doing is actually reinforcing that bubble as we all know filter bubble etc etc so I amplify what you're saying and saying there needs to be an accountability loop that makes more of that transparent and more of that updateable rewrite about device interoperability and differentiation between signals I mean I could have one work phone I could have one home phone and the distractions I have from each one of them is different and I might give different values to each of these and when you combine it for example having a desktop computer you'd have a whole universe of different signals that you might have to manage and give hierarchy so people are also self contradictory and we have competing values and there's a million types of these kinds of things but I mean I think the main point here is that there's some failure modes of agency right now that are addressable with some simple fixes and there's no current way to get that from our environment and I think we can all agree that there's some failure modes or some harms that could be reduced if we were to push on them what comes next yeah how do we get our time back you've convinced me that I have less time than I thought I did an hour ago you're an hour short now so let's do places like this I mean I think of this as like the organic movement before organic people didn't even question the food that they ate they didn't know that there was a difference between some foods affecting this way other foods affecting that way and they didn't have choice in the matter they just sort of buy the thing that shows up on the supermarket shelf just like our pocket like we use the thing that's in our pocket and the organic movement was with different stakeholders shifting what was necessary to transition to something new so farmers had to create a new community of practice and to find standards about what it meant to design something the same way here the organic movement is designers need a new conversation a new paradigm to think through about what it means to design for agency or the kind of social fabric that people want to have and that's a new community of practice it's like just like getting people together talking about it giving examples naming design patterns training trainers all that kind of stuff there's a part of it that's consumers, consumers have to be educated about the fact that not all food is created equal and that they might want the stuff that has a sticker on it and then there's also government in sort of naming that there are some human values that need to be pushed a little bit harder into the marketplace I don't believe that government gets the regulation and the criteria setting right but I think it plays a role in creating an urgency for companies to respond more rapidly around the stuff so I think that's the same thing here we need basically users demanding this as a new category of agency there's a shared concern about agency we have privacy and AI in the media every single week we have nothing about this which is affecting literally a billion people's thoughts and behaviors all the time I'll give you a second uh the yeah so it's and then it's a new metric so we need a new shelf space in app stores and in browsers that basically help preference the things that are sort of designed for the primal standard of public good any questions so I think there's a lot of parallel to this conversation with the future of work conversation that's happening in the on demand economy and what workers are facing I represent domestic workers in this country and have been very active in the national conversation about the future of work and similar issues and I guess two questions that I would have for you is one thing that I've been thinking a lot about is this idea of algorithmic transparency and whether policy efforts are not about determining what the algorithms should be but actually having some whether that is something that I think we're seeing with workers not really understanding what gets them on and off platforms etc and then the elephant in the room I think is a question of power and um whether this question the dilemmas that you're addressing can be addressed without tackling investors and capitalism um yeah I was particularly on that last one um when you marry together an undoubted desire for growth with a finite resource you get problems and that's capitalism um in this case there is a view that there's an the resource that's being captured that people want to grow every quarter financially is attention so um uh the current model of advertising specifically what's on the name it is the time spent based on advertising the problem isn't a rectangle in terms of attention the attention distraction problem is not a rectangle it's the problem is the undoubted desire to take more of your time and having the power to persuade you to get you to spend more time so those two things together just to be clear um now the question would be I'm sort of side stepping your question a little bit I'm naming that um could companies make more money so let's let's jump to that so currently um I didn't have any time to make this anything pretty um how much do you think Facebook makes on the average user um per year guess this $5 $25 it turns out it's about I think it's $7 around $7 it totally depends on the country totally depends on the country but average across all users it's about um I think last time I saw I was like between $6 and $7 or something like that per year okay how many people in this room would pay this is an unfair question because we're all worth way more than individuals in this room to Facebook then that amount but would you pay you know $6 a month to Facebook for a version that was about helping you spend your time according to your values some people are saying no no I mean this is an interesting room too this is sort of the decentralization you know um anyway um I think that some people might now I don't want that to uh get around the another big issue which comes up which is well then hold on a second some people can afford that no people can and wait that's unfair but guess what like the default settings of the world are like not good for people right credit credit cards um you know like more like name an industry and the default settings unless you have vigilance a really like not built to be best for people and you can kind of only live in the world with a kind of vigilance and this is a lot there's a lot of writing about this about how um the extra like you know the extra cognitive cost for people have to face during the day and I'm sure you know this narrative um so I raise this as a public concern it's just the same issue right do we want a world where the only way to live the good life is to pay for it or do we want a world where the good life is just available as a social good so Michael Sandel here at Harvard Law School has written a great book called Money Can't Buy the World Limits of Markets and it's all about that it's all about um you know how much of the good life can people must people pay for versus they're just available to the public good now there's a problem that can emerge where if too much of your current economy as it exists are dependent on the kind of predatory persuasive bad things for people world and if let's say 50% of your economy might be dependent on that you can't just say let's subtract that and somehow magically pay for everybody to have a good life there's a reality of sort of how much money there is in the economy and what loans attract and one of the points that I use in this at this point is um the British empire to abolish slavery had to take a haircut on their economy and the haircut was 2% every year for 60 years so in other words if you basically raise this concern that said this is unethical this thing we're doing to slavery but if it was 50% of your economy or 70% of your economy could you say you know you're right that's unethical that's wrong let's just abolish it now you'd have to self justify or come up with some cognitive dissonance answer rationalization that said that what we're doing right now is I don't know they like what they're doing we're making the world more open connected we're you know whatever it is right and so you have to we have to work collectively to get the cost down enough so there's a safe exit path from the problem of the different problem so right now too much of our attention economy and our current wealth based on the amount of attention we get so if you told Facebook I want my attention back they'd say that's nice um let's say you say I'm spending 30% more time on Facebook than I want to and I said I want it back I want you to stop persuading me that way they'd say well that's nice you have to pay me for that 30% that you're going to get back because it was my stock price but then 30% so but again how do we make that part of the good life and conversations like this are about that the good life you know I mean we are getting services for for free because we're paying for them you know with our attention and our data and we have to pay in some other way to compensate to keep those services going unless we want to shut them down and I think again I'm going to just go back to history that the good life is always expensive personally and socially you know the going back all the way to the Greeks the good life is you know you have to dissociate yourself from the desires or simulate things that might not work out the way you want and nothing so positively and you know and the good life has always been a personal choice it's actually it is about agency that something that you didn't touch on as think as much was how much doesn't it require our own agency to start to say I'm going to ignore these messages and I'm going to switch off my Wi-Fi and that is my declaration of agency it's not about changing the design it's about my personal choice to not engage with these apps and not install the latest apps and maybe pay a social price for it I'm not going to be on Instagram and I'm going to be on all the platforms maybe just three of them or two of them or whatever and that is so isn't that also part of the problem with agency absolutely it's just a question of in terms of fairness do we want a world where only the tiny tiny percentage of people who are educated aware of companies goals understanding systemic issues know their technology well enough to be able to turn off these things I mean the list of all I'm totally agree with what you're saying we don't want to eliminate personal responsibility right people are responsible for their lives but we we don't the world is so configured around predatory practice right now that if you don't know better you don't even have information about these things now culture has always produced kind of low barrier approaches heuristics like religions and culture where you have rules of thumb that work well and they protect you you don't have to question them you don't have to intellectualize them but they but they're good and they're always they're not around financial gain they're just ideas right and maybe do you think we should go down that route instead of trying to change the way the designers and companies think well I don't understand educating the public and empowering them with heuristics with simple rules and principles rather than trying to change the way Facebook operates yeah so right now though it's so imagine a world where I think this is a useful thought experiment imagine a world where there's cars you live in a city where the only way to get in the car and go someplace is to you're always getting an accident and so your other option is don't drive a car so if we live imagine that's just like the norm for whatever reason we just happen to live in this new reality where you either get in the car and you always get an accident because the city is designed in such a way and if you're really really like just you practice driving your whole life and you really know the routes and you can kind of swerve around things you can kind of get away with not getting an accident and then there's the other people who just say like this is just crappy I'm just not going to use my car but then they don't have access to many things they don't have access to you you do use your car even though you get an accident that's an all or nothing choice there's this kind of I either get to have and then it always screws me over or I don't have it and I think that's a design problem and that's you know you'd say we should just build some traffic lights in the city right another piece also especially on the consumption side is there's the question particularly environmental cost that we're all paying for everyone else's being persuaded to consume more by all the interruptions they're having so you might decide I'm going to shut all this off and use ad blockers I'm really limiting my phone but this is taking a very big picture but you know we're living in a world where we don't have the self-discipline to say we're going to not drive so much we have massive amounts of global warming we have problems with landfill etc we just have a lot of consumption-based issues so to the extent that we're living in a wonderful connected information-based world that is built on the notion of advertising which either is an incredible fiction and it's all going to come crashing down I argue about this quite a bit but it's not a total fiction it's not actually driving people to consume more or if it is we're all paying a very very high cost for this so it may appear to be free but it's not just a cost to the person who is in their time and their attention but it's a cost in terms of what we desire out of life and it's not just immediate things you notice the ads that you have but it's sort of this like scale of dissatisfaction the sense that you haven't gone to Disneyland enough it is this other trip you have to take or there's this other thing you need to have or experience or buy or discard the new one you have and get another one and so I think that constant consumer dissatisfaction that we live with is very costly for all of us and the other thing I heard you say was that part of the good life is not just an individual having a good life the good life is a social construct that is supportive of the good life I would just say that Judith's point is a really important one that the costs are known I mean we know them and we see it in research on privacy that I mentioned earlier we see it right now has anybody tried the brave browser yet try it out it blows away all average it's done by the guy who wrote JavaScript he's the guy who's the former head of Mozilla created a new browser it's based on Chromium which is the Google and the the engine that's under Chrome on commercial websites it speeds it up enormously and basically it's the whole point of it is I mean it's a reaction to all this it's an antidote and if you try it I mean it doesn't have some of the bells and whistles you want on others but it makes it faster and it is a market response I mean the guy said wait a minute people hate advertising let's get rid of it and then let's see what happens after that now he's got a kind of funny way of following that up like I'll sell the advertising but ignore that for now it's faster enough that you can see the difference in one good example here where Apple and Google disallowed you from a certain kind of agency the default web browser you cannot change on an iPhone so this is another example where there's these subtle ways that they kind of lock you in any link you click on on your iPhone is going to launch the default web browser and that extra cost it's like you have to drive 50 miles the telephone booth just to use it so these are areas where the agency there's some small freedoms where you should be able to set that default browser that would be great that would be part of this agency conversation and if it succeeds well enough there might be enough market demand for systemic forces that are I don't deny that what I'm saying is that there are I think in general people are better behavioral economists than we give them credit for and understand a lot of these costs that are out there and we'll take measures I mean we have these very small margins that we can operate in so far because these platforms are so hegemonic but but there are there are tools being developed that is a that's a very hopeful note on which to wrap up a little bit let's take a soft break in a couple of minutes people can stay and I see that there are more questions but if you guys have some time one slightly more hopeful example yes and if you've seen the TED Talk already I apologize it's repurposing some information so inspiring example of what it might look like if the world is really working for people my collaborator was the CTO of Capsurfing and when he was at Capsurfing he created the metric for how Capsurfing measures success so do you guys know what Capsurfing is? tell me you don't know what Capsurfing is so Capsurfing basically proceeded Airbnb so it was a four free way to stay with people operating entirely in an efficient economy for people who had an extra place to stay and just wouldn't mind hosting a cool visitor from Italy or whatever when they were in town and showing them around the city based on the reviews that that person got so if you had someone who just had 500 reviews saying this is an awesome Italian when they come to town you're going to have a great time you say great I'll let them stay with me so if you're couch surfing and you're managing this marketplace of people looking for a place to stay you probably think let's match guests to hosts we have a number of people looking how many people being matched guests to host so we had this number of people looking this is how many people got a match so that success is someone found a place to stay but that's not how they measure success the way they measure success is that they want to create lasting positive experiences durable positive experiences and relationships between two people who never met and the amazing thing is they came up with an incredible way to measure which works like this so let's say this is the first night out with the Italian who's staying with you in Boston for four days and they take the number of days that two people spent together so let's say you stayed this person stayed with you for four days in Boston then they estimate if they stayed together for four days how many hours would have been in those days so how many hours were created and then they ask people, both people how positive was your experience overall so they're trying to get a rough count of positive hours that couchsurfing created okay, so they got that part now what they do is they subtract all of the time people spent on couchsurfing so they take those two people and say you spent 20 minutes looking clicking, poking, sticking back and forth sending messages to hosts, looking for a place to stay reading profiles, doing all this stuff and subtract that as a cost and what they're left with is something they call net orchestrated conviviality or really just the net positive hours that existed between two people and they do a lot of clever things to find just the new positive hours that couchsurfing created they want to deduplicate from people who have found each other through some other way just the ones that couchsurfing are responsible for but I always just as an example, because can you imagine you're going to work and like on the big screen where they have the metrics dashboard and the graph you saw the number of positive hours that would have never existed unless you did what you're doing right? so when you think about an economy like an organic economy of time well spent things imagine a set of everyone's doing that imagine a world where Facebook cares about whatever it is time well spent for you on Facebook that's what you're getting out of it that's the kind of success that way I imagine that's the way that these actors in this economy are measuring success that's the kind of thing that we're talking about and that's about moving from time spent this idea of time well spent and that's what forget the website that's what this conversation is about how do you do that we're going to stay and hang out and thank you guys all for coming thank you