 What is the stupidest thing you've done with respect to your privacy? Join Twitter I'm just kidding Probably like the entire like when I was sort of a one like like Farhad just like signing up for None of it meant anything to me like I signed up for every service pretty much. There was just this time. It's probably like Between 2008 and like 2013 where it was like if it's new and interesting It's like my job to you know figure it out and like I'm pretty sure I gave my social security some companies or something You know, it's pretty reckless Fell in love with a guy who uses Alexa And then we moved in together and then I made him take Alexa down But there was a period of stupidity there. Yes I just fell in love with Alexa That was the main thing no, it's like all the things I mean Facebook All the things Charlie said like just sort of I I mean we write about it So I also felt that professional duty to kind of try things out. I I'm I don't know you've all experiences when you're kind of signing up for things you're in this like Altered state or everything seems like it's gonna be great after you press agree So, yeah Finally, um, what's the most extreme thing you do to protect your privacy? I live in Montana So, yeah That's the end of my answer I File a special form with the Oregon elections board to hide my address and They fuck it up every time. I'm sorry. I shouldn't have cursed. This is a library Yeah, I don't know if it's extreme but like that's that's the thing I worry about people knowing my address my address used to be Somewhat easy to find. I got I've gotten doxxed before And like there's all these services that I've signed up for you see I can't stop signing up for services to Like get my address off and like I filed those forms, but it still seems like a uphill battle So it's not extreme and it's just like I'm losing at it I had to take a picture of my driver's license to use one of those like wipe you off the internet forms I was like this feels like a real catch-22 here Too much information. We're actually gonna talk more about addresses later But for now I want to get a sense of how you and the audience feel So I'm gonna ask you a couple of questions. First. I want everyone to put your hand up We'll first be talking about social media websites like Facebook And we're interested to see where you personally think you would draw the line or what would be too much for you when it comes to privacy So keep your hand up until I say something that would make you kind of shake your head and say no, that's a step too far So when it comes to social media websites If you're comfortable when a website collects detailed personal information about you like your gender and your interest Keep your hand up if you're not put it down Wow, okay, this this game might end sooner than I thought If you're comfortable when the website Uses your friend's post to send you targeted advertisements keep your hands up if not. Wow, okay put them down If you're comfortable when the social media website receives information on what you buy in physical stores keep your hand up and Finally if you're comfortable if they know when you walk into a physical location Because you share share it with your phone app keep your hand up or if you would draw the line here put it down Okay, we're gonna have to talk to you later to get you a perspective One more topic smart doorbell. So these of course are the doorbells with a little camera on them everyone Please put your hand up again If you're comfortable with a doorbell that sends video and images to your phone So you can see who's at your door keep your hand up. It's kind of what it's for, right? And if you're comfortable with a doorbell that learns familiar faces so it can identify known unknown people keep your hand up If you're comfortable with a doorbell that gives law enforcement video clips If the if if they're served a valid a valid warrant keep your hand up Okay, we have like four or five people left. So I'll do the last one If you're comfortable with the doorbell that gives law enforcement access to its network of real-time video streams Keep your hand up if not put it down Okay, that's fascinating Farhad actually wrote a piece that said it's time to panic about privacy So I'm interested in getting his thoughts about the audience reaction here No, it seems like they're they're a good level of panicked The doorbells like the doorbells especially scare me I don't have one of those smart doorbells but like the idea generally the thing that has Worried me more and more over the last few years is the proliferation of cameras everywhere You know it started with we we take cameras around with our on our smartphones And so now you're being recorded by like anyone who who has a camera And you don't have much say over where your image goes or your voice or anything else But now it's like in Other than the people around you like the objects have cameras cars have cameras like airplane seatbacks have cameras Doorbells have cameras and what I worry about really is like these images being you know, they're all Cross-reference to your location to time there's this kind of massive surveillance system being created and Because we're consumers and buying this stuff and like tying it to the cloud. We're like a part of it So, yeah, the cameras and they're in like us buying it for these doorbells and other things even though they seem super convenient I wouldn't want one of those doorbell cameras just for like some of the conveniences. They seem Just like a step too far. Yeah, Sarah and Charlie. Do you have thoughts? I Mean for me, it's it's just like Why like and again like it's I am so many Wise down the line that like at this point the doorbells thing is is just in another universe I think that When we started putting You know, there's cameras on Peloton exercise bikes. I like I when I realized this I Don't don't even know what to say like you're gonna pay that much money to have an exercise bike look at you Like that's so strange. I just do not understand What the market thinks that we want essentially and there's there is this Prevailing assumption that this is what people want, but it's like no advertisers tell you what you want We're just moving further and further along this path Accepting this wisdom that we do want more surveillance, but you just saw the hands in this room Like it's true that we are all being tracked in these ways and that it's incredibly common You're dead right that all of the stuff that Jenae listed is actually happening But that doesn't mean we're comfortable with it We just have to do it because we don't really have much of a choice Like what are we gonna do throw our smartphones in the ocean? I can't find my way from Civic Center to the library. I Yeah, I mean I agree. I think one I Think a real problem right now. I think we're in this this transitional period where we're starting to reckon with all this stuff and And I think one of the big issues is just like transparency I mean like the the reports that came out this weekend that have kind of like trickled out about law enforcement partnering with With the system and then it's it's this like this closed loop that is sort of undisclosed I mean that kind of transparency is just like is broadly unacceptable like you can't you can't have these partnerships With law enforcement that people don't know I mean then you are just actually creating like a surreptitious surveillance system now Granted if you know if people know these things There was a there was a report about it like I've been thinking a lot about this transparency stuff there was a report in the Guardian about Workers at Apple who were you know Have to listen to some of the Siri requests and they were overhearing people like having sex or arguing and these things and To me the big thing is not that that's happening that can you can obviously be outraged by that But it's the fact that it's not disclosed I mean I understand that there's an you know a human infrastructure underneath technology the problem is that we Some of these companies have gotten so brazen that they don't feel like they need to disclose this that they feel like they can Do all this underneath the surface and that's that that's the thing that I think needs to Like that we really need to reckon with is the disclosure and the understanding giving consumers power Yeah, but some of these things are sort of more The way that you sell the technology is to sort of cover up the way it works The way that kind of Silicon Valley has worked for a long time is to kind of tell you the great things that are gonna be available with this technology and Usually it means like you know, it's There's there's some level of convenience or like ease of use or some other thing that you haven't been able to do before and in all of that It takes a certain level of kind of consumer understanding to know how it works and to see the risk there So like the doorbell thing if you think about it for a second It you can see all of the real really scary things like the way that the camera the image from the camera gets to your phone is it sent through a cloud server first and You know, so and it's Amazon and Amazon owns the doorbell company it owns Like it has access to your house because people are coming to your house all the time like there's all these connections there that When you describe the entire system sounds scary But the way that you know technology companies sell gadgets is not to describe all that stuff the way that kind of people understand How that stuff works is like not at a level of sophistication to understand kind of the scary things that are possible when you just buy this device That's like allows you to you know Answer the door or play music in your house or something like that I mean, it's the same thing with like food and medicine and a host of other things that are regulated Like we're just sort of at this point where We should be regulating this stuff, but we don't have an FDA for Alexa Like we've we've got a system that's like wildly out of control and we don't even have like a framework to understand What is harmful about it? We don't have an agency that is adequately staffed or adequately empowered to to assess the risk to consumers Yeah, we don't know how to in reporting on some of these things We don't even know what the processes are like we can't actually Explain them or understand them or like you you you know when when you're looking into certain parts of like to say like the Programmatic advertising ecosystem like you just come up against these these black boxes these walls where it's just the data goes in and and Stuff happens and we don't necessarily know and like the way to get past that is is really only having someone like Become a whistleblower or tell you know, so it's I mean yeah We're in it Well on that note Charlie you've written a piece telling readers you care more about privacy than you think you do I think most people in this room might care about privacy as much as they think they do They seem pretty informed, but when it comes to your average American, what do you mean by that? Yeah, so I mean I think that we're always What people who want to sort of take advantage of people's privacy will always say that you know You are by agreeing to these services. You want them right? by clicking, you know accepting the terms that You know, you're making that trade-off willingly and you're not necessarily But also there's this there's this thing of you know, you know about Cambridge Analytica, right? And yet you still use Facebook like that is you know, that's used as an excuse when it really doesn't like It doesn't actually affect how we feel like Privacy is so overwhelming and all-encompassing that You know, it's It's a lot like the person I talked to who did a study showing that people actually do really care about this stuff Compared it to like your 401k and basically said that like you don't walk around every hour Being like I got it. I got to check my 401k or hey I just bought, you know a $7 latte at Blue Bottle like I Clearly don't care about my retirement savings like we're always kind of working against that because it's we have to live our lives And it's really difficult and mentally taxing to do that It's really hard to look at your privacy in a holistic way on the internet I mean you really can't do it. You're shedding information Everywhere all of you shed all this information to come here tonight just by having phones on you And so to think about that all the time and to make that trade off in your head It's it's maddening so you work against your own interest and and I think I think that That is you know That is a myth that we kind of need to dispel is that people don't care about this stuff. You do it's just it's it's You have well we have lives to live and I would also say there's a certain amount of like fatalism involved because the whole thing in the way it's described in the way that we talk about these companies which are very powerful and big and seem to have no kind of Like regulators watching them. It just seems like if there's a there's a bit of like hopelessness like the whole story about What happened to Equifax like they had this massive breach It was unclear to me like why they were allowed to kind of continue operating as a company after that because like they failed it they're one job and and Then they have this they come out with this like really Underwhelming settlement that is gonna Now the money's run out and like you're not gonna get a lot of money from it and like that's the end like I you know they suffered some Losses in the stock market like and some executives were fired But there was no real penalty for any of this stuff and it feels like if you're a consumer like oh, they all have your information They can treat it sort of like they're not being very Careful in how they treat your information. It's kind of everywhere already and like the whole story is kind of hopeless It's really hard to with Equifax. It's like who to know who wants this company to exist Like I just I have a hard time understanding like who's the person who's like Like I'd love for you to monitor my credit so that I can't buy a home Sarah speaking of pieces of personal information that we don't want others to get a hold of You've written that home addresses should be protected Much more closely and almost like in that to social security numbers. Can you talk a little bit about that? Yeah, I mean we used to have the yet like the yellow pages the white pages And it was totally chill to have your home address just sort of floating around next to a telethon booth But uh, yeah those existed too, but uh it it's things have changed like there Just has been a cultural shift where we now understand that when Your home address is one Google search away You could possibly be in danger Like it is a very very common way to harass someone People have died from from this like from having their home address Found on the internet, and then someone calls the police pretends like there's an active shooter Situation they send a SWAT team and then if the SWAT team overreacts someone ends up dead and the case that I'm thinking of specifically the person who died wasn't even the person that Was targeted it was a new person a new tenant who had moved in like fairly recently The address was just an old bad address So this person just had no idea what was coming and I think it's like very strange for us to continue to think of home addresses as this piece of public information that we just Jot down everywhere like why like why do why does my physical? residents need to be tied to like a Receiving address for paper mail most of which is junk It it doesn't seem like it's worth giving up people's security for What is essentially a big old advertising? Thing slash the post office just simply hasn't changed its practices for many many many years So on a slightly less terrifying note Is everyone familiar with the app face app that people were using a week or two ago? For those who weren't it was an app where you could put in your picture And it would show you what you'll look like as a much older person So Charlie you wrote a piece about face app and what it can teach us about what people do and don't understand about Privacy a lot of people use this and gave it all up Yeah face up was really interesting because it was like this I mean it's been in the news before and always kind of it seems to flare ups semi recent or regularly But what was really fascinating about it wasn't just The privacy concern like the the virality and the privacy concern But it was also there was there was a privacy concern and then a backlash to that so Some people pointed out that the the app like the location of a lot of the developers is in St. Petersburg Russia and I was one of the people that sort of noticed that Alongside others and then there was a backlash to that which was like you're a Russiophobic like what you do you think? Putin's behind this like Putin doesn't want your stupid photos, right? And there was this so there was this concern about that and then there was this Part of this backlash was well Facebook does the same thing like if you're if you don't want to give Your photo to these people you shouldn't give your photo to you know to Facebook they have Years and years and years of all that information and I think all the the confusion and the and it just shows that people are like really Concerned but don't really know where to direct that energy We're not sure if you know We all downloaded this app and we're like kind of giddy about it and like look at me as a baby and look at me It was you know a moulder And and it all seems like fun and then there's these Stakes behind it, but we don't really understand what those are and and I think it it just shows that we're in the middle of This real transitional period like we don't most people don't know what happens when they click You know yes on the on the terms of service. They don't They don't look or think to look Because it's on the app store. It must be completely vetted. It must be totally fine We don't know what happens behind the scenes when we download apps. What pieces of software are in there from you know other advertisers from other companies people who you know who who host Their technology inside other apps like there's just a million things that are happening We don't understand and so I thought face app was interesting because it really sort of brought all that to bear in this Incredibly compact news cycle Did any of you choose to use it? No, I didn't I mean it's like as Charlie mentioned this had been coming up regularly they were like other there've been other apps that have done this and It felt like one of those things where it's like a Startup or something that comes out of nowhere and it's trying to get everyone's photos like it just seems kind of The intent is seems suspect And the payoff seemed low like seeing yourself as an older person. I mean That's going to happen. It was the name that got me I mean it was a combination of all those things and then it's the name and it's just like I feel this is a trap. This is This is this is some this is some kind of a trap. I just didn't do it because I didn't want to see myself Looking like that. It's like the opposite of a filter and that I pretended like I was smart and that's why I didn't do it Um Sarah you have also written about the insurance industry and how surveillance is being used In ways that sound kind of scary. Can you talk about that? Yeah? I mean, it's uh, you know, there are all of these Life insurance companies for instance that are offering perks for people to wear their FitBits And then they adjust their premiums based on the FitBit information um Life insurance companies that scour for social media postings To to see what your life risk Expectancy kind of is for instance like you don't don't post a lot of pictures of yourself skydiving I guess Yeah, so What they do is they go oh, we're going to put you on this voluntary sort of monitoring thing Whether it's FitBit social media, whatever like got like car insurance companies will do a thing where you have an app on your on your phone and It'll be using the accelerometer to tell how fast you're driving or when you're breaking and so on and so forth And they'll say oh if you opt into this program We can give you discounts which that's just price discrimination actually But if you say it it's a discount. It sounds a lot nicer and You hear me mention these specific insurance industries and you may have noticed that I didn't mention the health insurance industry And it's because of the ACA prior to the ACA The insurance companies were buying up data for instance from pharmacies in order to deny people Insurance based on their pharmacy prescription histories So wherever it's legal insurance companies are going to do that like they're going to slurp up as much data as possible through whatever surveillance means are out there It's right now people there's this ongoing debate inside the insurance industry about whether or not it's Okay to use credit scores to determine insurance premiums Which you may have assumed that that was already happening, but it is actually like kind of a controversial thing I don't know why But yeah, like they're drawing together all of this data all the time and unless they're barred by law They'll do it and there is this increasing concern because the data about us out there is becoming so Nebulous like that's one thing to have a list of prescriptions, right like that's a really Straightforward. Oh this person's on this ergo. They have this disease ergo. We're gonna Deny them coverage, right? Now imagine you have Some machine learning algorithm that's going through people's Facebook Participation like what Facebook groups you're in or the kinds of your social graphs and It's been going through this for a while and it's now charted out like life expectancy of depending on of your associations, right At some point for some reason the algorithm spits out this bias Against covering for people who are all part of a Facebook group for burka gene Havers like people who have joined a Facebook group for people who have gotten their genomes tested and they all know that they have the breast cancer gene, right You now have genetic discrimination But it's come out of one end of an algorithm. That's a complete black box And I think that unless we see more regulation in the future Like we're gonna see weird stuff like this happen all the time where there is some kind of illegal discrimination happening But it's happening in the guts of a machine that we don't we can't see through Yeah, one of the things that I think this points out is like one of the kind of arguments people have About privacy is Well, if you don't want your privacy to be invaded or if you're worried about this You can just kind of opt out you can like not not use this stuff like not use this newest service But I feel like the insurance stuff and the way that it the way that these technologies kind of like spider into every Parts of our lives it it really leaves little room to kind of opt out and you kind of have to be in the system You either like get the discount or you don't you get like better deals on health care and then often for for work You kind of have to do it like if you're an uber driver you have to give up your Like how you drive that's like part of the system or like truck drivers or if you're an Amazon delivery person Like and you visit a house that has the Amazon doorbell on it Like your boss is kind of like watching you as you deliver everywhere And you know and can see you and like potentially could be making you know employment decisions like payment Decisive like all that stuff and you don't really know I feel like we're past the point where we a lot of people can kind of decide You know in this high-minded way to like opt out of this stuff and the tough thing is like This is often as Sarah mentioned. It's not People making the decision sometimes, you know like that when the data gets abstracted and moved around It's not that it's always nefarious at all, but it's also not like it's not a human decision it is a decision based off of you know a series of constraints and and Approximating a human decision and that's really weird when it comes to you know Your your physical self where you are who gets to see what you know You're your opportunities later on and it's all based off of this kind of abstracted data that we just shed And I'm just gonna add the people who say that they've opted out Like are the worst human beings because they almost certainly have never opted out And there's some dude whose wife has a phone and does everything for them like it's it's always that I swear to God like it's No one actually opts out. You just let your girlfriend get surveilled for you Good point to bring it to a topic that's a little more local Farah had you wrote in May that San Francisco's Board of Supervisors was right to vote to ban the use of facial recognition Technology by the city's police and other agencies. Can you talk a little bit about that for people who might not have been aware of what's going on here? Yeah, I mean there have been a lot of Studies and just sort of investigations into how police Departments around the country have been rushing toward facial recognition software and there are just very few They're like basically no rules about how police can use them and the kinds of the kinds of things they're doing seems like Just they shouldn't be allowed in law enforcement There are stories from New York about how you know to try to find someone in a To try to find a match from like you catch someone on a surveillance camera you need to find a match in the database You know in New York, they're allowed to edit the faces to like there were story There were in some public documents that some researchers got there were instances where if they found somebody on a surveillance camera and like the map like the mouth didn't match the right mouth shape for like matching it in the Driver's license database then they would just go off to Google and find like a mouth that better matched it And then like paste it on to their suspect image and then run it against the against the system And like you know somebody is gonna get talked to by the cops because of this weird search And there's no rules about whether You know you have to disclose how the search happened to like defense counsel like where There's no there's just very few rules about like evidence and just about how cops should use this stuff And I thought that it was very kind of bold thing for San Francisco to decide not to do that so the the San Francisco rule as I understand it only applies to You know the police public agencies not to like private businesses that may use facial recognition So it's not like a complete ban, but it's like for the important I think some of the more important kind of ways that facial recognition can be used and a number of other Cities like at Berkeley is considering it a number of other cities are doing this we're thinking about it And I think we should just have like a moratorium on it at least until there are kind of You know regulations that kind of cover the stuff The other thing about facial recognition is not only our police department is rushing to use it But the technology is not very good like it it's biased against people of color It finds false matches it so it's something that people are rushing to use that everybody knows is not that good So it's like riddled with errors And there are no rules. I went to a facial recognition conference in May in Washington, DC Which is as dystopian as it sounds And it was a lot of just like private companies and vendors and there was this like Expo where people were going around and like the Department of Homeland Security is there and people are writing checks It ever is getting paid and it's really cool and I would talk to people and speak I think it speaks to your point really well there was this Passing the buck that just happened all down the line So I would look at the technology and I'd say well, you know, can it be abused this way? It's like well No, no, I mean theoretically maybe yes, but and then I'd say well, you know, who are you selling it to? Well, when we sell it to law enforcement, we train them. I was like great. What if they abuse it? They're like, well, that's I mean we did what we did we did all the stuff We checked all the boxes. What more do you want from us? And I was like, maybe Don't make it Or don't sell it yet And I just think there's that sort of thing where it's like there's that and then and then I spoke to some law Enforcement people who are there and like we train our officers very very well I was like, what if you have an officer who's biased against, you know people of color or you know, what if you just have the Human error right or y'all and and nobody has good answers for these things and I think until we have Answers for these things until we have Systems in place like it does make sense to to slow this down I I I am sympathetic sometimes to some of the arguments of You know, we we want to be able to Catch criminals to make airports safe to do these things like I get that I you know I don't think that it should be some kind of you know Lawless completely anonymous paradise where criminals can roam the streets and do whatever they want But it's also there's this real idea of just like we have to move we have to just keep moving forward and it's There's no reason that we can't you know Consider this stuff and move deliberately. I think facial recognition also like really exemplifies a lot of the concerns about privacy in this way that Privacy isn't just about like the invasion of ourselves, right? It is about what happens to our data and how it shapes society for instance So even if you're personally okay with your photos being out there that they're now part of the system That's being used to racially profile people like does that feel good? I don't like that's actually very uncomfortable for me One of the things about facial recognition is that it's not so much that people are being scanned That's the problem per se. It's that this technology is being used to Take these already terrible tendencies in our society these inequalities these injustices Make them worse and also come up with a reason for it to happen that doesn't appeal to humanity at all So now you just have an excuse for profiling someone now. You just have an excuse for It's just stop and frisk with a magic box that you can hold up and be like oh, hey well this is why I stopped and frisked this person because the computer told me to and We also know that the computer already has a baked in racial bias I think that The examples I came up with with like the insurance industry a lot of these nightmare scenarios that we keep bringing up have to do with the worst parts of American society in 2019 the things that have already gone wrong and Just throwing data and surveillance at it makes them go even wronger. Yeah, one of the the kind of the thing that got me Scared more scared about facial recognition was Last year I I feel like I got more than a dozen pitches and talked to more than a dozen like startup companies whose whole Plan is to sell facial recognition Technologies and basically surveillance camera technologies to schools As a way to stop mass shootings and other security incidents at schools Like I the thing that really worries me about this technology is that people will buy it because Like they see it as a quick easy fix and something to do for like larger and more intractable social and political problems in the culture and And the technology will be used as a way to solve it, but we'll you know, probably eventually make everything even worse We're gonna move to audience questions soon But first I'm just going to ask Sarah one more question that I'm actually dying to know the answer to Has anyone else had this experience where you're talking about something with your friend or your spouse? And then the next day you get a Facebook ad for it So there's a rumor that Facebook is listening to us and it's just all very weird and creepy and I'm dying to know So what is what's actually happening? It's it's it's not real. It's not The problem is that advertisers know you better Than you think they do they they can read your thoughts without actually listening to you talking So it's literally worse. It's worse than you think basically so like researchers have done done these experiments with phones Where they're like looking at the data packets going out and like trying to figure out. Oh, what apps are transmitting? Audio without permission essentially and there are like some apps for instance on Android That are not in the Google Play Store So the sketch sketchy apps that you should not download to begin with that do this thing, right? But if you're looking at like an iPhone for instance, just face or Facebook in general It's not transmitting audio For advertisers what's happening is that the stuff that you click? The websites that you visit the contacts lists that's on your phone that Facebook harvests Your phone number which is tied to all of these little bits of data the Wi-Fi networks You connect to your geolocation the other phones in the same location the other phones and devices and computers that are hooked Up to the same Wi-Fi networks that you connect to all of that information together Makes a very accurate profile of who you are who you're related to who you're married to what you guys are talking about what you're considering doing next What you're watching on television? What age you are? What's happening in your neighborhood? What ends up happening is that the advertisers know what you're talking about before you even say the thing You're very predictable. I'm sorry. I'm very predictable too Also, you're only noticing it because you notice it when it happens If it's if it's not happening, of course, you're just gonna know it's here's another stupid ad for something I don't care about but yeah, like this is it's bad like it's really bad and This myth that phones are listening to people I trotted out to like for the last time Farhad and I were on a panel. We were at an advertising conference So it was a it was a very different audience and people just kept talking about like, oh, I know people people do want I want to be tracked and I was like, no I really don't think people want to be tracked and I like pull help this example every time like listen people have this fear of being Tracked that's why they make up this myth about their phones listening to them Because they don't like the reality the reality is way worse This is the problem with with the advertising in general is this like when it's really good You're terrified When it's really bad, it's completely, you know, there's no context in your life You're getting a whole bunch of ads for the kitchen appliance you bought on Amazon six days ago And it feels frustrating and so there's like there the system is just set up There's really like there's no real way to win the better. It is the more invasive It feels the worst it is. It's just you're throwing money down. I mean, we're talking about online advertising though And the current model of on like ads weren't like this Like for you know, most of human history really So like we've just sort of waltzed into a model where we don't we don't think that it can be any other way There's like no way to deliver like a Super Bowl commercial in an interstitial ad when you're reading like, you know A TMZ post about Kylie Jenner, you know, like you can't have that good experience You're like no one's ever like oh man show me that banner ad again. I Mean one of the things I feel like the really scary thing here that I think about often is where If you start using this stuff a lot if you look at Facebook a lot if you look at the internet a lot I really sometimes wonder What percentage of my thoughts are like independent thoughts? versus How I'm being influenced by the algorithm by the advertising I mean, I'm in like my job is to write columns for the New York Times Like I'm in the business of like coming up with ideas to write about I am unsure often How an idea got into my head And I feel like this is a real problem for society because Like these are like what does it mean to live in sort of a capitalist society if you're not sure if like It was of your own volition that you decided to like buy this thing or like go on this vacation Like I feel like we're at a point where it's hard to tell What agency we have as users? And like citizens versus like what the companies are kind of like mind-controlling us to do On that mind-control note, we are going to move to audience questions We invited people to send questions in and some of the people are here tonight So is Eva or Ava Galanis Rosenbaum here Someone will bring you a microphone if you'd like to ask your question. I Think you got to a nine on that last answer It's on Can you hear me? Yes. Oh great. Okay So my question is How do we draw the line between staying in touch with people? you know far-flung friends people all over the world or all over the country and You know opting out of social networks using or selling our data You know, basically, how can we stay modern and? You know use the great options of staying in touch with people without compromising our privacy I Don't think you can And I think that I think that that's like sort of the problem is that it everything is built on this and it doesn't mean that it's all bad Right, it's just like that's why I think we're in this really interesting period. I Don't it there are certain things you can communicate, you know via any encrypted text messages with people you can you know you can You can use certain Like like FaceTime, you know, there's very little chance that someone's Gotten in involved in the middle of that You obviously can't control the person on the other end if they choose to you know take a Picture of you a screenshot or something like that But I think I think the real issue is if you want to really be Modern if you want to participate in all this like you really can't opt out and I think that's That's the issue that we're dealing with right now is and that's why things like regulation are Our top of mind because like this is the system that we live with we have this And so yeah, I mean I think being very realistic, you know, I I write this newsletter This privacy newsletter every week and we have to have like a tip of the week and the tips always feel to me very akin to like To personal responsibility tips for climate change right where it's like hey should I fly and I'm like I will I'm sure I mean yeah Temperature is still gonna go up It you there's a collective action that needs to happen And it's not gonna a personal responsibility is good You should protect your privacy in all the ways that you can but to think that there's some sort of Switch you can flip besides moving to Montana I Don't I don't I don't think that there really is and and that's I think why we're all here tonight Why this is something worth discussing why they're discussing it in Congress and everywhere because this is we've built this and we have to deal with it So I have a friend who decided to start doing like Christmas card like family Christmas cards Like the old school sort of like every year like you know got the family Portrait and then you mail them out to all of your far flung friends She she solicited everyone's addresses through a Google font Docs form There's only so and I believe she sent the link around through Facebook But there's only so much you can do right because of the way things are set up Personally on my end I have been trying to withdraw away from the social networks and and really keep in touch with people via like text message and that Is hard like I just drop out of contact with people for long periods of time because I'm not good at this kind of thing and Yeah, it's that is sort of the trade-off like that is like you're going to have to start Actually putting in the work figuring out that balance if you want to pull away from the platforms, but that said like It's maybe a good thing for you to do that not for yourself, but for your friends Because there are people who would rather not have their information exposed to these companies Who are your friends who are who have a higher threat profile than you do and you? participating in these technologies puts them at risk because of how it works Do you want to add anything No, I think that about everything. There's a I mean as I said, there's like a kind of hopelessness to it That's hard to get around Also, it's not the worst thing in the world to like Go back to the days when you saw all your high school classmates at the high school reunion And that's when you found out what you have to fly to do it and that's a lot of carbon. It's a lot of carbon And it's also like it When you do start doing the thing where you do think about this critically you start to realize some of the ways in Which it's incredibly unnatural like I have I literally have Facebook friends who are privy to any of my Photos and I met them one time in college at a party like that's like me sending my Christmas card 300 times a year to a person you met at a college party. It's very unnatural So there are some you know It's helpful to think about this holistically Is Cassie peppered here Okay, someone will bring you a microphone So I guess the question I wrote is I worked in sales for an encrypted company chat and so Basically those sales didn't happen We were a startup But yeah, I just know how difficult they can be because companies don't trust their employees and And especially in case of harassment like with everything that's come out, you know, they really want to make sure that Everything is is forever captain. They can reference it. So just with that constant Monitoring of everything you say at work whether an email or on chat. I kind of wanted to know how you guys Would relate that to our private lives or like not so private social media lives because everything is being recorded Those eight to ten hours a day Yeah, it's really interesting how that happened and we didn't really kind of get a lot of say in the matter like Work just became like this place of surveillance There's Amazon as a whole program for Alexa at work you know to kind of control lights and video conferencing the huge recent IPOs have all been about cloud companies that you know work by like Allowing people to communicate like slack does but then kind of You know keep an archive of everything people say which is like creating an archive that you may not have had before You know Like you just talk a lot more Digitally over slack than you did over email and so things are archived more Kind of everything you do at work, you know, which like which office you're in is tracked like and Because it's work like there are a few rules about it And I think I think it's it's it sort of tracks the way that we've been surveilled and kind of consumer life And that it's just become okay for your boss to kind of know everything about what you're doing and keep a record of it I think there's like a big data problem just in general in in even understanding like sometimes I'll talk and again This is probably like this doesn't really necessarily relate just to like Google and people who are like using that information in lots of different ways but there is like we were talking about school tech earlier and there's a couple of In some of my reporting I've talked to a few school districts and they talk about all the information they collect and you know And and there's this idea of like more is better like the more we have the better Like the better our insights are gonna be the more we can like slice and dice and it's been you know That's gonna it's gonna make our lives better in some way and then you kind of Keep asking questions on the chain and it's not clear that they even know what they're going to do with it It's just like we must collect it because it's there to be collected and and I think that like I think that there's there's a I think the the jury is still out on whether That's always like that always creates the absolute best data sets and that always you know leads to the absolute best insights And and what that trade-off is and I think I think that we just have this sort of like data solutionism thing Where we just think like the more we have the more we'll know so I'm sympathetic to the concerns of Retaining data for sexual harassment inquiries When I worked on a sexual misconduct case for an investigation like a couple years ago Some of the evidence disappeared because the accused Allegedly Was a hacker and went back and got into The accusers Facebook account and got rid of a bunch of stuff Yeah, a lot of private messages just went missing The But I still think like employers are collecting way too much right like it's it's not just the chats which that's quite a lot Like we're talking about like sensors at desks and Like there's of course all the cameras, but like a sensor that senses like how long you're sitting at your desk or so on and so forth I mean we were just talking about all of the things that are wrong with American society and hell The surveillance stuff has made it worse like this is just another thing right like workers We don't really have free speech rights against their employers workers. Don't really have Like all of these anti-union rulings that are coming down all of these Pro arbitration rulings that are coming down like it's it's not great to be an employee in America right now And this is just another little cherry on top of the terrible pie Really like optimistic group here tonight Nine and a half Is Jason Kelly here? Good, okay So obviously everyone in the room cares about privacy, but sometimes corporations force us to choose between Privacy and efficiency when in your view is it okay to choose? That to choose efficiency over our privacy. I mean I would just say that The way that you frame the question is interesting because I feel like often we don't know What we're giving up in privacy and what we're gaining in efficiency We don't we especially don't know at the kind of when we sign up for the service because you haven't even used the service like you have to agree to the The rules and like the data collection practices that Amazon has for Alexa even before you you turn it on even before You know like what it's going to do for you And I just feel like we don't have the capacity We don't have the information we don't have like the overall picture And usually when we start out using the service we don't even have Like any idea of how it's going to fit into our lives to make that trade-off Maybe over time you can kind of tell but then like I think about often of my Apple watch which Gets like a level of data about me even beyond What my phone has which is already like a breathtaking amount and it kind of ties that into what my phone has and now It has this like complete picture like it like with with wearable devices. They know how much you're sleeping They know your heart rate. They know like very intimate stuff about you And I just don't know what they're doing with that I know a little bit like why that is kind of useful to me to know how I slept, but it's not that useful It's like I Just feel like I'm gonna void making making that that decision Those trade-offs like they sound it sounds interesting in theory to make those kinds of Decisions, but like I just feel like in practice we were not capable of doing it I think there are obviously like like let's take like, you know security right like things like that there there are potential Trade-offs and and things like efficiency, I think where it gets really complicated is like you said You don't know what's happening when the you have to agree to the rules of the game before you've seen the game But then also the game really shifts like I like when we think about a lot of the services that we use like they started as something and then they grew in in importance in size in In impact in the way that like the way that we make decisions I mean we've ported all of our lives onto some of these things and and you're doing that incrementally, you know, it's it's the The Frog in the pot of boiling water all of a sudden, you know It's it's really it's super intense and the stakes are really high And that's where I think it really scrambles everyone's brains like how could you possibly When you downloaded Facebook in 2004 known what you were agreeing to of course you didn't and and that happens with a lot of things But I think the speed with which that's happened in the tech industry You know, there's not a lot of parallels there and so I think that's really difficult like now when I sign up for new services and technologies I I try to Occasionally do that thought experiment and be like what could this be down the line? And that's that's kind of it like I mean That's a game you can't win So we have about seven minutes left and I'd love to take some questions from the audience Here You can first you and then you with braids Yes, I'm curious about how you feel about being identified inaccurately Versus being identified accurately What I have in mind is for example if you go and look at your Google profile And it has sort of a list of the areas you're interested in and I know what I look at mine It's Sort of man, you know, it doesn't really capture what I think is me You go to you Eagle Eagle Google yourself, and you notice there's like all these white page white page apps that list you and Sort of caring, you know, supposedly your address supposedly where you live try to guess your race your marital status And a lot of that you can see for free and I find that good part of the time. It's wrong So my tendency is not to try to fix it Let them be inaccurate on the other hand that can lead to issues where Real real, you know nasty false consequence false positive consequences, so I'm curious about your views on that It's a good question. I it's one that I thought about often like you notice it when you Have purchased a product from Amazon and then you are going around the web and like you're being advertised that product for Like from Amazon and you're like, shouldn't this very smart system know that I already bought this thing And so you notice like I've noticed the thing where like you search your name and it has a past address for you And I wondered, you know, someone is after me You're gonna go to the old house and like should I fix that or no Like you don't really know it's a good question I'm not sure whether to correct this things that the surveillance system gets wrong about you or not It's it's like, yeah, that's a complicated trade-off. I Don't have a good answer for that. I think it's it's really tricky Hi, so I work in product and privacy comms at a big tech company and before that I worked in media buying So I'm pretty aware of like what Companies know about me and so I was the person who kept my hand up the longest because All of the things we said we were comfortable with if you have a Facebook or a Gmail or an Instagram or a Twitter or like a Computer that's not air-gapped Companies probably already know that about you what why do you think? We use the things we say we're not comfortable with is that a result of like no offense through ignorance or Complete like complicit like why? Right like people in this room said they didn't want social media companies tracking what you buy But Gmail keeps a list of everything you've ever bought like every receipt that's ever gone to your Gmail account every uber ride You've ever taken I care about climate change and I flew here for this event for work I I care about climate change and yet I exhale carbon dioxide You Like I said, I'm not I'm not joking that I would not be able to find my way here from Civic Center like it's my my ability to find places my boss When that big story about T-Mobile and other companies selling geolocation data in real time From phones to like bounty hunters and stuff when that story came out She turned her location off on her phone and that lasted 15 minutes until she had to get somewhere and realize that Like she couldn't use maps without it it it's I think you know like there there's some things that You just can't really get around. I mean like I want my water to be fluoridated, but I live in Oregon So now I just have to use a fluoride rinse every night Which that I don't do that But of course I would rather my teeth be healthy, but I don't remember to do a fluoride rinse I would rather they put it in the water because it's not dangerous But it's like this is a holistic sort of thing like we were living in the system where Some we all know that this thing has to change and yet it's not changing because well We all know why it's not changing, but it's important to sort of ball together and to collective and demand that change Yeah, I mean I think I think about this often with regard to like other technologies like think about cars like The the thing that happened with cars is like the whole world Shifted to accommodate cars and then you couldn't not use a car anymore because things were too far away Cities were built in such a way that like you needed a car to navigate it And that's totally the same thing that's happened with all these technologies social networks gmail email in general like slack at work like the the kind of physical and kind of cultural your social life everything has Turned to accommodate these technologies and it's hard to go back. It's hard I mean, it's it's hard to like not use these things And so you have to even though you may have problems with it Even though you may want stronger regulations like the only way to do it is through collective action And like until then you kind of are stuck using them or you're like living in Montana. Yeah, well Just because I think we're getting I wanted towards the end like I do think there's with all of that there is like a sliver of of optimism potentially in in the fact that we are having these commerce like you know This is again the world we live in we've rewired it the opting out is of is basically a fallacy at this point but the fact that you know, we're Freaking out about face app even you know, even just that's like that's like a thing that kind of gave me a little bit of Hope like people are thinking about like, you know, the provenance of their weird app for the first time Cambridge Analytica I think is is a moment that really showed a lot of Americans that there are like consequences to the data That's out there, you know the the the 2016 election shows the the potential power of Social networks to mobilize certain people towards certain and like we're starting to see these things and we're starting to have these Conversations, San Francisco with facial recognition like we're we're moving Very slowly much slower than we need to but I think there are there are reasons to be hopeful that like we're starting to understand that system We're starting to understand that you know, this is the world we live in and that's really the only way that you can start to begin To change it is to know what you're up against Yeah, I really don't think that we're like completely in the dark here Like I mean like look at this room and all the hands that went down very quickly like we're in the state of California You guys have CCPA congratulations There's there's real action happening I mean the US Senate bill might be dead, but I Mean you know you live in California, so who cares? But uh Yeah, I mean like your your state representatives are going to be Receptive to your concerns about CCPA being amended in ways that you object to You they're going to be receptive to you saying that this needs to be stronger Hopefully at some point you the US Senate will also feel that way But like there is real change in the air. There is things are happening See I'm Francisco banned facial recognition like we're not actually stuck like we sound like real downers because like We are yeah, I mean like I mean like have you seen the world though But like but I'm not saying that there's nothing you can do there's stuff you can do I'm just saying that this whole thing of like recycle And then we can save the planet that thing like we need to put that away like let's let's put that away and let's talk about like, you know reforming the system and Fixing it in a real way not just a fig leaf on top of An industry that has gone completely awry. I think we're back at eight All right, I'm glad we ended on a somewhat positive note because we are out of time Please give my brilliant colleagues around We want to thank the San Francisco Public Library so much for hosting us the New York Times is here with events every month And next month we have the Times Music critic talking about pop songs inspired by California and there actually be a dance party afterwards So make sure you look that one up with noise pop and DJ red Corvette Thank you so much and have a great night everyone. Thank you There's flyers in the back for the program