 I'm Kevin Bankston. I am the Policy Director of the Open Technology Institute here at New America. At New America we're dedicated to fostering new ideas and new thinkers to address the new challenges that our nation and the world are facing. One of those is digital privacy, a key focus of the Open Technology Institute, which is the technology policy and technology development wing here at New America. Julia Angwin is a good friend and in my mind is the lowest lane of digital privacy reporting, a dogged award-winning investigative reporter who spent a decade at the Wall Street Journal routing out corporate corruption before earning her team a Pulitzer in the process, before turning in 2010 to the issue of digital privacy. As part of the journal's groundbreaking What They Know series, otherwise known as the series that launched a thousand FTC investigations and congressional hearings or at least several dozen of them, Julia and her team of reporters and technologists did absolutely groundbreaking work investigating, reporting on, explaining and visualizing what data our cell phones, our computers, our apps are sharing and with whom, both government and in the private sector, when no one else was really digging into that issue. She's now written a new book building on that work. It's called Dragnet Nation, a Quest for Privacy, Security and Freedom in a World of Relentless Surveillance and here is my copy right here. And I'm super excited to have her here to talk about it today. I'm also very excited to see what new trouble she gets up to as a senior reporter at ProPublica, her new gig. We're going to be joined today in discussion with Julia after she does about a 20-minute presentation by Lena Kahn, who's a policy analyst for our Markets, Enterprise and Resiliency Initiative here at New America. She focuses on concentrations of power in the political economy and she has lately been looking at how concentrations of data often translate into concentrations of power, often power over those in underserved or disadvantaged communities who are already on the wrong side of some pretty significant power imbalances. Tracking some of the same themes that Julia raised in a New York Times op-ed not too long ago asking the provocative question, is privacy becoming a luxury good? Spoiler alert, yes. Lena wrote a great review of Julia's book for the American prospect and I hope she'll share some of her perspective on the book and on the issue here today. But first off, thanks to those of you in the room who have foregone a beautiful spring day here in DC to come and listen to this. I'm sorry I would be happy to take this outside if it were not for those on the live stream and thank you to those of us who are watching via live stream. As I said, Julia's going to present for about 20 minutes, then there's a short video that I'm going to play and then we'll have 30, 40 minutes of conversation between the three of us and then we'll open it up to questions to folks in the room and perhaps over the Twitter if we see some good questions there. So without further ado, Julia Angwin. Hello and thank you for coming. Does this work? Can you hear me? Yes, okay. So I thought for those of you who haven't read the book, which is probably everyone, I wanted to just go through quickly sort of what is Dragnet Nation? What did I write this about? And I always start my presentations with this funny stencil headshot that I use online because it's sort of a good entree into Dragnet Nation. I use this so that the facial recognition technology has fewer photos of me to use for identification. So just to get a sense of where I'm coming from on this issue. So I thought just to start with who am I and how did I get into this? I'm a technology person from the beginning. I grew up in Palo Alto. My parents were both computer programmers. I learned how to type on this machine. And I am a technologist. I love technology. And so I don't come to this from a standpoint of like technology is ruining our world and we need to get rid of it. I come to it from a point of view of like, how did we get here? And can we save all the great things that I love about technology? I spent a bunch of time, as Kevin was saying, covering the rise and fall of the dot com bubble at the Wall Street Journal. And I basically got to a point where I thought that social networking was going to be really big. So I wrote this book in 2009 about my space. And I was so completely right about social networking. And I was so wrong about which network I was going to be the one to write about. But the thing that book is actually what got me into privacy. Because one thing I learned from my reporting there was that these social networks had realized that their most valuable asset was your data. The data you uploaded to them was something that they could dangle in front of advertisers as a lure, and was their competitive advantage. And so when I came back to the Wall Street Journal from Bookleave, I thought to myself, you know, who else is using my data? What do they know about it? And what can I find out about this topic? So I launched an investigation called What They Know, which launched in 2010. And just briefly, I'm going to just sort of talk about some of the things that we found at that time, which was we looked at all the different websites to see how many of them had invisible tracking technology embedded on them. This was dictionary.com, which had more than 200 trackers on its site, if you went to 10 pages. We talked about what is known about you from a cookie that's placed on your computer. This was a girl who was 26 years old, and the tracking company nicely shared with me what they knew about her. I think they later regretted it. But about her favorite movies and which, you know, quizzes she liked to take online. We tested apps and found which ones are sending your data. Pandora is particularly sharing friendly about your information. These are all the different parties that it was sending its information to. And most of them were sending the phone ID, which is still actually being transmitted. But now you can reset it. You probably haven't found that little button on your phone. But if you look really hard, you can. And we talked about how it was affecting people, how Capital One was offering different credit cards to different people based on the information that it assessed about them. So this guy, I don't know if you can see in the fine print, but they assessed his income as downscale and some college. And so he got kind of a middling credit card. Now he could still apply for the other ones, but this was the instant steering of like into this direction. And we talked about how far it has gone. This was a story about a company that had actually figured out a way to figure out what cars you were looking for online before you got to the dealership. And essentially, you know, that's the classic situation where you definitely don't want all this information known about you before you get there to negotiate a price. And that's what the pervasiveness of personal data leads to. And we talked about the license plates. This was actually a great story. These are repo men. This is a repo man guy who they drive around and have privatized the scanning of license plates across the country and sell it to anyone who wants to buy where you've been. And we looked and then we of course led to the government, which was, you know, I started with commercial tracking because that was sort of my understanding of the world came from covering Silicon Valley. But eventually realized partly through Kevin, who was like, Hey, you know, there's other people doing tracking out there. So we I started looking into the government surveillance. And this was a great story where we obtained some documents from a secret surveillance conference where people sell to the government and put up all their brochures about the different ways that they could track people. Then of course, obviously, there was this big story out there that Edward Snowden brought and I can't bring any credit for that. But, you know, we really learned as a nation, you know, how pervasive surveillance was and everything that I'd written before, I think seemed a little bit small bore by comparison to the scale at which NSA has been conducting surveillance and its partners around the world. Just, you know, to remind us all about the Prism slide, which had all the names of our favorite, you know, email providers and services. And then of course, this slide, which I can still never get over, which is the slide showing how they're trying to hack into Google's data centers, you know, just shocking revelations that have continued to unfold for almost a year now. And I think we're all still as a nation kind of grappling with what does it all mean. And that's sort of what led me to write my book. I started the book before the Snowden Revolutions, but I was already under the impression that my God, is there anything to be done, right? Is privacy dead? And certainly, this was the question I got from people at parties, you know, like, I've given up. I can't even fight. I don't have the tools. So really, the question I tried to answer in my book is, can we do anything about this? Can we reclaim our privacy? And so I set about to do it. And I set about to do it not in a sort of I'm going to go live like a man in the Bible way for in the woods. I tried to do it as a working mother with two children. Like, I want I need to be reachable by phone, cell phone, like I'm not I'm not taking extreme measures, right? So what can I do within the realm of the modern world? And that's really the story, the quest that I set out for in Dragnet Nation. And so, you know, the very first thing I did was quit using Google search, because I love Google, they have great results, and they seem sort of not evil most of the time. But I just didn't like my search history. It's a record of every single thing I'm planning to do or have done or might do or thoughts I'm having about anything I seem to Google them. When I looked at my result, my search history, which dated back to 2006, it was shockingly revealing. So I switched to DuckDuckGo, which doesn't keep any logs. And so that means, by the way, when I searched for Natural History Museum on DuckDuckGo, they give me the one in London, because they don't know where I am. And it makes it remind you of how much Google knows, right? I did this in my hotel room, and they're like, Aaron DC, yeah, we know that, right? And so it just is a reminder that all that you type into the search engine is not the entire inputs, right? There's more that goes in. And so, although I end up having to write those two extra words, DC, New York, I feel much more in control of what I'm searching for. I left Gmail, I went to riseup.net, which is, you know, it's hard for me, because people ask me, what should I do? Should I leave Gmail and go to riseup? And riseup is a very small collective. Not everybody can join. You need two invitations to join. I had to sort of lobby people to get me invited and has a very small quota. And so this isn't really a plausible choice for most people. But I wish there were more choices for email. This is one thing that I think is really surprising to me, is I think there's most likely a market of people like me who are willing to pay, who want privacy, don't want their emails read, scanned for the content and advertised against, but for some reason, riseup is the only one offering that right now. I unfriended everyone on Facebook, which was, I think, weirdly more emotional than I expected. People really took it personally, but I wanted to keep a page, because you have to sort of found on Facebook, you know, there's just something about it. You just need to have that entry point. But I believe that the list of friends being publicly available is too much of an exposure, particularly as a journalist, but even just as a citizen. We are who we associate with. I have the same food tastes and music tastes and movie tastes. Of course, I think I'm really individualistic, but my friends reveal a lot about me. And so I just don't want that publicly available. And so I actually deleted LinkedIn, actually, which was really heartbreaking, because I had this panicked feeling that I would never get a job again. And that might be true. You call me in five years and we'll see what's happened. But I finally just took the plunge. It was like jumping off of a cliff. Then I went to try to fix up my internet browsing. So I have been using this aviator browser, which is basically has two of my favorite technologies built in, which is disconnect, which is basically blocks all those add tracking technologies that I mentioned in the beginning. And HTTPS everywhere, which encrypts your connection to the internet and is very important. And both of those things you can add on to any browser. I just liked the fact they were already built in to aviator and had a few other features that I liked. But surprisingly, web browsing is one of the things you can sort of succeed at. I was managed to feel that I was doing a good job of protecting my privacy. And when I wanted to go hardcore, I use Tor, which anonymizes your location. The problem is it's slower, right? Tor routes you as if the website you're visiting thinks you're in Amsterdam. The problem is you are in Amsterdam. So your browsing is like going Amsterdam back. And so it is slower. So I try to break that out as much as possible, but sometimes I'm looking for speed. And then for like super hardcore, I went to Tails, which is this sort of you boot from a flash drive or a CD. And that means that if your computer is compromised, you can kind of override it and use Tor from there, which of course meant that I had to buy a clean laptop to use. So I bought a used laptop on eBay to run Tails. And this is how it starts to get expensive and become a luxury good. And then I bought an encrypted cloud service, which actually cost about $200. But I didn't want to store in Google Docs. And I do believe that if we want privacy, we should pay for it. I mean, in some ways, we have to take some responsibility for the culture of free that we want everything for free. And then we're like, oh, horrors, they're taking our data. Well, you have to pay somehow, right? So I try to pay when I can. And then I created a fake identity, Ada Tarbell. So Ada Tarbell is a buck-breaking journalist from the turn of the century. She's a hero in the mind. She wrote all these groundbreaking stories about standard oil and how they were abusing their monopoly power, which really led to strengthening of antitrust laws. And so I realized that there were some things that I just couldn't do with technological tools. And so I decided to use basically social engineering tools. So creating Ada Tarbell, I got her an array of exciting technology. She has her own credit card. She has her own Amazon account. She has a post office box. She has an open table account. And all of this was legal, by the way. So AmEx knows I pay her bills. She's as if it's my child. I just said add this name to my account. They said, sure, they don't care. I pay. It doesn't matter, right? But it does mean that I have that veneer of anonymity in a regular store or restaurant, like if I book a restaurant reservation and open table under Ada's name, I pay with Ada's card, then there's this level of anonymity that the restaurant doesn't need to know I'm there, right? And that in the old days, we did all that in cash. But the problem is that we don't use cash anymore. And I wish that I was the person who was like going to go back to cash, right? When I started this project, I'm going to go back to cash. But it was like, oh, my gosh, it was such a pain. So I was like, no, we're going to go to the fake credit card is going to be better for me. So those were my success stories, but I had some failures. I'm pretty that we're actually I learned a lot about what we need to do to fix privacy from the things I couldn't fix myself, right? So one thing I couldn't do is I couldn't really get out of the data broker industry. So these are the companies that collect your name and address and all the attributes they can basically buy about you from public records or commercial records and sell them. And they used to be mailing lists. So they got some junk mail and it was sort of annoying. But in the end, it didn't feel that intrusive. But now data brokers are selling to, for instance, Facebook, right? Facebook buys information from a data broker, adds it to the information they already have, although I don't know why they need any more information than they already have. And they have the world's best profile of you. And then they're adding it to, you know, they're selling it to these other online advertisers. So it's starting to be a much more robust industry. The dossiers are much richer. And then it's being used for much more different things than it used to be. So I wanted to get out of it, but I couldn't. So I found, first of all, just finding out who data brokers were. I found a list of, I put together a list, which took about a month of 200 data brokers. Only 92 offered opt-outs. 65 of them wanted me to send in some sort of ID for an opt-out. And some of them wanted, like, ridiculous things like Social Security or one company wanted a credit card number, which I just, so then it wasn't really an opt-out. You know, I wasn't willing to give them, it was like too much information just to remove my information. And actually, the weird thing was I couldn't see my data. Even more than opt-out, I wanted to know what they knew about me, right? It was just sort of this primal knee, like, what do they know about me, right? Only 13 could I get my files from, out of more than 200. And so that is just a situation that wouldn't happen, by the way, in most of the rest of the world. We're the only Western nation that doesn't have a commercial baseline privacy bill of rights type thing that would basically say, commercial data gatherers have to provide you at least this amount of transparency, at least this much of an opt-out. And the Obama administration did try to come up with the Privacy Bill of Rights two years ago, and I don't think it's gone anywhere. But the sad thing about this opt-ing-out situation for data brokers is that actually, the bad actors now have my data, because, see, I've removed my data from the good actors. The ones who voluntarily spoke to you says, I don't exist, right? But I don't know who info registry is, but they're like, yeah, baby, buy our data. Which one of these addresses do you want? So it incentivizes the bad actors, right? And so that is, I think, not a good market situation. The other thing that I really failed at was my phone, right? So I call it pocket litter, because pocket litter is sort of a spy craft term when you find the litter in someone's pockets, you know, this is sort of like old school CIA, it's like there might be identifying information for that person or that someone or their associates, a number they're calling, something like that. And so our cell phones are the ultimate pocket litter. It's like the best pocket litter you could ever imagine, because it has all of our contacts and everything we've ever written. And basically, in my case, like, you know, my location and my text messages, I mean, there's nothing really about my life actually, I don't think that you couldn't find out from the phone. But it's not very controllable, right? For instance, I can't put disconnect or HTTPS everywhere on my phone. And there are a lot of people who are in the business of sort of trying to find out a lot about you through your phone. So this is interesting, this is the where conference, there's a whole bunch of conferences about location data, where basically an entire new industry has arisen in the past year to figure out where people are, and they use the Wi-Fi signal. So if you have Wi-Fi on, essentially, anyone, including myself, could just set up some sort of sniffing device that's like every Wi-Fi device thing that comes by, I can pull in some information about it. It's perfectly legal. So as a result, I had to turn off my Wi-Fi, because I tried to opt out, but I only found, I think there were 26 companies I was able to identify who did this and not that many of them had opt out, and then you had to put in your MAC address, and it was just super complicated. So I ended up just not using Wi-Fi. I also tried to use encryption, so I'm, you know, this is whisper systems, red phone, there's also silent text, silent circle, but both parties have to have it, and I had very little success getting anyone except my children who loved it because they were like, ooh, secret coded messages, yay. So I silent text with them all the time on their iPad, but it wasn't that privacy-protecting. So in the end, I had to get Ida a phone of her own because I was like, look, I can't control this environment, at least I can have Ida be tracked. So Ida is now being tracked, but of course that is also completely ineffective because there was a recent study that said you only need four unique location data points to uniquely identify somebody. So it's anyone who looked at this data would very quickly realize that me and Ida have the very same fact pattern, and in fact are the same person. So it's a very thin veneer of privacy. So then I went to this very sad, sad state of putting my phone in a Faraday cage, which is essentially a bag, and that bag is lined with thin metal, and that prevents any signal from coming in and out. Now, then I can't use my phone also. So we've really come to the very bottom of the rat hole of privacy, which is exactly where I didn't want to end up, which was like living in the woods in a tin hut. So essentially putting my bag in this, my phone in this bag is like putting my phone in that tin hut in the woods, and that's not where I want to be. So in the end, I found that there was just some things I couldn't buy my way out of. No matter much effort I put in, the data brokers, I couldn't get any privacy software onto my phone, I couldn't really be sure that my encryption was working because I couldn't, unless I used that clean one laptop that I bought on eBay, which I don't always have with me, I couldn't be sure that my end points weren't compromised. And in fact, even that machine on a laptop is probably compromised. And in the end, also privacy is an ephemeral good. I couldn't really tell if all my services were working. I thought they were, I tried to test them as much as I could. But I really ended up feeling that this was a false choice. People say, we've chosen to give up our privacy. And I say, after a year of trying to get out of it, you know what, it's not, you can't make the other choice. I tried to make the other choice. And that's why I ended up writing this op-ed, have privacy become a luxury good. Because when I added up what I spent last year, I spent almost $2,500. That is a lot of money. And for a good that I don't know if I got, right? And so the question is, is this the world we want to build? You know, that's really the question I have for everybody reading this book. Is that the world we want to build? Let's make some considered choices about where we want to be and in our drag net nation. So that's my presentation and we can have our chat. Oh, okay. Well, Julia's quest reminded me of a very, as are many pieces of content generated by the onion, very funny, very true, very depressing piece of comedy that we'll play right now. About your privacy while using Google, the Internet giant says it understands. Google is now offering users a chance to opt out and live in privacy in a remote mountain village. Tech trends reporter Jeff Tate has more. Thanks, Teresa. They call it the opt out village. And it's just what you'd expect from Google. If you want to keep your information private, all you have to do is move to our 22 acre opt out village and not speak to anyone from the outside world. It's very simple. Just go to the Google front page, click the opt out button and in minutes a van will come to your house and pick you up. That same day a team of Google privacy experts eliminates your home address guaranteeing it will no longer appear on Google local pages. And after just two days in the back of a van, you're there in the village. We can guarantee that there's no chance of Google reading your emails because there are no computers. And because they're also monitored and tracked by Google, there are no banks or hospitals. Residents will be expected to know how to grow food, suture wounds and bury corpses by hand if they plan to opt out. And Google has gone the extra mile to ensure that users who choose to opt out are given complete privacy in their new home. A 30 foot tall, 10 foot thick physical data security wall keeps all former Google users from leaving the village until they decide they want to start using Google again. The opt out village can't even be seen by Google satellites because the entire town is enclosed with a large metal box with no openings. Google says those wishing to opt back into using Google after their time in the village will be allowed to do so if they agree to be branded with a whimsical G on their foreheads to label them doubters. If you don't want to give us complete access to your most private thoughts and feelings, that's fine. You can just toy on the hinterlands and die young. And Carter says the opt out village is already getting rave reviews. One of the first village residents sent this letter praising the total privacy inside the village, saying all alone, no light, hard to breathe. Now that's one man whose data is secure. For the Onion News Network, I'm Jeff Tate. Thanks Jeff. If you have any questions about the opt out village, type them in an email to a friend at Google. We'll get back to you within 24 hours. And in just a minute, is your child missing out on teen sex parties? As I watched your play in this book, I also love your modified Google motto, not evil most of the time. As you point out, we give Google a hard time. At least we know mostly what they're up to. While there are a lot of actors, we don't really have much idea at all. But so you went to a lot of trouble, spent a lot of money to try and enforce some level of privacy in your life. And I can imagine people asking, I certainly get this question all the time and it drives me insane. Julia, why should I care? I don't have anything to hide. Why should I go to all this trouble to protect my privacy so that this marketer doesn't know where I am or so that the NSA doesn't sniff my emails of my recipes from grandma? I have nothing to hide. Why should I care? I know. This is my favorite question, as you know. It's convenient because I live with this person who says this to me. My husband is completely uninterested in privacy. He's a professor and he says, if only they would read my research papers, it would double the audience. Please, I'm begging you, NSA. So my answer to that question is that I think people don't realize what that means to say you have nothing to hide. Do you really want to have the same conversation that you have with your mother that you have with your boss, with your child? These are just normal things that you would just phrase it a little differently or you just have slightly different posture with these people. And that's normal human behavior. And so the idea of nothing to hide is sort of ridiculous unless you want to tell me that you are the same person across all those things. And then I have to ask you, who are you? But I know that it is also there's a more serious answer to it too, which we can also go into, which is essentially that I believe that there is a deeper answer to why we should care about privacy, which is just that it's about us as a society, right? It's a collective issue, which is, do we want to make spaces for there to be confidential conversations? Even if you yourself don't want to have one, shouldn't there be places for them? I mean, so I definitely take your point about contextual privacy. We actually do show different faces to different people and different, you know, participate in different ways and different roles in our life. And, you know, even if Mark Zuckerberg thinks that's a sign of dishonesty, that's actually the way human beings typically work. And I take your point about societal issue. Frankly, my stock answer, actually, I'll share another stock answer from Jim Dempsey at CDT, also in the Privacy and Civil Liberties Oversight Board. His stock answer to that question is, oh, you don't have anything to hide, then hand me your credit card and drop your pants. Like, just go. But my stock answer, particularly when, you know, at cocktail parties, is it where only half joking? Actually, no, I'm not even joking. When I'm asked this question, I say, well, you know, it's not all about you, you narcissist. It's not about your personal privacy. It's not about whether you're creeped out. It's about, from my perspective, as a civil libertarian, maintaining the conditions for a democratic society. I mean, we don't have to look that far into history to see how people have used imbalances in terms of information, used surveillance, often unlawful surveillance, to not just violate people's personal privacy, but to try and disempower political movements, to undermine the anti-war movement, undermine the civil rights movement, seeing people in power use power over information to try and stay in power. And so it's not actually about someone, you know, getting their jollies off, reading your private email. It's about the systematic imbalance of power that imbalances in access to data can bring, which is actually a perspective that you've been looking at as well. Yeah. I mean, one thing I absolutely love about your book is how it kind of starts documenting like categories of harm, you know, like really expressly showing how people, everyday people are being affected. I think one thing that's interesting is to think about, you know, whether the language of privacy really even starts getting at kind of the full spectrum of harms. I mean, one of the most obvious things is, you know, what are the market effects of these like pronounced informational asymmetries that are being created? I come at this from, you know, the commercial side, kind of looking at, you know, what does it mean for a lot of this data to be consolidated within a few players, you know, it reshapes the balance of power both between companies and consumers, but also among companies, you know, and kind of just looking at, for example, recently, Monsanto and Dupont and John Deere have been kind of moving into data analytics. And so that's a situation where all of a sudden, you know, combines and tractors on the farmer suddenly, you know, transmitting all this information to just a handful of companies. And so farmers are all kind of like looking around. And it's really interesting because they're not talking about any of this in terms of privacy. They're kind of thinking, you know, what does it mean for my business partner to suddenly know in intimate detail what the economics of my farmer. So yeah, just kind of thinking about, you know, how do we talk about this in new ways to try and get some of those other issues across. So black boxes and tractors, as well as our new cars, I did not know that. Thinking back to the first time we met, which was in 2011, and you were turning from commercial privacy to government surveillance, and we were, I remember specifically, we were trying to puzzle out, Ron Wyden keeps saying there's some secret Patriot Act. He keeps intimating that there's something going on really massive that we'd all be really freaked out about if we knew. And yet he can't tell us what it is. And I remember theorizing about that and talking about section 215 orders, which now the entire country has heard of, and sort of giving a crash course on ECPA, the Electronic Communications Privacy Act, which is the main law that governs law enforcement access to this data, and that we've been trying to update for quite a while. Actually, if you're an ECPA nerd, you'll understand why I'm very pleased and honored that I'm the person who introduced Julia 2703D orders. If you're not an ECPA nerd, you're going to be, you're probably thinking, wow, what a nerd that guy is. But it's kind of important stuff. It was a momentous moment. And so we're puzzling there in 2011, what is Ron Wyden talking about? Flash forward to last summer. And we and many of our colleagues in the privacy space are in Berkeley for the Privacy Law Scholars Conference. And we're all sitting in this nice hotel ready to, you know, share our papers with each other and talk theory. And then suddenly the world falls in on us and these Stoden docks come out. The first day, the night before the conference, the revelation that the NSA was getting all the telephony metadata of all the phone companies. Then the next day, news of Prism, which was the so-called downstream content program where they're getting a lot of data from your Googles and your Yahoo's and your whatnot. This happened in the middle of you beginning or well into the book at this point. How did the NSA revelations change the process of writing the book and the conclusions you came to? And how do you think it's affected, you've been reporting on this in this area for quite a while now? How do you think it's impacted the, you know, narrative as a whole and the debate as a whole? Yeah, it was, I still remember where I was standing when those revelations came out because it was so stunning. And it was so, to be completely candid, just heartbreaking for me because I had been chasing that story for two years and I was like, wow. But it was also so reassuring because, to be honest, I had already decided to call my book Dragonet Nation and I'd already decided to focus on what I called widespread, indiscriminate, ubiquitous tracking, right? Surveillance when I'm not a suspect. And, you know, I had actually a lot of commercial examples and some government examples, but boy was I able to beef up that section because, and it made me feel less paranoid. I had thought to myself, you know, there was just this thing like, well, am I right? Is it really not bad? Like maybe it's not that bad, you know? And, and then it was like so much worse. It was, it was jaw-dropping even for people, maybe you haven't been surprised, but I have to say every week I continue to be surprised. I'm like, no, really? They're really doing that? Like they're really, they really need the data that Angry Birds is sending to advertisers? Like I just wouldn't have thought of that, right? So it's been really helpful because it makes me feel less crazy. And I, but I also think it's been a little bit hard because people are numb, right? They're like, oh my god, they're actually the way when I started this book and I, people would say to me, I just feel like I've given up, that's like times two now because people feel there's nothing they can do in the face of like a state actor who has access to everything. It seems like at least on the commercial side, right? It seems like so much of the dragnet is tied into kind of just the economics of so many of these services, right? So we've all kind of unwittingly, you know, signed up to hand over reams of this data to get these free services. I mean, I guess one question that comes to mind is, you know, as we try to recover this data, can we do so given the current economics of say, Google or Facebook or, you know, given that so much of their business is premised on this and kind of like, what could that look like? I mean, is that something that came up at all? Yeah, I mean, I, I try to pay for services because I do think like that gives me a clearer sense that I'm the customer. But even then sometimes, you know, your data's being sold. My previous employer, the Wall Street Journal didn't, you know, there was just as many tracking devices for people who paid to log on as not. So it doesn't always get you the answer, right? So I, I think of it this way, which is that if we had assurances that our data wasn't going to be used in ways that would really hurt us, I think we'd be willing, it would be easier to make this transaction right now. The problem is you're trading your data for a service, but you don't know the cost of your data. So it's like the bond market. It's completely opaque. So you have no idea whether that's a good deal or not. Because if it means that the next time we go to the car dealership, they're jacking up your price $2,000 or maybe it wasn't worth it. Maybe you'd rather pay 50 bucks for that service, right? But so the fact is we just can't price our data effectively. And I think that's why this trade seems really unfair and scary. Thinking of the NSA stuff and what was expected or not expected, surprising and not surprising or where you were at the moment. I remember I was in San Francisco for this, for this event and visiting my, my old offices at the Electronic Frontier Foundation where I spent years litigating over the last set of NSA revelations. And, you know, we were collectively beating our heads against a wall of secrecy. And so to have those, that news drop at that office at that moment was really quite something. And I'm hoping and praying that, that litigation goes forward and is much more successful now that we have a lot more information in play. To some extent, I wasn't surprised. It was sort of more like confirmation. Like, yes, we've been saying this all along. We've been saying they've been getting all the phone records since 2005. We've been saying that. Yes, they're sitting on top of the internet backbone and sucking everything up and filtering it based on some weird formula that includes looking at all of those records. You know, we knew that. What we didn't know and what's been really shocking, like, and fundamentally, philosophically disenchanting way, is how much they have compromised the basic infrastructure of the internet. How they have compromised the encryption that you are relying on to try and reassure yourself that you have privacy. How they have installed bots, compromised machines all over the planet that are watching the entire internet, basically, significant portions of the internet and being able to see you reaching out to Facebook or reaching out to LinkedIn and then jumping in in the middle of that transaction and inserting malware into your computer. Or, say, using the tracking cookies that those companies are using or other companies are using to track you for ad purposes and using that to target their own target, saying, oh, we know that person is using that double-click cookie. Now we know to look for that double-click cookie on the network wherever we see it and to jump in and attack that computer. So that makes me think of something that a lot of people call the surveillance industrial complex, which is this sort of meshing of the private data collection and the government's surveillance. And I do have some quibbles with a few people who bring this up because they often say, well, it is the business model, the internet companies that is enabling this surveillance. But what I haven't seen, I have not seen any revelations yet about, like, this type of record that is generated just because they serve ads was targeted or search logs were targeted or this or that. It is usually either transactional records like phone records that they were already collecting or it is data that we have chosen to store in the cloud. But the one example where I have seen that innovations in ad tracking have actually facilitated NSA surveillance is this, them using the cookies they use to track us for the NSA to track us as well. Did you see many other similar synergies between what the companies are doing and what the government is doing in your work? Or how do you see the relationship between the two? Put another stupider way. Who should we be more worried about the companies or the government? So I kind of came to the conclusion that you couldn't separate out the two. And I'll give you the example of why I say that. So in order to register to vote, you need to tell your state, your name, your address, oftentimes, birthday, party, various pieces of data. Most states sell that list to commercial data brokers. Those data brokers then buy additional data, match it all up, build up a nice big dossier about it, then they sell it back to the government. Many times law enforcement buys these files or the fusion centers are using this type of thing. And so actually politicians buy this data for their campaign targeting. So I mean this is the cycle. And this is true for the tracking cookies. Google puts it on for one reason, which is sort of not evil in some googly way. And then the NSA jumps on the piggyback of it. And then it comes back around. And I feel that they're basically inseparable. And that is actually part of this pricing of your data problem. So you can't price your data appropriately as a commercial transaction if all of a sudden it's actually being used for governmental transactions, which all of a sudden have a much higher price because they can land you in jail, right? And so then all of a sudden we have this problem, which we don't know what to do about our data. How much is it worth to us? And that's the sort of national angst we all feel about our data, which is like we know it's worth something. We don't really have the tools to protect it. And we don't know how bad it can be. And that's why everyone feels anxiety. Well, I wonder, you know, perhaps this big data process will solve all of it. So as many watching probably know, President Obama in his speech about the NSA revelations in January announced a 90-day process whereby the White House and particularly the Office of Science and Technology Policy was going to do a 90-day review of the issue of big data and the economic and privacy and policy implications of big data, which is basically lots of data. And the ability to find, you know, non-obvious patterns when you have lots of data and what that means, both good in terms of greater productivity and efficiency and discovering new things that are helpful and bad in terms of unexpected impacts on privacy, discrimination, and like. I'm curious if you have any thoughts on that process. If you've been watching that from afar or have any opinions on that, I certainly have a few. But I'm curious what you think. Well, I participated in the NYU Big Data Roundtable and the point I made there actually was about journalists. So I think that one really big concern I have about surveillance is it's already impacting. Let's say you don't have anything to hide, right? Everyone in America totally clean. My profession is in a crisis, right? We cannot promise confidentiality to our sources because everything leaves a digital trail and the legal standards for getting that data are low. And so we can't get good stories and so we can't be watchdogs for democracy. And so that's actually like a big deal of problem of our profession. So I was actually raising the issue that there are some things that we do to police the government as journalists that should be protected so that we can watch the watchers. Like big data might be fine, but somebody has to be providing oversight. And we're hampered already in our ability to provide oversight. So I was actually making the argument that one of the biggest problems, the Computer Fraud and Abuse Act, which criminalizes some of the work that I did at the Wall Street Journal is actually would be, could be under some prosecutor's reading of that law illegal, right? That's what we've did, which to go to jail is actually very similar to what I did, which is essentially you go iterate a bunch of URLs over and over again to look and see what data is available on publicly available websites. And so simply criminalizing and surveilling journalists is actually a problem that I think impacts everybody in society. It's what keeps our democracy strong. I know that's not the answer you're looking for, but it's something I just feel really passionate about. I mean, I certainly sympathetic on the issue of the over criminalization of security research and privacy research, which is something we're very concerned about. And just generally the issue of CFA, Computer Fraud and Abuse Act, something we saw Aaron's law introduced last year in the wake of Aaron Swartz's death to try and reasonably narrow the scope of CFA liability. And although that bill didn't address, and I think it still could and should address some sort of sensible exception for security research. I mean, in many ways right now we're seeing research that would help us protect our privacy and security, you know, and make cybersecurity better. We're seeing that research being chilled by this law that is supposed to be protecting our cybersecurity and privacy. And that's not an ideal policy result. That's really interesting because it seems like what you're describing is, you know, how what we need are new laws. But if anything, the laws are kind of going in the wrong direction. And I think we're seeing the exact same thing with kind of corporations. I mean, recently corporations have started using kind of First Amendment claims to try and say, you know, we have free speech rights. A couple years ago there was that case where, you know, there was this I.T. help firm that was kind of scooping up pharmaceutical medical records and selling them to pharmaceutical companies. Sorrel. You know, a few states said, hey, this violates privacy. Let's, you know, ban this. And the companies came out and said, nope, we have a First Amendment right. And we've started seeing, you know, companies also start inserting like arbitration clauses that limit how companies, how citizens can kind of hold them accountable. And so it just seems like, you know, at the very moment where we need new laws empowering us, we're actually seeing those laws go in a completely other direction, both with government, it seems, and on the corporate side. Yeah. I mean, the First Amendment issue is particularly troubling. I mean, we see this overreaching argument about how privacy regulations are not constitutional. We've seen arguments about how net neutrality regulations are not constitutional saying, no, we, the ISPs have a First Amendment right to block your speech if we want to. It's not an easy question, though, like I, you know, as a privacy and a free speech lawyer, like there is a tension there, you know, whenever you try and strictly or broadly regulate the distribution of true facts, like that does raise First Amendment problems, which is why I think that if you had a really strict data privacy, data protection regime in the U.S., like you have in Europe, that might not survive the First Amendment. But we did have a relatively moderate version of that in the form of the administration's blueprint. They had a consumer privacy blueprint issued two and a half years ago that sort of embodied basic, fair information practices procedures, like notice of what's being collected about you and the ability to see what's being collected about you and correct it if it's wrong and a variety of other sensible notions like that. We never saw any legislation come of that. We never saw the administration really push it very hard. But what we now have seen is this another process to look at the problem again and see what things might we want to add to the list of things we haven't gotten yet on privacy, which to my mind is a bit dubious. I find, you know, in terms of the big data process, it did come out of the NSA controversy and suddenly it's like, well, yes, the NSA is collecting a lot of data. Let's start a process to look at the issue of big data outside of the context of government surveillance because, you know, the companies are collecting a lot of stuff too. So you might want to go pay attention to that. You know, to me it seemed in the way that it was it was born kind of actually a transparent attempt to distract attention from the NSA issue and perhaps even split the emerging coalition of privacy groups and companies that are starting to work together to try and reform NSA. Despite those questionable origins, I'm hoping it will be a useful process and it's been auspiciously timed because around the time that process began, a group of civil rights organizations were just putting the finishing touches on their own principles around civil rights in an age of big data because big data doesn't just raise privacy concerns. It raises traditional civil rights concerns about discrimination against disfavored groups and underserved communities. We haven't traditionally seen the NAACP or LaRaza or other traditional civil rights groups really getting the game on this, but now they have a new set of principles. They've been involved in the data, big data process. Your book contains a lot of interesting examples about how big data, whether it's a privacy issue or not, raises a power issue and can foster discrimination or reinforce stereotypes. One of the examples was about price discrimination and how often people were being offered higher prices if they were farther away from a store because they would be more likely to pay more because they were farther away from a store, but that also typically meant it was the people who were in the poorer neighborhoods, not near stores, as opposed to the rich neighborhood that had a big box store everywhere that were actually getting charged more. The people with less money were getting charged more. Another example was Internet searches. When you search for a name that sounds like a black name, you'd get all these ads for, oh, you want to look up their arrest records? You would not get that for the white-sounding names, as it were. It wasn't correlated with whether you had an arrest record. That was the amazing thing about the study. Whether your name sounded like an African American name. Whether your name sounded traditionally African American. It was all these ads for, like, buy their arrest records, buy the bug shots. So can you talk more about putting aside just traditional privacy concerns about misuse of the data to, like, can you speak to that imbalance and to the risks of discrimination and sort of civil rights issues raised by it? Yeah. I mean, basically, I framed this in my book as freedom of association. One thing that arises from the First Amendment that is not written in the First Amendment is freedom of association. And the test case was this NAACP in Alabama. Alabama wanted to get the list of members of the NAACP. And Supreme Court said no. There's a freedom of association right to be a member of a group and to have that membership be confidential. And the thing that's amazing to me about big data is that it is all about building associations, right? The whole point of big data that is really that you don't know what you're looking for. You basically get a whole bunch of stuff and then you see what's associated with what. And that is sort of the intellectual premise of it. And then you could find amazing and interesting and innovative correlations. But it also means that our ability to associate with people, places, ideas are being collected and analyzed in ways that could end up being very discriminatory. And of course, the story I tell in the book about this that moved me so much and is not a commercial story, but it's a story about a young Muslim American man in the Bay Area in Santa Clara, California, whose friend, his best friend from childhood, wrote a kind of silly post on the social network Reddit, making fun of the TSA for taking deodorant and throwing it in the bin. Remember there was a period of time where they were against deodorant? Now they seem to be fine with it. But it was like an orange alert for deodorant at that moment. And so he said he wrote a post saying, you know, why are they doing that? Why are they throwing away my deodorant? I could just walk into a mall with a duffel bag and blow it up, right? Perfectly factual statements but provocative. So his friend, his best friend finds a week later, he takes this car for an oil change, puts it up on the lift, looks underneath, he's like, what's that thinking off my car? And being a 26, 22 year old kid, he's like, I will post it on the internet. And the internet will tell me what it is. And the internet did. They were like, you are being tracked by the FBI. And the next day, the FBI showed up at his door and were like, can we have a tracker back? And he was in that unique position, which rarely happens, of getting to confront your surveillers, having them confess to surveilling you. And he said, why are you doing this? And he's like, well, they have your friend wrote this dumb post on Reddit, you know, and that was exactly the bad thing that we don't want to happen with freedom of association, right? He didn't have to join the young Muslim American men of Santa Clara to be affiliated, right? And just because his friend is kind of writes provocative things, all of a sudden he's on the list. And you know, to this day, he gets detained at the border for extra screening. And yet that is a cost that he pays for this crime that was not a crime that nobody committed. Well, speaking of costs and GPS, that's a good segue. You've worked a lot with Ashken Sultani, who's a technology consultant in Sari. He did a lot of work with y'all at Wall Street Journal. He's now working with the Washington Post on their NSA stuff. Ash and I recently read a paper on the falling cost of surveillance, and specifically in regard to GPS tracking, and looking at sort of how it used to cost thousands of dollars if you know, if you wanted to follow someone for a day, and if you want to follow someone for 28 days, it would be astronomically expensive to like, send a team of cars following someone, versus comparing that to the cost of say a radio beeper from the 80s, which is what the case law was about back then, that sort of let you know if you were within a mile or so of your target and which direction they were going, versus the GPS trackers at issue in that case, and in the USP Jones case at the Supreme Court a couple years ago. Then finally to, you know, tracking your cell phone, which at this point is basically could be done for cents a day, depending on how long you're asking for, and looking at that really exponentially dropping cost, and how that impacts our privacy. Is that something you saw in your book? I recall some mentioned in the book about the falling price of storage, the falling price of being able to collect data, and how basically it's enabling mass surveillance that never would have been possible before. I mean the best example in my book is actually when I went to the Stasi Archives in Berlin, because I went there and I thought, okay, let's see what total surveillance looks like. Everybody says that East Germany during the communist regime, you know, everyone was scared to talk and the Stasi were everywhere, and I went and I sort of filed a freedom of information request, and I got a couple of files, I translated them, but I also spoke to the archivist there, and I was shocked to learn that the Stasi had to work so hard, but they only had files on one quarter of the population. And so this perception of fear that they're watching everyone was actually, they managed to generate that with a fraction of the surveillance that we can assume is happening now. I mean already the NSA revelations have clearly demonstrated there's something on all of us in those files, and that wasn't even true for the Stasi. So then the question is, okay, and they had to work really hard, the reason they did that was they had to steam open the mail, they had to listen to the actual calls, their technology just wasn't good enough for scale. So then the question is, okay, we're in a situation where the state has all this information, but of course we're not communist, you know, East Germany, we aren't throwing people into jail for dissent hopefully, but what is the take to make sure that won't happen? And I think that's really the debate we need to have, which is, and the real answer to that question is always oversight, right? And when you have a situation where Congress says they were shocked at what was revealed about the NSA, the author of the Patriot Act was like, that's not actually what I meant you guys to do. And the public was, you know, shocked. I think that we can agree that we didn't, we don't have appropriate oversight for this mass surveillance system, a tool of power that we have never seen a state have before. I think I agree. I mean you mentioned the Stasi, and I remember I've seen the graphic that you use of like, you know, a drawn out social network they have, and it looks like you compare that to say like what's on LinkedIn or Facebook, and it's like comparing finger paints to Picasso. I mean it's like just the sheer difference, which actually leads me to a very specific practical question. You unfriended Facebook, you deleted LinkedIn, but they're still Twitter. So what about Twitter? I mean I think about like all the people I'm following on Twitter and who are following me, and that, you know, it's an incredibly powerful tool to advocate with and to share opinions with and it, you know, to strengthen relationships with journalists like you, but at the same time it's painting an incredibly intimate portrait. I mean one can imagine some really in-depth social graph analysis on just Twitter that would be like, okay, those are the ones to round up. There was actually a great example, they did a social network analysis of all the folks around the time of the revolution that were tied to all the different social clubs in the colonies and you look at this network and there's like, you know, a bunch of folks at the club here and over here and over here and in the dead center of it is Paul Revere. I mean really like if you'd been able to, if the Redcoats had been able to do social network analysis, pre-revolution, we'd all be having tea this afternoon. Yeah, no, Twitter is a failing of mine. I really love it. It's a great way to broadcast. The way I've rationalized it is that it's a one-way relationship, right? Both Facebook and LinkedIn require confirmation of theoretically your friends or a connection, a business connection with someone and the truth is I don't know who these people are who are following me a lot of stuff, right, other than you and you, but I, so I decided that it doesn't, it's opaque enough about who am I really friends with that that's how I rationalize it, but it's true that it does reveal information, but that's the problem. Everything does, right? So you have, you have to give to get also and so one challenge has been right, like marketing the book is hilarious, right? It's not exactly like privacy-friendly to be out here everywhere with my picture everywhere. I thought about putting this, I wanted my publisher to put the stencil picture in the book jacket and they refused. How do you get one of those? Oh, I just five dollars on Fiverr.com. It's just like five dollar tax. Okay. So, but five dollars, a few, a few of those things, they start to add up. You, you got to $2,500 for the year. I mean, a lot of that is the burner phone, right? Because that's a monthly payment and the portable Wi-Fi, you know, my MiFi, which is monthly, but yeah, it added, I couldn't, I was actually, my jaw was on the floor when I added up that number. It couldn't believe it. And so what, what do those of us who don't have that kind of, we'll be doing questions later. I mean, this gets to the privacy is a luxury good. I mean, is it reasonable to ask people to do that? And if not, then what's the answer? And if, if the only way we're going to get free email is with our data, I don't know, how, how do we make that bargain? Do we accept that bargain? Do we renegotiate? How do we renegotiate? Or do we accept that? Like if you want privacy, you're just going to have to spend money. I think of it like cars. So I basically feel like cars are really dangerous, but we get into them every day. And the reason is because we have a baseline understanding of what the safety measures are. And then you can buy your way up the chain, right? You can buy your way to the Volvo, and then you're like, I'm seeing nothing will ever happen to me, right? Or you stick with the like little rattle cap like I have and you take your chances. But at least you kind of know you have a baseline. And I sort of feel like that's the goal for privacy, which is like, we should agree that everyone should have a certain amount of privacy. Maybe we decide that that's just for children and health and financial, or maybe we decide on a baseline level, but we agree collectively on what we think are the right wilderness areas to wall off for that. And then you buy your, this is capitalism, you buy your way up the chain if you want more, you know, go for it, spend the $2,500. One practical question I had when I was reading your book, I mean, Kevin mentioned, you know, how you guys found that staples.com was charging, you know, different people, different amounts for the same stapler, depending on, you know, whether you lived close to competitors. I mean, how precise can companies get at this point? I mean, do they have like the technological capabilities to charge all of us different prices based on, you know, say our income or education or, you know, how, how close are they to that? Yes, they have the capability. It's not, you know, this is what I do is try to chase down where they're doing it. It's hard to do, but we have seen they're getting more and more sophisticated. You know, the credit card companies, we wrote that story about Capital One tailoring its offers in 2010. They continue to do that and add more granularity and more information. The truth is, right now it's technologically possible for every one of us to see an absolute different price on the same website and never know the difference, right? The challenge for me as a reporter is it took nine months to prove staples. We had to build this team that we had to eliminate all the possibilities of other reasons of prices and time is a huge element, right? And so capturing them at the exact time to prove that it was really at the same time. It's a big technological challenge to do this kind of reporting, which is why I'm so passionate about not criminalizing that kind of reporting. I was at the National Retail Federation. They have this big annual conference in January and it was really interesting to see how even brick and mortar stores are now trying to move into price discrimination in more subtle ways. So they were kind of developing these tools to, you know, track your eyes as you're looking at products and then, you know, send you in real time a coupon so that if you're in, you know, the chip aisle, they'll send you a discount for the salsa and kind of different ways to actually arrive at the same kind of price discrimination but through more subtle mechanisms, which again, yeah, is going to be really tough to track down. I write about one company in my book that is developing facial recognition technology that is being used at a National Retailer. I have not yet been able to learn who it was. But basically, when you enter, they pull up your face and then all the sales associates get whatever data they have on you. And their dream is to identify the best shoppers, to give them the best service, and to identify the shoplifters to kick them out. And of course, I know that this isn't going to end well for me, right? I'm going to either look like a shoplifter and get kicked out, or I'm going to look like the good customer and get charged twice as much through either way I lose, right? And so this is what I'm worried about is that once facial recognition technology gets really good, I think this is going to be a really difficult situation. I mean, you can imagine a system. I don't think they're building this yet, but like, I mean, looking at Facebook, you know, one can imagine they have Facebook connect for apps and they'll, you know, share a lot of data with advertisers that doesn't individually identify you but allows advertisers to target you based on your unique interests and tastes. One could imagine you walking into a store that is Facebook affiliated that would see your face, send it to Facebook, and Facebook wouldn't say who you are, but would say they like this, this, and this, you know, and they make around this much money, go for it. And I think we're very close to that. I don't, and I don't think I don't see anything stopping it except for people getting really freaked out. If they even know about it, which is again why it becomes important that folks like you dig it up before it happens or as early as possible. That also I feel it gets to another interesting point in your book, which is, you know, like, the whole idea of like anonymizing data only gets us so far, right? Because it doesn't matter at the end of the day if Facebook isn't giving them your name, if they're able to link this stuff, right? So a lot of ways that, you know, you end up finding how the privacy policies that often just give a veneer of security to us, right? That they're all these ways to circumvent, right? Yeah, I mean, what is anonymity if they know everything about you? They know that at 10 p.m. you want to eat Cheetos, like, and then they sell you Cheetos at some extreme price at that time. Like, it doesn't matter that they don't know your name. Well, there's also in you made this point in your talk about how at this point, location data is itself personally identifiable, like even without your name. I mean, there are very few people who are going to be sleeping at my house and working at this office. Right. In fact, there's only one, I promise. There's only one. And, you know, it just takes those two bits of information to be like, oh, that's Kevin. Right. So what can you do? Actually, I guess that leads to my sort of my final question before we open up to the audience. Like, when you talked about this book at Privacy Law Scholars, you said your hope for the book was that it would be like Evgeny Morozov with hope. And if you don't know Evgeny Morozov, he's an author who's written several books sort of taking aim at internet exceptionalism and this idea that the internet's going to solve all of our problems and instead highlighting how it's becoming a tool of control. And I think you wanted to highlight that as well, but signal in a way that Evgeny in his very fiery and entertaining way does not evoke some hope about a path forward. Right. Well, I am. Have you found hope, Julia? Well, I have to say that one of my reviews said this was really good, but this woman is irrationally optimistic. And I think that is actually true. I don't have a rational basis for my hope, but I'm a hopeful person. And I also think this is the kind of problem we have solved before. We are a nation of innovators and doers. And that's the thing I love about this country. And also one thing about this country that is, we are rebels. Every hero of every movie is a rebel who fought against the man. Right. And so this is the kind of issue we love. It's a rebellious take back our data issue. And, you know, it reminds me of environmental problems. You know, we haven't solved all of the environmental problems, but that was a collective issue. You could never prove that your particular cancer came from that particular plant. All you knew is that the air was dirty. Right. And eventually, you know, we decided to clean up the air and clean up the water. And we did it through, it took a long time, but we did laws, clean air, clean water act, but we also did it through social norms. We started recycling and we started picking up our dog poop. And I think this is going to take the same type of thing. Like we're going to have to be a little bit more thoughtful about the tech tools we use, maybe choosing DuckDuckGo is a little bit of a vote in terms of privacy. And then at the same time, we're probably going to need some collective action like laws to set some baselines. And so I have hope these movements take like 50 years, though. I thought you're latching on to I had a tar bell was so apt in that way, because, you know, as we were transitioning from the agricultural to the industrial economy, we ended up passing a whole body of laws to regulate railroads, you know, and create the federal trade commission. And it seems like that same movement we've yet to see as we've kind of entered into the information age. Correct. And that's why I chose her explicitly for that reason. And it's worth remembering that that movement took 50 years. And a lot of child labor happened along the way. It was, you know, it's like sad, change is hard. Indeed. On that hopeful note. You might want to pick another metaphor because there is still this issue of global warming that might end all of us. But we're going to solve it. Yeah. So do we have any any questions from the audience? Yes. Hi. So a comment in the question. First, I actually I love the kind of analogy between cars and privacy, the idea of getting a baseline. And so it sort of made me think in some ways like we and maybe maybe you are this person or there's something else, but sort of like a Ralph Nader privacy, right? The person is going to ensure that every single car, like we know all cars have seat belts, right? That is the basic thing we can expect. There are certain safety innovations that will be in anything you drive. And I like that idea in terms of thinking about privacy and sort of that there should be baselines that anyone could expect regardless of like whatever, you know, crappy website or whatever it is that you're using. The question that I had is thinking about the Capital One example, right? So, you know, they like crunch all this information about you and the decide that like this is the credit card that you get offered. So I feel like there's a part of me sort of the civil liberties part and the civil rights part that things like, Oh, that's too much information about you and they should just, you know, send you whatever offer you're they're sending somebody else. On the other hand, it doesn't, you know, it's not quite price discrimination, right? And you can apply for other credit cards. And so I'm sort of curious, like, does that sort of feel like I have a little bit of a knee-jerk reaction to it? But at the same time, it's not clear to me that it is actually as problematic as a lot of the other stuff. So I'm sort of curious to hear more about this, that and if, you know, if you think there's a better way, like, is the answer just that the credit card companies either shouldn't be collecting the information or really shouldn't be making any decisions about their services based on that or is there something in between? Well, I think that that's a good question because the capital one, the credit card industry is one of the few industries actually is regulated. It's not a privacy law, but the Fair Credit Reporting Act actually is one of the few laws that gives you some rights over how your data is used against you. And so the truth is, if you applied for one, they showed you a lower card and you applied for a better card and you got rejected, they actually have to send you some information about why and you have the chance to dispute it and you can look at the data underlying your credit score. And so I think that you're right, that is a gray area because you can still apply for any card. But I guess I would like to answer the question about price discrimination in general because actually a lot of economists say that's awesome, right? There's a huge economic theory around the fact that you should pay what you can bear. If I could bear more than $5 more than you can and I should pay it because that's perfect pricing. And I think that that's the debate that's going to end up playing out is whether that's just awesome. Let's have a perfect pricing world. And I think that's where I really think we have to redefine what do we mean by redlining because right now we think about redlining as something, it's like a neighborhood shouldn't be discriminated against or race category of people. But actually now what's going to happen is it's going to be redlining around my head, right? I am being discriminated against. And so weirdly we're going to have to rethink discrimination because it's going to be so individualized. And so actually to me, I think it's like a bigger question than that, which is just sort of how do we understand individualized discrimination because we've always thought of it as a category. And that's where I get into sort of oversight. Essentially the ability to police this stuff and to write about it as journalists, I feel like it's so important because I think in the end the public has to make that choice. Is this reprehensible? Because actually we are pretty good, we make mistakes, we put the Japanese in internment caps, but generally as a society we've been pretty good about reacting to things that are just discriminatory and trying to correct them in most cases. I mean I'll just say, I mean I think this goes to the heart of the issue of how so much of this is not about privacy. So much of this is actually asking really hard policy questions that we haven't answered before about fairness. And if we're talking about what economic opportunities are available to you, whether you should be able to get credit on these are those terms, or whether you should be able to get this job, what criteria is it fair? And from policy matter is it a good idea for us to let vendors make decisions based on? Of course we don't want them and we make it illegal for them to judge us based on our race or our religion or our gender. But whether criteria are fair game and should be fair game, that's not a privacy question. But it's critically important and I hope it's one, I expect it's one that's going to be raised quite distinctly in the Podesta Big Data Report and that I hope we're going to have a broader conversation about it. And that's why by the way, I really kind of hate the word privacy. I use it because people understand my topic that way, but I end the book with what I call the unfairness doctrine, which is essentially, I feel like this is a question of fairness. We have a society have to decide what we think is fair and I lay out some points about what I think are some tests of whether things are fair. And to be, to collapse it into one thing, I really think if you're going to choose one criteria for fairness, I call it the publicity test. This is a longstanding ethical test that ethicists and philosophers use, which is basically if your criteria can withstand public opinion, it probably is fair, right? And I think that's actually what NSA is being subjected to in sort of a brutal way right now is the publicity test. And you know what, I think we've made our decision like the phone metadata base did not withstand that test. Now some of the other stuff, I think we still still might take us a while to actually stop it. But yes, right? But I think that like that's sort of what and I come to it from a journalistic standpoint. So I always think of it in terms of like journalists to do the exposing and then everyone will decide. But I think that's a one good way to test the fairness of something. Julia, Mary Madden from the Pew Research Center. Thank you so much for your presentation. I think it's just perfectly in line with the moment, especially this week with the security breach. And so I wanted to bring up the question of security a bit more and sort of the both the leakiness of data and also the messiness of data, because I think one of the points around fairness is, you know, if there is a mistake and often there are mistakes associated with our data and our profiles, how can we be made aware of them or correct them? And then even if we have, you know, perfect encryption and perfect files stored and we take all the precautions that you've taken, well, if the security is weak and that information is hacked, then all that is lost. So I wondered if you could speak to that a little bit. Yeah, you're referring to the Heartbleed Bug. So, you know, I was thinking about this today and Kevin, I loved your thoughts on whether this is the right approach. But one thing I think about with the Heartbleed Bug and particular with encryption is that encryption used to be something that only spies did. And they had people who ran their code books back and forth and those people were like specially trained and they like knew how to handle a code book and they were caught occasionally but then they took the sign and bill or whatever. I mean, it was all very dramatic, right? But now every one of us is carrying around our own code book, right? And this is the democratization of code. It's spectacular. But you know what? I am a terrible manager of my keys, right? Terrible. Right now I have my laptop locked in that hotel safe because I was like, well, I should bring it because my keys are on it but then I can lock it in the safe. This is idiotic. I shouldn't be in charge of my keys, right? And this is actually the Heartbleed Bug. It's also emblematic of that. All these websites are in charge of managing their own security and we're just not that good at doing it. And that's the beauty of the web is the decentralized nature of it. And so I don't want to defeat that, but I do recognize, I think it's just worth pointing out that it also has this bug in the beauty of the decentralized nature of the internet. It also has this one problem which is that we all have to defend our own forts and we're not that well equipped to do it. And that's what a lot of security is about that. How do we get everyone equipped to do that? Well, I mean, there are a lot of potential ways to talk about this. One of them is we need to focus, I think, more on defense in depth. That is, having security that can withstand a failure of one element of it. I think that involves encryption at a variety of levels. You know, one thing I keep harping on ever since the president's review group report came out about the NSA stuff is we now have a high level, presidentially appointed review group with specific recommendations about encryption and about the importance of fostering the widespread adoption of strong encryption throughout the internet, the importance of the U.S. government not undermining those standards, as we've seen the NSA do, ensuring that we are not creating new vulnerabilities to facilitate surveillance, you know, creating secret backdoors, and I think really importantly, not stockpiling the vulnerabilities that we find out about, but instead ensuring that vendors find out about them so that they can be quickly patched like it looks like Heartbleed is about to be. But it's not an easy problem, and I think that having a variety of layers of security such that when something like this happens, it is not catastrophic, and we don't have to all go and change every password we've ever used, although now I'm hearing also, or, okay, maybe don't do that. You don't have to do that. In fact, maybe that might be that if you do that right this minute. It's very confusing. Yeah, I don't know what to do either. Go to the opt-out village and it will all be okay. You mentioned having two children. My question is sort of what are you trying to pass along to them data online habits sharing-wise? Because a co-worker of mine, I should disclose, I work here at the New America Foundation. A co-worker of mine said that online data-wise, we're past the root-con for all of us. There's nothing we can do now, but it's just the people who are coming online now, the children now, that you can have the good habits and have a different data profile online. So my kids are young, they're six and nine. I always have to give the caveat, which I haven't been tested by the fires of teenagehood. I can give all this happy advice and then talk to me in five years. But the truth is, my daughter who's nine loves being online. She's a wired kid and she's really into her privacy. The reason she's into her privacy is because she doesn't want her little brother to break into her account. I think that's perfectly legitimate. That is how privacy works. She doesn't care about the NSA. She never heard of them. She doesn't care about Google, except for the fact that they give her good search results. But she cares about her little brother and also she cares about me. She doesn't want me reading her email. This is a question of privacy as threat models. Everyone has a different threat model. My kids' threat model is primarily each other and me and dad and maybe grandma. And so once I sort of cottoned on to that being the threat model, I gave her tool to address those threat models. So she actually really loves privacy tools. She uses this encrypted texting and phone calling to reach me because it's super cool. She actually started a business building strong passwords. So I came up with a way to manage my passwords. Essentially, I use a password manager to create most of my passwords. But for the password to the password manager and my bank account and email, I wanted those to be extra strong and not in the password manager. So I use a method called Diceware where you essentially pick words out of a dictionary and the dictionary's words are numbered. And you roll dice because of course you can't trust any random number generators online because of some people who have been undermining them. So you roll dice, you pick the words out of the dictionary and then you string together a bunch of dictionary words and the beautiful thing about it is you can remember them. So they're very long, they're very strong, and you can remember them. But I was really lazy. I didn't feel like rolling dice. So I was like, my daughter, why don't you do this and I'll pay you a dollar of password. So she makes passwords for me. She had a password booth at my book launch at the Brennan Center in New York and she made passwords for people. It's like her little business. And she's really password savvy now. And so it's been fun to watch her get really smart about this. And I think she's not an anomaly. I mean, if you see the success of Snapchat, I think that actually the younger generation is going to be more savvy. And certainly Pew Research has shown that younger people are much more savvy about their privacy settings and about which apps they choose to download. So I actually think that it's going to be a great thing, this next generation, is going to be much more savvy at controlling their data because they have a much better sense of how important it is than we did while kind of stumbled into it. How much does she charge? Oh, a dollar of password. Although she told me she was thinking about raising her prices. But I think we can still get one at a dollar. Yeah. Julia, you made a very honest statement when you said that the companies like Google and Yahoo, they're all in cahoots with the government. It's not that they're not fighting each other. They're not in competition. That was true. But then you raised the point that press or reporters should have privacy. But then maybe there are few honest reporters, not that many of course, because the companies that the reporters work for, they're also in cahoots with the government, often because I've always said that in America, we say the press is independent, but the press is owned by big corporations, which are always in cahoots with the government. That's why they encourage these wars and so on, because they all profit from it. And the government and the companies that you work for are in bed. So obviously, a few honest, they're not going to be, and your editors are going to force you to change your story, adjust it and all kinds of things, because ultimately you've got to go through the system and their owners representative editors and so on. So given that, I think the privacy is dead in America, given the way things are. And Mr. Obama now, he says that, oh yeah, there's been too much spying. Okay, we're going to rehash it. But this is all he's doing window dressing because he is, as it is honest, he promised us when he ran that there will be no spying. But he has been worse than George Bush. And given that, I don't think he's whatever window dressing he's going to do is going to make things even worse. And but it'll stay secret till he gets out of office. I think there's more of a comment than a question. But I think I would say I would just defend the profession. I want to defend the profession of journalism in the sense that you I think that you might have been more right back when there, the media was much more centralized. But now there's this flowering of journalism online and so many outlets. And so I think that actually, we're in a great time for journalism because there are so many outlets that are really small, independent, feisty, and, you know, probably underfunded, but but doing great work out there. And the competition keeps people honest, I think. And I agree with you, there's corruption everywhere. But I think that that I believe in making this country a better place and fighting for it. And that's, that's what I think is the right thing to do. I think that there's always things to improve. Speaking of feisty independent novel new models for reporting, you just moved to one. Yes, I did. I just left the Wall Street Journal after 14 years. And I'm at ProPublica, which is a fantastic outlet. So the managing editor of the Wall Street Journal actually, when the company was sold to Murdoch in 2008, he left to found ProPublica. It's an, and his idea, which was the correct one was that big newsrooms at for profit companies like the Wall Street Journal and New York Times, which are publicly traded and have to answer to the showholders are going to then are going to inevitably have to cut back on investigative journalism because it's expensive and takes a long time. And sometimes it doesn't pan out. And it's just a high risk business. And so he decided that I'm going to set up an investigative newsroom that is nonprofit and makes all raises also money from foundations. And so that's what ProPublica is. It started in 2008. And I just joined it's actually about 45 journalists now. And it's, it's like, I was actually just telling one of the new America Fellows here. It's like heaven for journalism. It is, it's just all the people there are committed to super ambitious long term projects with bigger and bigger scope. You know, the question in a newsroom like the Wall Street Journal is like, well, who's turf are you going to be stepping on if you do that story? Because, you know, you're going to be making the Washington Bureau matter, you're going to be doing this. And I managed to make a lot of people mad and still do ambitious stories. But it was a fight. And the nice thing about ProPublica is that there's no fight. It's just like they're like, okay, that's great. Let's how can we expand and make that even bigger and more meaningful and more impactful? And let's double down. And it has 10 program or developer journalists like Oshkan. So I had to fight every year to get funding for Oshkan Soltani to do data driven investigative journalism. He's really a computer developer, not a journalist, but he has incredible journalistic sensibilities. And I wanted to bring that to my reporting, not to the end, when the newsroom would like you to just come to those people then and have them build a pretty graphic to illustrate their story. And I want developers to be at the front end. That's why, because my reporting like on staples is like a nine month CS project. It's not actually, you know, and so that's how I want to do my journalism and ProPublica is super committed to that. They probably have the biggest investment in data journalism, even if you look at the big papers. Yeah. Well, I really look forward to seeing what you come up to with there. I recommend that everyone who hasn't already picked up a copy of Dragonite Nation either outside of this room or online or at your favorite local brick and mortar bookstore with cash privacy, protecting technology or with a pseudonymous charge card, which now I'm going to look into. Yes. Thank you so much for coming. This was great. Thank you guys.