 We aren't helping users understand that, you don't think Amazon's got it quite right? Well, I think, I'm going to go back to that one second, one second. What I think, you know, this being Berkman and Berkman being the incredible leader in thinking about the internet in this country, and now with this wonderful tied architecture in both senses of the word, I would love to see some senses of the principles that come out of this discussion today, right? What are the emerging principles that we're trying to talk about here? And I hear context, though as I say context is I think it's very difficult to deal with. It's a very complex, I hear usage from Dana, I hear control, though that's difficult too. I hear transparency in our discussion before that you mentioned as a context. So, you know, what are the appropriate behaviors that enable a public and public discussion to be created at the same time preserve the rights and privacies that we have and where that goes? So, what does Amazon do wrong or what could they do better? What's the lesson? Well, I mean, that particular question is less interesting to me than transparency and usage. I don't want to hear what Amazon has wrong, just for a quick second. Harry Potter issue, right? The massively popular things badly skew their recommendation algorithm. So, you know, I'll get continually, I can't stand fantasy. So, it's not a matter of principles, it's a matter of execution. Yeah, well, it's a little bit of both. Okay, all right, keep going then. Yeah, I mean, to me, you know, one of the things that I tried to inscribe in my distinction between the things which were, like, less problematic and less threatening and more problematic and more injurious was the local use of locally gathered information. And to my mind, if you're just acting on something in its immediate physical ambit, that is, and you know, space and time, that is ultimately going to be less of an issue for me personally than something. I mean, I just don't believe in anonymous data anymore. I don't think that we can speak of anonymous data unless something is in perfect and serene isolation. You don't think we can create policies to maintain the anonymity? I mean, you're right, the sense you pull multiple data sets together, you can re-agreify anything. Yeah. But couldn't we create an accountability framework in which that data was walled off effectively for use? In law, but not necessarily in practice. But isn't there, but it's worth trying, you know, because one of Google's founders had said that if we use the flu trends to predict the next pandemic we could save in his perhaps overblown calculation a third of the population of Earth. So there's some context in which predictive analytics of that form don't really bother me at all. But there are other contexts in which predictive analytics so clearly cut against our social contract. I mean, think about, you know, the police would love to be able to use predictive analytics to cite resources and to say, well, we know where the last 15 murders in the city have occurred and, you know, we're going to like locate our resources in neighborhoods where we know the crime is about to happen, right? It stands to reason. And yet it cuts against our entire history of law and jurisprudence. And it's redlining, right? I mean, it's going to lead to that sort of thing, inevitably, and wherever enacted. And I think, you know, if we can see that, you don't think that's... No, I think it's blown too far. I think that's good budgeting. I know a lot of... If the argument is that police could prevent someone from being murdered, how is it a bad thing to have them try to do that in that context? If it leads to kinds of profiling that are injurious to people's lives... That's a different question. That's a different question. I don't think it is. I think the two things have become very... That's where you go back to this idea, Shiv. You're not allowed to know that. You're not allowed to act on that. So if you say we have to have, by a matter of principle, the same number of police per square mile throughout the whole place... No, that would be an absurdity. Okay, that's where you're headed. Not necessary. I don't know. I mean, I think this could require an afternoon. All right, fine. We'll take it off. Jeff, you displayed that magnificent image of someone's data shadow as you put it being displayed, which I kind of love as long as everybody knows what's going on and you talk about controls over that. But back to creepiness. I hear people talk about using AR, the fact that we can use facial recognition, the fact that with AR we could have Google goggles tell us, I could scan the room and find out the Weinbergers are vegetarian and that Ethan does this and so on and so forth. That I hear often is described as creepy. So one of the principles... I think the value of what you propose is great. One of the principles that you would try to have to make it operate in such a way that people would welcome it rather than call you creepy. Well, I have a 16-year-old son who's currently doing his final exams in high school and it's in a city school in Switzerland, so there's even in Switzerland, there's a lot of cheating going on. And we've been talking a little bit earlier about how we could redesign the space to encourage probably better interaction and maybe even a fair way of acting. And I think that the anxiety of those kids are mainly that it's such a lack of tracking of what people are doing. Because the fair people who are not cheating are getting punished in a world where the information is incomplete. So there's a benefit to them... I think there's a benefit to have their identities tied to the information that is legitimate about them. Well, I think the whole thing about the hyper-public and this campus that we're creating is a bit based on the premise that if you are... essentially, if you are honest citizen, if you're a good person, you benefit from things being public. Yeah. I agree. When I drove into Amsterdam last hour, I was driven into Amsterdam at some point in the road. Everyone slowed down to the speed limit. Everyone. As an American, I was freaked. What's going on here? And the driver said, it's a sensor and we'll all get tickets. So on the one hand, logically, that makes perfect sense. To an American, it seems creepy. It seems like a bad local use of this data. Logically, it says we're going to reduce accidents and save lives and it's a good thing. But it's a case where our norms and our technology just don't match yet. Right? If you could all be in Google's auto-driven car, that would be wonderful, but we lose control and we don't like that. This is exactly what we're trying to negotiate now. Benefit. We're talking too much about fear, I think, and not enough about benefit. I think this is a slightly different issue. I think this ties into what Betsy was saying about shame and shamefulness. I think that... I think the more you are shameful about something, the more you fear this lack of privacy. Right? Yes, but that will strike some as puritanical. Not that you're from... not from the... the birthplace of that, yes. There's a little Calvinistic in... and I don't disagree with you, but I think that's the fear is that somehow it becomes kind of a judgment matter. That goes to Eric Schmidt's line that if you're... if you're going to be ashamed of people knowing they should do this, then you shouldn't perhaps be doing it. I think it makes perfect sense, but got as often Eric Schmidt lines are tweeted and misconstrued. Do you... What do you think of that? That construction right there? It has never historically sat well with me. I didn't figure it would. Let me come on out and get the discussion going. Anybody? It sounds like it's a question of abuse. The design would be very good, but then it looks like maybe we need to have systems and part of the design should be to prevent abuse. Because the one person's concern about abusing the system of say for example predictive analytics, essentially you're still guessing and you could get it wrong and humans tend to be lazy and we rely on it too much. And so maybe how do you prevent people from overly relying on something that's just a guesswork? I was talking earlier about the social network you're trying to create and that part of the issue that part of the issue here is the expectation of it that if Facebook were presented as a sharing service, the whole reason to be here is just to share, Twitter like, then the fears about privacy would be lessened. So the definition of abuse goes into the prediction what you think this thing does. Yeah, exactly. So that's where the Harry Potter thing comes to because the prediction algorithm is not doing its job right that's why you said it's about the execution. So maybe we are relying on it too much as opposed to being aware of it but maybe it should be part of the design itself to not rely on it so much. I should point out too that like Amazon I've bought one book a week from Amazon since 1997. Amazon has more information about me, my habits, my behaviors and predilections as expressed through the exchange of hard currency than any other organization on Earth including Google and they're still recommending Harry Potter to me. You can fix that but it provides the means you can say don't use this again. It gave you the power, are you trying to complain about it? I do use that, I check that box, it comes back up there's something in... You can't kill Harry Potter. Well, this is what we're being told. I'm sympathetic to your prevail. That's not normative. Over here. Hi, I'm Jean Rosenberg from Baruch College in CUNY. I guess this week a couple of conversations came up around privacy. Yesterday at Birkman and before that at a corporate council conference that I attended and I'm interested in getting the panels read on it. When you think about privacy and what you've been discussing, you've been discussing the privacy interests of the individual and what's at risk when that individual loses some of his or her privacy but what about risks relating to third party abuse such as the example of insider trading if you've got a CEO of one company that's going to some small Midwestern city and so someone third party can kind of put together that maybe there's some deal that's in the works and then take advantage of that and basically take advantage of investors who don't have that information who haven't researched that or what about an example that some of the media team were talking about yesterday about knowing someone's location and then being able to break into that person's house? What about abuse of information in ways that aren't necessarily offensive to the person to the individual in terms of their privacy but that can cause risks to their safety or can cause some risk to their property? What about when they are offensive? We were told when the backscatter and millimeter wave machines were introduced at the airports that those scans would be locked up and never available to any third party and very early on it became obvious to everybody who had any knowledge of those machines whatsoever that these printouts were being passed around by the TSA people that they were escaping from the containment that was designed for them and that in principle there's no way to design these sort of perfect isolation bubbles in which these facts can reside. I tend to think the Canadian Office of the Privacy Commissioner has a little piece of dog roll that they use that it's a little cute but it makes a lot of sense to me, it says if you can't protect it, don't collect it and that has to be my bottom line on all of this stuff, even if there is value that might be mined from that information in the future. If you have no way of protecting that information then you have to assume that some harm can come to vest in the lives and the life choices of the people that that information has been gathered from and that's what doctors have been doing for millennia like first do no harm I'm just not convinced by the notion that there's all this amazing benefit out there to be derived from this that outweighs the potential damage particularly to vulnerable individuals. Go ahead. Well, first I'm all for protection of privacy, it's not that but in the two examples that have been talked about the Harry Potter example or the intruders who gets the information to get into a house into your house the problem there is really that the information is not complete, there's not enough information Amazon doesn't know enough about you otherwise it wouldn't if the system would know about the intruders that wouldn't happen either, right? So it's an uncanny valley problem we've got almost enough information but not quite a perfect amount Yeah, so you can either try to hold it back or push it further it's like a train that goes perhaps in the wrong direction but at least accelerate it so that it crashes but at this point you're just holding something back I want that to be the design principle that's built in everything you know, push it further so that at least it crashes I think that's beautiful There's another angle here that's worth mentioning which is you can maybe get both worlds so the location example of I don't know if it's Facebook or Foursquare where it was like the please rob me thing that's an individual with an identity posting their location The flip side of that is anonymous collection of location data like that that is done in the location services industry at large where it actually does provide tremendous value and you can capture that value while protecting the identified data set and I think that's the distinction that is most interesting to me is how do you get the latter bit of that while protecting the identities involved so that you can prevent that type of that type of breach This is turning into the classic discussion we have these days around fear and privacy if that notion of if you can't protect it don't collect it means that you'll never collect anything because you're always managing to the worst possible case the highest common denominator of fear and liability if something could go wrong with it then you shouldn't collect it I don't want to live in that world I really don't, by the way I'm a weird guy, I wanted to use a scan as my author photo I wanted to push back to the panelists who paint this picture of privacy is really if you have something to hide don't share, only the people who have something to hide are the ones who won't share it I'm reminded of many years ago one of my first tasks was for a cancer registry I mean a disease registry and they wanted to make sure the people who had HIV diagnosis were not released so it's a very simple thing, you have an HIV we're not supposed to release that information but now you have a list of all these patients and the only ones blotted out are the ones who have HIV so it became immediately clear to me that privacy can't be the issue of hiding only that which, from an individual perspective that in the case of that registry it had to be a kind of societal value and that I had to also suppress other people who didn't have HIV in order to protect the people who did have HIV and adhere to the societal regulation well, I'll come back to you first myself, right but then you affected those people whom you also blotted out I wrote about my prostate cancer I got incredible benefit by doing it under my name, it was my choice and it was easy for me to do it because I'm a white American male in the US and I'm public and obnoxious but I got incredible value back because my name was attached I got friends who brought me advice who brought me benefit that I could not have gotten otherwise right, so in the effort to come up with the safe rule for all you affect people in two ways actually that's not correct so remember I live in the world of parallel universes the universe that I talked about you can't find out about your prostate cancer you couldn't find out it from this source it didn't mean that there didn't exist another universe in which you could find out about it and an example you gave was one where you gave that information the point is a very profound one I wanted to go back to a point Betsy made a couple of times actually in what she was saying which had to do with maybe going back to a point again in Natania's slides this morning maybe this isn't actually a trade off at all maybe you can have both maybe you can have privacy protection and still have a flourishing advertising ecosystem and one observation I would make is that what makes us individuals are the differences between us even identical twins are different but what makes us economically valuable is our similarities do you think that that leads us to any kind of potential solution to some of the intractable problems we've been discussing I'm not sure I completely understood the question so let's see if my response makes any sense what you said resonates in the sense that the types of value that I think about from big data is I mean Jeff point taken that there's value in you sharing your individual life history but I'm actually more interested in the aggregate sort of analysis of social data where you do actually find similarities you're applying statistical machine learning to a big data set to learn something interesting about that big data set to learn something about the individual pieces of data in there even though as has been pointed out any one of those individual pieces of data is probably re-identifiable in some theoretical world and so I think you can put in systems to provide accountability and protection of that aggregate data set to protect against that risk spot on I think your response was gratifying I think it also goes beyond economic value if you're looking at an individual and those individual characteristics and they are unique from a medical point of view you're in the territory of Dr. House you know, Freakville medical detective work for most people it's the similarities that doctors diagnose the problems we have so I think what you say is potentially also applicable in medical record protection we have time for a few more I wanted to go back to some of the examples you gave Adam and you talked about how this morning there was a lot of talk about how informed consent might not be the paradigm in which we are moving towards it seems to me that there's a possibility that we're moving towards what might be called uninformed consent and I was wondering if that you find the case where we have the option to choose to engage with systems but are not really told the results of what that system is doing or collecting if you see that as a trend and if so whether it's problematic yeah so the construction I apply to these things is very much about another distinction which the morning also revealed to be increasingly problematic which is that between public and private space I mean if we're blurring the thresholds and boundaries between these conditions then some of the things that I'm talking about and asking for don't really hold up very well and we should be really clear about that the existence of a screen that's capable of capturing biometric identifying information about you doesn't bother me if it's in the product store it doesn't bother me if it's in Walmart it doesn't bother me if it's in McDonald's because in some senses you've consented to that by crossing the threshold and I think that there's probably that could be argued as well but as sort of a clear identifying marker for me where you know that's not my domain that's not what I'm designing for I'm designing for what we traditionally call public space and the condition of the street and in that context there's no way to inform you or achieve your consent your consent is implicit your consent is bound up in the action of passing in front of a screen like that and again you know I might not be the most active advocate for the things that I believe but that doesn't sit right with me just as somebody who walks through the streets of the city do I have anything that I necessarily need to feel guilty about I don't know I don't know that that's the question though are there policies and procedures which could be enacted on the basis of that information which would be detrimental to me or run strongly at variance with my desires absolutely and that has nothing to do with shame or guilt or any of those priorities so again there's this sort of user experience challenge I mean if you're going to design something that lives in public space and is going to be gathering that information and there is this burden upon you to inform people of that and to you know get some kind of acquiescence from them some kind of formal acquiescence from them before proceeding with that how do you design that into a situation where somebody is walking past that at normal speed and the engagement itself might last a third of a second if you can't respond to that in a meaningful way then I think you ought not to be deploying that technology there what I'm interested in doing right now personally as a designer is finding out a lot more about the future history of informed consent because frankly it's the model that I built a lot of my assumptions on and if that's kind of being subjected to evolutionary pressure at this moment in history oh boy I better need to find out about what comes next because it is implicit in what I designed that there is this model of agency and consent thanks Jenny to me how many of you guys have ever heard of the band or the individual musician Momus because he has a great song called the age of information which definitely runs into the sort of utopic language around this where basically the substance of the song is in the future everyone will know everything about you and they will decide whether they're your friend based on who you actually are which I think was really contemporaneous to what the 15 years ago when he wrote it it was a very utopic exciting idea of a song but I wanted to push back on two memes that I keep hearing in this space and that came out on this panel as well that I'm not sure if I believe and the first one Latanya addressed very clearly which is this idea that if you don't have anything to hide you're going to benefit and I really don't agree with that and I actually heard Evan Moglin on Science Friday a couple weeks ago just saying it very clearly that while it may be true that you're not doing anything that is wrong there have been things historically that are always private where your children are what medications you take what your symptoms are from your insurance companies at a certain moment there are real reasons for this level of privacy and I think to contest it is crazy the second thing I would just say though is this meme that's come out on this panel privacy is only about shame I think is a very narrow definition of the benefits and values of privacy and that I may look at my husband in a private way I may have shared stories with my best friends that are private that are nothing about shame at all but that are about a level of intimacy that I am granting to them and the idea that somehow all of this is fair game and the future seems crazy to me so positive idea behind it just for clarity do we think we anyway said that that the privacy is only a matter of shame I heard that a couple times I quoted John and Franzen but Franzen, yeah it was just an interesting quote just for clarity don't relax the meme is not taken off as badly as you think that's all I'm trying to suggest Hi Jackie Kerr I wanted to add to that actually shame I thought it was a great quote but I also thought of, I've studied Soviet history and I thought of Joseph Brodsky's essays about the need to hide in private kitchen settings to have real conversation during the late Soviet period and I think that privacy can be constructed as being only about shame in a perfectly democratic society perhaps not in any society in which the public norms are not viewed as legitimate by all in that society and in any non-democratic society you really run into those issues and I just thought that was a point worth throwing out there for the discussion thanks yeah I mean we know historically as a matter of mission creep if information can be collected and used against you then ultimately it will be used against you across the board and that's what, we live in a democratic society at the moment I'm grateful for that but I think to assume that the society will always be democratic and that information will always have the protections it has now is fatuous in my personal opinion so I completely agree with you dystopian or neotopia it's not a dystopia it's historical okay, wait I'm not going to speak myself up we have two weeks Hi, Shreya Murthy at Brickman Intern I'm actually writing a paper right now reviewing the work of Daniel Solov who if any of you guys are interested in sort of like the legal and philosophical basis of the way we think about privacy specifically in the courts it's an excellent readies got a paper called the taxonomy of privacy and one thing that I think is really relevant the idea of like shame as being you know the main justification for a right to privacy the courts have actually recognized that there is such a thing as dignitary harm which occurs when privacy is breached that there are, there's a certain sense of your person and your personal effects that is harmed when there is a breach of privacy and I just think that that's something that you know we should recognize this is an understood concept that there is something specifically wrong with putting a person in a position where they don't feel comfortable acting freely or thinking freely or doing anything that they in a society that recognizes free speech as such an important value should be able to do yeah that's right and it's not just about the transition to totalitarianism it's about the transition out of totalitarianism as well I mean one of the my favorite historical examples is all the information that was collected by the East German security state wound up in the Stasi archives when that regime fell that information became freely accessible all of a sudden and people's life chances were harmed by it and only in retrospect did you know people judge them on the basis of the actions that were taken 20 or 30 years ago without having any understanding about the things that stipulated your honor bad things can happen but if all I'm trying to say is if all we do is manage every choice we have and we don't join in with our fellow human beings enough because of these fears, fears, fears I think we're going to lose the benefit and the power of this tool of publicness that we now thank you internet folks have right I think the fear is really relevant I mean that you keep bringing this this concept I think is very important because it seems to me that we are still in the paradigm of the Benham paradigm we are being watched we don't know when by who who is collecting our data so we are all in prison in a way a society in this in this condition if we follow what Adam is saying for example right but there are other aspects of it I mean the are not of privacy keep changing so it's not are not of privacy it's not the same that it was in the Victorian age or that it was even before this morning somebody said that privacy basically started in the Victorian age you say something about what and someone about the same so you know define culturally Robin Evans wrote a very beautiful article called figures doors and passages where analyzing architectural space he talked about how in the space of the renaissance there are these palaces that lead from one room to another and people will be passing by and people will be making love or defecating or everything and everybody had a sense of privacy despite the fact that they are being exposed right so the introduction of the wall in architecture to separate what is private and public is historical to as it is the construction of that wall with the open plan so we are in a new pattern where definitions of what is private and public are changing and you know for example a couple of years ago there was a few years ago there was an article in the New York Times maybe it's now ten years ago the manufacturers of curtains and says we're all going out of business because people didn't think anymore that they have to have them right so in an age in which we are all so much more exposed to new media people didn't feel that the curtain in your wall was such a kind of a point thing to have right so we are more able and then what concerns with privacy have changed to as it has been repeatedly say it has more to do with our medical histories than with other aspects that may have preoccupied the Victorian age right so what is interesting and I think about this conversation is that it expresses the fears of us as a society in multiple ways and that's what we should be paying attention to what is our I think we will agree in the end of it is that one is making a bet on the future so when a gay or lesbian person comes out and uses the power of that publicness to fight down the bigots and gets married in public and is registered they're making a bet that they're not going to find a backlash and turn around that gays will again be shoved back into the closet whether it's totalitarian governments or anything and that's a bet that one chooses to make on optimism and that's the kind of negotiation we're going through right now I think we've run out of time entirely anybody want to respond to that or okay I want to thank the panel very much for a discussion that went in surprising ways thank you