 This session is called The Writs and Beauty of Hyperpublic Life. This morning, Jonathan said, well, just yet another privacy conference. And so I just want to emphasize it's also a publicness conference. And to some extent, this is a session that deals with that. Instead, there's also a huge temptation to do things that are very, very public. There's fascinating things that can be done with that. You know, both dystopian and utopian images of the city of the future where we could see the tracks of things, we can see all the data, we can see the movement. And so a lot of the idea in this session is to look at what are the sort of beautiful and tempting things that we can have in the hyperpublic world and what are their risks, but also what is the value from it. This session will be chaired by Jeff Jarvis, who's a professor at the City University Graduate School of Journalism. And he is, as many of you know, a widely read blogger about media news at BuzzMachine.com. He's the author of What Would Google Do and an upcoming book called Public Parts on Privacy and Publicness. So, thank you. Thank you. I haven't been to Berkman for some time, and I just absolutely love hanging out with smart kids. It's just such an amazing institution and the leader of the Internet, and so I know I speak for the world and say thank you Berkman for doing this. I'm also grateful that you constructed this whole day around not the word privacy. Jonathan was wrong. It's not another privacy conference. It's the first publicness conference. It's the way I want to look at it, and I think the two necessarily then become into discussion one way or the other. But to look at it from both sides I think is incredibly important. Just one bit of maybe transition from the morning sessions as I, my brain was swimming and my Twitter feed was going crazy. Thanks for everything because everybody retweeted Dana all morning. Is the notion, hanging out with where Ethan was, that I think what we need personally is principles to inform the architecture and operation of where we're going. That we need to give, that what we do now is wait for the service to give us the terms of service, and we should have a terms of service for the Internet in a way, to give to these institutions, these corporations, these governments. And I don't suggest there's a constitution or one set of laws. All I suggest is we need a discussion, and out of that discussion we'll start to intuit where we discern where we go with these principles. And I think that's what I hear happening today, is trying to discern the principles of what it means to have a public and where that goes. We're going to talk about objects and cities and data. One thing that I'd like to keep in mind that I think was great out of the discussion of Street View this morning. And when the Germans have their right of, fair pixelungsrecht, the right to be pixelated, as they call it. And 244,000 Germans chose to have their homes pixelated in Germany. And then others objected because one resident of the building would have a pixelated, the other resident wanted to be clear, and it was typically wonderfully German. But part of my objection to what I saw happen in Germany, that was one point I didn't hear this morning, that I would just like to add to the discussion, is that there was a diminishment of the public. There was a diminishment of the public space, the public view, and thus the public good, and what the public owns. And that that's an issue here in deciding what's public. It's also important to keep in mind that kind of, we all have a stake in that. And that it is a public good. And so as we discuss what it means to have both privacy and publicness in public and what does it mean to have a public, I think we have to look at it with some proprietary interest in that, an ownership interest in that as the public. Is there any other point I want to make? The other interesting thing that I hear from this morning is that I think that we talk about the English home with the back stairs and the hallway inventing privacy, the notion that Dana mentioned too about the switch of default on privacy and publicness, that privacy was for so long, was a scarcity, and it's now a scarcity again. Privacy used to be expensive, then it was free, and now it's expensive again. I find it a fascinating economic route here that goes and it's changing the economics of this and our presumptions about this and how we look at the world through the architecture of both our spaces and our places and also our systems. Finally, I've been struggling a lot with the metaphor for the internet, and you know, it was Doc Searles who taught me to not call it a medium because when we call it a medium, it brings the limitations and the desires to control it that a medium brings, and Doc taught me to call it a place. As I've called it that on my blog, some people objected to that. The CTO of the Veterans Administration calls it the 8th continent, so it's kind of a new place, but that kind of makes us assume as if we're all leaving from Plymouth Rock and we're leaving our old homes where in fact we remain resident in our old homes and we become citizens also of this new space. And Sarkozy at the EG8 refused to call it a parallel universe. I guess the internet in my mind is a platform for publics. It's a platform that enable us to come together and create publics and that also enable us to have the ability to do that as private people, and we're trying to find the balance of that. But I think the important thing to discuss is that there are benefits to publicness. There are benefits to being able to gather together and act to speak and act together as a public and how we architect that world is what's going to enable what happens and what can't happen. So that's my little preamble and I'll pass over to each of the panelists who will do up to 10 minutes of a talk and then I hope we're going to have lots of discussion in the room so I had my say, so I'll shut up, that's a lie, but go right around the room and so store up your comments and challenges. So Adam first. Good afternoon. My name is Adam Greenfield. I'm the director of a shop in New York City called Urban Scale and our tagline is designed for networked cities and citizens. So we are very deeply invested in the idea of how do you design networked spaces that support the sorts of publics and the sorts of public debates, discussions, activities that we've all been concerned with so far today. I'm interested in pushing back at some of the notions that we receive around this figure of the networked city. The networked city is often offered to us as existing in what certain people in the room call approximate future. And I mean to push back very distinctly at that. I don't mean that the networked city lives in approximate future at all. I think that we already live in the networked city or what we call the smart city as well. And why is that? Well, we've got a pervasively, even a comprehensively instrumented population, not merely in all of the cities of the developed world certainly, but increasingly in the cities of the developing world as well. We have very widespread adoption of locative and declarative media, for example, Forest Square, and now the ability to perform analytics, very much including things like sentiment analysis on the utterances we make in those spaces. Increasingly, we live not merely among declarative people, but declarative objects as well. This is a project from a London designer named Tom Armitage. He grafted a Twitter account onto London Bridge. At first, he fed this account manually. And later on, he hacked together. He scraped London Bridge's actual records to be able to allow London Bridge to speak to you in something approximating a human voice through the medium of Twitter. Like, I am opening, I am closing, I am about to pass. And this is a stand-in for a very, very large body of objects which are now increasingly speaking to us, with us, and receiving information from us. We also see objects in spaces with networked identities, informational shadows, like this building, the end building in Tokyo, which the entire facade of the building has been rendered as a QR code, and when you image the building with your camera, you get taken to a website describing the building. It's a little... This one's a little iffy, but it, again, is a stand-in for a class of object services spaces that increasingly we cannot fully understand, except with reference to the network spaces that surround them. And most interestingly to me, our cities are increasingly becoming comprised of objects that are capable of gathering, processing, displaying, transmitting, and or taking physical action on information, which necessarily to me implies new modes of surveillance that do not always or even necessarily operate in the visual register. We've discussed some already today, and I'll be discussing some more in a couple of minutes. My contention to you is that tens of millions of people worldwide are already exposed to these conditions, and it occurs to me, and others as well, that just possibly we need a new theory and even probably jurisprudence of public objects to help us guide how we think about them and interact with them. So I'd like to offer you a taxonomy of effects. I'd like to offer you a couple of very specific concrete systems that I've observed being deployed in our cities over the last several years, actually over the last 18 months or so, which gives us a way into thinking about what this class of objects implies for us, how we might wish to treat them, what sorts of rights and responsibilities we have with reference to them, and what sorts of rights and responsibilities the people who wish to deploy this class of artifacts has with reference to us. The first is something called the Valky traffic sensor. And I regard this as more or less an unobjectionable information gathering technology in public space. And I regard it as unobjectionable for two reasons. The first is it gathers local information and takes local effect upon it. This is a traffic safety beacon. It has a motion detector. It was invented by a Finnish company, and if you know anything about Finland, you know that it's basically dark there 20 hours of the day and 10 months of the year. So they need something to protect people at traffic intersections. It's very, very hard to see pedestrians moving through public space. And the traffic fatality, particularly in the northern parts of Finland, has been very, very deleterious and significant. So this Finnish company designs this traffic beacon. It just has a motion detector. It detects when somebody, either a pedestrian or a bicyclist, is at a crossroads or a crosswalk and shines a very bright blue LED oncoming traffic to warn them of the presence of a person that they might not ordinarily otherwise be able to see. So again, a local effect on a local gathering of information, this information is not uploaded to the network. It is not archived. It's not data-based. Inferential analytics are not applied to it. And very importantly, there's a clear public good that's associated with the function and the purpose of this object. It is entirely possible that there's somebody in this room that doesn't think that the ROI of having these out there is worth it. Nevertheless, I think we can all more or less agree, and I don't mean to prematurely seal a hegemony, but I think we can more or less agree that traffic safety is a public good. Then we get into, from this, we get into something where I begin to have concerns. This is an ad for Nikon that was deployed in the subway system in Seoul last year. Again, a purely local use of local information, but for a different kind of purpose. Here we've got motion sensors that have been deployed underneath the red carpet that was laid down on the floor of the subway system. And as you pass this billboard, there are paparazzi in the advertising image, and as it detects motion past it, the paparazzi, they all swivel, the cameras flash, and you feel as though you're walking down the red carpet in Conor and Hollywood or something like that. Again, you know, I'm not hugely threatened by this. I think that it's at worst sort of mildly disruptive and disrespectful, and the reason I have concern about it is that there is no consensus public good associated with this. There's a purely commercial motivation for it. So yes, it's gathering information, it's acting on that information, and it doesn't have at least the red herring of some kind of beneficial purpose associated with it. It is purely a commercial instantiation. And, you know, it is an open question to me as to whether anything like this could successfully be regulated. It doesn't seem to cause harm to rise to a threshold of concern that would seem to justify that to me. And then we get into a system like this, which begins to raise those very concerns for me. This is a touchscreen vending machine that's been deployed in Tokyo since last year. It's called the Akude. And the design blogs were all over this because the entire front surface of the vending machine is a single touchscreen surface. There are very high resolution images of the things that are available in the vending machine. You touch it, and, you know, your purchase is delivered to you in this very kind of like high-touch Japanese way. What's problematic about it to me is that every customer approaching the vending machine has not offered the same selection of consumables. There is a camera in the vending machine that attempts to characterize the person approaching it as to age and gender. And then the selection is tailored to a model of that age and gender in the norms of Japanese public life. I don't even know where to begin describing how problematic that is to me. It is both prescriptive and insidiously normative. And of course, all of that information is gathered up, is compared to all of the other network vending machines in the system. Inferential models are built on that and tune not merely the future selection of beverages that are offered to individual consumers but even potentially the kinds of beverages that are marketed, the kinds of beverages that are produced. So if you're the kind of person, as frankly I am, who tends to fall outside these models, if you tend to be sort of, you know, the anomaly that gets detected by the anomaly detector, guess what, you're kind of out of luck in this scenario. And again, there's no real public good associated with this gathering of information. There's a clear commercial purpose, and somebody stands to benefit from it. You could even argue that, you know, the person approaching it benefits in some way from it. They've been offering some kind of savings of time and or effort. But I'm not super thrilled about this. And then I get into the really scary stuff. This is a video advertising billboard analytics package that's offered for use with video advertising. It's called KVDVD reports, and it claims to be able to detect merely your age to within four bands, not merely your gender ethnicity off of indices of facial bone structure, and they even bounce a lamp off your eyeballs, and they can claim to detect whether you're paying attention or not. So again, predictive and prospectively normative. And something, this draws value up off of the streetscape in a very, very insidious way to me, because even if you're passing by in the background and you're not paying any attention to it at all, you're still a data point in the advertisers analysis of what's going on. Even if you very deliberately choose to turn your back on this, you're still, in a way, generating value for some third party, and that value has never returned to either the city or the citizen. These things are interesting enough because it's easy for me to understand them. They're concrete, discrete objects, but I'm also concerned at the case where the power knowledge in this situation resides not merely in one object, but in an ensemble of discrete things. Here, a network bollard in Barcelona where the access control function is provided by the bollard itself. You see down at the bottom of the screen there. But the functionality of it is actually, it resides in this ensemble of like the data collection that the embedded sensor grid there and the RFID touch plate over there and even the lighting and the signage and the wording of the law. I mean, this is an actor network that's brought to bear on people, and yet most of us only ever understand it through the avatar of that bollard. So it can be a little bit difficult to understand and it's even more difficult to understand when that effect resides not in the physical object itself, but in code. And the example I'll give you is that when I was in Wellington, New Zealand last year, I was told about a citizen referendum that the citizens were offered there. There was a ring of surveillance cameras that the city, the municipality, wanted to put up. And the justification that they offered to the public to vote on in the referendum was, what are these things for purposes of traffic safety? Well, people there thought that that was a pretty good justification. They voted yes on it. The cameras were installed. And the year after an upgrade became available to the software at the back end of these cameras that permitted facial analysis, facial recognition, and all of a sudden there's been mission creep. All of a sudden the same ring of cameras is used by the police for a very, very different function indeed. And that upgrade in the software was never put to a referendum. And even to me, somebody so sort of public minded as I like to think that I am, it's a little strange. I get the idea of putting a camera to a public referendum. Having for us to vote every single time somebody wants to upgrade a software package does feel a little strange to me. So I am just about out of time and I'm really only into the interesting part now. If you'll indulge me and I can go a minute over, I think I can manage this. What I'm going to argue to you is that what we see here are a class of things that we should think of as public objects. And we should define them and then we should begin having a conversation about what we want to do about them. Here's my definition of what a public object is. Any discrete object in the common spatial domain that's intended for the use and enjoyment of the general public, any artifact located in or bounding upon public rights of way, or any discrete object which is de facto shared by and accessible to the public regardless of its ownership or original intention. And having defined public objects in that way, this is what I think we ought to be doing about them. I think we ought to ensure their openness. Where such objects are capable of the data collection activities that I've described here, they must be designed in such a way as to render the data streams they produce open and equally available. And by open I mean as an API in that their specifications are published and available to all and that the data streams coming off of them are non-rival risk and non-excludable. That is that there is no way to restrict the goods of those data streams to paying customers and that my use of that data stream in no way prevents anybody else's use of those data streams. There are, of course, issues, implications, and objections, but I'm already one minute and 50 seconds over, so I appreciate your indulging me. A city that's provided with such things where they are open necessarily and inevitably has an enormously increased attack surface. Vulnerability to hacking, vulnerability to griefing, vulnerability to disruption. Almost to the point that you'd want to argue that doing things this way had better produce an awful lot of collective value if you're going to take this kind of risk with them. I would obviously argue, given my prejudices, biases, and assumptions that that is a risk worth taking, but by no means will everybody share that opinion, and I think this is a debate that pretty urgently needs to be had. I think that we have yet to develop the sorts of etiquettes and protocols of precedence and deconfliction that we have evolved over 7,000 years of the physical use of public space. We now, I mean, it took us 100 years to learn how to live with automobiles in cities. I don't think we do it particularly well yet, and yet we have laws, we have etiquettes, we have standards of behavior that govern our interaction with this other order of mobile object in the city, and I would argue that this is something that, again, is going to require new thought about etiquettes and protocols. But why would you do such a thing? I think that it's justified to do this because it moves against the capture of public space by private interest, which is something that's very important to me. I think we move towards sort of a general-purpose fabric of freely discoverable, addressable, queryable, and scriptable urban resources, from which I can't even begin to anticipate now all of the potential upside uses to which everybody on Earth will be able to put these things. These will be the urban fabrics of the true deep mid-century. I think we're moving toward a place in this inscription of technology where the right to the city is meaningfully underwritten by the design of public space and the things in it, and if all goes well, toward a revitalized physical manifestation of the public sphere, which I think of it as the place where democracy happens and is seen to happen. Thank you so much for your time and attention. Well, Jeff got set up. Let me ask you a question. The worst word that I hear in discussion of privacy is creepy. Right? We hear it all the time, and I always try to stop the conversation and say, define creepy, but we're going to regulate our world based on this odd, vague sense of creepiness. So that vending machine, I think I get it. It comes up, and it's telling you that we think you're a woman, so here's a product for you, and I'm guessing you're going to say that. Why do you like it? Don't like that. It's because it's creepy. But let me go to the next step and say how you would just solve that. It strikes me that what you're also asking for is transparency in these devices, true of everything we have. So if you walked up to it and it said to you, our computer thinks you're a woman, we're going to give you these products. Click here if you don't want this. Wouldn't that kind of take away the creepiness? Wouldn't that kind of be okay? It could be when we use targeting online, why wouldn't physical objects use targeting as well? Yeah, that's the tension that I'm working with, but I'm also a user experience guy, so one of the tensions that I cut against is that sort of flow of having to authorize the computer to do that on your behalf is just a very bad user experience. But is it fixable? I don't know. It's an intractable problem. I mean, I've been working, you know, I'm not a lawyer, I don't have a background in legal studies, so I've been working with my best available knowledge, which is about informed consent. Informed consent is the model that I've been working with in the design of all these systems. What I've heard this morning is that, you know, better heads than I are saying that informed consent is no longer necessarily a tenable model for these interactions. And I take that very seriously. So I am the first to admit that I don't know how that's going to play out. I think we have to design protections for people into these systems, but not at the risk of so overburdening them with alert windows and dialogue boxes and those sorts of interaction that they just become untenable and unusable.