 So, since we're running late, I'm going to open this up right away to discussion from the group. But I think one of the recurring threads here that's a rich one for the broader conversation really is the whole question of what kind of public space we want, the contested nature of what is public space in an era where there's this constant repartitioning and blurring of the boundaries is obviously an interesting question and that includes everything from various kinds of semiotic conversational rhetorical cues that indicate markers of inside and outside to literal physical space. So I'm going to open up the conversation here. I can do the honors to pass the microphone if you want to. Thank you. I'm Zaynep and I wanted to just follow up on what Ethan was saying, especially regarding sort of this Facebook real name in light of what's been recently happening with the fur over Amina Araf, Syrian blogger, if you guys don't know who was a fairly well followed blogger, turns out she's at least partially fictionalized whatever it is and might be a complete hawk. Or it might be completely true, we don't know at the moment, but there's a lot of fur. And it was before this I started thinking about this, so I didn't just change my mind. But I've recently kind of been thinking how, following from your cute cat theory, that what's going on is a bit of a tragedy of the individual where what's good for the commons is actually not good for the activists in that the realish name norm on Facebook, which is ish because in my surveys 10, 20% don't really do that. But as you point out, the people who get called on it are the people vulnerable. But still that norm seems to really have helped activists, both in Egypt and Tunisia, by embedding political conversations in everyday conversations like what you're seeing. So even though they put themselves in danger by being out there, they also were somewhat protected, one by being in large numbers, also by for creating an authentic place. And I think under, you know, autocracies, you don't really have the sweet spot where you can have political efficacy, safety, trust and anonymity and like some perfect intersection. It's just how you're going to balance this. And ironically I used to criticize Facebook a lot for the real name policy. And recently I've been thinking there's really value in that kind of space, even though Facebook is not doing it with, you know, maybe the best intentions. So I just wanted to ask you about what you think. So I think one simple reaction to this would be to suggest that what's really going on is sort of a raising of the stakes on multiple different levels. And this didn't really happen first in the activist space. I think it first happens around bullying. I think it first happens around, essentially, teen behavior in one fashion or another where we're sort of willing to look at highly flexible identities. And then suddenly there's this realization that some very uncomfortable things happen in these spaces and we immediately sort of swing and say, well, maybe these should be pure real name identity spaces. Maybe if we just have names associated with those problems go away. And of course we also know that that's not the case. As Dana's revealed, there's all sorts of ways to have those fights occur stagnographically. I think what we're finding on a lot of the tools that we produced is that we produced tools sort of assuming that they are light and funny and for cute cats or for congratulating each other on birthdays and that in a bunch of different ways those stakes are starting to get raised. And as those stakes get raised, we need to reconsider a lot of design decisions that went into them. One of the design decisions we need to consider more than anything else is who is involved with setting those rules of the road and setting those terms of service. So far it's been entirely a corporate process with sort of nominal participation from member bases. That may not be acceptable in the long run. Dana's made the argument in the past that as some of these services essentially start functioning as utilities, we might choose to regulate them as such and try to figure out how to have a public interest involved with figuring out what's involved with those rules of the road. For those who actually don't know Alessandro Questi's work, please go read it. One of the things that he does, which is really interesting, he's working on a current experiment that I want him to finish writing up and I've had the great fortune of hearing him presented a few times where he can take a picture of you and through a combination of photomatching technologies and other things can predict your social security number with a high degree of accuracy. All of our conversation about real name being the most important personally identifiable information, I think it's going to be technologically undermined very, very quickly. So one of the questions then is how do we actually deal with systems of power? Because what you're really dealing with in the cases that you're talking about is what happens when somebody goes from being part of a broader society to an individual that becomes identified by a system of power. And I think that's where we have to really look at all sorts of marginalized populations and their relationships to power in all different forms. Because the thing is that the technology is getting to a point where it's going to be able to be leveraged by systems of power to really identify people when there is a need to do so. And it's no amount of individual hiding will allow you to sort of hide within a collective in the way that we have understood historically. So the question then is how do we actually rethink our relationship to power and rethink what it means to deal with individuals and collectives in relation to power? So some of these clients are sort of going to something I wanted to bring up as well, which is to do with sort of a privatization of public space and a sort of reinterpretation of what those public spaces are, especially in transnational contexts. It seems to me not irrelevant that these are American corporations that are providing the space for a sort of notionally public discourse in non-American spaces. And the question again of what that means within a corporate context, because turning it around, it's interesting to note that all it took for the US to shut down Wikipedia as concentration on two spots, Visa and Mastercard. That the infrastructure that supports these things is a corporate infrastructure, but it's located in very particular kinds of places. So I wonder if you could speak a little to this, the notion of privatization and corporation. And not just in Ethan's stuff, but actually I think the questions of publicness and the privatization of public space has been a big issue in architecture and urban planning as well, for instance, and across the talks. It's probably worth remembering that the reason that WikiLeaks shut down happens is because WikiLeaks is forced by a DDoS attack to move off its own controlled infrastructure in Europe onto Amazon's cloud. And then Amazon responds extremely badly to public pressure. So, you know, while I'm thrilled that people are looking for interesting technological solutions to this, even Mogun's Freedom Box, all of these things, at a certain point, these forces are incredibly difficult to deal with. And you find yourself flocking to hide behind the big rock. And the big rocks are a small number of companies that I think therefore have a special responsibility to think about how they protect public space. Just on the international piece of it, just on a tiny subsection of it, language becomes incredibly important within this. We have taken to changing activist best practice. An activist best practice nowadays is, sure, use Facebook, use YouTube, use everyone, make sure there is an English language description of your content. Even if you are putting up a protest in Kyrgyzstan, in Kyrgyz, you need an English language description because the people who are evaluating your content don't speak Kyrgyz. And when five complaints in English come in to say, these people are advocating the slaughter of innocents, that site's going down. So, you now have the responsibility, whether you speak the language or not, to try to speak the language of the people who are controlling the platform that you're on. I'm Jeff. Ethan provocative as always. I'm struggling though with what you think you're, where you had for solutions. Regulation doesn't satisfy me because I'm that way. And because, you know, we see regulation in China and Iran and all those places. Once you open up to one body to regulate, you open up to all bodies to regulate. But the notion of what is the definition of a public space for public speech? I hear you say at the end of yours, maybe the current best option we have is to lobby the private organizations to behave well, which is where we're going now. I'm just, well, I guess what I'm asking is what is the other alternative to that in your view? So, and this is hard because I'm trying to give a quick presentation in the context of other things as well. But Rebecca McKinnon's got a new book coming out, Consent of the Network, which basically ends up arguing that the problem is using these spaces while adhering to these usage agreements that we have no ability to shape. And that ultimately what you need to have is a revolutionary moment where the people who are using these spaces take some responsibility and power over shaping the rules of the road, ideally in cooperation with the benevolent monarch that is the company putting it forward. And it's to the company's advantage as well as to our advantage and there's the possibility that we see differentiation in the space over time to companies that are open to that process rather than to companies that are out open to the process. The problem, Jeff, is that I've been working for 10 years on the other alternative. I've been looking for the technological fix to this. I've been playing around with circumvention systems, all sorts of different distributed content. Ultimately, technically, we do not have a good answer to a question of how you deal with something like Facebook in a decentralized fashion. I can host video in a bunch of different places and spread it over tournodes and so on and so forth. But I cannot have a social networking conversation without some degree of centrality. And so at that point where we have centrality, we need to figure out whoever that entity that is centralizing becomes part of a discussion about the idea that this is both a public space as well as a private space. I think another thing to tie both Paul and Jeff's comments sort of together, one of the other things to keep in mind is that corporations have incentives, right? And individuals have incentives and sometimes they align and sometimes they contradict. And one of the most challenging things in thinking through this is that I often run into folks who just assume that corporatized interests are so far removed from what their interests might possibly be that there's always a hatred. I actually think there's a lot of power when you can actually think about alignment, right? And alignment is one of the tricky things. Sometimes alignment happens through finding common interests where you're sort of odd bedfellows. Sometimes alignment happens through forms of regulation at a social norms level or at a legal level or at a technical level. And so one of the things is as we think about all of these individual roles as actors in this, a lot of it is to think through those incentive structures and find points of alignment where good work can happen and good change can happen. And I do actually believe that's possible. It's not just by rallying and trying to convince the corporations to behave like you want them to, but to find points where you can actually agree on things and it's in your shared interest to produce XYZ. And I think that that's, so rather than thinking ourselves as in opposition, I think a lot of it is about finding those points of alignment and those ways of really coming together. My name is Diyu. I have a related question to Dana. In terms of the problems with the social norms of people not respecting on social networks, have you seen designs like in terms of technology which actually solve that problem? Is it even a hope for it that you could actually design something where by definition it is much more conducive to people following social norms as opposed to kids having to invent? Sure. So, you know, the design issue is a really interesting one from my perspective because the design issue, it builds the conversation really about the technology and what's in that space rather than thinking about the technology as situated behind a whole variety of things. So, for example, one of the things I struggle with all the time is that people are like, oh, we have to build better technologies to make certain in Facebook that they won't share in these ways. All it takes is for a parent to lurk over the shoulder to violate any technical conditions that you could possibly have developed. Likewise, because of, you know, one of the things I love about teenagers is that their cultural logic around these things is not necessarily the same as parents and adults in general. So, the vast majority of young people share their password, just like many of you probably shared your locker combination as a sign of trust, a sign of relationship with people. So, these technical solutions aren't quite there. That said, there's really important design implications that play out. There's a real big policy conversation right now about private privacy by default, which I actually think is deeply problematic. The reason that I think it's deeply problematic is that, actually, what people are really good at is adjusting to a set of technologies when they understand it and they can work around it. So, when I look at young people who are using Twitter, they know it's public by default and they act accordingly. The bigger challenge from a design perspective is what happens when you make changes. And changes are where there's a huge cost because changes are where we undermine social norms. So, in December of 2009, when Facebook decided to automatically change certain privacy settings that resulted in a lot of people suddenly being public without knowing it, that's where we see a design violation. So, part of it is actually as you're one of the challenges for designer is as the system is evolving, figuring out how to tango with users so that you can actually evolve the system because the system has to evolve. No system comes out and is forever that way. But you do it in alignment and that is a very hard design problem. And it's really not about the initial design but the iterative design where we really need design efforts. Hi, Tim Sparapani from Facebook. Thank you all for your very interesting comments. Yes, I've outed myself, hello, I'm here. If anybody's responsible for the architecture that Ethan was talking about, it's me and my team. And I think you make some very interesting points about how to hide advocates within a mass of people. Every design that you could pick for any networked system has consequences. And I think that's the point that you're essentially making and we have made certain choices. The one that has always bedeviled me and I come from a background of being at the ACLU having spent a number of years there and having really worried about the people with the guns and the ability to lock you up rather than large corporations which can annoy you and charge you money but can't really put you in prison. And so that's again a bias that I have in terms of the design that we look at. So here's the nub of my problem. The same system that finds and identifies fake accounts for advocates around the world also identifies by intention fake accounts from law enforcement and national security and secret police people around the country and around the world. If you think that the conversations are difficult with respect to what happens when an advocate is outed they are equally difficult when a law enforcement or national security agency intends to haul you in and say why is it that you've just shut down our honey pot intended to find and disrupt the following crimes terrorist activity whatever protests that they would suggest. So I guess my question is this is how do we avoid as designers as people who want to build a structure and this is to everyone on the panel building systems that when by design would create a pseudo sense of privacy for people because literally on the back end if you look at a fake account no matter whether it's an advocate's fake account or a law enforcement's fake account it looks the same essentially it has all the same anomalous behavior it doesn't look like the other you know more than a half billion people on Facebook so I just I hope you can throw some some meat on the bones there of that rambling discussion. Thank you. So I think there's a couple of interesting pieces of this that come into play. I think the first thing that comes into play is that we tend to make design decisions about a platform and then we tend to watch behavior occur atop the platform that surprised us. This has certainly been my experience in building social media platforms people do things with it that I never thought about. I got sensitized to this issue in 1996 when the Malaysian opposition put more than 50,000 pages on the website that I was hosting because it was the best way that they could promote the reformasi movement and I found myself in this interesting position of having all this content which was clearly very sensitive I got a lot of complaints about was in a language that I couldn't read but on the other hand appeared to have a certain amount of political significance and so we then had to sit down and start bringing in experts in Malaysian politics to figure out what the right thing was to do at that point. So I think once we start realizing that there's problems that we didn't know that we were going to deal with we're dealing with law enforcement doing stakeouts on our services we're dealing with activists trying to figure out how to organize on our services behaving in ways that are technically against the rules of the road but may in some cases be admirable may in some cases be very dangerous how do we figure out how we adopt around it and the answer is it's got to be some sort of ongoing process as it obviously is. Now Dana has signaled that there's another complication in that process which is that having a change in an arbitrary and opaque fashion tends to break goodwill and it tends to break the ways that people have figured out how to use those tools in creative ways. What I'm suggesting knowing full well that it's provocative and probably unacceptable on the surface but it's an interesting way to go from there is to say there needs to be some way that we open that process. There needs to be some way that you're talking not just with law enforcement who I'm sure you are talking to about figuring out how to make that work but to the activist community about what are the ways in which we evolve that terms of service over time that doesn't just recognize sort of a legitimate law enforcement need to use that space but also recognizes this responsibility that you guys have found yourselves in by virtue of tremendous success in becoming a network public sphere whether or not it was your intention. And there's a long conversation thank you David there's a long conversation about how we could best do it we can argue about you know GNI and all sorts of different mechanisms but I think this conversation is going to be a growing one over the next couple of months and years in figuring out how we open that process so it's not just purely a lawyer's process but it starts becoming a community process as well. And I want to sort of second that oh It was Weinberger he always does crap like that you know that Charlie. But I actually think Tim's question is extremely important I think Ethan's point is really critical and something that involves everybody in this room which is that these puzzles are not actually easy and it's not really clean to sort of figure this thing out and some ways what we're really struggling with is what the sort of moral fiber of the society that we want to live in is really about and how we actually deal with it and I'm dealing with it in a slightly different variant than Tim is which is that I'm thinking about it in terms of human trafficking right which is a lot of the same puzzles that we use to sort of out you know activists also can be used to sort of out people who are actually trafficking minors for sex work in really horrific horrific ways and so how do we actually balance between these things that we see as really do good and things that we don't in many ways that's actually the big puzzle to me about the neoliberal society that we're creating which is that we have these ideas were market driven rhetoric and that the idea of the individual can trump all of this but I think we actually have some of the hardest puzzles for thinking about the moral society we want to create ahead of us and I think one of the things is as we dichotomize against these institutions I think it's really important that we figure out how to come together and work on solving these problems because there are really problems for all of us I think on those with those words we're going to conclude I want to thank the three speakers Beatrice Dana and Ethan and we're going to transition to the final connections talk of this morning