 So, sitting immediately to my right, who you've seen earlier today where she gave a wonderful presentation talking about the night news challenge award that she's won, is the wonderful Jillian York. Jillian wanted me to mention that she is in no way prepared for this discussion because she found out about nine o'clock this morning when I pulled her aside and asked her if she would participate. That said, Jillian is generally speaking one of the world's best prepared people for this discussion. She heads up International Advocacy and International Freedom of Expression for the EFF, has worked on International Advocacy and Freedom of Expression for a long, long time in global voices in another context, and we're thrilled to have you here. Thank you. To further to your left to my right is Nathan Freetas, the benevolent dictator of the Guardian Project, and I'm glad that you called that out early on. It's good to take ownership of these sorts of things. I'm the guy with the mic. That's always got a certain amount of dictatorship associated with it. Also deeply involved with the Tibet Action Institute, someone who has been coding and working through issues of freedom of expression online since before he was a teenager, someone who's thought really deeply about these issues from a technology perspective. And finally, on the end, our dear friend Emily Bell from the Tau Center at Columbia Journalism School, formerly with a different Guardian Project, the pioneering British newspaper, where she was the head of digital content. And one of the reasons, basically in general, if I can put Emily on stage and on a panel, I do so, but specifically in this case, I wanted to have her here so that we could talk through some of the implications of surveillance for the journalism community, which was something that we were both involved with an open letter, and the Tau Center is now very, very involved with questions of journalism post-Snowden. We're not going to back up and do the entire Snowden revelations here. It's been an incredible year, and I'm guessing that most everyone in the room has at least a cursory familiarity with the story. What we're going to try to do instead is have a conversation about questions of the world post-Snowden in terms of what does this mean for activists? What does this mean for anyone trying to protect their communication through technical channels? What does this mean for journalists and for sources? What does this mean for any of us who are interested in trying to use the digital public sphere as a space where we can organize and where we can debate? Because she's had such a lousy day, I thought I would start with Emily and ask if you can talk a little bit about this question that we found ourselves addressing in this open letter. What do we think the Snowden revelations mean for journalists and for their sources? Okay, thanks, Ethan. Two things. First of all, I was not detained because I was speeding. I was detained because I ran into somebody and it was my fault. I know you're not supposed to say it was my fault in America by British, so I'm sorry if that was my fault. And also, if you're going to run in someone, run in someone nice, preferably another woman. So we chatted. I charged her phone. She was like, oh, it doesn't matter if you've towed my car. Sorry. Back to the issue in hand, which is what do these revelations mean for the journalism community? We can get into some detail in a minute because they're very specific, I think, security implications, which most journalists, including, as we said, Glenn Greenwald, who was the recipient of the leaks, simply weren't aware of. They didn't kind of practice good digital security. There are a small number of journalists who are forced to practice good digital security because they are working in extremely challenged, very physically dangerous environments. But they're a tiny percentage, and in general, I think the journalism community, Priest Nodin, two things, one of which was its actual newsroom practices in terms of digital source protection were very poor. And this is something that we owe a great deal of debt already to activists from the EFF, et cetera. Chris Sohoian, who pointed out sort of two years ago in an op-ed in the New York Times because it was in New York Times, everybody read it, saying journalists need to kind of read each sort of brain up on this kind of stuff. But I think there's a broader implication as well, which was something that I'd been concerned about for quite some time, even when I was still kind of operationally working at The Guardian, which is that we had this kind of, as journalists, we had this sort of almost inappropriately close relationship with the technology industries, that there was a kind of an outsourcing, not just a function, but also a thought to Silicon Valley. Like you would go around, like people would, every time I met a major kind of, you know, some chief executive or director general of this or that major media organization, they would always be, I've just been out Palo Alto, and I've met Facebook, and I've met Twitter, and I've met Twitter, and Google aren't Google great, and I didn't, you know, I don't want to kind of come over all of Janey Morozov about this, because there are clearly expediences, you know, there's a great deal of expediency for journalism in using freely available tools, but there was literally, I think, in many cases, no thought about what this really meant. And the great thing about, you know, Snowden's bravery and his kind of, you know, light on how these relationships work is that they really did demonstrate vividly for journalists, not just how they needed to protect themselves and their sources, but how systems of power, you know, work now, and particularly in the nature of big data and extensible social platforms. And even if you are not going to, you know, be working with highly sensitive information, I don't think you can be an effective journalist these days unless you know exactly what those issues are, unless you understand those systems and you understand that journalism should stand apart from those systems. So some of the kind of, you know, great things that, I was not reading my tweets either when I ran into that one, but some of those great things that were, you know, have been tweeted about today in terms of sort of civic activity and participation. You know, journalism is the fourth estate for a really good reason, you know, we're not on anybody's side, we're there to hold power to account, we're there to go to jail, you know, we're there to break the law when we see that things are wrong and we're there to uphold freedom of expression for not just ourselves, but other citizens as well. And I think that, so I think there are two, as I say, there are two sides to this, one of which is a much broader question and one of which is a very specific security question. And we're addressing both of them in this great sort of program, which actually Ethan was a real inspiration for, called Journalism after Snowden, you know, shout out to the sponsors, Knight and Tower Foundation are both backing it, which again, is fantastic because we need funding for tools and exploration in this area because it's been, you know, underfunded for too long. And we're up against a lot of money from some serious actors in this. Well, so let's, let's turn over to Nate and let's talk a little bit about the tools in the space, because you, in my opinion, have sort of taken on a problem that's critically important, but also incredibly challenging. I think many people in this room, carrying around mobile phones, don't understand that those of us sort of within the human rights and activist community, think of these first and foremost as tracking devices, and then possibly later as communication devices. What's the Guardian Project trying to do? How far are you on it? And how has the sort of real and present danger in the Snowden revelation sort of changed how you're thinking about that space? I think the issue of trust is now at the forefront of people's minds around what is the technology that I've entrusted my freedom or voice or culture or movement to, and that has been a window through which to start getting people to think about these tracking devices in their pockets. So the work we're doing began out of my work within the Tibetan independence and rights movement, where supporting Tibetans on the ground in China and Tibet and in protest situations, it became apparent how much these were tracking devices and how they were being used to map a social graph and then arrest lots of people very quickly. So when I began talking about this, the only secure tech phone mobile you could get cost quite a bit of money and was being used by military government contractors, oil companies, things like this. And so on the large scale, the progress that we've made along with groups like Open Whisper Systems and other projects that have thought about this is huge. I mean, you can now take an off the shelf $100 smartphone pretty much anywhere in the world and install free open source tools for like TOR. So I work on TOR on smartphones. Mike Teegas is here who does onion browser TOR for iPhones, encrypted chat, encrypted calling, and it's all free and it works. That's great. The I think the question comes down to a lot of people is why should I trust you versus trusting Google or Twitter? And that's one that we're still trying to sort out as small scrappy open source projects that aren't quite Mozilla yet. And I know Mozilla has a lot of and the mark brought up a great point about app stores and this is this is one of the biggest issues I think right now is we can build all the apps we want. But we have this barrier of the app store, which is for places like China is a huge problem. And so because the gatekeepers are not the open web, we are a little screwed in some cases. But I want to talk about that more because because that's part of it is just how do you get in front of people, let them know they should use this app, how do you get around censorship of apps? Ultimately, the you know, I think we've seen that and again, what Snowden said is, you know, math works, you know, the fundamental premise we have has not been broken or shaken. Everything is sound that we're building upon. There might be seven open SSL vulnerabilities tonight that I need to scamper home and deal with. But otherwise, you know, the, we just need to do more and we need to continue to spread the word and ensure people have access to these tools in an affordable way that's not just on the latest smartphone. Let me make you Jonathan's a train for a moment here and push a little bit on this app store idea. There's a couple of things that we often talk about when we talk about security in the space, one of the big things we talk about is users and usability. And certainly getting to the point where you can go and say download this app and make me secure. Is it appealing place to be? What's the barrier that we're having at this point with with Android and Apple app stores as far as as trying to bring something like Artie into market? I mean, in the Google world, there isn't there is very little barrier and it's it's we have this great opportunity where we can just put it's gotten so much better than previously with with Java phones, for instance, and telcos so that we can put our app in the app store and it's published globally instantly. And Apple, you know, is a gatekeeper and they require more of a process. But what we've seen now is that in places like China, you know, they've they've removed apps based on local requests. So there's local censorship of apps and people don't aren't impacted in the same way that if there was censorship of DNS or the web, you know, and you hear Oh, this app isn't available in this country, it doesn't seem quite as bad as as when a website's blocked. The you know, and increasingly, these are going back to the trust issue from a user perspective, people trust the Apple app store, because it seems safer, less viruses, right? Apple sells this trust us will protect you. And I don't want to dismiss the user's the user stake in this, you know, I don't want them to say, I don't want to say, Well, you're a bunch of idiots, because you're trusting a censored system, there is some value to the fact that Apple provides a good service. So it's that balance between what the user needs and benefits from in terms of curation versus the the, you know, it's a bit of the liberty versus security debate within the app store world. So thanks for that. Jill, I want to tell a story to ask you a question. Okay. So a little bit more than a decade ago, I was in Cairo meeting with an Egyptian human rights organization, the organization for personal rights. You and I both know that personal rights in the Arab context translates as gay rights. And I was there to do a security training to figure out why they weren't using Tor and PGP despite the fact that we trained them three or four times to use Tor and PGP. And the folks there explained that they weren't going to bother using any of the silly security software because everybody knew that the entire internet's traffic was routed through seven computers in the basement of the White House and that the president just read everything that was coming through. And I said, that's crazy. That's totally absurd. How could that possibly happen? And then of course, what we found out this year is that more or less that's what happened. So the question that I want to ask as a fellow activist in this space is how do we deal with what I've started referring to as security nihilism? Now that we've figured out just how hard this is, how difficult it is to actually keep a channel secure and how to keep a targeted individual able to use these sorts of tools that have been so important in the Arab world and really around the whole world. How is an organization like EFF dealing with this sort of post-snowed and climate of security nihilism? Sure. Yeah. So I mean, there's, there's, you know, I do some trainings in the field and we're developing a project called Surveillance Self-Defense that's going to answer a lot of the questions that people have. How do you use the Guardian Projects apps, things like that. And when I talk to people, I get these two different perspectives. One is the Silicon Valley. And we saw in the past couple of weeks, Mark Endres and Robert Scoble both going, well, you know, tisking the privacy advocates, oh, you guys are useless, you know, this is, we're in the post-privacy world and I, words can't express how I feel about them. And then on the other side of it, though, you've got the privacy nihilists, the folks who say, you know, this is pointless, everything's being watched anyway, or even, you know, to another degree, I'm not going to use Tor because that makes me a target in my country, which is true if you don't have enough users. And so how do we deal with that? I think is a big and serious question that I can't answer completely, but some of the things that, you know, we've been doing are looking at, okay, well, with journalists, for example, a lot of journalists, you know, are used to taking enormous risks and are willing to take enormous personal risk, but maybe aren't realizing that it's not just about them, it's also about their source, it's also about the people they're communicating with. And that's been true for a lot of the other folks that I've talked to that kind of have that approach in the Arab world and elsewhere. You know, I find the other side of the coin actually more difficult to deal with when you have these strong, powerful white male voices from Silicon Valley telling you that privacy is not important, and their voices, you know, are being heard over our voices. So I actually find that side of the coin much more difficult. But yeah, I mean, the privacy nihilists are real too. And I think that, you know, the ease of use and the usability and the improved user interfaces of some of these products make it a lot easier for me to get people to use them, for me to be able to say, okay, we're not talking about PGP anymore, because PGP is hard. Tech secure or the Guardian Projects apps, not that hard. I don't mean to say that to belittle anyone that does find them difficult, and I'm happy to sit down with you if you do and help you through it. But nonetheless, I think that this is making it a lot better for us. So, so let's move into technocratic white male privilege space and actually talk about this question of how online culture, commercial culture, maybe sort of tied into all of this. One of the responses to some of this note in revelations has been, what's the problem? You're all telling everyone everything on Facebook anyway. Why would you care now that it's government instead of the private sector? And Bruce Schneier has this wonderful line that I actually find myself repeating again and again, which is that surveillance has become the business model for the internet, that almost all of these large platforms that we're sort of paying attention to, we have this very uneasy uncomfortable situation with them in which we are giving up an enormous amount of data to have ads targeted to ourselves in exchange for free services. Is there a connection between giving up certain amounts of privacy and targetability in commercial spaces and the fact that while in a room like this, we can talk about a post-snote in the world, but in terms of mainstream America, we may not have had the sort of outrage and reaction that some of us were sort of expecting from this. Did we somehow set up government surveillance by normalizing corporate surveillance and what can we do about it? I'm just going to give a quick answer because I want to hear what they have to say as well. But basically, first off, there's a difference between, I live my life pretty loudly on the internet, everything's got my real name, and there's a difference between the things that I say on Facebook or Twitter and the private communications that I have with my sources, my family, whatever. The second thing is a lot of people make the argument that social media companies don't have an army to back them, so their surveillance is not as bad. Well, first off, that's not true. They do have an army to back them. They're in bed with the NSA. But second, so what? That doesn't mean that we should be handing over everything to them. That doesn't mean that just like governments can change in an instant, so can companies. And I don't know if we're talking about the sort of hostile takeover of Google scenario or whatever scenario, but nevertheless, I don't want companies to have that information and that should be a choice. Yeah, I mean, this is where I get to play the sort of crypto-communist centralised European on the panel, which of course I am. I mean, one of the things I've been, I've lived in America now for four years, and one of the principal cultural difference is really the idea of kind of centralised, some sort of centralised power and alternate kind of balance to, if you like, the corporate world, and really living in a genuine sort of free market economy where in America the commercial rules are sort of the rules. You have a public sphere here, which is basically, as Clay Shirk has said, lots of people have said, is really a private sphere which tolerates free speech. What we're seeing in Europe at the moment is a kind of pushback against the white male kind of technographic voices. Not necessarily the most intelligent pushback, perhaps, which is this idea of the right to be forgotten, ruling which came down a couple of weeks ago from the Department of Justice, which is really about sort of taking things off the internet and saying that if you don't like the way that your research results render and it's not in the public interest, you can have them removed from Google's cache if you ask nicely or through the correct channels. So it's kind of interesting here because you have very tight, like, regulation here is small and we're entering a global market where actually that kind of regulation is much more kind of front and center in different markets. I kind of think that one of the balances to this enormous kind of corporate, I mean it's kind of an incredible thing because it is American public media is Google, it is Facebook, it is shaping the way that the world communicates with each other. It is imposing and I say this nicely, American stands as a free speech on the rest of the world and actually in many parts of the world it's not that simple. So we have to have more than, we have to have I think a really engaged, intelligent conversation with government. We're not going to get that until you have more kind of engagement if you like sort of on the ground. I'm actually, so I would, I push back a bit in that I'm surprised at how many people here have paid attention to the NSA story. If I go by ironically, if you go back to the UK, if I go back to the UK, it's had almost no air play there whatsoever, you know, at almost none. And one of the reasons is that something like, you know, the state broadcast of the BBC, it's not state broadcast, but you know, the licensee has not really run with the story, the other press outlets have not run with the story, people have kind of gone, we sort of knew it anyway. We need more vivid illustrations of why this is really kind of, and you don't need those vivid illustrations right in countries where people are in jail because, you know, because they are vulnerable. But here I think that, you know, I think that the debate is really important that kind of, you know, Snowden was a series of events that pushed it out the stops, got it kind of beyond I would say Washington New York, San Francisco. But it needs, you know, there is a danger that it kind of fizzles out, if you like, as a ground-roots issue, as you say. People just say, well, this is kind of, you know, expedient for us. I would just like to see more people kind of picking this up as a political issue because we do need regulation and we do need kind of some sort of governance in this area. You know, I can't see otherwise how you counterbalance what's essentially kind of, you know, the corporate might of, you know, a handful of companies in Silicon Valley. Well, so Nathan, I want to ask about is there a hope that the corporations are going to sort of move around to our side? Are we likely to end up with secure communications software, you know, for our phones? There was a moment in time where some of us thought that Skype was perhaps a step forward and might be harder to tap than existing telecommunications. That's proven absolutely not to be the case. Is this a moment where we might see someone move into the commercial marketplace, or is it going to have to be non-governmental organizations, activist organizations like your project that are sort of trying to figure out how to create what turns out to be very difficult to build and very powerful software? So in continuing a non-U.S. sort of view of this problem, one of the apps that I engage with a lot is called WeChat, which is from China. You know, this was adopted rapidly over the course of last year by Tibetans and, you know, hundreds and hundreds of millions of users around the world and now it's being adopted in India of all places at a crazy rate. And so WeChat being that every message you send goes through, you know, Shanghai or Beijing data centers and is subject to filtering and monitoring by the, you know, people that we usually associate with closing down borders from people trying to get out as opposed to inversely surveillancing, they're sort of my one of my mortal enemies, like, oh, I hate WeChat, you know, they're just like making my life so much harder. Now, I was at a meeting where a researcher had had started unpacking the code and looking at the guts of WeChat and they decided to surprise me by saying, you know, and we looked in the source code and it found that turns out they're using Guardian project software in WeChat. So, WeChat users are encrypted database software SQL cipher. Why did a Chinese app company employ database encryption in their mobile app? I think because they cared about protecting their users and they didn't want messages to be hacked or copied. And I think there's, there are strange bedfellows when it comes to users and corporations and data and we're starting to see these sort of post-national interests, which is sound scary, but it's also useful in that if there's an alliance between users and the services that supports them that goes against sort of government surveillance, then yes, you know, and we're starting to see Google doing end-end encryption for that reason and I think it's very likely we could see someone like Tencent or, you know, do that. So, for us, we really see ourselves as reference design and we have an app called Obscura Cam that automatically blurs faces when you take a picture. We want every camera app to have that feature. Harlow leads a project called InformaCam, which allows you to actually trust a photo is not manipulated when you see it and verify a chain of custody so that when there's a photo of a shark swimming down the east river in New York, you can tell if it's real or not. So all of these things are big ideas and feature designs and reference designs and we need industry to adopt them and we need the news industry to adopt these tools and journalists to adopt them as this is the standard practice for protecting sources or verifying media. So, I think that concept of the toolkit or of the library that it's out there and it becomes pieces that other organizations can use is incredibly powerful and one thing sort of implicit in your comment, Nate, is that one of the reasons why we'd like large corporations to do this is that the scaling problem around these tools is really, really hard. It's been hard to get to the point where enough people are using Tor that it serves as cover traffic rather than as a clear sign that you have something to hide. But you can imagine if Tor was part of the Mac operating system or Windows or so on and so forth, right, you know, that starts getting very, very interesting as far as sort of where we figure out where to go on this. I wanted to come back to something that Emily said because it, I thought your observation that not only are we in a privately held public sphere but we're in a privately held public sphere that works on U.S. laws whether we mean it to or not is really interesting. I was in Myanmar earlier this year, Gillian happened to be there as well, this is what happens when you go to a lot of the same conferences. But we were bizarrely enough at a conference of freedom of expression in Yangon which is not a place that I would have expected to be but Yangon has actually made some real progress around freedom of expression and I was talking to a lot of independent journalists and a lot of activists about the internet and I found out two things. One was the internet is Facebook. In Myanmar right now, you don't search on Google because no one knows what Google is, you search on Facebook because that's all you know. And then the second thing was that my activist friends wanted Facebook to be censoring posts and they very specifically wanted to be going after ethnic incitement and hate speech. And this for me as an American who is very very interested in freedom of expression was incredibly uncomfortable. I found myself sort of sitting there and saying, but guys, like you can't possibly trust this government, they're going to use this against you at some point fairly soon. The flip side to which was people sort of saying look if this is going to be our public sphere, we need it to operate away other than it's operating right now because this is going to blow up and people are going to get hurt or sort of get killed from this. How do we have this conversation? If Rebecca is right and I think Rebecca is right that the really powerful actors in this space are the Internet hegemons. They are the powerful platform companies. They do a great deal more regulation of speech right now than formal regulators. How do we sort of open that conversation? How do we have a conversation that deals with at least two points of view we have on the panel where I know that Jill feels really, really strongly on freedom of expression and making sure that we don't have censorship and I take Emily to be making a point that some of these standards may actually need to be more local or if you don't want to take that point I'll happily take that point. Yeah, well I wouldn't, I would never stand against the American standard of free speech. It's protected but you also have the First Amendment protections here which you kind of don't have in other places. And also you know you kind of are actually a pretty self-censoring society as well. You know and that there are places where you know kind of free speech really means can mean kind of you know people living in fear of hate speech and kind of you know organized aggression. And you know in more volatile kind of situations and it's a really, it's really difficult because ultimately we should definitely be working towards this kind of you know more open standard. But you know the main thing is just to get a debate where the major platform companies will even acknowledge that what they do is editorial. So that's the most important thing which is you know you hear it time and time and time again we're just a tech company they're not just a tech company and even if they were everything they do is imbued with what you might call you know kind of cultural or editorial standards. So you know when you we can't have this debate with them until the leadership of those companies until the Mark Zuckerbergs and you know the Larry Page Sergei Brinds of this world will really engage with it you know and not as Julian says just kind of go oh it's over we're now into this phase you know this is not a glib phrase you know these people have enormous power now over how people live their lives you know and it needs to be they need to be brought into that discussion because I think that you know that we can't nobody can pretend now that it's not an editorial act to tweak and use algorithm to include or exclude kind of you know results from from from something that they show we understand and and it's journalism's job to understand these systems as well as these companies do and push against them because you know they are they are new systems of power that we have to hold to account you know I think that the reason that Snadan was so valuable was it was the first time you began to get some acknowledgement I mean I've seen Eric Schmidt say if you want to be safe on the internet you just have to be very careful what you put on there you know they're not saying that anymore which is that's one thing but we have to have more active engagement from from from tech companies we really do but I think the issue of ethnic violence or I mean we've seen in China you know people labeled as terrorists or local ethnic and we need to you know they're cracking down on Tibetans and self-immolations and and if you share the media online of a self-immolation that is now punishable and it's and so there there's ways that that is used to persecute and I and I respect a society's need to self sort of have harmony or a humans need to have harmony and to have you know and but at the same time you know yeah it is you know what one person labels as terrorism and there's a freedom fighter of course and you raised the really interesting idea a moment ago that 10 cent may actually sort of be on the side of increased openness in this space I wonder if I can get you to sort of expand on that not just about Wixian but maybe also about Webull yeah I mean there well again this idea that if everyone used Twitter on the planet we would all be happy but there's a there's a draw of Chinese to these platforms because they're native right and they're national and they're local and there's pride in that but I think at the same time these companies are moving internationally right and so 10 cent can't afford to have WeChat be seen as a Chinese company right especially in India and there's all these Indian celebrities who are advertising WeChat to the Indian population and so that you know they want to be they see themselves as Google as Twitter as these global so I think there is a millennial generation of tech companies perhaps that you know that is has more future looking values around you know having some private communications that are maybe there's a more nuance coming between the the public sphere and the private and this is the Webull WeChat kind of balance where it's like well you should be able to say some things on WeChat you can't say on Webull so I'm you know I think there there is a technocratic hope that you know I meet Chinese developers on GitHub all the time right and GitHub is this great gathering of code and there's there's some hope that we all have these same perspectives of what are what we should do for the user and and that the governments can get in the way of that and so we should have end-to-end encryption and all these things so I think as a new form of international diplomacy maybe it has possibilities and you know I'm hopeful that at Google I.O. happening or any other national gathering developers that they're you know you sort of much like the internet task force you put aside some of the nationalism so that you can make the internet work better. So Jill you're involved with these conversations every day with large internet companies as they try to figure out how they navigate both constraint to the internet censorship surveillance within different foreign governments and also as they try to think about what sort of controls they're going to have on this space how are how are you talking with these different entities around these questions around surveillance in particular. Yeah so it's really tempting for me to go off into the speech zone but I'm not going to do that however I'll just give an example so that I can try to transfer it back to surveillance. So if you saw over the past couple weeks Twitter took down some content at the behest of the Pakistani government and then put it back up and the reason that happened from all public accounts is that the Pakistan the Pakistani legal order was questionable in nature Pakistani groups were railing against the decision etc and Twitter did the right thing by putting it back up and I'm we commended them EFF commended them on that decision today but this is exactly the sort of thing that really concerns me if you look at Facebook or Twitter or Google's transparency reports they're handing over data to governments where they have no offices and that are not democratic and so Facebook for example right in their transparency report they give user data to the Pakistani government and we don't know what that is so sure maybe it's a murder case and there's a law enforcement request that is legitimate or maybe it's not but it concerns me really that these companies are making these decisions by and large without any local expertise as part of it and so while I can sort of trust Twitter's judgment to understand and correctly apply a legal order from say Germany or the UK I don't really trust their judgment because really these are companies out in Silicon Valley where this kind of very arrogant error exists you know everything we've got this right this is how it goes and most of these companies don't have particularly diverse policy teams and so I think that that's one of the big issues here is if you're dealing with user requests for data if you're dealing with and I mean these are you know a form of surveillance in a way if you're dealing with that sort of thing and you need to be consulting not just with inner groups like the GNI but with local organizations in the country that you're dealing with and I really at this point don't trust these companies to get that right so with these three remarkable individuals on stage I want to make sure that we have a chance to open up a conversation the conversation probably will not be completed during this session but I am happy to take some questions and I can see even before I'm rising with the microphone Rebecca already has the hand up so we're going to give her pride of place with the first question but queue up a couple more for other folks who want to get into the conversation thanks so much Ethan Nathan I was really intrigued by your comments on Chinese companies and Tencent and Weibo and so on and you know I've also seen in some research I've done that some of the Chinese companies are making efforts around consumer privacy and security against hackers and sort of criminal attacks and so on but do you have you seen any evidence that these Chinese companies are pushing back against data demands from Chinese Public Security Bureau and State Security Bureau at all no and I think my developer optimism maybe is getting ahead of what they're capable of doing as companies within the Chinese realm but I do think that they're not as horrible as maybe they're made out to be when it comes to the state of US companies and so I think there is you know there's no transparency and we continue to have reports from our Tibetan groups of people that are detained very frequently after posting things on privately on WeChat so that's a very good clarification question but the fact that well I'm going to test what happens because we're going to be building on top of the WeChat API soon so we'll see you know if they block us right and how open their APIs are I think is will be an interesting test so the floor is open for questions regarding surveillance regarding trust regarding the transformation of journalism the tech industry activism within this space I am really happy to see the hands coming up and I will work to get the micro out of the different folks Hi my name is Caitlyn I work at WNYC my question is about the sort of migration of journalists to tech companies and the way that they've created all these editorial positions and I'm sort of it's an open end question but what do you guys think the responsibility of journalists who make that migration is to sort of change and tackle some of these issues that you guys are talking about my answer you know it's a great question I think is to advocate it has to be out to educate and advocate internally and externally about what what journalism actually is I mean there's always a there's always a problem when you are a journalist within an organization which is intrinsically not journalistic you know which is that you are a low priority and particularly if you're in a company which is you know post IPO needs to kind of service as shareholds as etc because journalism isn't going to make many of these companies much money you know it's going to actually make it hard for them to make money because you'll be you know they will be supporting things which nobody wants to hear which advertisers certainly don't want to sit against you know it introduces that but if you you know if you do if you do hop the fence or whatever and you know look there are there is there are fantastic advantages to having platforms like Twitter I you know I advocate for it all the time in my classrooms but I also say you know be aware of how these systems work so I do think that you know if if you really want you know Facebook um you know and it's great that you have kind of really smart people like the Dean Lavrousic and Vivian Schiller now inside these companies but you know that's kind of like those are powerful positions now and I think sort of advocating for what is intrinsically journalistic versus intrinsically commercial is a really important part of that role I was going to ask you talked about protecting sources but there's a challenge now in getting a source how do you get a government source now because and this this may be more of a journalistic you know question but who in the government is going to talk to you if they know the cell phones or or a tap the emails are tapped so is there a new way that you operate just to even get sources now is it more you've got to get out there and pound the pavement or what I'd I'd like to hear from all on this because I I think this is really a question about trust and we're sort of all working with different populations on this so Emily sorry to cut you off I just want to make sure that everyone gets in on this just just very briefly Len Townley wrote great report on this kind of just a brief just sort of more or less a couple of months after the Snowden revelations came out which talked to lots and lots of editors and national security reporters that said exactly as you say the problem here is not just security around sources it's the fact that nobody wants to talk and that's a huge problem and whilst you have the kind of arms race of the Obama administration pursuing through fairly ancient kind of espionage laws et cetera leakers and sources you're going to have that problem and that's that's a huge different area that we need to address because it's not about security it's about culture and I'll short answer one Barton Gellman I think has sort of the best operational security in the business and should be recognized as such and just you know he can tell you how he communicates securely and what he does and platforms like Global Leaks and SecureDrop and platforms that news agencies can put in place that raise the bar for the way they accept submissions of content and such that they they protect their sources by default so I think there's just you know upgrade your skills it's possible there's free tools there's free apps and there's colleagues who are already you know the the freedom of the press foundation colleagues who are already engaging with with this as you know providing that entry point for news sources I agree with that I was going to say just about what Nathan said but I guess I'll take it one step further and say that and put out a public call for these wonderful platforms like SecureDrop and Global Leaks to take this stuff internationally we're seeing a lot of US media organizations adopt these technologies and that's where the outreach has been but I think that these technologies need to now be adopted by journalism organizations around the world Hi quick question I'm new to newsroom things but I find the idea of operational security for journalists in newsrooms absolutely fascinating however in my discussions with you know people at various newsrooms usually in like IT and security there despite the myriad options that people have they actually do not recommend anything officially because of course the stakes are way too high I was wondering if you or in the newsrooms that you had been in were there specific instances when there were like practices that were 100% sanctioned taught and you know yeah exactly well when I was at the I mean two things when I was at the Guardian we first of all you know there was nothing that was a system and I think there's this is not just the Guardian it's in other newsrooms I've been in as well I think that there were certain practices that individual reporters practiced I think that getting it kind of institutionally supported was harder and I think we're now entering a new phase and you know SecureDrop is again one of those kind of tools which you now here talked about and the institutional level that's actually an important distinction to make between individual Opsec for reporters and actually how journalistic institutions say how are we going to adopt what's good what should we agree on you know what are the standards and have a conversation you know among themselves so in general you know there was that the the standards of security and the types of tools were restricted that the other thing that's concerning is restricted to a tiny number of reporters because it was always felt well outside that charm circle nobody else will need that level of security and you know as as Jill and Nathan said these tools are hard because sometimes they're not they've got a lot easier in the last couple of years since I've not been in newsrooms but we need we need kind of you know more progress on that as well but also threat modeling I think that sometimes it isn't possible to guarantee safety so you need to threat model and your journalists need to think through who am I exposing to danger here what can I do about it what can you know it's not all about tools some of it is also just about thought process and about you know non low tech solutions as well and that was not routinely taught and that's where kind of places like the place I work J schools have got to kind of you know really up their game and and close that skills gap yeah I'm going to use my favorite line here and say that encryption works but encryption tools are like condoms they're 99 percent you know effective when used properly but really I mean that's a thing that's another thing that really freaks me out is the idea that of people pushing a certain tool as being the savior or NSA proof if you see that term run in the opposite direction I think that I mean yes we use a whole bunch of different things the FF and I think that all of these tools are really important but I don't want anyone to ever think that every that these are going to be perfect solutions and going back to Nile is a point which I didn't get to comment on I mean one of the places I gain inspiration and hope is from my Tibetan colleagues who you know after 50 years of occupation and these odds against a huge state power and the greatest surveillance state that on record the you know they're still optimistic and ingenuitive and find humor and find solutions and practical ways to move forward so if you're feeling like you can't do anything or you know some of the work we're doing with the safe travels online campaign is meant more to give you inspiration to try and to take small steps you know and that's what keeps me going and and that you know it's not about complexity it's about you know each intuition and little decisions you make every day so Hi my name is Ilaria I work in C cell I'm interested in your views on Tor so Tor I have a machine that has tails on it and it's unusable it takes too long to load there is tons of websites that have been blocked and the worst part of it is that even though it's very good for certain kind of aspects 80% of users on Tor they use it for illegal purposes so having this kind of privacy within without accountability has this kind of side effect so this kind of tools while maybe effective they might cause more harm than goods in some other places Nathan I think we're gonna make you ask answer that one first I have as a developer of Tor I feel I'll admit my bias you know the numbers of illegal or non illegal use I mean Tor is illegal lots of places in the world and I know that tails itself if you're trying to use it for day to day activities it can slow down I think finding ways that tools like Tor can be useful to you in day to day life is a good thing so saying you may not need ultimate security of tails I run Orbot on Tor on my phone and tablets and I run Twitter through it and so I say you know I'm gonna run all my Twitter traffic through Tor and that's a great way to use Tor because it doesn't feel slow and people aren't tracking when you're using Twitter and that's what works for me I don't use tails all of the time so I think there's you have to find what works for you you have to find where the technology is viable and I think that the idea that you know I'll push back and say Tor is not slow and Tor I can show you streaming video over Tor and all sorts of fast usable ways to use Tor I use Facebook over Tor and Google so I think ultimately saying it's not black or white and finding one way that you can use a privacy enhancing technology in your life today right now that works for you is a first step and don't just throw the baby out with the bath water because tails was too much to start with it's like saying we're gonna switch to I'm gonna have no bags and full organic I'm gonna grow my vegetables and you know switching your diet overnight it's not going to happen so so I'll speak to that I mean I was gonna say Tor you know I recognize that there are places in the world where Tor is nearly impossible to use Lebanon is a very good example of this and that's I know a thing that they're very conscious of but in terms of the illegal use okay so I mean I guess substitute cars I'm sorry to use this analogy cars can I'm still going with this one I'm sorry cars cause pollution they kill people all the time they run over animals so let's outlaw cars because they can be used for harm I think that it's true sure Tor can be used for harm but I think that the incredibly important uses of it whether we're talking about activists in Egypt or people in the U.S. who are looking up things about their health that they need privacy for any whatever the reason I think that the good uses I don't I don't want to say outweigh the bad because that's what life is it's always about trade-offs but I think that that Tor is incredibly important for all of those uses despite the fact that it can also be used for illegal purposes Google I.O. happening or any other national gathering developers that they're you know you sort of much like the internet task force you put aside some of the nationalism so that you can make the internet work better so Jill you're involved with these conversations every day with large internet companies as they try to figure out how they navigate both constraint to the internet censorship surveillance within different foreign governments and also as they try to think about what sort of controls they're going to have on this space how are how are you talking with these different entities around these questions around surveillance in particular yeah so it's really tempting for me to go off into the speech zone but I'm not going to do that however I'll just give an example so that I can try to transfer it back to surveillance so if you saw over the past couple weeks Twitter took down some content at the behest of the Pakistani government and then put it back up and the reason that happened from all public accounts is that the Pakistani legal order was questionable in nature Pakistani groups were railing against the decision etc and Twitter did the right thing by putting it back up and we commended them EFF commended them on that decision today but this is exactly the sort of thing that really concerns me if you look at Facebook or Twitter or Google's transparency reports they're handing over data to governments where they have no offices and that are not democratic and so Facebook for example right in their transparency report they give user data to the Pakistani government and we don't know what that is so sure maybe it's a murder case and there's a law enforcement request that is legitimate or maybe it's not but it concerns me really that these companies are making these decisions by and large without any local expertise as part of it and so while I can sort of trust Twitter's judgment to you know understand and correctly apply a legal order from say Germany or the UK I don't really trust their judgment because really these are companies out in Silicon Valley where this kind of very arrogant error exists we know everything we've got this right this is how it goes and most of these companies don't have particularly diverse policy teams and so I think that that's one of the big issues here is if you're dealing with user requests for data if you're dealing with and I mean these are you know a form of surveillance in a way if you're dealing with that sort of thing then you need to be consulting not just with you know groups like the GNI but with local organizations in the country that you're dealing with and I really at this point don't trust these companies to get that right So with these three remarkable individuals on stage I want to make sure that we have a chance to open up a conversation the conversation probably will not be completed during this session but I am happy to take some questions and I can see even before I'm rising with the microphone Rebecca already has the hand up so we're going to give her pride a place with the first question but queue up a couple more for other folks who want to get into the conversation Thanks so much Ethan Nathan I was really intrigued by your comments on Chinese companies and Tencent and Weibo and so on and I've also seen in some research I've done that some of the Chinese companies are making efforts around consumer privacy and security against hackers and sort of criminal attacks and so on but do you feel have you seen any evidence that these Chinese companies are pushing back against data demands from Chinese Public Security Bureau and State Security Bureau at all? No and I think my developer optimism maybe is getting ahead of what they're capable of doing as companies within the Chinese realm but I do think that they're not as horrible as maybe they're made out to be when it comes to the state of US companies and so I think there is you know there's no transparency and we continue to have reports from our Tibetan groups of people that are detained very frequently after posting things on privately on WeChat so I that's a very good clarification question but the fact that well I'm going to test what happens because we're going to be building on top of the WeChat API soon so we'll see you know if they block us right and how open their APIs are I think is will be an interesting test So the floor is open for questions regarding surveillance regarding trust regarding the transformation of journalism the tech industry activism within this space I am really happy to see the hands coming up and I will work to get the mic around the different folks Hi my name is Caitlin I work at WNYC my question is about the sort of migration of journalists to tech companies and the way that they've created all these editorial positions and I'm sort of it's an open end question but what do you guys think the responsibility of journalists who make that migration is to sort of change and tackle some of these issues that you guys are talking about I answered you know it's a great question I think is to advocate it has to be to educate and advocate internally and externally about what what journalism actually is I mean there's always a there's always a problem when you are a journalist within an organization which is intrinsically not journalistic you know which is that you are a low priority and particularly if you're you're in a company which is you know post-IPO needs to kind of service its shareholders etc because journalism isn't going to make many of these companies much money you know it's going to actually make it hard for them to make money because you'll be you know they will be supporting things which nobody wants to hear which advertisers certainly don't want to sit against you know it introduces that but if you you know if you do if you do hop the fence or whatever and you know look there are family there is there are fantastic advantages to having platforms like Twitter I you know I advocate for it all the time in my classrooms but I also say you know be aware of how these systems work so I do think that you know if if if you really want you know Facebook and you know and and it's great that you have kind of really smart people like for Deem LaRusik and Vivian Shiller now inside these companies but you know that's kind of those are powerful positions now and I think sort of advocating for what is intrinsically journalistic versus intrinsically commercial is a really important part of that role I was going to ask you talked about protecting sources but there's a challenge now in getting a source how do you get a government source now because and this this may be more of a journalistic you know question but who in the government is going to talk to you if they know the cell phones or attack the emails are tapped so is there a new way that you operate just to even get sources now is it more you've got to get out there and pound the pavement or what I'd like to hear from all on this because I I think this is really a question about trust and we're sort of all working with different populations on this so Emily sorry to cut you off I just want to make sure that everyone gets in on this just very briefly Lentown you wrote a great report on this kind of just a brief just sort of more or less couple of months after the Snowden revelations came out which talked to lots and lots of editors and national security reporters that said exactly as you say the problem here is not just security around sources it's the fact that nobody wants to talk you know and that's a huge problem and you know whilst you have the kind of arms race of the Obama administration pursuing through you know kind of fairly ancient kind of espionage laws etc leakers and sources you're going to have that problem and that's that's a huge different area that we need to address because it's not about security it's about culture and I'll short answer one Barton Gellman I think has sort of the best operational security in the business and should be recognized as such and just you know he can tell you how he communicates securely in what he does and platforms like Global Leaks and SecureDrop and platforms that news agencies can put in place that raise the bar for the way they accept submissions of content and such that they they protect that their sources by default so I think there's just you know upgrade your skills it's possible there's free tools there's free apps and there's colleagues who are already you know the freedom of the press foundation colleagues who are already engaging with with this as you know providing that entry point for news sources I agree with that I was going to say just about what Nathan said but I guess I'll take it one step further and say that and put out a public call for these wonderful platforms like SecureDrop and Global Leaks to take this stuff internationally we're seeing a lot of U.S. media organizations adopt these technologies and that's where the outreach has been but I think that these technologies need to now be adopted by journalism organizations around the world Hi good question I'm new to newsroom things but I find the idea of operational security for journalists in newsrooms absolutely fascinating however in my discussions with you know people at various newsrooms usually in like IT and security there despite the myriad options that people have they actually do not recommend anything officially because of course the stakes are way too high I was wondering if you or in the newsrooms that you had been in were there specific instances when there were like practices that were 100 percent sanctioned taught and you know exactly well when I was at the I mean two things when I was at the Guardian we first of all you know there was nothing that was systemat and I think this is not just the Guardian it's in other newsrooms I've been in as well I think there were certain practices that individual reporters practiced I think that getting it kind of institutionally supported was harder and I think we're now entering a new phase and you know secure drop is again one of those kind of tools which you now here talked about an institutional level and that's actually an important distinction to make between individual OPSEC for reporters and actually how journalistic institutions say how are we going to adopt what's good what should we agree on you know what are the standards and have a conversation you know among themselves so in general you know there was that the the standards of security and the types of tools were restricted that the other thing that's concerning is restricted to a tiny number of reporters because it was always felt well outside that charm circle nobody else will need that level of security and you know as as Jill and Nathan said these tools are hard because sometimes they're not they've got a lot easier in the last couple of years since I've not been in newsrooms but we need we need kind of you know more progress on that as well but also threat modeling I think that sometimes it isn't possible to guarantee safety so you need to threat model and your journalists need to think through who am I exposing to danger here what can I do about it what can you know it's not all about tools some of it is also just about thought process and about you know non you know what we call sort of you know low tech solutions as well and that was not routinely taught and that's where kind of places like the place I work J schools have got to kind of you know really up their game and and and close out skills gap yeah I'm going to use my favorite line here and say that encryption works but encryption tools are like condoms they're 99% you know effective when used properly but really I mean that's a thing that's another thing that really freaks me out is the idea that of people pushing a certain tool as being the savior or NSA proof if you see that term run in the opposite direction I think that I mean yes we use a whole bunch of different things DFF and I think that all of these tools are really important but I don't want anyone to ever think that every that these are going to be perfect solutions and going back to nihilism point which I didn't get to comment on I mean one of the places I gain inspiration and hope is from my Tibetan colleagues who you know after 50 years of occupation and these odds against a huge state power and the greatest surveillance state that on record the you know they're still optimistic and and ingenuity and find humor and find solutions and practical ways to move forward so if you're feeling like you can't do anything or you know some of the work we're doing with the safe travels online campaign is meant more to give you inspiration to try and to take small steps you know and that's what keeps me going and and that you know it's not about complexity it's about you know each intuition and little decisions you make every day so Hi my name is Ilariam I work in C cell I'm interested in your views on tour so tour I have a machine that has tails on it and it's unusable it takes too long to load there is tons of websites that have been blocked and the worst part of it is that even though it's very good for certain kind of aspects 80% of users on tours they use it for illegal purposes so having this kind of privacy within without accountability has this kind of side effect so this kind of tools while may be effective they might cause more harm than goods in some other places Nathan I think we're gonna make you answer that one first I have as a developer of tour I feel I'll admit my bias you know the numbers of illegal or non illegal use I mean tour is illegal lots of places in the world and and I know that tails itself if you're trying to use it for day to day activities it can slow down I think finding ways that tools like tour can be useful to you in day to day life is is a good thing so saying you may not need ultimate security of tails I run Orbot on tour on my phone and tablets and I I run Twitter through it and so I say you know I'm gonna run all my Twitter traffic through tour and that's a great way to use tour because it doesn't feel slow and people aren't tracking when you're using Twitter and that's what works for me I don't use tails all of the time so it works for you you have to find where the technology is viable and and I think that the idea that you know I'll push back and say tour is not slow and tour I can show you streaming video over tour and and all sorts of fast usable ways to use tour I use Facebook over tour and Google so I think ultimately saying it's not black or white and finding one way that you can use a privacy enhancing technology in your life today right now that works for you is a first step and don't just throw the baby out with the bathwater because tails was too much to start with it's like saying we're gonna switch to I'm gonna have no bags and full organic I'm gonna grow my vegetables and you know switching your diet overnight it's not going to happen so so so I'll speak to that I mean I was gonna say tour you know I recognize that there are places in the world where tour is nearly impossible to use Lebanon is a very good example of this and that's I know a thing they're very conscious of but in terms of the illegal use okay so I mean I guess substitute cars I'm sorry to use this analogy I'm still going with this one I'm sorry cars cause pollution they kill people all the time they run over animals so let's outlaw cars because they can be used for harm I think that it's true sure tour can be used for harm but I think that the incredibly important uses of it whether we're talking about activists in Egypt or people in the US who are looking up things about their health that they need privacy for any whatever the reason I think that the good uses I don't I don't want to say outweigh the bad because that's what life is it's always about trade offs but I think that that tour is incredibly important for all of those uses despite the fact that it can also be used for illegal purposes so one of the many things we try to do at this conference is we try to set up panels discussions talks that are going to give us fodder for discussion when we head out and have drinks and have dinner and questions about where we go with tours definitely one of them but Miha Sifri is always suggesting to me that one of the jobs in life is to ask a good question so I'm going to ask him to sort of close off our conversation by asking a good question I'll make sure I end with my voice going up actually it's a three one question for each of you so Emily my surveys of journalists in the US suggest that at the moment the number of American journalists using any sort of secure communications is somewhere between one and five percent why is that what can we do about that Jillian same question but related to activists and Nathan my question for you is who pays to develop these tools I'm a big warrior about the adage that if you're not paying for something online that's because you're the product so you are giving away these tools for free but shouldn't we be paying for them so three fantastic questions let's start in the order that they were delivered okay well one and five percent you know is exactly where you would expect it to be and I think the answer to that is because American journalists haven't felt the need to and they haven't been aware of what the risks are or the threats are and they haven't really seen how this arc is going to progress because they're busy people and they've got a lot of stuff that you know they've got a lot of stuff to do and this this stuff gets in the way you know so you've got lots of stories to kind of write have you got time to stop and learn all of this stuff you know this opsec stuff that actually is it really you know you're a kind of like you're a local education reporter is that really does that really matter to you and I think that what we know now Sean light on is that everybody needs to do this you know because journalism needs its own channels of communication you know as a bar association meeting where actually very senior lawyer came up to me afterwards and went oh my god I hadn't realized that of course the people that we are representing you know we have no client confidentiality you know so it's not just journalists and this is the start of an educational process so you know we have to be really active in that as J schools you know journalists have to be really active in in in reporting it you know and the technology community I hope will be active in helping us you know figure out what these problems are and come up with some robust and easy to adopt solutions okay so my answer I'm going to keep it short I think the first thing is that the developers need to listen to their constituents listen to their users and take their feedback seriously another thing is make these tools multilingual start from that premise don't assume that your user base speaks English or can even if they do that they can use a tool with English instructions and then the third one I'm going to paraphrase an Egyptian friend of mine and say that until these tools are as easy to use as a toaster or remote control a lot of people aren't going to use them and so I think we're getting there and I think that that's one of the keys is make these things seamless to integrate into your normal daily life and you know that in terms of who's paying for this now and who should be paying for it down the road you know we're lucky right now to benefit from various funding sources including internet freedom initiatives of the U.S. government through groups like the open technology fund so your taxes are paying us if you're an American or if you're Dutch or if you're a few other governments Swedish maybe the Night News Foundation gave us a generous grant for work on informal camps so Foundations and Ford and OSI and other groups so Foundations have taken an interest in this Eric Schmidt himself gave us some funding and you know I think that can that can go on for a while that can go on for a certain period of time but it can't go on forever and we need to you know I mean Mozilla you know I need to talk more with Mozilla and Apache and you know they've figured out really fascinating ways to get lots of money from corporations and donations and I think you know ultimately I'm really excited to see what someone like Silent Circle is attempting to create a business model around privacy and security unfortunately the minute you go that route you tend to start prioritizing corporations and military and kind of not the people that I really I used to do that work and I'm not I want to help people that maybe can't pay or don't have a credit card so I'm looking for a longer-term plan and help with that and I believe it again Mozilla Apache Linux other web foundation I'm inspired by those groups because they've proven you can do this for 20 years and more than just my two or three year grant horizon in my head So let's get a round of applause for our very very smart and engaged and extremely helpful panelists here and I think they're all around for dinner and drinks and I'll help you install apps well so if you want to continue this either on the geeky level of what should I do or on the higher level of what does this mean about trust what does this mean more broadly these are three terrific people to connect with and in general please keep in mind one of the big reasons we do this is the hope that we can spark some interesting conversations and maybe collaborations which is why at this point there is an open bar before dinner so please head on out thank you all for a really long informative but wonderful day and thanks for hanging in with us and have a wonderful evening