 So, as a professional journalist, I get to ask the first series of questions before we turn it over to the audience. And Rebecca, I'd like to start with you. You call in your book for a sort of a new way of looking at the way Internet policies are conducted and empowerment on the part of network users of us, regular people who use the Internet, both to hold our governments accountable, wherever they are, and also in the corporations, which can be as powerful as nation states. And I think in the past couple of months with the online protests over the rather aggressive anti-copyright measures that recently were withdrawn in Congress, people feel that this can actually work, that enough Internet protests and sharing of information can lead to changes in government policy. But it doesn't seem as easy when it comes to corporations who aren't responsible to voters. What would it take to get companies to balance their drive for profits with these more moral questions? Well, I think we can look to other sectors and how companies have changed. I mean, back in 1970, you had Milton Friedman, the economist, writing an essay that the only socially responsible activity of a business is producing value for its shareholders. That was also the same year that we had the first Earth Day, when I think we had a real social awakening about what sustainability means and the fact that the value that a company long-term is generating for society and for itself is about more than just short-term profits. It's about delivering value to society is how you're being sustainable within a larger context. And whether your products are ultimately contributing to the kind of life and the kind of planet that we want to have. And so I think when it comes to the Internet, when it comes to telecommunications companies, when it comes to Internet companies, service providers, and so on, I think what we're seeing now is an awakening, really, that we need digital sustainability, that just as we expect, it is not acceptable for companies to hire 12-year-olds, even though that might maximize profits. It is not acceptable for companies to pollute our air and water, even though doing so might maximize their profits. That's socially, the idea that it is acceptable has been now deemed illegitimate. It was much more controversial sometime back. And I think we're now recognizing that unless we, as we have entered now an era in which all aspects of our lives, we depend on these digital platforms and services, not only for our personal lives and how we conduct our business or education, but also our politics. And if the structure, if the way in which these networks are structured, the way in which they're managed, the way in which they're governed in terms of their private terms of service and so on, is not compatible with the kind of values that we want our society to have, the kind of freedoms that we expect, the kinds of freedoms and rights and ability to hold government accountable that people are risking their lives for around the world every day. If these platforms and services are not contributing the sustainable manner to the kind of world we want to have, that's not acceptable. If I could just follow up on that for one point before I turn to Jillian. Are there any early examples of an online effort or an effort by consumers in the U.S. to change that have been successful in changing a company's policy either here or in some other country? Rebecca, that's actually for you. For me? Oh, okay. Well, I think we're starting to see, for instance, recently Google Plus rolled out its new social networking service. And they started out with a very similar identity policy to Facebook, which is that you're required to use your real name. And a lot of users didn't like that. They had been hoping that Google would be different. And people started actually using Google's platform to lobby Google's management to change the policy. And they have now begun to adjust the policy to allow pseudonyms in some cases. And there's still, not everybody's happy and there's still a lot of bugs to be worked out. But you've seen Google's executives at least willing to adjust and willing to listen. And I think that's an example of when people get organized, at least with some companies that you can't see adjustments. And even with Facebook, you've seen, for instance, I know Jillian's been involved and other activists have been involved with trying to get Facebook to make certain features more secure. So it would be harder for governments, hostile governments to hack people's accounts, for instance. And have been kind of advising them on, okay, here are some things you need to do. So definitely I think there are cases where when you get enough people going to the companies and saying, here's the right thing to do. It's gonna increase trust. It's gonna increase the value of your service, not just commercial value. But the way in which people value an environment as a trustworthy place to be. And that's in your interest in the long run. Thanks for that. Jillian, Rebecca in her book describes an arms race between on surveillance, in particular surveillance and the techniques for avoiding surveillance. And I guess, China is probably the acknowledged master of this. And there are lots of other countries that are attempting to follow suit, notably recently Iran, which has really been beating up its surveillance ability. And on the other side, we have a couple of scruffy activists who are cobbling together things like TOR, which allow people to evade surveillance. Being maybe unkind to the activist crew, they're by and large motivated by idealism. It doesn't seem like a fair fight. How is that battle going? And is there anything that regular people can do to tip the odds? Right, so when it comes to the fight that we're having right now with surveillance, you're right, we've seen increases in Iran and more recently in Syria, which is likely to have been helped by Iran. And it is, it's really an unfair fight. You have activists who are trying to utilize the internet for whatever means of protests and also just ordinary everyday things. But then you have these governments basically spying on their citizens. And so the way that the fight is right now is that organizations like EFF and plenty of other organizations in the US that can't even name them all and in Europe are working to try to put similar standards as to what Rebecca has discussed, the similar standards that exist for some of these social networking companies to surveillance companies. And so one of my colleagues was in Brussels last week, testifying to the European Parliament about how some of these tools have used, sorry, have been used and how we think that we can hold these companies accountable. But on the other hand, some of the regulation that's been proposed is problematic in that just as back in the 90s, the crypto wars sort of blocked encryption technology from being exported. Sorry, the regulations on encryption rather than we had all these crypto wars. And now we have that ability to export encryption technology. There are similar issues with putting regulations on surveillance technology. As things stand right now, Syrians are prohibited from accessing certain communications tools because of existing commerce and Treasury Department regulations. And so the fear there is to strike a balance between ensuring that the surveillance tools don't get in the wrong hands, but also ensuring that communications tools are accessible to all. But it does seem like an unfair fight. And so in terms of what people can do, I think that really we should be looking at the same sort of model of lobbying these companies. Users need to be aware, not just users, sorry, obviously a lot of these have multiple uses, but stakeholders and sort of everyone involved in this process needs to be aware of how these tools are being used. And in the past year, I think we've also seen a lot of excellent awareness raising on the subject, not just from rights groups, but also if you look at Bloomberg's coverage has been quite incredible in terms of sort of delineating how and where and what is going on. And so I do think that that people are becoming more aware. Well, if I can follow up with you on that, it's nice that they call things like the wiretappers ball. I mean, it's actually fairly accessible to the general public. I mean, you see a problem there. But if we look at that export issue, as I understand it, there are virtually no regulations preventing the export from U.S. and European countries of some pretty capable equipment to some unpleasant places, except for the ones that we officially have no dealings at all with, because it's all dual use. It can be used for stopping porn or catching bad people, and it can also be used to spy on everybody. Is there any way to change that? Is there any way to either have greater restriction or do you think that there's some chance of investor or public pressure on the companies? I mean, it's hard to imagine, I hate to pick on them, but there's a company called Naras here. I mean, they don't sell machines to Ma and Pa on the street, and I don't think if there was a EFF inspired boycott, it would really make much difference to their bottom line. So how do you impact companies like that? I guess that could be for both of you. Right, I mean, there is a big difference. Some of the companies do produce tools that are used for home use as well. And so in those cases, it's much easier. But when you've got tools like what Naras produces, EFF has advocated a set of know your customer standards, where upon companies would be required to be transparent about who they're selling to, what they're selling, and what it's going to be used for. That's our position at the moment. I've seen other proposals from other organizations that also deal with similar ways of handling it. But I do think it's a difficult fight. And I think ultimately we will have to force the hand of these companies to be transparent without the shareholders. It is quite difficult to get them to move on that. Yeah, definitely I think we're starting to see socially responsible investors move into this sector. And they haven't been thinking until recently about technology companies in terms of the downside and whether there are certain companies you just don't want to invest in. And so that's certainly part of it. I think part of it is just greater awareness. A lot of these companies like Naras, for instance, they've been getting away with it with no public reporting on what they've been doing or anything else for a very long time. And so there hasn't been enough reporting on a lot of these issues. There also, as Jillian says, these companies are not required to be transparent and report on where they're selling. Also you have a problem that the US government is also one of the biggest customers of these tools. And European governments are massive customers of these tools. Are they saying anything to the companies? Are they expressing concern about who else is being sold these technologies? And trying to exercise some kind of influence through their buying power in terms of we're only going to buy products from people who set certain standards about the human rights standards of the countries they're working with and how they're being transparent and reporting requirements. I mean, the US government, even if you don't have a law, they could have standards for their vendors about we're going to require. If you're going to be a vendor for us on surveillance and security technology, we need to know certain things about what you're doing and we're going to report that publicly. They could do that if they wanted to and that would have a huge effect. So there's a lot that could be done that would at least shed light on what's happening and while these companies are not consumer products. I think if the broader investor market, if this information is more regularly in the Wall Street Journal, which over the past year has begun to be, but before then it really wasn't. I think you might start to see a bit of a shift. Okay, let me talk a little bit about the face of activism as we see it. Now, it's been morphing a lot, this sort of digital activism. Certainly, it's sort of entwined now with the Occupy movement and probably most other movements from here on out. It seems to be a fundamental part of the program now. But it was also very much in evidence against SOPA and Protect IP in Congress. But still the thing that I think most people think about is anonymous, which is kind of mobby and does things that are against the law. Because it's so diffuse, it's sort of hard to, you can't have a sit down with the leadership and say, well, why don't you point your guns over here. But for both of you, is there much hope of that getting moderated or channeled or are people so, if people are annoyed with the behavior of companies or governments, is that just the easiest thing to do to sign up to participate in a distributed now old service attack against something they don't like? You want to go first? Sure, sure. I mean, it's interesting because we've seen a lot of despicable things done by anonymous and then we've also seen them hacking the website of the Syrian foreign ministry. And it's very difficult for me to look at something like that. I mean, in the case of Syria, what they had done was actually to face the site by putting up tips and instructions on how to use the internet safely in Syria. And so in that case, it was really hard for me to look at that and condemn it. Now I do, because I actually do believe that while those attacks can be used against governments, against huge companies in somewhat positive ways, they're more often used against small independent media and human rights sites that can't defend themselves. And so I take the stance that I don't think attacks like that are a good idea, but what we saw with the protests over the past, I guess it was almost a month ago now, but with the Stop Online Piracy Act and the Protect IP Act, SOPA and PIPA, we saw a different kind of action taking root. We saw people raising their voices instead of sort of lobbying grenades, so to speak, horrible analogy, I suppose. And I think that that might be something that we'll see continue. It's difficult, though, because when we sort of looked back and debriefed over what happened with the protests around those two acts, it doesn't necessarily seem like those numbers are easily replicable. And so I've gotten the question a lot over the past couple of weeks. Were we far over the tipping point, or were we just barely with the number of people involved and the number of organizations and companies involved, or were we just barely there? And I can't answer that. And so I don't know if we'll be able to see that sort of action replicated easily. I don't know what Rebecca thinks. Well, to speak to the anonymous point, and then maybe we can talk more about the Stop Online Piracy Act and the activism against that. But I agree that there have been cases where the anonymous has done some helpful things like in Syria and helping dissidents. And there's, however, I think the attitude that the ends justifies the means that many members of anonymous has troubles me deeply, having begun my academic career as a student of the Chinese and Russian revolutions and having seen where those went. And OK, if a certain police department does something that violates citizens' rights, is it constructive then to go and hack their computers and expose the personal information of all their employees and their family on the internet and their home addresses? I mean, I'm not quite sure what problem that solves. Or when, OK, BART did not act intelligently in how it handled the shutdown of its cell phone system, its wireless system inside BART. But by hacking their site and exposing the account information of everybody who had an account on BART, what is that achieving exactly? How is that resolving the problem of the abuse of power either by government entities or corporate entities? It's kind of like King John was a really bad king. So Robin Hood went around robbing from the rich to feed the poor and he was acting civil disobedience, and that's great. But he didn't actually solve any problems of bad governance. He was just sticking it to the man, and that's about it. He didn't kind of bring anything forward. He was kind of cool and everybody thought he was great because he was sticking it to the man, romantic. But what problem do you end up solving? None. And it took people who were actually being more constructive about, OK, we need to come up with a new way of governing other than the divine right of kings. And we need to build new structures that hold power accountable, which eventually led to the American Revolution and so on that actually kind of helped to solve long-running, entrenched problems of bad governance. And so coming back to what Julian was saying, yeah, I mean, what we saw with the protests around bad legislation and bad governance and people kind of trying to figure out, OK, what are the solutions is a positive step in the right direction. But I think we need to be careful because I hear from younger folks. It makes me sound so old, but I hear from a lot of people, some people who have, I think, affiliations with anonymous or affiliations with different hacker groups who kind of use the language of the ends justifies the means. And innocent people, yes, are going to get hurt, but this is a war against massive abuses of power and there's going to be collateral damage. And that language really scares me because, again, I started my academic career studying the Chinese and the Soviet revolutions, which also had similar language and didn't turn out so well as far as human rights were concerned. I want to ask perhaps a fairly broad question. I stared away mostly from copyright because it seems like we spend a lot of ink in the journalistic business. We spill a lot of ink talking about copyright and there seems to be a lot of fewer in Congress about copyright and there seems to be dramatically less about human rights. And maybe that's just because of people in America make more money off copyright than they do off human rights. I don't know. But the broad question is, is there a lack of balance in the discussions about not just copyright but also security? Because we have a very big omnibus cybersecurity bill that is rearing its head again in the Senate. And may or may not pass. There have been lots of smaller cybersecurity efforts. But it seems like if that follows the model of the copyright language, it can be pretty short-sighted technologically and in terms of people's individual rights. So I want to ask both of you to take a whack at copyright and security versus privacy in terms of where the government is putting its emphasis. Yeah. Well, if I could just take a crack at it first. I think this is, we're facing a common problem with internet-related legislation not only in the United States but I think around the democratic world where legislators are faced with a problem. OK, we have attacks on our network or we have theft of intellectual property or we have child porn or we have cyberbullying. And constituencies are screaming, do something. And so they want to do something to resolve that particular problem. Whatever basket that problem happens to be in. And so they go for a solution. And without really thinking about how that solution is going to affect these other areas. Because I think what happens is that legislators, because the internet is so new, they tend to think about problems on the internet sort of like you fix your toaster or you fix your refrigerator or you fix your computer and then it's fixed. You've got to fix the problem. Rather than this is a space in which that society is extending into and it contains all the same contradictions that we have in the city of San Francisco. And if you want 100% solve the crime problem and have zero crime, you're going to have some trade-offs that may well be unacceptable because you'll turn the city into something like North Korea. And so if you don't want that, you have to think about what kind of solutions balance other concerns and rights. How do I make sure that all affected stakeholders are consulted? That's what you do if you're governing a physical place. But legislators I think are not used to thinking in those terms when they're solving problems in the digital realm. And I think what was good about what we saw with the Stop Online Piracy Act and the debate about this particular legislation, which was trying to solve a problem, and it was like in response to one constituency that was screaming about the problem without consulting with everybody else who was going to be affected to see what are the other ways of solving the problem that might not be quite so damaging for free expression and put technical mechanisms that look like the Chinese firewall and legal mechanisms that impose kind of unacceptable burdens on internet companies to police their users. And just have a broader conversation about what the solutions are. Just as you have a broader conversation coming back to security, so look at policing in physical space. If it's just about kind of whacking the criminals and cracking down, you're actually not going to solve the crime problem, because you don't have the community buy-in. You don't have the community involvement. In fact, you might kind of turn people against you. And your measures will lack legitimacy. And so again, how you get the buy-in from all the stakeholders when you're going after resolving a problem. And I think whether it's child protection or cybersecurity or copyright, just legislation has not taken that approach. And that's why it tends to fail. And then they just keep coming back, trying to add more legislation, and it still doesn't solve the problem. It just makes everybody else mad. Yeah, I mean, I think there are a couple of great examples outside of the US really similar to this. In Tunisia, tomorrow it will be announced whether or not the government has to filter pornography. And this was a long series of appeals. And the final appeal was made to the country's highest court. And tomorrow, if it's determined that they will have to filter pornography, then they will have to re-implement the same systems that existed during the Banali era. And what that means for Tunisians is the fear that the government will overreach. And I think that that's the same thing that we saw with the Stop Line Line Piracy Act, is that that provision that would allow the government blocking of websites is the same exact fear, the fear that that will then be used to overreach or will cause collateral damage. And we're seeing this in India, we're seeing it in a number of countries around the world. And I think, again, generally speaking, the digital rights groups, the technical minds are not consulted in these processes. And that's what we saw with those bills, and that's what we're seeing in Tunisia. Well, I've had monopolized enough of your time. I'll turn it over to audience questions now, which I will pick from the sac. I think if you have any more things to ask, please send them up. This is a question I'm thinking is for Rebecca, it's certainly outside of my extra days. Maybe not, maybe it's still interesting. How significant is the experiment with online free speech in Iceland and in parentheses, I-M-M-I, close parentheses. Can it provide protection to dissenters, journalists and regular users, or can it raise the bar on free expression online internationally? Are there other models that look promising? I have no idea what the questioner's talking about. Counting on one of you to bail me out. So I-M-M-I is the Icelandic Modern Media Initiative, but I am afraid to admit that I haven't followed its progression closely over the past year. Yeah, I'm not quite sure exactly where it stands the moment either. I know that there was discussion of kind of setting up basically a set of laws and regulatory structure that would make Iceland kind of the haven for online free expression and prizes. Including WikiLeaks, this is vaguely coming back to me. And so on, and kind of ironclad protections against surveillance and whatnot. The details of which I would have to go and check. But it's again unclear exactly how far they've gotten, so not sure. So there is no free speech haven where you can just host everything and then not worry about it. Well, there are people talking about creating islands, like manmade islands in the ocean that have no legal national jurisdiction. And that's kind of the no actual nation state has the ability to demand access to servers. But the problem is in any nation state, there's law enforcement issues. And there are constituencies demanding that terror be fought and nasty people who do bad things to children be chased and so on and so on. So it's really hard to ironclad things from state access. And then if you do have law enforcement access, even in a very democratic society, how do you ensure that that's not being abused and how do you have the right kind of checks and you're ensuring that it's accountable? And this is the problem that I think all democracies are really struggling with. We haven't and worked out the model. There's no model democracy right now that has gotten it right. And I know there's a couple people sitting around who work on this full time as well. But I think there are some countries who've gotten some pieces of it better than other pieces. And you could maybe cobble together the laws from several countries and say that this is ideal. But I don't know if I've even seen anybody put together. Here's what the model set of kind of laws and regulations and kind of political checks and balances and corporate practices, like all combined. Here's what you need if you want to have digital infrastructure and corporate behavior that's really gonna be democracy compatible. Free expression, civil liberties, maximize compatibility, here's what you need. Here's the model to go off of. I haven't seen that anywhere. We need it desperately. Here's another question for you, another broad one. Is it possible to have censorship without violating human rights? What is the balance that can be struck between the two? Oh, it's a tough one. Is it possible to have censorship? Do you wanna do this to that? There are some people who argue that it's not because it's impossible to have censorship because the whole point of censorship is people don't know what you're blocking. I mean, I suppose in theory, if human beings were perfect, you probably could. And if government was perfect and so on. But even in Europe, say, where there are a number of countries in Europe that have blocking systems, censorship systems in place for child pornography. And there's been some studies done on a couple of levels. One is, is there mission creep and are there mistakes made in terms of some of the sites that end up on the list? And based on leaks and other information that's come out, in pretty much most cases, there's some over blocking that happens. So there ends up being what people in the field call collateral censorship. It's very hard to do it completely right without accidentally censoring stuff that you didn't intend to censor, even if you're getting it all right. It's also very easy for stuff to get on the list kind of in weird ways. So in the UK, there's kind of, it's not sort of mandatory, but there's this kind of watch list that most of the internet service providers use for blocking child porn. And for a short period of time, WikiLeaks was, I'm sorry, Wikipedia. Parts of Wikipedia were blocked because there was an Oasis album cover with a prepubescent girl on it and somebody in this organization making the watch list put that Wikipedia page on the list as child porn. And so there are these borderline cases that sometimes end up on the list and then who's deciding and who's exercising that power and how you get it off the list. But the other thing too is that people who've been doing research on censorship of child porn specifically, then they're asking the question, is it actually solving the child porn problem? Is it actually resulting in a reduction of exploitation of children? And the answer is it doesn't look like it is. It's just putting a band-aid on the problem for the general population. The sick people who really want the material are still finding ways to get it and the people going out there kidnapping kids and exploiting them are still doing it. And so it's actually just putting, diverting people's attention from the problem and not enough police resources are going into the hard problem of human policing. So that's the other issue with censorship is in how, and there needs to be a lot more research, like in what cases does censorship actually solve the problem you're trying to solve? Assuming it's a legitimate problem in the first place, like child porn is certainly a legitimate problem. Yeah, I would agree. I mean, I guess I was interpreting the question more philosophically, but I think that when it comes to filtering or blocking of websites, I actually don't believe that it's ever the solution. I think that if we look at the way the U.S. deals with child pornography, which is matching photos in a database and taking down the websites, going after the hosting, that I feel like that's a much better solution to what other governments do, which is, again, just blocking them and forcing the problem underground. And I would agree with what Rebecca said, that I don't see it solving the problem, and the same goes for other issues as well. And if we look to the United Kingdom where last summer, during the riots, there were pushes to Twitter, to Facebook, to censor certain content, that was another example of the same thing, where by doing that, would it actually solve the problem, or would it just push people to use other tools that are less recognizable, to coordinate by other means? And now the same thing, just this week, the UK put out another paper recommending, again, the censorship of online extremism, and I'm just not sure how that's going to solve the problem. And so I would say that no form of, say, blocking of websites, that type of censorship, I don't think that there is a way to do it while still respecting human rights. Here's another question. With the New York Times story this weekend, that the Justice Department has ways to monitor reporters without approval by a judge, how much safer online from their government are US citizens over Chinese? It seems that both the US and China can and do monitor their own citizens when they decide to. Well, I mean, that's certainly true in terms of the potential access, I think. I think the difference is what is done with the information, ultimately. I mean, there are a few more controls in terms of what the US government accesses under what circumstances. All those controls are excessively loose and excessively unaccountable, and not sufficiently subject to constraint. That said, though, in China you can post a tweet and have somebody, a policeman show up at your doorstep and take you into detention. And to my knowledge, that has not happened in this country. I know that somebody, a tourist who was coming to this country posted a tweet and wasn't allowed in, but it's not quite so bad as throwing him in jail and torturing him. But I think we are definitely entering into, I think a recognition that if you really want to be untraceable, if you really want to be completely sort of not monitored or you want to be ironclad, ironclad sure that you're not monitored, don't conduct your conversation electronically or your interaction, you need to be completely analog. And journalists and diplomats working in China have kind of assumed this from the beginning. You're going to a meeting with a source, you leave your cell phone at home with battery out and you don't arrange the meeting electronically. Sometimes you just show up at somebody's house and then go for a long walk and even then you might be seen. So, but yeah, if you really, the US Postal Service is certainly the most secure way to communicate with people. That's absolutely true. Handwritten letters, you need a warrant. The government needs a warrant to read your mail. It's really clear that the protections are very clear cut and to read your email, they are not clear cut at all and there are all kinds of gaps in the loopholes. If you're email, if you're using Gmail or Hotmail or Yahoo Mail or whatever and your email's over 180 days old, it's just totally fair game because the law regarding electronic communications and surveillance and government access was written back in the 80s before you even had the World Wide Web and when they assumed that if email's stored kind of by a company server, it had to be abandoned. And so there are all kinds of loopholes and the EFF did a study that came out last year that came to sort of national security letters that are issued to companies for access to so-and-so's account or the set of accounts and so on and the percentage of requests that were actually of dubious legality was quite high and the percentage of instances in which the company's actually challenged the legitimacy of the request was quite low. So even things that companies could challenge they're not challenging on behalf of their users and that's even the kind of more troubling thing and of course if a company like AT&T was found to have done a few years ago collaborates with blatantly illegal surveillance. So for instance, in 2006 I believe it was a whistleblower who had just retired from AT&T let it be known that actually here in San Francisco the NSA had this secret room and they'd gotten the engineers to funnel all communications through that room and it was being captured. There was no kind of warrants or subpoenas or national security letters being issued to see whose information could be looked at. It was just all being hoovered in. And the FF and others tried to sue AT&T with a class action lawsuit and it was thrown out. Why? Because AT&T is immune from liability for collaborating with blatantly illegal acts from a law called the FISA Amendments Act. And so which Obama as a candidate said he wanted to overturn and then soon before he got elected decided well maybe not and now he definitely doesn't want it revised and his administration is not favorable to revising the Patriot Act either and the kind of gaping lack of accountability when it comes to government access to information in that sense. We are not China by any means I can tell you I've spent enough China to know that the United States is nowhere close to what it's like in China if you're trying to organize a protest. Occupy Wall Street would not happen. They had Tiananmen Square, they occupied Tiananmen Square, that didn't go so well. And people trying to do that since don't fare so well. But still there are very troubling trends. Okay, I think that's a pretty thorough answer. Not too long. Another question from Mark Glazer. What happened to the money that the State Department of the US had to promote internet freedom? Hillary Clinton gave a big speech and what happened next? So, see if I can give the broad overview rather than the detailed and whiny one. So, a lot of that money has gone toward tools. Circumvention technology, TOR also, so anonymity enabling technology. Tools that are used by people here, used by police, used by people in other countries. And so, tools that I would call for the most part, relatively neutral. And some of that money has also gone toward trainings. Less neutral. And then of course, we've also seen, it's my brave, we've also seen a lot of contention over which tools that money's going to, whether those tools are effective or safe. And of course, who they're targeting. And so there's been a lot of back and forth in Washington over funding of specific tools and I won't get into that. But what I would say, sort of the broad answer is that I think that those tools have been incredibly useful in enabling the safety of activists in a number of countries all over the world. In that sense, they're wonderful. But in terms of the rest of where that money is going, I haven't seen a whole lot of sort of output there. I haven't seen, trying to kind of couch this in the right language. But I just haven't seen proof that that money has gone to the right place and that it's really making a difference. And so, circumvention tools are wonderful. Don't get me wrong, they do help people get around censorship in a number of places, but they're not a silver bullet. And I personally believe that that type of funding and generally speaking, those efforts need to be diversified. And I don't think that we've exactly figured out how to do that yet. All right, I think that's a fairly thorough. Apparently some digital companies and I'll have to just, a member of the audience alleges that some digital companies build a backdoor into their software to allow government access. Is this true? Do you have any examples? And are there any legal restrictions that would prevent that? Well, a lot of technologies are required to have lawful access built into them. Which is different from a backdoor, is I think most people think about it, I think. Yeah. Folsom Street is a backdoor. That's a backdoor. So as opposed to sort of collia with the sort of lawful access. So it depends on the technology, really. Do you know of any, is there a version of, I mean, there have been great fears in other countries that Windows comes with its own NSA peak ability, but has any of that ever been exposed? I mean, there have been, you know, there. Adobe in China. Well, it was hacked. Adobe was hacked. But there's a lot of software that has flaws in it that hackers then exploit to access other people's computers. But that's not the same as a backdoor. That's not something that the U.S. government, or some other government required that the company deliberately built so that others could access user information. We could, of course, preserve deniability by hiring the hacker that finds the exploit. Right, yeah. I mean, not knowing what the intent of the person writing the question was, I mean, I think that it would be worth pointing out that a number of these companies, even if they don't have backdoors, and I actually can't think of a good example at the moment of one that does, but that a lot of these companies don't have safeguards in place for how they deal with legal requests from other countries. And so obviously, this is Rebecca's area of expertise and goes back to sort of why some of these protections are now being put in place and why some of these multi-stakeholder initiatives are forming, but at the same time, we still do see companies, examples of companies handing over information to foreign governments without much consideration of what that sort of legal process is. And then I would say even more commonly, we see companies taking down content at the request of foreign governments without really any consideration at all. There was actually a great study done recently by the Center for Internet and Society in Bangalore, India, where, so now India has this law that allows anybody to sort of go ahead and request an intermediary to take down content and the intermediaries required to do so. There are certain parameters, but they're very broad. And this organization did a study where they submitted something like nine requests and to mostly American companies, and seven out of nine of them, I believe, don't quote me on that, were complied with and they were all absolutely outrageous requests. And so I do think that a lot of these companies are not, not only is their terms of service not really compliant with free speech principles, but then they're also just kind of taking down content at the request of anybody. I mean, one example, and again I'm not quite sure if this kind of fulfills what the questioner was really asking, but Skype, in order to market its software in China, went into a joint venture arrangement with a Chinese company called Tom. And it turned out that that version of Skype was logging everybody's text messages on Skype and sending them to a public security bureau computer in China. And there was a researcher in Canada who uncovered this. And apparently, the Skype head office claimed that they didn't actually know that it was, you know, that they were unaware it had been quite this bad. I mean, they knew that the company had been doing something, but you know. And so there, again, this kind of speaks to a lot of companies not thinking through how they're going to protect their users' interests. And, you know, I mean, there are a lot of, there are a number of Chinese companies trying to do business in the United States now and running against a lot of suspicion that the Chinese government might have a backdoor, you know, in a lot of these technologies and kind of worry about that. And this is why I think it's really globally in the interest of industry everywhere to kind of set up a set of mechanisms to prove, look, we are really, you know, we have a global standard on how we're protecting user information and we're allowing ourselves to be kind of audited that we're actually adhering to these standards. And there'd be a lot more trust, I think, in our networks if that would happen. And I think commerce would benefit. And the entire network would be more valuable if that happened. Well, I actually wanted to follow up on that because one of the things that's quite valuable in your book is that you talk about the sort of the insidious pressure on companies to collaborate. And one, I mean, China's got some pretty nifty technology just in government hands, but the most powerful tool really is the fact that companies have to cooperate with their law, which pushes the burden of censorship and down onto them. And that gets vague and maddening and then they probably overreach. And that's one of the things that drove Google nuts and eventually partially out of the country. But that happens in lots and lots of places and lots and lots of ways. I know with your global network initiative, you've tried to get companies to be, companies who sign up for it to be adhered to certain best practices and explain what they are. But that's been around for a while and there hasn't been a mass adoption of the, by the Facebooks and a lot of the newer Silicon Valley companies. How is that going? Do you think more companies will participate in that? Well, I think if you look at other initiatives and other industries, it takes a few years for them to build momentum. If you look at the Fair Labor Association, if you look at initiatives in the extractive industry, trying to get oil, mining, and gas companies to sign on to principles and be held accountable, it can take a while to really get the industry to kind of accept that they need to be held accountable. And so, global network initiative, we launched in 2008, Google Yahoo and Microsoft joined. We now have two more companies. We're talking to you a whole bunch of other ones. I think there's a range of companies who are recognizing they can't go alone and that they need to figure out how to kind of demonstrate to the public that they're trustworthy. And it takes a long time to go through their legal departments and get all the sign off. It can take a couple years for a company to get all the sign off throughout the corporation to actually join an initiative. So it takes a while and we're this year just going through our first year of independent assessment where you have an independent assessor is actually looking at to what extent are the companies actually living up to their commitments. And that process is still ongoing and the report is not out yet, so that's also a test. But so it takes a while and like I said, we're sort of like a 1970 Earth Day where you didn't get companies when it comes to environmental standards, when it comes to human rights standards, just to wake up one day and say, oh, I'm gonna do the right thing. It took a massive social, political, consumer, shareholder movement over decades to get companies to recognize their responsibilities. And we've just not seen too much pressure coming from the public, coming from consumers, coming from users, coming from even from government or from investors on companies to step up and recognize that there's this new component to sustainability and it's called civil liberties. And until you get sort of a similarly strong and broad movement in which kind of the social norm, the social license to operate includes respect for civil liberties and digital rights. If a company can get away with not being held accountable, of course it will. It's gonna commit to as little as it can get away with not committing to. So these kinds of things don't happen overnight and they don't happen without a whole ecosystem of people saying, this is what we value about a company and this is what we don't value about a company. If I might add to that though, I think one of the greatest things that's happened at the Global Network Initiative in the past year is one of the companies that joined more recently is a different type of company than Google Yahoo and Microsoft and that's WebSense. And what's interesting about this company, they produce filtering software. So software that is sold to people for their homes primarily to censor the internet, child safe internet, home filtering, great. Much better than the government doing it. But WebSense a couple years ago was, we noticed that the country Yemen was using their technology to censor political content and all sorts of other content. And when WebSense found out about this, they were quite angry and they attempted to stop Yemen from further updates. I know it took quite a while but essentially what happened was they ended up coming around on this and I think in the past year, I don't know their particular impetus for joining but in the past year with all of the news around surveillance exports and filtering exports, I believe that this company saw this as the right time to step up and really push those standards and I hope that their joining will cause some other companies like that to start considering this as well. Okay, might have time for two more questions or this might be it depending on how long your answer so you can spend some of your time on this one or all of it. What should individuals do to ensure that their privacy is protected as much as possible on the internet? Besides turn it off and resort to the as opposed to service. You know, I mean, I think the first step is a lot of what Rebecca's book talks about which is that we as users, as the general public need to become vigilant and start caring about this. People don't start caring about censorship generally until it affects them personally but these privacy questions affect all of us personally. We just don't think that they do because we're not criminals and so we think that we're immune or presumably. But you know, on the other hand, I think that's where we need to start moving forward and pushing companies, whether it's pushing them to join the global network initiative or pushing them to adopt their own policies or you know, really, I mean, I think this is a matter of sort of naming and shaming and something that needs to happen. Yeah, I mean, definitely what she said and it's public awareness. It's just being aware in part of what you're using and how that information is being shared who has access to it. I mean, I think a lot of people use stuff on the internet or mobile devices without really paying attention to how private or public it really is. And so part of that is just public education, educating ourselves, educating our children to think about, you know, is this just, I'm sending, I'm posting this thing on my friend's Facebook page but is this really just as public as me sticking it on the wall over there and letting everybody see it, you know, except it's magnified and, you know, just kind of think, again, think about what you're doing and sort of what its implications are. And a lot of that is just socialization because the technology so do, you know, we socialize our kids to look both ways before they cross the street, you know. There was a time when nobody was used to cars and you know, things were much more chaotic and then you kind of socialize yourself to live in a new kind of environment with new technologies and we haven't kind of socialized ourselves to this. So that's part of it too, I think, just in terms of education, you know, from primary school on up. All right, let me get one last quick question and I'm gonna synthesize from a couple of the questions in here because this is one of those relatively new things and that's the targeted advertising and the sort of the deep mining of information about individuals. Now supposedly this is to show dog lovers ads for dog food as opposed to cat food but it all sort of goes into this world of shadowy middlemen who are auctioning the right to advertise to you in a split second. Is that the only place it goes or is it going to go someplace else that we might find very unpleasant? Yeah, yeah, a security consultant who works for certain government agencies can then buy the information and then kind of work out what your preferences are and then maybe it might get used as part of an investigation in ways that are, you know, not appropriate. Well, how paranoid should we be? Let's all go, you know, it's, you know, that's a, I think we need to demand transparency and accountability and we need to be aware. And I think though that we've gotten to the point where yes, a lot of information is collected about us. We need to require that companies obtain more permission before they collect information. That they're more clear about how it's being used and sold and that there'd be more regulation about things being resold and so on and that there'd be more reporting on it. So that at least if something's being abused against us we understand who to hold responsible for it. And this is I think a lot of the problem is how do you constrain the abuse of this? Because sometimes we consent to the collection of our information because it's convenient for us. And so, okay, maybe we wanna do that but we also wanna make sure that it's not abused for purposes that we did not consent to. And right now we just don't have the mechanisms to prevent that abuse. And I'm sure Jillian would have great things to add to that, I'm not going to give her the chance. I'm sorry about that, we're out of time. That concludes our program for tonight on behalf of the World Affairs Council. I'd like to ask the audience to join me in thanking Rebecca McKinnon and Jillian York for this excellent talk and discussion. Thank you.