 So welcome everybody, my name is Mikha Sifri, I'm the co-founder of Civic Hall. Welcome to Civic Hall. If you haven't been here before, we are a community center for civic tech, for people who work on using technology for the public good. You are in our event space where we are pleased to hold events like these seven days a week. If you haven't visited our community workspace, which is on the other side of the building, feel free to sort of check it out on your way out. I'm here to answer questions. Our staff are here as well. If you're interested in joining or maybe hosting an event here, you're absolutely welcome to do that. So today we're here for the launch of the Ranking Digital Rights Report. And I have to say that it's a personal thrill for me to be able to help launch the report here at Civic Hall. Rebecca McKinnon, the prime author and the driving force behind this report is an old friend. I think I've known Rebecca from practically the early 2000s when she and Ethan Zekerman started Global Voices Online. She has been a fighter and a champion for digital rights. From that point forward, she's been educating the rest of us in many, many ways. And if you haven't read her book, The Consent of the Networked, it's a must read. It's one of the, I'd say, most critical books that any of us in this field have written on the world that we're in and what we need to fight for. So do go get The Consent of the Networked if you haven't read it already. Here's how things are going to work. Rebecca has this enormous and incredibly valuable report which is now available online. We're going to invite her up to talk basically to give you the top line findings. And we'll give her about 20 minutes to do that. There's really a lot of meat in this report. And then we've got two distinguished guests who are going to come up and help comment on it. Peter Meisek from Access Now and Ellery Biddle from Global Voices. And we'll have a little bit of Q&A together and then we're going to open it up to the audience and take questions. This event is being live streamed and the hashtag for it is ranking rights, ranking rights. So if you're tweeting about it or you want to follow along and in fact if you are watching online and want to ask a question, use that hashtag ranking rights and we'll try and pull some of those questions in as well when we get up to that. So without any further ado, let me give you Rebecca McKinnon. This is really, actually really special that Mika's doing this because as he mentioned we've known each other for a while now and Mika's been giving me platforms and helping me out ever since we met. And I just really want to say that the work Mika does both with Civic Hall but over the past 10 years with Personal Democracy Forum and all of these other platforms has provided really important platform for conversation, for amplification of ideas, for projects, for connecting people. And plus your own books which everyone should read as well but Mika's a real connective tissue I think in this community and amongst different communities working on civic tech and digital rights and other things, technology and politics. And so really thank you. Also before I get into this, a few other things. My team who've been working on this, there's been a lot of people involved with this report but there's only three full time people, two of whom are in the room today. Myself, Priya Kumar, our research analyst, without whom I would be dead. And Alan Barr, it's true. And Alan Barr, who is not here today, he's in Berlin, he's say hi to him through the internet, he's with us virtually. Without whom our data would never have, I just don't even want to think. But also who has really, is a real specialist in internet and human rights and has been vital in terms of thinking through how we're evaluating companies and being consistent in our standards. I also want to thank Sustainalytics, the investment research firm, who are our partners. There's more information about them on our website and in our materials. We worked with a network of researchers around the world who helped us out, who contributed. So thanks to all of you, I won't go through the whole list. Their names are online on the website. And also we want to thank Development Seed, who did our data visualization on the website. Really great work and you can go on the website and poke through all the data. You can download the data, etc. As I'm talking, I'll be talking about top line findings. We did hand out a little brochure that sort of gives you sort of an overview of our findings and a bit of an overview of sort of the top line data. But definitely you'll want to go onto the website and explore and you can also download the raw data as well. So finally, my final thanks is to the funders of this thing. We are obviously not taking corporate money and that would be weird. But Open Society Foundations, who are here, thank you for coming. The Knight Foundation, Ford Foundation, MacArthur Foundation, Media Democracy Fund and Hewlett Foundation are amongst the funders and there's an even longer list on the website, but we're incredibly grateful that you've been willing to support this work, which has not always been popular in all quarters. But we're really thrilled, so thank you. So without further ado, what is this index and why are we doing it? So this has to do with holding power accountable in the internet age. And some of you have seen versions of this slide before, but people are increasingly dependent on internet and mobile services for every aspect of their lives. So how these services, what they let you do, what they don't let you do, what they enable you to access under what conditions, that has an increasing impact on what you're able to do in your physical life, what data they're collecting about you, who they're sharing it with, et cetera. And they also increasingly mediate our relationship with governments for better or worse, not only our ability to figure out who we want to vote for or political organizing, but also governments knowing what you're up to and what they're going to do about it, et cetera, or how they control what you're saying. And so it's vital from a human rights standpoint that the internet and the companies that run the internet or much of it are adhering to commitments on human rights and commitments to the freedom of expression and privacy rights of their users if we want the internet to be something that is actually compatible with human rights. So what is this index? We evaluated 16 companies, eight internet companies. There are some of the obvious biggest U.S. platforms that are used by people all over the world, but we also wanted to make sure that we looked at some companies that are not the usual U.S. suspects, but that are used by millions and millions of people in other parts of the world. So we chose a Russian, a Chinese, and a Korean company, and we'll hear more about Kakao later. And then a selection of some of the world's largest telecommunications companies around the world that collectively in aggregate are operating in lots of different countries and, again, have a real impact on what people are able to access, what they're able to say, what they're able to share, and who knows what about them as they're going about their online business. So this is a screenshot from the front page of our website. I know you can't see the company names very well, but it gives you a sense of kind of how the data is organized. And as you'll see, the top-scoring company is only 65 percent. This is where a test, they'd be getting a D in school. But I do want to say this is a diagnostic test. This is not a certification. This is a test you take at the start of the class to figure out where everybody stands and then we can all get to work and figure out how we can all improve. And so what does this mean? What do these scores mean? Basically we found that anybody getting over 30 percent, we felt was making some degree of effort to respect users' rights, at least in some areas. None of them are doing perfectly. There's a lot of unevenness, as I'll talk about, but at least there was evidence of meaningful effort. Under 30 percent is really little or no effort. So that's kind of amongst the telecommunications companies and internet companies how it breaks down and you can go to the website and really poke around. So the headline is there are no winners. As I mentioned, only six companies scored at least 50 percent. Nine companies, at least 37 companies, nearly half were between 13 and 22. And no company in the index is providing users with sufficiently clear, comprehensive and accessible information about their practices that affect freedom of expression and privacy. So going down into the findings a little bit, I'm just going to give you some kind of slices of certain findings that I think have helped to illustrate what we found. So this is the overall scores for privacy. And again, I'm showing this to you not so that you can squint and look at the company names, but to see the arc of the scores. So Google overall got the highest score, but as you'll see, there's a lot of companies that whose kind of level of disclosure about what they're collecting, who they're sharing it with you, what they're doing is kind of roughly the same. And it's kind of roughly the same mediocre kind of state. And then it kind of goes down from there. One thing that's interesting is that you will have seen in the news, of course, a lot of debates about American companies' privacy practices versus European companies' privacy practices. And that while data protection law in Europe is stronger, actually the two European Union companies that we looked at have less disclosure about what's particularly what's being shared with authorities than American companies. So that's, it's an interesting note, you know, and that's a different observation then about the law or what's happening under the hood that nobody's disclosing. But in terms of what the companies are telling users about what's happening to their data, European companies are not disclosing as much as you might assume, kind of based on the popular narrative. So to talk about some other kind of surprising findings, and when you're looking at these scores and you see who scored how much overall, it's really important to realize that when you get into the specific indicators, the specific questions, it really varies. So on the question about does the company disclose what user information it collects, how it collects this information, and why, who scored the most? A Korean company called Kakao, they got 80% of the possible points on that question. The second scoring company was Facebook, in terms of what they're disclosing about what they collect. Then you have a whole kind of clump of companies that all disclose the same amount. There's a lot of, it seems like, almost copying and pasting from each other's policies to some extent in terms of what the standard practice is on disclosure. And then it sort of goes down from there in terms of the amount that's disclosed. So again, just because one company is scoring highest in the index overall doesn't mean it's necessarily doing best on specific questions that are really important to people like what it's collecting and how it's collecting it and why. There's another question about sharing of user information. Does the company disclose if and how it shares user information with third parties? Again, Kakao scored highest on this. Look at where Google ended up. So again, they may have had an overall high score in the index, but there are particular indicators again that some people really care about, on which they did not do particularly well. On freedom of expression, these are screenshots. One is a notification that Twitter provides when it withholds content on request from a government. The other is a screenshot of a block page you get on Vodafone Mobile, if they block pages that they've been asked to block for various reasons. So what are some of the questions about freedom of expression? It's about things like, are their terms of service clear? Are they explaining their enforcement clearly? Do companies notify users when they block or delete content? Are they transparent about requests they're getting? Not just from governments, but from private individuals, whether or not through a court, from private organizations. Are they transparent? Do they provide any information about the volume and nature of content they're taking down in enforcement of their terms of service? So on freedom of expression overall, Google got the highest number of points. This was on a number of indicators added together. On second and third, again, Kakao scored really well. And Kakao is an interesting case of how this is not just a Western thing. This is not a set of criteria on which only Western companies can do well or that only Western companies care about. We're actually seeing evidence that at least in countries with robust civil society and some rule of law, you can actually get companies really stepping up. And then Twitter doing very well. Going down into the data a little bit, I'm going to show a series of slides less for not asking you to absorb which company did well, but just so you can see the progression of scores. So on this particular indicator, it's asking whether companies disclose anything about their process for responding to requests for taking down user information or denying service. Only half of the companies in the index disclose anything about this process. And you'll see interestingly, one European company discloses nothing. When it gets to disclosing data about government requests to restrict content or access, it's even fewer. Only six companies out of the entire index are disclosing anything about government request to restrict content. Although I would note that after we finished, after our cutoff date for adding new information, Microsoft released a new transparency report where they did include some of this information. So next time around we do this, their score will bump up on this indicator in particular. When you get to disclosure about private requests, so not government requests, not court orders, but more sort of private requests, or DMCA takedown requests, you have even less disclosure going on. Only four companies are publishing requests that they get from entities or individuals that aren't government or acting on any government authority. No companies publish any data or kind of stats about the volume in nature of content being removed or accounts being shut down in enforcement of terms of service. And this is something on which, I mean, Ellery will talk about from civil society standpoint, a concern. Now there is some good news. And what is that? I'm going to focus the good news on what we call the commitment category. The commitment category is looking at broader institutional leadership and implementation of commitment and oversight. So things like, is there a commitment from the CEO? Are there clear company policies in support of freedom of expression and privacy? Is there oversight at the board or executive level over issues related to freedom of expression and privacy? Or is it just kind of left to some staff who care about it? Is there evidence that there's staff training? Is there evidence of internal whistleblowing when problems arise? In other words, is the company providing evidence that's institutionalized its commitments? Is the company carrying out human rights impact assessments? So is it assessing the way in which its business is affecting freedom of expression and privacy of users and trying to mitigate that? Are they providing any mechanisms for people to lodge grievances and get remedy if they feel that their rights have been violated? So on commitment, it's interesting, the companies that score highest on commitment are either members of the Global Network Initiative, which is an initiative that I actually helped start a few years ago, but it tries to get companies to commit to corporate principles on freedom of expression and privacy. They commit to carry out human rights impact assessments, and they commit to let the GNI appointed assessors then go and actually verify that they do these assessments at a level of quality that's meaningful. And so what we're seeing here is that the companies that are doing best on commitment are either members of the Global Network Initiative or they're members of something called the Telecommunications Industry Dialogue, which is a group of telecommunications companies that have also signed up to a set of principles based on the UN guiding principles on business and human rights, and some of them are also doing assessments. And there's sort of discussions going on between the Industry Dialogue and GNI about possibly merging or having members of that joined in the future. We'll kind of see how that goes. But in any case, it's a very interesting mix here. Specifically within the commitment indicator, human rights impact assessment, you're going to find this interesting. Who scored best? We actually have someone from Yahoo here. I won't force her to raise her hand unless she wants to. But Marissa Mayer actually, before she came in as head of the company, inherited a business and human rights program that Yahoo had set up soon or sort of around the time it joined the Global Network Initiative as a founding member. And in the wake of some scandals over handing over of journalists and dissident data to Chinese authorities in China, and Jerry Yang, one of the founders being called into Congress yelled at and so on, and Yahoo has since then put a lot of commitments into place and is clearer in its disclosures and its information that it provides about the thoroughness of its human rights impact assessments than any of the other companies in the index. So that's something to note. I'm afraid to say that Twitter provides no evidence of conducting human rights impact assessments and it doesn't, even from non-public information, I haven't found evidence that they do them in any systematic way. And while Jack Dorsey obviously has made a lot of statements about freedom of expression and the rights of users, we see very little evidence of the institutionalization of their commitments beyond just sort of certain individuals in the company carrying a lot, which goes a long way except when those individuals leave. So when it comes to a company, how do you institutionalize that? And then finally on commitment with grievance and remedy, the top scoring telecommunications company is an Indian company called Bharti Airtel. The top scoring internet company is again Kakao from Korea. In part this has to do with regulatory requirements about remedy, but it also just goes to show that the industry as a whole really doesn't know what to do about remedy. And this is something that Peter knows a lot about and he can comment more on. So we have a bunch of recommendations in our report on the site and so on and you can see it online in more detail. But we're really saying companies need to not just sort of make statements, but they need to carry out risk assessments. They need to evaluate what impact they're having on users and then figure out how they're going to sort of adjust their business practice and policies to make sure that they are in fact supporting users' rights. They need to be transparent and accountable, not only about government requests, and we are seeing more transparency reporting every year. Companies are releasing more information about requests they're getting, but there's a lot less information about private requests and no information about terms of service enforcement. And I think our panelists are going to talk about why that matters and why it's a problem not to have transparency on that. We need better communication about what's happening to users' information. As my colleague Priya can tell you, we were really kind of going at the privacy policies from the standpoint, if I'm a user and if I want to know for company X, if somebody were to create a dossier on me, everything that this company knows about me and my activities, could I be able to figure out what that company has on me? What would go into that dossier based on what that company has on me? Going through all the policies, it was really hard to figure that out. I mean, Priya and several other people were working full time for several months just trying to work this out for some of the companies and kind of working through the scores. Users need clearer than information than that. We need to understand what is known about us. It's not easy, we recognize that, but we're not where we need to be either. There's a lot of dialogue I think that needs to be had about how do we get to a better place. Grievance and Remedy, I'll let Peter riff on this a bit. Particularly in a world where we have governments increasingly demanding that companies take responsibility for terrorist activity online and got to do more about hate speech. There's lots of bad things happening and everybody's saying the companies have to police the stuff more. First of all, there's an issue with that to begin with, but as more and more pressure is being put on companies to use their terms of service to enforce and so on, if we don't have good grievance mechanisms, if people don't have a way to get their account reinstated or receive some kind of redress when their rights are violated in the course of companies attempting to do the right thing but not doing it properly, then there's no incentive for the companies to improve. And it's harder to prevent violations or identify who's responsible for them and that's a really important thing that we're pretty far from figuring out how to do. Strong security practices need I say more go to the website and kind of see that part of it. And just for governments, just really quickly, and again we get into this much more on the website, but there are a lot of cases where companies are prevented by law and regulation from being as transparent and accountable as we'd like them to be. And we've just got to do something about laws that do not support companies' respect for users' rights and we need comprehensive examination and reform on that. And these governments, they're joining open government partnership and whatnot and then not being transparent about what they're demanding of companies. That's just crap, right? Mika can talk more about that. Engaged with the open government community quite a bit. So that's it. I'm going to stop talking. And I could see our group here nodding their head. I could see other people nodding their heads. So I hope we can just have a really fun conversation which Mika will lead. Great, thank you. And everybody please come up. This one's on in case. Okay, great. Thank you Rebecca for that. So just to dive right in, I thought we'd let each of our commenters take a few minutes. To just sort of offer some initial response. We'll start with Peter Meisak who is the head of business and human rights work at AccessNow, accessnow.org, which is an international organization that defends and extends digital rights of users at risk around the world. He also teaches at Columbia's School of International and Public Affairs on internet policy and governance. Peter, your thoughts? Yeah, thanks. And first of all congratulations again for launching this. I know how much work I've seen. You put into it and I think we all benefit from the results. So yeah, from access's perspective, this report is really an invaluable tool for a number of different stakeholders. First and foremost, we have the companies, right? Obviously they're the subjects of this report. But in the internet ecosystem, they're one player. And not the player that my organization really exists to support. So who we're talking about are users at risk around the world. Really vulnerable groups, journalists, activists, human rights defenders who are forced online because of the shrinking space for civil society, for public engagement and civic engagement. They're forced to go online to find their communities, to have a voice, even to do business. Increasingly our daily lives are forced online and forced onto these privately owned and run platforms. Our rights are mediated by these platforms and our access to human rights depend on their policies and on the decisions made often far, far away, right? So we have decisions in country that telcos face. Just in Congo, Brazzaville about a week ago, there was an internet shutdown timed on the day when mass demonstrations were going to hit the streets to oppose a proposal to extend presidential term limits. This is the 15th recorded internet and mobile shutdown that we've seen this year, depending on hazy news reports, random anecdotes that sometimes reach our digital security outline. And sometimes the only people we hear about it are the companies themselves from. But one of the companies ranked in this report who we've really targeted as one of the worst actors and secured some high level human rights commitments from, that's MTN. I'm happy to see that they ranked better in the commitments category than they did just about anywhere else. They have not made a public statement about this shutdown that may or may not have occurred that prevented their own users from perhaps telling each other, hey, don't go down that street. The troops are assembled here. Protest elsewhere. These are the types of messages that people depend on in the moment that shutdowns prevent. We have struggled over the last couple of weeks, fought for that company to be more public about what it was ordered to do, and they're still fighting that. I think that nugget kind of shows, first of all, the importance of corporate disclosures, as they're sometimes the only actor who knows the extent of what they're ordered to do by the government and should be held accountable to the public. And by focusing on disclosures in this report, I think it does a lot for those users at risk, in terms of notice especially. Great, thank you. It's Ellery Biddle, who is the director of Global Voices Advocacy. And Global Voices, I suppose you should say a little bit about, which is a global network of citizen bloggers in almost every country in the world where people can do that. Only 137 countries, actually. Only 130, that's still a lot. And she's the editor of AdVox, which is a program of Global Voices online that is dedicated to reporting on threats to online speech. So Ellery, give us your first reactions to this report. Sure, so I think the questions about terms of service and remedy jumped out at both of us. And I guess a couple of examples that came to mind for me, one, so the Global Voices community has writers in many parts of the world, and we established a special project on free expression online when it became clear to us that this was vital to our own existence and work as writers in the internet. And we were aware of that because our blogs started being blocked. People started receiving different kinds of threats because of what they were writing online. Legal measures were taken against them. So this speaks to a lot of issues that we've dealt with quite personally. And this year, I'm sure some of you know about this case in Ethiopia that we have been experiencing for a while. Nine bloggers in Ethiopia, most of whom were contributors to Global Voices, were rested in April 2014 and charged a few months later under the Anti-Terrorism Act there. And they were imprisoned for some of them for 15 months and some for 18 months. And so they were actually, the last of them were just released two weeks ago. And interestingly, they were acquitted of a terror charge. But what if they had not been? What if it becomes? And it is, it's become so easy to use national security and the threat of terror, whatever definition that takes to silence critics of governments. So then when something like this happens, and I think about a company like Facebook that is always very clear in saying, we have to follow the law in whatever country we're in. We have to do, if we get a court order, most of the time we just have to do what it says. Well, if you have a court that tells, that says these people were being really critical of the government, we believe they may have intended to overthrow it even though there's not an ounce of evidence that that's true. They're terrorists. What happens then, what do we, how, how can a company like Facebook or any of these guys how are they supposed to behave in that situation? Becca, do you want to tackle that question? Like this, you know, this is exactly, there's a couple of different issues here. I mean, one is that from the, I think companies have, have been facing a dilemma and they've been making different choices in different countries. As you know, about do we, do we go into this country and let people use it and then follow the law, or do we not follow the law in this country and just get the whole platform blocked, right? And with the exception of China and a couple of other places, in most other parts of the world, companies have opted, well, we're going to go in, we're going to engage, we're going to follow the law. But, you know, part of, you know, signing onto the global network initiative is saying, well, at least we're going to be as transparent as possible, we're going to conduct human rights impact assessments and at least try and understand the trade-offs we're making and make informed decisions about what we're doing. But that doesn't kind of solve the fundamental issue for people in the global voices community, you know, for this problem you're stating, it's very unsatisfactory. So, so companies are in this difficult position where at very least they need to be very clear with users about what they're doing, but then what's the next step? And how do you empower people to actually change the situation without just contributing to the problem? And I think that's where the deeper work goes. And I think this index is showing at least to what extent are companies being open about what they're doing? To what extent are they informing users? I think the next conversation that needs to be had on top of that is, okay, these things are still happening nonetheless and they're deeply problematic. So, now what do we do? So, I have a follow-up question and any of you can take this. You know, I was thinking about it, you know, these 16 companies are the infrastructure for more than a billion plus people in our collective digital lives. And if the engineers and designers are the people who are kind of setting up the highways and byways of that, you know, sort of digital megalopolis, what this report is really about is the, you know, what the rules of the road are and whether, you know, you as somebody using operating in that space are protected or vulnerable. And that isn't really the job of the engineers and the designers to figure out though, maybe it could be. But who are the deciders? Who are the people in these companies who are actually, you know, the ones responsible for the state of affairs that we have now? Is it the lawyers? Is it the board? Is it the founders? Like, who are the real deciders? You know, the buck stops at the top, right? And, you know, the problem is that decisions are getting made at all different levels. You know, there might be a designed decision that actually really affects somebody's freedom of speech, but the engineer who made that decision just has no training on freedom of expression and doesn't, didn't even, it didn't even occur to them that it was an issue. You know, so there's that kind of thing. But then there's also kind of lawyers and, you know, or marketing people, you know, different people with different interests within the company, which is why it's important, which is why we had this commitment section, which is that, you know, it's fine for the CEO to say we believe in free speech, but, you know, are people at the very working level where the rubber hits the road, trained to think about these trade-offs, is there a comprehensive kind of examination, you know, throughout the company of what impact they're having and making sure that staff understand that and that people are appropriately being held responsible and that there's oversight at the board level or at least, you know, high-level executive, that there's actually an effort to understand what's happening, the impact that's going on, and to mitigate bad effects. So it's not one person, but obviously, you know, the board is ultimately responsible. The CEO is ultimately responsible for what their staff are doing, you know, no matter what level these decisions are being made. And these, the repercussions of their decisions are increasingly going up to the board level. Just in the past week, we've seen the chair of a European telco resign over, you know, essentially a failure to do due diligence and entering one of these, you know, emerging markets where, you know, anyone could predict they might encounter, you know, problems of corruption, which are certainly in my mind linked to, you know, a propensity for human rights violations. Go ahead. Well, I guess something that I hope this project can help our whole community with is just understanding better how people in these big companies work together. Because it can be, I mean, different ones, there's different levels of clarity and transparency about that. But in the Global Voices community, for example, many of our writers are using social media in languages that are not major world languages, not UN languages, and they're not Latin scripts. And when they have a problem or a piece of content gets taken down, there's often the process for appeal if there is one, the process for figuring out what happened is often very slow. And the suspicion among, I mean, really everybody is that Facebook doesn't speak my language, Facebook doesn't speak Tamil, Facebook doesn't speak Malayalam. They have moderators, you know, they have, the way I understand it is that it's third party contractors, they hire people who are fluent in the language, probably live in the region and look at a bunch of tickets a day and make a decision about what to do with this piece of content. That's really important. And I know, we know really small amount about what that work is, what kind of thinking goes into the training of that set of the labor force within this space. And I think it's, I think it's huge issue that I want to get into. And just to kind of speak to all of this further, that in terms of the standards we're looking at and the questions we're asking, you know, the company doing well in this index, it's the floor not the ceiling. And it's just about sort of, is the company sort of disclosing enough that users can really understand what's happening to them in the context of the platform. It doesn't get to actually addressing, particularly government abuses, because that requires a whole other set of work. But at very least, we need companies to engage in maximum disclosure about what they're doing and how what they do affects users' freedom of expression and privacy so that users can make more informed choices about how they're using the platform and sort of who to blame for what in a way. Because sometimes you just don't know, right? I mean, I've heard from people say in Hong Kong, their Facebook account went down and they thought it was like the Hong Kong government's fault. Because nobody quite knows who to even take your grievance to. The other thing, another point that's really, I think, quite useful for people to know is that the report does go into the variations inside these companies' services. Facebook's policies for users of the social network are not the same as for users of Instagram or WhatsApp. And that also people may be making assumptions falsely assuming that, you know, whatever that policy is, is uniform across all of their services. That's right. I mean, we're holding the corporate entity to account, you know, the buck stops with the board. And so we're not just looking at one service. For each company, we're actually looking at several services, several of their major services. And so we did find for certain companies in Facebook being a very strong example. But Twitter with Vine also an issue where you'll have much stronger policies for one service than another. Or just total lack of clarity about whether the policies for one also extend to the other. And that's, you know, I mean, I heard from some companies it's like, well, but we're doing really good over here. And I'm like, but you know, WhatsApp has 900 million users, you know, you're responsible for them too. So that's another issue as well. And that affects the scores. So when you're kind of looking at the scores, it's the whole corporate entity that we're looking at. So another question, and then we're going to go to the audience soon. Is this problem inherent to these companies' business model? Like, do you think that, I mean, given that none of them managed to get even a decently passing grade over 65, how much of this is really rooted in their business? Yeah, I mean, if you if you just took, we did an exercise where we went through and we took the highest score for every indicator, regardless of who that was, and created the sort of fictitious company that was composite of all the high scores. And even despite the fact that we think everybody can do better on certain indicators, that still gets you to like 75%. So, and that's that's without even like doing the more difficult work that I think is not necessarily part of the business model. Now, there are a few indicators on which to totally get a perfect score, you have to show that you're not collecting data and, you know, that if you were really going to get 100%. So that does speak to the business model. But we're like so not to the point in the scores where that actually makes a meaningful difference in the difference between scores. But we put that in there just in case we end up ranking a company in future that for instance, doesn't store anybody's data. We don't have one of those in the index now, but it's possible at some future time we wanted to leave room for companies whose business models are super duper privacy friendly to to get a higher score. That said, though, again, even without changes in business model, even even even without major changes in legal and regulatory structure, all of these companies can do much better. Okay, so we're going to open things up for questions from the room. Do we have a mic for people in the room? You can steal my mic. I'll try. Thanks very much. This is very illuminating. I'm Gus from Simply Secure. Speaking of the entirety of the company, I realize I'm not sure if we're discussing lobbying attempts by any of these companies, and specifically I'm thinking about net neutrality. I learned this summer at the Chaos Communications Congress from some representatives of hacker spaces in India that while a couple of major U.S. companies are lobbying in favor of net neutrality in the United States, they're actually lobbying against it in India because their business interest isn't having what is it, zero rate or one of those sort of low level. Yeah, I'm wondering if that's something that you're tracking and something that you've included in this, because I was sort of shocked by that. That's a really good question. We're not tracking lobbying in here, mainly just because we were limited in scope. That's one of the types of things that's hard to be consistent about how you evaluate it across companies and in different places. In India, you can actually get information about a company's lobbying activities. In some parts of the world, you just have no idea. Then you might end up giving a company a worse score on lobbying just because they're doing it in a freer information environment when actually there's even nastier things happening somewhere else that we have no information about. So we wanted to avoid those types of issues within the context of a ranking. That said, I really hope somebody kind of, this is where I really make the plug for we're creating a framework around certain research and data that we felt we were able to provide at a certain consistency and standard. There's a whole lot more research. There's a whole lot more kind of investigation that I hope people are going to hang off of this or that will build off of this in different ways. This is really meant to be a scaffold for people to then sort of ask, well, this just raises more questions. So we need to go research those. I really look forward to being able to link and connect what we've got here to what other people are then kind of discovering and tracking on top of this. That's where I think the real value is going to be. One of the things we're hoping to do in the next few months is really start having conversations with a lot of groups that kind of want to build research projects using this as a starting point and really dig down in particular locations, particular jurisdictions where there's kind of more jurisdiction specific questions that don't lend themselves to as global ranking but are really important for people to know more about. Hi, my name is Michael Conner. Congratulations, Rebecca. Great work. Question following up on the discussion earlier about companies. There are no winners you say and so the question becomes is the result of, is it the result of incompetence? Is it the result of not caring or complicity perhaps? And then a follow up, how do you begin to trigger change within companies? How do you change it? That's a really good question. I mean this is why I wanted to do ranking because frankly there are a lot of companies with a lot of people in them that do care about this stuff a lot. But the companies themselves don't necessarily have a lot of good comparative information about what else is possible. I think it's going to come to a surprise to certain companies that Kakao beat them out on certain indicators and kind of how it is that they're doing it. And we did have conversations with a lot of the companies, about half of the companies in the ranking, about their preliminary scores. And there were some cases where people sort of made the case that what they're currently doing is all that they can do. And then we're now able to point to examples of there's somebody who's actually doing more. And I think it's not that they were lying, it's just that they perhaps weren't aware because there hadn't been that kind of very comparative analysis looking at the question in exactly the same way. So I'm hoping this can be a real tool for the companies because there are a lot of people in these companies that want to move the ball in the right direction. But it varies from company to company. I think there are some companies that really haven't thought about these issues that much. In some markets there's not a lot of stakeholder pressure or media pressure or there are just kind of other issues going on or people just don't know. There are some places where users just don't realize that there might be another way for companies to do things and that there are actually examples of companies doing, you know, being more transparent or having different types of practices. And so I'm hoping that this ranking will help kind of illuminate that and incentivize companies I think we've seen in other industries, you know, as you know, since you do investor advocacy that, you know, the more you kind of put out the information companies are further incentivized. And so some of its priority, like I expect somebody from a company that's not actually in the ranking, I won't name the company or the person, but, you know, I was asking why doesn't your company do X, Y, and Z, you know, along the lines of what we're looking at in here. And this person said, well, our CEO thinks that if our users aren't demanding it, you know, aren't just breathing down our necks demanding it, it's not a priority. And as you know, human rights is not always the first thing that the public is that the charity of the majority is demanding. You know, oftentimes they're demanding delightfulness and ease of use and, you know, fast video, right? That's what the majority is kind of demanding the most. But, you know, for the people who need this stuff, it can be life and death. And they may be a minority of users, but from a human rights standpoint, you have an absolute obligation to be doing everything you can to respect their rights. And so the hope is that this kind of helps prioritize this and that it provides the questions that investors can ask of these companies. I think questions, in terms of the questions that companies get asked, it's been sort of uneven as well. So I'm hoping this might provide a little bit more of a focus. I'm hoping that just that we've seen some indexes on sustainability and labor practices have kind of helped move the ball because it just kind of helped promote shared learning, not just among the companies, but also amongst activists and investors and other stakeholders just about what's possible and provides a framework for a conversation. I'm hoping we can do that too. I would add to that that I think that some companies are starting to think more about usability in relationship to fundamental rights, whereas I think five years ago it was more likely that the policy team was in this sort of unattractive corner of the office while the people actually designing the stuff were like having all the fun and they were kind of talking to each other. In some context, I mean I think at Google for sure, it seems to be happening more that there's more thinking about well what are the different ways that we can serve users and create products that people can choose to use that are more privacy sensitive, that are a bit safer, and that make the user experience actually easier somehow. That's not, that's sometimes thought of as kind of impossible or those two things just don't, can't work together, but I don't, I think that's, I see developers moving in that direction in some, in some companies and NGOs and I'm, I'm excited about that. Thank you. My name is Corey Abramson. Thank you all for all the work you're doing. Looking at the report as a tool, you were obviously limited in your scope and your time and how many companies you could work with and how many wanted to work with you. What are your plans to promote the report and connect smaller companies or newer startup companies with the information so that they could from the get-go have good practice as opposed to having to correct them down the road? Great question. We're working with a lot of groups including Peter and Peter can perhaps talk to that as well, but yeah I mean we're going around to a lot of gatherings and you know hanging out at Wright's Con later next year. Just to really meet with groups. I'm going to the Internet Governance Forum next week to meet with NGOs and academics to talk about how they want to use this and to really encourage people to kind of build their own projects off of it. I have also heard from people, you know frankly I'm quite happy for people to kind of actually find business opportunities in this so I think there's a lot of consultants that work with startups that can help them take this and figure out how to apply it to their situation. I totally welcome that. This is out on a Creative Commons attribution only license. You know we want people to use this. We want this to have as much as an impact as possible. Yeah I would just add for the you know especially the larger corporations and the telecoms definitely count. We're going to pull the levers of the corporate machinery. You know many of these companies are structured where you can attend the annual general meetings. Shareholders can put forth resolutions. There are a lot of activists and impact investors out there who are very experienced dealing with labor issues, the apparel sector, you know the extractive sector. They just are not aware of perhaps what the risks are in the information and communication technology, the tech and telecom sector. So this report helps define those risks first of all. You may call them non-financial, you know reputational whatnot but they do go up to the top and hit the bottom line. And then once you you know can educate these pensioners up to the equity fund chiefs on what those risks are this provides a roadmap with you know just really rich and hard-hitting information about the actual steps that companies can take you know whether they're they're a startup or a multinational talker. Thank you. My name is Matthias. I'm a journalist from Berlin and on the board of reporters without borders the German section. And of course I'm most interested in the observation that you made about the European companies that have apparently a very stringent data protection regime but are not following human rights so much or I'm a bit mean here. But my question is there is a discussion of course in Germany and in Europe about data protection being the right means to further human rights and some people arguing that well it's not and apparently your report gives credibility to that notion but is that your take on on the results as well? That's a good question. I mean just to clarify because we're looking at disclosure so if a company is doing things to adhere to data protection law but it's just not explaining what it's doing to users then they're failing to get credit based on our methodology. But I guess that's the point is that we're finding with some companies in some markets they're talking to regulators but they're not talking to their users so they're not explaining to their users what's actually happening. So this is more this is less about what's happening under the hood that nobody sees or that maybe the regulator has confidential access to and more about what is the company telling their users about what's happening to their information or their their ability to speak. So just to kind of make very clear that that this is more about disclosure than it is about you know practice that cannot be seen outside of the public eye because that's what we're able to measure but we're also taking a view that companies need to communicate these things to users and and that only talking to regulators and not making clear to your users what's happening in the process of your compliance with regulation is not sufficient ultimately. So just to be clear about that there were a few cases where the companies say well we comply with regulation but they say nothing else and on our methodology you don't get credit for that so we can have a debate about that as well. But you know friends in the case of orange the privacy orange for instance it doesn't even put its privacy policies online you know and it's like come on folks you know we could do better than this. So but it's tough. I think there is an issue about you know compliance versus you know have you really institutionalized something and there have been people who've written about that as well and this certainly it kind of adds to the debate. I don't think it really comes down on one side or another conclusively. So actually question from from Twitter from David Sullivan who asks what are the top three things companies could do to raise their scores next time around. So well there's you know there's the hard things and the easy things and the low hanging fruit you know and and I'd love to hear what you guys you know I'd love to hear each of your own versions of this but you know the seriously low hanging fruit or things like some companies just lost a lot of points because while they disclose what they do in response to government requests they have no disclosure about what they do around private requests and particularly around kind of user data questions and depending on the jurisdiction there's some there's some jurisdictions where it you know sometimes companies hand over data on private requests even if it hasn't come through court order or you know a government process and other other countries where you know that tends not to happen but companies often don't even clarify anywhere in their public materials whether they entertain private requests you know and and that's just sort of this so there's a certain certain things that are just no-brainer and then you talk to somebody in the company privately it's like well of course we don't entertain private requests it's like well why don't you tell your users that because there's actually a lot of users who suspect that you do and you know I think many people in the gv community and so so you know there's some things like that we're just being clear about what you're doing in different circumstances could take a long take things a long way um yeah just just kind of going just basically basically just disclose everything that the the law doesn't prevent you from disclosing that is possible you know is is real so so there's a lot of companies that you know they're not the law doesn't let them um disclose data so like the number of government requests they get they the law won't let them do that but the law doesn't stop them from disclosing basic information about how they handle requests like what their process is what their policy is for receiving requests the law doesn't prevent them from doing that so you know go up to as far as you can in terms of the types of disclosures you're you're providing you know don't just say well you know the law and the other is that there are a lot of I guess the third thing is we found a lot of cases particularly in countries that that are tend to be known as democracies you know that have rule of law where where the company is like if the law isn't requiring it they just don't get around to it so so there there are some countries where because the law doesn't make them disclose certain things they they just you know won't do it until the law forces them to do it and and so we just need companies to be more proactive and and perhaps use this as a roadmap about what they need to be proactive about yeah I have an answer for David which is um I think that every company that allows users to report on each other to flag content that they think is violating terms of service or to flag an account that they think is violating terms in some other way a key example being facebook and their real name authentic identity policy I think they should release data about that because there's a stunning amount of stuff that is online and then it goes away and then it comes back and people whose accounts are suspended indefinitely suspended until they provide some information but they're not really sure what happens to that information once they send it off um and we just don't really know we're kind of it's been a huge issue that Peter and I both have been working on a lot in the last couple months and I we can point to these cases that are really scary and sort of staggering particularly in the area of disclosing the people's identities and situations where they're not actually prepared for that um and we're just totally in the dark we have these great particular examples from people who have been willing to share their stories because of how upsetting or sort of passionate they are about the issue but we could know so much more if we had that information yeah and just to add we want we were we want our policymakers to have evidence-based data-driven policy right we want um companies to to act the same um if you want civil society to to produce our you know our guidance um based on you know the the state of affairs then we need to know what's happening on your platforms you need to give us that um actionable data um and not in the form of you know press release but but the real hard numbers and then just to throw on um you know digital security data security practices and encryption especially end-to-end encryption is is one area where a lot of these like all the companies essentially failed um you know if you if you're really confident in the security of your systems and your networks you should be able to disclose um why you're confident you know how often um you update what standards you use um and and just how um how much encryption you're using um encrypt all the things is a is an access campaign access now campaign um that really puts it forward that you know end-to-end encryption works and it needs to be offered um wherever possible um in the form of pgp keys or or otherwise um and uh you know starting uh is just to be um honest and forthright about what practices you're currently um employing. Hi Madeline Erp I work on Freedom House's Freedom on the Net Report. I'm curious about the um the effects that this could have on public opinion and the possible chilling effects that misinformation about company policies can have I mean it's often a problem that we face that researchers even experts will come to us and say well this is Facebook's policy or you know there'll be a transparency report that'll be misquoted or distorted and we'll say well where's the where's the source for this is they all these 50 news reports you know and it's just widely believed that this is a practice which is not actually supported in fact and that's also susceptible to misinformation by governments who say oh it's the company's fault as a way of distracting attention from their own practices so our official answer is always to say well for more information more transparency will defeat misinformation but in some areas that I research I wonder if that's really true and how long it takes. That's a really important question um and I just want to give a shout out to Freedom House and the Freedom on the Net Index that just was launched last week it's the the month of launchings and um and and what was really interesting about the finding we know of one of the top-line findings in in this year's report was the increased pressure on private platforms to to carry out government restrictions and surveillance and so on um but you're absolutely right there there is this issue of kind of what information to believe and actually for most of our indicators um we didn't count press reports um we didn't count there there were a few kind of there were a few very notable exceptions which we kind of noted in the researcher guidance that's in the indicator document but for for all all of the indicators with the exception of a couple in the commitment section where it was appropriate to be a little broader about what we looked at press reports don't count you know it's like what does the company itself actually say on its own platform you know not what the CEO said to the financial times or you know whatever that or what was reported in an Indian newspaper about what Twitter was saying off the record or something um that none of that counts and that's because it's not verifiable so we're really only looking at the company's disclosed policies now it's also true that what we do not have the capacity to do is then go on the ground in all of these places and examine is is this policy that they claim is this practice that they have disclosed and the statement they made about what they do is is that what we're actually seeing on the ground or are we seeing something else happening that that kind of disproves the statement so we did not go and kind of fact check the company's statements about like whether they're whether their statements really are completely consistent with what people in the global voices community are actually experiencing out there on the internet in various countries what we're hoping is that since we've now provided basically this you know this the set of data that's based on what companies claim they're doing not on what somebody in the media was told by somebody who works at twitter off the record but you know but what what the companies say they're doing is that i'm hoping that you know people in your network and and people in the access network and the global new voices met network you know this is now the opportunity to go and say okay you know we now have a repository of what the companies say they're doing and we're going to call attention when we whenever we find that actually something else is happening or the company based on our experience and our documented evidence is doing something different than what they claim in that policy you know here it is and and that so so that our kind of this layer of research will hopefully provide sort of the basis for the next layer of research which is kind of verification on the ground at a more local level where the rubber is really hitting the road where people are really experiencing these things. Hi my name is Anne-Marie Puente I'm a lawyer and also a design strategist from Pred Institute I'm really curious to see of how your indicators will actually be evolved in the future if any of these commitments talked about privacy by design data protection legal protection by design I'm curious if you encounter that with your finance you mean whether we oh I'm right I'm Rebecca congratulations once again and my name is Jacqueline I'm a law student and I was interested in as a regular use of cacao you know I was interested in cacao of course as an outlier and I'm curious how much was it due to government regulation but also how much was it part of big due to user expectation of the user base where you know obviously if the users are so used to collections throughout years and years you know they would obviously feel more comfortable giving up their you know certain levels of privacy and cacao as a result would feel more comfortable being transparent of data collection so I was and what would be the takeaway of cacao being an outlier so that's my question thank you my name is Torvin and I had the pleasure of working with your colleague alone bar at the balloon best think tank though that's basically why I'm here I'd be interested in whether you've been in contact with these companies before today's presentation and whether there's been any feedback or interest in their own performance in the index bar so so kind of privacy by design etc we we don't use that term in our indicators because we we we ask very specific questions that enabled us to really identify particular behaviors and policies and privacy by so we ended up like not using in the indicators themselves sort of things that tend to be buzzwords like privacy by design or we didn't even use net neutrality in the indicator that actually looks at net neutrality we we use terms like network management and you know sort of other kind of more fact-based terms about what is actually happening but but that's certainly part you know it's sort of implicit I think in in what we're looking at and certainly the methodology will evolve over time right now for this iteration we looked at internet and telecommunications companies only there's no device companies in here so Apple is not here Amazon is not here Samsung you know there's there's a you know we like to expand it to include a somewhat broader range of companies for very obvious reasons when it comes to the question about cacao I mean they're a fascinating company you know some of their kind of privacy policies definitely relates to regulation some of it relates to just kind of scandals in Korea recently over surveillance and and I think a strong civil society that's demanding more transparency you know so cacao issues transparency reports and you know they're I think they're not the only Korean company that releases a transparency report but not many not many Asian companies are doing transparency reporting and so so but they they are showing that you know you don't have to be a western company to do this and that there's demand you know Korea is is a you know it's a democracy it's got sort of robust multi-party system and and and also kind of a certain you know half the society distrusts the other political party pretty strongly it seems so there's a lot of incentive to you know want more transparency it seems as well so so those are all factors and that politics is you know Korea was the first country to elect a president thanks to internet organizing so you know so people really see the the importance of that for their civic lives and that seems to be part of it too I mean I'd love to hear you know somebody could write a fascinating paper about what's going on in Korea but I I that kind of segues to our engagement with companies and and cacao did engage with us along with about half the companies in the index what we did in terms of the engagement was several things one was as we were developing the methodology we reached out to a broad universe of companies that were likely to end up getting ranked or were kind of in in the zone of possibly getting ranked and and asking you know them for feedback reaching out to companies at various conferences and so on letting them know this was happening putting various iterations of our methodology online so that you know people could see this coming from a million miles away it's not a surprise and giving them opportunity to give us feedback we did a pilot study in which we talked to the companies about their results even though we didn't release them publicly and for the index itself we we contacted all the companies that were in the index at the start of research saying you're you're the lucky winner of being included in the index and we'll be in touch with you in in a month or so with your initial results and you know we will welcome your feedback at that time and and so about half the companies did take advantage of this opportunity to give us their feedback and it helped us you know it was it was very helpful and you know just in terms of our making sure that we were evaluating the indicators in a way that was consistent with the reality of how the companies function while at the same time you know we didn't we didn't always you know some companies are saying well my score should change on this we didn't always agree with that either but but it was very important to have that dialogue and this is really just the beginning because this is not sort of this stone tablets down from the mountain about here is how the world must be according to the ranking digital rights team you know and this is our verdict this is a conversation starter this is you know we've applied these standards based on widespread consultation with experts building on a whole set of different kind of emerging privacy standards knowledge of many experts you know human rights standards that are increasingly widely accepted we built it on that but some of the details about where we need to be how we get there what ideal policies and practices really should look like we need to have a conversation about that and and so this is really meant to be again the framework for a conversation with companies a framework for a lot more research by a lot more people framework by for a lot more advocacy etc so it's a starting point so we're almost out of time I do want to say one thing I even just following the twitter stream here it's clear that Yahoo's business and human rights department has already posted something on their tumblr blog saying they're looking forward to continued conversation they're going to study the report so that's you've already had an impact with one of the biggest internet companies my last question for the panel and if you can just take a minute each is with the freedom house report that just came out which clearly says that freedom on the net at least in terms of government influence on people's freedoms is getting worse globally which way do you think the pendulum is swinging and do you have any optimism perhaps about the behavior of companies compared to the behavior of governments Hillary starts crying I'm afraid I don't and I I often hope that that's because sitting from where I sit how would I I get every day I edit a story about somebody getting arrested because of something they said on either Facebook or Twitter that's there's so many of the problems here touch that I know but I but I think well but I think that there's I think we're getting to a place where the company well for example in how you ask users to identify themselves and what kinds of information they have to give you in order to use their service it does implicate you know there there is at what point do John Ruggies principles on on business and human rights which say if you could prevent a human rights violation from occurring and you can kind of see that you should as a as a big company such as these are I think that maybe there's a maybe there's a bright spot in in efforts like internet org or free basics or whatever it's called where there is it's scary on one hand because it's a narrowing of the internet but they're the response to it in a place like India has been fascinating I mean there's a lot of pushback and opposition and it's not just from elites it's from people who who get what this means and want the company if they're going to do this stuff to do it differently so that so there there's a positive thing thanks yeah yeah I agree I think the the space for civic action is shrinking governments are generally seeing the internet as a threat and you know it's it's more incumbent on companies I think than ever to understand that they have you know a responsibility to respect human rights and to to jointly where possible with governments provide access to remedy but you know if if independent courts aren't an option they need to to look within and see what what they can offer in terms of you know resolving grievances as as efficiently and painlessly as possible as possible I think the tech sector tech and telecom sector has has a real golden key in its pocket which is that people actually like their products this isn't you know this isn't you know they're not going in and setting up mines that destroy communities they're you know providing in many ways connectivity that people depend on and want and so if you're looking at you know remedy it's it's often just turn it back on but you know be a little bit more transparent and and secure and how you do that and you know coming off of the big you know global connectivity push at the United Nations people you know I think recognize that infrastructure and and technology is is the way to implement all you know 17 and a half or whatever of the of the global goals and we just want to ensure that that takes place within a rights respecting framework you know that that human rights come hand in hand with with our connections yeah I mean companies have we're at this kind of crossroads right now companies can be part of the problem and can be part of the exacerbation of the problem or they can at least some critical mass of them can be part of the solution or at least contribute to a solution or at least prevent things from getting any worse than they already are you know sort of some level there but I think companies need to recognize that whether or not it's their intent they are becoming part of this problem they're part of the ecosystem that is the problem um and they and again companies that really want to think about how what do we need to do to make sure that this globally interconnected network maintains its value and doesn't become so balkanized and so full of distrust that it loses its value you know by any sense of the word value you know they need to be part of the solution and they're going to have to think creatively and innovatively right these are some of the most innovative companies on earth let's let's get to work with some some more innovation here um so the short term I think we're you know the winter is here a few years ago I said the winter is coming um and it's gonna you know we're we got a slog through this winter but you know I'm optimistic about sort of as a historian's daughter I think in the long run there's cause for optimism well that's a great note to end this morning panel on thank you everybody for coming thank you to Rebecca McKinnon and the whole ranking digital rights team for this great new report go to RankingDigitalRights.org to read the entire thing and and it's really rich there's a lot of great material in there Peter Meisek from Access Now, Ellery Biddle from Global Voices Online thank you as well for enriching the panel everybody else thanks for coming again you're always welcome here at Civic Hall and hang around there's coffee and and I think our folks will hang around here as well to talk for a few more minutes so have a great day thank you