 Good morning, folks. Good morning, and welcome to New America for this event to commemorate the release of the 2017 Ranking Digital Rights Corporate Accountability Index. I'm Kevin Banks, and I'm the director of the Open Technology Institute, which is New America's internet policy and tech development wing, where our mission is to ensure that all communities have access to an internet that is both open and secure, which raises the question, how do you do that? Well, one important way of fulfilling that mission, protecting openness and security online, protecting human rights like free expression and privacy online, is obtaining transparency and accountability from the internet service providers and online platforms that compose much of the internet and make possible much of our activity on it. How do you do that, though? How do you get companies to, for lack of a better word, do the right thing and protect their users? Well, we actually just released a report, a set of case studies on that topic aptly titled Getting Internet Companies to Do the Right Thing. And one of our conclusions was that rankings, report cards, relative comparisons of the company's performance to help spur competition and get them to compete to see who can do more things or better things or faster things or first things to protect their users is one of the most powerful levers we have to affect change online, which brings us to ranking digital rights, whose detailed comparison of 22 companies based on 35 detailed indicators is unmatched in its scope and its rigor. And it's that kind of work, that kind of hard work that makes change, and I'll give just one example, in its first report, ranking digital rights, the one indicator that every company failed was the indicator asking the question, is this company reporting on how much content they take down based on their own terms of service, voluntarily, instead of government demands for content take down? This is something that's really, really important, especially today as we see companies enhancing their efforts to take down content due to violent extremist content, harassing content, et cetera. Well, thanks to that kind of pressure, just this week, Twitter issued its new transparency report with new data about its terms of service takedowns, and with that domino falling, if the trend goes the way it usually does, this is gonna become a common practice in the next five to 10 years, and that'll be due in no small part to the work of ranking digital rights. It's that kind of change based on that kind of work that makes us at OTI, and personally myself, so proud of ranking digital rights, but we can't take any credit for ranking digital rights other than to say that we've provided it a home. RDR is an independent project with independent funding to avoid any potential conflict of interest, and with a growing expert team independently managed by perhaps the most independent human being I know, Rebecca McKinnon. You may know her as one of the world's greatest experts on the interplay between internet freedom and the internet companies that make the internet possible, which was one of the primary subjects of her 2012 instant classic, Consent of the Network. And in many ways, RDR is a response to an attempt to answer many of the questions and concerns in that book and take the theory and make it into practice and a lever of change. An attempt to truly impact the state of the net. And in that way, it's more than an expert intervention and it's more than a labor of conscience, as has been especially evident to all of us at OTI New America as we have watched the countless hours of work and the sleepless nights that have gone into getting this report out. It is clearly a labor of love for all involved, which is probably what makes me proudest of all. So please join me in congratulating the RDR team in shipping this report, which they're about to tell you all about, and please help me in welcoming Rebecca McKinnon to the stage. Good morning. Before I launch into the substance, I need to continue with a few thanks because there are many people to thank the ranking digital rights team has been truly phenomenal. And we have a couple members of the team here today, but we are a distributed international team. So there's actually several other people in Budapest right now who are at an event that is streaming this event live into their event. And so I'd just like to acknowledge Amy Brulett, our research and editorial manager who is there in Budapest today, along with Lisa Gudermuth, our program manager, Ilana Ollman, our policy and communications analyst, and Natalie Marischal, a senior fellow on the project. And we have here in the room Laura Reed, our senior research analyst, and Andrea Hackle, who is also a fellow with the project. And we've been working with this, what I'm about to present and what you already can see online if you've got a laptop or have looked on your device and you go to rankingdigitalrights.org. You'll see it's not just a report. And we have a four pager that's outside that has sort of the highlights. So you can pick that up, but there's a much longer report that goes into detail, but there's also an interactive website where you can really explore all the data indicator by indicator element by element, and also download it and do what you want with the data. And that is all due to very hard work by a lot of people. And also thanks to a partnership with the Sher Foundation in Serbia who've been working with us. They're an NGO that works on digital rights issues and visualization. And again, just really wanna thank OTI and New Americans, the Board of Kevin and a number of OTI staff who've really made this possible. Chris Ritso, who has really helped us, kind of saved us in a few cases on the technical side. And Allison Yoast in communications who has also been heroic. And many others, you all know who you are. So I won't take up too much more time. I'm gonna put this down, this is our report. I also wanna thank our funders. As Kevin mentions, we make an effort to sort of maintain pretty clearly distinct lines around our funding. So our oldest funders are three foundations, the MacArthur Foundation, Ford Foundation and Open Society Foundation. And this project would also not be possible without the State Department, Bureau of Democracy, Rights and Labor who have given us a generous grant under the Internet Freedom Program. And we really appreciate their support and their very light touch. And just being so supportive has been wonderful. And also wanna thank, we have a set of advisors. Since we're kind of a little project within a bigger organization, I have sort of an informal board. And one of those members is Melissa Brown who will be coming up later. And so they're listed on the website. And also we had a lot of help from an organization over the years developing the methodology from an organization called Sustainalytics which works on, which provides data to investors on environmental, social, governance, human rights information. So I'm gonna give a presentation about the results of the ranking. And then we're gonna have a discussion and Niels Tanover from Article 19 will introduce the panelists in more depth. But we have Melissa Brown from Dowbridge Capital bringing an investor point of view to this and Arvind Ganeshan from Human Rights Watch who has been involved with these issues and many other issues around business and human rights for a very, very long time. And has I think a really wise perspective on these issues. So the Corporate Accountability Index, what is it about and more fundamentally why do we do it? I think it's become pretty obvious by now that our relationship with our government, our relationship with broader society is intermediated between by our digital devices and technologies and that we're dependent on them for our relationships and so that the technical decisions, business decisions, design decisions being made by these companies are increasingly shaping what we can and cannot do in our lives, what we know, who knows what about us, et cetera. And so therefore it's very important that these companies be held accountable to the public interest, be held accountable to respect our human rights. Because ultimately it's kind of a sustainability issue. If you think about it, right? It's become standard assumption that companies are responsible for maintaining a global environment in which we can live, that they're responsible for not polluting the air and water to the point that we're all dead, right? Even if that might maximize profits. We're all very comfortable with the idea. We assume that companies should be contributing to a sustainable physical environment. We also, the next step is to get companies to be accountable to contributing to a sustainable digital environment that is going to be the kind of environment in which we want to live, in which our rights are respected. What kind of future do we want our children and grandchildren to grow up in? Do we want our digital communications environment to be compatible with human rights and democracy? And if we do, we actually need to make an effort and companies need to contribute to an ecosystem of our information in which human rights, freedom of expression and privacy are respected. So that's really the broader overview or the premise of what we're doing and why we're doing it. So we evaluated, as Kevin mentioned, 22 companies. And so the idea was to select a set of companies that collectively touch the digital lives of the majority of the world's internet users. So we have 12 internet and mobile companies and 10 telecommunications companies and they are distributed across the world. And so we go in our report into more detailed information about how we selected these companies. Of course, in an ideal world, we'd love to rank more, but this is resource intensive. So, but we feel that this gives a good representation of at the end of the day, to what extent our internet users around the world being given the information they need to understand how their information environment is being manipulated and who knows what about them. And so we think that this helps to present a good picture. What kind of principles is our methodology built on? One very core set of ideas comes from the UN guiding principles on business and human rights that states that governments have a primary duty to protect human rights. Of course, one can have many more panels about the extent to which they're upholding that duty. But companies have a responsibility to respect human rights as well and to mitigate harms, not only in the process of just conducting their own business, but also in how they interact with governments and governments and companies jointly have a duty to provide remedy when rights are violated. So that's one of the core kind of frameworks that we're building on. And there's quite a number of rankings out there today that are ranking different types of companies on human rights issues. And one of the more recent ones that came out is the Corporate Human Rights Benchmark that's examining extractives and manufacturing companies, not internet and telco companies, on 100 indicators related to how they're impacting human rights around the world. So we're kind of part of the broader ecosystem that is evaluating and benchmarking companies on a range of different human rights issues in a range of different industries. And you'll see, it's probably a little hard to see there, but they're very tough graders. The most of the companies were in the 20 to 30% range, definitely enough. So we're building not only on kind of broader human rights standards, but then also specific standards that have been developed over the past decade or to around how do you apply human rights principles to digital platforms and technologies. And so one of the earlier groups that began to develop principles was the Global Network Initiative, which Arvind and I and others were involved in helping to create, particularly around government demands. So when governments are making demands to censor, to assist with surveillance, how should companies be conducting themselves? And what kinds of principles should they commit to and how should they implement them? And so we build on that, but we also build on a whole range of other privacy and freedom of expression standards that have been developed by a whole community, including Article 19 that Neils comes from and others. So we're building on a set of standards. So here's the index. It's probably hard to see in detail. You've got your four pager where you can look at it more easily. And I'm gonna break down into little details that you can see a little bit more easily, but I just wanna give you an overview because I wanna explain the three categories that we're evaluating companies on. So we have, we are indicators, the 35 questions we're asking about the companies are divided into three main buckets. The first bucket is what we call governance. So that's looking at, do the companies have a clear public commitment? And are they showing evidence that they're actually institutionalizing that commitment? Which means are they conducting human rights impact assessments to understand how their own business is actually affecting freedom of expression and privacy? Do they have clear lines of responsibility from their board and executive management on down to ensure that freedom of expression and privacy matters are being addressed throughout the corporate operations? So we're looking, there's a set of indicators that are looking specifically for evidence of institutionalization of commitments. And we can go into that a little more later. Freedom of expression is dealing, of course, with are the terms of service clear? How transparent are companies about the way they enforce terms of service, their own rules? But also how are they handling government requests? Are they issuing transparency reports around government requests to remove content, block content, shut down networks, deactivate accounts? All of those different types of requests that affect freedom of expression. Also other private third-party requests because there's a lot of private actors that make requests to remove content for copyright reasons, for liable reasons, different countries. And so we also wanna see transparency around private policies and practices. And also some questions about identity and I'll get more into some specifics later in my talk. And then on privacy, we're really looking at three sets of things. We're looking at, first of all, does the company disclose clearly what's being collected about you, with whom it's being shared, how it's being used. So the life cycle of information that could be used to profile you or that could be used to track you. We want to see disclosure so that people understand what's happening to their information. We also wanna see transparency around government requests for user data surveillance demands and transparency around what the policies are in dealing with those. We also have a set of indicators around security practices. We're not asking companies to disclose information that's gonna help hackers get into their networks, but we are looking for basic evidence that there are policies in place and that there are efforts being made that are credible to secure users' data and to address data breaches as well. So those are kind of the three buckets here. And you can see from the scores and from the four pager that you have, that essentially if this were a test and not graded on a curve, we have two companies that got a D and everybody else got an F. We're tough graders, it's supposed to be tough. That's not to say that within the data, within the specific indicators, there aren't a lot of good practices, there are. But there's tremendous room for improvement by all companies and to call anybody a winner is, or even a leader is perhaps going too far. So that's one thing I wanna make clear. So going, digging down a little bit into the results, these are the internet companies broken down with their total score. So it's the governance, freedom of expression and privacy added up into a total score. And you'll see that we basically have the US based companies plus the South Korean company Kakao and then the two Russian companies, two Chinese companies and Samsung kind of mixed in below 30% and 30% is kind of, I would say the floor below which meaningful effort is kind of hard to say that meaningful systematic effort is taking place around user rights. On telecommunications companies, we just have one US company in the index because we're really looking globally. And this is how the total score breaks down and again sort of below the 30% for, you have quite a lot of companies that are not disclosing much at all about their policies. And we'll talk a bit more in the discussion about and as I break down and take you through the data a little more kind of what that means. So I'm gonna break down now into the governance scores because this tells us something interesting. So this is kind of the spread in the first bucket of results, governance, the way the scores go. And I wanna call your attention to the companies at the front and what that says about them. The companies with the green arrows are all members of something called the Global Network Initiative which I mentioned earlier which is a multi-stakeholder organization that works with companies to both make commitments to users' freedom of expression and privacy and also implement those commitments particularly in their governance. So they have to be conducting human rights impact assessments, they have to be showing that there is executive and management's oversight over these things. They have to show that they actually have policies to deal with government demands around freedom of expression and privacy. And the other leaders in this category are members of Telecommunications Industry Dialogue which is another group that I can tell you, many of them, not all of them, but have been observers of the Global Network Initiative and we're hopeful that they may, some of them may even join soon. So stay tuned for that news. But those companies for the past several years have also been making some concrete public commitments and have been kind of reporting to their group about how they're implementing the commitments and so that shows in their government scores and actually since 2015 and in earlier, compared to earlier research we've done, we've definitely seen some of these companies putting in place more and more policies and processes than they had four or five years ago. Notably, further down in governance, we have Twitter and Apple and one could do a whole panel just on why that is, but basically those two companies in our governance category, we found that in terms of their public commitments to users' rights or evidence that they have institutionalized those commitments, both for freedom of expression and for privacy, are not extensive, quite minimal. So they did not do well on governance. One of the major aspects of this year's index compared to our first one in 2015 is that we added what we call mobile ecosystems. So we added Apple for the first time, we also added Samsung for the first time and we also added Google's Android ecosystem. So devices that use Android that Google controls directly, we also took a look at that and largely because of governance and freedom of expression practices and policies, Google Android outperformed Apple, which I think is a surprising finding for many people who we've talked to about this so far. Breaking this down a little more, looking at the mobile ecosystems and breaking it down by category, you'll see Apple did respectively on privacy, disclosing privacy policies and practices of different kinds, security policies and practices of different kinds, but disclosed very little related to freedom of expression. Essentially, there's no transparency about how it polices its app store or how it handles government requests to restrict content on its app store and no transparency reporting about the volume and nature of requests it receives, no transparency about any kind of terms of service it has for apps developers or how it enforces its rules. So that's what brought Apple's score down. Freedom of expression category overall was very interesting. Kakao, the South Korean company, has some very good practices around disclosing different types of requests they receive to remove information and they got some high scores. And so companies sort of looking at what best practices. Kakao's an example of how sometimes best practice isn't necessarily in the West, there are lessons to be learned elsewhere. Twitter is very strong on freedom of expression and its transparency around removals, relatively speaking, and again, Apple not so much. In the telco sphere, what's quite interesting is amongst the industry dialogue companies, you get, it's quite uneven in terms of the amount of transparency related to content. So for example, Orange doesn't do any transparency reporting about government requests, it gets to block websites and we know that in France there's been a dramatic increase in the amount of website blocking, but we have no transparency reporting about that. So that's one example. And we're seeing more transparency from some of the others, but it's still under 50% and many companies don't fill us anything. To dig down a little bit into transparency around specifically any kind of third party requests that companies get to restrict content, you'll see that there's quite a variation. And again, around third party requests, Orange didn't get any credit for the reasons I was just explaining, Apple very, very little cacao doing pretty well, very competitive. Drilling down further into the indicator Kevin was talking about, in 2015 I had a slide that showed the question about do you disclose anything about content removed for your terms of service and it was a flat line zero, we're starting to see companies reveal content and our research ended before Twitter released its most recent report, so it'll be interesting to see in the next round how those scores continue to evolve. Network shutdowns, we had a new indicator this year about network shutdowns, looking at transparency by telecommunications companies about the requests they get to disconnect their users from the internet in for specific periods of time in cities or sometimes throughout the country and in India there were 30 shutdown requests last year alone for example and there's not very good transparency around that so that's something that we'd like to see companies working on. When it comes to privacy and I'm kind of skipping around some interesting things and I'm obviously not going through all the results. This, these charts are looking at disclosures related to the life cycle of user information so the amount of disclosure that's taking place about what's being collected on you that could be used to identify you with whom it's being shared, how the company itself is using it, how long they're retaining it and so on and we see Kakao doing quite respectively in that disclosure and we're seeing the European telecommunications companies kind of all over the map and in terms even if they're complying with European data protection regulations they're not telling their users very much for whatever reason and they haven't gotten really clear explanation about why other than that while we're complying with the regulations the users should know that but our view is that from a human rights standpoint you actually need to tell users what's happening to their information and just assuming they understand what the regulators are doing is insufficient. On collection, to drill down on that set of indicators further you see some interesting patterns in terms of disclosure about collection of user information, Kakao again from South Korea is the strongest performer. Google not doing so well in that particular piece of disclosure might have to do with its business model. P4, sharing of user information does the company clearly disclose what user information it's sharing and with whom? Kakao again comes out first Google not doing so well and Apple surprisingly it's not clear why they don't disclose more because they talk in fora and to the media about their policies but for some reason they're not disclosed systematically to users so that's surprising. Encryption policies, Yandex the Russian company which overall didn't score so well does quite well on its disclosure about what types of encryption are used for what services. Data breaches we had a new indicator this year on what are the company's policies in responding to data breach in terms of informing affected parties and so forth only three telcos disclosed anything nobody else in the index disclosed anything and so particularly for investors that may be of concern but obviously the concern to everyone else as well so just to wrap up we obviously have a number of recommendations for companies that are in detail on the website and in the report that really relate primarily to we need companies to carry out risk assessments there needs to be greater transparency around all kinds of requests. Companies need to communicate more clearly about what's happening to users information if somebody was gonna build a profile on me based on my use of Google, Facebook and AT&T and my iPhone ecosystem what kind of profile could be built about me I need to have enough information that I have some sense of that so that I can then make informed choices and right now people are far too much too far in the dark on that grievance and remedy the industry needs to develop better mechanisms for people it's not necessarily always about financial compensation but getting your content restored or having the harm addressed if your privacy was violated and working when appropriate with governments on having better grievance and remedy and then just greater clarity about the security practices is clearly something that everybody's very concerned about. Also for governments I mean one of the things we found in the index and I'm just wrapping up now is that of course there were areas where companies were not disclosing enough because governments weren't letting them and so we need reform of laws and regulations so that companies can maximize their transparency. There are a lot of regulations that prevent companies from disclosing private requests for all kinds of things it's not clear what the national security interest is in this so there are a lot of things that could be done with the law just generally the demands by governments on companies need to be compatible with human rights standards and they are not, one could do many panels on that and governments need to be doing transparency reporting themselves and for the most part they're not that should be a core requirement and expectation for any open government practices that are going on and right now there's very little of that happening even from the most open and enlightened governments so that is kind of a very broad overview and I will now invite our panel to come up and Niels is going to lead the conversation and we're hoping to open it up to the room pretty quickly and Dr. Hackel here is going to run around with a mic and so we will have a good conversation, thanks. Hello everyone, so I'm happy to take over the baton from Rebecca but let's first jointly congratulate Rebecca with the launch of the Great Reconciliation. So much for being nice, let's come up with some critical questions but not before I introduce the other panelists here next to me there is Arvin Ganesan who is the director for business and human rights watch, he has been one of the driving forces behind the global network initiative as Rebecca said but he has been in this area for quite a while also setting standards for the extractives industry, the IATI and fair labor. Now next to that there is Melissa Brown who has been one of the leading responsible investment analysts, who is a partner at Dowbridge Digital, who is a board member at Bestos Analytics and who is a leader for years in the responsible investment sector. So we have a great panel, so let's get the discussion going and I see we also have quite an informed audience so really let's get the discussion going early as well so raise your hand if you've got a question or a comment. But let's first start off with a question to Rebecca. So first of all again, congrats on this work, it's excellent and one of the excellent things is its longevity, it's not a one-off. Because we in the NGO sector some times have a short memory and run from fire to fire, keep the adrenaline high and then forget to build on the work that we've been doing but this is an excellent advocacy tool because of that. But at the same time, the technology is changing so rapidly as is the international legislation, there are new companies, new technologies. How can a ranking keep up with that? And if we look at the trend over the years of when you've been doing this ranking, when do you think you can retire? I need to call on myself. But that's a great question and rankings are something of a blunt instrument in that they are a snapshot in time. And so as we learned today from Kevin, Twitter has made some changes that didn't get into our index because we stopped, we had to close the research at the end of January and things have continued to evolve. So it recognizes that it's a snapshot of what the companies were doing during a particular period and in an ideal world you could kind of dynamically change it quarterly or something if you really had resources that I can't imagine. But doing it once a year, it provides kind of a year-on-year snapshot. But I think acknowledging that it's a much more dynamic field that regulations are changing all the time and that there are a lot of companies we're not covering. And so one of the ways we're addressing that is that our methodology is completely open and our data is completely open. So people can take it and do more things with it. So if there's a company you're really concerned about or a set of companies you're concerned about that we didn't include, you can take the same questions or some subset of them and analyze them yourself. And we're also seeing actually, I've heard from a couple of companies who've told me confidentially they're using the indicators internally for their own evaluations even though that we're not ranking them because maybe they'll get ranked in future or maybe their shareholders or other stakeholders will ask them about some of these things. And so in a way the ranking is kind of the beginning of the conversation about what the standards need to be more broadly. And what are the questions that civil society, that investors, that also policymakers should be asking of the companies and about the companies. So that we at least are, I think sometimes it's a little bit ad hoc with the advocacy or sort of the questions that get asked. And one of the things we found, I think with some of our results is that you tended to have more companies disclosing more things on stuff that more NGOs have been bothering them about recently and the stuff that the NGOs have not thought so much to bother them about, they're not disclosing as much. So we're trying to be more comprehensive in that regard but we're also not trying to be the final word. We're putting this out there as a framework that people can use then much more dynamically. So Arvin, let me turn to you. Many sectors have been going through this, the extractive industry, the banking industry. Why do we still need to push so hard in the digital industry? Well, I mean, there are multiple reasons. One is, I think the digital is in many ways far more complex and far more immediate to everyone than other industries. So what an oil company does abroad doesn't necessarily affect you in the same way what your cell phone, cell phone provider, and platform does. And one of the things that makes this ranking quite important is if you actually take a step back from it, what it's actually telling you is this is what the internet looks like for 3.7 billion people. It's using an Apple device, a Samsung device, through a service like AT&T to get to an app store or on a platform like Facebook. And now you have a way of assessing what are the human rights risks at every part of that or what are the human rights practices at every part of that. So on a very personal level, it provides a roadmap to understand the kind of cautions you need to take as well as what companies need to do more. And I think the other issue is that there are few things that are as intertwined with government as the digital sector. So a government cannot spy on what a company doesn't have and what companies have increasingly has a very sophisticated business purpose. So at a very fundamental level, what both the private sector and public sector do affect everybody's digital rights. And in that case, the ranking is helpful in understanding what every part of that chain is and where to push to get improvements. I think in a global sense, what we're also seeing is that industry after industry is being subject to more disclosure, whether it's extractive or digital or others through investors, through governments in Europe and elsewhere. So it's very clear that there is a public trend that's reflected now in growing government regulation that people want to know more about what companies are doing. And one of the things they wanna know is how it affects their lives. So now I think about this as Rebecca was talking, I have an 11 year old who got his cell phone this year. So he's got an iPhone on AT&T and you know, texts and plays video games. Now I can look at this and I can say, okay, well AT&T has certain issues with this and the app store at Apple has certain problems with privacy disclosure, which means as a parent I'm gonna say no to everything basically. But what it really says is, look, for 3.7 billion people, you now have a roadmap of what the internet looks like for them and what companies need to do to better protect their rights in each part of that chain. And that's gonna be critically important because as we're seeing, I mean, just this last week in Europe and elsewhere, there is more pressure to disclose what companies are doing on rights and in the digital sense, there is going to be even more pressure because governments want more access to that information as well. Thanks so much. That brings up a lot of excellent questions but you mentioned investors. So let's now first go to Melissa. Melissa, the companies are often pointing to the customers for providing something that they want. And then the customers are often looking at the consumers or you can also frame it as citizens often looking at governments to regulate for them. And then governments and bureaucracies are not as fast so they also look towards investors. So where in this chain is the change gonna come from? Well, I think it's safe to say that investors probably are very unlikely to drive the change at this particular stage of the game. I think investors are beginning to focus on a whole range of these issues but also investors are not monolithic. So you have different groups of investors who will react to the types of data sets that are coming out from the work that ranking digital rights has been doing. And I'm beginning to see really the front end of that. So there are groups of investors who have always been global investors who pay attention particularly to human rights issues. Other groups of investors who will focus on governance issues. And the governance backdrop for these issues is now beginning to be really very, very significant. And then there's the final group that basically says we understand these issues and we're interested in them but we're particularly interested in them as they evolve as risk issues that may impair a share price valuation or longer term as risk issues that may shape the business models of the company. And we've touched on the word business model here. It's really relevant and we've seen this play out. So let me just pick a simple indicator. I think it's safe to say investors are probably likely at this stage of the game to be most interested in some of the privacy indicators. Well, let's put that in context. Yahoo just had to revise their planned acquisition by Verizon and to announce that they were gonna take a $350 million haircut with longer term implications to the value of the transaction because of an effort to come to grips with the damage to the value of the company as a result of data breaches. What's great about the accountability index is now you can go look at that indicator and say, so where are we on data breaches? We've got some terrific companies here, many of them with an established public track record, a long history of regulation and really dynamic relationships with their customers today and hopefully their customers tomorrow. What are they telling us about data breaches? I think it's probably one of the indicators where we saw the worst performance across the board. So it's an area where the companies are very, very reluctant to say anything and that's quite material and it's very interesting to me as an investor because I have a couple of questions to ask. One is, is it because the companies themselves are not in good control of the issues, the risk management issues related to their infrastructure and their commitments to their customers? Does that mean in two years there'll be a new player who's going to offer me a much more stable and secure platform that I can work from? Interesting business opportunity. Or does it mean actually that the companies and their boards don't yet fully understand the issues and possibly that their general counsels and their lawyers are telling them don't say anything publicly about this because you may face some liabilities you're not prepared to manage. These are all important and complex issues. They all relate to the business models and these are inherently investor, interesting, significant investor questions. So the investors will come but these are complex issues and events like the one related to Yahoo are what get investors really started on these questions. So if we continue on that, where do you think the risk actually resides now? I think there's no question and again this is a business model issue. Essentially what the major internet companies in particular do and to a degree certainly the telecom providers behind it is they outsource that risk. They outsource that risk to their users and to a degree some of the compliance issues they outsource to the app providers who are in a certain way are obviously typically smaller companies and much less capable of dealing with it. So we've got a, it's almost an old economy model. We've simply outsourced the risks and in many senses this is a road we've walked down before it. To me in some senses it kind of looks like the auto industry in the 1960s or 1970s. We know this will be unsustainable. We do know it will change but at this stage we're locked in this kind of process where we say look at our cool products. You want them, don't you? You want them so much you'll take them even though they are not really yet very good products. So Rebecca, going back to you and I'm forgetting the audience completely because it's fascinating stuff. Welcome to them, Sam. So also building on your book and combining that with what we just discussed do you actually think that the distributed nature of the internet is actually making transparency and accountability harder? Yeah, well I think with, we're always a lot of naive assumptions that the internet would just make everything easier and that's obviously not the case. I guess yes and no, the nature of the internet makes it harder to assign responsibility for things and particularly when you have a situation where governments are trying to influence companies and then companies are taking actions. The responsibility is often joint or it's kind of opaque as to exactly, this is why we're so focused on transparency. We need to know who's responsible for this content getting removed? Who do I go hold accountable? And oftentimes you don't know or who was responsible for my cousin in Turkey for ending up in jail and to what extent did that have something to do with his mobile service provider or some app or something. There's kind of, it's very opaque. And then you have the bigger challenge which I wrote about in my book is that you have this globally networked set of platforms and services and then you have government jurisdiction that's very tied to specific countries trying to each trying to regulate or take action upon these companies in different ways often contradicting one another and then the companies trying to have globally consistent policies and practices and it doesn't work and it's not serving human rights very well in the digital space. And so one of the hopes is that because this is a global index and because we're trying to hold global standards it enables stakeholders to kind of assign accountability more broadly rather than just kind of depending on their own political system to provide it which is the old model, right? That the old model isn't working clearly. So their domestic market and their domestic political system will take care of the accountability and responsibility issues that's not working in this space at all. And so you need kind of more comprehensive data sets and I think this, which is why transparency and requirements of transparency become so important and why having comparable data about what's happening on a global scale so that somebody in India can say, look, actually you're doing, they can say to Bharti Airtel and the Indian government, here's what's happening here but something very different and much better is happening over there and why do you have to do this here? And try and have a bit more positive modeling rather than just kind of everybody is modeling worst practices which has kind of been more of the trend lately. And so governance is increasingly about benchmarking and monitoring and about evaluation and enabling people, both effective groups and those who kind of speak for effective groups to then figure out how to apply the right pressure and incentives on a more global scale. If we look like with the squins of our eyes at the graphs, we see you're up, there are no winners but we definitely see a difference between the Ds and the Fs and some of the Ds are members of GNI. So we see the dark people that are organizations that are sensitive and responsive to this data and are showing improvement but there are definitely also organizations that do not seem to care. Will there be a differentiation with the forerunners and people who say this is not of our interest, we're just gonna give the best pricing and how do we ensure that it doesn't happen and what can we learn from other sectors in that respect? Well, I mean, I think it's not an accident that the companies that ranked the highest did because not only are they part of GNI they're also the ones that got the scrutiny the earliest. I mean, some of the first scrutiny was by Rebecca in 2006. So you're seeing in some ways the culmination of 11 years of pressure on them. So on the one hand you can say across industries companies that receive pressure change and you will probably see that. The second thing is as disclosure increases, I think like Rebecca alluded to, you wanna create a race to the top and not a race to the bottom and this is an area where there's a combination of clearly external challenges like governments or criminal networks that are stealing data but there's also consumer choice. You could theoretically see the 2019 Digital Rights Index giving individual consumers or groups the ability to choose somebody who's above 60% at every part of the internet or telecom chain or something like that to say, look, if we're gonna have it, we want at least this. So I think it is by ranking things it forces other companies to look at what their peers are doing but it also creates more disclosure and hopefully it will create more choice. What it also says and what's important is that whether it's a data breach or whether it's government surveillance or whether it's a company taking data and using it for commercial purposes, the data is the same, right? It is only the people who use it and the purposes they use it for that differ and so by knowing that and knowing there may be criminal activity, there may be legal surveillance or illegal surveillance or just maybe even a dubious commercial purpose knowing what every part of a company is doing in greater detail gives a lot of people, consumers, governments, investors and others a better way to look at what a company is doing and exert pressure in a more focused way but I think ultimately it will push people higher or at least it will cause people to avoid the lowest on the chain in a sense. At this moment I'd like to turn to the highest educated mic runner in the city. The doctor mic runner to see if we have some questions from the room. Could you please state your name on the affiliation? I'm Michael Nelson with Cloudflare, web security firm. I really like the work presented here and I really like the fact that you're starting to see action, particularly glad that we're starting to see data on what companies are doing to take down customers according to their own corporate policy not at the behest of law enforcement. So my question I guess is mostly for Rebecca. In your discussions with the companies you've surveyed, have you heard indications that others are going to start doing that as well and what have we learned so far from the data that these companies have released about takedowns according to their own corporate policy? Yeah, that's a really good question. I mean they're definitely beyond the companies that actually showed disclosure there. There have been conversations with other companies about how do we start doing this and disclosure around terms of service enforcement is tough because you're also dealing with spam, you're dealing with child porn, and so you want us to disclose all of these things or what do you want and what we explain and a number of other kind of groups, not just us have been saying, look, people need better clarity about how your terms are being enforced and the types of things that are being taken down in what percentages and volume, right? So this is round accountability, it's not just about numbers for numbers sake. So there's definitely a conversation ongoing between the companies and a lot of groups about what good transparency around terms of service should look like because there's no one good model and right now we're seeing kind of piecemeal transparency. Twitter started actually in the very end of 2015, beginning of 2016, with some information about accounts that they had deactivated around extremist speech and then Google released a bit of information here and Microsoft has released a bit of information about specific types of things. I'd have to go back into the data, but it's generally kind of one or two categories of content rather than a comprehensive view and so it's a good first step, but definitely I think it's, everybody's kind of experimenting with how they release this information and how they release it in a way that is meaningful and I think also companies just don't track it as much internally and we've seen with transparency reporting around government requests or around copyright takedowns is companies first have to put in place a system to track this stuff internally before they even figure out how they're gonna disclose it publicly and sometimes I think there's certain disclosures that don't take place because the companies don't have their own internal systems for tracking and analyzing the data internally and then they're really sorted out. So sometimes when you see a new set of disclosures happening maybe for a year, maybe more, it's kind of finally poking to the surface a set of processes that have been going on for a really long time internally to understand the information and figure out how to communicate it. Is there one particular type of terms of service violation that's more common in the cases that you've looked at already? Well, yeah, I mean, there's not enough data disclosed to be able to answer that question. Right now, most of the disclosures have been around extremist content because that's what, although Laura can tell me probably off the top of her head, but then I can, there may be a couple other subjects that companies have been also disclosing. Yeah, non-consensual pornography and terrorist content is what gets disclosed. But the data isn't telling us anything about what percentage of takedowns total that represents or what other category. There's just not enough data being disclosed to even know. So with this opportunity, I'd like to take this opportunity to ask a question back to an internet company if you mind me. How is Cloudflare dealing with risks assessment for freedom of expression and privacy? Well, by one thing, we're talking to GNI every so often. Cool. But we also do have a very clear transparency report, quite comprehensive on what data we have been asked for from law enforcement. We do that every six months. We also make some very clear promises about what we will do for our customers. We're right in the middle of the internet. We process more than 10% of all the world's web requests every day. And if people don't trust us, they'll stop using us. So I don't know how we would score, but I think we'd do pretty well because transparency and trust is essential to our whole business model. But we are also subject to these law enforcement orders and sometimes they come with a gag order. We just spent three and a half years fighting to be able to tell the world about one of these cases. And we are succeeding in court so far. But for a company in our position and for several of the other companies up on that list, if you don't have people trusting you, you don't have a business. And we will spend a hell of a lot of money to fight government requests that we think will endanger our customers' data. Quinn from Article 19. I actually just wanted to follow up on this a little bit. And reflecting on the ranking, Microsoft scored quite high. And I wonder to what extent this actually goes back to this question of business model. I was in a meeting with Microsoft and all the other internet giants a year or so ago where it was clear because they were getting pressure from their business clients on privacy and data protection that they were taking it very seriously. Whereas some of the others who were more of the B2C model were like, and no one's telling us they care, they just like our new shiny app. So to what extent is that really driving maybe some of these kind of enhanced transparency and protection because it's not longevity. Apple's been around almost as long as Microsoft. So where do you see that role playing? Yeah, that's a really good question. Business model is definitely part of it. And so I guess in a way this intervention is meant to kind of push beyond business model as an incentive. But it's been pretty clear for a while that Microsoft has had a number of reasons why they wanted to implement these things. But it also is kind of self-reinforcing. They've been involved with J&I from the beginning. They've been involved with conversations with stakeholders for a very long time. They've been doing due diligence. When they bought Skype, they inherited a Chinese joint venture that was a nightmare. And they did a human rights impact assessment and consulted with a lot of people in the human rights field and cleaned it up basically and implemented a change that is much better for privacy. And so that goes beyond this sort of enterprise business model and speaks to a company that kind of early on just decided to make commitments. But business model is certainly part of that. I think it's easier for some companies to care about these issues than others. And exactly where their profits are derived is part of it. But I guess what we're saying is that beyond your business model, you have an obligation to be respecting rights regardless of what your business model is. And so here's that picture. And just as some regulatory factors might also give some companies a leg up and some companies a leg down, business model may be doing that too. But objectively, the facts are these are where the disclosures are happening and these are where the disclosures are not happening. And that's just kind of an objective across the board measure. Thanks, Jason Pielmaier from the State Department Bureau of Democracy and Rights and Labor. Congratulations on the report. So question about the indicators within the freedom of expression bucket. So article nine team talks about the right to seek, receive and impart information, seeking information clearly in that sort of mobile internet context you address in part in terms of looking at the mobile ecosystem and the app stores and what transparency there is there. Imparting information obviously, disclosure about censorship and government request to censor information. Receiving information though, I'm curious, especially in the context of all the discussion around news manipulation and the most recent elections, do you look at or are there plans to look at the ways in which companies disclose how they provide proactively feed information to users whether it's news information or feeds from others including advertisers. Just curious if that's already part or if you have started to think about how to incorporate that into it. We don't currently have an indicator that's looking at that specifically. So we've been asked quite a bit about algorithmic transparency also and how that fits into the index. You could kind of see the terms of service, the need for transparency around how you're enforcing your own rules or how you're policing content or how you're proactively manipulating content that there could potentially be some way to develop an indicator around that. We'd need to spend more time looking at that. But definitely I think more broadly kind of beyond the questions we're looking at, there does need to be more transparency kind of just in general. And I think it might be also useful if there are some groups that are affiliated who are interested in kind of looking at this question and if we were to develop an indicator that's looking precisely at that question, how transparent and accountable are companies about the way in which they're shaping your information environment in the context of the platform. What would that indicator look like? What would the objective kind of questions be that you need to seek disclosure on? That's definitely, I think could be really an interesting exercise if somebody wanted to help us think that for, but before we go back, I was just wondering if Arvind or Melissa had any responses around the business model question in particular or anything else so far? I think from my perspective it's extremely obvious that what we're seeing very much live time is business model issues being sorted out and it's a little bit ugly. In a way the telcos are more accustomed to a slightly longer duration dialogue about a regulatory process. It's actually the internet companies that are now finding themselves caught by something that I think the more traditional utility providers are not so surprised by and they're not at every moment wearing it particularly well. One thing that I would say, and I'm happy to refer to this intermittently, but if you want to get a fresh snapshot in time about how some of the most highly-paid securities lawyers look at some of these risks, it's worth your time to take a look at the risk factor section of Snapchats at Snaps S1. It's very fresh and in order to get that terrific valuation they had to spend a lot of money with very, very good lawyers who carved up their risks relative to their operating environment and they tell you what they're frightened about or worried about or risks that they fear they may not be able to manage. And this goes very much to your question about the business model and roughly in order, top four risk factors for them. Number one, speaks to issues of privacy, safety and security and implicitly to regulatory risk and risks they may struggle to manage. They talk second about mobile operating systems and their dependence on other companies, infrastructures and interestingly, a little bit to your point, standards that we do not control. So when you have a very rapidly evolving set of sensitivities about something like associated content or how others might view your content, think about Google over the last two weeks and their ad placement. These are immensely sensitive issues and they're not going to get less sensitive. They're clearly becoming more sensitive and simply saying that's where our algorithm puts your ad is increasingly not a very satisfying answer. It's not a very business-like answer either. Snap also refers to their total reliance on Google Cloud relative to disruptions. That's their infrastructure. Those are their pipes. Google Cloud can't deliver for them. They can't deliver their fun stuff. And then the final issue relative to their business model, really, really crucial, is our inability to collect and disclose data. That's how do we monetize? How do we make money? So this issue of the business model is immensely important and your simple slicing of it, is it B2B or B2C? It's easier to impose legal liability on the B2B context and to see this play out in a more stable, traditional, legally mandated way. The consumer area is much more volatile. And from an investor perspective, it means you're going to see a lot of events and that's how investors will get better at this. We're going to have disruptive events to look at that, especially when you have companies that are trading on very stretched valuations. Some of those can be really value-destructive. Some of them over time can be really value-enhancing because, indeed, some of these companies will get better at this. Yeah, I mean, I think, Microsoft is a good example that structure has an impact. I mean, I think one of the reasons Microsoft is able to do some things faster is because it's been around since the 70s. And so it's put all the structures in place to have a compliance regime or implement policies throughout the company. The flip side to that is if you look at telecoms, one of the reasons they have a lot of problems is they used to be government entities and kind of institutionally they're not necessarily used to taking an arm's length position from the government and saying no to what government wants. So that's a downside. But from our perspective, it shouldn't matter. If you take GNI, every company has the same time-bound period in which to do what it's supposed to do. And our view is that you can't either rely on or blame structure for not meeting human rights commitments. You've got to figure out a way to do it within the context of the individual institution. But I mean, not just in this sector, but in others. Yeah, somebody already has a systems in place, has a general counsel's office that has visibility throughout the whole company and can act quickly. It will probably be able to integrate new policies faster than one that doesn't have any or it's just starting to figure that out. So we had, I should go back to you, but it's such a, we see that size and visibility matter and we just heard from the gentleman from Cloudflare that over 10% of the traffic is going through. So is transparency and accountability a kind of a consent of the centralized? Consent of the centralized. I mean, you know, this is part of the problem, right? I mean, you have some markets. Like take Myanmar Burma as one of the more extreme places. The internet is Facebook, right? You know, you go to a, you know, a lot of people are going online for the first time through a smartphone, which they go to a shop. They purchase a smartphone, the shop, you know, with a subscription via Oradew, which is in our index. And the person in the shop sets them up with the Facebook app and a Facebook account. And their entire internet experiences through Facebook via Oradew, via whatever Android smartphone it is. And so it's going through those layers of dependencies. And yeah, I mean, there's, it's not like you can choose between a dozen of different DVD players and you have all this consumer power. It's like you're kind of stuck with that set of layers. And this is why figuring out how to incentivize transparency and why transparency is so important. And why we need to, you know, figure out what the layer, what the levers are to induce greater transparency. Because it is a little bit different than, you know, again, leveraging the power of consumers to choose amongst six different DVD models or auto models or something like that. And yeah, so people are kind of having to consent if they want to participate in the modern world and don't have a lot of choice. And so how you kind of get more accountability and also sort of participation and knowledge and understanding of what's happening. The choice ends up being more how you're going to choose to use the ecosystem that you're kind of stuck with than it is about whether you're going to use it, right? Because you kind of have to use it if you want to interact with things going on in your life. And but you can make choices about what you're, what you're talking about online or not or with whom you're having conversations through Facebook and with whom you might meet in person or, you know, basically just how you're going to govern your behavior in that ecosystem. And that will have an impact on companies over time because if you are constraining your actions through these ecosystems because you don't trust those ecosystems, then those ecosystems are going to be less valuable in the various ways that connect to the company's business model. And so that's one way that companies are going to need to respond if people start changing their behavior as a result of greater knowledge of what's going on or just greater knowledge of the fact that ORRADO is completely untransparent about anything. And so you don't know what they're sharing with whom and so therefore you better not give them a benefit of doubt, you know. And right now people aren't even aware of that. So yeah, it's many layers here but I do think that pushing for greater transparency or at least shining a light on the fact that there's no transparency and no accountability is the first step. But I mean, Arvin and Melissa may have more views from their experience on kind of where you go from here. I think one of the interesting examples for someone like me to pay attention to because of the really the power of it, we've just talked about the Facebook example but I live in Hong Kong and if you're looking at China and you do business there, which I do, I'm on WeChat. You can't, it's very difficult now to be in touch with Chinese colleagues, companies, other investors if you're not on WeChat. And WeChat is fascinating what it discloses and one of the things that's apparent is that the way they track users who originate in China with Chinese ID cards is different from the way they track a user who originates outside of the Chinese ecosystem and they don't track you in the same way. So Tencent is really worth the time and they are a super app. They have everything. You can do all your financial stuff. You're buying, you're selling, travel. It's a very comprehensive. It's a terrific consumer-oriented service but all of my Chinese business partners and friends, I mean, we're very, very clear. There is no security. Anything that you put on WeChat will be shared. That's a given. But it's also interestingly, Tencent is a company that is looking to expand and looking to expand significantly in Asia and in North America as it can. So watch them because what they're clearly saying is they're going to shift their business model as they do it, that they're aware of the fact that there are different sensitivities. What would be great in a disclosure context is if they can start coming up the curve and making that apparent to the types of people they really want to add as new users and people they wanna do business with because we have the right to ask. Yeah, there was the question from... Thank you. Shanty Kalathal, I'm with the National Endowment for Democracy. My question actually directly relates to these Chinese companies that have scored fairly low in your rankings and also ties into something that Mike Nelson said at the very beginning about trust being the key component. And it almost seems as though, just given your comments just now, some of these companies may, as they push overseas, develop some kind of bifurcated system whereby they don't really care if they have consumer trust within their borders, but as they expand outside, perhaps they are under more pressure. So I'm curious as to the whole panel's perspective on these companies are likely to be quite significant in the internet ecosystem in the coming years. And to what extent are they subject to investor and consumer pressure on these issues and how likely are they to adopt transparency and accountability practices that are more in line with what we would want to see as the global norm given that they're not really subject to a lot of domestic pressure in this circumstance. Why don't I go up on that one? I mean, I don't think the incentives are there yet. And I mean, I think what one of the problems we run into and it cuts across industries is there aren't enough positive financial incentives to do the right thing in place yet. What's more likely to happen as they expand is in certain jurisdictions in Europe, for example, where their mandatory disclosure laws that are forming, companies will have more interest in doing that. What's gonna be needed as companies that aren't doing a good job or even companies that are doing a better job are more incentives to do the right thing. And that could be in an ideal world, it means that the cost of capital has to be lower. It means that there has to be preference in government procurement, things like that, that incentivize doing the right thing. It's not there yet, so what I suspect is it will be very, very piecemeal depending on where they go and what they do. And part of that is gonna be public pressure, undoubtedly. I mean, I think as companies move into new areas, they'll get more scrutiny as well. But I don't think there's a holistic way and until that starts happening in a more holistic manner, it's gonna be very piecemeal and still pretty challenging. One thing that was really interesting with this index because we added a Chinese company and a Russian company so we were able to compare two Chinese companies and two Russian companies is the differences between them because I think the assumption would be, okay, the two Chinese companies are gonna be really similar in their scores because they're dealing with the same difficult regulatory environment. And actually there was a lot of difference between 10 cent and buy due around the privacy indicators in particular. And while neither of them discloses anything related to government requests or government demands or how they're handling those, 10 cent was disclosing a lot more around just kind of the quote unquote consumer privacy related issues in terms of what's being collected, commercially how it's being shared and so on. And one of the things we did with our research on all of these companies is we had somebody who's an expert on the legal context of each jurisdiction that these companies are headquartered in help us with an analysis to identify, okay, of these things that say 10 cent and buy due did poorly on which ones are things they really have no little control over unless the law changes and which are the things there's no legal barrier to improving. And we identified quite a number of things on which there is no legal barrier to improving. Basically everything that doesn't relate to government requests, anything that relates to private actions, they could improve transparency significantly. And because of that there's differences in their practices. So I'm hoping that this kind of research will sort of get out into at least the more savvy discourse in China about these types of things and it can be that the conversation that takes place may have an impact on the companies eventually. And likewise with the Russian companies there was tremendous difference between Yandex and Mail.ru on a number of things even though they're in a tremendously kind of both unstable and not very free environment. And you saw Yandex taking security much more seriously and disclosing that. So there are things that companies can do and I'm hoping that our analysis and we've got sort of this section on the website looks at Chinese companies specifically and analyzes the differences in a pretty granular way and also the Russian companies as well. And identifies what we think companies can change despite the tough regulatory environment. So definitely I'm hoping that as those kinds of things get highlighted more and get into circulation more that perhaps investors might also seize on those things because investors kind of calling up 10 cents and saying oh you did badly in the corporate accountability index 10 cents like well what do you expect we're in China and that's kind of the end of the conversation, right? But if the investor can say well we know you're in China but there are all these other things that apparently the index people say there's no legal and barrier to improve so why don't you do that? Then it becomes more of a foothold I think. Yeah granularity is opportunity in that sense and allows for much more intelligent and advanced policy making. I was strictly told to stop at 11. It's in one minute from now but I would really like to pick up all the questions. So there was one more question in the back. Eric Kuhn from Echo Kilo. I know your work is based mostly on disclosure and data but this discussion has flirted very closely with accountability in the context of some ethical issues as well. What are your views on looking at other aspects such as legal actions against some of these very influential tech companies? It's hard not to see Google's ranking. For example, there was a decade long litigation in the Google books matter that most recently went to petition before the Supreme Court. Is there any other sort of ethical basis that you would look at as far as accountability? Yeah that's a really good question. I think because this is a global evaluation and we looked at questions of what if we include lawsuits or what if we include incidents reported in the news kind of controversies or what if we include or positive advocacy steps that the companies take as well. And one of the challenges is because we're looking across a global set of companies you end up punishing companies that are in more open societies where the legal systems are more open to lawsuits against the companies. And so there's no lawsuits that we know of against Oradu but would there be a ton of them if they were here? Yeah I think so. So in terms of sort of an objective measure across a global set of companies including those types of questions we felt would kind of not clarify would not be helpful in setting sort of a global standard of really what's and this is again this is a four not a ceiling. This is like the minimum things companies should be doing and they should also be doing many other things both positive and not doing things that we're gonna cause people to sue them that they should be doing that we're not measuring. So that's I think an important thing and also importantly again we're measuring a set of things because that's what we can evaluate globally in a consistent way but I think the real value is gonna come when at a more granular level people kind of build off of what we're doing and say okay if you do an analysis kind of within Europe or within India or whatever about what's going on and sort of add questions about lawsuits and add questions about controversies and kind of build off of that to do a more in-depth analysis. I think that would add a huge amount of value. Just to add something, I mean I think not necessarily to the ranking itself but in general standards of all and a really good example of that is the standard that the index looks at or the ranking looks at about whether companies are challenging problematic laws and that's a really it's a relatively new concept and what it came from is the Snowden disclosures where once it came out the magnitude of surveillance and how it was being done companies could rightly say they can't disclose what the government's asking them to do and human rights groups and others basically said okay if you can't do that for understandable reasons at least show us what you're doing to change the laws that are so problematic in the first place. So you can't let this just live in a black box and throw up your hands and say there's nothing you can do and that standard can be applied anywhere in the world. So my expectation would be that over time you see maybe not that precise indicator but different indicators getting at that issue in a way that's discloseable and something you can rank as well. I think one of the things I'm conscious of especially where ethical issues come very close to how investors may or may not perceive things. There are two examples that came to mind as I was kind of preparing for this event over the past two weeks. Last week I was in New York and saw some contact who are in the advertising industry and one of the kind of remarkable things they kind of casually mentioned was as we were talking about sort of how Google and others are playing such a significant role in changing the nature of ad spend globally we were talking about healthcare companies and they casually mentioned that oh yeah, you know if you wanna get a data set now that helps you target Medicare and Medicaid people for particular pharma companies that's no problem at all. You can get identifiable data based on people's location services and geo data, people who go to Medicare and Medicaid offices. That struck me as getting really very close to a line that was truly troubling and I think part of what's going on is we all need to sharpen our business and ethical senses a little bit about where these quite important lines actually do exist. Who's going back to the large ad companies and indeed the large pharma companies and saying that's an inappropriate form of targeting if Google is providing you with that data off of their platform. That's a no go zone. Another area where we all need to get a little better is yeah, we love the new bright shiny things and the apps and so on but the people who help give us information, the soft sources of advertising we take which include the business press. If you want an interesting little adventure on that one. It was in an article I think on the BBC platform yesterday that the tech side talking about a very cool new app that you can put on your phone and it helps measure sperm quality apparently. And they described it in really like wow isn't this cool and this is how it work and so on and so on. Failing to ask the business question which is how is this app company going to monetize this app because the conventional means would of course be to sell the data. That's a little sensitive I suspect. So this is the business question. So in sharpening our sensibilities about these boundaries and they often have a profoundly ethical piece to them. And when these events come and as I said earlier investors are very responsive to this issue of events. I think that's a company when someone does their due diligence it's going to struggle to raise venture capital money from more mainstream investors because figuring out here you're gonna monetize that one. I think it's gonna raise some important questions. So this is something we're all getting better at. You can see it obviously in the data trends around terrorist to content and pornography but these issues now are spread across the landscape and they affect very significant populations who will see the issues in ethical terms. So I'm not gonna be able to top that. So we've seen that there are no real winners in this analysis and that currently a lot of the risks are actually being outsourced to users. But there's also a clear way forward in new standards being set but also set in other sectors. And luckily we'll be able to monitor that and being able to advocate that through the data gathered by ranking digital rights so we can hope that more human rights impact assessments and rights impacts for freedom of expression and privacy will be done. So we're looking forward to next year's impact assessment and hope that the future will be exponentially better. And on that note thanks all very much for your presence and attention. Thanks so much panel.