 Good afternoon and welcome to New America. I'm Kevin Banks and I'm Director of the Open Technology Institute, which is New America's Internet Policy and Technology Program, focused on ensuring that all communities have equitable access to an Internet that is both open and secure. OTI is also the institutional home of the independent and independently funded Ranking Digital Rights Project, which today is celebrating the DC launch of its 2018 Corporate Accountability Index. That index measures how well or more often how poorly 22 tech companies around the world are protecting the privacy, security and free expression rights of their users using a set of 35 objective indicators refined over years of research and multi-stakeholder consultation. How many years? Well, let me tell you a short story about the development of the index over the years and about the impact of just one of the indicators used in the index to evaluate the companies. I fondly remember coming to the first major private meeting of experts that RDR's Founder and Director, Rebecca McKinnon, convened at New America to get feedback on her ambitious goal of such an index. That was in the fall of 2012, before I even worked here. From there it was three long years of hard work building, refining and applying the first version of the index's indicators, leading to the publication of the first rankings in 2015. And in that first corporate accountability index, there was one indicator, just one, that every single company got a zero on. This was the indicator asking whether the company regularly published data about how much content it took down because of violation of its terms of service. At that time, although many companies published such data about government demands for information, about government demands for takedowns and about copyright-based civil demands for takedowns, no one reported anything about the content that they were taking down voluntarily based on their own content guidelines, even though that was clearly the largest category of takedowns and therefore the category most impactful on user-free expression rights. No one was doing it. But in 2015, RDR put a stake in the ground and made clear that based on the growing consensus of the broad multi-stakeholder range of experts that they were consulting and creating their indicators, the companies that weren't issuing such reports, meaning all of them, were not doing enough to provide transparency and accountability to their users. Just as importantly, RDR made clear that those companies that did issue such reports would be given public credit for doing so. RDR made transparency reporting around content moderation a priority when it wasn't won before and kept pounding on it year after year and index after index. Flash forward to today, three years and two indexes later. And just this week, on Monday afternoon, Google via YouTube became the first company to issue a detailed transparency report about its terms of service-based takedowns, highlighting how over 8 million YouTube videos were taken down in Q4 of 2017, along with giving details about how many of those were flagged by humans versus automated systems, how many violated which content prohibitions and more. And on Tuesday, Facebook finally published its detailed internal guidelines about how it makes decisions about its own takedowns while expanding its appeals process for impacted users, both steps that are also responsive to RDR's free expression indicators. And now, now that those first dominoes have finally fallen, we're likely to see, across your fingers, a revolution around content moderation transparency across the industry over the next few years. Just like how the first trailblazing steps on transparency reporting around government demands set the stage for an explosion of reporting across the industry once the student surveillance scandal added gasoline to that fire. As that example shows, the progress from the first demand for a new rights-protecting practice to getting one company to actually do it, then to get a few more companies doing it as a best practice, and then finally, for all companies to be expected to do it as a standard practice, that process of driving adoption can take nearly a decade of grinding work and hyper-focus, the drip, drip, drip over time of water on stone. And that's the work that Rebecca and her amazing team have been doing now for over half a decade, and hopefully we'll be doing for many more years, pushing that rock up the hill, not just on that one indicator, but on 34 more and counting. 34 more angles to push companies to do better by their users, 34 more ways that RER is making progress happen, slowly, methodically, drip by drip one year at a time. It may not be flashy, but that is what real change looks like. That is why I think pound for pound, RDR may be the most impactful project in the internet rights space, and that is why I'm endlessly proud that RDR calls OTI its home. So with that, I would like to congratulate Rebecca and her team on issuing their third corporate accountability index and would like to invite her up to tell you all about their findings, after which she'll be joined by an excellent panel of experts. Thank you. Thanks so much, Kevin, for that really fabulous introduction. I'm going to do sort of a credit roll sort of like after the movie so that you can really appreciate all the people who put work into this once you've heard a bit more about the index if Kevin's introduction wasn't enough. So we've been doing this as Kevin mentioned for three iterations, but a lot more time in developing the index. And I wanna talk a little bit about the reason why we're doing this. Of course, if you live in Washington or really follow the news at all, you're aware that there's a bit of a clash of power between internet giants and governments these days in terms of who wields power to shape people's digital lives and how that relates to their physical lives. So this map shows the world's most popular social networks by country. It's created by an Italian digital marketing entrepreneur who's been doing this for a decade and it's very interesting when he started there were a lot more colors on the map. But all of this blue are all the countries where Facebook is the most popular social networking platform. And you see China with QZone and Russia with, I'm sorry, with Kentucky and then a few other anomalies on the map. But mainly Facebook is one of the sovereigns of cyberspace we would say for much of the world. If you look at Alexa rankings, and Alexa is the company that ranks the most, the traffic going to websites around the world. If you look at the Alexa ranking for the top website in every country around the world, this light blue is Google search. The pink is U2. So again, Google the sovereign of cyberspace for much of the world. And then you have China and Russia and a few other small exceptions in a few other places. Which brings us to ranking digital rights in the map that we show of the companies that we cover. Now we've selected, we would like to rank more than 22, but resources enable us to rank 22. But we've selected 22 of the most powerful internet, mobile and telecommunications companies in the world. When you add them up, they're shaping the digital lives of most of the world's internet users around the world. So not just North Americans and Western Europeans, but really when you add up these companies, you've got the sovereigns of cyberspace, you've got the top two mobile device sellers whose operating systems both through Google Android via Samsung and Apple are shaping the digital lives of how people are then accessing other platforms. And you've got a selection of 10 telecommunications companies that are because of their global footprints operating across the world are affecting the digital lives and the ability to access internet platforms of most of the people on the planet. So that's how we selected this group. And you've all got a four pager on your seats that has the list, so you don't need to squint and memorize the list of companies here. But that's the selection, which is why we have two Chinese companies in the index as you saw from the other maps. It's vital that we include Chinese companies in this equation. Also vital that we include two of the most powerful Russian platforms. In addition to Samsung, the Korean mobile device maker, we also have Kakao, which is a major messaging and internet platform in South Korea. We thought it was very important to include at least one company that is an internet platform that is based in a democracy with strong rule of law that is not in the West. Because that helps us kind of test out some of the assumptions about what is universal and what is not. So that's the set of companies that we're looking at. And these are really the choke points for our expression around the world. They know they're able to shape what we know and what we can say online, who we're talking to in what context and who knows what about us and what they can do with that information. So this is this year's ranking. This is when you take all the 35 questions we ask and we're asking questions where we're looking at companies' commitments and disclosed policies that affect users' privacy and freedom of expression. When you add up the scores for all the 35 questions, this is how they stack up. Now you'll see we've got two Ds and everybody else gets an F effectively. So in that sense, while obviously there are some that are disclosing more than others, nobody is disclosing enough. And again, the list of companies is on your four pager. This is a more detailed breakdown. We separate our methodology into three different categories. The first category is governance. So what we're looking at is, does the company make a corporate-wide commitment to respect users' freedom of expression and privacy? Is there board and executive and management oversight over the way in which the company is affecting users' freedom of expression and privacy? Are there impact assessments carried out by the company that are comprehensive that track and anticipate what are the positive and negative impacts that the business operations of this company are going to have on freedom of expression and privacy of users? Is there stakeholder engagement? Is there grievance and remedy when people's rights are harmed? We're looking at all those types of questions. Freedom of expression. A company does not get high marks for freedom of expression because it's the biggest free-for-all, right? That's not what we mean. We mean freedom in the context of human rights. That's very important. So it's not that the company with the fewest rules wins. If there's no rules, if there's no governance, without governance, life is nasty, brutish and short for everyone who isn't really large and really wealthy and likely male. That's why we have governance and it's important. The issue is, is the governance accountable? And is it actually serving the rights and interests of the governed, right? That's what speech governance should be about. And so we're looking for transparency by companies about all the different factors that are shaping what you can say online and what you can access online and how you communicate online. We want to see transparency about the types of government demands they're receiving and how they're responding to those. Demands from other parties, whether it's copyright holders or people are flagging against harassment. We wanna see how those mechanisms are working and the volume and nature of content that's being restricted or accounts restricted and so on. We also want transparency around things like network shutdowns by telecommunications companies, how networks are being managed and manipulated, et cetera. Privacies in three buckets. First, you have the, what in Europe is known as data protection issues. And in the States tends to be called consumer privacy issues, but the whole question of what is the life cycle of user data? What's being collected? What's being done with it? With whom is it being shared? Under what circumstances? How much control does the user have over the use and sharing of that data? How long is it retained? Are you being tracked around the web, et cetera? We want to see clear transparency about that and I'll show you some of the results on that question later. The second bucket relates to government demands for user data. Is the company being maximally transparent about the demands it receives for surveillance and for sharing user data with authorities? And the third bucket are security questions. So is the company providing credible evidence that it is taking strong measures to secure user's data from theft and breach and so on? So that's the index and you'll see that the companies that score the total high score don't necessarily get the high score on the categories and as you drill down to each indicator it starts to vary even more as you'll see. So this year our methodology, our questions were the same as last year so we were able to track improvement and change. Interestingly Apple was the most improved. Apple does a lot of things to protect users' privacy but for whatever reason has not really disclosed a lot of them to their users themselves. They've disclosed these things to security experts but not actually on their official materials to users. So just really by making more disclosures directly to users they manage to boost their score a great deal. Their score on freedom of expression was less improved. The company does not really make a clear commitment to freedom of expression and has a lot less transparency around content removal in the app store and that kind of thing. A couple other interesting things to note about the changes and on our website there's actually a page that kind of documents everything that changed for every company and we also have individual company report cards that talk about kind of what was improved and what didn't. Couple interesting things in the internet platforms both Chinese companies in the index improved. They didn't improve on anything that relates to government demands. I think if you know anything about China that the reasons for that don't need to be explained. However, they did make improvements on security, on consumer data privacy issues. So in terms of being more transparent about what's being collected and shared for commercial purposes and some improvements in transparency around terms of service enforcement as well. So it's interesting to see that even in very difficult places as far as regulation and law is concerned some companies are trying to prove that they're doing what they can for their users. On the telecommunications side the main improvements came from the three European telecommunications companies that are members of that recently joined the Global Network Initiative which we'll talk about more in the panel. And that's also reflected most in the governance scores. So in the governance category the companies that got by far the highest scores for having much more systematic commitments and accountability mechanisms and risk assessment throughout the company were all Global Network Initiative members. Not that it's perfect, there's much stronger risk assessment and accountability when it relates to government demands than as relates to other things like commercial privacy or terms of service enforcement. And that's where a lot of the deficiencies lay. But we're seeing much stronger governance by GNI companies than anyone else. And most strikingly on the one question we ask about the comprehensiveness of human rights impact assessment the companies in the GNI are showing much more evidence of impact assessment than anyone else. So that's interesting. Moving on to freedom of expression at the telecommunications layer for people who live outside of the United States these types of pages are fairly common. One in India when somebody tries to access a website that's been blocked. Another one in the UK where some content has been restricted because it's perceived to be adult content, this is on a public network. So the question that we look at is how transparent are telecommunications companies about various external demands that they're getting to block websites, block access to websites or apps. And the only really three of the telecommunications companies tell us much of anything even about their process for responding to third party requests to block content. And everybody else is not transparent. And even with just data, transparency reporting about third party requests to block content, even government requests, we're seeing very little transparency around the world by telecommunications companies. So globally that's a real problem. People do not know why content is being restricted and who should be held responsible for that content restriction to their telecommunications providers. Another freedom of expression indicator related to network shutdowns. This is an issue that people working on internet freedom internationally spend a lot of time on. You have a lot of countries in India in particular. There were 64 instances of the government in various localities just shutting down the internet data, mobile data completely in cities and regions. We only have three companies that showed much disclosure on their policies and processes for handling network shutdowns. The ones that did improve disclosure while global network initiative companies as well as shutdown requests are entirely a government demand issue. Moving to the internet and mobile ecosystem companies. Many people are familiar when a page on Facebook gets blocked or a YouTube video gets blocked. And in China that's an example of a cute little block page that you get on Sina Weibo. But these types of removal and blocking notices are pretty common across internet and mobile platforms. And we're starting to see, as Kevin mentioned, transparency reporting, particularly around government demands to block and remove content at the platform level, transparency reporting has been going on for some years to varying extents by some of the major platforms. So that's an example of Twitter's transparency reporting which has been going on for a number of years. Facebook is starting to report more about content removals about, although they report a lot less and their score reflects that. Google's been doing transparency reports about government demands and copyright takedowns longest on the left is their transparency report related to government content removal and blocking demands. And you can see the demands have gone way up in the last couple years. And this is one of the many reasons why transparency reporting is important. So you can see where the demands are coming from and what the trends are on the demands, both in terms of who they're coming from but what kind of content and so on. Which we want to see transparency from governments too, but corporate transparency is a start. Google also gets credit for its transparency around the right to be forgotten demands that it receives in Europe from private actors who want their search results delisted. When it comes to terms of service enforcement, and we stopped our research before the announcements earlier this week, obviously. But these were the scores around transparency, around terms of service enforcement. We had already fairly strong disclosure about what the rules were and with Facebook's latest disclosures, we'll see that bump up even more. But very little data as of January about the volume and nature of content actually being removed on terms of service enforcement. And so with Google's latest transparency report, that will go up. And so these are some screenshots from YouTube's in terms of service enforcement. And as Kevin was saying, they have data on what types of people had flagged the content, what sparked the takedown. Was it automated or by a trusted flagger or not by a human or whatever reasons, et cetera. So it's very, very helpful. And then this is Facebook's recent disclosure. The most transparency that we're seeing is coming in the privacy side on government demands for user data. That's where the most transparency reporting has been happening. These are examples. And you're seeing transparency from some of the telecommunications companies on that too, even well beyond the Global Network Initiative. But certainly the commitment that GNI companies make to be transparent about government demands has really helped, I think, to fuel that. However, when it comes to the data protection consumer privacy indicators, this is our bucket of what we lately shorthand are sort of the Facebook issues. Because these are sort of the questions that have been most in the news lately in terms of how transparent are companies about what's being collected, how it's being used, with whom it's being shared, under what circumstances, how much control does the user have over the sharing of their data collection and use, et cetera. The highest score is an F. And it goes down from there. And that's where Facebook was at the time. I think their latest disclosures might bump them up slightly, but it certainly doesn't bring them to the front. A lot of the disclosures were more just kind of rewarding of current practices and making them more clear. There were some sub-missive things, so it's not that their score won't change at all, but there's still a lot that needs to be done there. But this line here is just to point out that the telecommunications companies are all as bad as Facebook, if not worse. And that's one conversation we have not been having so much of and maybe need to have a bit more of. On one, and just to show how when you drill down to the specific indicators, the ranking changes dramatically from what you see from the overall score. When we ask how transparent is the company about what user information they share and with whom, how the Korean company is way more transparent than everybody else. And this has something to do with the fact that privacy law in South Korea is pretty strong. Google doesn't do so well, despite, its overall score in the index is highest because Google happens to just disclose more things about more things than everybody else. But when you drill down into specific practice, particularly specific practice that relate most closely to the business model, you see other things happening. And Apple could be a lot more transparent than it is and I'm not quite sure why. This is what we like to call the Cambridge Analytica Indicator. How transparent are internet and mobile ecosystems companies about the options that user have to control their own information. And this includes a sub question, if you go on the website and look at the sub questions in here, it has to do with how much control do people have over the sharing of their information for targeted advertising and you only get full credit if it's opt in rather than opt out. Facebook got the lowest score in the entire index behind two Chinese and two Russian companies. Their latest changes may move them slightly up but my hypothesis is that unless further changes are made between now and our next research round, they will not be at the front. Let's just put it that way. Apple is the only company that commits not to track you across the internet. Nobody else commits that and they do it and the level of transparency about it is problematic. Amongst our various security questions and there are several and we don't have time to get into all of them but you can go on the website there's a whole chapter in the report about our security questions. The question that looks at does the company disclose what its policies are for handling data breaches? Only Apple in the internet and mobile ecosystem companies discloses anything and among the telcos there's very little disclosure but Vodafone shows that you don't blow up and disintegrate and die if you disclose your policies. So that's some interesting food for thought. How transparent are companies about their security oversight processes? So things like are you conducting a third party audit and can you at least, you know, we're not looking for information that's gonna aid the adversary in attacking your platform but just some basic evidence that you have processes. Kakao and Google are getting full points. Everybody else is disclosing a lot less than we think is necessary to reassure users of what it is you're doing and the telco sector also some problematic things. So one could go on all day if one really wanted to go through every single indicator but we wanna get to the discussion because that's more lively and we can get into a lot of other questions but we have a lot of recommendations for companies in our report on our website. In the individual company report cards we have recommendations for each company which focus on even if there's no legal change in your home jurisdiction, here are all the things you could do today to improve your score. And we also in each chapter have much more detailed recommendations around the specific types of subjects but it really boils down to we need much more thorough governance around these issues. We need to see very clear board level commitment and oversight. We need to see risk assessment that's comprehensive. We need to see grievance and remedy that is meaningful and we need to see clear stakeholder engagement and a real effort to innovate around both business model, technologies and design that are actually compatible with enabling people to basically function in an information ecosystem that is compatible with the kind of society we want to have and companies really need to be thinking about that and that is compatible with human rights with the exercise of human rights by users. We have a lot of recommendations for governments in the report, this is just kind of a few but we found that there are a lot of companies that would get higher scores if the law in their home country was not so bad. So there are many jurisdictions that are making their companies uncompetitive on these issues. China is a very obvious example but there are a lot of countries that have laws that don't allow companies to even disclose transparency reports on copyright takedowns and what the public interest reason for that is, is beyond me or there's all kinds of transparency around content and network shutdowns and so on that companies are not doing because the law prevents them. There's also the lack of data privacy law around the world is clearly a big problem. One example is MTN in South Africa whose disclosure about its handling of user data is very poor, not because there's any political reason why they can't do it but the law's not forcing them to so they're not bothering and we see this in a lot of countries. So either stakeholders need to impose consequences or the law needs to impose consequences or some combination of the two. So just kind of a little advertising for the website put together by my colleagues which is just really fabulous this year. You can really explore the data in a very granular way. You can go through each indicator and click on it and see kind of how each company scored on each question and you can even go in, let's say you click on Facebook here for this particular indicator and you can see what score they got for each sub question for each service and so on. You can also go and download the raw data and get all the research or comments for every single score. So if you really wanna geek out on our data and if you really think that Google didn't deserve the scores it got, you can go into our spreadsheet and look at the researcher comments for why every single sub indicator got the score it did. So that's kind of what you need to do with this kind of thing because otherwise people are like why did you give them this and we can explain it and they can look at it if they want to. So where are we going next? There's a lot of questions we didn't ask obviously and as the world continues to evolve and technology evolves we're thinking about should we ask a question about transparency in relation to the use of algorithms and also risk assessment in terms of use of algorithms. What kind of transparency should we look for in terms of the deployment of AI, artificial intelligence and risk assessment around AI and grievance and remedy around AI. Also should we be asking more questions that relate to the business models of the companies and the risks and transparency that we want to see around that particularly advertising. So those are all questions we're gonna be exploring in the coming months before we start research on our next index and we'll probably make some adjustments to the next methodology in some way hoping to kind of have broader conversations with more experts and stakeholders kind of through this process that we engage in to kind of test out the more difficult indicators that have less consensus around them to really try and figure out what is the standard we want to set for corporate transparency around these issues. It's not always entirely clear at this point. We also have, we're only evaluating 22 companies on a set of questions. There's a lot of other companies and technologies we're not evaluating. We're not ever gonna be able to do it all but we're partnering with people who wanna take our methodology and adapt it and evaluate other things. So we're working with consumer reports on a set of standards for evaluating the internet of things on privacy and security. And we also, our methodology is public on the website and so we're starting to see researchers around the world adapt it to local regional companies. So in New York, the new school recently applied our methodology to evaluate ISPs in New York City. They didn't do so well. You can download their report. But also in NGO, in Lebanon has used our methodology to evaluate the privacy policies of telecommunications companies across the Arab world. So we're very encouraged by this that we can't cover the whole world and all the things and all the issues but we're really thrilled that we're starting to help provide a framework that people can use to explore the companies and the issues that have greatest impact on their communities. And so I'm hoping that a broader ecosystem will emerge. Finally, my credit roll. All of this would not be possible without our team. We have six full-time people but also work with a lot of researchers around the world who do, who could work with us for shorter periods of time with expertise in various languages and specific technologies. But our research team, Amy Broulette who's based in Budapest most of the time which is why she's not here but I hope she's on the webcast. Laura Reed, our senior research analyst based in New York, Andrea Hackle, research analyst who's normally here but due to the family emergency is not here today. That research team are sort of the core who are working with our researchers around the world to make this happen. Our program manager, Lisa Goodermuth who is in Berlin some of the time and occasionally visits us here but hi Lisa if you're on the webcast. Policy and Communications Analyst, Ilana Ollman who's also now living in Berlin and of course all sorts of researchers and partners and they're all on our website. Share Labs based in Serbia did our data visualization website and designed the graphics and they're incredible and they do a lot of great activism themselves. Allison Yost from OTI, the communications diva. There she is, she's hiding. She's being modest. This beautiful report, there's a couple copies of the full report out there but also these four pages and everything that's due to her incredibly hard work and creativity. So just admire these things and admire Allison. And then last but not least our funders who make this possible as Kevin alluded to we do not take corporate funding. We can prove under audit that we have no corporate funding. Our funders are the State Department Bureau of Democracy Rights and Labor, thank you Laura. They have been supporting us since the previous index and we're grateful for that support and very light touch. And Ford Foundation, Open Society Foundation and the MacArthur Foundation who were our kind of foundational funders from the beginning without whom this would never have gotten off the ground. So we really appreciate their faith in us and there's also a set of advisors including Leslie here and it back there. Some others out there in the world listed on our website as our sort of advisory council who've been giving us advice as we've been navigating a whole set of different pressures that people try to put on us. So thanks so much. And with that I will stop thanking people and thank the panel who, I hope they'll come up. Oh there is a clock, okay. Here we are, yeah and there's a clock. Hello, thank you all for staying for the discussion part of this. We were just confirming that we will do audience Q&A towards the end of this. So if you have questions, I'm sure you do after seeing just a glimmer of the immense amount of data that's in this report, please hold on to them because we will definitely leave plenty of time for questions. I should probably introduce myself. My name's Emma Lonzo, I'm the director of the Free Expression Project at the Center for Democracy and Technology which is a tech policy advocacy group based here in Washington DC with offices in Brussels. I'll be the nominal moderator for this session but we've got a lot of experts with a lot of great thoughts and things to discuss so I don't imagine having to do a whole lot. Let me introduce the other two panelists real quick. Rebecca, who you know, hopefully by now. And then we have Shanti Kalathil, the director of the International Forum for Democratic Studies at the National Endowment for Democracy who has also worked at such places as USAID, Wall Street Journal in Asia and has a lot of really interesting perspectives to share particularly as focused on a kind of situation in China. And then we also have Leslie Harris of Harris Strategy Group who is also a professor at Georgetown University, former president of Center for Democracy and Technology, a founding member of the Global Network Initiative and the reason I am in this space who hired me on as an intern a decade ago. So with that, I mean, obviously as both Kevin and Rebecca said in their remarks so far, this may be the most focus on kind of the role of technology platforms and telecommunications provider in sort of our daily lives, in our societies, in our elections than we've possibly ever had before in the history of the internet, not just here in the US but in countries around the world, more and more people I think are having to come to grips with the fact that there are gigantic companies out there with playing a huge role in our access to information, our security, our privacy and what exactly they're doing and what we as people, as governed people can know about it, it can be kind of difficult to determine at times. We've seen a number of different kind of events in Congress just over the past month from the Cambridge Analytica hearings with Merck Zuckerberg a couple of weeks ago to yesterday's House Judiciary Committee hearing on the filtering practices of social media platforms which is an incredibly important issue to actually be thinking really thoughtfully about that also just sort of featured a lot of staffers trying to keep straight faces as the discussions went kind of often lots of different directions. A variety of levels of kind of public policy conversation about these issues right now but one of the things that I know a lot of us kind of in the advocacy space keep coming back to is the need for real data, real information about what are the practices of these companies and what kinds of consequences and impacts on user rights they actually have and this is where a report like ranking digital rights is I mean it's the leader in the field as far as rigorous evaluation according to open methodology that really enables not just understanding a particular company's practices much better but also comparing across companies and across countries and sort of really getting a much more holistic perspective grounded in really solid methodology. So just Rebecca thank you for kind of you and your team contributing this to these public policy discussions that are so important for all of us. It proves that it can be done. We can actually have solid data to work from when we're thinking about these big difficult issues and I can't wait for the fourth report to come out. But so to start off the conversation I thought I'd just like to ask each of you to kind of comment on in this welter of privacy, security, free expression issues and the role of these platforms. What's the one issue that you're most concerned about or that you just sort of like to raise for the group and maybe we'll start with Leslie? You just limited me to one, huh? So here's what worries me most and this is in the wake of Facebook putting out its new platform that's gonna allow you to deal with all of your data. Number one, we can't get away from this advertising model. And as long as we have this advertising model we are, we're the product and so I think it's really, really hard and at least in the US very soon because we're losing the net neutrality rules we're gonna have another whole player who right now don't have any privacy rules. So the data collection and the value of data to make these companies grow in my mind is so powerful that asking in basically I wanna have the ability to like take myself out of being a fashionista, that's what Facebook sees me as most. And who knew? Well there's some other interesting categories. So I think it's fine to provide these tools as long as we acknowledge that in some ways we talk about security theater. I think a lot of this is privacy theater and I'm gonna be pretty right on about that. Secondly, and I think equally important, we're talking about algorithms and algorithmic decision making and profiling. Nothing they're doing isn't in algorithms. So to sort of say when you're being somehow your data is subject to an algorithm there was, I can't remember where I just read this but besides the fact that 65% of the people get their news from Facebook an equal number had absolutely no idea what an algorithm was or like how they were getting the content they're getting. So just as if like back in the day when we would say consent is not the answer to privacy you're putting the entire burden there has to be, sometimes there's either a shift in business models or all of this is privacy theater. And secondly there has to be on the part of companies some kind of red lines. To me the biggest lesson out of Cambridge Analytica and looking backwards they can all are go oh we didn't know what was going on yeah we're the close partner who they've done business with for years is I call this the everything's called advertising. So if you gave some kind of transparency to what was going on they'd say yes they're doing political advertising and people understand advertising is toothpaste. So we have to figure out like where is the responsibility the human rights responsibility the ethical responsibility of companies as they are being constantly driven by money and an advertising model to draw some red lines. I call Cambridge Analytica the who can we turn into being a Nazi algorithm and Facebook just said hey sure go on the platform and you know those people who are angry depressed and can't get a woman and send this out because our research shows we can make them right wing and so in some ways the sort of combination of all our data and the pressure to turn everything into I'm just advertising Zuck said that at the hearing it's just advertising right. So I just think I don't yet know Rebecca what it means for the index I've been thinking about this a lot but we have to shift some of this focus to some kind of substantive red lines and I don't know if that's ethics if you look at Facebook when they're doing their own research they have the best ethical process in the industry I did a study last year on what they were doing in research but not when they have people running around the platform right so I just think that we may want to start asking some questions about that right about you know and I also think we have to think of difference between algorithms that think I'm a fashionista and algorithms that want to make me a Nazi or that are trying to decide what jobs I should get and start to consider what's a consequential algorithm and as a company is there a different level of transparency a different internal responsibility about those kinds of algorithms I mean if I am according to Facebook an African-American fashionista with liberal politics I own all three I just think that maybe algorithms are not quite as good as advertised so those are some of my concerns another whole thing about content curation but I'll come back to that and Chansey from your perspective so allow me first to take a step back and keep some more praise on ranking digital rights which I hope will not be objected to by anybody but just to give it some context I think even just a few years ago as both Rebecca and Kevin alluded to in their remarks the idea of what went into this mess this great sort of miasma of information that the companies interacted with to then provide these services it was a black box there was just no way to get into it and what RDR and before at GNI to some extent have done is to try to quantify to some extent look here's what we understand about the things that are important for human rights and for as Rebecca so nicely put it the way to make sure that the way we want to live our lives is matched by the platforms that we're living our lives on so that's by way of saying that I think we now have tools and Rebecca handed us the indicators before the stack these are not just indicators they don't just go into the project they're actually a compilation of all these evolving best practices and standards as we know it so this in itself is just a valuable component I think for me if we get to the question of concern is what are the next black boxes and what do we not know enough about what do we need to unpack more in order to understand how to compile the next set of these and to understand what the best practices are for me over the years my unit has always been by default to look at the government's usually authoritarian governments and what are the practices within their borders but I've always understood that to be inadequate because in fact the global information space is not governed only by national borders it is an intersection of corporate policies and government policies and that's where the action takes place and so we need to be able to incorporate these two things more fully and for me the worrisome part is where it hits I think I would probably echo sort of the privacy and surveillance components in the issues that I look at and particularly with respect to China it's been a big flip over to incorporating elements of surveillance into everyday life and in those sorts of environments within China there's very little civil society can really do to push back against that there's not a strong rule of law environment what do we need to know about this emerging aspect of our lives which I fear is not going to be constrained into authoritarian environments but will be sort of more broadly felt outside of them as well Rebecca do you have, I mean I know full perspective of the indicators but you know it's funny last week before the index came out I was talking to a journalist in advance of the index coming out to let them know and this person asked me is there any company worse than Facebook? Yeah, most of them but I guess the point, you know when you think about the set of companies we looked at these are the world's biggest publicly listed companies that for, there's lots of imperfections but they actually do care about what the public thinks of them and they do care and all of them I think it's fair to say all of them all of them as with many big companies most big companies have different kind of factions within the company, you've got the privacy people you've got the security people you've got the marketing people you've got the PR people, you know et cetera, et cetera the money people and people are competing for resources and attention of senior management and I hear from a lot of people in a number of these companies who are kind of at middle management level who are like this index is very useful because I can tell my boss that we didn't do very well on this particular indicator and we need more resources and kind of management priority to do better on this because this really matters for our company and so there's people that really care I would say in most of these companies even some of the companies that aren't doing so well we've had some interesting conversations with people kind of at the middle management level but there's a whole set of companies out there I mean there's internet of things that we're doing some work on already and I got to tell you if we took these criteria and applied them to many sort of internet of things companies like two and three and single digit kind of thing yeah and but it also gets even more complicated because a lot of internet of things devices are actually compilations of several different corporate entities working together you've got an operating system or sort of information platform or payment platform different companies and then you've got the hardware and so on and none of them are clear about their policies and none of them are taking responsibility for much of anything if something goes wrong so it kind of gets worse from here really which is one of the concerns the other concern is sometimes people come to me and say well why aren't you looking at the companies that provide networking equipment or why aren't you looking at the companies that sell surveillance software to the Egyptian government or something and that latter category they're the arms dealers they don't care about rankings I mean because otherwise they wouldn't be arms dealers and they're not consumer facing so they don't care about the user's trust in their product they might be interested in the Egyptians governments trust in their project but that's different and kind of the network equipment layer I think you probably need a different theory of change to really incentivize that layer of company so I think yeah I kind of like to think of this as proof of concept for a certain set of types of companies that care about their relationship with people but we need to think more about kind of what's the pain point for different categories of companies and what sort of data we need and to put in the hands of which types of actors to get change for different types of companies and just on my part as far as like big concerns picking up on something that Shanti said about the sort of I guess habituation of people to surveillance through kind of what gets incorporated into these technologies I'm also concerned about that on the free expression side of things too and the sort of the environment that we're in where a content host, a social media platform potentially could try to comprehensively apply its terms of service across every piece of content that is uploaded so if you have say a policy against hate speech it's technically very difficult doing it would probably involve vast amounts of over broad or under inclusive kind of censorship of the content but they have the means to affect any of the posts on their service and that this is a really different environment from how sort of laws about speech have applied to people in societies before that there's if we have a law in the U.S. say against issuing a true threat of violence against a person there are a lot of things that actually rise to the level of true threat of violence that are never heard by anybody in government or law enforcement or turned into a case or turned into a prosecution there's the sort of there is a big gap between the number of times the law is actually applied and the amount of speech that it could potentially be applied to and I think the sort of shift from an offline environment where the laws, the standards, the rules exist and they're applied a probably fairly small percentage to a fairly small percentage of cases that actually merit it versus the potential perfect application of rules about speech online is just a societal shift that I think we're still really kind of grappling with and working through the consequences. It also raises a lot of questions around should platforms, terms of service, conform with human rights standards substantively and what all of the different consequences of that are but we've got a lot to talk about so we won't take you deep into that right now. I wanted to have a follow up one question in particular I think for Rebecca and Chauncey about the Chinese companies in the index. Baidu and Tencent and Rebecca you had mentioned how they actually both showed improvement over time and just wondered about your thoughts on companies like Baidu and Tencent operating in the Chinese environment. I think there's sometimes a tendency to just sort of say oh well it's China, there's nothing to be done but here you see companies actually improving over the last year. I mean they're not improving their transparency about government censorship demands and they're not improving their transparency about government surveillance that involves government authorities sitting in their offices looking at users' activities. They're not transparent about that and that is not improving. But when it comes to disclosing again what's being shared commercially, what's being shared with other entities that are not governments, what's being collected, how it's being used, we're seeing a willingness to be more transparent and some value placed on that and also security. In China the Chinese public is very concerned about hacking and theft. It's a huge problem in China and so a company that can demonstrate it's making real efforts to shield, to protect their users against criminals is that's a real commercial incentive. So there are definitely areas. It's again, I think in a number of jurisdictions we see this where you really, if you're gonna kind of push the company to improve you have to do a bit of an analysis about, okay here are the things we recognize require legal reform. And in China it might not be very possible to see the legal reform anytime soon but in a place like India for example you know you could see civil society and the company's getting together and telling the government actually there's no public interest reason why this law is preventing this disclosure and let's get it changed and that it's in the company's interest to do that. So but yeah in China it's very interesting and the other interesting thing is that one of our audiences is investors and this is another reason why the Chinese companies care because a lot of major investors are investing in Chinese internet companies. You know have big holdings in Baidu and Tencent and Alibaba and so on. And so I've heard from investors who have told us that our data is actually quite useful to them for their calls because they're responsible investors to invest in Chinese companies. And they then kind of have a better have greater clarity about what they can raise with their Chinese holdings where they can actually have a real conversation and what is going to be less fruitful. So that's also quite useful. Briefly just to build on that the what we've seen is actually an interesting shift and it used to be you know back in the day when I know Rebecca was looking at this many years ago and looking at these companies really the focus was just domestically within China. These companies are now huge. They're some of the biggest internet companies in the world and they have expansion plans and they are tying up increasingly both at outside investors but also buying stakes in other companies themselves. And so they will be part of this global ecosystem. So putting that into that global comparative context is important. And you know I applaud anything that will allow those companies to do something that will help protect user rights even a little bit more within China. That said I think you know part of the challenge here is understanding what a lawful request to take down content means in China. I mean that can kind of have a lot of scope to it. And so I know for the purposes of the index it's focused on lawful requests and that's important because it has to be a standard. That's why I think there needs to be this overlay of actually understanding that this close interplay both between the private sector in the state and within China and that is actually becoming ever tighter now as well as sort of this weak rule of law environment in which the companies operate. And because of the expansion that I talked about before it has the potential to have global effects. So this is a great way to bring that into the conversation. And before we open it up to audience questions. So one sort of theme that came up in your presentation this morning was the role of the Global Network Initiative or sort of the fact that a number of the countries in the index are members of the GNI and how that seems to really have had an impact on the kinds of disclosures that they make. So I was wondering if Leslie, as one of the co-founders of the Global Network Initiative could you give us a little bit of background on sort of what the conversation was, one what it is, to what the conversation was like when it was really first getting started and how that sort of shifted over the about 10 years that it's been in existence. I think a lot of people probably know, sorry, I'm losing my voice, that I mean the process to stand up GNI happened in an environment of the Chateau arrest in China, a Congress here that was trying to pass laws that some of us thought were unworkable. It was US based, we actually had European companies in who did not stay in, but the focus was very ways or like on how do you respond to government, to government demands for censorship, to government demands for user information and surveillance. And I think what was very clear at the time, because we had some European NGOs who wanted it to be broader and look at commercial practices was that we were gonna not get that step one if we tried to expand it. So GNI really was and I think for the most part still is focused on the relationship between these companies and what government demands of them. What's happened in these 10 years, how many years since you wrote your book? Yeah, well it was kind of prescient I thought because a lot of people were saying sovereigns of cyber space, that's a bit much. Rebecca, you're too negative. Yeah, you're so negative. Well, I remember saying that to have Jenny once upon a time and now I teach his book like it's a Bible, things happened and the thing that really happened was this extraordinary shift of power, a technological development that allowed what used to be we're collecting this piece of data and we're gonna use it for this purpose into basically your data being currency to be run through algorithms for many different purposes almost all commercial and that's what is so important about this project because there are, it was an enormous innovation that came out of GNI. The entire concept of transparency reporting is not written into those guidelines or principles. I think that was one of the most exciting things is people thought okay we gotta do this, how to do it right. So to see it now that the companies understand the same way they thought the world would collapse if they agreed to half the things they agreed to at the time in GNI, if they now extend this to some of their own practices, that that may be a positive rather than a negative. So I think that the time is right and I hope in the next iteration we can really talk about pushing and content curation is my next big issue that I wanna get them into. Well, I wanna make sure we have time for questions. I'm sure there's plenty of thoughts out there in the audience. I don't know if we have mics, yes we do have roving mics so any questions? Hi it's Andrew Renz from the Indian Governance Lab at American University. So I need to ask some factual questions to Rebecca to frame a particular question which is if I understand correctly looking at MTN and Vodacom you would have looked at the operations throughout Africa. But Vodafone is based in the UK and so it seems Vodafone's a relatively good actor but MTN in Africa they're fairly evenly matched competitors is not such a good actor MTN's based in South Africa. So when you mentioned that the law affected how that acted it's the law of the home base that we're talking about here. Right, yeah so just to clarify with our methodology for the telecommunications companies we looked at two different levels basically because we don't have many millions of dollars to hire people all over the world. What we ended up doing is we looked at the for each telecommunications company we looked at the group level policies so with the transparency reporting and the human rights commitments the governance indicators are basically all at the group level and then also the transparency reporting indicators are really looking at global transparency reporting but otherwise for the commercial privacy and security and also sort of handling of government demands around content and information flows the rest of those indicators we looked at the home country operating company because a lot of these companies have in some cases a couple dozen different operating companies in different markets and because telecommunications companies are so physically localized their policies differ in every single operating market which is different from the internet platforms in that sense. So with MTN we looked at their group level and their South Africa operating companies and for Vodafone we looked at the global transparency reporting and governance commitments but otherwise at their UK operating both fixed line and mobile. So that was just a it's kind of a methodological necessity because there's just no way to properly kind of examine and compare and then average up in any kind of meaningful manner kind of scores for every single operating company and we actually looked at one point at is there some way of kind of doing spot checks for specific other markets and just methodologically it just didn't work in a way that was gonna make sense but that's why it's really important that I'd love to see well the internet sun frontier which is an NGO that operates largely in Africa has recently done a report using our methodology looking at telcos and several African markets and so that's why it's important that people kind of take this and look more deeply at the operations of some of these companies in specific markets. Thank you. Let me just add one thing to that. So Vodafone is among the telcos who just joined G&I and they are gonna be assessed on their global operations. So in fact if you and part of that assessment is looking at specific events that happened in difficult markets. So if you happen to know of any of those that you would like to share quite seriously because sometimes it's hard to dig them up. I mean some of them are big and please do. That this is the first time they're gonna go through that kind of third party assessment and that's always a part of it and this is sort of picking some set of difficult cases and examining how they're handling it. Thank you. Sure. Hi, Nina Gardner, adjunct professor at Johns Hopkins SICE. Thank you very much to all of you for all the work you're doing and Rebecca in particular because this ranking digital rights is extraordinary. I wanted two questions. One to Rebecca specifically on now that you've done all this huge amount of work. One of the things that came out very clearly in the hearings in the last couple of weeks is how poorly prepared our congressmen and senators are on even understanding what these issues are. So question, one part of my first question is are you planning to do a little bit of preparation for all of these guys and women to understand what questions to ask so that we can actually move forward here because Zuckerberg was having a field day there. He got away with answering nothing. That's a good question and I'd love to hear kind of some of your thoughts about sort of legislative kind of preparedness. I mean, I have spoken to some congressional staffers. That's not the same as educating members but that's the longer, we actually just to give another advertisement to some other colleagues is a program called Tech Congress where they're actually placing technologists on the hill working in representative members' offices. And I think that's very important among the very many things that need to happen but I think the level of education of this particular legislative body is if you think about all the parliaments around the world and that are grappling with these issues and trying to figure out how to regulate. I mean, I'm not sure the Indian parliament how they're doing on these issues or any number of these other governments that are having major impact on the digital lives of millions of people as they're grappling to regulate this. I mean, part of the problem I think is that existing regulatory, legal and judicial frameworks in much of the world are not fit for purpose in dealing with these issues and especially that they're cross-border and you often have legislation passed in one country that let's say legislation that's currently on the books in Germany around forcing platforms to take down content very quickly without any kind of judicial review and it might make sense maybe, it might make sense in a highly democratic country with great rule of law and great, but even then it doesn't really make sense but the implications of this for internet users around the world are very negative but the people who passed the law they don't answer to the rest of the world, they just answer to German citizens and so we have a real problem with regulators in many jurisdictions. I had a conversation with a member of the European Parliament not too long ago where I said, you gotta consider what you're doing and how it's affecting people in the developing world and authoritarian regimes and he basically said, I'm paraphrasing only slightly, he said, I don't care, that's not my job. And so then we're sort of counting on, and this is one reason why G&I exists is that we're sort of hoping that companies will push back against kind of this mess but then the companies of course have their own commercial interests and there we are. When they come together with everybody else's interest it's very powerful, when they don't, not so much. So I've been in this space, in this internetty thing since 1996 and in all that time and in all the various places I've been we would always at a retreat, give an award, private award to the two people who seem to know something in Congress about what we were doing. And I have to say, in those hearings I thought it was worse, I mean over the years particularly in the house where the members are younger over the years to build up a fairly, you know, a group who were at least educable which obviously the senators weren't. But I mean having also lobbied for a really long time the truth is it is probably ultimately more important to have smart staff and I don't know what happened to smart staff at that hearing. The Internet Education Fund which was actually the spawn of CDT back in the day continues to have never-ending events to educate on all these complicated issues. You know, I'd say some people oughta get out of Congress because they're too old but I'm gonna be too old soon so I don't wanna kinda go there, you know? But no joke, two people, one was always Ron Wyden so this is over 25 years. We just keep substituting the second. I don't know what to say. I mean we used to have an Office of Technology Assessment. We used to actually feel like it was important to ask somebody. They were probably, apparently they were too serious. You couldn't get a quicker response. Why Gingrich got rid of them in the 90s? I don't know but we obviously need serious entities to advise Congress and you saw the same thing back in Selpa. They didn't have a clue what they were talking about. Not a clue. Stop online Privacy Act. Yeah, stop online Privacy Act. I mean piracy. Piracy. Yes. Well and I think... Freydy and Slip. Whoa. Yeah, I don't miss any of that. It was a secret acronym for that bill. I enjoyed the thinking part. But one of the big questions too is just what level of technical understanding do we actually need legislators versus their staff versus regulators to have? And I worry there is a little bit of a tendency in the tech community in general to kind of call out every legislator on every misstatement or misstep that they make in describing what are in fact some fairly complex technologies. There's a lot of different issues where they have to have a little bit of knowledge on to be able to kind of do their jobs. And it's where I think something like RDR is so helpful because like if you can show graphics, right? If you can show, hey, here's some, here's some, like this is a much easier entry point to actually having thoughtful conversations than a 20 page white paper to be able to say like, wait, why this shows me that Google is at the top of the US companies and Apple is at the bottom, why is that, right? That gives you the opportunity to actually start having an entry point into some real conversations that we'll get very technical, we'll get into some real detail, but immediately contextualizes it in a way that helps people understand what is relevant to them. And I think that's for kind of for everybody in our space framing it from that kind of consequences for actual people as the entry point to let's talk about ultimately some really complex technical topics is gonna be a much easier way for legislators to engage. And can I just, I just wanna say because I sometimes have a foot in this space and sometimes have a foot in sort of this other broader space of either international affairs or looking at democracy and so on. And I think that sometimes in this space we get too caught up in thinking of the internet conversation or like this is a screen and keyboard thing. And part of getting both lawmakers and the broader public aware of these issues is to make sure that people understand broadly it's not just that this is a screen issue, this is how you wanna live your life. I think this is the consequences issue that you're getting at. And that goes beyond the companies in this index because increasingly these issues will be relevant to so many other companies. And this is essentially a way of staking out the parameters of the discussion that can then be expanded beyond these companies. Just as a brief example, I noticed that Airbnb is now agreed to share information within China about people that use its service. Okay, that's not a tech company per se but that is something that is incredibly relevant and making things relevant in that way so people understand, okay, this is just one way of understanding it but we need to expand our frameworks and bring the broader public in as well. And the other thing too, just to build on what Shanti's saying which is so important, is that it's increasingly going beyond what we traditionally consider the tech industry, automobile companies, consumer appliances. All of them are increasingly going to need to consider all of these questions in our index. And not just privacy and security but speech, a hacker a year or so ago posted a thing on Twitter, a picture of a refrigerator that had been hacked into a porn hub because it was like an internet connected refrigerator with a screen and somebody was running a porn hub off of it. So now you have expression issues in relation to home appliances and I can bet you that General Electric and Electrolux have not thought about these issues at all if they're even thinking about the privacy and security stuff beyond what they have to do for GDPR. Yep. Could I just ask you a couple of the, how are you engaging with investors, for example, to ask these kind of questions? Because these, the pictographs are very useful for numbers of people who are not thinking about these issues and there's nothing like investors asking those nice little questions. Yeah, well, I've been, you know, talking to investors, in fact, I'm speaking at an investor event in London on Monday and kind of have a few other conversations going on and I've just recently written something for an investor publication and ranking digital rights actually put out an investor brief last fall which is sort of like translating what we're doing into investor thinking. But the primary argument is investors, when you get beyond sort of the die-hard socially responsible investors who have cared about human rights for a long time and recently, you know, are part of GNI in terms of privacy, you know, in terms of surveillance and censorship and concerned about that, when you get beyond them, investors in thinking about these issues have traditionally only considered the cybersecurity issues. So the data breach issues and the theft issues, right? So those few indicators that we have in our privacy section, that's what kind of the traditional investor considers material to the business, to the value of the company. And the argue that I've been making with the help of some of the people who advise the project who are investors is that actually cyber risk is much broader than breach and theft. It's the damage to your brand and basically anything that can cause harm to users, both collectively and individually, is a risk to your business and therefore a risk to your investment, which means that everything in this index is materially, at least potentially material relevant to investor and investors need to be demanding that boards oversee risks across all of these things. So that's the argument that I'm trying to make and got a bunch of presentations that I'm gonna be starting to give beginning next week to investors in a number of places. And we'll see, Facebook's value seems to have, it shares down 30 points, but its earnings are up. So, I mean, it takes a long time. How long did it take to get investors to care about pollution or climate change or even slave labor, you know? It has taken a long time and there's some individuals in this room who've been working on this for decades. And you don't get investors to get it overnight. And we're just starting on this one and the light bulb is starting to turn on over more people's heads, which is a good thing. I'm starting to get calls and emails from people who were not calling and emailing a year ago. So, yeah. But I don't know if anybody else has anything to say. Is there time for one last question? If there's anyone else out there. Who is it? Looks like. Hi, Sharon Bradford Franklin with OTI here. So, there's been fair amount of conversation about how just as you were putting this to print, there have been some new announcements by Google and Facebook and we've obviously had an increase in the general public who doesn't focus on these issues like we all do on these issues. And Rebecca said that these 22 companies do care. So, I'm curious from not only Rebecca, but all of you, how optimistic are you that we will see great improvement before the next index comes out? I'll let you guys go first. I think I am optimistic if there's lots of attention to this, you know, to the content transparency, content moderation and take down transparency that we will see some other companies move in that direction. I mean, there are a few companies that have as much content to take down as Google and Facebook. But I feel like that's doable and that that's gonna be a best practice five years from now. And I feel really good about that. I don't feel really good about privacy. Which I think has, if you unpack it, is a lot more than privacy. We call it privacy. And I mean, I would actually be optimistic about the fact that we are in this moment. So you mentioned the moment that led to the formation of G&I, which was the Yahoo case, you know, and that really galvanized attention. We are in that moment again. And after that moment, you saw real change. So, this is a moment I think everybody in this community within civil society and broadly has to take advantage of it and push. I'm optimistic. 17 of the 22 companies in the index between last year and this year made improvements. So I'm positive we'll see just as much improvement. But I would also caution what this index is. This index is the floor, not the ceiling. This index is the bare minimum of stuff that just, you know, this is like the easy stuff that they have no excuse not to be disclosing for the most part. You know, this is not the hard, the hard stuff is the business model stuff, which isn't in the index. So I think one thing to caution is even if everybody gets 100 on this, the problems are not gonna be, there's many fundamental problems that will not be solved. But it's just like, at least they can do this. Right? And I just echo that I'm optimistic about more transparency on content moderation as the companies are doing it voluntarily, but probably pretty pessimistic about where government regulatory efforts around how they do content moderation will go. So we may be on a path where we see, you know, the company's really filling up those bar charts and, you know, really increasing their scores on the sorts of things that they have leeway to do, while also I fear operating in environments where they are much, much more kind of constrained and restricted about how in favor of their users' human rights they really can be. So with that, I think that is all of our time for questions today. Thank you all so much for coming. Rebecca and team, thank you so much for all of your work on this.