 Everybody, we'll let our panelists have a seat. Good morning. Thank you for joining us. Welcome to New America. My name is Cecilia Muñoz. I'm a vice president here, which mostly means that I have the pleasure of working with some extraordinary people, including some of the people that you will be meeting this morning. I feel incredibly fortunate to work at a place where you can arrive as a fellow and, as Rebecca McKinnon did, work on an award-winning book, and then work on a really audacious idea, which is asking the question of how we might achieve an internet that supports and sustains human rights. So the result of Rebecca and the team that she assembled chewing on that idea is the ranking digital rights report that the team is releasing today. This is the fourth ranking digital rights corporate accountability index. And we've already seen that it's having some real impact, as you will hear. By the second RDR index, more than half of the companies ranked by the first index had improved disclosures and policies affecting privacy and freedom of expression. By the time we'd evaluated companies for the third RDR index, 17 of the 22 evaluated companies had made further positive changes. And between last year and this year, 19 companies made meaningful changes. And in many cases, those companies told us that at least some of what they were doing were as a result of the index. They are aware of it, they are following it. And they are thinking about their policies and their behaviors as a result. So this is what we were hoping for, is to increase awareness on the part of the public, increase awareness on the part of the companies, and really advance something that New America, multiple New America projects work on, which is really addressing some of the most challenging governance and policy problems of our time, particularly as they relate to the way that we intersect with technology. So it gives us great pride to be releasing this report today. And with that, let me introduce Rebecca McKinnon. Good morning. Thanks so much for coming. And thank you to New America for having incubated, enabled us to incubate this project. And to all of the ranking digital rights team who've worked on this, who finally we have more people working on it than I can sort of reasonably list at the podium, which didn't used to be the case. And our team is around the world. So we have a few people in the room here. Raise your hand if you're currently working on RDR. And we have an alumni or two as well. And thanks also to those members of the team who are watching the webcast from Europe and researchers around the world who have contributed to the index, who are watching from all over the place. On your chairs, we have a four page summary of our key findings and top line results and recommendations. On the website, there's a much more detailed report analysis and data so you can go through the data and see the results of every single company on every single indicator and so on. And I'll talk a bit more about the overview and the sort of basic findings. So as Cecilia mentioned, the point of the corporate accountability index is that we need an internet globally that sustains and supports human rights. And we're not going to have an internet that sustains and supports human rights unless companies are actually operating and designing their products and services in a manner that's consistent with human rights. That's kind of obvious, right? But actually benchmarking companies coming up with the indicators for how you evaluate the extent to which companies are respecting human rights and the extent to which they're improving where they need to go in order to get us to the kind of internet we want. It takes some doing, takes a lot of consultation with a lot of people to develop the indicators. We evaluated for the 2019 index, 24 of the world's most powerful internet, mobile and telecommunications companies. We can't do the limited resources, cover every single company that is affecting people's rights around the world. So we chose a selection of companies that when you add them up, they are touching the digital lives of the majority of internet users on the planet. And as the project continues, we hope to expand. So you'll see also on the website, you can dig into the report card for each company and so on. So this is the index and you can see it more clearly on your four-pager. We have the companies divided up into the internet and mobile ecosystem companies and the telecommunications companies. This year, Microsoft came at the top of the internet and mobile ecosystem companies and Telefonica, a Spain-based multinational telecommunications company that operates all over Latin America came in first in the telecommunications category for the first time. And as you'll see, our methodology is divided up into three categories. The first category is governance and that's about does the company have basic commitments to freedom of expression and privacy? Do they carry out oversight? Is their board oversight over risks to users' freedom of expression and privacy? Are there human rights impact assessments being carried out? Are those assessments comprehensive across all the different ways in which companies might be affecting freedom of expression and privacy? Is there grievance and remedy? Is there stakeholder engagement, et cetera? Then for freedom of expression, it's not about which platform is the greatest free-for-all for anything anybody wants to do. We look at freedom of expression in the context of human rights norms, Article 19 of the Universal Declaration of Human Rights. There are limitations to freedom of expression within the human rights framework. You can't use, it is not consistent with human rights norms to use an internet platform to organize a pogrom to massacre your neighboring village. It is acceptable under human rights law to limit such speech. However, any limitations need to be necessary and proportionate to the harm. And so what we are looking for in terms of companies' respect for freedom of expression is, first of all, to be transparent and clear about what their rules are, how their rules are being enforced, having, again, appeals mechanisms that are fair and that they're being clear about all the other ways in which people's speech might be affected or manipulated or their access information might be restricted through platforms. So we want transparency about what types of demands governments are making to limit speech, what types of demands other people, say copyright holders or other parties under a right to be forgotten and so on, we want transparency around that. We want data, so you can go through the indicators and see what we're looking at there and I'll talk a bit more about the results. And with privacy, we're looking at three buckets of things. One is what in Europe tends to be called the data protection related issues. So with the entire life cycle of people's data, what's being collected, with whom it's being shared, how it's being used, how much control do people have over the use of their data, et cetera. We're also looking for transparency about the ways in which third parties might demand or access data, so including obviously governments. So we want transparency reporting about government demands of transparency about company policies about handling government demands. And the third bucket of issues relate to security policies and practices. We wanna see verifiable evidence that there are security audits and strong use of encryption, et cetera, and sort of efforts to enable users to secure their data and understand under what circumstances their data may be secure or not and whether the company is making best efforts. So those are sort of the three different categories and I'll walk through a few different results. This is the overarching score, which is basically the average of 35 different indicators. So obviously when you drill down, you get very different results depending on the specific indicator and I'll talk more about that. There's essentially kind of three bands and you'll see the top scores are 62% is not academically a really great score. So when people say, oh, somebody did great, no, that's not great, you're getting a D and then everybody else did worse. So it's, think about it in that way rather than saying, oh yay, Microsoft and Google are great. No, but that's not what we're saying. We're saying that they did better than the others who did worse. And also this year we added two new companies. There had been 22 companies before. We added two this year, TeleNor, which is based in Norway but operates around the world, including Myanmar and Deutsche Telekom, which based in Germany but also has subsidiaries all around the world and all the other companies have been evaluated in the past. This is the year on year improvements and very interestingly you'll see Telefonica made a lot of changes in the past year and we know that a lot of those changes were sort of responding to the evaluation of them we did previously in addition to some other things that they're involved in like the Global Network Initiative which we'll talk a bit more about later. Interestingly, the two Chinese companies in the index and one of the Russian companies also made significant changes. The Chinese companies primarily in response to data protection regulations in China that don't deal with government demands but deal with kind of consumer protection issues and so they actually did clarify a lot of information about how user data is collected and shared and under what circumstances people can control it so that's a very interesting trend to see. Yandex, one of the Russian companies in the index seems to have responded not related to anything good the Russian government did in relation to the internet this year but it seems to actually have responded in relation to some of the things we evaluated them on last year so that again is very interesting and the two Middle Eastern based companies based in the Gulf, AT Salat and Oridu actually declined and they operate Oridus also in Myanmar actually in addition to being around the Middle East and elsewhere and AT Salat similarly is Middle East and Africa. Samsung also declined for mysterious reasons but they don't seem to be really paying attention to a lot of these issues so much. So I wanna talk a bit about the governance scores because governance while it's a bit more amorphous I think to the community who work on internet rights issues it's really key to a lot of other issues particularly emerging issues that are hard to know what the standards should be. And so in that category you can drill down into the index on the website in the report again we're looking at commitments that the company is making we're looking at human rights impact assessment we're looking at board oversight we're looking at stakeholder engagement whether they're internal whistle blowing practices whether there's internal training on these issues grievance and remedy and so on and interestingly the companies in the global network initiative all score continue to score better on governance than others not that they don't have gaps and I'll show you some of the gaps in a moment but at least that they're because of their commitments with the global network initiative which is primarily focused on principles for freedom of expression and privacy that deal with government demands there are at least clear company commitments on both freedom of expression and privacy and frameworks around implementing those commitments that come out in our assessment and just interestingly just pointing to a couple other US companies Twitter's governance does much less well and on specific indicators even worse and Apple shows a lot less evidence of any governance and oversight mechanisms particularly around freedom of expression and I'll show you more on that in a moment so looking at one indicator in the governance category which is a comprehensiveness of human rights impact assessments again I mentioned Apple and Twitter they show very little evidence of any kind of human rights impact assessment of any kind and you see kind of a real variation among the companies that are in the global network initiative in terms of how comprehensive their human rights impact assessments are because we're looking for them to be assessing the risks to users freedom of expression and privacy not only in relation to government demands that they receive but also in relation to their business operations terms of service enforcement, business model, deployment of new technology and so on. Drilling down into that indicator we have one element in that indicator that's looking for whether companies conduct any kind of impact assessment on their terms of service enforcement which of course is a big issue as companies are coming under pressure to police problematic speech in a lot of countries only three companies provide disclose any evidence that they're conducting human rights impact assessment on their terms of service either the terms themselves or the enforcement process that's Microsoft, Telefonica and Verizon Media formerly known as Oath, formerly known as Yahoo and in terms of companies that do any kind of impact assessment on deployment of algorithms, automated decision making, artificial intelligence again only three companies Telefonica, Microsoft and Deutsche Telekom so in terms of some other companies that are in the news a lot about the use of AI and algorithms and automated decision making to police content primarily Google with YouTube and Facebook and Twitter not to be seen in terms of whether they're conducting any kind of human rights impact assessment on their use of AI Target advertising business models nobody discloses that they're doing any kind of human rights impact assessment in relation to their targeted advertising business models I think sort of given what happened with Cambridge Analytica and many other things that are happening with disinformation and the manipulation of data based on targeted advertising business models the fact that there's no evidence of impact assessment around those business models is highly problematic and we can talk more about that as we go along one other thing to note this graph breaks down all six different indicators in our governance category by kind of what we're looking at but also the score is divided into how they're doing on freedom of expression risk related governance and how they're doing on privacy related governance and you'll see the darker green is governance of privacy risks across the board companies are doing more to govern privacy related risks than they are doing to govern freedom of expression related risks so that's a trend across the board we also have this fun graph which basically is the gap so for each company how big a gap is there between their governance of freedom of expression and their governance of privacy and the company with the greatest gap was Apple in terms of they showed a lot more evidence of governance of privacy risks than governance of freedom of expression risks and the ones in green are the ones that actually have show a bit more evidence of governing freedom of expression risks and privacy the ones with zeros either show no evidence of anything whatsoever in the case of Adus Salat, MTN and Oridu or are kind of fairly even in the case of Long's and Tolendil but in most cases the emphasis of these companies is much more on governing privacy risks than freedom of expression risks which given how much pressure there is from governments around the world and all kinds of other stakeholders to deal with problematic content of different kinds from terrorist extremism to disinformation campaigns to all kinds of things that most of us don't like but how do you deal with this in a frame that doesn't also result in arbitrary censorship of activists, of journalists, et cetera how do you make sure that content is being policed in a way that's accountable and consistent with human rights norms the fact that freedom of expression risks and the governance of those risks is much weaker than privacy risks is problematic and there's really no excuse for companies not to be governing their freedom of expression risks as robustly as possible and that that is one should think a real counterweight to the pressure that they are getting from governments and others to do more to censor more content effectively so for those of us who work in advocacy pushing companies on governance of freedom of expression risks is certainly something I would suggest we do a lot more of and hopefully our data can help with that now going into our freedom of expression category specifically and again we're looking at companies' disclosures of their policies that affect freedom of expression so in other words, what their rules are data about what they're taking down and what they're not in response to who what types of requests or whether it's due to their own terms of service what are all the ways in which somebody's content might be restricted or their access to a service might be restricted or cut off in some way that's what we're looking for here in terms of transparency I'll note that Facebook's transparency is not as good as some Apple's freedom of expression overall score not so great as you'll see kind of the companies that go beyond Apple are either telecommunications companies or their companies that are other than Samsung not in particularly democratic countries so that's something to note looking at a basket of indicators this is sort of the average of several indicators that are looking at transparency about how companies handle external requests so in terms of a government asking a company to take down posts or to restrict accounts or copyright holders asking making a notice and take down claim or right to be forgotten demands or any kind of external demand that a company gets to or network shutdown demands for in the case of telecommunications companies transparency actually did not improve much and for a number of companies actually declined and we talked more about that in our report that companies in the face of government pressure are not reacting with greater transparency about the demands they're getting they're sort of forgetting about those forgetting to improve and instead are just trying to are in damage control mode and so just pointing out here in that basket of indicators these are some highlights in terms of where companies fall on the spectrum I would like to point out Kakao which is a Korean company the South Korean mobile messaging and social media company does quite well on some of these indicators and in fact gets first place on some specific indicators which is I think one of several counter arguments we can make to those who accuse us of perpetrating Western values that actually users all around the world and we'll talk about this more in the panel care about these things and are able to exercise both your policy and through their kind of consumer activism to exercise pressure on companies to respond in places where that is politically possible politically and legally possible so that's also something to note one particular indicator where we're looking for transparency reporting about the amount the volume in nature of content being removed when enforcing terms of service the first time we did the index in 2015 nobody scored anything there was zero disclosure and the company's response was it's not reasonable to expect us to disclose this it will only allow people to game it we heard this of course about other transparency reporting back in the day but there were all kinds of reasons why this was impossible but fortunately due thanks to pushing from some people in this room and elsewhere companies began to improve transparency over time and now we have all the four companies to varying degree produce data publish data about what is being removed when they're enforcing their terms of service it remains uneven it tends to not be across the board but we're moving in the right direction so and some of our colleagues at OTI at New America are also working on some of these issues as well in terms of how ideal transparency reporting should look like in that area another thing we look at is transparency about network shutdowns so Telefonica and Telenoor actually reveal a fair amount of information about the government demands they receive to shut down access to networks in countries where they operate but it's very uneven and there are quite another number of companies that are revealing very little and we have a big problem in a lot of jurisdictions where governments actually don't allow telcos to report to the public on what's happening and this is another problem with law kind of around the world even in a lot of democracies preventing companies from being transparent. In the privacy category Deutsche Telekom new addition to the index this time they got the high score on privacy overall Apple also did very well despite not doing so well in the other categories I would note that AT&T performs less well than a number it certainly performs less well than Deutsche Telekom sort of to put that in the spectrum of companies across the world but kind of bizarrely and the reason why I'm highlighting Orange here is that they're based in Europe and for telcos we actually looked at their group level policies and then their home country operating disclosures you would expect that a European telco would do super well given the GDPR but we look at a lot of things beyond the GDPR we're looking for companies to disclose what they're doing with users data to users not just to regulators and the GDPR focuses more on just disclosing to regulators and also we're looking for disclosure around surveillance and government demands for user data which GDPR obviously it doesn't cover that at all so actually there's GDPR is great but it's not the holy grail it's not gonna get us entirely there so that's an interesting thing to look at these are the basket of indicators that are sort of the data protection indicators and the lighter color is the 2018 average score for each indicator looking at things like collection of user information do they disclose the purpose for collecting and sharing user information what are they disclosing about retention periods, et cetera the improvements all most companies improved on all of these indicators but given that the GDPR came out last about a year ago you'd think that there'd be greater improvement than there was so again we welcome GDPR we think there's a lot more to do beyond GDPR and that's something that that I think we need to think about in the US context as well now this one indicator is what I like to call our Cambridge Analytica indicator in that we're looking for how much companies disclose about users options to control how their information is used and shared and this is again you've got Deutsche Telecom at 63 and Orange at six in terms of transparency to the public and to users in terms of the variance just among European companies with AT&T behind a lot of other telcos so they could certainly be doing a lot better in the US and I'll point out that Facebook here among internet and mobile companies last year Facebook came in dead last place they've improved they are now ahead of Baidu and Mail.RU but so they've kind of improved but the extent to if you hear folks from Facebook talking about how much they've improved since last year on this issue take it with a huge grain of salt take a look at our data in terms of what was a real improvement versus what was just semantic so in terms of recommendations we have throughout our report and on our website much more detailed recommendations even the recommendations in this four pager are more detailed but one very key thing is that companies need to be serious about identifying what the risks are and conducting due diligence across the board with all the risks that users face to their freedom of expression and privacy we need grievance and remedy mechanisms that are better and I didn't drill down in this talk into the grievance and remedy mechanisms but overall those are very poor and particularly when you have governments putting pressure on companies to remove more content more quickly if you don't have very effective grievance and remedy mechanisms and again and also on privacy if you're collecting data and using various things using it for various purposes even if you're claiming to be very responsible and transparent about it if you do not have solid grievance and remedy mechanisms that's usually problematic again we need transparency with users and users need to understand all the ways in which their information environment is being affected by the companies and through the companies you know that's what good should look like you know as a baseline that's not gonna take us all the way to an internet that stains human rights but it's at least gonna take us to a place where we can hold companies accountable in a more effective way it's pretty clear that just adhering to law is not gonna get us there and getting and getting law laws passed is always gonna be an imperfect process and we need to push companies to go beyond legal compliance as their goal and the companies that do best in the index are going way beyond legal compliance you know in their home markets and globally and and really thinking about the human rights impact and not just am I gonna avoid getting sued or getting fined and and we need to see more innovation obviously we need to you know companies you know we should commend Facebook for talking to stakeholders and trying to figure out new ways to address content moderation we'll see how far that goes how real it becomes and so on but we do need to see companies innovating to try and address some very very hard problems around the governance the global governance of data and speech because are kind of legal mechanisms and I think technical approaches are not fit for purpose are not gonna get us there we have this year much more detailed recommendations for governments in our reports as well just given given that governments we felt could use some more recommendations and you know laws need to uphold human rights standards that's sort of a no-brainer but you know I don't think you can remind governments enough of that there needs to be robust oversight over the implementations you know over governments enforcement of their law to ensure that it's not abused both on surveillance as well as on any kind of content removal freedom of expression side we need governments to be transparent about what they're demanding of companies and we need governments to make sure that their laws enable and require company transparency and right now that is not the case even in a lot of democracies and it's a problem if you're gonna pass laws particularly that relate to speech start with corporate governance start with requiring risk assessment start with requiring board oversight over the harms that the companies might cause to users that's a much better place to start than saying you have to take down problematic content within 24 hours and having companies then over censor to a huge degree so we'd like to see governments paying more attention to governance of companies systems and oversight rather than focusing on much further downstream at sort of the result of poor governance and poor design and requiring and ensuring access to remedies so not only require that companies have remedy mechanisms but make sure that appropriate legal remedies actually exist and are available to normal people and that's again a huge problem in most jurisdictions presently so what's next for ranking digital rights and then we'll kind of hand it over to the panel we're actually not gonna produce an index in 2020 our next index will come out at the beginning of 2021 and we're already starting to work on development of new indicators that we're gonna add to the index looking at targeted advertising business models at the extent that appropriate due diligence and transparency and good practices taking place around that clearly a big human rights issue and also the both transparency and accountability around algorithms, machine learning and artificial intelligence that just the impact that these emerging mechanisms and technologies are having on speech and privacy around the world we need to add more indicators into the mix and we also will be adding some more companies at least Amazon and Alibaba which has slightly different structure and business scope so we have to tweak the methodology a bit to add those in but so there will be some new companies added as well and a couple other things to note obviously we're only covering a couple dozen companies but we welcome people to take our methodology which is open and evaluate any other companies you want and so actually in New York last year the new school took some of our indicators and evaluated a bunch of ISPs in New York and found them not doing too well and so because we only have one US ISP in our ranking and we're not able to do a deep dive on the US telco landscape but that shouldn't prevent others from taking our methodology and doing something with us with it a Lebanon based NGO called Social Media Exchange took a part of our methodology looking at privacy in particular and evaluated a set of mobile operators across the Arab region and again kind of found some fairly shocking results also that companies that in Europe have pretty good transparency comparatively speaking are much worse when they're operating in the Arab region so that was a finding that they've been trying to do some advocacy around so we welcome people to take our methodology and hack it and use it however you find most effective for your own purposes. Another thing that we're finding and that we're hoping to kind of expand our work on is that increasingly investors are starting to use our data and our framework to ask questions of companies to engage with companies to improve their governance and improve their transparency and other practices. There's as you know sort of a growing set of investors that are concerned about a whole range of both climate and social and governance and human rights issues who draw upon data sets from a lot of different research organizations and so we're trying to kind of help investors use our data as a tool to push for better practice and disclosure by companies as well so we're hoping to sort of increase our work in that area as well. So a lot to do and we look forward to collaborating with a lot of other people in the space and with that I will stop and welcome Ivan Siegel and the rest of the panel who he will introduce and we'll have a bit of a conversation also with all of you in the room, thank you. And I should say Ivan Siegel, I didn't introduce you other than your name is Executive Director of Global Voices which is an international citizen media community. Hi everybody, good morning. Nice to see some known faces in the room. I'm going to introduce everybody else on stage briefly. We're gonna talk for about a half hour, 25 minutes and then we're gonna open up the room, conversation to the room. So to my left is Denage Stakhor. He's the Research Director for the Web Foundation and then Jason Pylemeyer, the Policy Director for Global Network Initiative and last Amy Stapanovich, the Policy Manager for Access Now. So they might kick off with a quick question for Rebecca to start. I had lots and lots of thoughts and lots to chew on with this really, really comprehensive report and the more you do this, the more I recognize its power. It seems like it's a really interesting cumulative effect over time that's starting to become really impactful and I'm really curious to know if you could talk a little bit more about the relationship between the index and the companies and their behavior. Both the companies you measure and also the larger ecosystem of companies that you don't measure, whether there are any kind of knock-on effects that you're seeing. That's a great question. So we do engage with the companies and we actually started engaging with the companies we thought we were likely to rank long before the index even really got figured out. So when we started to develop the methodology, we actually started to have conversations with companies about these are the kinds of questions we're thinking of ranking you on and we got an interesting range of feedback but as we started to kind of develop the draft methodology, we shared the drafts with companies and gave them an opportunity to provide feedback. We did a pilot, which Priya here was one of the people who worked hard on it, where we kind of tested out the methodology and shared the results with the companies who we tested it on and it was only kind of semi-public so the companies actually knew more about kind of their results, who was measured in the pilot than the public did because we wanted to make sure that it actually made sense, that it would actually incentivize the right kind of change as opposed to, because some methodologies if you don't think them through can actually have perverse incentives that you don't want. And so companies kind of saw this coming from a long way away and before we produced the first index in 2015, there were actually a couple of companies that we piloted on who already started making changes in anticipation of the first public index and there have been several of the companies in the index that have kind of continued to sort of make changes very deliberately on the basis of how they were being evaluated but before each research cycle, we notify all the companies, you're the winner of, you've been selected to be ranked and here's our methodology and around in X number of months, we're gonna come to you with your draft results and then we're gonna give you an opportunity to give us feedback on it and Laura here is one of the people who's been heavily involved with that process and then we kind of look at their feedback and kind of decide what to take into account and not intend to have a conversation with them about it and then come up with the final scores. So we are engaging with the companies quite a lot throughout the process. Now there are some companies that choose not to engage in this process and don't respond to emails and so on and there was one of the Russian companies that has a somewhat rude response at one point a few years ago but most do engage and some when they do relatively well, we've seen some press releases but also there's a couple of companies that are not in the index, one of which actually asked us if they could be included and there've been a few startups who've also come to us and said, how do we get into your index? And it's like, well, it doesn't kind of work that way. We're only doing sort of major listed companies that sort of have global reach. We can't just, this is not, like you pay us X amount of money and we put you in the index or something like that but that kind of shows that there are some companies out there who think that if they were included they would do relatively well vis-a-vis their competitors and that that would be helpful to them with their market. So I guess that kind of shows that it's of some use and there's been a couple of companies not in the index that have also told us they use it internally to kind of evaluate their policies internally. So yeah, does that add? Was there some other part of the question? I can't remember. Yeah. Now, another really striking difference is between the kind of diverging trends that you see between privacy and freedom of expression. And I wonder, what if you could jump in on that and also Amy, if you could talk to us a little bit more about the privacy aspects that you're seeing. Yeah, go ahead. Do you want me to say that? I think the thing that really jumped out at me in the privacy piece from going through it and pretty good death over the weekend is really I wanna take the word trust and print out the definition and send it to every single company that gets ranked because they talk about this word a lot, this word trust and they really want their users or people to trust them because we're handing over massive amounts of data but it is clear that when they say trust they mean we're going to comply with the law, which is a good thing. It's good that they're complying with the law. It's good that laws like GDPR are getting passed and implemented and we're seeing that but as you said, it's not enough and I think people at these companies need to start realizing that it is more than that. There are processes and this is one of them that we can take to go that extra step and I'm hoping that they'll start engaging on those processes because while we do need a law and I think that, I mean, I've personally put a ton of work into the idea that we need one in the US and our organization works globally, that's not gonna get us all the way there and we're not gonna see passing grades just because there are laws that exist but we've been working on privacy for a really long time and I think the freedom of expression issues have really only hit the front page headlines over the last couple years and I think it's gonna take a while for that to even out and for companies to realize the threats to freedom of expression maybe are just as dire and just as great and deserve just as much governance as those threats to privacy and also to get all the way over to that trust word which is what I think I increasingly come back to in these conversations because the notion of trust is just nobody really gets the definition. I don't trust you because you're not, you're doing everything you can to not be sued. That is very self-interested. I'm gonna trust you if you're doing something to help me out. Interestingly, the norms on freedom of expression long among governments and potentially companies are actually going in the other direction right now and we look at the Christchurch call like a lot of the regulatory tendencies that we're seeing are pushing pretty far away from what we consider to be pretty standard norms for freedom of expression and I'm curious about whether there's an equivalent, like whether the norm setting exercises around this that we've seen in addition to ranking digital rights are succeeding and I wonder if you could jump in on this with Global Network Initiative and talk about because it's fascinating to see how on governance side they've done that effect has been really powerful. We're not seeing it on the freedom of expression side necessarily though. Yeah, it's a really good point, Ivan. I think one of the big reasons is because if you look at the government, the laws and the regulations and maybe even more importantly, the sort of soapbox politics around free expression versus privacy around the world but in particular in Europe, which is where I think a lot of the activity has been, the push on privacy is to do more to protect user rights in a kind of consumer rights sense and the push on free expression is to take more content down more quickly, put out fires, it's often in the wake of horrible, highly publicized terrorist attacks or other events that get a lot of sympathy and get played up in the media and so you're absolutely right. I think that the companies are being pushed to do more and by more I mean not more what RDR is asking them to do but more take down, more moderation, more policing of speech and that's tough to do at the sort of scale and speed that is increasingly expected of them while also building out the kinds of transparency, the kinds of governance that would demonstrate improvement on the indicators. So I think not to put all the blame on the kind of regulatory environment but I think that's a significant reason why there's been less improvement on free expression than we've seen on privacy. Do you think, Dhanaj, that, so working with the Web Foundation, you're looking not just at what the companies are doing, not just what governments are doing but at whether the internet as we built it is truly representing a kind of aspirations for a democratic and decentralized civic space globally. Are either of these groups that we're talking about right now the governments and the companies actively representing those interests in a way that you think fits with the aspirations of your advocacy and your research? Big question, I know, but if you could frame it in the context of what we're seeing here that would be super helpful. Yeah, I mean a short answer is no but I think there's definitely a lot more that can be done within the private sector, among governments, regulators and even suicide itself. One of the things that we, through the Web Foundation recently launched, Tim Berners-Lee was confounded the Web Foundation launched last year as a contract for the Web which tries to address this very problem. The Web as it was created was meant to be a public good which allows everyone to actively participate in its development and also to use it in a beneficial way but over the last several years we've seen the reverse in that where it's become more centralized, more privatized and less and less of a public good. So I think initiatives are the contract for the Web online or try to push governments in ways that they can make sure that the Web and the Internet are accessible to everyone all the time that companies will place greater emphasis on people's privacy when they use these kinds of services and also to make sure that there are mechanisms of people to get involved. What I think is really crucial here for something like when you think about the applicability of the index is how we can help us generate accountability for measures like contract for the Web or other global measures that try to see how companies, how governments, how everyone can get better involved. When you look at the indicators in the corporate accountability index, you can think about specific ways that we can measure performance of what governments might commit to through something like the contract for the Web. So in terms of applicability, I really see a lot of benefit for the index in that regard because there's a lot more work that needs to be done to go back here or question about whether things are improving. Right now I think things are not but with measures like this I think it's very helpful. So the measures form the basis for our ability to create a vocabulary in order to discuss what kind of rights we're seeking to have. But that's not sufficient for advocacy. That's a step in the direction of advocacy. It allows us to frame our advocacy in effective ways and to measure whether or not it's been successful. Question for all of you is what kind of advocacy is necessary now? And I'll note very briefly both of the other major institutional actors that we're talking about here, the companies and the governments have strong systems, incredible resources, and a long-term timeline in order to push forward their ideas. So society in contrast is often, I won't say fragmented, that's not the right word, but rather broadly disseminated, there's often a collective action problem. More resources overall, actually, but a collective action problem sometimes in moving forward. And I'm wondering if you have any thoughts about short-term and maybe medium-term advocacy strategies that we can do to push on some of these main issues. And the big ones that we're really seeing are rolling back on some privacy issues around government, take down government requests for information, and the rolling back of freedom of expression norms in really big ways. Well, I could just start. I mean, one of the many reasons we started this was a demand from a number of civil society groups for data. And the need when going to companies, saying, you know, we don't like the way you enforce your terms of service, actually having some data about how different companies compare, helps, and also going to governments, showing some data about how their policies are affecting what companies are doing in very concrete ways. You know, the feedback we get is that that's helpful rather than just sort of having, just the more research, the more data, the more concrete factual examples there are, the harder it is to get companies and many governments to just dismiss what you're claiming. So that's what we're trying to kind of help with. That's my judgment. I think two things. One would be, you talked about governance, you talked about stakeholder engagement, and I wrote it down because it strikes me that every entity out there is involved in stakeholder engagement right now, and it's stressing civil society well past the breaking point. And we need to start thinking as a community about what it means to engage and what we need holistically and who should be engaging and how to go about those processes because at the end of it they're gonna say this was a multi-stakeholder project, except civil society had like two hours to devote to it and the company had like 50 and that doesn't really look like a pluralistic process. Yeah. So we need to have a really big conversation about engagement because you're seeing it, we're seeing it from governments, we're seeing it from companies, we're seeing it from all different sides. I think I get a different email every single week saying here's this long process that we're starting right now that we need you guys to be involved with. And so that's one side and the other side is to stop the us versus them and I'll frame it from the index in the European companies not looking at government surveillance in Europe because they're so busy talking about government surveillance in the United States that they're not figuring out either what laws exist or don't exist and need to exist in their own countries and it's this we either trust or don't trust where we are and we trust or don't trust them and instead looking at standards like necessary and proportionate which is referenced in the report and the recommendations I believe to say let's apply these same things everywhere and I think that will also help civil society come together and we need to do that. Yeah, so I think it's interesting to think about so the global network initiative was started 10 years ago to focus on restrictions on free expression and privacy that governments were in situations where governments were pressuring companies so for free expression it was often demands that certain content be taken down or censored and for privacy it was requests for user data. Increasingly we have now seen a lot of concerns surface very legitimately around both free expression and privacy outside of the immediate context of a government action so how are companies treating their users and their user data how are companies dealing with content moderation outside of any government request. Those lines get blurry but I think it is interesting to kind of think about a kind of a table with free expression and privacy sort of governments restrictions and then kind of company restrictions. I think we're seeing more pressure now I think for 10 years we've been working at GNI on that sort of interface between governments and companies and I think we've made some progress I think RDR helps to demonstrate some of that. Now we're seeing a lot of pressure and a lot of actual movement towards regulation GDPR being the most concrete example with respect to the user data privacy consumer privacy issues and increasingly now on sort of content moderation governments putting more pressure and putting regulatory frameworks in place that would push companies to take more content down or do more with respect to certain kinds of content hate speech terrorist content disinformation et cetera and I think there's a real risk but also an opportunity there. The risk is that those regulatory frameworks are put in place very quickly they're not really well thought through they don't understand fundamentally how the sort of ecosystem works and they make the situation overall worse for users and we're seeing some examples like that I think but there's a real opportunity that if that regulatory energy and kind of this moment can be shaped in a way I think this is really where civil society can come in the civil society groups in particular that have been working in this space and really understand users and how they relate to these services and understand how the services work and where the services fall short and this is where the RDR report I think really helps sort of point the path forward if we can actually turn this sort of regulatory moment into one where we're encouraging more of the kinds of transparency and governance that RDR is focused on I think it could end up being useful to both avoid the worst case scenarios of unintended consequences but also to kind of help improve the overall ecosystem not just for users vis-a-vis the companies but also in terms of the government's role and the recommendations that RDR has for governments in the report which are really quite substantial in this version I think really are spot on and hopefully provide a bit of a roadmap in that respect and I would say this is speaking just for myself not on behalf of the little network initiative we are kind of scrambling as I know access and others are to sort of keep track of all of these new emerging regulatory approaches and white papers and discussions and actual legislative proposals most of them have been more problematic than good but I do think that the recent French framework to make social media platforms more accountable is an interesting, it's kind of a think piece but it actually demonstrates a lot more sophisticated understanding I think of how the ecosystem actually works and the relationship between platforms, governments and users than some of the other efforts and so I'm not endorsing it, I'm not suggesting it's necessarily more positive than negative but I think it represents a more thoughtful approach and hopefully that's a step in the right direction and we'll see how that plays out. Yeah, to go ask a question on advocacy so I think what's really important with good pieces of policy research just like this is to help shape the debate one of the things that I did not find surprising in the results was the low performance of the telcos particularly those in low and middle income countries when you look at the policy debates among regulators and those companies it's often focused on issues around access and affordability and it often puts aside these other important issues around freedom of social and privacy and so on and I think what's really critical then is to take evidence like this and to insert that into those debates and that's something we could do better through the initiatives that we have I think everyone else has engaged in those debates and with those regulators particularly in those countries I'll give you one quick example we recently did some research around women's experiences of social media taxation in eastern Southern Africa in Uganda in particular when we talked to the woman there they said that these taxes which are meant to limit social media use ostensible part of government's raise revenues but really to limit freedom of expression the people there would tell us that this is directly an assault and their ability to express themselves online because it was limiting their access to things like Facebook and WhatsApp and so on this was important for the woman that we spoke to so it's not surprising that the two main companies that dominate the market there in Uganda for example have over 10, 90% of market are MTN and Airtel which actually perform the lowest score the lowest on the freedom of expression cluster within the index so I think there's lots of opportunities for them companies like these and also regulators and everyone that's engaged in debates to learn from to use this evidence to improve that aspect of the access issue it's not just about access and affordability but as well also includes privacy, freedom of expression and so on that's really interesting and also just noting the kind of effect that the telecoms and the internet companies are coming from historically two different regulatory environments and how those are converging or not so Cognizant of the time why don't we open this up to the floor we want to hear from you and we have two people with mics when you speak and you will please introduce yourself first and ask a question which ends with a question mark and has a rising tone thank you Hi, I'm Mary Madden I'm a research lead at the Data and Society Research Institute thank you so much for all of this work I know this presentation is a very brief summary of a very long and detailed project and so thank you for your continued work and advocacy in this area and my question really relates to how you're thinking about this index moving forward and as somewhat of an outsider to the freedom of expression debate but more of an insider on the privacy debate I realize why it makes sense from a legal and enforcement framework to consider the two separately but increasingly the two are becoming more and more deeply entangled and if we look at an example like China and a complete surveillance state you can see immediately when you start to think about facial recognition technology how surveillance and privacy harms directly impact freedom of expression so I'm wondering particularly as you're thinking about including companies like Amazon as you're thinking about sort of how you account for not only government's use of technologies but law enforcement how you address what seems like a really thorny problem moving forward so I'm just curious to hear you talk a little bit about that would you take a shot at that? Yeah, so I mean Deva thanks so much, Mary and you're absolutely right privacy and freedom of expression are very intertwined and kind of self-reinforcing in different ways and we do sometimes encounter people who have an opposite question which is like the two are opposites and they come at expense of each other and we're like no, actually they're very synergistic so we get sort of a range of opinions about kind of their relationship and with the methodology as you're trying to kind of basically put together a taxonomy you need to put things in categories but overall, I mean this is why the overarching rubric is the human rights of users which both privacy and freedom of expression and respect for those both and the exercise of those both are necessary and why they're combined in one index and there are some people like why don't you just do a freedom of expression index or why don't you do, you know and we're like no, they have and they do kind of overlap and in the governance indicators they are very much intertwined but there are also kind of separate ways that companies look at them as well oftentimes within companies the people who handle freedom of expression the people who handle privacy are in different departments and don't talk to each other which is also a problem so in terms of the disclosures and so on and companies just think about whether they ought to or not they tend to think about them differently so yeah, definitely we certainly recognize that they're intertwined at the same time there are issues that relate to speech there are issues that relate to how somebody knows stuff about you which also affects your speech but there are some differences that we need to kind of make in order to evaluate a thing and really break it down Hi, Elise Dick I'm a graduate student up at the Fletcher School studying new media and human rights so thank you so much for your incredible work this index is unlike anything else out there and I think it's so necessary for where we are in the world right now my question has to do a little bit more with the regulatory frameworks and especially with the governance indicators I'm curious if any of you and especially in your work in the index are finding that the growing regulatory environment and these emerging compliance measures that companies need to reach if you think these are incentivizing better governance practices or if they are holding them back by creating growing compliance costs thanks I think it's a little bit of both like if you look at the GDPR it's I think driving better work as Rebecca started but we've done a lot of work for one example on the access and assistance law in Australia and that is definitely the gag provision is so broad you can't even say that you haven't received an order from the Australian government to totally undermine the security of your products you can't confirm in advance like before they even reach out to you you can't say anything and that's gonna have a huge impact on transparency and we've yet to I think grapple with the total impact that those sorts of approaches are going to wreck against efforts like ranking digital rights where what they're saying on their website no longer matters because they're being ordered to say something else and so I see them both ways and we need to watch laws to figure out which ones need to be supported and which ones need to be opposed vehemently I'll just say I think that's exactly right and I think it's really interesting that some of the same governments that are pushing laws or regulations to require companies to be more transparent about how they use user data in a commercial sense are also putting obstacles in place to discourage or sometimes prohibit transparency vis-a-vis the government demands and requests that the same companies receive so and a lot of times especially in less democratic environments those laws are themselves secret or they're written into licenses for mobile network operators so that in a way that it's not even that you can't say whether you've gotten a demand or how many demands you've gotten you can't even say that you operate in an environment where that is legal and that happens in a number of our companies our mobile network operator, telecom companies proactively trying to think about how do we how can we be transparent in that kind of a situation started a few years ago publishing what they call country legal frameworks which are basically a report produced by an outside law firm analyzing the legal environment in the countries where they're operating they said this is essentially as far as we can go this is kind of the legal and regulatory framework that we're operating in and where we can tell you more specifically about the kinds of things that we have to deal with and the number of requests we will and where we don't you can probably figure out what's happening so we've now kind of consolidated those and we have a new website that will be a lot easier to search there are 50 plus reports like that that our companies have funded and put together so I think that's a useful step it obviously doesn't go far enough but the companies are kind of in those circumstances handcuffed I'll just add really quickly one thought which is that oftentimes when we address issues of regulation of expression and privacy we assume that the answer actually comes within the regulatory frames of those issues rather than other things and one tendency that we've seen within the recent kind of spate of expression regulations is that the solution may actually not be about expression it may be about other ways that governments can especially in context with this conflict or extremism where the answer may actually not be only about expression it may also be about other positive actions that governments can do to create better environments for its citizens poverty alleviation I mean there's a whole range of things in post conflict environments where there's not an adequate peace building mechanism that leads to and so there's a tendency to kill the messenger around these issues and governments that have a tendency to over regulate as an excuse to not actually face the more problematic challenging and long-term issues of actually governing their people yeah so with that note who else wants to ask a question a question thank you Hi I'm Isabelle with New America and I was just I found this report really interesting and I was just reading up right now online on the website where there's more information about some of the methodology and I might have just not gotten there yet but I was intrigued about sort of how you tear the scores like what perfect is what 100 is you know I was looking as I was looking at the ranking of companies I was like oh Facebook scored relatively well compared to others like and I know that there's a lot of people who are like really mad at Facebook right now for example for like losing passwords and their governance et cetera and I was just wondering what 100 looks like right so 100 overall like if you get 100 on all 35 different indicators so that would actually also require regulation not to prevent companies from doing everything we're looking for them to do but 100 basically means that companies are being fully clear with users about all the ways in which your speech or your access to information might be manipulated or restricted and that you have a clear understanding of what you know and what you just who has power over what you do and don't know and what you can and cannot see in access and who can know what about you under what circumstances and to what extent you can control it and some of the indicators are also looking at good practice so there's one indicator where you get 50% if you disclose that you track users around the internet but you'll only get 100% if you don't track users around the internet right so if you're doing it it's better that you disclose it right so but if you're gonna get 100% you need to actually be doing the responsible thing that we believe is most human rights respecting and so there's quite a number of indicators like that particularly around some of the privacy personal data indicators and so but the point is I guess just to take the Facebook example I mean there are some things on which that particularly relate to the reasons why people are mad at them on which they are scoring very poorly they are however disclosing more about government demands they get to restrict content than Apple is and so let's give credit where credit is due the media discourse doesn't focus on that for a bunch of reasons that are fine and this Cambridge Analytica thing is justifiably a big deal but there are some things that they actually do better than some other companies that we have actually seen fit to give them credit about even though people are like how could you give them credit for anything at all and I mean we're going through and looking at okay what is the policy what do they disclose and we come out with a score and we're not running it through a lens of kind of what the media narrative is about who's good and who's bad we're just coming out we're putting together the facts and scoring the facts so sometimes that comes out with people get mad at us because they're like well you know but Tim Cook is so great on privacy and how could Apple not be at the top of your index and it's like they provide very little evidence that they do human rights impact assessments on any of this they do not have a commitment to freedom of expression they do on privacy obviously but you know so there are some ways in which just based on our methodology they're falling short and sorry if it doesn't fit the prevailing conventional wisdom you know and narrative but this is what we found and but the high score is a D right so and it goes down from there so it's not like it's not like anybody's doing great it's sort of varying degrees of bad and some are at the top end of bad and some are at the bottom end of bad you know on average right so everybody has a lot to improve in a lot of different ways so and Laura since you're here at New America Laura Reed who's here for a few more hours based normally in New York is she has our methodology and scoring system and data all memorized I think so if you wanna know anything very specific ask her and we've got some other colleagues as well who can kind of get into really nitty gritty about certain things but you know and again this is a global thing right so yeah Facebook does do better than Baidu and Tencent you know they actually do disclose that they push back against government demands for use of data and that they conduct human rights impact assessments and there are some countries that they have not gone into like China for example and you know so they're doing something and there are some countries that are doing nothing and so again the scale is what it is but we do on specific things give credit where it's due It's worth noting very succinctly that the aspirations are based on universal human rights documentation of which every country every home country of every tool every company we're talking about here has a signatory to so there is a very strong standard and those standards have best practices around implementation as well so it's not based on not just Rebecca coming up with those ideas like I think the best thing is that I say it is it's not the case okay we have time for a few more questions and also if we have anybody on Twitter that's kind of raising any particularly thorny questions that need to be you know our colleagues are keeping an eye on that as well my name is Sarah Nelson I'm a dual PhD candidate at Vanderbilt University getting one PhD in history and another in comparative media so I have not yet had the opportunity to read the full report but in looking at your sets of recommendations for companies on the one hand and for governments on the other it looks like even the government recommendations seems to suggest asking companies to simply better regulate themselves and so I'm wondering how you all are thinking about the problem of monopoly in the tech industry and what the prospects for broader market regulation may be if that's a consideration you have in the report and then following up on that what you think the implications here might be for the role of multilateral institutions so you know where is the ITU in this where is the UN et cetera cool well thanks thanks for that and I think some other folks will have anybody please jump back on us views on that as well I guess one thing that I would say in just in terms of multilateral you know the UN special repertoire and freedom of expression the UN special repertoire privacy you know the human rights framework in the UN actually draws on this and so ITU the extent to which they're actually paying attention to human rights norms is not so clear to me but that's maybe Donna Raj could talk a bit more about that I'm trying to remember the other piece of the question is what about market regulation are the companies being asked to regulate themselves or are there some stronger and more radical enforcement mechanisms so in our introduction we point out that the reason why we're focusing on these companies is because they have too much power you know these particular ones that a very small handful of companies have a huge amount of power over people's ability to exercise their rights and that's a problem and so as long as that they they exist and they have that power we're gonna hold them accountable in terms of you know we don't get into the whole anti-trust thing that's really beyond the scope of what we do but obviously there is a need to look at whether companies have too much power and what to do about them you know whether a particular person's proposal for how to break up Facebook is going to get us there is going to result in much wider set of companies that all respect users' rights a lot more just because we've broken them into smaller pieces I'm not entirely sure but at the moment we have companies that are not doing enough to respect people's rights and there's a small number of them and they have too much power and that is a fact so you know we're kind of adding that to the mix of corporate power and I think there needs to be a much broader debate about exactly how you get to a place where companies are held accountable and there's choice and users' rights are respected and kind of how that fits in a regulatory framework. With the government regulations we aren't just calling on governments to get companies to regulate themselves I mean we are calling for stronger data protection and privacy laws. We are, I actually think that requiring governance and oversight is not a self-regulatory thing it's just requiring that companies actually do their due diligence not just over financial risks or consumer risks but over other types of risks so it's not self-regulation but yeah, I mean this is a space where lots of different types of regulatory tools are going to need to be deployed and other mechanisms that are not regulatory are going to need to be deployed if we're going to get to where we want to go and there's not one regulatory approach to rule them all that's going to solve all the problems. Guess we don't have any yet, I love you. So I get the accusation although it's not always accusatory a lot that we don't do enough on competition issues because they're so important and I always want to say like there's a very specific specialization in competition and I'm not that expert and I would love to be that expert I didn't go into policy wonkery because I don't like digging into interesting issues but because the policy space is being DDOS there's not a lot of time to all of a sudden gain a whole new body of expertise. One of the things that I see missing though and my Twitter kind of butts up against competition Twitter so I see enough of it is the conversation happening there is happening at this level and there aren't a lot of pieces that I can take and send out to our audience of lay people to get them to understand even what that discussion is and what competition law is and how it applies to the digital space because it's such a high level conversation and if I were to say something that I think would love to see happen is a translation into something that real people can understand and engage with because I think this is an issue that's only gaining more importance. A lot of these things I think one of the reasons we don't see improvement is because companies don't have to you're a trapped audience and they don't really need to worry if they're not providing you more but people need to understand that and see what they can do and how to advocate and those materials aren't in existence as far as I can tell. Yeah, just say that I mean I think these things are related I think one of the kind of ironies that I see in this space is that a lot of the kind of regulatory proposals that are emerging which are not about competition they're about some of these underlying challenges but that some of the same governments will also identify competition as a concern in particular in Europe this has been a big sort of issue for a long time and yet while they are the kinds of things that are emerging to deal with hate speech or terrorist content or even data protection in some ways appear like they may actually help consolidate power among the few largest companies who have the greatest ability to comply with what can be quite cumbersome regulatory mechanics and so I think there's just as Amy was saying there's this kind of disconnect where there is a conversation about antitrust and competition and then there's a separate competition a conversation about privacy and free expression and those conversations are a bit more linked up but the regulatory proposals that are emerging and on this side I think are actually going counter to the kinds of concerns that are being raised on the competition side and I'm not really sure why that is but it seems like a challenge that needs to be worked out I also think just one thing to what we're talking about kind of the broader regulatory space one thing that's really interesting is that for these free expression concerns whether it's hate speech or terrorist content there are these concepts out of sort of tort law that are emerging this most specifically articulated although not really articulated but put out there I guess in this UK white paper on online harms around this idea of a duty of care which is a concept that exists in a lot of common law systems and apparently also in some civil law traditions as well but that to me doesn't really make sense to try and regulate situations where one user is saying something that harms another user and you're trying to figure out how the platform is meant to be responsible or what their obligations are I think the idea of kind of a fiduciary with respect to privacy and data makes a lot of sense because it is the company that is holding something that is not theirs or where there is at least a property divided property interest but I'm really curious and getting more and more concerned about this idea of a duty of care because I don't think it's very well established and it seems to be more of a political talking point than a really well thought out sort of legal concept or regulatory concept and so I just think it's gonna be really interesting to watch that space and see and hopefully engage with it and try and sort of flesh that out a little bit more and really understand what that means because I'm concerned that it could have some really negative unintended consequences. Can I ask you on that? Yeah, I mean, so I agree with what you're saying and the point about competition is really important and unfortunately you see it developing and isolation from these issues or the regulatory debates around privacy and freedom of expression. In fact, if you look out in some countries and on many other countries again going back to that context the issues around competition are often viewing these main of the companies here as having monopolies but being foreign based and then the reaction is to how to deal with them in a kind of nationalistic way so they have the issues around data, nationalism, data localization, policies or so on coming up which do not really address the underlying problem around privacy and security and user control of data. So in many ways, the competition issues from perspective of some of these regulators is kind of obfuscating the real underlying problem. I agree that the competition issues have to be taken into consideration and I would see that as complimenting research like this. If you look at Telefonica for example which has moved up really well, there are issues around the consolidations happening within their markets within their subsidiaries in Latin America and the Caribbean which is very important because that affects consumer choice and pricing issues, access and affordability and all of that. So you'd have to consider in parallel to you know, they've improved well in this regard but what's happening on the other side? So I'd consider both. And to the other point on the question was about multilateral spaces. So I'm not confident that spaces like the ITU would even really seriously address some of the issues here that there is in this index. They haven't in the past and there are lots of political issues that play out in those kinds of spaces which prevent that. They often try to focus more on technical issues than they're set to not so on. And typically that revolves around access. And that's what the member states prefer to focus on. Unfortunately that's one of the primary spaces for talking about issues around these things but they have them to emerge as they should have. So I'm aware of time. I know that we have a lot to say. I probably have about 20 responses to everything that's just been said. I'll simply say one, I'll limit myself to one thought which is that one of the challenges with regulating this kind of very complex multinational environment is that the companies tend to be innovating with their products for two or three steps ahead of the regulation. By the time we figure out how to break up these companies they may have broken themselves up three times over just the way Google did with Alphabet two years ago, three years ago. So I'm gonna leave the last word to Rebecca to send us off into the day. Give us some hope. Yeah. Well, you know, I think this takes an ecosystem. And I think what we're really encouraged by is the growing number of groups who are doing their own research that contributes to the ecosystem, that contributes to innovative approaches to the problem. I think the other thing is that's caused for hope is all of the companies we've engaged with, there are individuals in these companies that really actually want to improve and want to improve their scores not just for, you know, cynical reasons but because they actually do care about the internet. And we do wanna be empowering these kinds of people. And that's another way, sort of impact that we feel that we are hopefully having in certain companies is empowering the internal champions with the data they need to get their bosses to take this stuff seriously. And yeah, and I just wanna thank everybody who's worked on this, everybody, all the funders who have believed in us when not everybody was clear why we were doing this and why we were focusing on companies and not governments at the beginning in particular. And it's very encouraging to see over time more and more people wanting to get involved with this work and engage with it. So thank you very much. Thank you, Rebecca. Thank you.