 Good morning from Washington DC. Welcome to everyone joining us around the world. As you filter in. I see the number of participants is rising quickly. So. Thank you so much for joining us today. I'm very excited to introduce the launch of the 2020 ranking digital rights corporate accountability index. And before I hand it over for the presentation by our fearless leader Jessica dear. I want to recall a little bit about how far this team has come. My name is Rebecca McKinnon. I'm the person who's faulted is that we're here today. And in 2013, I launched a research project here at New America. Based on what at the beginning seemed to be like a simple idea, inspired in part by freedom houses, freedom on the net index that ranks countries on the level of protection of internet users rights. My idea was to create an analysis analogous ranking for how well companies actually respect and protect users rights. Now back at that time, a lot of people were asking why we needed to do this. It was two years after the Arab spring and people still saw tech companies as largely liberating the Snowden revelations highlighted the threat of government surveillance and how companies can get up to the abuse of government power. Remember that time. But thanks to a few funders with foresight and the support of a broader research and advocacy community. We did manage to launch the first RDR index in 2015. And since then this is now our fifth index and events have underscored I think exactly why this index is needed. I don't think we need to explain why big tech needs to be held accountable anymore. And the other just incredible thing is to see how much impact the work of our research team has been having. The Council of Europe has been referring to RDR's data and indicators and thinking about future regulation as just one example. And yet another example, and I could share many others, but we'll want to move on. This past year for shareholder resolutions cited RDR index data. But while we're focusing on companies, governments really cannot be let off the hook here. And so I want to end with a challenge. And in fact, our findings do highlight the fact that some serious government and regulatory failures are not being addressed. And people who work in the field of business and human rights might be familiar with something that has come to be known as the Freeman theorem, which states that demand for corporate responsibility rises in proportion to government irresponsibility. And many companies in the index actually have low scores thanks to government policies and laws that actually compel censorship and surveillance of political critics practices that violate human rights. And on the other hand, governments of the world's major democracies lack a coherent approach to regulation that would actually require tech companies to protect and respect users rights to the fullest extent. So this year's findings actually underscore how government responsibility, including regulation is needed, but not all regulation is actually created equal. So laws need to be based on data and research, not just short-term knee jerk politics. Regulation needs to be coherent and well coordinated among countries that share the same commitment to protect human rights. And it must be consistent with international human rights standards. And in our website, we have more detail about our recommendations for governments too. And finally, I just want to say how proud I am to have a great team of people who have handed over the leadership of ranking digital rights to Jessica Deere, who led the production of this year's index and is building an incredible team of dedicated professionals who will be carrying out this work. And I hope for many years to come. That's, I hope funders, you're listening. And I'm sure that the impact of this hardworking team is only going to grow under Jessica's leadership. So thank you all so much for being here today. Thank you to the RDR team. As soon as I'm done talking, I'm going to post the link to our staff page in the chat for everyone to see. And thanks also to our funders. Who continue to believe in us. And again, the full list is on our website. And with that. Over to Jessica. Thank you so much, Rebecca. Hello, everyone. I and our whole team are thrilled to share the launch of the 2020 RDR corporate accountability index with you. Next slide, please. One second while I recapture my notes. We are a small, but talented team working from the US, Europe and Latin America. And you'll see their names in the acknowledgments and in the by lines of the pieces on the site, including on the company report cards. They along with an incomparable network of researchers around the world deserve all the credit and recognition for this year's incredible body of public interest research. That is the RDR index. And as Rebecca mentioned, of course, none of this will be possible without the generous support of our funders. As you know, next slide, please. As you know, RDR was founded by Rebecca in 2012. And, and alongside the, the publication of consent of the network with, in which she foresaw the dangers of a world in which the corporations and governments that build, operate and govern cyberspace are not being held sufficiently accountable for their exercise of power over the lives and identities of people who use digital networks. They are sovereign, she wrote, operating without the consent of the network. Consent of the network is source code for RDR. And based on that source code, RDR now promotes freedom of expression and privacy on the internet by creating global standards and incentives for companies to respect and protect users' rights. We apply these standards to create the ranking in the ranking digital rights corporate accountability index. Next slide, please. And just in case you need a reminder, the RDR index evaluates the world's most powerful digital platforms and telecom companies on their disclosed policies and practices affecting freedom of expression and privacy. We look at what those commitments, at what commitments companies make to freedom of expression and privacy and how they operationalize those commitments across a range of services. We push for transparency as a way to establish a baseline of information about the policies and practices of tech companies so that stakeholders, best positioned stakeholders and our allies, best positioned to use that information such as consumer rights advocates, policymakers, investors and shareholders, researchers, journalists, code and technical auditors, and others can actually assess their performance from their unique perspectives and advocate for the change we need. We acknowledge that transparency is a first step and not the whole ball of wax. And we focus on free expression and privacy as both individual rights that enable our ability to exercise so many other of our human rights, but also because these rights are fundamental to healthy information environments, which in turn encourage robust social, economic and political participation. In other words, these rights are critical to our individual and collective agency. Next slide, please. Here's a look at this year's RDR index by the numbers. It includes 26 companies, Amazon and Alibaba are newcomers bumping the number of global digital platforms we evaluate up to 14. We also evaluate 12 telecom companies and we read through thousands of pages of policies on about 65 services, yielding more than 300,000 individual data points. Next slide. All told, the companies we rank span six continents, touch the majority of the world's 4.6 billion internet users and represent a combined market cap of about 11 trillion U.S. dollars. We rank the seven companies that quote control the internet and its infrastructure, unquote according to Mozilla's latest internet health report and these seven along with the Chinese search engine operator Baidu and IBM, which we don't rank, also happen to be the companies who are building the rules, systems and business models for the future of artificial intelligence according to Forbes. Three of them, Amazon, Google and Facebook have spent more than half a billion dollars on lobbying U.S. and European policy makers over the last decade or more. So I think it's fair to say that the data that we provide and our findings and analysis have really never mattered more. Next slide please. Our standards, our methodology is grounded in the Universal Declaration of Human Rights and the UN guiding principles on business and human rights, which say that companies have an obligation to protect, respect and remedy in their operations. Companies should make public commitments to human rights and conduct robust due diligence to identify and mitigate human rights harms and provide remedy to address negative consequences of harm should they occur. We choose this framework because it is legally binding through the conventions that have operationalized UDHR and because we need a global framework for companies that are operating globally and human rights is the right framework for that. Governments in addition to their duties under human rights law also have a responsibility, as Rebecca mentioned, to create environments in which companies can respect human rights. Our indicators, and there are now more than 58 of them in three categories, governance, freedom of expression and privacy, are structured progressively. We start with accessibility as the baseline. Can a user easily access the policy and understand it? And then we get increasingly more granular and sophisticated as we progress. This year we integrated into our categories a new set of indicators specifically to assess companies' policies of algorithmic systems and targeted advertising. Given the role these technologies are playing in our information ecosystem, amplifying disinformation and hate speech and how they are driving company profits. We also engage with companies regularly throughout the process and companies score better when they engage. Not because it earns them points, but because they can point us, point out sources that we may not have found and we can highlight shortcomings in advance of the publication of the index, giving them an opportunity to clarify their policies and practices, or we hope institute new ones. Next slide please. So here it is at long last, the 2020 RDR index ranking. Of course most of you have seen it by now on Twitter or maybe you've accessed the website. You know that Twitter took the top spot among digital platforms for its disclosures in the freedom of expression category and Telefonica earned the top spot among telcos for its strong governance commitments. Amazon scored the lowest of any digital platform this year for poor governance disclosures and for what we found to be a lack of disclosure about platform rules and enforcement, among other things. You can read more about why these companies were best and worst in their categories and all the companies in our index in our individual company report cards and in our key findings essay. But suffice it to say that we think the top spot is really not something to brag about in an index where no company earns above 53%. The long and the short of it, the most striking takeaway from our research in 2020 is just how little companies across the board are willing to publicly disclose about how they shape and moderate digital content, enforce their rules, collect and use our data, and build and deploy the underlying algorithms that shape our world. We're facing a systemic crisis of transparency and accountability among the world's most powerful tech giants. At the root of this crisis is remarkably weak corporate governance and oversight of commitments, policies and practices affecting users fundamental human rights to privacy, expression and information, and non-discrimination. The impact is across the globe, companies are leaving users in the dark about how their content is moderated and how their personal personal information is collected, protected and used to drive profits. Those of you have been watching the RDR index over the years may note that the scores seem a little lower than usual, and you would be right. New indicators introduced this year to address up rhythmic systems and targeted advertising resulted in lower scores for most of the platforms we rank. On average, the drop was about five points. Next slide please. In this graph, on the left, you can see that only six companies scores improved this year. Notably, all the companies that improved despite the addition of the new indicators are based outside the U.S. and Europe. On that note, we must also mention our disappointment in the performance of many U.S. platforms, including Google, Microsoft, Facebook and AT&T, most of which made only incremental changes to select policies, despite it being clearer now than ever that more holistic, systemic and user-centered reforms to their governance, operations and technologies are necessary. Very few companies are in credit for disclosures relating to the new indicators. So those that did score better this year made significant changes in other areas and represent some of our notable firsts. South African Telco MTN published its first transparency report and announced a wave of additional improvements to human rights due diligence. That I hope our panelists, Marina Madelay, will tell us about more in the discussion. Mail.ru, owner of the Russian social media site, PK, also published a commitment to respect users' freedom of expression and privacy rights. Emirati Telco at Tisalat clarified that its privacy policy applies to all its services and Qatar's Uridu published a privacy policy for the first time. Aksiata, the Malaysian Telco that provides telecom and related services to all its services, and the U.S. and U.S. and Malaysian Telco that provides telecom and related services to 150 million users in nine countries in Asia published guidance on how users can protect themselves from cyber security risk of commitment to privacy and improved its disclosure process for some government demands and related due diligence. Baidu, which operates China's leading search engine published a human rights policy which it didn't get credit for because it was published outside and it also made some improvements in governance, published some data on third party demands committed to limiting data collection to what is necessary for the services it provides and improved transparency about policies and practices on security. While all these companies still fall on the bottom of our ranking, they have made notable strides and demonstrate that protecting and respecting human rights even under some of the world's more repressive regimes is possible when the will exists. Rebecca goes into this dynamic in more detail in her essay Chinese tech giants can change which looks at what spurring the Chinese companies in our index to make human rights commitments and implement stronger privacy protections and what the limitations are. Next slide, please. We also calculated what score changes might have looked like without the new indicators. Just to be fair, still the U.S. platform saw the least improvement indicating that their policy improvements stagnated between these two indexes. Seeing companies that aren't always driving the new cycle make meaning policy improvements is encouraging and ought to give their peers in the limelight some pause for thought. Next slide. Scores are one thing, but the company's collective failure to meet even our minimum standards of policy transparency puts the dangers of abusing digital power into high relief. Policy transparency affects people and companies are not doing nearly enough when it comes to demonstrating their governance commitments to human rights, specifically around due diligence and remedy. And companies are also not disclosing enough about how they shape and moderate digital content, enforce their rules, collect and use our data, and build and deploy the underlying algorithms that shape our world. It's worth repeating. Next slide, please. So on our website, when you go to the ranking on the homepage, you can click on the buttons above and see the scores by category. Here are the governance scores. If you were to dig deeper, you might see a couple of patterns. Where we have strong scores of due diligence, companies are providing evidence of conducting risk assessments to freedom of expression and privacy primarily related to government demands. You'll also probably know that there's a need to broaden the scope of their risk assessments in line with new indicators to their own policies and practices, including algorithms and targeted advertising. U.S. companies have all made human rights commitments. Now that Apple published its human rights policy in late August last year, just before the end of our evaluation period. Next slide, please. But some really serious governance gaps remain. More companies are making commitments, but they're not doing the work that they need to do to make them stick throughout their operations. You can see here that companies scored 60% on average on making an explicit clearly articulated policy commitment to human rights, including free expression and privacy. But when we look at how they implement that commitment, which requires designing and implementing robust and systematic due diligence processes, their average scores fall to just 12%, which is at the bottom of the graph. So users with opportunities to express grievance and seek remedy through a predictable process, a critical element of any human rights commitment is also lacking. Average scores on this indicator is just 25%. If you can go to the next slide, please. With the exception of Telefonica, as you can see in the chart, most companies fail to offer clear and predictable remedy to users who feel their freedom of expression and privacy is violated. All of this clearly demonstrates that while commitments to human rights are being made, they aren't being properly reinforced in practice. Next slide, please. Governments isn't just an abstract concept as Jan Ritzek and Elizabeth Renearest, two of our team members this year, make clear in context before code. One of three featured essays that accompanies this year's index and helps interpret the index findings of human rights. This essay looks at companies' policies during the pandemic, giving examples of good and not so good practice by companies we rank on network shutdowns. This one pictured in Myanmar, one of 213 government order shutdowns documented in 2020. And on content moderation of COVID-19 mis- and disinformation. Their takeaway is that companies would be better prepared to respond to crises like these if they committed to strong human rights-led governance in the first place, especially by implementing human rights impact assessments. They write, strong due diligence can help predict, for instance, the rise of fringe movements in social media communities or the likelihood of coordinated extremist violence moving from online spaces into real life. Yet only four companies we rank to conduct assessments of their own policy enforcement where these kinds of threats often arise. And the next slide, please. And just in case you're wondering, this is our indicator on conducting impact assessments on algorithmic systems. Pretty much crickets. There's a little bit of partial disclosure, but we need more. And the next slide, please. And it's the same or actually worse for targeted advertising. Facebook gets partial credit for the fact that it's not adequate and the report was created in the context in a specific context. So it's not systematic. Next slide, please. These are our category scores on freedom of expression. You'll see Twitter talk this category and the ranking because of more transparency about actions it took to remove content and suspend accounts for violations to platform rules. It offers a lot of transparency about ad content and targeting rules and reports more data than most other platforms about government demands to censor content. It also gives more information about its bot policies than other platforms. One key pattern to note in this category is that while digital platforms were generally good about disclosing their rules for content, they didn't always back up those rules. So let's go to the next slide. And we'll go a little deeper into the rules and enforcement data. We can clearly see a gap here between companies on the right-hand side of the chart. And the lighter green or the lighter color, the greener color I suppose is the rules and then the second column is the enforcement data. So we ask that companies clearly disclose the circumstances under which they restrict content or user accounts. And then we ask them to report data on those content restrictions and user accounts that violate the company's own rules. And what we can see is that more companies than previously are starting to publish transparency reports about content and account removals. So let's go a little deeper into the rules and enforcement data. And then we'll go a little deeper into the rules and enforcement data. There are still significant gaps in what companies are reporting. Next slide, please. And here's the overview of the privacy category. Privacy scores were seriously affected this year by the introduction of new indicators related to algorithmic systems and targeted advertising. This is simply because those systems that access to privacy policies, we see a whole lot of color. This is the most basic indicator in this category and it looks relatively good. However, on the next slide when we look at the next indicator access to algorithmic system development policies, only telephonica earned credit. No other company explains to its users how it develops its algorithms. So understanding how algorithms are developed is critical to understanding the level of privacy of platform reports since vast amounts of user data are used to train the algorithms that underlie these systems. Next slide, please. Well, there's of course a lot more to say about privacy. Let's move more directly into algorithmic systems and targeted advertising since I'm probably talking too much. Here's a graphic showing what questions we're going to talk about in this slide. We're going to talk about how we can develop and use development and use processes. It comes from moving fast and breaking us all, another one of our featured essays in this year's index by Ellery that interprets the data through the lens of our most pressing challenges and dilemmas. Numerous reports have shown how algorithmic curation systems are able to collect and monetize user data and then using algorithms that are developed and deployed to drive reach and engagement exploit that data by targeting users with tailored messages and content. Next slide, please. So it would seem that committing to human rights and development and use of these systems not to mention conducting due diligence around them would be a priority. Unfortunately, that's not the case. As we saw earlier in the related due diligence reports, and as we can see here in the chart, asking for an even more basic commitment. The companies that get partial credit are mostly ethical commitments, artificial AI ethics kinds of commitments, which aren't as strong because they're subject to interpretation and they're not legally binding. Next slide, please. On our indicator that asked specifically about algorithmic content systems, none of the companies, none of the social media services we evaluated offered adequate information about how they actually shape, recommend and amplify user generated and advertising content. Only telephonica and Vodafone published explicit policies on AI and human rights. No U.S. platform made such a commitment. Only ethical principles as I mentioned and no other U.S. platforms including Apple, Facebook or any overarching principles addressing how they develop and use algorithms. Similarly, despite years of research warning of the real harms, no platform in the index clearly disclosed whether it conducts robust systematic impact assessments to evaluate its algorithms for possible bias or discrimination, such as we've seen in housing ads or search results that objectify black, Latinx and other women of color. There is a notable lack of overarching commitments or due diligence mechanisms governing how platforms design, train and deploy algorithms as well. Next slide, please. And telcos aren't off the hook either. All the telcos we rank have ventured into the mobile ad market and they're remarkably opaque across the board. Only a few offered any information on targeting rules and what kinds of targeting are prohibited. No telco offered any data on how it enforces its rules such as ads removed and suspended. And they're also vague about data collection and how data is used for targeted advertising. The lack of disclosure is clear in the telcos, doing worse as a group than the digital platforms and there are fewer high performers at the top of the ranking. Meanwhile, telephonic and Vodafone as I mentioned are in top spots in part because they are the only ones that publish explicit policies on AI and human rights. Our index this year covers many other issues. Government ordered network shutdowns. We have a new indicator on zero rating which also shows very, very little disclosure. We have indicators and data on data breaches and the policies of notification and around those data breaches and also prevention, bot policies, stakeholder engagement, security protocols and encryption. We hope that you'll explore and find the data that you need. Next slide, please. Finally, we have some pretty succinct recommendations for companies. On governance, we really need them to commit to and implement robust human rights governance that includes freedom of expression and privacy and due diligence and account for and mitigate harms from algorithms and targeted advertising. On transparency, we need you to maximize transparency. We need those comprehensive information about how platforms and services shape and target content and how data is accessed, used and shared and user control. Users need to have meaningful control about how their data is collected, used and shared including information that is inferred about them or us. We need to give users options to control recommendation and prioritization of content. It is essential to our agency as we mentioned at the top. Finally, we can go to the next slide. I'll end again with consent of the network and the call to action that Rebecca put at the end of her book. In closing, we know that without the tech services and platforms we rank alongside hundreds of others, the pandemic would be even lonelier, less productive and more difficult to endure. But we have to ask at what cost we need to have more transparency from the companies. We can't calculate it. If people and lawmakers do not know the specifics of how they operated is much harder to hold them accountable for their negative effects through smart regulation and other measures. We'll end up losing their benefits as we try to mitigate their harms. Fortunately, there's an ecosystem of actors responding to the call at the end of consent of network that you see here. We have to make sure that we don't lose contact tracing apps for being touted as easy fixes to spreading the coronavirus to spread of the coronavirus. Privacy and anti-surveillance advocates sound the alarm. Experts pointed to the potential for misuse of the data they were collecting and the inequities of designing public health systems around technologies not everyone can access due to the persistent global digital divide. In the wake of the police murder investigation, civil rights groups led to successful stop hate for profit ad boycott in which more than 1,000 companies pulled their ads from Facebook bringing the link between targeted ad systems and viral online hate speech into the mainstream. Meanwhile, investors and shareholders also expressed concern about environmental, social, and governance risks that tech companies pose in their portfolio selections and policy makers began taking the threat of tech power more seriously, educating themselves and calling company heads to testify and formulating regulation that in some cases tries to balance the benefits of technology with human rights. Last year also saw the launch of the markup, an investigative journalism nonprofit dedicated to watching big tech by building tools like citizen browser that with users permission can monitor and access, monitor across multiple accounts of content, groups, and ads are amplified. And consumer reports the leading consumer advocacy nonprofit in the U.S. redoubled efforts to integrate its digital standard, a framework for evaluating health technologies respect consumers interests and needs based in part on RDR's methodology into product testing of digital devices. At the end of my introduction to this year's index I propose that if tech companies do not want to tell the world how they work how they profit and how they will factor the public interest into their bottom line we will force their hand. We must find a way to govern tech collectively for the benefit of our societies and we cannot afford to let them govern us. And with that I'd like to turn now to our panel discussion to talk about this growing movement with both longtime leaders in the field of consumer rights and corporate accountability upstart starting to tackle this challenge from new perspectives and new tools and we're also privileged and I think she's very brave to have a representative from MTN group the South African telco MTN made the greatest overall improvement of any company in this year's index and I think that deserves attention. So let me close this part and let me introduce you to Marta Tayado Navija Sayed and Marina Madale I will give you a little bit of information about their bios and then we'll launch into the discussion. Marta Tayado is the president and CEO of Consumer Reports she leads America's foremost consumer organization an independent nonprofit that works side by side with consumers to advance truth, transparency and fairness in the marketplace. Since joining CR in the fall of 2014 Tayado has transformed one of America's most trusted brands and iconic social enterprises uniting its rigorous research, consumer insights, award-winning journalism and policy expertise to drive social impact. Marta came to CR following 25 years of experience and included executive roles in public service, philanthropy and mission-driven nonprofit management. At the Ford Foundation she was vice president for global communications and an officer of the board and while there she led strategic communications and advocacy on a range of issues in the US and around the world including economic fairness, free and fair access to an open internet and civil rights. Next, Navija Sayed is the president of the market a new investigative journalism startup that explores how powerful actors use technology to reshape society. Previously she was vice president and associate general counsel at Buzzfeed. Navija has been described as one of the best emerging free speech lawyers by Forbes magazine and prior to Buzzfeed she co-founded the nation's first media access law clinic currently in its 10th year of operation at Yale Law School and she served as a First Amendment fellow at the New York Times. Navija was an associate at Levine Sullivan Koch and Schultz leading media law firm and she has worked on legal access issues at Guantanamo Bay representative asylum seekers in South Texas counseled on whether to publish hacked materials and spoken about misinformation at the inaugural Obama Foundation summit for her work, Navija was named a 40 under 40 rising star by the New York Law Journal in 2016 and Marina Madale is the general manager for sustainability and shared value at MTN she is responsible for setting the strategic direction for the management of sustainability and shared value across the group comprising 21 markets in Africa and the Middle East. Marina played an instrumental role in pioneering the first ever Pan-African transparency report, a key milestone in positioning MTN as an emerging markets leader in sustainability. She has worked across multiple sectors globally including the oil and gas energy, banking and property development. Across multiple countries such as Qatar, Mozambique, Botswana, Gabon, South Africa and Australia. So with that I first would like to ask the three of you for some of your overall reflections on the presentation or the findings actually from the research and what do you think they're saying about where we are as societies and democracies in the role of corporate accountability. What struck you what stands out and was there anything that surprised you or anything that particularly reinforces or informs aspects of your own work. Let me start with Marta if that sounds good. Hi Jessica and hi to my fellow panelists it's great to be here a remarkable piece of work you just walked us through really it's it's you are pioneers this is this is foundational to what all of us have to do and I think I would just like to say it's an honor for me to be with my fellow panelists because what you're seeing is really a new frontier of human rights digital rights consumer rights and a new ecosystem of organizations that have to do it when you see the scale of the problem and you can't help but realize the scale of the problem is going to take many of us and many of us from different directions be it human rights civil rights consumer rights to really attack this problem and to bend the marketplace in the direction of putting people before profits so that's an enormous amount of work but it is foundational and I guess there were a number of things that struck me and I guess I'd say big kudos for including algorithms that that is that was just a game changer I thought it added another layer of complexity and richness to the data and and as we know from our own research it's so important to begin to try to move the market in that direction and how do you do that when they're not transparent and I think some of the organizations at least I'm really proud to have partnered with the markup in their first kind of investigation around the algorithms driving something like car insurance and really getting behind that and seeing that it has nothing to do with what you think it has to do with your driving record but it has to do with the color of your skin your neighborhood your job so all the discriminatory challenges that we faced in the hardware world are now part of this new algorithmic environment it is not objective in any way it really reflects some of the internal biases of the very people in companies that are constructing these so lots to talk about in the algorithmic space the second one I would say is wow Amazon rock bottom you know that that one just that one just I guess it shouldn't surprise me but I wanted to think better and I wanted to think better because as we think about the environment we're in now we're all online we're shopping online and the incredible shot in the arm this has been for a company like Amazon when you think about it 40 cents of every American dollar during the Christmas holidays went to Amazon where's the reciprocity where's the recognition that consumers are driving their bottom line I think that that's to me remarkable and then of course the work that so many of my colleagues are working on is the way you track the profitability of hate speech and division that we would be seeing companies encouraging that again because it drives revenue it's and really how corrosive that is to free speech and human rights so I would just conclude and pass it on to my colleagues that I think what this shows is something that consumer reports have been working on for a very long time we've been around for 85 years most of you I hope most of you think about us when you look for a product or service in the marketplace and as proud as we are for that 85 years of service to consumers we now are on the next frontier as I said of rights because we have digital connected products and so that's why we stood up the digital lab because it's so important and because we maintain that you know benchmarking does change behavior and it does move and shape the marketplace and that's what this enterprise is really about it is a theory of change that we have been squarely in with you for a very long time but now we've got to figure out what are the tools because as you say there is no transparency here so how do you do that in a world where the software is driving so many of the harms that we can't see so I'll stop there but I'd love to hear from my colleagues Great, Nebi how would you like to go next? Sure I would love to jump in with the praise this report is tremendous I want to underscore what Marta said including targeted advertising and algorithmic decision making here as a focus was I mean inspired and the subject of much shock for me from where I sit those are topics that folks have been talking about for the last couple of years and so to see that not a single company in the index assessed expression or privacy risks related to those categories honestly I didn't think it was going to be great but I thought it would have one in the portfolio and I think that was something that indicated just how much work we have to do so charitably I might say that companies in this space don't know what disclosure transparency looks like so we have to figure out what is the framework that we want is it something like generally accepted algorithmic auditing principles much like principles that we have in the accounting sphere is it something else, what does that look like I think there's room for imagination and demands in that space cynically I would say when it comes to targeted advertising and algorithmic decision making that they're so core to the business models of many of the companies in the index that we're not going to find out a lot unless it's mandated and I think either way whether you take the charitable tape or the cynical tape or I'm sure other takes will get into in this panel I think we're seeing that the limits of voluntary disclosure are pretty clear and the avenues I'm excited about going forward is what does regulatory intervention look like I think a lot of people are thinking about that space and there's a number of tools that at regulator's disposal to figure out that way forward at the markup we focus a lot on external independent third party monitoring because we think no matter what corporations might voluntarily share there is room for that independent assessment because of the sort of independence and different outside perspective it provides I think the third piece and this is where I think being on a panel to add a brand like consumer reports is just so uniquely exciting is I think the way forward in educating consumers in moving the marketplace and saying this isn't a set of foregone conclusions there is something that can be done here because you deserve more is a really important opportunity in moment and I think consumers are ready to hear that they have brands that they trust that are helping educate them in that realm and I think all of these three pieces sort of working together will help move us forward but I'm very excited to hear from the experience of RDR in moving things forward in other realms like I was also heartened to see that in 2017 when you asked folks about data breaches only three people responded saying we have here's our policy procedure and in 2020 you have 13 which is making some headway and so I think we can also look to your learnings because you've been in this space doing the rigorous work for so long to see what what levers of change work what might not and who else we need to go to so thank you again for a great report thank you and and I think I would be remiss and not sort of talking a little bit about the algorithms and average target advertising indicators that that was something that was a process that took us about two years to sort of go through and develop those indicators primarily led by Natalie Marshall and working with our research director Amy Broulette and others on the research team and that it was again another sort of prescient moment I think that I wasn't at RDR at the time but to sort of identify that we needed to develop indicators around those those tools and then took the time and there was significant stakeholder engagement including with companies in development of those indicators so Marina thank you so much for being on the panel as I said before I think it's we're delighted to have you it's probably quite brave being a company representative on a panel like this but you know MTN was the most improved company this year and so there's it's a great example of what we can do when there's a will so I wonder I'd love to hear some of your reflections on the index but also on the process of sort of being evaluated like this and how it has affected or had an impact on your policies and putting human rights sort of at the center of your work Thanks Jessica so I think as you mentioned MTN is operating in emerging markets very unique very dynamic across Africa and the East and every day we are you know challenged in terms of how do you continuously evolve so as you mentioned earlier that you know the score gets better when you engage and so you know RDR has been absolutely phenomenal I think the evolution as the panel and the set of the index is incredible and I think definitely algorithm zero rating especially when you look at the year that we've had that's been a very strong feature and the question becomes as mobile operators what role are we playing are we putting the necessary measures in place so that really stood out I think as we go forward you know our challenge RDR is to think of you know the effects of artificial intelligence blockchain and so forth so I think it definitely I think it's a great work and I think you'll be able to build on that foundation going forward because more is happening technology is advancing with introduction of 5G and everything so we continuously need to think of you know what does human rights look like in that space to recommend RDR as an organization I think engaging with the team fundamentally helped us to you know understand where we're at, what's required and so forth and I think one of the key things we digital human rights is concerned is that it's a journey and it's almost basically it's not 100 meters sprint you have to build blocks and keep building on that and you know RDR really helped us with that and that's why you see that we put the time and the effort made sure we worked within our multidisciplinary team because it takes everybody to help protect human rights and so I think that was one of our key learnings out of the process and that part of that evolution is really what led to us continuously enhance our systems our processes governance and also that really was one of the things that led to us developing our first ever transparency report which we're extremely proud of thank you That's fantastic Marina I wonder if I could excuse me pick on you a little bit more and ask sort of how I'd love to just hear more MCN's approach and sort of how is it making a business case for human rights both inside the company as well as outside the company where is this will to sort of align with the standards that have been created to do a transparency report and improve your policies how is it being driven within the company you said you have an interdisciplinary team how does that work thank you the model we're still learning but thank you I think the key to it all is really building knowledge and understanding obviously aspects like digital human rights may be seen to be restricted to particular departments or be seen as an abstract concept where you'll read in the media or you'll get approached to ask questions where people ask you questions but largely it comes down to each and every person and organization so the very fundamental key has been just building the understanding of why is it important and then getting everyone on board to say what systems or processes do we need to have in place and what you find and that I found also interesting from the research you presented was the gap between commitment and implementation and I think what you find is that a lot of a lot of organizations are actually doing a lot but they don't necessarily disclose it and I think that's the heart and the importance of RDR is really looking and saying okay have we disclosed this I think even engaging with the team the simplicity of how you put the information is fundamental and I think that we had quite a lot of discussions with Yan and the team just around how many clicks does it take to find your terms of service and so you really learn a lot from that just to say make it easier for the consumers make it easier for your stakeholders so definitely I think takes considered effort it takes amount to disciplinary team and then I think it takes support from the top and I must say and commend our group president and CEO Ralph Mopita who's 100% committed he spent the time he engages and I think that that really if you get the tone from the top right everything else comes right and we found that our stakeholders engage with us in digital human rights quite a lot they ask us you know tell us more share your policies and and a lot of them have actually reached out and commended us for the transparency reports in particular. Fantastic. I guess I would like to go to Navija I'll go back the other way this time and ask you about citizen browser it's citizen browser captures snapshots of what I think some volunteer like 2000 more than 2000 volunteers see in their Facebook feeds and you've released data from the browser I wonder if you could sort of tell us what the report covers and a little bit more about the technical auditing because one of the things that RDR is very clear about is that we evaluate what companies say about what they do what their policy disclosures are and see an opportunity but we don't evaluate actual practice or what the actual outcomes of those policies or operation operational policies are and so we see an opportunity for networks of organizations like ours to work together to kind of fill in the gaps and so technical auditing the kinds of auditing sort of code work and invest data data-driven investigative journalism services is providing yet another data set that when combined with RDR data perhaps consumer data and other sorts of things that we would be able not consumer personal data but data about consumer habits and consumer reports would actually sort of start to map or yeah tell us what these companies are doing that they're not telling us themselves absolutely so I think that's right I think this is a team effort right and everyone sort of coming at the transparency in different ways there's the voluntary disclosures made by the companies which are great then there's the evaluation of how well is that encoded in the actual policy which places like RDR jumping into and then there's folks like us at the markup and for those of you that don't know we are a nonprofit newsroom tomorrow is actually our one year anniversary of publishing so what a wild first year it's been sort of a simple and radical proposition at the heart of the newsroom right the public deserves to know exactly how technology governs their lives and what they can do about it and so to get that full picture right sometimes you have to look at the system that perpetuates harm not just anecdotes that result from it and so because what happens is this game of whack-a-mole right you identify some harm you tell the company they're like oh I'm so sorry here's a blog post and then everyone goes on their way and we're persistently monitoring so we thought well why don't we actually try to build some efforts to persistently monitor and this is really the beautiful brain child of Surya Matu one of our engineers and Julia Angwin our editor in chief who's been in the realm of algorithmic accountability for for so long and they set out to answer a pretty straightforward question which is what is Facebook recommending to its users just what is it right we should check and so as you mentioned we've got 1900 panelists who decided to download a browser that we built that would allow them to volunteer their data to us so we can track what is Facebook recommending to this panel and it has been fascinating right not only for being able to track some misinformation that's moving around but also giving us the opportunity to note where Facebook did not live up to a promise that it made to Congress it told Congress that they were going to stop recommending political groups in advance of the election and during the transfer of power and we using our national panel data two of our reporters saw that actually 12 of the top 10 top 100 political groups or groups that were recommended in December right after the election during the transfer of power which of course we were all very worried about were actually political groups they were explicitly political groups and worse they were being targeted to Trump voters which of course were served a lot of misinformation in this time and so it helps to build these systems right this way to monitor to just check in real time in real life are we all living up to the promises that we're supposed to and I think that that's really important it's expensive it's slow you know I was a joke that but it costs us to build citizen browser is like a dev and a half at Facebook right so there's definitely a scale question here of what does it mean to have a bunch of tie fighters going up against death stars you'll have to forgive the early Star Wars reference but you know but that kind of infrastructure is what we're trying to build and so we can build those tools and have people participate we also think of that infrastructure in terms of decentralizing the tools so we can put it in the hands of consumers which I think is a helpful last piece that I'll mention we built a tool called blacklight anyone can use and go to the markup.org slash blacklight put any website into it and it'll show you the invisible trade that takes place when you load a page so if you go to purinadogfood.com or whatever place you decide to spend your money you'll see what the canvas fingerprinters are what the trackers are what's going on like what are the cookies that page in that interaction and I think that's important too because we want to equip consumers with the ability to also call out when something curious is happening right if someone says you know I loaded this website what is going on here let me reach out to my university and ask them why is it that the twitter embed function allows it sends all this information to you do you need it right that what we're doing is we're actually giving the public sort of making them our agents and also keeping those companies accountable educating themselves making good consumer choices so at the markup we're sort of thinking about tools and infrastructure as a way to help the accountability game move forward but we really we need folks we need folks like you to be able to do it. I think we all need each other I'm really intrigued by the the scale question as you were describing and I couldn't help but think this is one of the things that we also struggle with a lot at RDR when we first started publishing the index in 2015 we ranked 16 companies and now we're ranking 26 and the research process is rigorous it takes a long time about 6 months I think from June until October we did the research for this index last year and one of my questions is sort of as you noted you have citizen browser and it's specifically for Facebook how do we scale this work and I think one of your solutions about engaging users is a good one and it's something that again we're also thinking at RDR is how do we make our methodology into a tool where companies or groups can use that to do their own assessments or to ask themselves and so I want to turn to Marta where Consumer Reports has been doing this for 85 years and has really figured out at least to some extent some of the scaling issues of this kind of work and I wonder what we can learn from you in this sort of new digital corporate accountability environment well I want to stay on this Star Wars metaphor and say that I think we established that there is a disturbance in the force and there's no question right so we've got these scrappy pilots here that are trying to figure this out and boy the scale question is tremendous so I want to break it apart into two pieces I want to touch on some of the amazing comments that Marina made around something that you know around culture really and to be have you talked about trust so for like a research you know data geek like we are I don't think those two concepts should be dismissed this notion of the culture corporate culture and trust and how consumers really can define the strength of a brand by how much they trust it and I think those are important concepts but to go to the scale question maybe you said the same thing the kind of investigative data journalism that the markup is doing and that CR is doing it's expensive it does require resources to be able to go down these rabbit holes and put these pieces together in a very scientific data driven way in a nonpartisan way and in an independent way because at the end of the day that is a real strength of ours is that we have to go to decision makers to corporate giants with impeccable credibility about our data and our research the comparative testing when we do testing it's not does this product feel better than the other do you like this color versus that it is not about that right it's taking you behind around issues of safety control privacy security the scale and the complexity of these products is much greater than what we've seen in the past and so we've got to figure out how you know how we're going to do some of that and part of what we did in the digital lab was open ourselves up have an open framework where other partners can come to that framework and help us define how do we create standards because what was so fascinating the day that we announced our digital standard we had a number of corporations immediately call us and say what are you doing how are you doing it what are the standards how do we meet them you know how how high do I need to jump to be number one so I guess you know we've been at it 85 years there is enormous amount of consumer power that I think consumers have that can be a catalyst for change there are sleeping giant we have to educate them but as the BSA they also need the tools and and that's that kind of innovation and ability you know we've developed some privacy tools as well to help consumers but again it's that scale so yes we would like to see when you get your TV and you bring it home that that it is not recording what you are doing that you don't have to actively you know the burden is all on consumers in these products when you bring them into your home what do you have to do to secure your home it shouldn't be that way they should be designed with consumer preferences in mind and one of the strengths I think that we can continue is that we are in the marketplace every day serving millions of consumers what are consumers needs and we need companies to be responsive to those needs and that takes me to Marina and MTM boy I wish we had more American companies that were modeling the kinds of really a culture that you're describing because it is that is a culture a culture that creates change and I saw so I don't want to over rotate you know because as I said we're data geeks but this notion that a combination of culture and leadership along with the data and consumer needs that need to be met I think is also a game changer so I'm really glad that you raised that and then I'll just say a little bit about trust again we know that by looking at data over time that there is a lack of trust and it's declining rapidly in our political institutions in corporate companies and in our elected officials right so now we're seeing more lack of trust in the platforms now that's a turning point for us you know I think we've been doing a lot of educating we have been seeing a lot of hacking the reporting that markups doing companies that are trying to set new standards I think we're at a crossroads here where I think we need to really turn the corner on this and we're seeing it we saw it just you know last night in California with the net neutrality rolling that came out of the courts we saw it in California as well with consumer privacy laws but all of this has to be put to the test and so I'll answer your question Jessica what have we learned well not only do products and services have to be put to the test so do these government laws have to be put to the test so how do you do that one of the things we did was we asked consumers to go ahead and try to test that as wonderful as and as proud as we are to see California's privacy law just how easy is it to opt out and what we found with consumer volunteers that actually actively tried to stop control to take control of the privacy 62% of them were like this was almost impossible we couldn't figure out we couldn't even get the company to confirm that what I just did really so total lack of customer service not putting consumers first so I think I think it's all those things it's it's we have to engage consumers they're a sleeping giant I think they're that the ecosystem is doing a terrific job I think some of the companies we know are the lack of policing and self-policing we know that that is not a game changer so we've got to address the market failure and the government failure and I think we have to recognize the power that consumers have and how do we begin to really work together to make that happen whether it's the readership of the markup whether it's companies that are modeling that yes this can be done so I think I think we're at a really important moment I knew I would do it once in the session so you would think after a year I know how to unmute properly but thank you Marta that's really eloquent sort of discussion of sort of where we are and what the space is I'm wondering from Nibbija what is your engagement with companies look like and what kinds of impact are you feeling from the markup and and citizen browser it's a great question so by virtue of being journalists right we maintain sort of this external independent independence is our guiding star always and so we do engage with companies when we go to them and say here are our findings what do you have to say right we're not in the business of gotcha journalism we want to hear if there's a countervailing explanation for this set of facts we want to hear it and that's a core part of what we call our markup method right we want to bullet proof our results before we share them with our readers but beyond that beyond sort of hearing what they have to say incorporating it with the fact that we have been sharing that with the public there isn't a lot of cooperation other than we serve we serve the consumers we serve the public we serve our readers and we're going to we're going to audit from the outside right we can be the barbarians at the gate we're not going to necessarily be on the inside working understanding that all of these are different theories of change I am as I would now putting up taking off my markup hat and putting on my like curious lawyer hat I am fascinated to see if see more companies who actually ask for some form of regulation and I think specifically about one op-ed by Jeff Glick who was at the time the CEO of Foursquare and it was an op-ed in the New York Times and Foursquare a location data company that you know collects data from all people checking into different restaurants and locations he said you know we should we need to regulate location data and here's why the game of trying to figure out compliance in one million different jurisdictions is just too much if we had clear standards then it would be there's a business case right here and of course there's all sorts of downstream questions of well then what role do lobbyists play do the regulations actually have the teeth they need to we can get to that but I'm really curious as Marta said we're in this very special moment and as Marina present on the panel and her great remarks indicate we have companies who want to do something whether we're going to see that sort of voluntary can someone help us give us guidance to happen in this moment I think that's a really interesting thing to keep an eye on I'm very very curious to see how that unfolds with the companies especially in the RDR portfolio index thanks I can't help but ask given the one page ad on we support regulation from Facebook and in the New York Times that was circulating on Twitter we know what your when you hear companies asking for regulation do you feel that it's only like in their self-interest or is it also do you think there's room I guess is what I'm saying for common interests there in terms of you know four square saying it's easier for the business case but also is there a human rights case that that they would respond to and maybe that's a question for Marina as well I would love to hear Marina's thoughts on this but what I'd say is you know I think it goes to the principle that Marta illustrates the trust question right if these companies feel like trust in them and their success that flows from it is predicated on doing the right thing or at least inching closer to the right thing then I think there is room for that sort of common interest to evolve but I think we need to sharpen that proposition and I think this is where it's you know consumers and users come into play government maybe nudges things along corporations play and really a lot can happen out of that interplay that doesn't necessarily I'm not necessarily entirely cynical about this I think it is possible I think it picks up on a little bit of the scale question we were talking about which is who is a lion on behalf of the consumers in the way that lobbyists can be for many of these corporations and how do we make sure that plays out in a way that's equitable and fair and make sure that everyone is at the table in the way they should be and so I'd have those set of questions but with that I'm very curious about what Marina's thoughts might be thanks Marina do you want to take that on I'll try and add a bit so I think I definitely like both Marta and Abida that we look at trust is a key factor especially for us as corporates so if you look for example at the Edelman trust parameter you see the trust plays a huge factor you see which sort of stakeholder groups is most trusted and corporates are still there within trust but I think that trust is starting to erode so we need to definitely look at mechanisms and I think it really is about listening and being responsive I think that's part of ways in which to build trust ourselves as MTN we run an annual reputation index survey and we've been looking at some of the results of that survey and when you listen to our stakeholders you find that we look at various parameters so we look at reputation relationship health and trust and we understand that there's different dynamics to it but at the end of the day it's about how do we respond as corporates because communities and various stakeholders still hold a lot of trust towards us and we have a responsibility at the end of the day so I think that's the key thoughts on that Fantastic since we're about 15 minutes away from the end of the session we're starting to get some questions from the audience roll in and Marina the first question is for you I suppose I'm supposed to say who it's coming from because his name is here but from Peter Meisak at Access Now who I'm sure you know are there updates to MTN's human rights policy or implementation processes coming including on substantive issues like internet shutdowns what can we expect as MTN explores this new territory and can you connect it to any lessons learned from your transparency report and best practices for other companies who might be watching you closely Sure So Peter definitely we know very well and we look forward to keep engaging with what we find is that as I mentioned there's dynamics in our markets so updating our policies is an annual continuous thing so I think Peter will be very happy to see on the system and you can go to our website right now you'll see the 2020 revised policy and the goal is to update it absolutely every year because especially when you look at the transparency report we learned a lot from it we picked up that yes there were instances of sort of freedom of expression related internet shutdowns and so forth what we found was that when you looked at the data in terms of the volume it was a lot less but what you find is that you can so we definitely continuously monitor and I think that's part of instilling and implementing our due diligence framework so that's something that we work with every single market and I think very soon after we signed off on the policy we actually had a few cases where we had to go step by step with the markets to make sure that we follow the due process some of our observations on the transparency what we learned was that you're finding both citizens and governments actually exercising their rights so at least I think if I remember some of the research was at least 60% of the requests that came in from civilians was largely requests for civil litigation cases so people are contacting mobile operators they wanting to get the information and so forth and then 40% of that is really for their own personal use whether they're applying for visas and so forth so you find that a lot of these requests are predominantly within that space so I think if we speak to what are we learning is that it is a journey as I mentioned earlier and so we are now with the transparency reports having been concluded we continue to evolve take stakeholder feedback it's been really useful so we appreciate the comments by access now the comments by RDR and many many other stakeholders that have reached out and we literally are taking each of those comments on board we continue to engage very open in terms of that and I think we also planning to venture into doing a digital human rights impact assessment as part of our next step so I think that's the key to any mobile operator out there is that you need to be able to be clear on your reality in your operating environment you need to take into consideration the regulations how the trade off and sometimes the balance that takes place between international standards and local regulation local laws obviously as organizations and I think we got a lot of questions around Sudan in particular last year and I think a lot of stakeholders commented and said but you're silent and they didn't realize there's a lot of things that's happening in the background and we're very pleased that we were able to share the case study last year and things that people didn't know was that NTN was the last to shut down and the very first to actually come back on to the point where customers were going out talking about it and also most importantly what most people don't know is that we worked in the background to try and reduce the time period look at engaging different stakeholders so there's a lot that sometimes you're not able to speak about in a moment but then you know as you come to report and you get your stakeholders around the room and you find the way to do it in a way that's possible that is safe for everyone involved you know so I think it's a journey and everyone just needs to take a step at a time and continuously improve and I think that's why RDR index and stakeholder feedback is absolutely fundamental to continuous improvement Thank you so much Marina that was really enlightening our audience is being very kind and has a question for both Marta and Nabiha so one question for Marta one for Nabiha Marta the question to you is do you think consumer preferences are not being taken into account by companies because customer data is at the heart of their business model and do you think do you think that customers can successfully push back with the sleeping giant wakes up do we have enough consumer power I guess that's the question Yes that's a great question I'll take the consumer preference one first well it's when you have monopolies consumer preference is not at the top of your sort of checklist right because if consumers lack the choice then that lever is difficult for consumers to pull so I think that is some of the ways in which we've seen real government regulatory failures is that we have got Goliaths and we've got many Davids and this is only going to work if we all work in tandem we do not have an agile and I'm speaking in the American context and of course this is when you think globally it's even it's really challenging we do not have a government we didn't think about it in the analog world they were slow in the digital world there's so much that is blowing past that consumers are not being factored into and so these monopolies are writing the preferences and when you don't have a lot of choice whether you want to get off of Facebook or Amazon it's very difficult for a consumer to feel empowered so I think one of the ways and I guess the second half of that question was remind me that was do you think that consumers can successfully push back if the sleeping giant wakes up absolutely and they have historically think about it think about the environmental movement where chemicals were being spewed in every which way all of us were really young kids then but this was happening also these products that you could not see that you could not feel or touch or inhale miraculously the basis of so many things that we're seeping in we have a very different kind of lack of transparency now now we're in a digital world where all of these products and ways in which we have become the product is not transparent to us and so again we go back to that we've got to show and help consumers and work with consumers not just for them to really understand how this impacts your everyday life how it's shaping the choices the things you see on the internet how it's being blocked from you how you have absolutely no agency and have some of the things that you cannot even control things that are mandated like you have to go out and get car insurance and there's an algorithm that's driving that you need a mortgage you need a student loan what are the algorithms driving that we even uncovered medical algorithms that block certain patients that are designed for a transplant how do you how do you feel empowered with things you cannot see and I think it's a very interesting parallel to the chemicals that are seeping into that seeping into our society as well so what I think what I really want to stress is this notion that consumers government and the corporates we have to shape our future and right now people have not been at that table right and we know that our government is not working we know that our government is lacking some fundamentals around how do we address this new breed of titans in the market that are not providing choice or competition we have always been for a healthy marketplace so I think we're tired of waiting for the product and then dinging it when it fails we have to get that's yesterday's way of accountability we've got to think about the future the future of accountability is being at the table and having upstream impact not waiting till a digital product comes on the market and by then it's too late and so I really feel like you know the way in which we're going to have power in the future is being at the table having voice having choice but also having the power we've got to do our job to reveal the truth and grow consumer power if we're going to have impact fantastic thank you and Navihat the question for you is what can journalism more broadly do to support citizens and consumers and consumers to hold corporate power to account so part of the equation that Marta was just talking about what is journalism's role in that process I want to stand up and like put my fists in the air after Marta is rousing framing of it and so I'll pick up on those threads I think for the public to be at the table journalists go into the room and turn the light on right there's a lot that we don't know and I think about a work at the markup in sort of these categories of like the tech you know right you know that you should probably know a little bit more about Facebook's content moderation or why is Amazon serving like these same sweatpants to me over and over again and who's profiting from them what's happening here but there's the tech that you don't know right and so I think about our launch story exactly one year ago with consumer reports about car insurance algorithms there's tenant screening algorithms as Marta mentioned there's the algorithms and the tech you don't know and there is just so much to be done on that frontier to like just turn on the light say this is what's happening is this the world that you want if not I really believe that we can rouse the public to say we demand better right and as I could not have put it more eloquently than Marta that like we've done it in the past I'm so bullish on what that type of people power is capable of doing in this moment and I would say it's consumer consumers having choices and also being at the table it's all there's a really interesting dimension at this moment too of what's happening inside of companies when you think about Google and what their workers are saying like we don't want to work on this we don't want to do that right that's a really interesting slice of people power too that is happening in this moment that I would say they're also at the table in an interesting way and so I think what journalism's job is to show that this is this is what's happening and to remind people that like a different world is possible right I think a lot about the role of imagination and helping people imagine like a different future it doesn't have to be this way right and so I was reading this great Brookings report paper about algorithmic accountability that came out last week and they referenced one of the regulators for the rail roads back in the day right a hundred years back started off with transparency saying hey railroads we need to understand more about what you're doing here and transparency is that step toward whether it's regulatory change or whatever form of change it is you first have to know what the problem is first before you can do anything about it and I think that's what journalists do a tremendously good good job in doing now it is an industry in great turmoil so if there are journalists and outlets that you love please do support them because they're doing really tough work but the work that we all need to do together. Thank you Nibihah and we're at the hour so I think instead of doing sort of around around the horn sort of wrap up I'll just try and highlight some of the key points because you're all so inspirational one is openness and I think the open source of the markup of the RDR methodology of consumer reports digital lab and also the transparency reports of MTN are all and just sort of your openness about your process Marina are all things that need to be in any process or in our sort of movement towards more corporate accountability corporate culture is a big driver of change I'm hearing trust and listening and being responsive that there are multiple theories of change in our space and that we need to knit them together in a way to create the sort of force and the power that we're talking about and to sort of equalize some of the asymmetry that we're experiencing right now one is that we all are facing challenges of scale I'm sure the cost of doing transparency work in the way that RDR consumer reports in the markup are doing it is quite high so hopefully our funders are listening and and also you know because we also don't take corporate funding as RDR and I don't know exactly the policies of CR and the market but you know in order to preserve that independence we also have to make some compromises and quote-unquote sacrifices that regulation is coming and that consumers and users and people who are affected by all of these tools and technologies that we need to be at the table finally I really like the idea of the tech you know and the tech you don't know and that the algorithms and the big data processing and the target advertising is the tech we don't know that we need to understand better and I think the RDR index is here does a really good job if I can say that of sort of illuminating that and just how much we're not being told so with that I'll close all of the panelists Marta, Nadia and Marina thank you so much for your engagement and participation and observations and insights thank you to our audience for staying and thank you to the RDR team and New America for helping us launch the 2020 RDR index hope you'll all visit it online thank you