 So many people have been speaking about the climate crisis, so many things have been put on paper. But the real question is why is it that we're still not acting at the scale and speed that is necessary? The extreme weather events that the scientists have long connected to the climate crisis are becoming far more frequent and far more destructive. For 150 years we built up a world based on the assumption that we can exploit the planet for free and it translates to very dramatic impacts happening right as we speak. The climate crisis is a threat multiplier, which means it exacerbates existing inequities in our society. The impacts are felt most deeply by black, indigenous and communities of color. We're living through an explosion of inequality. We need to remember we're on the same planet and this is the planet that we need to make sustainable for the whole of humanity. Climate change is impacting food security as well as political stability in many nations around the world. Five years ago there were 80 million people marching towards starvation. That number jumped to 135 million. What caused the jump? It was man-made conflict like in Ukraine compounded with climate shocks. No one is as vulnerable to climate change as farmers are. If you talk transformation the first thing they want to know is what must I do on my farm? We know that in this transition we require a fast adoption of a lot of new technologies and the question today is how to find the appropriate way to find this technology. To put a number around it, it's an extra two and a half to three trillion dollars a year of additional finance that we have to find in order to get those emissions down. Financial institutions have a lot of roles to play to bring the advice and provide the financing to make these transitions happen. Younger generations are demanding a sense of purpose. They want to look at companies and say I am investing with you all for this reason. With the upcoming two cops taking place in Africa and the Middle East we have this tremendous opportunity to put emerging markets at the forefront of our collective response to climate change. For international trade has to be part of the solution. How do we all get together to talk about a global carbon price that can guide us and help us decarbonize the world? The solutions are there. What we need is governments to regulate, to invest and we need business to act with values. History will look at us, people, politicians, corporate leaders. These times require not only solutions but speed. There is nowhere else to look than the mirror. We are the ones that need to do this. Hello and welcome from Manhattan. I am Adrian Monk at the World Economic Forum and this session is tackling this information part of our Sustainable Development Impact Meetings 2022. Thanks for joining us. I am delighted to say that for this panel we have a fantastic group of experts. With me here in the studio we have Melissa Fleming, Under Secretary General for Communications at the United Nations, formally in charge of all the comms for UNHCR and a very distinguished communicator and podcaster. I am also joined down the line by Rachel Smolkin, who is Senior Vice President at CNN in charge of the network's digital output and one of the leading experts in disinformation. Professor at Brown University, Claire Wardle, and Claire also advises many organizations on what they should do to combat this information. To start our session, Melissa, I wanted to turn to you. When you took over in 2019 running communications for the United Nations, I imagine you had on your agenda the Sustainable Development Goals, a lot of the big global issues that the UN tackles. I'm guessing that you probably didn't have on the list of things that you'd be looking at fighting global disinformation. How did that become part of your mission? You're right. It was certainly kind of bubbling in the background as a phenomenon, but it really exploded with COVID-19. But I think it exploded also in our own awareness of the phenomenon and the problem. Because as long as the social media platforms had become so dominant, there was already a proliferation of misinformation that was making achieving what we were trying to achieve a better world and a more inclusive, a more peaceful and harmonious world, it was making it more difficult. But with COVID-19, we realized very quickly we were in a communication crisis, unlike any that we had ever been in before, because this was a novel pandemic, and we were asking the public around the world to do things that they were very uncomfortable doing. And there was also so little known. We know that communicating science from just looking back at communicating vaccines or communicating all kinds of science is hard, because it's not black and white. It's nuanced. And in this case, the virus was changing. And so we're used to, as public health institutions or the UN or WHO putting out kind of press releases or dry documents in the form of PDFs. And meanwhile, very emotive content was going out, people expressing their fears. And people who were very active in the anti-vaccine scene and others were seizing the opportunity of people being so afraid as well and injecting disinformation and misleading information, fueled also by some leaders and governments. So it was a cacophony of information. WHO called it an infidemic, which meant if you were a user and you were trying to search, you were just confused, because there was so much information, some of it good, some of it eh, and some of it really, really bad. So yes, we started to study the phenomenon. And I know Claire Wardle sees with this as well, so maybe she can comment and then we can speak a little bit about what we did and how we kind of, you know, kind of changed course and beefed up our operations to try to address it. Yeah. I'd love to hear a bit more about that. But Claire, I mean, you've been studying disinformation for probably longer than almost anyone in the academic and communications world. Give us a sense of how big a problem we're facing. This is something that a lot of people in the media talk about. It's something that you hear about, you know, on shows dedicated to kind of examining the media world. But is it something that really everyday citizens ought to be concerned about, or is it something that's small enough to be contained and we, you know, we should put some perspective to it? It is something that people should be concerned about. And I often say, we can't put everything back into a box that fundamentally our information environment will always be polluted in different ways. And they're actually as citizens, we have to learn how to navigate polluted information environments and like pollution, some of it is by people who are trying to make money and they don't care and they're polluting the environment. Some people are doing it for political gain. So disinformation, which is people deliberately creating false information to cause harm. The amount of people doing that is relatively small, but they're pretty good at it because they're learning and they adapt and they evolve. The bigger problem is the fact that many citizens, all of us as humans are susceptible. So as Melissa said, one of the reasons that the pandemic was such a difficult time was because we were all fearful. All of our lives were turned upside down. And so when as humans, we are frightened, we're scared, we're unsure, our critical functions don't work as effectively. And so what that means is that we're all sharing information with each other, often trying to be helpful, but not understanding that actually it's false, almost leading and it's going to lead to harm. So this panel is called tackling disinformation, but really it's all forms of polluted information that we should be aware of and taking different steps to stop Russian actors who are deliberately trying to destabilise a country. We need to take certain actions there to prevent Uncle Bob from sharing misleading information at the Thanksgiving table is a different set of responses. So it's all of these different things that we need to learn and understand in order for us to try and mitigate the harms of polluted information. Thanks for that. And I want to bring Rachel in here. I mean, Rachel, you have a huge portfolio at CNN, but, you know, you're in a position where, you know, CNN is both an organisation that's trying to make sense of the world and trying to establish the facts. It's also part of a political war on who owns the narrative. And you're also now facing these folks that Claire just alluded to who are in a professional business of fabricating disinformation in order to make your life even harder. How do you and your colleagues kind of go about looking at that disinformation environment? You know, is this something that, you know, you've become used to or is it something that's, you know, you're still learning to kind of navigate and learning to kind of make your way through Claire did such a good job with the framing of this as polluted information. I find that a really helpful way to think about it, because as we navigate this environment, the information being put out that is deliberately wrong is all mixed together with information that is wrong, misleading, dangerous, no matter what the original intent of the information is. So we are navigating this in different ways and in different spaces. We've been very much in the space around the 2020 U.S. presidential election, the claims of a stolen election, the false claims, the whole stuff, the steel movement that we've been really pressing to give audiences the fact because the strength of our democracy is in our institutions and the public trust in the institutions. It becomes very complicated when there is not an agreed upon set of facts and a single narrative. There are the facts and in the media, it's our job to continue to point those out, no matter what the politicization around it is. The other challenge and Claire and Melissa both got it is we're no longer in one confined space. This is not just happening in the political space. It has been a huge issue during the pandemic. We've seen it a lot around the vaccines and false claims and misinformation about the safety of COVID vaccines that can scare people off. There are real health ramifications for this and can be quite dangerous to people who are taking in that polluted information, whether it is disinformation or misinformation. We're seeing it in the abortion space now also a very potentially dangerous space for women who are being told things on the Internet that are just not factually correct, not scientifically sound. We've seen it in the Russian Ukraine war with Russia spreading disinformation and then government such as China picking that up. So we're really seeing it in so many different spaces and need to think about how best to get the facts out to audiences and how to hold actors accountable and call out what's wrong without further spreading it ourselves. So that is always a balance that we have to be mindful of. Yeah, I want to kind of see you, Melissa, because you talked at the beginning about how you'd started to kind of tackle this. What were the what was the tool kit that you kind of came up with to try and start to detoxify or depolute this kind of information sphere? Travel to where the disinformation also travels. There actually Claire has spoken about data gaps we need to we need to find where people are searching and get there first. But not with a kind of boring 50 page document, but, you know, content that is produced in an engaging form that travels well digitally and works on social media and also in languages. We need we deployed our country offices all over the world to kind of take the basic messaging that that really didn't change that much on health guidance and on the efficacy and the safety of the vaccines and produce content in such a way that it is locally relevant and that it travels in digital spaces, but it's also in languages that people understand and in contexts that make sense. So we really took a lot of guidance from our local teams. What was trending there? We don't want to be central messaging from Geneva or from New York. Isn't going to work for everyone. Another really key strategy that we had was to deploy influencers, influencers who were really keen, who have huge followings, but really keen to help carry messages that were going to serve their communities. And they were much more trusted than the United Nations telling them something from New York City headquarters. So and finally, we had another trusted messenger project, which was called Team Halo, where we trained scientists around the world and some doctors on TikTok and we had TikTok working with us. And these scientists who had virtually no following to start with got verified ticks. They started bringing people in their community, into their labs, into their offices and answering their questions, engaging with them. It really took off and many of them became kind of like national media, go to advisors. And, you know, so it is it was a layered deployment of ideas and tactics. But finally, and Claire also mentioned this. People need to be inoculated themselves. And I think, you know, social media took off so quickly that I think people of all ages are very ill-equipped, especially in times of crisis when they're feeling very, you know, engaged with what's out there and searching and wanting to help and wanting to share, really learning actually how to spot mis and disinformation and how not to be part of the problem. Finally, finally, and none of us have mentioned this yet, we really think the platforms bear a huge accountability, responsibility, much more than they're doing. They did step up. They did provide, you know, ad credits. They did take down quite a bit. But the phenomenon was still exploding on their platforms and still is. And I'm just talking about COVID. I mean, Rachel talked also about conflict. There's the Ukraine war. We're seeing, well, the phenomenon of hate speech that is making wars worse, that is actually fueling conflicts. And these are all, you know, phenomenon that always existed. It's just they're now, they now have a distribution possibility that is so much more powerful than the other means they had before the digital age. That's really interesting. Because, you know, for people who don't understand how we got into this situation, perhaps we need to just take a step back. When social media and kind of internet communication started, it wasn't the situation that you had when the printing press rolled off, you know, when the founding fathers in the United States, you know, were battling with people publishing all kinds of scurrilous rags and and, you know, libel laws and other kinds of free speech things came into being, they were basically given a dispensation to say that anything that appears on your platform as that word came to be, not published Cation, but platform is not content that you're responsible for. And that has been a kind of fundamental factor in the growth of these platforms. Because that's not true, is it Rachel for CNN? You know, if you put out a news report, you can't turn around to the world and say, well, sorry, you know, don't do us. There's nothing to do with the editorial process here at CNN. It just happens to be unlucky. It was someone's opinion. You know, so, Claire, can you just take us back a little bit and maybe Rachel also just give us an idea of some of the kind of ways that you actually professionally manage the information and the checks that you guys do, because I think it's quite important for people to understand both of those sides of it. There is real process here and there's also some real structure to why we are where we are. Claire. Yeah, no, I'll just say is that we all got one of these, you know, not that long ago, not that I think the iPhone was 2007. And none of us got a crash course. We didn't do a driving license in using a phone and we got all excited about it. But fundamentally, I now have the same power as Rachel. I can create something right now. And if it's amazing, it might have a bigger reach than CNN. It would have to be very good. But nobody was taught how to have that level of responsibility. We didn't talk about it as publishing. We talked about it as posting and share a status update. We didn't say you have the ability to share information. And if it's false, it can really cause harm. And so I think what you're getting at, Adrian, is that the fundamental change with the internet and this idea that they were just kind of the communications pipelines and there was no responsibility. It is hard to wrap our heads around why doesn't Mark Zuckerberg take responsibility? And he should do. But this is hard because also the absence of gatekeeping that we have on the internet has also allowed all sorts of voices to be heard and to flourish and movements to develop that in an age of gatekeeping, we didn't necessarily hear from. So what we're going through right now, and you're right when you talk about the printing presses, that is a revolution of the same scale. And we are, as a set of societies, trying to get through this period of adjustment of what does it mean when everybody has a mouthpiece? What are the norms that make that we can do this in a way that doesn't cause harm, etc., etc. And there are calls for a change in what's known as Section 230, which means the platforms have to take responsibility for that. But I do worry that if we kind of have a knee jerk reaction to that, what kind of speech then gets chilled? Or what kind of speech don't we hear? How do we moderate that kind of scale of speech? So I'm not trying to say the platforms don't need to do more. They absolutely do. But what we're seeing is this technology has also allowed all sorts of magic to happen. And that's what we have to balance is the horrible side of the Internet with the joyous side of the Internet, which, to be fair, is human nature. And we're trying to get through this period now. And that's why, you know, how many of these conversations do we keep having because it's hard and it's complex and it's nuanced. And I think that's what we're trying to balance the human element of speech and communication with the technical abilities of the kind of iPhones or computers of the Internet. And that's why it feels so hard right now, because there's no easy pathway through it, and we're figuring it out as we go. And Rachel Clair said there that everyone's got a phone, you can just post. I mean, as a journalist at CNN, you can't just hop on to Twitter and put an opinion out or share something that you're not really sure if it's correct. You know, you actually have processes in place. You have editorial processes that kind of go into making sure that what you put on air and what you share digitally is checked and is researched and is stuff that you'll stand by. Because unlike platforms, you're under an obligation, aren't you, to make sure that you've gone through a thorough process of verifying what you broadcast and what you put out. Yes, absolutely. There are many levels of vetting from checking the information itself. I mean, it starts with the reporters, but there are many layers within the organization to vet that, to double check. Well, yes, we heard this from one person. But have we have we gone to to this other source, this other person who might give us a more detailed understanding or might tell us that the information we have is wrong? Have we checked it? Have we thought about it from this other angle, from this other perspective? We have layers of editors. We have the leaders who are experts in the areas who think through the information, who connect it to other pieces we've done. We have people in the organization who look specifically at our standards and whether the reporting is meeting our standards. So there are many layers of people who look at these things within CNN, really working to get it right for the exact reason you're saying, which is once we put it out, we know we have a powerful platform and we have a responsibility to serve our audiences with the most accurate information we can give them. When there is an error, I think that part of the process is also very important to tell audiences exactly what we got wrong and fix that transparently. So we try to make sure we're very rarely in that space, but if we find ourselves in it, it's an important piece of accountability as well to be always as clear with our audiences as we can. And Claire nailed it by saying there are many sides to the internet. We started this discussion by talking about how people were so anxious during COVID and looking for an outlet to share that, to let that out. And that's important, too, as a way to bring those voices out and bring them together. It's just incumbent on all of us who have platforms to share information that we make sure when there is information that goes to people's health, to their safety, to their understanding of and belief in the fundamental institutions of our government that that information is accurate and correct. Thanks, Rachel. And Claire and Melissa, I just want to come back to both of you on this, because you talk, Claire, about the joy of the internet and the fact that we do hear so many voices, but we've learned quite a lot, haven't we, in the decades and hundreds of years of history of information and journalism, which is that you do need to have checks. You can't just shout fire in a crowded room. That you do need to kind of make sure that you've gone through processes when you're putting out information that could be detrimental to people's health, their well-being, their reputation, all of these things. And yet those lessons seem to have been put to one side entirely in the current situation. And we're all groping a little bit in the dark. I mean, you know, is any of that, does any of that need to be revisited now? Because we keep having these conversations as if something called the history of journalism and the history of newspapers and television and everything, and radio doesn't exist, that the internet is so different and so new. And yet what you're describing seems to be very old, which is a problem of people sharing made-up stuff, or even worse, people making up stuff deliberately to undermine other people and having that shared. Yeah, but I'd say, you know, we're both British. We both have lived in a country with terrible tabloid newspapers. I mean, there are very good news outlets and there are news outlets that are pushing disinformation. There are very good politicians and there are politicians that are really pushing disinformation. So yes, platforms are absolutely part of the problem, but we can't ignore the full information ecosystem. So we need to, you know, Rachel just did a great job of explaining all of the checks and balances. Many people have no sense of all of those checks and balances in the newsroom. They have this idea that Rachel has a thought, she just puts it out. She just happens to work at CNN and she's a mouthpiece of the liberal elite. You know, unfortunately we're seeing trust in institutions decline because we haven't done a very good job of explaining exactly what Rachel said, which is what Rachel puts out on CNN is fundamentally different to what my best friend from high school decides to post based on their own experiences and their own reading of a scientific journal article. But they haven't done a research methods class and they're drawing the wrong conclusions from it. We're all in this space. We all have the power to publish. And so a lot of this goes back to teaching, teaching people to understand how to navigate this world and to think critically about all forms of information, whether it comes from Facebook or whether it comes from a news provider, is that news provider doing the kind of checks that you would expect so that you can trust that you're reading or consuming credible information? So that's the problem. It's so many elements of, I've talked about pollution. It's all sorts of pollution coming from all sorts of directions. And so we have to be a bit careful that we're not like, oh, news is the answer. When, unfortunately, globally, there's all sorts of examples when news are part of the problem. But I would like to say that there is a crisis in public interest media. And particularly in countries, I mean, that there is here. If you look at the demise of local news and local newspapers, but in many countries around the world and developing countries, I mean, there really is almost an extinction threat of the kind of media that would be that kind of check and balance out there. And so then Facebook becomes the internet and affiliated unaccountable portals or kind of fake news organizations spring up to kind of fill that gap and fill that space. So I do think, I agree with Claire that we have a polluted information ecosystem that has many parts and many players. But I do think that also with the demise of public interest media and the rise of digital alternatives has been dangerous. I mean, we've seen it in the most egregious forms. For example, in Myanmar, I think it's the case that cited the most where everybody got their cell phones. So as Claire put up and then everybody got Facebook loaded on to their cell phones and that was their way of entering this incredible new world without any education on how to navigate. And then a government that made a decision to dehumanize a whole sector of the population, the Rohingya in such a way that it gave license to kill, license to drive out 700,000 people. And with almost no moderation on the part of Facebook happening, not no real realization that this was going on on the platform. So it is an ecosystem that I agree with Claire. It is very complicated. I think it needs to be looked at country by country and addressed in so many different ways through education, through the bolstering of the kind of media that is going to provide factual, good reporting so that people have new sources they can trust. And then on the part of the platform to be more generous with their moderation capacities in countries that are very fragile. And where we're seeing, I just recently visited Bosnia and Herzegovina and there is a proliferation of denial of the Srebrenica genocide and glorification of war criminals. People there were saying this is to a point where we fear spiraling back into war. And this is driven by this speech which is traveling online kind of uncontrolled. So anyway, I could go on and on about all of the phenomenon we're seeing and finally our peacekeepers around the world were recently surveyed and almost half of them found that mis and disinformation is a real problem for them in keeping the peace. Now it's true what Claire said in some places, this could also be traveling on radio for example. So it's not just on social media platforms but I do think that our information ecosystem is a real problem if we want a more stable, peaceful, harmonious and united world. That's really interesting. And I think a couple of things that I'd love to get the views of all three of you on is one is the professionalization of disinformation which we've seen in the 2010s really from the experience of Russian denials of involvement in the downing of an airliner carrying hundreds of passengers from Holland and Asia to the chemical weapon use or non-use in Syria and right on until COVID where we saw what looked like state-sponsored actors engaged in that. And even the creation of these kind of front television stations like Russia Today, channeling and broadcasting conspiracies and other kinds of information and acting as a kind of amplification for them. I wanna talk about that but also the other sides, platforms are based and have their origin in the US. They come from a very specific place which is a place of an unrestricted battlefield of speech. And that kind of unrestricted battlefield doesn't exist everywhere. And we sort of know why for quite good reasons. In Germany, for example, there are rules and regulations about what you can say in relation to the Third Reich, the Holocaust, those kinds of things. In the UK, with the huge issues that the UK went through in the 70s and 80s, there is defense against racist speech and hate speech. And so you have got restrictions on what people can say and how they say it. And I wonder what your sense is, the three of you, in terms of A, what are their lessons to be learned from some of those measures? Or is it the case that the US example is the kind of absolute purest example that needs to be replicated everywhere? And also, if you're dealing with professional disinformation, how do you counter that? So two things I wanna look at, but maybe start with the US background, because Claire, you come from both sides. You grew up in Britain, but you work in America. What's your view on the kind of the unmitigated right to say whatever you please, wherever you please? Yeah, I have to say there are many things about the First Amendment that are very, very special. But I do feel that it stops nuanced conversations about speech and there's this idea of, well, it's a marketplace of ideas. More speech is good speech. But the truth is that algorithms are not unbiased. So it's not that every piece of speech is equally weighted. There are certain types of speech that tends to be more emotive and tends to be from certain people that gets more airspace. So my frustration is I wish we could talk more about harm when it comes to speech. So people say, well, misinformation, it's really legal speech. We know terrorist content, child sexual abuse imagery, we know what to do about that, that's illegal speech. But lots of these examples, Claire, well, that's legal speech. And I keep saying, but it might be legal, but if it's leading to harm, can't we actually have a conversation about that? And I think the examples you use, Adrian, is that there are very strong examples of when speech led to very serious harm. And my worry is that we don't think about this problem in a longitudinal way. There was a wonderful New York Times documentary where they actually went back and found footage of KGB spies from the 1980s. And one of them says, it's like drops of water on a rock. One drop of water doesn't cause any harm, but continuous drops of water will splinter the rock into thousands of pieces. And that's what we're trying to do with the US. Now they said it in the 1980s and you could argue 40 years later, they're really starting to see that happen. But so my fear, to your point, Adrian, is that people say, oh, the First Amendment, what kind of harm is this causing? Well, what does this kind of low-level conspiratorial, hateful, misogynistic content that doesn't break platform guidelines over time? Where is that leading us? So I just wish we could have a more nuanced conversation about speech because I worry that this idea of more speech is good speech, that's not really the case. And if you talk to people of color or women, their experiences on the internet look very different to probably your experience, Adrian. And so this idea that all speech is equal is not true and I wish we could just have that conversation properly and talk about the long-term impacts of different types of speech. Rachel, CNN is a global news provider and you operate in many different markets with many different types of regulation. What's your kind of perspective on that issue about the kind of primacy, if you like, of the First Amendment in terms of the global speech environment? We do operate globally, and so that means for different countries, there are different guidelines or rules. We see that in particular around things like elections where they're handled very differently from place to place. I think, again, to Claire's point, the discussion to me here is less about can somebody say something, whether that's an ugly offensive comment. I mean, yes, in the US in most cases they can, but I think the issue is more about how it is handled and that brings us back to the platforms from discussions about algorithms to what the platforms are allowing. Yes, somebody can stand up and make a comment that doesn't mean the comment needs to be shared on a platform and the platforms will set those guidelines. There was just a study yesterday that we wrote about on CNN. It was a news guard study of TikTok that found that in searches for basic information about news stories in 20%, nearly 20% of videos contain search results with misinformation, and that was on everything from the 2020 US presidential election to the Russia-Ukraine war to misinformation about abortion. So there is an issue where a platform is acting as a provider of information, and a young audience is coming to that, looking for information, perhaps not equipped to sort through, not media literate enough to sort through what is and isn't correct, and this is what they're finding. So to me, the discussion really has to stay in that space, not so much what can be said and not can be said, but how are we handling the information and what are we putting out? What is getting promoted? What is rising to the top so that when users and particularly young users are searching for it, what are they coming across? I think that's a really important point. And Melissa, I mean, to some extent, what you've just heard from Rachel there about the speed at which users interact with platforms is so much faster even than the platforms themselves can manage or understand. It seems that when we saw the beginning of these platforms, there was a lot of hope about global communities, about everyone having a chance to share lovely pictures of their families and friends, and none of the kind of discussion about the darker side of what could happen. And TikTok is a new platform, famous for dancing videos and showdowns of people singing and that kind of stuff, but it's also very susceptible to exactly what you've been talking about in the disinformation, misinformation space. So how do we balance that thing of seeing these platforms suddenly emerge from nowhere and get the users who, some of them with bad faith, bad intention, and sometimes state-sponsored, jumping in with this kind of bad content is, you know, how do we deal with that and how do you deal with that at the UAE? I mean, I agree with everything Claire and Rachel said about the phenomenon and also that, I can't remember the exact statistic, but it's an astonishing number of young people who get their news from TikTok and no other place. So the responsibility then with that knowledge of that TikTok has is huge. You know, it's even more. And if there is that much mis and disinformation traveling on the platform, obviously they need to do more to address it, but also to educate. But I do think we as news organizations, we as institutions also have a continued responsibility to inform the world about the state of our world, to guide the public. And unfortunately, like for example, the UN, I was astonished to learn from my social media colleagues that we fall under a category called civic institutions, which means our starting point, we're down ranked. So our starting point is down here, whereas Joe conspiracy theorist, you know, can start here. And so Facebook tries to address this by giving us ad credits so that we can then come back and be at the same place that whoever wants to say anything person is. But it is an algorithmic shift that was deliberately taken to favor individuals over institutions. And the institutions who are there to serve the public for good are at a disadvantage. We are also though have to get better at communicating in these spaces. And I think the humans who are running our governments, our public health institutions, also need to be more human in their communications because that's what functions well on these on social media channels. So it is, yeah, it's educating. It's hopefully elevating the content that we partnered with Google, for example. If you Google climate change, you will, at the top of your search, you will get all kinds of UN resources. We started this partnership when we were shocked to see that when we Googled climate change, we were getting incredibly distorted information right at the top. So we're becoming much more proactive. You know, we own the science and we think that the world should know it and the platforms themselves also do. But again, it's a huge, huge challenge that I think all sectors of society need to be very active in. So I wanna, it's been a really interesting discussion. I wanna bring it to a close with just asking each of you. You know, to some people watching this, they're gonna be saying, well, hang on a second. You guys, your experts, your institutional, your mainstream media, you know, you're the people that I'm doing my research to kind of go around because I don't trust you to deliver on what you say you deliver. You've got an agenda, you're part of these institutions that I didn't vote for, I didn't choose, I didn't pick. I didn't have a say in the editorial policy at CNN. I didn't get to pick the faculty at Brown or vote for, you know, the UN leadership or the World Economic Forum. How do you get to people like that and say, look, you know, you possibly might wanna check a little harder on what you're looking at or you might wanna think again. Or have we lost some people to this debate? Some people kind of, that's it, they're gone. Claire, you've been doing this probably for longer than anybody. What's your kind of sense check? Yeah, I don't think we can just say to people trust us anymore because if you look at the disinformation ecosystem, it's actually really participatory. People feel part of something. They feel like they have agency, they feel heard. Our ecosystem that we all live in, it's still pretty top down. Like, you know, CNN might tell me tune in at 11 or read this link or as you say, Melissa might publish a PDF and assume that I'm just gonna read it and trust it. We on our side need to understand how can we listen more effectively? How can we be more representative in our newsrooms and faculty of many, many people who felt like they're not seen and they're not represented. But again, we're not gonna come out of this quickly. So we just have to start on a process and say, how do we build back that trust and we're gonna have to make people feel like they're part of something. And at the moment they don't but the other side makes them feel part of something and that's why they're succeeding. And Rachel, what's your kind of take on that? I mean, do you think we need to kind of rebuild trust? Do we need to rebuild communities of people that we engage with? How CNN looking at this? We need to do both, trust is earned. It's a huge responsibility and a privilege to do what we do every day. And we're very mindful of that, of trying to get it right, of trying to serve our audiences, of thinking about what information they need and they're looking for and making sure we provide that in different forms of bringing in different voices and thinking about different communities and how we reach different people in different places. News is consumed very differently than it used to be. So are we thinking about reaching people in new ways in new places? Are there communities or people's voices? We are overlooking that we can do a better job incorporating into our coverage. These are discussions we have every single day in the newsroom and we need to keep having them. They're crucial. Melissa, you've probably been on the front lines of this in the last three years in ways you've probably never expected. How have you kind of looked at this issue? Do you think we've lost some people to the conspiracy spheres, to the disinformation dispensers or the other things we can do to bring people back? I think there are certain people who've totally got lost down rabbit holes and they're gonna hopefully find their way out at some point. But I do think there are all kinds of people in the middle and there is evidence that people are feeling really overwhelmed. They're feeling so much gloom and doom from the news environment as well, even the responsible news environment. So I think we also have a potential and we're seeing it on our social media channels. The UN has millions of followers and we put out a lot of messages that are really, they're positive, they're hopeful, they give people agency, they give people the chance to engage, to take climate action, to sign on to initiatives and they're taking part. So I think there is a hunger to be a part of something that is not conspiratorial, that is not hateful, that is not divisive, but that is working towards making the world a better place. There is a lot of positivity to be had and I think we just need to pull people together to provide that kind of incentive and context and there is some in combination with all of the other tools that we discussed here today. Thanks so much. Well, I hope for those of you watching, you haven't come away with too much zoom and gloom from our discussion. A big thank you from my side to Claire Wardle, to Rachel and to Melissa and to all of you for joining for this session. So stay tuned for more from the social development impact meetings, but from here in New York, thank you all very much.