 The clear and present danger of disinformation is our conversation here this afternoon. It follows a session just now about disrupting distrust, and of course, those are connected. So I hope that's where we can start. I'm Brian Stelter, formerly of CNN, now a fellow at Harvard University. Let me briefly introduce our panelists. And since we're being live-streamed, a reminder that the hashtag is WEF23, we can try to put some real information out into the world to make up for all the crazy. I mean, first with me, Vera Joravon, the VP for Values and Transparency for the European Commission. Next to her, Congressman Seth Moulton from the U.S. State of Massachusetts, the Sixth District, Democratic Congressman, Moulton, welcome. Jeannie Bergo next to him, the president and CEO of Inter News. Jeannie, welcome. Thank you. And A.G. Solzberger, the chairman and publisher of the New York Times, welcome. I think we should start with you, A.G., talk from the newsroom and the news publishing perspective, and then we'll work our way toward some of the political parts of the conversation. How does this discussion of disinformation relate to everything else happening here today in Davos? Well, first, thanks for having me as part of this conversation. As you can imagine, this is something I really care deeply about. So I think if you look at this question of disinformation, I think it maps basically to every other major challenge that we are grappling with this society, and particularly the most existential among them. So disinformation in the broader set of misinformation, conspiracy, propaganda, clickbait, you know, the broader mix of bad information that's corrupting the information ecosystem. Disinformation attacks is trust. And once you see trust decline, what you then see is societies start to fracture. And so you see people fracture along tribal lines, and that immediately undermines pluralism. And the undermining of pluralism is probably the most dangerous thing that can happen to a democracy. So I really, I think if you're spending this week thinking about the health of democracies and democratic erosion, I think it's really important to work your way back up to where this starts. And in terms of fake news and then disinformation, it was popularized six years ago at this point. Where are we today versus then? What do you mean, where are we today versus then? So this was a hot, popular topic. There was an awakening about it. The social networks fell pressure. But now where are we? And the same question for Jeannie. But where are we today? Oh, I see. Yeah. It's a great, and to be clear, actually, terms like fake news and enemies of the people have been popularized cyclically in society and in some of the most repressive and dangerous moments, Nazi Germany, Stalinist Russia. So I think anytime we're hearing language like that applied to a free press or more broadly free expression, I think we should be really worried. Look, I think that there's no doubt that society seems to have at some level accepted how much the information ecosystem has been poisoned. And I think it's going to require real sustained effort from the platforms, from political leaders, from business leaders, and from consumers themselves to reject that. Jeannie, how do you see this, especially thinking about it from different countries that internews helps make sure the news is being produced in? Yeah, I think sort of the traditional view of missing disinformation. We often think of sort of the information warfare and we're looking at the Ukraine crisis where it's been devastating and even looking at the arc of time at the beginning of the Ukraine crisis in 2014, the Russian disinformation was trying to sow confusion in Ukraine and now we see the Russian disinformation machine really sowing a single line inside of Russia for this continuation of the war. But it seemed very much like a great, almost like a great power struggle that was happening. I think the big trend for us that we're seeing and we work with news organizations in a hundred countries around the world trying to support their work because we believe in healthy information environments being so critical to solving the world's problems and the community's problems and the country's problems. So the big change that we've seen is really the insidiousness of the multiple levels and layers in which missing disinformation is being used. I was just in the Philippines in September and like literally every level of government, private business, you know, the civil society, it's just everywhere. And so navigating it, it feels different than sort of when it was like just the big conversation about fake news and it feels like it's everywhere. And so the solutions are going to be slightly more complicated. But you'd agree with A.G.'s perspective that at some level it's been accepted, a certain amount of this pollution. I do agree with that. And I think we've always had pollution in the information environment. So but it is really, really bad right now for sure. I also feel like there's at some level. I mean, when we turn to solutions, people are also getting used to navigating it a little bit better again in the Philippines. Here's just, you know, unbelievably complicated information environment. And yet people were able to find the information they needed. They knew where they trusted and they were starting to navigate it interestingly, sort of differently. And so I'm a little bit rose-colored glasses sometimes. I'm accused of that. And in this case, I can see some glimmers of hope. Good. You mentioned solutions, Congressman. Can you give us the solutions? No. I think it's really tough. And honestly, this is an issue that affects my job profoundly. I think it's gotten a lot worse in just the last few years. We have a lot of big problems that we have to solve in America. But you take a problem like immigration, where you need to have Democrats and Republicans come together, conservatives and liberals, to have comprehensive immigration reform, to do things that conservatives want, like securing the border, but do things that liberals want, like making sure we have a pathway to citizenship for the immigrants who come here. And if you can't agree on the basic facts, you can't agree on how many people are crossing the border. You can't agree on whether a wall makes sense or whether we should strengthen border security at the checkpoints. That's a simple fact that dictates where a lot of funding that Congress could spend on this problem actually goes. And if you can't have that honest conversation with your colleagues, it's a real problem. And I noticed in the Congress a marked change after Donald Trump came to power and proved that lying works. He proved to especially a lot of his Republican colleagues that this actually this is pretty good. And all these Republicans who disdain Donald Trump and there are a lot of public examples of people like Lindsey Graham who said the most demeaning thing is about Donald Trump and then jumped on his bandwagon. Part of jumping on his bandwagon was buying into this game that you could just lie and not only could you get away with it, but it might actually help your career. When you're confronted by those lies, what do you do personally or professionally? What do you find are the appropriate responses? You know, you try to have you try to have a conversation. But I remember this this instance. I mean, we were you don't ignore it. You don't try to ignore the try not to try not to. But I remember having a conversation with a conservative colleague from Southern part of the United States. And we were debating one of these one of these issues. And I won't go into the details and I don't remember every one of them. But he said something to me that just was late, blatantly untrue. And and I said, well, that's not true. I mean, I read this in the news this morning and he said, well, where did you read it? And I said, well, it was in the New York Times. And so we just scoffed and said, well, that's not that's not true. Let me talk to him. Do you have thoughts about that dynamic both in the U.S. and abroad? It's not necessarily new, but it's not getting better. The fake news distrust of any established brand and especially a brand like yours. I mean, look, I think for I think it's a different challenge domestically and internationally, right? So so domestically, you know, I think the most important thing that that independent media organizations can do is to hold their ground and and hold on to their principles, right? And so for us, that means treating, you know, independence, you know, and following the facts wherever they lead as a North Star, you know, and and when the facts lead somewhere ambiguous, it means conveying that ambiguity or the debate around it fairly and in fully. So, you know, I really do think that that is the the the path. And I think there's a giant segment of America that wants that. Internationally, you know, it is it's a more complicated picture and and it's it's why the anti-press rhetoric of a country that has been synonymous with, you know, you know, upholding and defending a free press is so dangerous. Is that most countries have far less of a tradition of tolerating a free press and free expression. You know this better than anyone. And so, you know, this this anti-press rhetoric in terms of fake news were greedily gobbled up by autocratic regimes and and aspiring autocratic regimes who who then, you know, you know, passed laws that they claimed were banning fake news, but were actually banning the independence, the scrutiny that is provided by an independent press, the scrutiny and accountability. So so I, you know, I see the bigger challenge on the international stage. International. There, do you see what role do you see for government in this conversation? Your portfolio includes mis and disinformation, as well as digital privacy and security and the the the relationship between consumers and tick tock, for example. So how do you see the world government here? Yeah, I will correct you or not correct you. Sorry, I would. Please, but I session about accuracy. I don't like to hear about consumers. I love consumers. Yeah, they are so close to my heart and stomach, but citizens. It's citizens. It's something else. You feel the difference. That's why we introduced this special regulation for political advertising. Strict rules for selling politicians from selling the shoes or furniture. Yeah, so I started from from from the end, what I wanted to speak about. Of course, I am I'm European legislator and we are working on legislation together with 27 member states with very different history and different instincts in the society. And we have countries which experience censorship like mine. I come from Prague from Czech Republic. We have countries which experienced Russian occupation like my country again. We have countries which which simply have different history and different different traditions and to legislate on how digital space should look like. It's pretty daring exercise because we must come with the rules which will not be abused. And I will come to that if you give me two more minutes. When I come to the United States, I was in Congress many times and I spoke to American think tanks also with the journalist I was in New York Times. The basic question I hear from Americans on how we are going to deal with the disinformation online is will you order removals removing of content from online? It's it's so simplified. And I am I am almost shouting, of course, not because this is not the way how to do that. We try to do in fact, three things to make sure that the disinformers do not find the feeding ground the society which is willing to get brainwashed. And here again, different traditions, different instincts, different sensitivities. So, for instance, the Russian propaganda was so bloody strong in the center and Eastern European countries because there were some sentiments in in the society already before the invasion to Ukraine. So the feeding ground, which means that our society, our people should get more resilient. It should be done through education, through the work of professional media. I had in New York Times, when I was there three years ago, the question, how did you increase your readership? I heard from some of your colleagues because the people became more hungry for the truth. So, but this is a long distance goal. The second, maybe I will I will surprise you, a better strategy communication from us who are the representatives of democratic governments. Aren't we lazy? Aren't we just too self-confident that the people will find the truth somehow? We should be more intensive in delivering just worthy facts and information for the people. They are not stupid. They and they have the right for transversy and reliable information from those who voted for and also do the pre-banking. When you look back at the disinformation campaigns, they are primitive as for the content. It was possible to predict. So why didn't we do more to prepare the people for that? Only the third thing is the regulation. And indeed, Europe started to regulate. First of all, the Digital Services Act says the content which is illegal offline has to be treated as illegal online. So here it comes. Terrorism, extra political violent extremism, hate speech, child pornography. What's the rest? Oh, there is it's enough. No, isn't it? Insignment of violence. It's another category. Disinformation. Well, it's in most of the cases, it's not illegal content. So we should be suddenly say that for online space, disinformation is illegal. And this is exactly what I mentioned at the beginning, that we must come with the rules which will not be abused. So what I heard from the people working for Facebook in the board gaining or getting the complaints and the request for removals, 90 percent of the requests are coming from governments. So the elected politicians mark as disinformation something which is uncomfortable. Yes, so let's be aware of that. So that's why we created pretty difficult, complicated system. The code of practice against disinformation where nobody is the arbiter of the truth. We invited the of course, the platforms, the advertising industry the fact checkers, professional media. It's a very broad exercise where I believe we might find some some good good results also, also in relation to the better resilience of the people because this code of practice is also covering this this part. Sorry I was wrong, but you see how full I am of it. I hope you're just getting started. Your point about, you know, uncomfortable truths being mislabeled as disinformation sticks with me. I hate this. This is a complicated term and there's so much between a clear, verifiable fact and a clear, verifiable lie. There's so much in between. And so that's why you're saying the rules have to be set up in a way not to be abused. Congressman, should we learn in the U.S. something from the structures that the Europeans have adopted? Well, look, I think in general the U.S. has a lot to learn in terms of data regulation, internet regulation. I mean, you're way ahead of us in that regard. But we believe very strongly in free speech. I believe very strongly in free speech. And I think there is a healthy concern in the United States that the EU might be going a little too far. So I mean, I think you look at this from both perspectives. Yes, they're ahead of us and they're doing some smart things that I know when I use the internet in Europe and I get all the warnings about cookies and whatnot, that actually makes me feel safer. That makes me feel better. And a lot of American consumers want that level of security on the internet for your own data privacy and whatnot. But we also have a healthy concern that, you know, we're not going to be censored. And that's the world that we live in. I don't think anyone in America, well, I don't think lawmakers in America want to give up on the fundamental principle of free speech. And we're very careful about that. But haven't we seen many Democrats in the last six years pressuring Facebook and Twitter and now TikTok to be stronger in content moderation? Hasn't that been a wave that we've seen crash over for years now? Well, I think this is the question of, you know, I mean, ultimately what we're trying to achieve there is some measure of public safety, right? We're not, I mean, sure, there are some politicians who are going to go out and just get angry at Facebook if they see things that are mean. I'm quite used to reading mean things about me on the internet. It's just, for a while when I started, I used to screenshot them. So I would just sort of get comfortable with the most heinous things people would say. So it was to sort of inoculate myself to the issue. That's a reality for being in public life today. It's very much more in your face than it was 30, 50, 100 years ago, but I think it's always been a reality. The difference is when, you know, I have a constituency that I'm trying to keep healthy and I can't get them to take a COVID vaccine because of misinformation that's propagated on the internet. And that's where this becomes a much tougher, more difficult, but also just a bigger concern. How do you counsel, Jeannie? How do you counsel journalists in this environment? I'll tell you a brief, funny, maybe not funny story. So I leave CNN and there's a crazy website that posts an article saying I was arrested by military police. And then I have a fact checking email and I don't know whether to reply to the fact checker and bother with this, right? Same website a month later says I've been executed at Guantanamo Bay. So the fact checker emails me again and I say, well, do you want to take my pulse? How do I disprove that I wasn't executed? And so to me, that is amusing and ridiculous. To other journalists, it might be really disturbing and it might be worrisome to them. And sometimes I might fall in that category too. What do you tell journalists? How do you counsel them about being in this environment? That doesn't strike me as a funny story. Okay, all right, not a funny story. Hey, I'm alive and well. Which I'm very glad, it actually raises, I mean, if we think about some of the trends in disinformation, one of the most worrisome is gendered disinformation. The fact that those types of stories hit women so much worse, women politicians. And it's just proven across the board that women online get harassed and online harassment becomes offline harassment very, very quickly. And so gendered disinformation is one of those trends that is to me one of the most terrifying out there. So I wanna sort of circle back to the comment about should the platforms be held accountable for part of this? They should be and you're exactly right. It's the accountability is about keeping people safe and gendered disinformation can be very unsafe for a lot of people. And so I think that's where we wanna go with it. I, you know, the platforms when it comes to content moderation do have a responsibility for trying to keep people safer and they can do more. And again, it's worse in the rest of the world. You know, the genocide in Myanmar of the Rohingya was clearly based on very poor platform moderation. And we put civil society organizations like mine put a lot of pressure on the platforms and there is responsiveness. I mean, in Myanmar to get Facebook to hire people that could speak the local language and do a little bit more. I mean, but it just isn't sufficient given the resources and given the incredible impact they have on the information environment. So that's one of the areas. I wanna just say another sort of solution because I find the policy one as an American. I also find it, I struggle with it a little bit but I think there are some really interesting elements to policy work and not just in the information landscape but also on election law and different laws that you can sort of get more clear about and as you said, sort of political candidates should be held up to a different sort of standard. I also think what's really interesting for this group in fact is that there are industry players who have an impact as well. The advertising industry has a huge impact on this and one of the ways we miss disinformation and misinformation sells. A lot of it starts out as market driven. We remember that with the election. That's the ridiculous story about me is just an ad play. It is. It's like great to get this groovy crazy news out there, right? I mean that's not good for the ad industry in the long run and we're finding more openness in conversations in places like this to advertisers and brands who say, you know what, we're done with that and so we're trying to encourage them like, hey, let's go back to the old school. Let's invest in local good content, good information supporting what AG was talking about and making sure that's where your ad dollars go to both help democracy and then hopefully save media in the same time. And just one other quick comment on this which I completely agree with what you're saying and this concept of preserving public safety even under the banner of free speech is actually something that we've accepted for a long time. You know, we can get taught in grade school the concept of, you know, yes, you're allowed free speech but not crying fire in a crowded theater. One of the most amazing experiences of my life was as a young lieutenant in Iraq. I was, there wasn't a great plan after the invasion. Maybe that's now been accepted. I was assigned to work not with the fire department or the police department like some of my colleagues but with the Iraqi media. And it was my job as a 23 year old to teach them the principles of a free press to get this media that had only laid things for the Ministry of Information and Baghdad to actually learn how to report the news. There was a lot of concern from the Iraqis, many of whom they were working with us because they liked what we were doing and overthrowing Saddam. They look, we don't want this to go badly. So they wanted to have a lot of propaganda. That was very supportive of us. But I said, no, no, and I found myself giving these civics lessons that I hadn't had for 10 or 15 years in explaining principles like this about a free press. And I think sometimes we forget that we have these norms established for traditional media that we've accepted for a long time. And we're just having trouble translating those to the social media world. Translated them, right. A question for AG, but let's open it up to everybody which is about that social media world and this next generation of technology. Generative AI is one of the hot topics here this week. Everyone's heard about chat GPT. AG, have you thought about how that's gonna affect the New York Times and affect the news industry? I'm sure there are like literally 1,000 people at Davos right now that have a smarter take on chat GPT than I do. But I do think that it's squarely, it's gonna make the disinformation problem and the bad information problem worse, right. Again, a lot of this won't be information that's like created with the intent to mislead. But based on everything I've read, I suspect we're gonna see just huge amounts of content. Content, right. That is produced, none of which is particularly verified. The origins of which aren't particularly clear. And that's even before we start getting into like deep fake technology, right. So I think we are getting to a point where tools are gonna make it harder and harder to solve this problem or make the problem worse and worse. And I think that we're gonna have to go back to first principles, which is, if you don't want bad information, you need to crowd it out with good information. And Brian, you know as well as anyone that local journalism in the United States and around the world is in crisis. And I think it is not a coincidence at all, that the flood of these other things, propaganda, conspiracy, clickbait, all came in as the journalism ecosystem that had been a trusted sort of guardian of these high ideals was at its weakest. So I think we both need to address the misinformation crisis, but we also need to rebuild an ecosystem that is weaker than ever. Other thoughts on where we're going with AI? In these conversations? Have I missed the topic on media? What's that? You discussed the media and I... Well, I mean, I share a G-sphere of if most of the content on the internet is written by computers with no regard for what is true, then we're gonna be drowning in a way that we're not even drowning right now. Then we will fully disregard the value of the truth. But I do think there's a real market for... Nobody will believe anything. Sorry. No, no. I just think there is a real market for the truth. There's a real demand out there as the New York Times has said about why you were able to increase subscriptions or as a hunger for the truth. And if you were to tell me as... If I were an investor at Davos and you said you can either own chat GBT and sell that across the world or you can own a platform that will tell the world if something is written by chat GBT or another AI, I think it's easy to imagine that the market, the demand, every teacher in America would immediately wanna buy that AI, whatever it is, that can identify chat GBT. Now, there are a lot of students who would like to buy chat GBT, but I would argue that the market would be even bigger for the machine that can tell, that can identify disinformation than for the machine that makes it easier to write your fourth grade history paper. The verification layer, right? I just wanna dump on that. The crisis with news, which is in the same crisis zone and it is the collapse of the news industry. Newspapers decimated across the United States and not able to expand around the world. I mean, there's a couple of things that you look at and then at the same time, the disinformation is the trust in news is eroding completely the 801 trust barometer shows that every year, how news is way down here when it comes to trusted institutions. So the question is, how do you both rebuild media but also rebuild trusted media? And there's a few things that we were talking about it earlier. I mean, it is really about the news has to behave as a trusted source and be transparent and be calm and be stick by their ethics. And we're finding a real growth in trust in news when it comes to local news. I mean, the closer to home it is, the harder it is to lie, the closer to home it is, the more directly relevant it is to your life and the more you turn to a local newspaper and say, okay, I need this. And so that again, feels like a building block for building trust when it comes to news organizations around the world. Questions from the audience, we have about 15 minutes and we have microphones around to pick up your question for the live stream as a hand here in the back. If we can bring a microphone up for you and then one up here in the second row. Hi, thank you very much for that. Have we done enough to make the truth as attractive as the lies which work with our confirmation biases that we want to read because those lies happen to reinforce beliefs that we have anyway? Have we as a community made the truth attractive and compelling enough? And so may I? Please. I like these philosophical questions. No, I think that we have sufficient data which show that a lie sells much better than the truth. And if we do not stop that then the algorithms will simply work in the direction of better business and better profit. And that's behind our rules in Europe that we wanted to stop this trend. We after having the data showing that it sells so well and it flies seven times quicker than the gray boring truth, we started to discuss this under the concept of the code of practice against this information. And here we have the biggest representatives of the European advertising industry which confirmed that they will not monetize or help with their money those who spread this information. And in other words that together we might make the business of spreading and producing this information much less successful financially. So that's behind also the concept of the code of practice. One sentence on AI or two sentences. We have long lasting discussion with the platforms how to use the artificial intelligence in detecting illegal and harmful content. And we were never very much fond of seeing that to be broadly used because just count with me. Yes, for child pornography. Yes, it could be for the detection when it comes to images, when it comes to texts which is much more sophisticated. Well, they have to work with the language. So I don't think the AI is still, is already mature enough to be able to disclose but this is just a matter of time for hate speech. Well, we need the people who understand the language and the case law in the country because what qualifies as hate speech as illegal hate speech, which you will have soon also in US. I think that we have a strong reason why we have this in the criminal law. We need the platforms to simply work with the language and to identify such cases. The AI would be too dangerous to do that. Where I see the usage of AI useful for detection is when the AI discovers AI production. Here I speak mainly about the bots and the production of this information, especially from the St. Petersburg troll factory. There was a lot of it. And here I have the answer to those who always shouted that we are censoring and we are not protecting freedom of speech. The protection of freedom of speech belongs to those who are real persons. Here the lawyer speaks. Let's take good old legal traditions. And let's look at whose freedom of speech should be truly protected. It's not the robot. Not the robot. Yeah, so that's the concept. And it's very much behind everything we do that we look at the old good law, which the society is used to and apply it to the online space. Was also striking as the Elon Musk of all people defending the idea of freedom of speech, but not freedom of reach recently with the algorithms that you were gonna add. Well, I just wanted to add one thing, perhaps an unpopular thing to that thoughtful question. Good. About making the truth as compelling as the lies. And I think we need to accept that we may not be able to make the truth as compelling as lies. And for example, I still think it's a more compelling story that you are in Guantanamo. Yes, it is, isn't it? Then that you hosted a panel on disinformation at Davos, right? And I think it was objectively a more compelling story. You should see the comments, Fred. They're having a blast talking about it, yeah. But which story is more valuable? Like the lie is only valuable to the liar, right? In that case, it was valuable because he benefited from advertising, right? A few sense of ads, right? Same with the politician, right? The lie is valuable to the liar. The truth is valuable to the public. And I think we need to shift from just trying to make the truth just as compelling as possible, which I actually think creates perverse incentives, carries some perverse incentives, and actually just try to make it as useful as possible. Let me say instead of compelling or sexy, how about just as accessible as the lie? Great example this week, again, maybe fun, maybe not a fun example here at WEF. Right-wing Twitters saying that Klaus Schwab decided to back out of the conference and not attend due to health reasons. So I'm scrolling through Twitter reading this while Klaus is on stage. And I'm thinking, it does a photo of him, a live photo count as a debunking of this lie. But you're right, that lie only benefits the liars. It does not help anyone else. But we need the truth, in that case, the truth of what Klaus was doing on stage to be as accessible as the lie. And that's definitely a challenge. If I may, on behalf of politicians, what's really varying trend is that to be caught lying. It's not disqualifying moment anymore. It's the congressman, yeah, right. For the politician. And I think that, here again, maybe the citizens should be more demanding and to look into what they promise in the campaigns. Because already the political campaigns are full of lies and unreachable goals. This is typical for populists. And then those who are lying in the campaign are normally lying when they get the power, yeah? Do we want it? So, yeah. But I thought it was striking congressman that you brought that up right away with regards to 2016 and Trump showing that lying works. Has that only become more true in the last six years? Do you see evidence that it's less true, hopefully, now? Or does Congressman Santos just reaffirm it? Well, I mean, Congressman Santos is a hopefully extreme case. Hopefully. This, I hope you don't know anything about him, but he's elected. No, it had a lot of laughs, so it's, yeah. Probably do. And look, he actually presents a real danger to our congress, not just to our reputation, but to our national security. If the speaker gives him access to committee information. If he sits on the armed services committee and gets classified briefings, that he could turn around and sell to Brazil because Brazil wants to extradite him. I mean, this is a real threat that we now have sitting in congress and he's not being expelled because he voted for McCarthy. I think that there is a bit of learning that's coming from the, at least for right now, sort of downfall of Trump. And there are definitely some people who I think are reassessing the allegiance that they've had to this type of politics. But, you know, my comments earlier made it sound like this is just a problem on the right. I think that there's some good evidence, at least in U.S. politics, that it's more of a problem on the right, but it happens on our side too. It happens on our side too. And I don't think it's gonna go away anytime soon. What do you do as a Democrat to hold your own side accountable then? Well, I mean, one of the solutions here is that leadership matters. And you're right. I mean, I hope that the citizenry holds us more accountable. I'm trying to take this gamble on being a truth telling politician and hoping that that works for me. But I hope, so I hope the citizenry buys into that, right? But it's also up to leaders to just realize that, you know, if you wanna be a good leader and you wanna do the right thing for your country, you're gonna have to adhere to some principles that might not always be in your immediate self-interest. Then the question is whether the truth can still be winning the elections. No, that is a question. That is a question. And you can point to examples where it hasn't worked. Trump, Santos, prominent among them. But then you can also point to examples where we've had real truth tellers who've been able to persevere and win. And political courage is about being willing to do the right thing when it's not politically advantageous. It's almost by definition doing the right thing when it's not, when you go against your own party, because that's harder often to do than go against the opposition. And we need more people who are willing to go down fighting like Liz Cheney, preserving her reputation. And look, there are plenty of things that I disagree with Liz Cheney about, but she's a great example of someone who is willing to lose an election for the sake of telling the truth. I think the Santos, sorry to bring that up again, is also interesting from a news perspective. I mean, it's famous for a story broke by a local news outlet. And then it's also famous, well, not famous for, but the local news outlet was a little bit extreme, my guess, as I'm understanding. And so it wasn't necessarily picked up as quickly as it could have been. And so I feel like there is a need for local news to cover local politicians. It was gonna cover that race. I mean, besides the political parties, obviously a bit of local news organization catching this type of behavior as it starts is where we want to start, right? I mean, it wouldn't have solved the Trump problem, but it could have solved the Santos problem. But then any of the muscle and the agenda-studying power of the times for us to all know the name. There was a question in the second row. Let's get to more hands here, and then we'll go over here. Yeah, thank you. Good afternoon. My name is Humberto Rumbos. I'm from Venezuela. I'm a global shaper of the global shaper community. And there is a situation of cross-contamination between countries and between region because, for example, in Venezuela, we had in 2022 a lot of misinformation campaigns about the Russian invasion and about what's happening in the region. And there was nothing here. It was in Latin America. And this is happening in Venezuela, Cuba, Nicaragua and many other countries. This is a situation that is also useful in the Middle East and North Africa region. And so what can we do about this since the point of view of a government, of a point of view of an international organization or an international newspaper like The New York Times to address this problem? Because it is not a problem of Europe or the US. It is a problem that is global and it affects the whole world. And maybe it is not illegal, the misinformation, but it's attacking our right to access information. So what can we do together to address this problem? I don't have a good answer for that, but I was just going back to the Philippines again. You can tell where I traveled the most recently. It was so interesting to hear that the Philippines is called Patience Zero by Facebook policy people because it's like the place, the home of where you can experiment with misinformation tactics and that the Russian disinformation, which is global and comes to all sorts of regions trying to muddy the debate. And the Philippines in particular is really interesting because they come in, they test different ideas and then there's a giant diaspora that sort of these disinformation goes into the diaspora and then it comes back around, right? And so it's just a really insidious circle. There's reasons why it's coming in and then there's ways it's coming back around to the world and sort of fueling disinformation around the world, even if it's going into a particularly vulnerable community to such disinformation. So that's no good of a solution, but it is highlighting that the problem affects individual communities, it affects countries and then it affects the global conversation. I think it'll be interesting to see if there is this revival, if we're going through a transition period and there is this revival of local news because of the trust that you get at the local level. And I think it's very much too early to say, but I grew up in a small town, 20,000 people and we had our local paper that always, we got once a week and you always read that local paper every Thursday cover to cover. And about three years ago, it became regionalized and essentially went out of business. This year, there are three competing new newspapers in my hometown and they've all started up just to fill this gap in local news and they're competing for the truth, they're competing for an audience. And you have to subscribe to all three. That's good, let's get one more question. We have two more minutes left. Let's go right here in the fourth row. We can bring the microphone over right over here. Yeah, thank you. Thank you so much. My name's Bani Dughal, I come from New York and I represent the Baha'i International Community to the UN. My question is about the role of education. How do you see that filling the gap? Because clearly we need to teach our children a sense of discernment to be able to recognize what's true and what's not. I mean, we were all born before the post-truth times and so I think we can understand the difference, but young children are being bombarded with fake news and sadly, yes, they're seeing their leaders lying and getting away with it so their sense of what's moral and good is also getting skewed. I remember raising my sons at a time when we had Clinton and then we had Bush and I remember a bumper sticker that made me very uncomfortable with said when Clinton lied, nobody died, but and my sons actually brought that up at a dinner table conversation and I had to defend the truth vis-a-vis whatever was happening politically. But anyway, the question really is about education and what can we do in order to? We just ran a really interesting story on Finland which apparently has the best track record on teaching students and young people how to recognize disinformation. So they're clearly our models at work. I also think it's useful for us not to overthink the problem too much. I mean, ultimately what you're teaching people in those moments, as much as anything, as much as you're teaching them to recognize a lie, I suspect you're teaching them to recognize trustworthy sources, whether that's an institution like The Times or The Post or The Journal, whether that's scientists, whether that's academia, but being discerning about trust and in some ways finding institutional proxies for trust where there are reliable, transparent standards, for example, an institution like mine, when we make mistakes, we acknowledge them in public and we correct them, right? And I think that that's gonna be a big part of this. I also just think that at some point, given the central role of the platforms in disseminating bad information, I think they're gonna have to do an unpopular and brave thing at some point, which is to differentiate and elevate trustworthy sources of information consistently, consistently, and until they do, I think that we just have to assume that those environments are basically poisoned. But doing so, I think we all know, it may suppress engagement, right, which is a North Star metric for a lot of these institutions, and it almost certainly will incur political backlash at a moment when these institutions are facing real regulatory pressure. So that's a hard thing for them to do, but I have a hard time seeing how we solve it, just on the demand side, without addressing some of the supply side. Supply side, last 30 seconds, final thought? It just two sentences, one on Finland. Finland shows that there is the smallest impact of disinformation on the society, and at the same time, they are introducing the most intense educational program. It's a paradox, but in almost all the member states, they are doing a similar thing. Also, the EU is funding, so this is covering young generation, and interestingly, in Finland, in Ireland, in several other member states, they started the programs for the elderly people in the public libraries to teach them how to differentiate, how to use, how to research for the trustworthy sources, and so on. It's, I think that we still have some possibilities to be creative. I just put in one word, final word from me, is just going back to where we first started, that this is all about trust. It's really about trust, and trust is so easy to erode, so easy to lose, and it's really difficult to rebuild, but that's what we need to do, and it's going to be hard, and it's going to be every day, and it's going to be everyone in this room practically. We have to rebuild the trust in information, and be good communicators, and step by step to get there. But it is a topic where we can all actually make a difference individually. Every single person, you know, actually indeed. Thank you to our organizers today, and thank you to our panelists. Thank you.