 What concerns you most about the current state of play with regard to disinformation? Is it that the problems are so intractable that we find ourselves at a status quo that seems untenable that we can't get out of? What really keeps you up at night? What keeps me up at night is the absence of trust in any referee. And anything that might feel like an umbrella under which it's like, all right, you know, I mean, just take an example from the foundations of a legal system and a court system. If two people have a dispute, so intractable and important to them and they really wanna be right or win whatever that means. And if one wins, it sure feels like the other one's gonna lose. And it's that bad that they are willing to endure litigation. They're ready to go into a courthouse and spend potentially years and tens of thousands of dollars trying to just get an answer from a jury or a judge and an appellate court and all that is to like, who's right here? It would sure be nice to know that at the end of that, when somebody wins and somebody loses if they don't settle, that both parties, obviously the loser's gonna be disappointed, but doesn't feel like and it is in fact not the case that they were robbed, that it was a corrupt system and like, I go, why did I even have the faith to go into that courthouse? And how valuable it is to have a legal system that can settle disputes without the system itself being rightly called into question in every case as to whether it is the problem rather than solving the problem. And however much rightful worry there is about whether, say, the American legal system meets that standard, how much less confidence there is in any credible party that is in front of us here, any possible party, like, do you want Facebook answering this? All right, well, how about Snopes? Can Snopes be trusted? The fact that we don't have a significant majority of people trusting anything is a huge problem because it's like, you can move the pieces around wherever you want, but unless you can create more trust and more buy-in among us that we may disagree or we may favor different political candidates, but we'd all kind of like the truth and we can achieve it among us as a shared thing and work towards it, we're lacking that right now. And I do have some ideas on that front, some of which were really, or some of those ideas were really inspired by these discussions, such as, instead of Facebook throwing up its hands and saying, we're gonna allow all political advertising, but in nearly every instance, don't expect us to judge the truth or falsity, and Twitter saying, yeah, you don't want us deciding either, that's why we're just not gonna allow any political advertising at all. My thought was to have political ads when submitted to a platform like that, they get assigned to an American high school class which under the guidance of their teacher and a grade from that teacher and maybe the health of the school librarian work through whether this ad contains such material disinformation or misinformation that it shouldn't be allowed on the platform. And they write up their findings, they get graded as to how well they do it and their findings are binding. And so that class or maybe it's three classes and then it's like two out of three is what the decision is, they decide. And it's my way of saying, all right, we don't trust anybody, do we trust our own kids? And if we don't, what does that say? Yeah. You can't trust them because they're gonna be the voters in a few years. So that's an example of an idea that I acknowledge is clearly crazy. And I'm hard pressed though when I think about it to say why it's worse than the status quo which is clearly unacceptable to me. Do you think that this lack of trust and traditionally respected or trusted institutions has sort of the result of the disinformation sort of situation that we're in? Or do you think that there was sort of the sentiment that preceded it? And this has just sort of exacerbated it because I can remember something I talked with Renee Doresta for our first episode of the series and she said something so interesting to me which is that social media has sort of had this democratizing effect in terms of who we consider to be a credible source at the same time we're experiencing so much disinformation that degrades the credibility of traditionally respected sources. Where do you think that this has really come from? Yeah, it's likely a sadly mutual cycle if the number of people that would find credible some tale about 5G and how 5G relates to COVID. I mean, anybody could sit down and write a page of word salad that invokes a bunch of words having to do with physics to explain how the vibrations actually change the vibrations of the, and it's just, it's incoherent. But the fact that that could have purchase and among how many would be a way of kind of asking that question, is it all you needed was to have your eyes and counter those words and then it's like a mind virus and it's just, you can, if that's the case, then even the employees at Snopes might need special gloves and masks and eye goggles to encounter so much disinformation and not become persuaded by it. But I don't know that that's the real model. So I think some of it is, again, it's a taxonomy. Some of it is stuff that almost anybody after encountering it, it might get them wondering and wanting some more information. That's partly the worry about deep fakes, that you see something, you feel like your eyes aren't lying and all right, somebody better explain what I'm seeing versus people who were already inclined for various reasons, including just wanting to rationalize what they already believe or wanna have happen about the world to having that smaller group of people, then persuadable by some random conspiracy theory. And they're both very different kinds of dangers. And in fact, when you look at platform responses, you'd probably want them tailored differently. If it's a, you know, what was Lincoln's quote, if it's some of the people being fooled all of the time versus all of the people being fooled some of the time and what those false beliefs might drive them to do. So our forum wrapped on May 12th and we had our last two sessions were really heavily focused on COVID, of course, it's topical on so much of what we're seeing online as COVID related or COVID focused. In our last two sessions, platforms, researchers and others in our group talks about the challenges that they've encountered as they really should work to manage the sheer volume of disinformation surrounding this issue. And then just recently, sustained attrition has really shifted to issues of racial inequity and justice and police brutality. So I think as we saw in the early months of COVID, the pandemic, just that focus on that sustained attention on a really high interest issue can pollute the information environment in a way that normal news cycles just don't, right? Normal news cycles, you focus on something that it moves on and it just keeps going. As you take stock of the challenges that are mounting in the world at large and maybe amongst the countering disinformation community as well, are there particular reforms that you hope to see? Well, I think part of the through line of the examples you're talking about is particularly disinformation that could contribute to violence or to harm, including self-harm in the health context. And it makes the stakes real. If you're thinking about a particular person choosing to look for something about whether people really landed on the moon and then consuming videos that say they didn't, you might have one view, a kind of permissive one that just says, whatever, people upload videos, other people watch them, it's called the marketplace of ideas. Tempered in the first instance by, all right, which videos is YouTube recommending? And how are you saying that that's a neutral choice? For which there's a lot of debate. But once you're talking about, all right, I go up to Bing or Google and I'm asking for a poison ivy remedy and what it tells me is to do something that's like the opposite of what you should do and then you're gonna end up in the ER. What's the marketplace of ideas argument around that? And it's not a good one. And so with COVID out there, it's in fact, not even just well, people have to buy or beware. Like if that's, if you're just gonna trust anything you see on the internet, that's your fault. Well, if it is your fault, it still might mean then that you're gonna be transmitting a virus to eight other people and it's not their fault. So that's an issue. And when it's about disinformation that could lead to violence and conflict where people are putting it out exactly for that purpose, it makes it awfully hard to just say this is too thorny a problem to start judging it. I'm not gonna wade into it if you're the platforms or if you're society. And so while acknowledging all of the difficulties that come from figuring out who's supposed to be the truth police here, having no police here is also the stakes are very real, very immediate. And when the denominator of people involved is in the billions who are tuning into these platforms and you know that a slight tweak to the platform here could greatly change the views of tens of millions of people. It is not a non-neutral position. There's just whether you're gonna be stirring the pot or whether third parties, including state actors will be stirring the pot. I completely agree with you. So what is on top for next year in particular? Well, we have a work cut out for us, right? And so I think I mentioned before we've kind of taken up other issues like cybersecurity and the ethics and governance of AI and having solved those moved on to the next. And I of course say that tongue in cheek. This problem of disinformation requires calls out for more than just the one academic year's worth of kind of focused attention in this program that it's been given. And there's a lot of momentum and I think enough collective feeling within the various groups that the status quo really isn't working that it's worth putting that against the sense that and exactly how to solve it other than just keeping on with some of the measures already in place, it's really calling out for new thinking and new experiments. And I'm also mindful that a lot of the action here both in understanding the dimension of the problem through access to data about it and what's out there and what people are doing and how they're reacting and in implementing whatever the solutions attempted might look like that does largely in private hands and figuring out the right way to bridge between those private companies that happen to shape speech so much and some sense of the public interest and public availability of that data. It is a really important role that our group can play and model and work with for the coming year. So my sensibility is that we'll really try certainly through the November US elections but even beyond to be sticking with this problem with the kinds of relationships we've forged among us and the different groups we have at the table and see if we can bring more to the table as we go. Thanks so much for joining me today Jonathan. It's my pleasure, thank you Amu. Thanks.