 Hello to those of you who've joined us, we'll be kicking off in a minute or two. Thank you. Right, so welcome to this panel on countering disinformation during global crises hosted by the research group on science, technology, health and security here at King's in the School of Security Studies. My name is Philip Alensos. I'm a senior lecturer in the Department of War Studies as well as in the Department of Global Health in Social Medicine and I'll be chairing this session. We've got a fantastic lineup for you today and I'll introduce each of our speakers in a due course. The plan is for each panelist to give a 10 minute presentation on different critical areas. So we'll start with Henrietta. We'll focus on the recent false allegations of bi-warfare labs in Ukraine. Rose will focus on disinformation in conflict and health events more generally. Ross will switch the focus to disinformation in conflict and health event, sorry. We'll switch the focus to disinformation in the nuclear energy sector. I almost had you doing something else there, Ross. And Jean and I will round up our discussion with a focus on export controls and dual use. There should be plenty of time for questions. So don't be shy. Please post these in the Q&A box below and you can do that during the presentations already and then we'll collect them for the end of the presentations. Final note to say we are live streaming and the recording will be made available after. So with that, I'm very happy to introduce our first speaker who is Henrietta Wilson. Henrietta is a visiting research fellow in war studies. She also works for UK universities, other UK universities on a freelance and employed basis including for SOAS and the University of Bristol. But we have a keen interest in retaining Henrietta for ourselves and Henrietta has a particular research focus on open source research and global weapons regulations. So with that, I'll turn over to you, Henrietta. Thank you very much, Philippa. Thank you for the warm words and also thank you for inviting me to be part of this amazing panel. I'm really excited to be here and to hear the talks that are coming up as well. So what I'm going to do in my 10 minutes is give very broad brushstroke comments about work in progress that Philippa and I have been doing recently that focuses on the Russian false allegations of biological warfare associated with Ukraine labs. So what I'll do is I'm going to give some comments about what it was and unpick some of the features of the disinformation campaign before looking at some of the responses to it and what we can learn about what worked and what could have worked better and things to think about going forward. So the first thing to say is some words about what this disinformation was. So clustering around early March, there are a number of high profile Russian statements about biological warfare associated with Ukrainian labs. I think it should be said, it can't be said strongly enough that there was no supporting evidence given for these allegations. And by contrast, there was a lot of supporting evidence for the refutation. So it's important to get that front and center. This is genuinely disinformation. There's no shades of gray here. And the other things to really point out about it is that the statements were very informal. They were non-specific. They didn't go into any details. They were given in settings like press conferences mostly. They didn't invoke formal reporting mechanisms for the most part and they spread very quickly rather dismountedly. So I think this demonstrates some broader features that are useful to think about in the context of Russian disinformation in particular. So what seems to be happening is a broad tendency to extrapolate from known facts to incorrect statements. So the known facts in this case were labs do exist. There are labs in Ukraine. They are researching pathogens in order to protect societies against disease. They are funded by the US. They're not controlled by the US. And how I can say those things so clearly is that the labs in the US are completely transparent about the relationship. So we know quite a lot about what's going on. But as I say, the strategy for these disinformation campaigns seems to be to extrapolate from known facts towards somewhere that is then an untrue statement. As I said, it spread very quickly through digital technologies and was quickly amplified by Russian allies. The point to China has been particularly significant here. And there seems to be a roster of people that are ready to pick up and amplify incorrect statements. So as I kind of hinted at, it's worth saying kind of absolutely centrally that this draws on a long history of similar disinformation campaigns, which use a very similar playbook. Half truths connected to known truths, often in the chemical and biological warfare context, the disinformation focuses on disease outbreaks. So there have been disinformation campaigns about the origins of COVID, linking it to American facilities. A lot of disinformation campaigns are also associated with labs and known facilities. And as well as using these kind of same narrative processes, the disinformation has often been delivered through the same set of allies. So the same set of individuals in the same countries often pick up and support Russian disinformation. And why? Why do they do this? Well, we can speculate, can't we? I think there's a lot of work to be done about that. There are initial concerns that the disinformation about biological warfare was part of a false flag operation. That it might be signalling some sort of intention to use chemical weapons perhaps, because that sort of dynamic had happened in Syria. That Russian had put out claims about other people's use of chemical weapons in order to muddy the waters about use. And that also, there's a history there. There are analyses of Soviet accusations of other people's uses of chemical and biological warfare that distracted from their own uses. So there were these concerns about, was it a false flag operation? So far, thank goodness, that's not proven true. So the broader thinking now is that really, this is more about controlling Russia's own population than it is about signalling anything else to about actual uses. There is also another byproduct of this. We don't know if it's intentional or not that it's undermining global governance structures, but I hope we will get a chance to say more about that in the questions. So more broadly, I've pointed out some kind of particular features of Russian disinformation in this context. I think it's worth picking up on some of the dynamics of it as well. In this context, it's resonated very quickly with far-right conspiracy theorists, particularly in the US. And the dynamics by which, how that's happening is through digital technologies, people, the forums, the technologies seem to promote a certain sort of communication which is making disinformation spread particularly quickly. And there is this interplay with global governance mechanisms. There now exist several instruments dedicated to mitigating the risks of chemical and biological warfare, elicit proliferation or elicit use, and spreading false narratives around clearly destabilizing, it has the potential to destabilize those. So moving on quickly, I'll say a few words about what happened next. Well, there are immediate refutations from governments, from international organizations, from non-governmental groups, all of whom were able to say very clearly the labs are not doing biological warfare. And this was really strengthened by new developments in open source research by which I mean, research that uses publicly available tools and information and I'd point to Philippa's work, mapping global biolabs as a really good example of by knowing the dangers in legitimate activities and by being very clear about those, you can be confident that they're not being misused. So there are immediate refutations which does rest a lot on the ongoing transparency embodied and formalized in international instruments and general industry reporting. And I have to say in this context, there's a stark difference to how Russia behaves in terms of its transparency obligations, which tend to be far more shrouded than the places that it's making allegations about. I should say that despite the allegations, nevertheless, the disconnection is kind of ongoing and you can still hear people kind of picking up and repeating it. And it feels as though there's a mechanism in which some disinformation actually spreads because of the refutations rather than despite them. And I think there's really points to, there's a differential burden of proof going on that making a false allegation, you say it without any evidence and it's picked up and repeated and people believe it. Whereas making a truthful statement requires careful wording, careful research at a high level of evidence and isn't strong enough to clamp down on the disinformation. So I think that's a really interesting, well, it's an important factor to look for. So summing up very quickly, I'm aware my 10 minutes is up. I think I'd like to point out that this means for us how important it is for governments, international organisations and non-governmental groups and individuals to be really vigilant about what the information they consume. And that in turn implies a really important role for education, telling people what's right or wrong, giving people the facts. And it also suggests a need for more openness through societies. And I'll point to last year in November last year and a UK intelligence chief gave a public talk talking about how societies needed to become more open to protect themselves. I think the education question though just to leave on a sort of question hanger, there's evidence that education works differently that it's somehow not enough to simply correct people. I think in the climate change debates, people have researched what sorts of education helps stop disinformation and simply correcting people when they're wrong is not particularly effective. What's more effective is educating people in the processes of how information is made and spread. So on that note, thank you very much. Look forward to questions. Thank you so much, Henrietta. That was marvellous, really rich, lots and lots of things to get into there. I wanted to pick up on one thing that you said very early on when you were talking about how this particular round of disinformation has happened. And it's some of it, you characterise it as informal and we have seen this kind of press conferences and the sort of language that's been used, not just the outlets, but also the language that's been used and all of these things. And I think one of the things it shows is that the settings in which claims are made really do matter. I mean, I know that just for myself, if I'm lecturing to undergraduate students, there is a certain authority that comes with that. When I sit around the dinner table at home, there's no authority that comes with that, right? I mean, the setting in which you make claims matters. And so the thing you came onto at the end was about the role of open source information. And I know you were very interested in this area. And open source information doesn't have this kind of automatic authoritative setting. And so I wondered if you had any thoughts and you may wanna deflect this to our discussion later on, but how can we strengthen credible open source information? How can we sort credible information from less credible, less authoritative open source information? How can we encourage more methodologically strong open source information, open source information that doesn't over claim? So if you have any quick thoughts, any initial reactions to that, and then we can, of course, pick that up later on as well. Thank you, Philippa. Really, really interesting point to explore. So I think this is an open question and I think whatever I say now, I could talk for a long time about it and also the day after tomorrow, we could all change our minds about it. So I think you're right to pick up on open source research as an enormously optimistic possibility for helping in this area. There's now a lot of really talented people, non-governmental groups, dedicating skills and time into finding out about what's happening in the world through all sorts of digital technologies who facilitate access to newspapers and industry reports as well as having access to remote sensing opportunities like satellites and tracking developments through social media. So it's enormously rich. It's extremely big and it's very hard, maybe for people to distinguish between what's good and what's less good in that field. And what I see in answer to your question is there's emerging pockets of best practice happening in the world. So in the human rights space, there is the Berkeley Protocol that tells open source researchers how to do their work in a way that it collects evidence that can be used in the International Criminal Court. There's the Stanley Center in the US that's developing an ethical framework. The bigger question about how societies use that open source research and know whether or not it's credible is maybe complicated and it's maybe very easy. It may be various people talk about maybe developing protocols or gatekeepers of credibility certification which can give some sort of credibility to open source researchers. But it could be as simple, it could go back to the point I made about education and teaching people the processes, teaching people to read and recognize what the hallmarks of good intelligence or investigative work is. And some of them should be of no surprise to us. If people are open about their provenance and the sources they have used, if they're open about the confidence they have about their findings, then all of those things help us to read confidence into research findings. Yeah, thank you. Exactly, thanks, Henrietta. Yeah, I think you're right. I think it is about, well, what are the typical characteristics or things to suggest that these are legitimate claims or these are not legitimate claims, right? It's about educating about what those characteristics are. Thank you so much for your excellent presentation. We'll now move swiftly on to Rose. Rose Bernard is a wonderful colleague. She's worked in the intelligence industry across both the public and private sector for over 10 years, also working with us at King's. At the same time, she's a woman of many talents. She specializes in group mapping and her academic focus is on the use of novel intelligence techniques in public health emergencies of international concern. So we're very interested to hear from you, Rose, sort of taking this focus on disinformation and different kind of biological threats a little bit further. So without further ado, over to you. Thank you so much. Thank you so much for your words. And thank you to my colleagues. I really, really enjoyed your presentation, Henrietta. And I'm gonna pick up on some of the points that I think are really, really useful. And I do want before we start to kind of get everybody on the same page when we talk about definitions in terms of the broader pantheon of discommunication because disinformation and misinformation are only part of it. Disinformation is, I've been saying this so often, it kind of runs off the tongue. Disinformation is false information shared in full knowledge of that falsehood, usually with malicious intent. Misinformation is information shared without knowledge of its falsehood. Sometimes in malicious intent, sometimes not intent, but it's about how you know, whether you know it's false or not. We also then are moving towards categorizing something that we're calling disrepresentation and misrepresentation, which is where information is not shared in falsehood, but it is shared in knowledge that maybe there isn't a shared understanding of what it means. And so you can use it to influence. So we saw a lot of this in newspaper articles around the vaccines and using those statistics for political gains because people fill in their own biases. My personal bug there at the moment of this is the American use of cyber attacks from Russia. They released a statement. They called it shields up. They told all of their critical national infrastructure to be aware of Russian cyber attacks. Obviously all of American, everyone's critical national infrastructure should be aware of Russian cyber attacks, but there is no history of destructive or disruptive attacks against the West. There is only data exfiltration and they used cyber attacks to kind of drum up a wider fear. So there is a huge interplay between all of these different types. There's misinformation and disinformation. And one of the things that I really liked that Henrietta said was about how they kind of have this tendency to extrapolate from known facts into new statements. There's this very 2016 view that Russians have troll factories, but that's not somebody that's not people sitting around in a building kind of rubbing their hands together and making up narratives. What they're doing is largely taking things that are on the internet already, embellishing it and amplifying it. And this is one of the things that we see particularly with the new legitimacy given to the anti-vaccine movement around 2016. So around 2016, you have this massive explosion in a Russian disinformation campaign which is meant to destabilize the American governance system and the American population. The way that they do that is not by going in and saying, I'm gonna target vaccinations. I don't want Americans to get vaccinated. It was literally to just take any kind of contentious areas and blow them out of proportion. So you would get people, you would get Russian bots both organizing, there was an example in the same town where the same Russian bot organized a pro BLM march and a all lives matter march days after each other. They were just trying to polarize the electorate. The impact that this has particularly on the anti-vaccine movement is it lends them a legitimacy that they didn't previously have before. So there has been a, there's always been an anti-vaccination contingent ever since vaccines got a little boost in the 90s with the now discredited scientist, Andrew Wakefield but it's very echo chamber communities. Then what happens is it becomes politicized and amplified to a disproportionate amount and it comes into this wider American political Republican fringe. So it starts out with people on forums, 4chan Reddit, 8chan goes to Breitbart as Breitbart gets integrated into Fox News and those kind of things happen, Fox News pick it up and eventually it starts becoming a political platform. Again, one of the things that I really liked that Henrietta said was when she was talking about an open source research and Philippa you talked about how do we kind of change that credible to the non-credible sources. It's really hard because if you are a person in the US and also in the UK and you want to get information about something you read online, New York Times, it's behind a paywall, Washington Post behind a paywall, the Atlantic behind a paywall. You know, isn't behind a paywall? Breitbart, Fox News. So all of your sources are already going to be biased. What happens then is you then get increased transmission, the issue of vaccination becomes politicized and you then have a very, very unstable population. We saw again a much more, this is contrasted against a much more targeted disinformation campaign in the Ebola outbreak in 2014 and 2016. And then again, 2017 to 2018, because that is a very, very targeted campaign by Russia, by China, saying that it's caused by America, American healthcare workers are there to give people Ebola, not cure people from Ebola. But again, these rumors are already there in the population, there is already this mistrust of the West, which is pretty founded, I think, and they are just amplifying that out. This has an incredible impact on the course of the pandemic. So it has an impact, people stop seeking treatment because they're worried to go to the treatment centers because they think that they're going to be killed. People stop saying when they've got an infection because they think that they're going to be ostracized by their community. There's a huge increase in violence against healthcare workers. So some organizations pull all of their healthcare workers out. So people can't get treatment. This is one of the reasons why it just went on for kind of and kept extending and extending because people weren't seeking treatment, people weren't able to access treatment. So we have all of these that are kind of specific to pandemics and specific to healthcare events. I do want to spend the last kind of two or three minutes that I've got talking about the different themes of disinformation and how we track them. One of the projects that we're doing at the moment is trying to look at how everything is going, how things are changing and being manipulated across platforms. Facebook more likely to host specific themes of disinformation, Twitter, all of these different things. And largely what we are finding is that it is becoming more helpful to track these things thematically. Yes, you can write an algorithm, you can code something, you can work out how many bots there are, you can get them taken down, but that doesn't assist the interplay between misinformation and disinformation. And what we have kind of found out is there are kind of five main themes of disinformation. The first is racially motivated posts. So these are anti-immigration. These are often anti-Black Lives Matter movements, anti-Semitism, not a very nice corner of the internet, but that's where they're coming from. It's playing on people's fears. Second is biologically motivated posts. Anti-vaccine talks about biological warfare. I put the biolabs in that category as well. Anything about disease, biology in there. Third, grand world order posts. So this concept that there is a huge government network of people out there who are controlling the world and surveilling you through various things that the government have to surveil you. The fourth is foreign government aggression and political protests. So it's based around casting a group or a country as very, very politically disenfranchised, very violent. And then the fifth is just an open category because there are some really weird things on the internet. I once saw a post where somebody was like, oh, you believe in the moon landing? And somebody's reply was, oh, you believe in the moon? Who's like, yes, I believe in the moon. But what we're trying to do with that is track the waves and falls and crests and rises of those themes over platforms. Is there going to be an event, for example, which we know is going to provoke a specific type of disinformation? We should have known, somebody should have known that Russia would do this kind of disinformation about biolabs in Ukraine. Like they did it, they've been doing it for so long. And if we know that, we can proactively go out and educate people. So by the time they're getting this disinformation, they're like, that doesn't tie in with what I know already. I've already got this kind of ground swell of understanding of the situation that enables me to think critically about the information that I'm hearing. And I've talked for my time very, very quickly. So I'm going to pause there and have questions. Thanks so much, Rosa was brilliant. On your last point there about proactively tell people, is that sort of what you meant with what the US was doing in terms of Ukraine when they were sharing intelligence about the sorts of things that might happen? Is that the sort of thing you mean? An outreach, I think it's all very well, us doing it on an official level, it's all very well. Government transparency is great and fantastic but it's not going to convince people who largely get news sources from social media. Yeah, fair point. You have to use different sources of authority, right? Or target different groups with different sources of authority, absolutely. Ross, you had a quick question or a quick comment as well. Yes, thanks, Philippa. So Rose, at the start you mentioned a few different types of information, problematic information, I would suppose. And one thing that I would potentially suggest adding in is something I saw online, mal-information, which I understood to be defined as correct or true information that's shared in the knowledge it will cause harm. So releasing confidential information about somebody that will then put them at risk, for instance. I don't know if that fits in with your typology as always adjacent to, but I just thought I'd want to mention that one. Yeah, that's really useful, thank you. I think this sort of dialogue is very helpful as we are developing different characterizations, different typologies, figuring out what the features are and what is more salient and mechanisms and all of this. So yeah, sharing information I think is very helpful here. We'll move on over to Ross. Ross Peele is a former nuclear engineer and a post-doc researcher in War Studies based in our Center for Science and Security Studies. So Ross works on topics related to nonproliferation and international security in connection, unsurprisingly, with nuclear power plants, given your background. And he has a special interest in how states use the export of nuclear power plants to exert improper influence and control over nuclear newcomer countries. Clearly, energy is another one of those massive topics just now. Lots to say, we're keen to hear what you've got to say, Ross. Over to you. Thanks very much, Philippa. Hopefully my slides will be coming up in a moment, if not already. You're not doing great. Yes, we can see them, thanks. OK, perfect. So what I wanted to talk about now and I'm just making a note of the time is what I've been observing, which is how certain states, and as a spoiler alert, it's going to be one of the states that has been talked about a lot already, is using disinformation to sow distrust of competitor nuclear energy organizations in the West. So I'm going to kind of run over a very quick bit of background as fast as I can. So the first nuclear power plants were built in the US, the UK, the USSR in the 1950s. And at that time, energy was promised from nuclear to be almost too cheap to meter. But over time, this has not really been the case for a number of reasons, partly through things like increasing standards required for safety and nuclear security. And this has led to increased costs and thus a need to increase revenues from energy. And the way this has been done is through an economy of scale. So what we've been doing is making these plants larger and larger. And the idea is that if you produce twice as much electricity to sell, it doesn't cost you twice as much to do that. So you've been getting larger and larger. But the problem we've arrived at now is that these things are so large that private corporations can't really cover the costs of building them. It's too much risk for them to accept onto their books. And so this has meant that private companies that have been originating in Western countries have struggled to do so. Whereas companies such as the Russian State Nuclear Corporation, Rosatom and CNNC, which is China National Nuclear Corporation, both of which are strongly backed by their governments, fully owned by the governments and are empowered and given full mission from the government to export nuclear power plants into other countries are strongly enabled to do so. We've seen Rosatom doing this for a couple of decades now. And CNNC is certainly in the last few years has been hot on their heels. Now, it's not really the topic of the presentation, but I will say that we're not stuck in this position. And we do have now movements toward what we call small modular reactors, where we're going towards a different economy model. And the idea is to build a large number of smaller ones, which you can then export around the world. So that's how nuclear is reinventing itself. So how do Rosatom and CNNC compete for nuclear contracts? Well, they have a strategy that takes place over years with a huge investment in sales and taking all these different countries that they're promoting towards through a, almost a customer engagement journey, starting from government to government, mining and dining and eventually going towards the point where you're signing some memorandum of understanding to engage in specific areas, such as the preparation of legal frameworks for nuclear. As those become established and the relationship becomes trusting to the point where it needs to be, you then start to sign contracts from nuclear power plants. And those can take many forms, but one of the most concerning to me is the example there, the build-own-operate contract. So whilst these type of contracts can offer a lot of value to host states if both sides actively seek that and want to formalize it, they're also potentially very exploitative. The customer state is effectively providing land, but in return is gaining nothing from that apart from a very overpriced energy. Supplier state is in complete control. It owns the land, it builds the plant, it operates the plant, it owns it. It transfers very little knowledge or value to the host state and in doing so maintains complete control over the nuclear asset and thus the power supply of that country. Now finally we come to this part which is not well evidenced and this is what comes next and this is a critical research area for me that I'm concerned with. And it's the exacerbation of imbalances of power between states and what that might mean. So you're effectively on a path towards soft power empires based on the control of energy where states control the critical national infrastructure that enables society as we know it today and as many states want to achieve it to be under their influence. Now I'm using a lot of language here in my slide that sounds like the language of grooming and in many ways that's how it occurs. You have large states with them align agenda, exploiting their position of relative knowledge, wealth, and power to make states with energy issues or a desire for decarbonization, trust them, become dependent on them and turn them away from others who might then help seek a way out or an alternative path. And the disinformation really comes in at every single stage of this process where you are seeking to exclude competitors from your customer's mind to the point where it's not only technically, economically and politically impossible for them to break away from you. It's actually unthinkable because they're 100% invested in you and every element of their nuclear sector from the underpinning legal frameworks to the nuts and bolts holding the rack together is wholly aligned with you and derived from you. And if you do this enough times, you're creating an energy empire, a network of states who are 100% reliant on you for their energy and that's their economy and their society. And thus willingly or unwillingly, they become your loyal subjects. Now the disinformation comes in a few different types. So the first is to, the first objective almost really is to convince the potential customers that you will look after them by minimizing the true costs of doing business with you. The second, the objective is to turn the potential customers away from any of your competitors who would offer them alternative options aligned with international best practice. You especially want to turn them away from, for instance, the international atomic energy agency, the international body because they will give them support. So do you want to either turn them away from the IAEA or you want to act as the sole interlocutor between the customer and the agency? Finally, you want to discourage your customers from forming independent regulatory bodies that would hamper your efforts to control the supply state without interference. And I've kind of made this graphic of Neapolitan ice cream because it seemed like a good idea at the time to convey the idea that all these three things come together as part of the same goal. They all come together to create crippling energy dependencies, which in turn minimize foreign state sovereignty and allow you as the exporter to exert control over states beyond your borders. So there are a number of claims that we see repeated in disinformation from various states. First of all, you will gain energy independence from doing so, even though I will control all of your reactors, your fuels, your staff, only I can operate it and I will take all the revenue. We will claim that your plants will never run over time or over budget during construction underlying those other countries. But it's very easy to make that happen when you're able to control the budgets, make the numbers go away and cut corners on the construction. Number three and four are linked. We often hear that you will be used as a guinea pig for the foreign state's use of these small modular reactors I mentioned earlier. And our plants are ready to go now unlike those ones, which is kind of an apples and oranges comparison and it doesn't make sense. Fifth, we've seen claims that stay away from those other states and the International Aid Atomic Energy Agency because they want to exploit you. And finally, the last one about, we will do all the regulatory work for you. You don't need to worry about thinking about is this thing that has been built by a foreign power actually safe or secure? Just let us handle that for you. And as we've seen in every industry where it regulates itself, that is not going to lead to good outcomes. So I've put a few examples of disinformation up on the screen there. And these are taken from Russian propaganda in Eastern Europe and Central Asian countries. All of these were published in Russian but I've run them through Google Translate for my own understanding as much as the slide. Now I'm not doing this to say that Russia is the only state engaged in these activities. It's just who they've, who I've been looking at so far. So the top three are from BALT News which is an allegedly Kremlin produced propaganda outlet directed towards a Russian speaking Lithuanians. And it's active in Lithuania, Latvia and Estonia. So in the top left article, BALT News is criticizing US-funded reactor programs in Ukraine, Poland and Romania. And the article calls into question whether the safety of US power plants and specifically those designed and marketed by new scale power is valid and claims that US nuclear energy cooperation with Poland has not yet been fruitful. The middle one, the kilowatts don't smell one makes the case that Russian power plants are beneficial for Russia, Poland and the Baltic States because of a perceived threat posed by NATO to the region. And the hysteria on the horizon one falsely claims that Russian Rosatom's construction of its nuclear power plants in the Kaliningrad oblast this kind of slightly separated part of Russia may be complicated and negatively impacted by the unfriendly position. This is a quote, unfriendly position at the Baltic States and Poland towards Russian projects, end quote, referring specifically to those countries' relationships with the West. Sputnik, which you may know is a well-known Russian propaganda outlet and proliferated disinformation. So that article criticizes US nuclear energy company Westinghouse in relation to Ukrainian President Zelensky's signing of a memorandum of an understanding between itself and that's Westinghouse and the Ukrainian state energy company. And the article claims that the project is too expensive and suggests Rosatom with more reasonable partner for a lower cost. And the last two articles at the bottom right and the bottom center have a common theme which is that Ukrainian nuclear power plants are unsafe and will lead to a repeat of the 1986 Chernobyl disaster. And the middle one in particular is particularly challenging because it cites a scientific journal paper released in 2016 called Reassessing the Safety of Nuclear Energy. But the article completely invents the contents of that paper and it goes in a completely different direction from what it says. And it claims that the authors predicted an 80% chance of an explosion at a specific Ukrainian nuclear power plant which is complete nonsense. And I'm aware of the times I'll skip forward slightly to say that it's not just in news websites. We also see this in academic papers as well. So this is an example of something that was, should never have been gotten through peer review in my opinion, but it basically claims that Russian nuclear power plants promise absolutely everything to everyone. So they're cheaper, more reliable, more modern. They adhere to international standards. They have short construction times. All the training will be provided, all the fuel will be provided, all the waste will be taken away and really makes Russian nuclear energy cooperation sound like the panacea for the evil West's attempts to abuse innocent countries. So finally, what can we done about this? Oh, sorry, I forgot to move the slide on. So that's the paper I just mentioned there. And finally, what can we done about this? Well, we need to be countering disinformation narratives with the truth as other speakers have mentioned. You based on high quality, evidence-based, accessible information that the reader or the receiver that information can cross-check and look into themselves. Ideally adapted towards the national context of the reader or the reviewer. And we want to be, as others have said, promoting transparency and building the capacity of others to challenge information, not just accept it and expose the true costs of doing the business with these dishonest states. So I will stop there in the interest of time, but thank you very much for the opportunity to speak. Thanks so much, Ross. And thank you all for keeping to time. So excellent, you're making my job much easier. Ross, I love your picture of Italian gelato. I think that should be in every presentation in multicolors, et cetera. So we should set that as a goal for ourselves to kind of get some Italian gelato into every presentation. I really enjoyed all the common claims that you put together around what was going on in the kind of the nuclear power plant world. I think that is certainly one of the things that we can do as experts in our fields is to collate these sorts of common claims as a way of complimenting not just an education about what are the mechanisms that are used, but what also are the common themes. I think that is somewhat similar to what the EU project, the EU versus disinfo project is doing on a bigger scale, not within our specific silos, but on a bigger scale. And I do think that is a really, really very useful exercise because you can see what the different variations on different themes are. And I think there were a lot of parallels between your talk and what Rose was talking about as well in her session. We're going to move on to, again, another very different field that also has similar challenges with disinformation. And this is Shonané De Saint-Raph is, well, comes to us from the financial sector where he's worked for 15 years in transactions and management roles for large financial institutions. So his research focus is, unsurprisingly, on counter-proliferation finance, but also on dual-use goods. So Shonané is a scientific advisor to the European Union Partner to Partner Programme on dual-use goods. And he'll talk to us today about disinformation in export controls, I believe. So with that, I'll turn over to you. Thanks, Shonané, for being with us. Thank you very much, Philippa. Hello, everyone. So thank you for the organizer to give me these opportunities to speak about export control and proliferation finance in the context of disinformation. Well, I'll try to do today is to look at disinformation in the international policy domains. And that's related to a paper, a journal paper I'm preparing. But before I get into substance, just a quick word to say that proliferation finance is quite used to disinformation because the vision tactics rely a lot on lies and disinformation. And most of investigation we do relates to finding the real facts about the sanctioned invader, the true purpose of a transaction, what's behind the lies put forward to a financial institution, for instance. When we talk about proliferation, especially chemical, nuclear or biological proliferation, financial institution needs export control lists that comes from export control regimes because in a nutshell, financial institutions are not scientific experts. They don't know a lot of what dual use goods are made of. And so checking information from a financial standpoint needs to be referenced into export control regime. What I'm going to speak about now is an attempt by China to revisit export control international standard. That's my main argument to say China is trying to revisit export control international standard. So I'll talk briefly about the mechanic around this alternate narrative about international standard on export control, which has a serious implication for the private sector, namely the financial institution, and probably has brought the questions on disinformation, academic literature on international policy. A bit of the concepts here on disinformation in international politics. So following what Rouge said earlier, disinformation is incorrect by intent, but moving on to international politics. There's, we know from the literature that it's, we can sometimes expect government leader, country leader to lie. The book from Marshall Eimer 2011 talks a lot about those lies, WMDs in Iraq, Vietnam, World War II from a US perspective. So we can expect leader to lie. That's expected, for instance, for intention and capabilities, a sort of statement that I won't invade Ukraine, for instance, that's typically a lie that is expected in the international politics area. The point here from a literature perspective the literature focused largely on domestic or bureaucratic politics, but there's very little on how a country, might expect international politics gains, especially in moving an international standard towards disinformation. What I want to talk about is the UN General Assembly resolution that has been passed last December called Promoting International Corporation and Peaceful Users in the Context of International Security. So this resolution was adopted by a majority of 78 votes against 53. And as you might see in the Chinese head of delegation comments, it was clearly meant to be against export control regime. And I'm going to dwell in a minute about the mechanic beyond this argument, but there's a broader context about the Chinese reaction to the Orcus deal in the Indo-Pacific and also 1540 renewal context. So this resolution was brought by China supported by 16 sponsor and 13 core sponsor. And you see in this map, the result of the votes and frankly to me, this map looks a little bit familiar. So I've seen this map in other contexts. So this is really the result of these votes. But interestingly, the literature also hit a little bit on this sort of North France divide in export control. And suggested is the export control regime are dominated by developed country at the expense of southern country. So how this information mechanic work. So there's the resolution narrative to say peaceful purposes framing that science and technology are beneficial for the development of countries for the economic and social development. That is true. There's benefits from this technology, but also dual use good and technology might have military purposes. And that's the purpose of export control. And there's no way to say in this context that the export control I meant to deny access. Some countries have legal mechanisms. So that's what the resolution or the Chinese delegates say that is put in place to restrict exports, the export of science and technology to other countries. Which is export control regimes well built from lesson learned on hard proliferation cases. The irony in this resolution is that the peaceful purposes quite echo the peaceful explosion of India back in the seventies. So there's a bit of an irony here that this peaceful purposes is sort of exploited again for sort of counter narrative on proliferation. So when we come to the disinformation mechanic, there's the denials abuse that are meant to restrict access to science and technology. This is not true. This is actually false that because the denial rates are actually really low in terms of export controls. Denials are based on national decision according to risk assessment made on the end user, the end user of a technology, but they are not meant to deny access to technology. And they are in no way in high number. The denials is the exception. And I think the also underlying argument in this narrative resolutions, say China doesn't deny access technology. That's the implicit here. And the fact is China has a very robust export control system itself and is also a member of the regimes. So there's sort of ambiguity here about what are they doing and how this mechanic is working. The bottom line is denials are country decision. So export control regime, which are targeted by this resolution are in no way in a system to predict what should be denied or should not be denied. That remains a country decision if a country wants to export or not a technology to another country. Here you see some media reception on this resolution. So these media, these press cuts are mainly Chinese base and praise the China for putting a resolution as a UN to solve global security threats to say that for the first time, China has proposed a great resolution on arms control and disarmament at the UN and that China is promoting international cooperation. Interestingly, there's been very rare mention that actually I'm looking for once if you see on this domain on the Western side and well, export control officers, export control agencies are pretty aware of this resolution but there's sort of, so what thing? What is China trying to achieve beyond this? Before getting into this question is the relevance to proliferation finance how that impacts or can impact the private sector and financial institution. So export control and dual risk goods control relies on legislation which are adopted by export countries which are the ones who are, which are producing dual risk good technology and international internal control processes. Financial institutions are not really aware of these different technology. And so they have to rely on lists if they want to perform checks on this transaction. Introducing a resolution like this, saying more than half of the world is not relying on export control means a greater risk for financial institution because they will have to move from a pure rule-based approach. So rule-based is black or whites. I know what's in the list. If it's on the list, it should be denial or it should be reviewed. If it's a risk-based approach, they have to make an informed decision. And this informed decision is even more critical according to the jurisdiction they operate, especially when we're talking about export control then they need to be compliant in every jurisdiction they operate. And so that will be a major challenge for financial institution. Or since obviously here could be a tool for financial institution, especially to reach a level of confidence about the transactional behavior. So to what level they can, or to what threshold they can get to say transaction is valid. That's also get into managing the level of false positive in accordance to the risk appetite for a financial institution. The bottom line I would say is this resolution sort of open more informational demand especially in non-expert area about the proliferation related risk with fewer references to rely on. More broadly and say the academic understanding of disinformation international policy, there's a question which I heard several times in this context is what were they thinking when they put this resolution? So how does disinformation has been produced? Because it seems to go against Chinese interests. They have a robust export control legislation. They are member of export control regime. They are candidate into other export control regime. And as well getting half of the UN General Assembly to sign off on the resolution requires diplomatic effort. That goes against the other diplomatic effort they're doing to join those export control regime. So there's a sort of paradox here about how this disinformation has been produced. Another aspect here is how can we contour this narrative? We can't go back to the UN General Assembly and say, hey, by the way, you were wrong, that's false. The argument that was put forward in the first place is based on false arguments. Although there might be a chance next year because the resolution call for the UN Secretary to put a report about the peaceful purposes. So there will be a sort of a rendezvous point in a year for that. But basically, it's difficult to contour because it's wider in scope than a single or a simple event. We're talking about policy framing and so that becomes more difficult to contour. And I think my final point is on the alignment with the current literature on disinformation in the context of international policy, which says that in the short term, you can expect results but not really to policy changes, which is true in this case, export control regime won't fall because it's a voluntary mechanism and contrary will still probably, well, likely apply export control regime. But this shows especially the polarity, so it's beyond the export control, beyond the topic. So in the short term, there's already probably again made beyond the topic of this resolution. Medium term question is, can this disinformation be part of a more elaborated foreign policy agenda? So it's not a single event disinformation, it's part of a broader foreign policy which raises question about the level of acceptance and support of the other countries because we're not talking about China alone, we're talking about 76 countries together with China who adopted that resolution. All right, thank you so much, John and A, that was terrific, really interesting case study actually and one I remember when it came up because I do cover the UN General Assembly and specifically the segment where it was brought up in the WMD segment. And I remember at the time where I write for the Women's International League for Peace and Freedom we provide these kind of updates about what happened this week in first committee. And we were discussing, well, where does this go? Does this go under my section of bio? Does it go under the chemical section? What are the Chinese doing with this export control resolution? So yeah, very interesting to hear your analysis. For me, the most interesting point of your presentation was actually how you bring up in this relationship between lie and disinformation because it seemed to me at times that you were using the terms interchangeably and to my mind, they're quite separate. So disinformation certainly has a component of lie but it has a very large component of truths as well that are built on, right? And so, and I think the way in which disinformation campaigns are run can be different to just lies. But at the same time, I think there's a lot to what you were saying about lies. And so unpacking that relationship a bit more, I think it could be very interesting. I'd also be very interested to hear what others have got to say on that. The other quick point that I just wanted to make was you talked about the importance of fact-checking. And one of the things we've seen early on in the war with Ukraine was how a new way of spreading disinformation was to package it as fact-checking. So you would see Russia say, oh, this is what the West is claiming but this is actually the truth and that truth would be disinformation. And so I think this is just spinning. It can be hard to keep up. Even when you're an expert in the area, right? It can be hard to keep up. Right. We've got already some questions in the Q&A box. I'm conscious of the time. We wanna finish a little bit early. Some of us are double booked. So we're gonna go just a few minutes longer. I wanted to just quickly get to these questions though. I hope the team, the panel team is prepped for some very swift quick answers to them. So the first question, and I think, Rose, this might work for you. When Russia's invasion of Ukraine began, did the overall Western media's disinformation towards the war and the enormity of its move to demonize all of Russia and the Russian people frighten you? I think that there are two parts to this question and I'll try and be quick. I am yet to see an enormity of campaigning in the West's coverage of the Russian invasion of Ukraine. I think it has been commensurate with what happened. However, I do think that it has used it to frame certain things politically. The question that I think is, the part of the question that I think is most interesting is the part about demonizing the entirety of Russia. Again, I think that the concept of the overall media is unhelpful in this because actually a lot of this is through misinformation on social media. So it's not necessarily through official news outlets, it's much more people spreading things on Facebook and people demonizing an entire population. And that can lead to significant problems. We saw a lot of anti-Chinese violence in the beginning of the COVID outbreak. We've seen a lot of this mob mentality. So I think that we have to be aware of the way that this can move into violence, this misinformation on social media echo chambers can move into violence. Thanks, Rose. I think, like you're saying, it is important to distinguish also between the government of Russia and the people of Russia, right? And when the West talks about Russia or targets Russia, it's the government of Russia, not the people of Russia that are targeted. We'll move on quickly to one of the second questions here a broad question. And I wonder, Henrietta, whether you might wanna answer this one, a broad question on disinformation. It seems part of the problem with disinformation today and false narratives is not that people don't hear other views, but that they simply don't trust these other views. So other voices are actively undermined. How do you think institutions and individuals can overcome this fundamental distrust? Thank you. A really important question then, I think all the panelists have said different things in answer that can suggest the way to do it. So the problem is not that the truth is buried. The problem is that the truth is not believed. And I scribbled down notes from Ross's talk and I'd love for other people to come up on this about making sure the truth is out there, making sure that you rebut false claims, making sure there's transparency in the process, John and I demonstrated fact-checking, all of these things are things that institutions or individuals can do. And I think there are existing mechanisms for doing this. And Rose pointed out, newspapers, trusted brands, you can get credible information from trusted brands. I personally feel it's a really difficult time for these to be implemented. I know there are initiatives that are kind of building on these for formalizing fact-checking, like the Center for Information Resilience or the US Verified Programme that's trying to give a stamp of authority to things, but it feels like it's a very difficult challenge. I invite anybody else to respond as well quickly. And I have some thoughts, I know that was a great answer. I think also we have to see this in the context of a sustained attack against institutions and the concept of kind of expertise that has been going on since 2015, 2016. So we're not just trying to restore trust in the information that they put it out, but in the institutions themselves. And that just makes the task much more enormous. I think that's right. A lot of this is anti-science. So how can we just throw truths or facts back at people if that actually it's the facts that are the problem? I think that's right. We are running short on time, but I do wanna bring in Ross and Sean and A, if you have any final thoughts that you want to share. I also want to give a quick shout out to Alexander Kelly who's popped a quick question in the Q&A chat. Nice to have you with us, Alexander, based at the OVCW. Your question and others may wish to also come back on this in your closing remarks, which I will hand over to you in one second. I think how the governance structure's been undermined. I agree there has been this incredible unity which should have, and I think to some extent has supported international governance structures, but I think that's a great question. Governance structures, but I think the way in which, for example, Russia has misused its platform in the Security Council to as a platform for conveying a lot of its disinformation has been hurtful to the Security Council, likewise in the Biological Weapons Convention, et cetera. So I take your point completely, but I do think there are also examples where you see it's really undermined some of the international institutions that we have in place. Right, so with that, I will turn the floor over to we'll go in reverse order, as we did with the speakers. So Sean and I will hand over to you for a 10-second final takeaway message from you. Yeah, thank you, Philippa. I think the main message I had was disinformation is known into people's minds, but it can also work on the international arena and there must be fact checking on international politics as well. And the assumption that contrary cannot be disinformed because they would know, they would notice that it is disinformation might not work as well. Thank you. Russ? Excuse me, I'll just try and get all my bits and pieces unmuted. Well, I'll say thank you very much to everybody who's taken part in the panel. This was extremely interesting and there's someone who is primarily a nuclear person who's recently come into disinformation to see some of the things I've been discovering lately, reflected in other areas as well, has been extremely valuable. And I'm hopeful that by continuing to discuss these and do something about it, we can maybe promote those open dialogues and encourage people, the right people in the right place to be more questioning and more resilient against attempts to disinform. Thank you so much, Russ. Russ? I agree with Russ. This has been a really useful and interesting panel and I think has proved that this idea of disinformation, misinformation, it's much more nuanced because it's about mediums, it's about different objectives, it's about how different use it. And actually what I've come away from this thinking is that we need a, it's not just enough to do an institution down or a kind of newspaper, social media up. We have to somehow meet in the middle. Very good thought. Thank you, Russ. Henrietta, final word. My final words are, thank you very much. What an interesting set of talks that I've just listened to. You know, I've learned a lot from everybody. Thank you too for the really interesting questions. I think the question about whether or not the extent to which the disinformation under mind, global governance is a wide and rich one to consider and just, you know, there's a sense that all of this disinformation reflects and drives wider themes through societies. And it's really important that we understand those wider themes and do whatever we can to make things move in the right direction. Yeah. Thank you very much. Thank you, Henrietta and a very warm thank you to all of you, a fantastic set of talks and an absolute treat to be allowed to chair you. And I think we'll find a way to take this forward, bring together some of our themes because I think there was a lot of, you know, similarities and things to build on, a lot of links between the talks. So thanks a lot for me. Thank you to all of you who participated. Thank you for your questions. And until next time, take care. Bye-bye, everyone.