 lithium source verificatio. Mi name is Henrietta Wilson. I'm absolutely delighted to be part of this. I conduct freelance research on weapons regulation and it's really exciting to be part of the SOAS team looking at this really interesting project. Thank you all, very much for coming here and welcome to everybody. We've got a lot of really exciting speakers and a lot of very exciting participants. Before I hand over to the speakers, I'm just going to say a very few words about how these webinars are going to work and what they're about. So, first of all, they've got a dual purpose. The main inspiration for running these types of webinars is simply to showcase the variety of open source verification that is happening right now all around the world. So, by open source verification what we mean are systems for tracking illicit activities that rely on publicly available information and are done by non-governmental groups and individuals. So, we're aware that technologies have absolutely transformed the capacity for looking at things remotely, identifying things that are going on and also for communicating about those things. So, we can see this in activities designed to observe nuclear proliferation, small arms flows, movements of illegal radioactive materials, all sorts of tracking systems are happening in very different ways. So, we want these webinars to really explore that diversity of what's going on. But beyond that, we want the webinars to provide a space to really consider some wider issues associated with open source verification. So, as different as all of these activities are, there are some commonalities going on around them. In particular, there are some challenges. Some of the activities encounter very similar ethical dilemmas. Some of them have issues around authenticating the data that they generate from the open sources and there are also some commonalities in thinking about the wider political significance of open source verification. So, a question marks around how much open source activities can contribute to bigger regulations against weapon systems and that's particularly interesting to two projects at SOAS. First of all, the Scrap Weapons project, which is looking at options for general and complete disarmament and, relatively, SOAS is looking at the possibilities, the desirability of a global weapons tracking system and in that mind, we're thinking about could open source verification afford a seed, an inspiration for a global system to track weapons. Would that be helpful to other global regulation systems? Is it helpful to have more diversity? So, these webinars are really an opportunity to explore those sorts of questions. So, what we're going to do today is we're going to have four short talks from people engaging very different sorts of open source verification activities and then we're going to have a comment from respondents who are reflect on the talks we've had. Throughout the webinar, we'd be really happy to have questions from all the participants and you could submit those through the chat function. If possible, it'd be great if you could also give your name and affiliation, you can start asking questions or making comments right now. The webinar will be recorded and transcribed and published via the SOAS website after the event and just to say, we're set to finish this webinar at 3pm but we are going to keep the Zoom function open for another half an hour after that for anybody who wants to stay on for some informal discussions around the event. So, just to give you a sense of the running order, I'm delighted to introduce some really fantastic speakers, really grateful and also their time in contributing to this. First of all, we'll have James Revell, who's a researcher on the WMD programme at Unidia and he's going to be followed by Veronica Bedenco, who's an analyst at the Open Nuclear Network and Ditaio. Andrea Carboni is a senior researcher at ACLID and a research associate at the University of Sussex. Christopher, a senior researcher at Vertic, will be the final speaker. Also from Vertic, we've got Anuradha Damalai, who's going to reflect and respond to the speakers. So, thank you very, very much for everybody and I'll hand over straight away to James Revell. Thank you. Okay, hopefully I can get this working. And now I'll start with the little part. Okay, can I check, can people hear me, first of all? Yep, okay, and I'm hoping you can see the slides as I go through. First of all, thanks to Henrietta and the organisers more generally for the opportunity to speak in this exciting series. I should note I am speaking in my personal capacity so these views do not necessarily reflect those of Unidia or the UN. And I should also note that what I'm going to talk about draws from a series of papers that we are developing through a Norwegian funded project on WMD compliance and enforcement and we have some more papers to come which will focus more on open source verification and open source tools. In terms of what I'm going to talk about, if this still works. Yep, okay. I want to start off with a bit of an outline of some of the potential roles that open source information can play in addressing WMD treaty compliance concerns. But I also want to look at some of the users of different sources of the open source. I think asking this question of who and how different entities will use open source can actually be quite helpful. And finally, I want to conclude with a healthy dose of realism. I don't really want to poop the party but I think it is important to recognise there will be challenges in trying to use this technology. So to start off with concerning some of the roles of open source, I think there are multiple ways this can feed into treaty compliance. First one I think is providing alerts of possible non-compliance. Alerts of incidents such as disease outbreaks or alleged chemical weapons use will in some cases become first visible through open source data. To give you an example, Promet is the first to report natural disease outbreaks, including SARS, MERS, Ebola and the early spread of Zika. In cases where there was a deliberate outbreak of disease caused by a biological weapon, it will be likely that tools such as Promet will be amongst the first to report that event or is to suspicious outbreak of disease as it will be understood at that point. Similarly, in looking at chemical weapons issues as well, it was YouTube videos and social media materials which provided the first indication of a chemical weapons attack in Guta in August 2013. So there are two sort of functions there. In addition, open source trade data can also be very useful in identifying anomalies with imports and exports. And open source satellite data can provide a number of functions which I think Veronica and others will talk to further later on. The second area that I think to look at is the role of open source in contributing to investigations of non-compliance. And here it is clear, particularly in the work of the OPCW, that open source has become useful. So the IIT first report indicated that they collected information from open sources, for example flight data, was confirmed through open sources. Similarly, in the OPCW fact finding missions, they also drew in part from open source information that was cross-checked with other data. There's also a third possible role which I think open source can play in relation to WMD treaties is that of contributing to accountability. So open source information, once corroborated and authenticated, can serve as evidence to hold actors to account. You can see one example of this in the international and partial independent mechanism which is using open sources amongst a number of other evidentiary tools to investigate and prosecute those responsible for the serious crimes under international law. So there are these various different functions. I should point out the latter one, so talking about contribution to accountability is not the same as treaty verification which I'll come back to shortly. In terms of who's going to be using this sources, I think you can identify three categories loosely. If you can begin to look at the end user requirements, it can perhaps be useful in trying to target and develop data suitably. So if you take the case of NGOs, civil society academia, these bodies are typically able to innovate quicker with open source technologies. They can produce assessments faster and they can be more forthright in what they say. However, there are questions and questions have in the past ar risen about the impartiality of NGOs. I think there are also limits to the extent to which NGO actors in civil society can feed into discussions around compliance. It's very often going to be a sort of case of warm way traffic with information going in. International organisations can also use open source, but they have to use it in a different fashion. So they have to be very careful in what they say and indeed in the methods that they use. There is an expectation that international organisations will produce authoritative outputs based on technical assessments. This can be quite a slow process. It's not the case of an hour long CSI episode where it's all over in the space of 60 minutes. It requires authentication, corroboration, and open source data needs to be validated along with the methods. Finally, for states, there's also clear usage of open source. It provides a valuable alternative to classified sources that can be useful in making points. Indeed, in looking at treaty compliance, states will draw on all available evidence they have to reach a judgment around treaty compliance. This is important because it's essentially going to be states that enforce treaties, certainly in the case of major treaty violations. The last two, so international organisations and states, seem particularly important as they will normally be responsible for verification per se, and there'll be a division of labour between the roles that they play. With this in mind, I'll turn to my final slide on this idea of a healthy dose of realism. I do feel slightly bad about being a bit of a party pooper in the first session of the first event here, but I do think it's important when you're thinking about open source verification, particularly in relation to treaties, this can be very useful, particularly as we face a current situation in which we have a highly contested information environment and we're facing considerable geostrategic tension. So, a couple of things to keep in mind. First is that open source is seen unfavourably by some states who in the past have challenged both the authenticity and the basis of open source data. So, do we need to be careful that open source information can be authenticated? This is going to be even more important as deep fake images become more effective and efficient, making that authenticating document is going to be really important, and that may require drawing on a wide range of skills, for example in the area of digital forensics. It's also important to make sure that open source verification tools are going to be corroborated with other information. Often the sort of technologies we're discussing can provide one single piece of what is a larger picture, which is required to address compliance with treaties. So, we need to look at how any particular piece of open source data fits with a wide range of provided bits of information. The third portion, I think this is particularly important for international organisations and treaty secretariats, is that both the technology and the methods used will need to be validated. This in some ways armor plates or provides a shield against criticism of procedural or technical criticisms around procedural or technical integrity. So, having clear agreed methods as to how these tools will be used can in some ways protect the use of open source. The fourth point relates to information management. In 2019, the IAEA Director-General indicated that the agency was handling 140 million items of open source data every year. This raises the question of how international organisations, particularly treaty secretariat bodies, are going to be able to sift out relevant information, how they're going to be able to collect, assess, and preserve a wide range of open source information. That's something that could be quite tricky to do. I suppose my final point is that technological developments in this area and innovations in this area are really exciting and it's proceeding at a rapid rate. So, if you are looking for fresh approaches to arms control and disarmament, there seems to be great value in looking at open source tools to see what they can provide. But at the same time, open source is not going to be enough in and of itself to verify treaty compliance. It can complement and it can augment existing practices, but it's not going to be a substitute, particularly in what is an increasingly contested information environment. There will be limits to what it can do. So, I'll finish my short remarks there. I'm happy to say questions and, as I say, we have a series of papers and another six papers are coming out. If people are interested in this topic and want further information on what we're doing, I'm happy to send people an email update. Thank you for your time. James, thank you. What an amazing introduction to the sense of the possibilities that open source verification can afford, but also a real sense of some of the difficulties that it might face if it tries to engage with bigger political processes. I'm really interested to find out more about these sorts of areas and I would like, I'd be interested to hear your thoughts about the extent to which some of these technological solutions might provide opportunities to bypass some of the geopolitical problems you referred to. So, I recognise completely that there are limitations that come with open source, particularly around validating the information that's generated, but it also seems to me that verification conversations in the past often got stuck in negotiations around what was politically acceptable to all contributing states, parties, but now that so much can be seen by non-states, non-governmental groups, does that circumvent those conversations in any sense, do you think? Sorry, could you repeat the question, I think? Sorry, I was just thinking, it's just a speculative question, so I've been thinking and I've noticed you've got a question from Chenty about verifying the BWC. Is it possible and necessary after COVID-19 or is there some minimum verification that would be agreed upon at the ninth review conference of the BWC? You can save these questions if you want to, you can jot them down, you can read the group chat. The thing that I was asking you about was mindful of the fact that verification regimes in the past were dependent on what could be agreed politically by states, does open source verification afford opportunities that those sorts of stuck conversation are bypassed simply because so much stuff is more visible than it was before, but yeah. I think there's certainly potential to do that. I think for the same reasons that the current difficulties that we're facing at the moment in terms of the wider geostrategic context, I think getting acceptance around those tools may be more difficult now than it would have been in the past, so I think there's certainly technological potential to do that, but realising that potential will require overcoming sort of political ceilings in order for these technologies to be accepted, which is perhaps becoming more difficult now. If I may respond to the BWC question later. Of course, thank you very much. Thanks James. So I'm going to pass the opportunity to talk now to Veronica Podenko. Thank you very much Veronica. Thank you Henrietta. I just muted myself and let me start sharing the screen. Okay, did it work? Does anyone see the screen? Good. Okay, so today I would be talking about open source data analysis for nuclear risk reduction and basically presenting my organization's open nuclear network concepts for nuclear risk reduction and how open source data analysis fits into it. So just to give you, okay, just to give you a bit of a background of what open nuclear network is. So this is a agatization that was established back in 2019 in Vienna, Austria, and OINAN is a program that is being privately funded by a philanthropic one of future foundation being had quartered in the United States, and OINAN's financial and programmatic independence allows us just to try to remain non-aligned and politically neutral to reference James's concerns that he raised in his talk. OINAN's approach to nuclear risk reductions consists of two core elements, the synergy which makes OINAN concepts specifically powerful as we believe. So the first element is open source data analysis. The decision maker is in states engaged in conflicts that could give rise to the use of nuclear weapons, need access to high quality shareable information that enables them to make the best decisions in the face of conflicts. So our team of analysts produce operational insights using unclassified sources and leveraging publicly available data and technology. The second and equally important part of the concept is the engagement network. This is a concept to engage decision makers through a network of a trusted intermediary that we call engagement network. The members of the network are mostly former senior civilian or military government officials or prominent academics and other experienced and well-regarded practitioners from the felt. So with the health of the engagement network, OINAN hopes to transmit its analysis to top government levels. OINAN's overarching goal is the reduction of the risk of nuclear weapons, as I mentioned, and those nuclear weapons we are concerned might be used in response to error, uncertainty, or misdirection, particularly in the context of escalating conflict. Asymetric access during formation when adversaries lack shared timely and reliable information is one of the most important risk factors. While intelligence services exist precisely to minimize those asymmetries and shortfalls, they do so for their respective governments, and sometimes sources and methods have to be protected and information cannot be freely shared. In a standoff relationship, adversaries may be incentivized to not be fully transparent about their intentions and capabilities or present them ambiguously. So such uncertainties and misunderstandings are critical risk factors for conflict escalation. And with the increasing public availability of data from unclassified sources and rapidly advancing analytical capabilities, we believe that civil society can and should provide alternative sources for trustworthy information and analysis. NGOs could provide independent assessment of capabilities of the parties to a conflict, and this could contribute to alleviating the core challenge of asymmetric information. NGOs could also offer independent verification complementary to the one conducted by states, or perform fact checking to help clarify allegations or disputed incidents. This could be achieved without having to create bilateral or multilateral institutions or mechanisms, which would definitely save us time in the times of conflict. For both of the mentioned activities, verification and fact checking, open first capabilities are especially instrumental as the process through which a specific assessment has been made can be reviewed and confirmed. Commercial available satellite imagery allows NGOs to continuously monitor activities around the globe and take those changes. For example, like movements and troops, military vehicles and luggage equipment, renovation and upgrading of sites, bills could all be now observed through the rapidly developing and commercially available remote sensing capabilities. While governments of course could initially be uncomfortable with such third party involvement, a proven record of neutral assessments can increase confidence in the added value of neutral non-governmental entity. One of the key tools that my organisation and analysis analysts and our organisation use in their work is the open source data platform that we call Ditaio. This platform was designed specifically by ONN to facilitate a more efficient way of data processing. So Ditaio brings together diverse types of data such as text, images, video, satellite imagery, aircraft and marine vessel tracking data and all other types on one platform. By combining all those data sets, the platform empowers the users with a more comprehensive basis for the analysis. Another essential characteristic and probably the one that is most important is the accessibility of Ditaio for the general public. So we encourage individuals who are interested in open source data analysis to join the platform and contribute to professional conversations. Ditaio allows method users to interact with each other and discuss any of data available on the platform. So through those conversations, it would allow us for a constant peer reviewing of one's analysis thus increasing the quality and making one's analysis less biased. So with that, I would probably stop here and if you have any questions and comments or any points for discussion, I would be happy to address those. Thank you so much. Thank you, Veronica. Wow, an amazing insight into a project that's really operationalising some of the things that James mentioned. It's really interesting to hear about this model that you've got for crowd sourcing people to come and help authenticate the data that you generate. So it'll be interesting maybe to have a conversation with James about how that could work in the wider conversations he was talking about. I'm interested to put it back to you. You say that one of the roles of the open nuclear network is engaging decision makers and so just a very quick response about how much appetite is there for that, how much interest is there for that in the people you talk to. Right, so just because we're a new programme just established 2019, we're still in the making. So for the first year of our operation, we decided to concentrate on the conflicts on the Korean Peninsula. So we're trying to get engagement network members from the six countries that participated in six party talks, mainly US, Russia, China, South and North Korea and Japan. Currently COVID interfered into our plans, so now it's really difficult to meet people, but we're still trying to stay positive and we think that would be actually really interesting for those people to still be part of the conversation and be those peacemakers so they can still make the difference. We think that's really attractive and so looking forward to that. Great, I forgot to one meet there for a second. Thank you very much. I'm going to move on now to Andrea Carboni from Ackloed and the University of Sussex. Thank you very much, Andrea. Yep, hello. Thanks Henrietta for organising the panel for the invite. What I'll try to do, I won't have slides, I'll try to provide a general overview of the work we do at Ackloed, the authentication and sourcing processes we have in place, so the way we engage with open sources and describe also possible applications of our data and perhaps if I have time also showcasing one example of this application and the combination with other open sources. Ackloed is the acronym for armed conflict location event data project which is at the moment the most comprehensive real-time data collection on political violence and protest across the world. The project started actually as an Africa focused initiative and then expanded to also include Asia, Middle East, the Americas with just launched two new projects on the United States and East Asia just in the past weeks and we are about to launch also the Western Europe data which will kind of provide global coverage to our project. We work primarily with open sources meaning that we rely on secondary sources like media outlets, reports from human rights organisations, data provided by INGOs, I'd say well selected Twitter accounts and some data that are shared kind of privately, so confidentially by local partners in context where data collection is particularly difficult because of the specific political environment which doesn't allow for instance accurate media coverage. These means that we don't kind of verify the specific content, well the specific, we are not collecting the data ourselves, we don't have of course the resources nor the team to do that but we rely on the quality of the sources we use and we verify that the sources again we use are trusted and provide as accurate as possible information. Of course this is largely due to the real-time nature of the data collection and the scope and the kind of general scope of the other projects so we kind of try to work globally and have a full coverage of all states which means rural urban areas outside of the usually most covered regions as well. But again because of the real-time nature also of the data collection process we are aware that there might be issues with data collection this is why we periodically I'd say weekly review the quality and the content of the data we provide. I'd say that approximately 10% of all data we collect each week are corrections to previous events simply because sometimes there is more information available that wasn't at the time of the initial release or simply because the information has been, there was a lag in the availability of the information. And again this is particularly important when there are spikes in protest activity for instance in particular context or changes in conflict trends that doesn't that don't allow immediate access for journalists particularly to to to worrying areas. I mean we can come back also to that with a with a questions later and if you actually want to know a bit more about the data collection and the sourcing process which is a big part of the you know the internal reflections we have in our team. I'd say that the data are of course used by a number of different actors because of the granularity of the data we are not interested in you know aggregation in aggregated conflicts but actually providing disaggregated data on conflict events, discrete conflict events and we provide information on the specific date, location, actors that are involved or the number of fatalities and some kind of other additional information. We don't provide and I try to connect here to the you know to the topic of the the weapon or one of the topics of the weapon which is actually the weapons how we can track actually weapons. We don't provide that kind of specific granular information I'd say although we provide whenever this is available in news reports we actually provide as much information as possible and available about the weapons that are used in conflict events there and we we know that in in the data. However we've seen that all the kind of partners and organizations that use ACLA data combine these data with other sources. We have you know data being used in academia by international organizations or non-governmental organizations as well for planning purposes for instance or to see what you know what's the outcome of specific measures and policies they've taken by governments as well and by journalists and perhaps I wanted to showcase here an example from last year particularly from an investigation that was carried out by a group of French journalists that were working on the use of specific French-made weapons in Yemen in the war in Yemen. So the investigation let me just maybe share my screen so you can have a look. It's called it's called Made in France and was actually intended to to investigate whether particularly artillery weapons and tanks were used were used in Yemen for offensive purposes something that you know the French government had denied for for long but that according to these journalists was discredited by some leak documents that they were able to access and perhaps I can show here one of these specific applications so let me see the website is a bit heavy yeah here we go so we've got these maps so these was one of the leak documents at the time which showed that in some areas of along the border between Saudi Arabia and Yemen I think it was like a harvester cannons where stations by Saudi Arabian forces and claimed to be for defensive purposes these cannons were actually French-made and the French Minister of Defense had claimed and denied that they were used for offensive purposes or they were used in combat operations. What happened was that these journalists actually contacted us and particularly asked whether along these areas that within the range of the cannons there were any events in which artillery had targeted civilian areas particularly civilian areas because that would if not prove because of course it's very difficult to prove that but to provide substantial evidence that those weapons had been involved in such offensive operations and particularly targeting civilian areas what we did was actually provide such data that was also combined with satellite imagery as well so kind of additional layer of open sources being used here what we provide was these these these data can also show some more but these data in which at least we identify at least 35 civilians were killed in 52 different bombardments that were located in areas in which these Caesar cannons so Horowitz type were stationed and the same was also done for other weapon systems so Leclerc tanks particularly along the coast these times so in these in these areas of Yemen so here along the coast so this is to sum up and I'll just come to conclude here so I'll stop sorry I forgot how to yeah stop here so this was just a you know one of the examples in which Acledata well not only relied in these cases on open sources but also contributed to kind of large-scale open source based investigations that combine multiple multiple you know sources so satellite imagery the use of conflict data and and also the presence of you know the knowing where some weapons are are used as station in particular so I'll stop here and perhaps we can also talk more later about other applications or sourcing or whatever question thank you very much wow what amazing another really amazing set of insights about the scope of what comes under the banner of open source verification and what can be achieved and as James introduced right at the start this sense that there are issues around authenticating what you do but as Veronica and Andrea both pointed out embedded in your in both your models in quite different ways there's a sense that you filter out what you're getting your you yourself are authenticating the data as as it goes to to make sure it's as reliable as possible and you respond to suggested corrections so I'm going to move us on to Grant now but I think there's options for a really interesting conversation about knitting those all together with with the with the bar that James set us to start with so thank you very much is Grant Christopher with his talk thank you hello everyone sorry I just wanted to get everything settled before I started talking so we're going back to nuclear so really tying in with Veronica and I think it'll touch on quite a few of the things that Jeremy was talking about but you know unlike Jeremy that really had to think about the problem of the use of chemical and biological weapons in attribution um we're not perhaps thinking about attributing the use of nuclear weapons but more about how many weapons do people have what's the status of people's nuclear infrastructure um so no one's really offered definitions we kind of skip that stage and I'm certainly not a definitions guy um but what is open sources and what is open source intelligence is roughly the internet broadcast media um so tv radio etc and print media so books journals magazines and then everything kind of gets thrown into the internet so social media and satellite imagery which always traditionally has been a completely separate intelligence division and uh you know a lot of uh there are people that used to be former satellite imagery specialists that have now kind of moved into uh open source satellite satellite imagery work so I want to again with uh you know Jeremy's last slide talk about the limits of open source before we start talking about how it's used so it's pretty hard um it's hard to kind of get what you want because there's often a scarcity of good sources um maybe you want to find information and answer a question and then you realise there just isn't information available no matter how good your team is now no matter how good your uh your information management infrastructure is but if you have several small pieces of information and you have a skilled team you can go very very very far and answer some really interesting questions um it can be effective you can often work really really fast you know NGOs and small small teams can move very fast um but often analyses will take a few years before they're ready to be made public but you know as people have said you can openly share these this is a way for um artists that don't really want to talk about national technical means of spies and satellites and search um to talk about what another country is doing without revealing sources but really importantly it's just a component um when talking about verification regimes so verification has been used in two different ways in this talk by everyone so just if you talk to Osin people verification means is this thing that we found does it actually show what it really is showing that's what verification means here and then we're also talking about um an international agreement where you are trying to verify that each party is doing what they said they're doing under the agreement and that's what I will talk about when I talk about verification here and you cannot use Osin alone to constitute a verification regime you must have some other elements to your regime to do it so I come from Vertic a Vertic is really first and foremost a technical um verification organization so we help um organizations build and improve their their verification regime that's basically the risen vector of Vertic um I want to talk about a very exciting recent case study I had the privilege of being involved with um so this this is really done by Dave Smiller um he works for the sense of enombol efficient studies and this work is also in partnership with Royal United Services Institute Roosie and we're looking at North Korea from open sources and Dave just published this analysis of a re-evaluation of North Korea's only known uranium yellow cake production facility so the nuclear field cycle as many of you know you've got to get stuff all the way from mining it in the ground to your weapons testing facilities and metallurgy factories and you've got to make some missile materials or stuff you can put in bombs so there are going to be quite a few facilities in every country doing this and the real goal is figuring out what's going on at these facilities what research is going on but also what material is going in and out and you know clearly the the main product of this analysis is this large picture but there are really important other sources of information that go into this analysis that you know have have leaked into the open source or moved into the open domain but they include on-the-ground inspections so North Korea has been on and off about letting inspectors in and there's footage from one inspection that revealed really important technical information about how this facility processes the uranium that was a really important part of the investigation um there were diplomatic cables from in the Cold War that that give us a clue about you know how concentrate that the uranium is that it being mined out of the ground um there's some important technical papers just in general about the process that is going on at this kind of facility but also specific to what North Korea is doing um so the big takeaway is satellite imagery you know what this this was titled you know in the age of google earth so with a satellite imagery focus this is dependent on what other sources are available and you know you really need to be able to do on-the-ground inspections and install technical equipment to to have an appropriate and competent verification regime for assessing a nuclear program i've gone the wrong way apologies um so i just want to conclude on this um so this is another uh you know brief case study this again comes from uh cns really but just talking about how Ozin can respond to the news cycle and what its role is so uh the picture on the right the top right is from the negotiations of the Iran deal the jcp away and right up to when this this uh deal was concluded that this had significant political domestic opposition in the united states and in other countries and this has been done before this isn't the first instance and it won't be the last but some people weren't happy with the status of negotiations and that something was going to be concluded um so a Iran had a history of concealing enrichment facilities you know the most sensitive the most the most uh one of the most important parts of the fuel cycle where you get uranium you could put in a bomb and an organisation she said no they've got another facility um trying to erode around credibility um so you know i got contacted about this and asked about this but you know thankfully the cns team Geoffrey Lewis had already published um a refutition of this on the arms control wonk blog and uh people associated with this he has a molissa was involved with this too um and they you know this idea that the truth can you know has to get get its shoes on while the light is traveling around the world well this this refutation was so fast that the damage was really limited here and i think that's a really important thing to think about that was in can be rapid um and there is a bit of an asymmetry that it's often a little easier to refute something with ozzin than it is to prove something um so that is that is a very particular challenge associated with ozzin as well but i would just like to uh leave it there with that thought thank you um thank you very much uh grand uh so much to think about there um we've had a question um from julia af thank you very much um that's i'm going to put straight to you but anybody else who wants to think about it as well uh it'd be interesting to hear your reflections she she's asking about what uh what specifically are the uses of open source verification and i and i'm very interested to hear Grant's response to that given that you're so categorical that it can't replace a formal verification regime attached to a treaty which i think echoes the comments that james started us off with uh straight away um and i absolutely appreciate what you're saying um but it also feels to me you mentioned one of the tasks of conventional verification regimes is to build confidence in uh compliance um well it is very hard for a conventional verification regime um to do that so the fact that it's also very hard for open source scheme to do that doesn't seem to me to necessarily uh root it out i don't know i'm kind of thinking off my head and the secondly uh julia mentioned what do you do you see there's a difference between open source intelligence and open source verification uh it feels as though there are such a differences to me but i'll be interested to hear your thoughts about that sorry i'll unmute um so going back to um an earlier comment that was made about the sensitivities and about how you'd actually get these things included in a bwcw regime um so one of the reasons open source intelligence uh is is used that's what the kind of the practitioner community uses but politically it's easier to just talk about open sources that's that's pretty much it um my understanding so i have never had um access to classified information so i can't give a complete answer about how in an all source uh intelligence question what role open source does but i have spoken to many people that have tried to do these things and you know hopefully my what i presented kind of gives a little more on on julia's question so julia posted it before my talk i know but when you have all sources then open source is a component and when you have only open sources then you have to you have to rely on leaks and other such things i think that's the way it goes um there's some other really interesting discussions about um i think there have been questions of culture in organizations that actually have to do and answer these kind of questions that secret intelligence and intelligence obtained over certain means was always seen as more valuable even if you could get a much simpler answer or better answer or different answer from open sources um so the the secret versus the the easy to obtain um shouldn't necessarily be worse or better uh information and should be evaluated carefully um i i don't think i can really answer that question fully though but that gives a look no that's really that's really helpful yeah thank you very much um i'm going to hand over to anirada de malei please uh for your reflections on these four fabulous talks uh thank you very much um so i'm annie thank you so much for the introduction henryta and thank you to siras for organising what's been a really interesting event um thanks to jamie baronica andrea and grant for your presentations as well um i'm going to quickly provide some reflections from what i've scribbled down during all of these talks um hopefully the things i say will encourage a bit more discussion and a few more questions from the audience so i work largely in a cbrn um space so that's chemical biological nuclear and a radiological swapt over sorry um when it comes to verification so it was really interesting to hear contributions both within and outside of you know the specific topics um when thinking about the topic of the webinar the first two or three things that came to me um to be interesting things to discuss which have been touched on uh to some extent uh the first one was this idea of who will guard the guards so that's the verification and authentication of methodologies and data surrounding verification in open source data collection um i'm also really interested in this spills out of cbrn into also space um conventional arms small arms um into the ethics surrounding both the power dynamics in in data collection ownership um and also just how that open source information can be used so this is something that's been talked a lot about my in my background of science in in open science is the ethics of open source information and how it can be used for good and bad so what i'm going to do really quickly is summarise some key highlights from each of the talks and then leave it to the questions uh to the audience because i think there will be more than enough so uh james i'll start with you uh thank you so much for outlining you know these three roles of open source that you talked about in the context of the work that you do um and almost commenting on the ontologies of the methodology and data in that they mean different things to different actors so they mean different things to states they might mean different things to NGOs um and that that is even more complicated when we operate in such a contested information environment as you described it um you also mentioned which sort of grant uh uh reiterated this idea that open source is not the substitute um for other tools in compliance but actually a part of the part of the puzzle almost a part of um this whole system of moving parts that is used to verify compliance or to judge the actor judge the behaviors of different actors in the global setting so i think it was a really good a really good talk to to sort of start us off and talk about verification it's it's uses and it's caveats um from quite a realist lens as well um and then moving on to Veronica you know i've heard so much about d'Itario and the work that open nuclear network do so it was really interesting to finally see um i know that you've only started quite recently i think last year um is that right yeah um so it's it's really great to see um how you're directly addressing the power dynamics almost that i was talking about so this asymmetry of information um it's something that needs to be addressed um especially when you talk about ethics and the verification of of information um and especially touching on the diversity of the data and how they interact with each other um the sense of built-in accountability i think is is really really interesting um what i'm also quite interested to hear about is the agency when it comes to accountability so where does the responsibility lie or does it lie anywhere or is it as dispersed within the network as the accountability system is itself almost so is that agency also peer reviewed when you talk about you know who is allowed to make a judgment on whether this data is is good bad reliable inaccurate what um and then talking about diversity of data i think um Andrea learning about app club today was amazing um it was really interesting to hear about work outside of cbrn when it comes to verification so political violence perhaps more on small arms or conventional weapons um and it was also really interesting to hear about how this is now a global network so the coverage is almost quite decentralized um it was great to see the data vis work as well that you've done which is not just the sort of geographic visibility part but cool to see how all of the different data interfaces and relates to each other and and how you build an image of a of a singular data point or a singular event um coming from all the different types of data that you have um and also sort of uh different to the verification work that i've been exposed to in that we we at that time at vertic with uh with grant christopher and i've worked within his team and um to hear about how you know our work is very much quite often looking at lots of different data and aggregating it over time for a specific country or a specific issue um looking at how you have discrete packages almost of data for individual events and how open source information can be used for so many different things um but there's limitations for each and sort of understanding i'd quite like to understand a bit more perhaps the difference in the methodology surrounding those two different approaches and what the pros and cons of each are and how they fit in to a larger uh sort of compliance or security regime if you will and then finally um grant it was um a great review of the project that vertica doing but also i really appreciated you defining what osint is because i think a lot of times in this field you know we literally have an institute called the acronym institute because there are so many acronyms um and i'm a big fan of just trying to make everything a bit more understandable so thank you so much for firstly defining what osint was but secondly also kind of outlining how difficult working in osint can be it's almost like a puzzle that's in four dimensions but as long as you have lots of different people working together it can it can kind of work now the one thing i was i i really took away from your talk grant was when you mentioned how some of the data initially was protected and then became became open source um so things like ground observations within North Korea and it's really interesting to think about how the nature of data can sometimes be dynamic um so it also by that logic the quality of open source data can be very dynamic too um and so i'm quite interested in hearing from you you know for instance the the quality of it of a data point can differ by source but it can also differ over time so if you have a country for instance with a changing administration perhaps the quality or amount of data that you get from that country changes over time too so i'm quite interested to hear what what you would think constitutes good data or how you apply intelligence methods to be discerning enough to understand what a good data point is what a reliable data point is or is it always going to be relative to the other data you have is it more about how it fits in to the rest of the data so what sort of judgment calls you have to make when you're doing open source intelligence work and then also when it comes from sources such as deflectors or countries that have vested interest in a certain issue how the ethics of that scenario play out so i'm actually going to leave it there because I scribbled down so many notes it was really hard to try and provide a short reflection on what's been said but thank you again to all the speakers and thank you Henrietta and the SOAR team for organising today's event thank you Anu you've done an amazing job at kind of encapsulating so much of the richness that we've got from the speakers today i'm very mindful that we're on three minutes to three and at three o'clock i said we'd segue to a more informal discussion and some some people may need to leave at that point so by all means do leave if you do need to so i'm going to give each of the speakers a chance to respond to Mano's point but also flag up that that you've had a couple of other questions through the chat function one very specifically to Andrea about methodology for looking at the specific case study you mentioned and one from Lydia Wilson asking about how much disinformation affects the challenges of open source verification so i'm going to go through in the order of the speakers to give you a couple a minute or two maximum to to respond and then if you could stay for a more informal thing i'll give you a longer chance to respond so James you first please okay thank you Henrietta and thank you for the questions i'll try and be as quick as i can just to point out Veronica the comment on impartiality was not directed towards you or ONN just in case there's any confusion there if i could address Chen Wei's question i think that'd be quite important because it was a very good question um there there are tools that we didn't have last time there was a series and significant discussion around BWC verification one very small example could be online open source trade data but this is one of a number of tools including those that Veronica and others have spoken about which could be adapted combined i think there is potential that these could give us greater confidence in compliance or indeed non-compliance but i don't think it's necessary i'm certainly hesitant about using the term verification in this regard because that to use annus raises different ontologies and different meanings there as well but i think it does really does point to the importance of preparing to have a fresh and a constructive discussion in the run up to the review conference scheduled for late 2021 and i'm hoping there's a window of opportunity which states will seize there if i may just one of the points in relation to Lydia's question on disinformation this this is not something which is new this is something which has been around for a long time it's just it's become faster and spreads more widely much more swiftly but i was always struck by a comment from the late Julian Bay Robinson it was a paraphrase there because i don't have his eloquence but um this idea that allegations of association with chemical and biological weapons have been used by well-intentioned and sprupulous actors for millennia to vilify enemies i think that's it's been around for a long time it's quite a potent way of getting it people even if it's not necessarily true um i know if i may respond to your comments later and i'll leave it there brother sorry thank you james congratulations thank you for responding succinctly and before i wang on i will hand over to Veronica please yeah yeah thank you Henrietta so the first point i want to address was about using open source for verification and i want to just be clear because in my stage i talked about how NGOs could perform verification and i want to make it you know crystal clear that we don't talk about substituting state level verification with just the open source but rather adding this additional level of verification that NGOs could you know uh could uh offer uh as an addition to the state level uh verification and then to the comment that Anu made about the responsibility who is responsible for the analysis that we do and i guess the outside responsibility still lays with the organization and um i guess there are no way to ensure that there will be no mistakes in the analysis that we do um but if we do make a mistake we would be the first one to acknowledge yes we did it and rush right there to to correct ourselves but that's when the tail uh we believe becomes really instrumental because you do put this in the open source and your crowd source your analysis so you have many many people to be there and review uh your stream of thoughts and uh all the methodology is there to to to be open so we think this is one of the ways to make our analysis as unbiased as possible and you know limit the number of mistakes that we could do when we're performing that analysis yeah and with that let me once again thank you Henry after for organizing that i didn't think i i touched upon that when i was making my remarks but that's a really great webinar and thank you for having all of us oops there i was uh talking thank you Veronica you know it's thanks to all of you guys that this is such a rich uh an interesting conversation so over to you Andrea to respond quickly uh to to these questions that we've had first up questions yeah i'll try to answer quickly i mean the question was very large and perhaps i'll also try to answer uh uh Lydia's question about uh this information or in general about the how we um balance you know even the kind of the richness sometimes of information so what we try to do at least is to uh well first of all triangulate as much uh like the information we have the reports we have as much as possible and as much as uh it's available um for instance in in even in very polarized contexts where the you know warring parties and they may try to spin their own their own positions their own figures for instance we always try to combine see whether we can find actually uh more accurate information on either side if not we always rely on the most conservative estimate it's possible uh we try not to inflate the scale of the violence so we report the events we try to make sure you know disease like the the events we we uh we capture are included in our data but when it comes to for instance including the number of fatalities which in particular context again can be very politicized can be used by different sides to push the road propaganda agendas or whatever we always rely on the most um conservative estimate this is done consistently across the globe uh and across regions um again I've worked primarily on on uh Yemen in the past few years and this could be done in areas for instance where there was even a lot of coverage where you know the media environment was rich and you had different positions on the same event you had different reporting sometimes you know reporting no fatalities on one other article reporting 10 fatalities let's say we would always rely on on on the most conservative estimate in these case but the problem comes where there are environments where there is only one side reporting this is the most difficult for us because we have to kind of make a choice in between you know reporting the information as it is available but at the same time we are aware that we are trying to use we are using actually only one side only because that it's only that we have available um north and Yemen is one of those cases where it's very difficult to have a full picture because on one side you have Saudi Arabia which is a very closed environment and the media reporting is limited sometimes at least in those kind of warring border areas and you heard the other side a Yemeni side fighting against Saudi Arabia which is very keen on actually pushing some you know some types of violence over artists in those cases what again we try to do is to include the information and then again done consistently across all regions we cover we use kind of standard for instance when it came when it comes to fatalities standard figures which is usually 10 let's say we know that there has been an unknown number of fatalities or injuries and deaths these are the usual sentences you read we rely on using like a 10 which is kind of an average estimate we're drawn from Africa from some specific areas in Africa wherever we could kind of estimate uh you know that that was an average number in between the fatalities we were missing and the fatalities we were sometimes overestimating and so that's why we kind of use that um again and I'll close here well uh we always stress particularly when it comes to fatalities but in general because they are very political data I mean these are can be politicized and they are very politically charged data what we always try to to stress is that the data and overall you know fatality figures are usually estimates should be treated as estimates that can be more or less accurate but it shouldn't be you know we shouldn't be we don't do that like providing you know the very exact number because that again since we can't do like the very the first hand verification uh working on on kind of large-scale projects this is what we try to do so rather than saying this is the exact figure uh we try to work on either time trends or estimates I think this is a bit more productive and accurate to the reality of the process data collection uh thank you Andrea um a very interesting uh it feels of those at different levels going on here that your data can it is maybe bypassing some of the bigger treaty conversations that James alluded to and we've got a question from Paul Shorthair about that um I'm mindful that we're at eight minutes past three so I'm going to invite Grant to quickly reflect on the question so far and then we'll we'll segue into people beyond to make direct uh comments uh across the board yeah thank you Grant so I'll go straight into Paul because I think it's the most interesting and difficult question and I mean Paul Paul knows very well that we shouldn't expect us into be some silver bullet for the most difficult problem in or one of the most difficult problems in arms control which is compliance disputes but it can help and it in in the right context where there is good there's good open source information um maybe maybe there's a way to help parties resolve disputes by openly sharing without revealing sources and methods for having gotten that information from the first point elsewhere um so perhaps that's a way to talk about that so on our news questions on uh data of a time etc um so you can think about some uh states have become aware of osent so we're in a capabilities counter capabilities regime um you know that they're known to censor their propaganda and and change the angles that things are shot and things like that um they know that people monitor their websites so um there are you know things you think there used to be a lot of low hanging fruit um but now the community and um the the NGOs and the NGOs have built up capabilities so there's there's additional things we can do but there's always going to be a capability counter capability battle in terms of getting good osent and uh it being concealed um on disinformation so i give i give a good uh pretty good example of a quite elaborate disinformation so to do this well right it takes probably as much time as it does for good osent investigation at some point you've got to think about that so um maybe maybe we're we're in a good place where um the osent investigates have the advantage and it's harder at the moment to craft a credible disinformation campaign in terms of you know each individual piece of information that comes through you know every a good osent analyst has a checklist of things that they do that they can go through to try and you know be it a social media post or a video or an article you find um and then you can go through technical um authentication procedures you know CNSU is a thing called tungsten to do technical analysis of photography and imagery so you know there are there are things you can do to be pretty sure that the the thing that you've got is the thing that it says it is or to discard it so those capabilities do exist and and they are quite hard to spoof i think um deep fix um you know we're in a new era where there's going to be a lot of problems um and uh but you know we're going to need um probably on the on the political news cycle side like an army of citizens and Republicans uh you know refuting the stuff as it comes in uh on the wmd side i don't know thank you um thank you and again for all of you well done for summing up um so quickly we're now on 12 minutes past so i think we are officially in the informal discussion bit thank you so i um i'm aware there are some questions uh in the chat function that haven't been aired yet i wonder um uh from julia we've had a question saying how do you factor in the selective nature the individual motivation of open source verification we we've heard very quickly a response to paul's uh question um uh and lidia had a come back um on disinformation do any of you three want to voice your questions in person because uh you now can um just unmute and uh there's been some short responses um but feel free to talk if you want to no so i'm going to i'm going to summer like feels as though there's some very oh paul paul yes well i'll have a go um first of all i think maybe i i i had to miss some of this so this point may be made open source may be more useful than uh it's politic to say because open source tick ticks off state uh intelligence organisations so you get a synergy with what national technical means are picking up and that that all helps the totality of verification but it's generally uncool to mention that um because it then gets denounced as a as a tool of spying but um it does seem to me that this as i tried to suggest this is is important but it really doesn't address the the core problem of managing um treaties and wmd uh which is that whatever information you collect it seems that it can be ignored and refuted and denied um maria chap chap chap rofnir said something very interesting the other day in relation to syrian chemicals which was that um she of course rejected the opc w implementation team and finding which she went further she said in effect nobody no organisation is entitled to reach any conclusions about attribution or or compliance except the un security council where where we know what russia's position supported by china is about vetoing any inconvenient conclusion so i don't if that if i'm right i might be wrong but if i'm right that this is the fundamental problem uh in compliance assessment and verification that powerful states simply don't want it to be possible against them or their clients then i don't know how osin makes a fundamental difference or how it ever could with whatever better satellite resolution or information sharing uh so thank you for that paul and it has it echoes all sorts of things that people have raised uh through uh the webinar but also that globally geopolitically treaties are encountering problems in verification areas it seems to me i'm going to open it up to everybody else or the other speakers um right here in this webinar we've got different models about what the point of open source verification is so andrea gave a really neat example of how it wasn't tied into some big treaty regime for something to happen uh the french journalist collected data and were then able to hold their government to account andrea please correct me if i'm wrong likewise veronica it feels to me as though your project is not necessarily about making treaties work it's about generating information so that other people can do something whatever that is it will have a life of its own i can see some nods so uh veronica do you want to come in first and then andrea next i know you're totally right i think you summarized it quite right we're not trying to replace a verification mechanism for any of the official treaties rather we want to empower the decision makers with unbiased analysis of the information that we find in the open source and with those facts they can come to the negotiation table and already start those discussions so again just to highlight uh by no means trying to replace the state level verification i don't think that's possible with the current capabilities of the open source that are available but just uh offering that additional layer to bring facts back to the discussion thank you um and yeah sorry paul i'm just going to say yes facts but disputed inevitably disputed facts i mean alternative maybe true facts but certainly alternative facts from the point of view of very interested parties in this but i guess at least to trigger some discussion uh even if those facts would be disputed but we at least start this conversation because sometimes parties to the conflict that they just they're not able to share the insights that they have because they're not allowed those are you know um classified information and here we go uh we offer them those insights and say how about we talk about this and even if they would start you know to say no no no that's not true that's allegations and what not the discussion is there and that's where we see our value okay that's a reasonable yeah so my guess is worth thinking about the counterfactual Veronica and paul maybe if you didn't have empowered conversations based on information that can be demonstrated to have some basis in fact then it would be a worse situation surely but for for those those sorts of conversations uh thank you andrea um do you have anything to comment because acolytic fields isn't isn't aiming to be part of these bigger conversations about constructing treaty-based regimes uh no no definitely not uh what we do is actually i mean we treat our data in these sense like as a public area we provide kind of a resource for a number of different actors we we don't charge anyone i mean the data are publicly uh accessible from our website and so they are used by a variety of actors um i mean they are used by surely by governments but when it comes you know there's also a variety of non-governmental actors that have very practical very practical needs to to include these data and their operations uh one of the recent kind of changes we did to our data was actually include kind of sub sub sub event types which allowed some humanitarians particularly in uh in conflict um conflict uh worrying areas to uh verify whether you know even some roads were accessible for them uh whether they were uh or whether they were instead um uh uh you know marred by either landmines or other forms of fighting the way these data are collected so in real time is of course a resource uh for for many of these organizations that sometimes don't have the resources to either analyze or collect uh and so you know from direct experience and conversations we had with a multitude of um organizations working in Africa Asia or elsewhere uh these uh fitted um maybe i can add a one very um small comment um i don't think i mean the data in this sense we we provide our i mean this is a bit more kind of more general reflection but i'd say uh they are usually picked up by either journalists or you know governments uh they make shape kind of contribute to change the narrative helps empower you know someone within either in the media or within you know government circles or whatever to to to to to to to make a change but i don't think that data themselves like uh are a resource that will change everything it's kind of a resource an additional resource you can use in you know uh along with many other things to to push that change um if i'm thinking of you know that one of the most frequent use of our data was in relation to the number of fatalities that we had claimed when we had died in Yemen particularly over the past five years which official figures were saying were up to around 10 000 and we we we said based on our data collection it was 10 times higher at some point uh well it wasn't that figure that actually helped change kind of the narrative built somehow like coalitions within some government circles but it was it was like those government circles rather picking on our data uh and then you know helping that kind of narrative challenging what uh was the narrative within those circles say i don't know if i kind of answered but uh there was a bit on on the use of the data we we provide i think that's really interesting the really neat uh uh idea about a different way that data the data themselves maybe aren't the only thing that's going on here um but from both of you and Veronica and i'm going to be passing this back to James and Grant and Paul and Lydia if you want to come in so so uh that do be making notes or whatever um so from what i've heard from Veronica and Andrea there's a sense that good data is important in in this sense of it could be important in different way but the starting point is to get some good information and check it um and uh in a world of disinformation that Lydia's picked up on that there's there's loads of incorrect information out there uh uh it's quite important there are checks and balances James pointed out there's nothing new in disinformation but there's a sense that maybe there is something new in the scale of it uh Grant was saying that lots of it can be spotted but there's that that might not be true i you know sorry if i get this wrong but equally in Veronica um d'Andrea's models we heard we heard ideas for how that disinformation can be managed to an extent as a sense i think Andrea used the word triangulating the data points Veronica you were talking about crowdsourcing and so a really useful function is perhaps this filtering of of this this soup of information that's out there and Paul's already pointed out you know so what's the point we find this stuff out and how does it tie into bigger points and I and I want to put it to James and Grant and anybody else's interest is common um what um what bar needs to be hit but for this to make a really big difference and and it is any bar to an extent completely artificial uh given what we've heard from Veronica and Andrea about to an extent getting the conversation is enough getting a conversation based on good facts is enough uh so James and Grant I don't know if any of that made sense please feel free to come in yeah yeah it did I'm hoping people can hear me um I I agree with Paul's sort of take on the limits of open source two things I would raise that might be worth consideration I mentioned before is this role in alerting people that something has happened so it's most likely that instance of chemical weapons attacks would likely be identified through or you would first be alerted through social media similarly suspicious disease outbreaks would most likely there will be alerts through things like paramedics and there's a role there the other possible role is in relation to um accountability now I realise that is not the same as strategic compliance and they're distinct and different entities I'm also aware that the history of prosecuting people for chemical weapons related crimes is not particularly good but it's a possible role that they could play with evidence so open source data being corroborated authenticated and used as evidence to prosecute crimes it's a possibility um and I think the other factor is this idea that it provides additional texture to the picture of the state's compliance or non-compliance which I think is nonetheless useful uh I'm moving forward the pendulum may swing back and we may not get back to business as usual but this is not it's going to be forever this period of year strategic tensions and there may be possibilities in the future for actually building and accepting some of these tools for use in treaty regimes but I think you're right in terms of the limitations it's it's easy for states to dismiss it thank you James um Grant did you have anything to add um yeah I've been thinking about Paul's point this whole time and I mean I don't have anything significant to add that doesn't really expand the discussion um I mean I think there are ways to address Paul's point but I think it takes us you know quite far off topic um but yeah very very interesting and important and I think that that Paul you might have missed my talk where I was talking about like even even getting to the treaty stage in the first place you might have you know strategic release of this information and being able to count to that immediately can be quite helpful as well uh thank you uh Grant um Lydia or Paul do you want to come in with your comments in person just just if there are a pattern here that in the case we heard about France French weapons in um Yemen uh I mean that's that seems real and true and and I can think that British governments have been embarrassed by revelations gathered by journalists or in future by satellite and analysts but um so but is but is there is the underlying pattern there that it's um these states are mostly liberal democratic in their in their behavior and their um accountability and yeah it makes a difference to them does it make a difference to states which are very determinedly non-liberal democratic and just refused to be held accountable by anyone except their own governmental structures is that is that is that the true pattern in the background of this I mean Paul that's certainly where I would take the extended discussion right that's the the the lens for you which to answer this but I think that goes beyond my uh I mean I'd have to probably even for vertex say I'm taking my vertex off and start talking right and I think um thank you but I think that's a really interesting point to raise ball and kind of connects back with a question Julia about the uh inherent discrimination of the potential for inherent discrimination within open source techniques in that uh it's unlikely that everybody all around the world will have equal access to the technologies that are needed to do it or the um political protections maybe that liberal democracies uh afford I think that's uh also debatable uh to be honest um uh we've had an enormously interesting conversation um uh as a final point Lydia would you like to come in on dis disinformation do you feel that the things that you were mentioning have been addressed it's been really interesting thank you very much I didn't want to um disrupt it too much derail the discussion I just wanted to say that whatever tools that you mentioned to cut to overcome these um the disinformation problem it's the the the aspect I see is new is that uh the other actors out there are aware of all of those and are able to match them straight away so for example crowd sourcing there are troll farms pumping it out so they've got crowd sourcing on their side as well or or and the manpower to disrupt our um advantages of crowd sourcing for example or or triangulation you know they can come up with triangulated sources from one country's conflict quite easily when they're in control there for example so that's what I see is a little bit new about disinformation now is that the speed they can respond the arms race of open source information can be can be pretty formidable uh thank you capability counter capability we're in a capability counter capability regime so by that you've already alluded to that grant haven't you about that you are you are mindful of the need to catch up and sometimes you might have the advantage in that relationship and sometimes that move and it might move very very quickly uh the differential yeah um so um thank you I'm going to draw this to a close because we're at the time um I want to thank everybody that's been part of this the speakers thank you really for your amazing knowledge and expertise and generosity in responding to the webinar invitation also to the question the really interesting set of questions that we've been able to respond to um we are never going to get distinct set of answers in an hour and a half but I think the questions and exploring the questions are useful uh in themselves in getting the being the start of a really interesting conversation about the issues the challenges the possibilities that open source verification can afford um so just to remind you we've recorded this uh and the recording and the transcript will be available on the SOAS website we have another webinar with a similar theme um on the 29th of July which you can see on the SOAS CISD page um and thank you again very much I hope to see you soon yeah bye bye thank you thanks thank you so much bye bye really nice to see you all