 Gweld i'w wlad yn ymwneud â'r Gweithgoedd crywyddoedd yn gyflawni am y cyfnod sŵr awr, byddwn ni'n cyfnod ar bwysig hefyd a'n sogenanntol sydd yng Nghymru. Rwy'n gweithio gyda'r gweithio. Ar gyfnod hynny, rwy'n ongyrchu'n gweithio y bob cyfnod sŵr. Mae rhai rai'r eilydd Henry Hefts Willson, rydym ni'n cael ei gweithio'n cael bod y prydegaseth eich cyfnod mwy o fyrwbodaeth o'r ysgolod i gael fan sleidol. Dan Pleish yn y Llamade Samuel, ond oedd yn gennym ni'n gymryd yn bwysig ac yn gweithio'r ffordd sy'n gwybod y fawr, a'r ffordd o'r cyfnod a'r ffordd o'r gwneud a'r ffordd o'r ffordd o'r ffordd o'r ffordd o'r ffordd o'r ffordd. Rwy'n meddwl 5 panlyst yn ystod. Rwy'n meddwl i'n meddwl i'w ffordd o'r ffordd. Rwy'n meddwl Melissa Hannum i'r Unif, Rhyw Yn Yng Nghymru i'r teio. We have Alan Hill from Ridgway, we have Hans Christensen from the Federation of American Scientists, Benjamin Strick, who works with all sorts of organisations, and I'm going to take BBC, Bellingcat, EU Arms Project as some of them, because that's what he lists on LinkedIn. And we also have Nathaniel Raymond from Yale University. It's a great honour that they've all been able to join us. Before I hand over to the speakers, I'm going to just give some short introductory remarks clarifying what these webinars are all about and where we're going with them. So to start with, our definition of open source verification is all monitoring that relies on publicly available tools and data. So it's often associated with technologies. In recent years, it's really comes under a banner that's more typically called open source intelligence or OSINT, implying use of data from satellites or social media and lots of different technologies to crunch and access the data. And our definition of open source verification definitely captures those sorts of activities, but it captures more than that. It captures activities that were happening before these technologies were invented. So there's a huge invitation with our definition to remember that monitoring traditional media is also important, building communities of verification practitioners is really important to generating good results. So this is the seventh in a series of webinars that have been really aiming to showcase the diversity of open source work that's happening around the world. And I'm really pleased that it's done that. Over the seven webinars we've had, we've had 28 speakers representing an enormous range of tools, methods, backgrounds of practitioners and also the sorts of things that people are monitoring. So we've had people who are monitoring human rights abuses, political violence, environmental issues and weapons technologies. So a really big range of stuff. And we're really mindful that there are enormous synergies and overlaps between different monitoring systems. So it's great that we've got such diversity. However, you know, it's only right to point out this is the tip of the iceberg in my research conversations that have led to these webinars. I've engaged with a lot more people from many different places. And as we go forward with our project will be really aiming to capture more and more of the diversity of this field. So the seventh webinar is also the last in the series for now. And so I want to say a few words very quickly about what next in our work. Well, we've had a number of requests for community building activities. And in the very first instance what we're going to do, which was mentioned in the invitation to this event, we're going to run a virtual cafe, a chance for people to get to know each other through very short, small conversations. And the first one of those is going to be on the 10th of December. So please save that date. We'll be sending more details about it soon. We're also working on an edited volume and various other research spin-offs from the work. So moving us back to actually what we're going to do today, we're going to have five short talks from this fantastic panel. We're starting with some introductory talk from Nathaniel Raymond from Yale and moving then through the order of speakers that I mentioned. Throughout the session, please do feel free to raise questions or comments by the chat function. I'll be fielding those and putting to speakers as and when we can. And then later on I'll invite people after all the talks have finished, I'll invite people to raise their questions and comments via their video and audio functions. So thank you very much. I'll hand over now to you, Nathaniel. Thank you for being with us. Thanks so much, Henrietta. It's a real honor to be here and to talk to you all. I want to do three things in the time I have to speak to you today. One, talk about what's not new in terms of the current moment in the OSIN field. I want to talk about what is new. And then I want to talk about how I think 2020 is going to be a really important year for this emerging area of practice. And to orient why 2020 I think is going to be into 2021 is going to be a very important six to 12 months coming up. First, just some background. I was formally the director of operations for the satellite Sentinel project founded by George Clooney in 2010 at the Harvard Humanitarian Initiative at the Harvard School of Public Health. And that work led to the founding of the signal program on human security and technology at Harvard in 2012, which led to the first, to my knowledge, satellite imagery geospatial methods and open source class in the United States, which we talked for several years at Harvard before it came here to Yale. I just wanted to give that background as a launching pad for talking about what's not new in our discussion today. For me, I'm generally uninterested in technology and never really wanted to have anything to do with it. I was a human rights investigator war crimes investigator beginning in 1999 physicians for human rights as part of the forensic team that physicians for human rights ran for the United Nations office for the high commissioner for human rights in my my sort of doorway into investigations was the 2001 dashi Lely massacre in northern Afghanistan. And by way of background, I started working in around 2008 was satellite imagery with Lars Bromley of the American Association for the Advancement Science and Stefan Schmidt from our forensic team to try to determine where this mass grave we had been investigating had gone because it got stolen. So my, my entry into open source investigations was entirely accidental, and was out of desperation and necessity. I was wondering these other methods came in because our traditional scientific methods, such as on the ground forensics and DNA didn't work so hot when all your evidence had been taken by a warlord. So, for me, in that moment working with the New York Times of the American Association for the Advancement Science was forensic professionals. The potential promise of these methods, not only for retrospective forensic investigation, but for prospective. And what I mean by that is, we could begin to basically set up collections of imagery collections of open source information with the idea of trying to detect perpetrators of gross human rights abuses before they committed their crimes. And so this is the first part. What's not new. The journey of professionalization that you all are part of now is a journey of professionalization of every single type of forensic science ever that is not new. And if we look back at ballistics at fingerprinting at the use of mitochondrial nuclear DNA and human rights investigations of the application of physical anthropology to human rights investigations. The process of maturation is pretty predictable. The first question is, how do we know that we know what we know. How do we know how to establish a theory of forensics, specific to the individual streams. We are bringing to our analysis, and most importantly, to the variability that comes when we fuse streams together through mosaicking. That's a problem of physical forensics. It is a problem in digital forensics. And the bad news is that it never goes away. The second piece of professionalization is who's qualified to do this work. How do we determine what do we need certification. How do we review our work. And then, most critically, what are the limits. We talk a lot about the opportunity in OSIN, but as with any emerging field. The question is, what are our limitations. When do these methods inject more risk into analysis than solutions. And how do we measure that. So the takeaway from this first part is that it is an interdisciplinary process of multiple fields to professionalize a means of forensic inquiry. It always is. It, regardless of whether we're talking OSIN or back to the previous antecedents in terms of DNA analysis, etc. It will be a combination of practitioners. A combination of academic researchers, a combination of governments and standard setters, and it will involve the intersection of technical standard legal and ethical. Let's talk about what's new. What's new is that the potential right now to rapidly fuse data is unprecedented and will only continue to grow. And we must constantly as professionals in OSIN. Look at the double edge nature of the burgeoning availability of open source data, open source methods in tools, the double edged sword here is we have an incredible and growing opportunity to be in the game to be involved. And that gives us an incredible and growing opportunity to screw up. And I think, for me, the one of the missing gaps in the growing visibility and integration of OSIN methods into traditional media, such as here in the United States New York Times and the Washington Post have open source units. We are seeing it increasingly taught. We teach it here at Yale. What's missing is a scholarly pedagogy to capture where we make mistakes. And I think that, back to my experience with satellite Sentinel is that we have, I think, failed from the beginning to create safe spaces and academically rigorous spaces to share where we screw up. In case of satellite Sentinel, we identify about I think six or seven alleged mass grave sites through a combination of imagery and ground data and witness data. I would say about three to four of those would stand up to academic scrutiny or review now. How do we begin to develop a peer review system that allows us to capture critical incidents to capture where we fail. Capturing our failure in this field is the only way it will grow and mature. Capturing failure requires us to create basically standards of review in venues for review. And so that's why events like this make me so excited because what's new now is that the stakes are higher and higher. I, a couple years ago, I was involved with the Washington Post on investigation of the said NIA prison in Syria and apparent evidence of mass execution at that prison. And what really, for me, why I bring it up is that it was a success. We were able to combine multiple pieces of evidence together and to both review and debunk some previous reporting, including reporting by the United States government, and to fuse together the reporting of others to on the corpus of evidence. The problem with that moment is that we never got to capture and publish our methods. And so, both for failure, looking back at mass grave typing, I wish we hadn't done in satellite Sentinel, and looking at the said NIA example we see the missing gap of right now is that we are focused on technology, rather than pedagogy. And so let's talk about and I will wrap up. I'm rambling here. Why the end of 2020 into 2021, I think is going to be so important. You may have heard here in the United States we have a new presidential administration coming. It is very clear that there will be policy steps by the Biden administration by Congress, likely in the next six months. That will probably address issues related to open source investigations and to professionalization of open source investigations, including if you've read the recent Human Rights Watch report. I think it's called video not available, or video not found on the takedown of data by social media companies. I think that we have a moment in the next three to six months for action by the incoming administration. In conjunction with platforms to begin to develop mechanisms to retain retrospectively and proactively social media data relevant to evidence of gross human rights abuses. I think, unfortunately, US internet policy and US social media policy is de facto in many ways global internet and social media policy. I think we have a moment with the new administration where we might be able to use an American football analogy. To get first down. It may not be a touchdown but it will be a step forward. And so the challenge I have for you all is to think about what in the European and in the British context, what governments can do on your side to support you in your professionalization. What if we could talk to governments and ask for partnership in the professionalization of this field, whether it's evidence retention, whether it is investments in forensic standard development within one jurisdiction or cross jurisdiction, whether it's training of law enforcement, or other professionals, lawyers bar associations in these methods, working for judicial standards and educating huge need within the ICC and the ICJ, educating jurists in in immisibility, in chain of custody. Now is the time where we need to take this from a hobby or a subfield into an increasingly accepted part of modern forensics and modern investigations. So that's the challenge. I think what will probably happen on the US side is we will see an opportunity building on the US example for there to be a best practice about how governments may start to engage in formalizing and in franchising evidence retention as part of standards. So with that I will shut my trap as we say in the United States. Thanks for having me. I really appreciate it and I hope you have a great webinar. Nathaniel, thank you so much. What a fantastic starting point for this webinar. You've given us a really detailed idea of the opportunities for this sector. The real things that you've achieved and more generally what's possible throughout and also you've clearly laid out the challenges to it. And I really appreciate the immediacy of the frame that you put on that in terms of what's happening in US politics right now. So thank you very much. I'm going to remind everybody please do make comments or questions via the chat function if you have any. But seeing as we've got such a full panel, I'm going to hand straight over to Melissa Hannum now, who is from the open union network and detail. Thank you very much, Melissa. Thank you very much and thank you to Professor Raymond for really giving a broad overview. I'm actually going to now dive really deep into a very small slice. The detail project is a project that just turned two years old yesterday, and it is a community. It is a repository for data that may or may not live on the internet anymore. And we are trying to address some of the ethical and pedagogical issues that Professor Raymond already mentioned, but we're not perfect. We're very young, we're funded, but not infinitely so. And it means that we're meeting a lot of these challenges in real time. So I'll give you a very brief overview and then I think in the question section if you want to have any specific questions about how we're operating or facing these challenges, then we can go more deeply. So we have three factors that we try to combine in one place data fusion is is the primary goal as Professor Raymond already mentioned this is the value adds that we see today. We collect data sets from all over some of these are purchased from commercial providers and some of these are scraped if that is in within the terms of service of the website. Some of this is contributed by volunteers and others are from our own employees. We consider volunteers of detail, we consider to be a distributed network of talent. These can be people who volunteer their time. People who are maybe in school and are trying to learn skills experientially. Members are domain experts already. Some of them are journalists or even panelists on this very panel. And some of them are data scientists or other kinds of technologists who are trying to learn how to do data visualization rather than the subject matter itself. On top of that we are our third leg is machine learning and mechanical Turk work. This is the, maybe the leg that's lagging a bit right now. So we do put out missions where we ask our users to do work for us to build training data sets that we can use for object detection. We should have sentiment analysis up next month and some other services are already up like topic modeling and other services using algorithms that are already in existence that we did not build but which we link back to from their original repository github or whatever. And we digitize documents, videos, photos, other types of information and tag it so that it's more usable. Our goal is to enable these kinds of actors, civil society, journalists, governments and those working on treaty compliance and that's because we're primarily focused on arms control issues, particularly nuclear weapons. The exceptions to this is that of course we have to follow us EU and UN sanctions regulations, and we also have to follow export control regulations. So because we have data sets including satellite imagery, for example, we have to make sure we don't accidentally export those to a user who may use them for purposes that are sanctioned. And we also have to make sure that we do not have users from certain sanctioned countries, or individuals who are that they themselves sanctioned. So we have a system of automated and human vetting in order to join detail. If you apply. Our goal is to have increased security for everyone. We believe that a better intelligence product is mated in a more diverse innovative way so where information has historically been very siloed, perhaps not even shareable within a government, let alone with someone who you may view as an adversary. Our goal is to collect purely open source information with a variety of users of different expertise, different capabilities languages, not local knowledge. And by that increase the nature of the conversation that's happening around arms control to be more informed. We're hoping that these informed conversations will ultimately lead to letting off steam before a conflict turns into a nuclear armed conflict. That's our hypothesis. I'm very aware that there is increasingly a have and have not situation that's happening, not only among intelligence groups that are identified by governments but also in open source. So I come from a university background where my university could afford expensive data sets, and that is how I learned, but the bar for new talent new capabilities set too high for the average person to learn. I think I tweeted out that I learned using Google Earth, and that is a true story and a pretty rare story, because most of the people in open source on the nuclear weapons beat come from intelligence backgrounds. There's a very few people who have moved from intelligence to open source and perhaps shared that which is allowable, but this new wave of those people who have only in open source backgrounds is growing. Some of the capabilities we have in detail include imagery analysis. A lot of the space is just called workspaces. So this is an example of the map workspace where users can look at imagery satellite imagery, as well as points lines and polygons that may be useful. So this is actually a Vandenberg Air Force Base, where user can turn on notices to airmen to look for signs that there may be a hazard for airlines, civilian or military aircraft to be wary of. This allows you to maybe identify different places you want to look, and then you can search for satellite imagery as well. As you can see, there's a social component so users can talk to each other and they can also make identifications by annotating themselves. This is pretty much free form in this area, but we do bring all the sources, algorithms, data and user comments with you so that if you decide to actually share this to our internal gallery, then all those sources come with you. And this allows us to kind of keep track of what data, what sources and what individual works on each project. We have some photo forensic tools, so I don't know how many of you are fans of military cats, but we have some very beginning ELA analysis and metadata extraction that's allowable here to give you a chance to see that not only cats, but all the various military officials have their medals polished up as well in these images. You upload your own photo to check them and will not be shared with the rest of the group, although the admins of the site will see it. We have scanned documents, some of this is good for your own edification and learning. I put up those documents that I found to be very helpful to me and learning how to identify overhead imagery or ground photos and other kinds of things. But we also have scanned documents that are not optically character recognized yet we apply optical character recognition to them, which makes them more searchable by text. You can comment on the documents you can comment on images inside the documents, and you can have discussions about them. We have not yet done any machine learning on this but this is an area where we hope to have a lot of analysis. We have done machine learning on quite a few text data sets that are already in text format. The Korean Central News Agency has been in text format since late 1997. We collect all these different data sets, and we allow you to view them not only as text, but also in a graph format if that is what you choose. And in this case, you can see when we applied the LDA algorithm for topic modeling, and you can link back and look under the hood if that's something you're interested in. The top 10 topics are identified by the size, and the one I've highlighted here in red is related to DPR nuclear military war force. So I know that this is a topic that is very prevalent in North Korean state media and then I can do additional sentiment analysis on that. I'm interested in reading about it. We've got an article up on the front page of detail, the public facing page by Dr Clayton Bessa, who works with us, and you can read more about that topic. We do have videos. Thanks to Scott before we received over 40,000 North Korean videos. Because Google has been taking them down due to risks, perceived risks of sanction violations, because obviously on YouTube you do earn some income from advertising. So I think many of us in the crowd probably know that Syrian videos are coming down other kinds of videos are coming down. I'm going to host homeless videos or videos before they go homeless here. All the audio of the video is transcribed into text format, making them searchable. And very shortly that text format will go to our graphing workspace so that you can do topic modeling network analysis and so on on the video transcript as well. You can see videos by dropping a box in the video and writing a comment. And similarly, you could comment on the bottom of the video if you want to have a discussion with someone. If you're able to make a visualization you like, then you can save it to our gallery and other users can comment on them and make different types of discussions available. It has to stay behind our password protection because of our licensing agreements with different data sets. So while this is a useful tool behind the wall it's not fully public, and that's primarily because of the legal ramifications of doing so. I think I have a friend on the panel who really enjoyed this particular mission. In this case we asked users to identify surface to air missile sites. There were about 8,000 sites that I had culled together from different sources, where we thought there could be surface to air missiles. We gave the example and then we asked users to draw a box around it if they saw it. And that's because, in many instances, the sites were abandoned sand or dirt blew over them so on and so forth. We created a very good training data set for object detection. If legally allowable, we will make that object detection algorithm available on the detail website so that users can search large areas of satellite imagery in order to find surface to air missiles. But the new US export control law in January makes that a little bit questionable so that is a legal question we're working on right now. We are new. And there isn't a lot of data, or I guess an economy of points in our system, but we are trying to build reputation scores for all our users. So the more activity that you do, the more points you accrue and your reputation increases. We build out more missions and more other types of activities, and we eventually get a baseline of accuracy. We will start ranking people on how accurate they are as well. We want to gamify it a little bit to encourage participation, but we also hope that this may be a credential that you could say on your resume, for example, or share your reputation score. My last point is I do want to talk about the larger organization to tear was a tool inside of open nuclear network. You can find us online, and we have built a code of ethics. This was after several meetings with the Stanley Center in the United States where we openly described some of the internal ethical decisions that we were facing, and found a lot of other people, including Hong's who's on the panel. Also, we're trying to work through how we handle some of these dilemmas. And so we've done our best effort at describing our code of ethics. This is our public facing document. We also are working on an internal document, which is more describing how we handle employees as they face these different ethical dilemmas. So we're like an enforced enforcement mechanisms. I'd be very happy to get feedback on this because other than modeling on journalism, we didn't have a lot of inputs. And so this is an area where we're still developing. So thank you very much for having me and look forward to your questions. Thank you, Melissa. A really, really fantastic follow-up to Nathaniel's giving us all sorts of very rich detail about how you do it, how Deteo does it, but also a sense of what an open source researcher does. We have had a question in for you from Dan asking about how scalable these things are, but if it's okay with everybody, I'm going to ask him to ask it in person from you after all the talks. Give you a chance to think about it and also so we can move through the speakers. I was really interested by the comments that you made around some of the vulnerabilities of the data being lost and Nathaniel also made similar points. And I'm just putting it out there. This isn't a question, it's a kind of comment that it feels very much as though there's a sort of porous quality between what's closed and what's open. And something that has been open can become closed again and what has been closed can become open. So we've explored this a bit through other webinars and I'm just interested to see these themes coming up again. But I will now move the floor on to Alan Hill, please, from Ridgeway. Oh, if that's okay. I'm moving through alphabetically by surname today. That's how I've done it today. So over to you, Alan, please. Thank you, Henrietta. Let me just sort this out. Is she on okay? Cool. Yes, that's great. Thank you. I'm just going through it. Good afternoon everyone. My name is Alan Hill. I'm the operations and technical manager for Ridgeway Information. We are spin out from King's College London. We've gone a few years now and mainly what we look into is non-proliferation and other things that I look at now is the wider range in sort of non-proliferation chemical weapons and other subjects. On the go through his last year, we were tasked to conduct research into some historical chemical attacks in Syria. This is for a particular client. And on top that I thought, you know, this is probably a good opportunity to have a look, do something else and try a test case. And this probably falls into what Melissa was talking about, as well as Nathaniel about data being lost. This becomes quite important in the open source world because if we lose that data, how are we going to use that to formulate our assessments and things like that. And after this become more pertinent now is when conflict in Syria is ongoing. Most information is out there about atrocities, war crimes is still in their source domain and we need to make sure we retain that and use it for future use. Okay, so just to outline the task is, as most things, identify the objectives, collect original online data information which I'll cover in more detail, conduct the cold case review and compare results. So identify the objectives themselves. So the main things you want to look at is collect what original data information is still available online. Okay, so as in now this one the attacks, particularly when we looked at 2017 was very interesting how much information was still available and how much information we could still use. Then conduct the cold case review of the attack using only original data information available now. So if we looked in three years previously, using information would we come up with the same sort of assessment. And then compare the data results and provide an initial assessment itself. This was on the sideline to another project so we're doing this in this or the wings just to see and prove the concept. So to apologise, there are maybe a few images on there that are quite adverse. But this is when we start to look at all the information and collect all the reason online data. This becomes when it becomes more problematic because suddenly attacks, they become secondary reporting, they become fake news, which is trying to get into the weeds of the initial information. And actually that's what you want when you're doing your own source of investigation, you want the primary source detail, primary source data. One of the issues we have with the data collection is ideally as well identification that primary data, you want that initial data. You don't want the secondary report, you don't want the BBC news, you want the data from the ground. So that's going to be the best insight. And then we start to look at identify other data sources. There are other data sources that are already reported on these, but we do want to have any bias preconception of any issues and information that was gained. But we would use them, actually use them and use their sources. Then we start to find, use the OPCW reports and find all their sources. Sewing archive, the NGOs, news organisations and investigators as a benchmark. The one thing we didn't have as a benchmark to say, right, is information is still the same as it was in 2017. And this is when we start to look at, it's hard to gauge how much loss of data we have over this period of time over these three years. And one thing, if we lost it, if it's lost, we can't use it. And when you're a source investigator and doing your research, you want to say, you want to make sure you've got all the data valuable at hand. And this is when you actually become a new to this, when you come to a new conflicts, new area, is understanding the reliability of the data and the sources, how reliable is it, can you use it, is the source credible, all these sort of things are gained over time. It's hard to do this straight away and trying to gain that sort of reliability. The most special looking at conflict zones is adverse nature of material. You look at conflict in Syria and ISIS and what's going in Yemen and around the world, there is a lot of adverse nature of material. And you've got to think about your researchers, your investigators, the people doing this work. The amount of information they're looking at with adverse material, looking at overtime, that's going to have an adverse effect on them. So you've got to think, right, you've got to monitor that and make sure they've got the welfare support in there when they're doing it. So once we get all the information together and we had maybe 200, 300 pieces of data, we start to go through them all. And you have to review, verify and assess all the original data collected. So every piece of data we would go through a stringent process of reviewing it, verifying it and assessing it. And I'm sure Benjamin, after that, my talk will go on to how he does that sort of thing. So every piece of data we've got to go through it into finite detail and extract all the information we can from it. What I tried to do as well is with any bias, any unconscious bias, people have any preconceived ideas people will have about this attack as well. So we're going to a fresh to have a clear mind. So any previous knowledge and opinion in the review was removed. So we didn't look at any OPSW reports, then look at any Bellingcat reports, another reporting to try and have them so we had a clear and unconscious view. One thing I'm always adamant about with my team is be objective and critical of the data. Be objective, always go into these things, open mind. Don't have a preconceived idea, be objective. What information can you see? What does it tell you? Go through the what, the who, the where, the when, how. And be critical of the data. Take the data apart in finite detail. Get everything you can from it because that way then you can determine and see how useful it is. One thing we start to create there is a timeline from the data and information that's quite important then because we're especially trying to look at the event. We're trying to piece together what's happened, what actually happened on the ground. And actually when information was posted, especially when you do an open source, something might happen at 2pm, but the first post doesn't go. The first image or video doesn't go out till 2.30. So there's a 30 minute window. Why is there a 30 minute window between the initial event and the information start being published on the internet? This could be down to things like internet connectivity, removal of 4G connections. People actually when things happen, sometimes people don't the first thing they do is start recording that information and posting it online. Last year when I was conducting a course, that next to, as we started the third day, the Turkish invaded Syria along the border and actually we could watch it live and actually understand actually this mean live information. That's quite critical but then you can see it as it's happening. You can see it as identify your knowns and your knowns. What do we know? What information have we got? What information can we gather? Don't think, okay, and then when you start to clarify what you can actually know, what you don't know. This would be very clear about that and critical about that. I'm saying about the critical data. What do we know? This information is ranked this event. Is it? Can we push it, put it to that event, to that location, to that timeframe? Don't we know? It's been very clear about, is there a 30-minute window between an event and it first being posted online? Why is there a 30-minute window? That's when we start to identify our information and our knowledge gaps, understand where we don't have the information. We can start to think about why there's a knowledge gap, but don't start putting theory into it. That would be very critical about it. What is fact and what is fiction? What do we know and what we don't know? Sorry, then we're going to start to form this assessment on the available data we've got. So all that is done independently, so all we're doing is that in this enclosing find and the information we've gathered about that incident. So we've got there and we start to compare the results. So we conduct the cold case review assessment. So we look at that independently with all the data we've got and cover an assessment. Is it the same assessment as previously? Do we do things any different? Okay, one thing we did start to look at and sort of thing is Nathaniel has talked about and listed on that is data loss. We actually identified, we lost 10 to 15% of data that was available in 2017 to 2020. So 10 to 15% may not be much, but actually you might have that critical information that was posted online, that critical video that showed a atrocity, a war crime that has been removed by the government or the social media. That's going to change your assessment straight away, especially when you're looking at historic information. What we do in this case and identify, the loss of data had little impact on the information gaps or the assessment itself. That was only one case. So then just to start to close it down then to the conclusions then. Then we start to know is conflict is going to be more prevalent on the internet and because it's going to be more prevalent there's going to be more information and data out there. It's going to be more important then to look at the data loss can have an impact on the ability to use open source data and information. It's then I've been alluded to earlier on today is data of carbon needs to be more rigorous in order to ensure data and information is available for the future. Because I think in the end of the Syrian conflict, there's going to be a lot more people going through it and source information and look at all the things that happen in the Syrian conflict. I'm going to be more detailed to see what can be proved, what can be used evidence, what can you use as information to start to look at and start to prepare cases against these atrocities as well. Thank you. And that's why I'm going to address you if you want to get in touch with me. Thank you, Henrietta. Great. Alan, what an interesting insight into an analytical unpicking of what data loss looks like, how much there is and what implications it has to a study. Really, really interesting and it'd be fascinating to know more about how representative that experiences is. I am going to move us on, keep moving on through the speakers now. So I'd like to now introduce Hans Christensen, please. Alan, if you could stop sharing your slides now that would be great, if possible, and we'll hear from Hans. Okay, thanks very much. Plugging on the slides now, hopefully everybody can see them. Yes, right. Thank you very much. It looks great. So great. Thanks very much for the invitation and very interesting briefings here, of course. I really like that they touch on so many aspects of this kind of work. And so I'm going to dive into, you know, what we do and some of our lessons learned here over the years. And by doing that, I will start by just given an overview of a project here, sort of very briefly talk about the methodology that we've been using over the decades, and some examples of our work. And then some lessons learned from this. And the nuclear information project is a nuclear weapons. Nuclear arsenal. Public information project that tries to glean from God knows how many different sources on what the arsenals, nuclear arsenals of the world are and what they and how they are structured and the trend and all these things. So this is obviously work that goes way back before there was anything called the internet, anything that was called Google Earth, anything that was, you know, called cell phones where millions of people run around and take pictures and videos all the time. This work started and the methodology started way back in the 70s and early 80s. And my work, I'm just standing on the shoulders of giants like Tom Cochran and Stan Norris and where they mark and who were the people or some of the people who developed this methodology and dug out the tedious work of digging out the information trying to make sense of it, and so forth. So what we're trying to do of course is not only create the best non classified estimate of nuclear arsenals, but also of course explain the trends and histories, you know where we're heading. But very much to empower the public debate about this all these different categories that participate in it, and that goes for people both inside and outside. And the reason is, of course that once you have information that may not be right on the money, but it may be good enough. It actually also enables officials to participate in a debate in a way they couldn't do before because they can't share the actual assessments from the intelligence community with just anyone even within the government. But an important part is also to try to counter exaggerations and hype, which is prevalent in the world of nuclear arsenals. We see this all the time and right now also of course in the debate about China and North Korea for that matter. So this is a very vibrant piece of work. And we put this out, of course, all in the public domain, because the essence here is to make this information available to the public so we can use it so we can reach back in history. And so we produce the nuclear notebooks and the bulletin on the atomic scientists and that describe the arsenals in fact sheets if you will about the nuclear states and people can go back freely and use all of this information. There are no firewalls, there is no payment, anything. It goes into the super year book of course. And what's unique about that is that this is a publication that is translated into a large number of languages, Russian, Arabic, Chinese, Korean, Spanish, you name it. It just enables information like this to get out to audiences and users that otherwise might have a hard time to get access because it's not English, etc, etc. So, and in some cases they can also allow people in countries where they're not allowed to do research and write about their own nuclear arsenals to grab material from outside and say oh well that's what they say out there, and therefore can use it to have a conversation. So there's a lot of information over the years and you know we can go back to this of course our methodology is not particularly sort of, you know, unique, I mean it's like everybody else doing we're looking for for data on this and we're pouring it into some sort of a prism and try to draw some conclusions based on it. And of course is a vast inventory of the classified documents, even official documents like public documents and whether they be, you know, descriptions of operations or or budget documents or or what have you. We obviously dip into, you know, satellite imagery is to the extent we can we rely a lot on it in illustrations. We do less sort of technical analysis, if you will, of satellite imagery signals. We, we've done a few things but it's mainly to illustrate and to discover things. So, so we are maybe not as deep in satellite images others but we certainly use it frequently. And then of course personal contacts are so crucial in this work and I'm not talking just, you know, friends in the arms control community that work on this but talking to people that are at all levels of this discussion and this development, whether they be practitioners or journalists, intelligence people outside inside journalists, people on the ground locals whatever it's just very important to run these this data set by people and hear their take on it. And then of course social media, I mean you know obviously social media has always existed. But it's gotten a lot more advanced and immensely huge, as Trump would say. So there's a lot of data out there to crunch and we're just a small team. I mean we're a very small team, you know, a couple people going through this on a regular basis trying to make sense of the data. Old documents, new documents, what have you. So it's not by any means a big organization, even though Federation of Americans sound so big. It's a it's a very small project. And of course that also makes it fragile. But just some examples of that is of course we very much use satellite emerges to sort of document new developments glean some technical insight for them, but very much to illustrate. We combine that of course with documents search and what have you and put it out to the extent we think it's accurate. And then we come from our mistakes and, you know, move on. So, so through this we've been able to do some discoveries over the years and it's been really fun and rewarding. We can also use it to to sort of monitor and verify, verify types of forces that are deployed or operating. We can only get the news about what they are. I'm just using one example here from on the picture on the left, which is from a B 52 operation where there are certain ways to identify where the particular aircraft that's flying is a nuclear capable aircraft, or if it's just another bomber, you know, so that is that is an important way to doing it. And smoothing with people. You know sleuthing kind of work also enables us to do things like detect when a new system goes out and we for example detected when the first US low yield nuclear warhead when to see on board a US ballistic missile submarine late last year. And they're now out on the subs in both the Pacific and the Atlantic. In Russia, we spend a lot of time using satellite imagery. And of course, open source information to, you know, glean insight into their modernization program the status of it where is it important here is also that there's an enormous volume of information from the past. That has just, you know, over years proven very very important to do this kind of work. And ironically, this work that we've done has come, you know, in some cases fairly close to what official estimates are on. And what we've heard from some people in government is that it enables us government publications sometime to lean on the kind of work and and sort of spill. And that's not their entire Intel, but but just say, you know, up to 2000 on strategic weapons and that's where our estimate has been for a number of years, and so that's another effect of it. This raises another issue of course that I'll come back to later here that you're sort of damned if you do and damned if you don't because if you put this information out and and the players in the nuclear arms race, if you will, begin to tag into it and say that as their justification for saying the other guys are really bad and therefore we need some more stuff. You know you get into this loop here but there's no way out of this, I think, I think it's more important to to document what we find and and and and let the data and the information speak for itself people would use it for whatever purpose they find necessary of course. We're also trying to find information about new capabilities being built into the systems for example that's going to go through modernizations here's an example of, you know, one of the silos in the cosales missile division in western Russia. And you can just by looking at the image look at this enormous difference in the infrastructure for this silo. And as the base matures in the upper picture you can see new capabilities and in this case we've obviously noticed as have many other people. The inclusion of advanced air defense systems to the system and reflection of the perspective that these are going to be these going to be targeted not just by nuclear weapons in the future bars by conventional forces. We also combine satellite imagery and anecdotal evidence like stories that people in the past told about issues. They told that turned out to be really interesting here in the case layer where a former stratcom commander went to a Russian central nuclear weapons storage site. And after the fact that described this in public. Apparently the intelligence community wasn't very happy about that but he did it nonetheless. This destroyed the structure of this internal. And here we have a picture of what's going on now at that storage site how it's being excavated and they're trying to upgrade and modernize the storage site. On the Chinese side, of course, monitoring where systems go in China so hugely interesting because it used to be this very, very opaque dark spot on the map, where it was very hard to get informational as it came from people that leaked something. Since Google Earth came about. It's been extraordinary. I would also add cell phones for that matter, but certainly Google Earth has been an amazing trip. There's so many really, really good people today who are just combing these areas to look for systems and developments. That is just sort of a very potent development. I mean it's unbelievable how much is going on in this one and there's some, you know, really, really good analysts over the last few years that have just john pioneering work on this stuff. It's really unique to back to their names but they're, it's really unique to see our work has been for example to discover a missile training site up in the northern side of the, you know, China. Central track outline it and describe you know the infrastructure, new developments tried to geolocate all these types of stuff. And so we have even, you know, gone in there and looked at where particular launch events have happened been able to use, you know, geographic features plant, you know, plant distribution during different seasons to hone in on when the particular test happened etc etc. So that's been a very fun exercise we look at. What kind of launchers are roaming around in this area and we compare the visuals of these launchers on the satellites with what are displayed during parades to to sort of, you know, get better at discovering these particular capabilities when they're operating in in in the landscape. We're also discovering new things, such as work on silos in China. And again, just like previously with with Russia. This also found its way into the Department of Defense report the latest one. And, and again, you know, it gives you sort of an interesting taste in the mouth because you know of course it's part of the arms competition as well. But that's part of this governing stuff. We're also looking at satellite image imagery in the sense of the challenge of getting around the variety of different forms of commercial satellite imagery over the years when you compare with things and. And here to the left is just an illustration of a base in the Netherlands, where, even when we purchased imagery from the company that, you know, commercially sold these things we discovered that they had, you know, they had manipulated the imagery and and and so to to mask out a new nuclear weapons or a former nuclear weapons store. It wasn't even a current nuclear weapons store your former. But but you sometimes bump into these things and you have to be really careful when you use it. It's not given because you buy a commercial satellite imagery that therefore what you see is what you think you see. We're also, of course using this type of work. To kick back on sort of silly secrecy both in terms of this, but in terms of this image as it illustrates but also in terms of when policies change and of course a recent development is that the Trump administration decided to discontinue the disclosure of the size of the US nuclear weapons stockpile. Pushing for that disclosure for many many years and we're still pushing, but right now they closed the door on that happens on other issues submarine operations what have you that you know it's it's a it's a trend under this administration. So hopefully some of that will change in the future. So just some broad lessons from this of course. It's just so vital we find out of the years and this is a no brainer of course to have a very broad array of sources that you use for this. And like it's been mentioned before, start from the original sources. This is one of the biggest problem of the internet, a curse of the internet literally that it is so easy to spread misinformation and disinformation and and even by people who think they're and and once they find their way into the headlines, it can be extraordinarily difficult to to clear it out. And we see very respectable organizations today with sort of overview of nuclear forces on their websites that it just had really wrong information for which there's no basis and and once you start going back and tracking the sources. So that's just to say, you know, be really careful here on the importance of this, like I mentioned, it can help reduce official secrecy and sort of the compartmental secrecy, and actually allow governments to open up of course, to some extent there's a huge difference between So the United States is extraordinarily open about its nuclear capabilities compared to other countries. Russia says very little that was some statements after the end of the Cold War, but it's basically very opaque, although it does give a lot of briefings about now it's doing this and updating this facility and what have you. China also saying very, very little about what they have. But again, China has also evolved tremendously in terms of they like to showcase what they have and when you showcase. Well you show videos, some of those videos might actually be authentic. Others may be quite a mix of different types of sources and and events and what have you so be extraordinary difficult, and you have to be really careful about. And then of course the other issues that grow, you know, educating a growing up people so to speak in this field just takes a long time. You know, you know, an Ari issue here, tracking nuclear arsenals around the world. You have to have a memory bank that's that's enormous to be able to quickly see what's going on. Otherwise you're led down pathways that lead to nowhere or somewhere wrong. So that's that's just really important that not only that you can do that but also that funders value this and actually sustain this type of operation for a long time so you have the time to educate people to do this. And not least building archives data pages from day one be organized do it, you know, you love it later on. So, and of course, that can always improve and that's a never ending job for the couple of people we are working on this, this project, this is just like an enormous task compared to just following what's going on. I said funding is scarce and it's a challenge again. We've been lucky over the Trump administration Trump has been good for us. I'm sure we're going to have a hard time when things change again but so if fluctuates enormous that gives an enormous uncertainty in how you can structure and plan your future work and that is a big minus. So funding strategies by a lot of organizations we have many funders big funders that have supported us and many others. And then suddenly they get a new leadership and they're thrown into yet another strategic review what should we work on and how should we do it and suddenly everything is up in the air so this. There are all these factors that are going on. We don't have taxpayers who just give us money all the time. The collaboration essential valuable. You can't do without it lean on others run theta by other people you really have to do that. And of course, but it can also be a challenge on because of there's so much competition going on, ironically in this work. I call it turf here, but you know people find discoveries and they want to get credit for discoveries they won't just want to don't want to just pass them on to someone else or. It appears into someone's database or what have you so there is an element of that as well. But again, please, please, please train the next generation for this kind of work I know you, the other organizations that are represented here during a fantastic worker doing that for us it's a challenge but it's something for this particular kind of work we do, but it's we're we're actively engaged in with my, my new assistant here, Matt quarter who's a really bright cookie and so people like that need to be brought up young people for different kinds of backgrounds to to be able that we had to be to be sure we have these kind of capabilities in the future. I have a babble too long and I'll just leave it at that for now. And thank you so much. You know I've used your numbers I relied on your analysis for ever. And it's really exciting to hear you speak about where you get them from. And also the sense that you know you've given us a snapshot of verification before the age of Google, and now how the new ocean style tools and making a difference to your work. Before handing over to Ben, I'm just going to say, you know, it's again a really good reminder that a really useful part of open source research is the sense that it can correct misinformation. It's hard, but it can do that. And also this really interesting insight you gave us about the institutional memory is really important that that you don't just get to be able to do this stuff very quickly. And one thing that open source organisations can do is store the stuff which which sounds very similar to the talk you've had about data loss by previous speakers. So thank you very much. I'm now going to hand over to Benjamin Strick. I noticed we've got questions coming in so please all the panellists do keep your eye on those and be ready to answer them after Ben's talk. Thank you Ben. Thanks Henrietta and thanks to the other panellists as well, I'm actually really honoured to be here. A little bit nervous as well because there's a lot of very professional people in this room, but yeah, I'll give it a crack anyway. Okay, cool. So here's my presentation. So basically what I wanted to do is just make a little brief slideshow on visual analysis and answering questions. Belling's gone through a lot. Everyone else has gone through a lot and so I just wanted to really dive down into some specific case studies and show you how some of this works. So a little bit about me. My name is Ben and I'm an open source investigator with the BBC at the moment. I'm a contributor with Bellingcat. I run a project, a lead trainer for a project called the EU arms project where we document breaches of arms export licences in regards to the common position for EU countries. I've got a very different background. I don't have a background in intelligence or a very rich resource intelligence background. A lot of this is self-taught and as Melissa said before, Google Earth educated as well because it's free and I don't have to pay for it. So I just thought I'd put this slide on here. I do often have a lot of conflict images in my slideshows and so I just wanted to pop this in here. There will be a couple of things I'll warn you before they come up in this presentation if there's any graphic content. So a little bit about what I do at BBC Africa Eye. I'm part of a team where an open source investigative unit and so we've done some pretty popular titles and really good investigations into some horrible things that have happened in Africa. So for one of them, for instance, is the murder or the execution of two women and two children, which was said to be in Mali or to be in Nigeria. We actually found it to be in Cameroon just by geolocating this mountain set, this ridge line that you can see here and other things like that. And going through, we've done some work on Sudan when there was a massacre on June 3 as well. We were able to tell exactly who did the attack exactly when it happened. And as you can see, a common theme in this stuff is using satellite imagery. Now, more relevant to what we're talking about today, I've also been able to identify Turkish ghost ships delivering weapons packages into Libya in breach of the UN arms embargo. And we've been able to use a lot of satellite imagery to really show what's on the ground as well. So it's helped because a lot of these times people are trying to hide their activities and all that sort of stuff. And so we've been able to identify videos like this that were filmed on board a Turkish vessel with huge weapon systems, ACV 15s, GDFs, Corkats as well, which many of you might be familiar with. And we've been able to sort of do things like these visual forensics and bring them together. Another piece of research that I focus on a lot is about influence. And this is something that's dangerous to us all because it infects the information that we get offline. So Alan was talking about going to the first source, but sometimes the first source might actually be provided by a bot network like this one in West Papua, which was ran from an Indonesian marketing firm, and this massive Chinese network that I busted last year as well that was targeting activists in Hong Kong. And this one also, which is using AI generated profile pictures, a more recent one. But what I want to talk to you today about is looking at the Wing Loong II in Libya and how we attributed that to the UAE and some specific questions that weren't available online and how we found those. So this is a graphic image at CCTV footage filmed in Tripoli from a military, like a small recruit academy. In this, we can see when the young recruits turn and march, and then there's a flash of light and a lot of them are dead on the ground. This was a targeted strike. A lot of people said it was the UAE. But again, as Alan said in his framework approach, we really need to ignore that information and have a look at what the facts are on the ground. I would have said yes, this is UAE or it could have been Turkey, but we didn't have that proof. So we started digging for it. So obviously looking at where the academy is looking at where Tripoli is just to give you a bit of an overview. This is where the GNA is. This is where the LNA is in the red area. So we've got two conflicts that are going on here just in case if you don't know about Libya, but you probably do being in this in this seminar right here. So one of the first things that we started with was just having a look at the shrapnel on the ground that was collected on the day. We can start to sort of have a look at that stuff. We got access to that shrapnel through a citizen or someone that we that we worked with in Tripoli. And so we can see this table full of shrapnel. We piece that together. We found that all of this shrapnel piece together is a rocket system called the Blue R7. That's a Chinese made surface air to ground missile system. So this is quite interesting because in UN reports looked at previously and just from common knowledge and having a look at what type of weapons systems or what type of vehicles would use this bomb or this missile system. We know that it's fired by Wing Loom 2. And that's been included in UN reports already. So we had numerous amounts of evidence to attribute this to a Wing Loom 2, but not the UAE. So going through satellite imagery of air bases in Libya, we're able to look at one specific air base. We actually looked at all of them, but I'm not going to go through all of them today. But this is one specific air base called Al-Khadim, which is in the eastern side of Libya near Benghazi. And we can see Wing Looms on satellite imagery. We know these are Wing Looms because we can measure their width and length on Google Earth and narrow that down to not being any other drone but the Wing Loom with the exact measurements of that as well. And we use high resolution imagery for that too. So how do we know that this base was controlled by the UAE? We looked at all of the weapons systems that we could get on satellite imagery. Black Hawks, Wing Looms, AT-802 air tractors, which are refurbished, and Hawk surface-to-air missile systems. Who owns all these? Cool. We've got the UAE. Great. So we knew that they were in control of this. We also had a look at registries, which is what someone hinted to before out about the Cypri arms registry. We had a look at that and we can see that UAE purchased these. But there's a couple of problems with this investigation that I'll get to next. So we started to have a look at, okay, who knows what a Wing Loom package actually looks like? Because if we go to arms expos and I contacted friends that do go to arms expos all the time and take little photos of the places when the Wing Loom is displayed, what comes in a Wing Loom package? What does it look like when you buy it? Does it come in a box? Does it come with anything else? Obviously China doesn't want us to know this. So we have to start going through really alternative footage like this report from 2006-2007. It's a journalist walking through talking about how proud they are of this new drone that is being sold around the world called a Wing Loom. So she's going through the control centre and talking about it. So what we did was we got a satellite image from that exact day at that exact same location of when it was being filmed. And what we can do here is start to identify, okay, what are the core components that are sold with a Wing Loom? Okay, she points to this satellite data system right here. We're able to measure that one and we can see the little box on that and this one, which is the pilot's room. Okay, now we've found out that every Wing Loom centre, even the ones that we can see in the desert here in this base, all come with this little pilot control centre and this little satellite data centre. We couldn't find any of that information on Google whatsoever. So we're starting to bring these things together. What else can we find out about that that we didn't know before? So, for instance, these little boxes here. So now we know after doing this research that when you buy a Wing Loom, it comes like a product from IKEA. It comes flat packed. We found that out because we dived through Chinese state backed agencies and having a look at promotional videos about the Wing Loom. And we found this one. So what they do is when they pack up the Wing Loom for shipping, they fold the wings on the side and they have the core components that the nose and they put it in a little flat packed box. And that's what we can see here. And we can also rule that. We can also further provide that evidence by having a look at when these things actually moved from Libya to Egypt. We can count the exact same boxes and the exact style of boxes and they moved across just recently, earlier this year, they moved from Libya to Egypt and we can see that on satellite imagery as well. So we know that those are the Wing Loom containers there. So what I'll do is I'll make a new screen share here. And I want to take you through to Google Earth because that gives us extra information. So revisiting a site called Aljufra, which is a popular airbase in Libya. We're able to find some interesting things. So this is one of the bases where Wing Looms have been rumored to be present. It's not rumored because we can prove that through satellite imagery if I scroll through my little timeline here. Henrietta, is that Google Earth showing? That's working, right? Yes, yes, I can see it. Thank you. Cool. So what we have here is a little Wing Loom present there. And of course, I can just measure his wings to see that, to see that it's actually him. Okay, cool. So that's our little Wing Loom. What we can tell is that when those Wing Looms were present in Aljufra, this little station popped up here. And what we have is the exact, and that also left after a bombing in Aljufra as well. And so what we have is we're able to identify, okay, usually the UAE would hide their Wing Looms in a little airbase in a little area, protected area like this. But there's one place that we know that is a definite sign of Wing Looms. And that is now these little sort of installations that we're just able to identify by having a look at Chinese media reports and by cross-referencing that with satellite imagery. And now when we rule these out and we take their dimensions, we can confirm that that's the exact composition of what a component would be when it's sold from the Wing Loom. So that's all I've got to show you in a very brief amount of time. And if there's any questions, I know I sort of really rushed through that very quickly as an encapsulated diagram of what Wing Loom 2 systems look like and the whole war of Libya. But yeah, if there's any questions, please follow through. Ben, it was absolutely astonishing what you managed to whistle through in that short timeframe. I'm really fascinated by it. You get a really deep sense of the sort of work that you do is not about quick fix shopping lists of things that you can tick off a list that there's a real craft involved in open source research, not least how you know what you're looking for. You might not be looking for the thing that you're looking for that you start looking for components and you have to look for clues and different sorts of evidence, really, really fascinating. Thank you. So we have seven minutes left for questions everybody. So thank you. We have got some questions in. We've got several for Melissa, one for Hans and two general ones. So what I'm going to do is I'm going to read them out and give you each a turn to answer your bits and reflect on more widely. And if there is time, I will invite the audience to respond, but they may well not be. So Melissa, in order, in fact, this question is from Dan Plash, who's just bobbed up in Speak of You. Dan, do you very quickly want to say it? Well, yes, you said yes and then not. First of all, this series has been absolutely fascinating and very productive and it's been brilliant to see people coming in from very different areas of work into what seems to be building an epistemic community. It's been blown away by the quality of some of the presentations, but very quickly, one of the motivations behind our project is that while, while very large part of the world thinks it's desirable and practical to manage the world's industrial systems to tackle climate change, and thinks that's a viable thing to do and an essential thing to do, that broadly those same communities see the global monitoring of weapons as just too hard to think about. And going back to experiences and arms control 20, 30 years ago, it's always seemed to us that actually it's more feasible than people think it is. And these presentations, I think, only reinforce that. So the larger technical question is, if we're starting to think about what a global weapons management system might be, if you like a global lies version of the CFE CSBM agreements of 1990s in the OSCE. How scalable is the sort of work that Melissa and Hans and Ben, how scalable might that be to deal with the general purpose forces of the world's armies, neighbours and air forces, which is I think the 800 pound gorilla in all of this. We're looking at the micro level and we look at the macro level of WMD, but this huge area in between gets almost no attention from NGOs and funders and governments. Great. Thank you, Dan. Adding to that, a question for Melissa was, could you say anything more about how Deteo stores the evidence from Dan Yu there, and also a question from somebody who's now left, I'm afraid, but asking how frequently you update the imagery on Deteo sites. We had a question specifically for Hans, if you could say anything more, say anything about the deployment of Russian nuclear forces, please, and then some specific question for Ben as well has just come in about if you think that states might start taking measures to evade open source research the more they know about it and the more prevalent it becomes. And then two general questions that I want all the panellists to think about. Again from Dan L, when thinking about professionalisation of the whole field, do we think in terms of one community of practice, one set of professional standards, one ethical code or other multiple? And Paul Schult asked a question about what's the evidence for what sort of data sets are the most influential and the most useful here. So I'm very sorry, I've kind of gone through those very quickly. Times really ticking. I'm going to ask Melissa, please to respond first and I will ask you in the order that you've spoken so you can be preparing your questions. Yeah. Thank you, Melissa. Thank you for the questions and you can follow me on Twitter and send me a direct message at mhanum if you if I don't answer them all. I think the tools we have in detail are imperfect but getting better, and they are definitely scalable. Our constraints are the data, because we're collecting data outside the world and bringing it in and we have a finite budget. So the data we put on there is our mission and it may not be yours. Very interested in cost sharing and other opportunities for lowering that. The data we get in comes in the format that the company delivers it to us. So, for example, Baxar data is daily, but it may be two or three days behind the current date because it takes a while to get downlinked. We do build bots typically that they collect however often we believe the data to be updated so if it's a new source or a government page that is updated daily then we collect daily. Our storage is Amazon web services and storage really hasn't been a cost factor. It's getting cheaper and cheaper. It's the processing power that does start to add up and get expensive so when you click a button and you think you're getting, you know, often will be like it just takes a few minutes. There's like a computer server farm in Iceland that's just like dealing with all that data that's where the expense adds up so to the extent we can pre clean and vet the data for those computations to be simpler later we do that. If you have any other questions please feel free to reach out on Twitter. I just wonder Melissa if you had anything to comment about Dan's done plushies question about scalability of the tools that you do that you use. Yeah, so imagery, text analysis, all of those kinds of things I think are pretty useful. I did a training of the OSCE. I want to say a month ago on using natural language processing. I think there are a lot of opportunities. It's really a case of political will. Many governments and international organizations who have had the chance to look at detail say yes this is really great, but we don't want to operate in the public we want our own siloed version of detail where only we look at this information, so this is their right and their option and makes sense, but it does mean you lose that expertise and diversity from the rest of the world. Great, thank you very much, Melissa. So, moving to Alan, would you do you have any comments about these questions generally put. That's the one actually on those words are there how what information comes more credible how to make more credible. I suppose it's when you go to things like when bending this analysis behind it. If you can prove beyond, like to a very high, high degree of actually done the last behind it you can the informations they can prove the information. It's just when you've just got one post that has no nothing no analysis behind it is actually is easily debunked classes fake news. So then all the work that goes behind it actually can can prove the analysis and previous judgment. Great, thank you. So I think that was answering Paul's question about what evidence is most influential and what you're saying is it's the process of giving the information you collect some credibility the great credibility. So that could be it's not the question is not really about the data, the sort of data you're getting it's about the what you do with it how you crunch it and giving it a value added quality. Yeah, thank you. Alan. Yes. Hans, I'm going to move on to you then. Okay, let me just say I must just to just to interject I misrepresented the question for you. Did you read the clarification? Yeah, so let me just let me just share my screen while I answer the question so the question is about whether the Russia has nuclear are or to what extent it has nuclear weapons in the clinic red region. And of course, we don't know. All we can say is that yes they have lots of weapons systems that can that are capable of delivering nuclear warheads. But whether they have the warheads for those delivery systems in that region as well is is is hard to say, based on open source information. But you can see there's some important developments going on, for example, here with an upgrade of what clearly seems to be a nuclear weapons storage site north of the city of Colliningrad. This has been going on for the last several years. And I think they're, you know, just about done with this, the bigger of the bunkers. And so, of course, this doesn't prove that there is or is not nuclear warheads in Colliningrad. You need other information to do that, but it shows that the site is active and is being maintained and apparently being upgraded. So at least they want to send a message about capability or what have you. So here from from us officials is that. Yes, there are several nuclear capable systems in Colliningrad, but they have not detected movement of nuclear warheads into that region yet they're further stored further back in inside main Russia. Thank you. The other questions as well, Hans. The one about scalability. Yeah, the one about scalability, of course, is theoretically possible. But I think the real crunch for just speaking for ourselves is of course, is the amount of resources available to do this work and this, this is what it comes down to. I mean, we are in a way suffering under the same predicament as as the intelligence community does that it's very easy theoretically to collect a lot of data and store a lot of data and, you know, and present a lot of data and all that kind of stuff. What takes time is to look at it and pick what is most relevant and make the right conclusions about it. That takes analysis. And in the in the end it might, you know, take people, obviously to be able to make those decisions. And that takes that takes personnel. And so this is a real dilemma. So, for example, for our part, that has been contributing factor to forcing us to limit ourselves actually to not look at broader defense issues but just focus on the nuclear part of it but we just don't have the capacity to do that. It's really interesting. It's really interesting to hear theoretically it would be possible to do more global weapons tracking or more tracking generally, but the hard limit comes from, from how many, how many people hours you've got and so it's quite good to be able to draw limits to be able to do what you do really well. In case it's not clear to everybody, Nathaniel Raymond had to go because he had teaching commitment so we're not, we're not hearing back from him. So in the final point I'm handing over to Benjamin Strick to comment on these questions and we had a similar questions that one that came in to you about if these sorts of measures are available by states. Paul Schulte sent a follow-up question saying shouldn't we expect states to just start hiding things in big boxes. So over to you, Ben. Yeah, awesome questions guys. So first of all on the countermeasures, I mean, this is not new stuff. Bellingcat is not doing new stuff and none of us are doing new stuff because the enemy always has an intelligence room full of analysts, 10,000 of me and 10,000 times the size of Bellingcat looking at satellite imagery all day long as a job. And these Israelis have had some of the most autistic people staring at Google Earth 24 seven mapping out every single thing. This is not new stuff. I think what is new is some of the bad press that some of the military get because now it's not intelligence. It's the media doing it. And so, for instance, Russia has now sent out, well, did a couple of years ago sent out a communicate essentially to their superior officers in the military saying stop your soldiers from recording, uploading, things like that. And I'm sure the UAE have done the same thing for people in Libya, you know, stop taking selfies at Al-Jufra because otherwise Ben's going to find it, you know, so it's simple things like that. I also think, you know, as time as technology progresses as well, they're finding new ways of hiding bunkers. So for instance, there's a new Turkish airstrip in the middle of Tripoli that's built in a residential district essentially. It's just a smack bang airstrip. And the way to hide their vehicles is by hiding them under the unused apartment blocks, rather than having bunkers that are hidden and camouflaged. Of course there's bunker. What else is going to hide? It's not going to hide in McDonald's. I think the next thing is probably one thing that's coming across a lot is commercial contracts for satellite providers as well. This is a very touchy topic, especially coming from journalism as well. Obviously, if you're a satellite provider and you have UAEs as a contractor, you're not going to provide too much satellite imagery to NGO groups or public groups that want to investigate war crimes in Libya about UAE. So there's interesting situations like that that are going to be a turn point later that may be indirectly assisting the enemy. And I'm not going to call them the enemy, but just documentation of horrible things as well. For Paul's question about evidence, I think that's a really good question. And the issue is with OSINT or open source research growing is that it's becoming a bit of a trend as well. People hop on Twitter and they like to screenshot Google Earth and be the first one to identify this or that or this. But oftentimes the rush to get that information out is sacrifices, attribution and accountability and proper documentation. And I think there needs to be a trend to be, hey, it's cool if you can find something, but find it in the right way and do it in a research mentality. Not a, hey, I want to be the first journalist to report this thing because that's an issue that will kill the industry and will give open source a very, very bad name. And for scalability, I mean, the biggest qualm that I see with scalability is the collaboration of the communities. There are a lot of groups out there that want to be the first to get their brand name ahead of the thing or to be in charge of, hey, I'm the person that found this or where the group of people that found that. And they don't necessarily like sitting in the same room and sharing that reputation. I think if everyone was to come in with a complete community mindset and to be able to approach this and say, hey, we're all doing the same thing. We're all trying to keep accountability. Let's do it together. Sacrifice ego and do good work about documentation of North Korea, documentation of human rights abuses in Africa and things like that. But if you want to respond to me and send me hate messages or anything like that, I'm on Bendu Brown at Twitter. Ben, thank you very much. And I'm sure that's unlikely that anybody's going to be sending you hate messages. Really, really fantastic. You know, I'm sure we've all got a flavour that we could have gone on for a long time, much longer with these questions. And I'm sorry it's so rushed. I think you brought us back full circle, the sense that open source can do a whole bunch of stuff. But there are challenges to professionalisation. And that brings us back to Dan Ells question about, is it one community practice? Do we need multiple ethical codes? I invite you to look again at Melissa's. She invited comments on the details code of practice and the ethics that they've written. And I do hope we can keep this conversation going. Thank you so much. What amazing speakers we've had today, what amazing questions we've had and not just today over the full seven webinars. It has been amazing, the conversations we've had. Please do save the date of our virtual cafe. That's something that interests you and please do watch out for other things that we're doing. The date of the virtual cafe is the 10th of December. So it's just now time to say goodbye to everybody. Thank you again and hope that we see you soon. Bye bye.