 Good afternoon, everyone, and welcome to New America. On behalf of OTI, SNV, and Mozilla, we'd like to welcome you to our panel on Vulnerability Management. I'm excited to introduce our moderator, Sharon Bradford Franklin, who is co-director of New America's Cyber Security Initiative and the Director of Security and Surveillance Policy for OTI. She's going to introduce the rest of the panelists, but if you would like to tweet about today's event, the hashtag is OTI Vulnerabilities. So, Sharon, do you want to take it away? Thank you, Andy. So thank you for joining us here today. We have a great group of panelists. I'll introduce them in just a couple of moments, and we're going to have a conversation here first, and then I promise we will open it up toward the end for questions from the audience. So in April of 2014, the public learned about a vulnerability called Heartbleed. That was caused by a flaw in open SSL, and it allowed attackers to access a lot of sensitive information about usernames and passwords and so forth. Security researchers at the time termed this vulnerability catastrophic. And according to Wikipedia, at the time of the disclosure, about 17% of the internet-certified secure web servers were vulnerable to Heartbleed. At the time, there were immediately reports that the NSA knew about the vulnerability but refused to disclose it because the NSA wanted to retain this vulnerability and exploit it for its own intelligence operations. Just a few short weeks later, on April 28th, then cybersecurity coordinator for the Obama administration, Michael Daniel went public, announcing that, quote, while we had no prior knowledge of the existence of Heartbleed, this case has reignited debate about whether the federal government should ever withhold knowledge of a computer vulnerability from the public. He proceeded to describe White House efforts to design a process for decision making about when the government would withhold a vulnerability and use it for intelligence or law enforcement purposes and when it would disclose it so it could be repaired. Some time later, through some FOIA litigation, the Obama administration released a heavily redacted document that outlined a process called the Vulnerability's Equities Process that was going to govern this decision making. Then in November of 2017, just a little over a year ago from today, the current administration, through then cybersecurity coordinator Rob Joyce, released a fully unclassified document outlining the Vulnerability's Equities Process in a charter. So for those of you not already in the weeds on this topic, the VEP, or Vulnerability's Equities Process, is that process through which a government decides when to release a vulnerability, disclose it so it can be repaired, and when to retain it so it can be exploited for law enforcement or intelligence processes. Our conversation today will explore how the VEP works, what harms does it cause to cybersecurity when the government does not disclose a vulnerability or waits too long to disclose one, who has a seat at the table within the government to make those decisions, what are the processes and standards under which they operate. And we'll also discuss the US VEP charter one year after its public release, although of course its providence goes back further than a year. And a proposed approach just released by a paper that you may have had a chance to pick up on your way in, but our author here spent at the end of the line with SNV. So our expert panelists will offer a range of perspectives on all of these topics. And I'm going to introduce them now before I start turning to them for a series of questions. So we have seated immediately next to me, Heather West, who is senior policy manager and America's principal for Mozilla. Seated next to her is Daniel Mossbroker, who's the internet freedom desk officer for reporters without borders in Germany. Then we have Catherine Charlotte, who's director of the technology and international affairs program at the Carnegie Endowment for International Peace, and who is former deputy assistant secretary of defense acting for cyber policy and served as the DOD representative in the vet process from 2016 to 2017. And then at the end, we have Dr. Sven Herpig, who is the project director of the Transatlantic Cyber Forum for SNV. I'm not going to try and express what SNV stands for. He can do that for us. And the former deputy team leader at Germany's federal office for information security. So I'm going to start here with Heather in helping us set the stage for what the problems are that we're trying to address here with vulnerability management. Heather, Mozilla had some direct experiences with issues of vulnerability disclosure that helped illustrate all the equities that are in play. Can you tell everybody about that experience? Sure. So Mozilla makes the browser Firefox. And we are a mission-driven tech company. One of the things that we try to do is figure out interesting new ways to foster trust in the internet and push back on things that cause people to distrust the internet. So this is a pretty big one. When Heartbleed comes out, everyone, especially in the tech industry, basically runs around in a circle, throws their hands up, and panics. So we want to avoid that as much as possible. So when we started looking at the vulnerability equities process, we were looking for something that we could make a concrete contribution to that would have a positive impact on trust in the web. The vulnerability equities process is very wonky. It has approximately 100 syllables in its name. And I told folks internally, I'm like, this is a really cool project. It's really interesting. It's never going to make USA today. I was wrong about that, by the way. This has entered the discussion as we talk about the responsibilities of technology companies and responsibilities of governments to really protect users. But one of the things that we were looking at as we poked around this, there is a series of FBI cases called around a website called Playpen that the FBI used some hacking techniques to investigate. And regardless of whether you think that's a good thing or a bad thing, it's happening now. And we need to be thinking about how the government interacts with this ecosystem in the Playpen case, the FBI decided not to share their vulnerability. How it is they got into these, I believe, somewhere north of 1,000 accounts with illegal activity on this site called Playpen. And we think that that isn't necessarily the best way to think about that. If the FBI wants to withhold a vulnerability, we believe that vulnerability was in Firefox. We don't know. But we want to make sure that we can trust the government when they say, you know, we went through it, we talked about it, and we need to withhold this one. And that piece, I think, is the core of this process as the government looks at these vulnerabilities. And when Michael Daniel wrote his blog post about Heartbleed, I think that was actually a good example of, okay, so there's something going on here. So Mozilla started just looking into that. It was highly classified. Hard to get information on, but through the FOIA, which I believe the FF did, Michael Daniel's comments, and that kind of thing, we started working on what should a vulnerability's equities process look like who should be at the table? What are the considerations that you should take into account when you're making these decisions? You know, what if do we think we're the only people that know about this vulnerability? Do we think that we, the US government, can do more good with it than it will harm others? And so we were thinking about all of this, and it came together, and we were very, very happy with the reformed VEP charter that came out last year. And we'll get there. Yes, and so I will let others chat about there. We'll come back, yes. So turn next to Daniel. If you could help us further set the stage, based on your work with journalists and human rights activists on the ground and the perspective that this provides you for the problem that we're addressing here. Yeah, thank you for the invitation here. Yeah, so reporters with our borders, we have also an office in Washington DC. I'm from the German section. We are a human rights organization working with journalists, especially in the field, and trying to secure their work also physically. But yeah, I am focusing on digital issues. I'm a trainer for digital security, and maybe just to give you an impression on our work. So we, for example, launched this year in Germany a fellowship program for journalists who kind of had experience with digital threats. This is how we called it, and we asked for applications, and within two weeks we had, I think, more than 350 applications. And one of the criteria to apply was that the people had already faced digital threats. And well, after looking at the application, I would say like, in 250 cases, it was really like, okay, so like, wow, okay, there was something going on before. So, yeah, surveillance is honestly a big issue for us and for us, yeah, in training them, of course, we're also talking about encryption, about anonymization tools. But I have to say, from my experience, I'm doing this now since 2015. And I have to say, journalists working in the field, especially working in autocratic countries, they completely lost trust, honestly, in their devices. So it's like, they say, well, my government is cooperating with some businesses based in Europe, based in the US, based in Israel. And well, then you're nice to have PGP here, but I'm sure that my government has access to it. And this is problematic because we are a bit helpless sometimes because they're right. There are these vulnerabilities and they know about it. And yeah, this is a bit problematic because, yeah, what do you say to them, right? So you can't really help them. And one thing I learned, especially in the last month, so just give one example, because examples often make it concrete. We have one fellow from the Democratic Republic of Congo at the moment, well-known journalist in his country, and then he came to us and we kind of checked his devices. And we found out that he was infected with intrusion software, probably somehow affiliated with the state. That doesn't matter now, but the thing is the vulnerability that were used there are known since, I think, seven years. And this is also one of the problems because the discussion often focuses on zero days, but especially in developing countries where people just don't have money to get the latest devices, the best iPhone. It's just a problem that they use kind of very old devices who are very easy to hack, basically. This is maybe also something we can discuss today, like the whole thing of patching. So turning next to Kate, you participated in the Obama administration's development of the VEP and then a bit into the current administration's efforts to revise or at least declassify a document that outlined this process. So if you could just outline for folks a little bit more, what is the U.S. VEP, what it says, what it covers, how does it work? Sure. Yeah, and you described in your opening the pretty good overall statement of the trade-off that you're making between retaining a vulnerability to exploit it for military intelligence or law enforcement purposes or releasing it to the vendor for patching. To make that a little more concrete, too, I mean, it's a very valid kind of trade-off discussion that has to be made on the national security side. These kinds of capabilities help agencies look for warnings of cyber attack, a whole host of national security needs. On the other hand, these vulnerabilities could have significant impacts on software and hardware used by financial critical infrastructure, by energy critical infrastructure, by the broad public. So it's a really challenging and should be kind of a contentious, kind of debated process that happens internally. So from the U.S. side, the process of vulnerability equities and this kind of trade-off had been happening for a long time within the NSA after Michael Daniel's blog that was moved into the White House. But because it was in the White House and it was treated as sort of this executive privilege and high level of secrecy and sensitivity, there just was not a lot of discussion publicly about it with the exception of what came out in the FOIA, what came out in Michael Daniel's blog post. So, and I think there is a recognition and a feeling that a lot of these details were actually something that could be talked about. I mean, I, for example, go back to the criticism that I saw about, hey, the Department of Commerce isn't involved. Well, we just weren't saying anything about who was involved. And so, but every meeting, the Department of Commerce was there, like the different agencies were there, but we hadn't been talking about it. So I think when Rob Joyce announced the updated charter, it was updated, but I'd say most of it was new transparency as opposed to brand new content is how he portrayed it when he did the announcement. But it gave a rundown of the agencies involved. It gave a rundown of the different types of considerations that went into these decisions, like how prevalent was this vulnerability to the extent that you can know it. That's actually a challenging issue, like how much impact would it have on the public and private sector in terms of the vulnerability that exposes to them? Do we have any indication that some other adversary is already using this vulnerability? You know, lays out the number of different kind of criteria by which some of these decisions would be made, as well as the process through which it happens. In this case, the agency, there's lots of different agencies within the US government that could find and do vulnerability work, but they have to submit it into the process. The goal is consensus to either retain or release that. And if there is not consensus, then it goes into this broader interagency discussion process so we can detail if anybody's interested enough to come to either a consensus or address areas where there might be disagreement. Okay, so Sven, you just released this wonderful paper. Folks had a chance to help you pick it up out there, Government Vulnerability Assessment and Management. Here's our plug, right? Weighing temporary retention versus immediate disclosure of zero-day vulnerabilities. So if you could tell folks, you make a number of recommendations in this paper, give folks a high level, some of the recommendations you would like to highlight, and also you've been involved in conversations with the government in Germany about adopting some kind of VEP process. So if you could both give a high level overview of your paper and tell us a little bit about the status of these proceedings in Germany. Thank you, Sharik. And thank you for the invitation to be here today. So maybe let's start about the genesis and how we kind of approach the German government. We were setting up our work on that in January last year, and we brought together some German experts in Berlin, some American experts here in Washington. And we were talking about what are the issues that are prevalent on both sides of the Atlantic? Where do we want to cooperate and where can we learn from each other? And before anyone in Germany ever talked about the VEP, we came up with the idea that, okay, there's a VEP going on here in the US that was before it was published, but maybe we can talk about it and we can inform the German public and the German government about how this vulnerability management can look like for Germany. So we worked with that for one and a half years, and while we were working on it, we got lucky, the US published parts of their workflow and participating in the details. So we could heavily rely on that and then improve it with our working group of technical experts, political and legal experts from both sides of the Atlantic. And then when we were having the first draft of that paper, the German government announced that they're gonna do a VEP. And we got lucky again, we were like, okay, well, I guess now we will publish our version of how the German VEP should look like before the German government is actually publishing it. And that's what happened. We published our paper, I think, in September, and the German government will publish their VEP in February, March, next year, maybe. So what happened was, first of all, we were able to do the agenda setting. The German government was actually quite happy with it because they were not sure how the public would react to us talking about managing vulnerabilities because until now, there was always the discussion about the government should not retain any vulnerability at all. So we actually saying, well, there's a complex process that we can implement, help them to say, okay, we want to publish a policy and look at that thinking there, they're having some people, and they're actually also saying we should make a process. At the same time, it helped us to influence what they are drafting right now because you have to imagine how the German government sometimes creates their policies. So you have like one or two people stuck in an office for like half a year and trying to draft something and they don't even talk to their technical agencies. And they're writing a policy on a technical issue, which doesn't work too well. And then they had to rely on public information. They might give a call to the GCHQ or to the NSA, but also information sharing on national security issues might not work too well. So they were happy that it was a paper in German language dealing with specifically what they needed. And that's how we end up with what we put in the paper. The papers for policy paper are very, I believe very concrete. We drew charts and we put like boxes and said, okay, that's like two pictures. So if you look at the pictures, you already get the idea of what we're doing, which is very good for our policy makers also. And we had some high principles to follow. One is, for example, like this kind of process should be enshrined into law, maybe with the sunset clause and a review, but at least it should be put in the legal framework so it can't be changed on the rims of the government. And also that there should be an extensive weighing of the equities that go in there because it's a national security issue. And that's very important that we carefully look at the different equities and ways that temporarily retain vulnerabilities or don't do that. And then we worked through who should participate, who's gonna be in the decision-making body, then the problem that's very specific to most of the countries, not the US, but most other countries, which is, our government doesn't have enough technical expertise to make valid judgments or very good judgment so how do we bring external expertise in there and talk about classified information, but then also down to what Kate also just mentioned, like what are the criteria, so we're going through 10 different criteria that should be looked at and evaluated, when should the government be able to retain a vulnerability, when should it be directly disclosed to the vendor, something like where is it used, how critical is it, so can I remotely access an electrical power plant and shut it down or can I just download the emails from a five-year-old Android phone like just Daniel said, so what is that vulnerability, where is it used, how can it affect us, how can the law enforcement, intelligence and military use it operationally against our adversaries and all of that in a very neat paper, I think we're gonna have 20 school pages or whatever, and we published it in German so that the German government can take it on and they were very happy with it and now we're awaiting of their version and we're looking forward to criticize their version, which as the last thing I'm gonna say about that, which makes it, so it made it very easy for us to criticize them now because we went ahead and published something before they published their policy in a new field, so we have now in the public perception the best reputation in Germany to actually criticize them and they can't really argue too much with it because they know how we were doing it, so it's gonna be fun. So, question for all of you based on something that is in Sven's paper, paper makes the point that the question shouldn't be whether a government should disclose a vulnerability but when, so even when the decision is made to retain a vulnerability and exploit it, that shouldn't be a forever decision and the US VEP charter does call for reevaluation of decision, when the decision is made to retain a vulnerability, reevaluation of decision on an annual basis, so I wanna ask a panelist, why is it important to reconsider for people other than Sven, what do you think of the question of should it always be a when rather than a whether? Can we ever retain something forever? Thoughts on these questions? You wanna go first Heather? So Mozilla very much agrees and we were part of writing this paper, one of my colleagues and myself, but we do think that there, the reconsideration is an important piece of this because you shouldn't say, hey, here's my tool that I wanna use to get into that computer, I'm gonna keep it forever. That is kind of explicitly what we're trying to avoid when we're talking about the importance of this process. We also, when we were talking about it, thought of it as a shot clock, you can reset the shot clock and you can say like, all right, I still need this, I have a use for this and it is a concrete use and kind of making sure that people have a good reason to keep it and again, this is all about creating trust and talking about the process is a really important way to create trust but if we think that there is a horde of old vulnerabilities in a corner, that's not ideal but I also think that from the government perspective, an old vulnerability just hopefully isn't that useful. I think in the US that's probably more true than for your folks in the field but I think that that is getting that in there both in your recommendations and as well as in the declassified that charter was great. I'd like to make the point that yeah, we can talk about the weather and the when. I do want to talk about the weather again, we had this discussion in the working group so like there were also, I hope I am allowed to say that there are people who at the end did not sign the paper because they say no, we can't make this step because it's open, it opens kind of that we use vulnerabilities at all. So you're saying some folks take the position that you should never retain it for exploitation. I would like to say that so we're working a lot about, so we're an international organization, we're working a lot about yeah, about the surveillance industry making business with a state that are not that democratic and let me put it that way. And in this questions I think more and more I'm deeply involved in an initiative by the European Union, they would like to reform their control system for surveillance technology and for example the question of vulnerabilities is not part of that. It's too far. However, I think the discussion is a bit similar, we're talking a lot about okay, which states are allowed to get one. So this is kind of a weather in the sense of like which are the conditions for states to get these vulnerabilities because yes states can find them by themselves but this is I think no one knows I think not the majority. So especially when it comes to non-democratic countries they buy them on the market. Either they buy like tools to use or they buy like hacking as a service so the company does everything for them but it is also known that vulnerabilities are sold on the market and I think for this let me say like international dimension we would ask the weather is very important and what we always bring in there is like such a deal is only possible when kind of the human rights standards in the exporting countries are similar to those in the countries that buy these products. This is something we very much lobbied in the or tried to lobby in the European discussion it's hard to make that point because it is obviously that then a lot of deals would be impossible in the future and the states like or the states in which the companies are based customers by themselves by these companies which makes it very hard to lobby for human rights standards in this international dimension. So Kate Spence paper as you know urges it shouldn't be whether to disclose but when Daniel's taken it a step further and some folks say it shouldn't be even whether to ever retain. What's your take on this? Yeah I mean so I see valid purposes on both sides to retain being potentially in the national interests so I don't see it necessarily as a question of whether but when. I think that one of the most important things that a VEP process does is it forces the organization who wants to retain it to make a strong case for why they need it and why it's in the national interest to retain as opposed to release. So I think that having this kind of discussion and pushing this discussion whether it's every six months or in certain cases because something is particularly urgent every three months or even the next day it's the push is on those who want to retain it to make a case for it and if that case no longer exists for whatever reason it's not useful then the push should be to release as Heather said right where it's not, we've said, Rob Joyce has said it's not a hoard, it's not a stockpile and you have to live up to that but you're not just keeping these things around unless they're useful. I do think that in terms of like the timing of it you do need to have a flexible process to be able to where something is particularly of concern or there's a lot of debate about it that you should have that conversation more frequently. Now I do think that there are also instances where the question becomes if there is a vulnerability that is so old or it's in a product that is no longer supported that you release it there's not going to be a patch anyway. That raises another kind of question about whether it's in the national interest to release that too. So there's one kind of slight nuance to the question of whether versus when in my mind. So how did you come up with this formulation and why is it so important? I actually didn't come up with that. As we said before there was a lot of experts involved in drafting the paper in the end and we had content-wise it was already in there but I think it was Eric actually who suggested that precise sentence which I would like and that's one thing that's often overlooked and that's the reason why we can't go for weather which is you can also retain a vulnerability for defensive purposes. So for example the government finds or buys or whatever a vulnerability in a software or hardware where the vendor doesn't exist anymore. So what do you do with that vulnerability? Of course you can say okay we have this vulnerability we publish it on the internet and go for it but if there's no vendor who can provide a patch for it is it really that useful? So you have two options there either you retain it for defensive purposes which means you retain it, hope that no one else will find it because you don't know what to do with it you can maybe there's a mitigation method which is not a patch but work somehow that you can share with your critical infrastructure with the government or some broader with the public or you can say okay we have a vulnerability like a bankrupt five years ago so everyone who has like a smart phone with operating system XYZ it's inherently How likely is that? So how likely is it that technology is widely used by critical infrastructure in mass communication tools and the vendor It doesn't have to be widely used generally you can say the other option would be of course give out an advisory and say okay everyone who use that tool you can use it but be assured that it's vulnerable and there will be no patch for it so you can retain it for defensive purposes or you still have the other way of saying okay well you have an advisory don't use it anymore if you use it be aware that it's basically very vulnerable which is both ways you can go I'm just saying making the argument for I don't think we should go for not because there are instances where you retain it and not only for operational purposes which also makes sense but also for defensive purposes which might make sense So a couple of you have already referred to the concept that governments sometimes buy these vulnerabilities from great market vendors others not just necessarily not necessarily the NSA experts or others in other countries who are discovering them both happen and so one of the provisions in the US VEP charter that has been somewhat controversial and criticized by OTI among others is a provision talking about non-disclosure agreements and saying that if the government purchases the vulnerability it's not quite spelled out in the detail but basically the concept is if the government purchases the vulnerability from a vendor and enters into a non-disclosure agreement this is something that could lead to an exception from going through the VEP process in the same review So I'd like to ask panelists what you think of that kind of provision should we be having non-disclosure agreements are they important should we be narrowing those circumstances should they still go through the VEP process in the same way and somehow we can factor in that non-disclosure agreement so any thoughts you have on those questions and we can go back in this direction this time so Sven you get to start So in the paper we basically argue that zero-day vulnerabilities that are either bought via software that you buy like a hacking tool or if you buy only the services please give me access to that phone should also go through the VEP process because we've seen in the US case some very Latino iPhone case that this government basically contracted it out to open the iPhone and said we didn't buy the vulnerabilities so we don't have to give it into the VEP process or the service that someone uses to zero-day vulnerability to open the iPhone for us which is you know yeah so that's why we included it as one of the guiding principles and the main text basically says we are against NDAs because you can circumvent the entire VEP process by just going for NDAs however there's a footnote to it which says okay well we also understand that that might be very difficult to do for governments but another way would be to make an exception have like a validated paper trail every time you use that exception for NDAs and that in the end of the year you have an independent audit parliamentary oversight which says okay well how many times was that exception used how many times it goes through the regular way and is that exception rules still feasible or should be abolished altogether I think that's a very practical way to go about it but it ties in with the discussions that we haven't touched upon in paper which is like the entire market discussion of how feasible is it that our government can still buy zero-day vulnerabilities hacking through the services that they might need that's what the process is also for to evaluate that if they say okay we buy from you but we can't say on NDA it might end up being published four weeks later because the vulnerability market right now doesn't give out exclusives so you buy it from the vendor and then you publish it and all the other companies or governments that the vendor sold it to are screwed so that vendor is not going to sell to you anymore and again it's a trust based marketplace also so that might make the government not be able to buy their days anymore which is like good or what depends on where you stand but of course it kind of contradicts and kind of abolishes the entire process so I think we have to be careful there and Spence did it very well I mean he kind of laid out the complexity I mean this is one of the more vexing components of VEPA policy I guess I would just say on top of what Spence said is you know a lot of people hate the vulnerability marketplace that it even exists but I don't think the way to deal with that is through a VEPA policy I think you have to figure out your approach and you're thinking about how you feel about the entire dynamics here and then come up with a strategy for how you want to deal with it that's going to be much more much bigger and more holistic than any policy that you implement in the VEPA context so such as for the U.S. government we've got some pretty skilled folks at the NSA and elsewhere discover the vulnerabilities and use those but don't buy any maybe? I mean I guess that's a whole range of options for dealing with it but you know I just don't think whether you release or not release is going to be if you are mad that there are these vulnerabilities that are not provided to the vendor to be patched but that are sold for people to exploit them you know the way to deal with that question is not just through the VEPA process You already mentioned some concerns about the gray market Do you see NDA's policy regarding those as a way to address those concerns? I have to say in this very complex discussion and topic this is one of the easiest questions I think to answer because if a democratic state decides okay we need a VEP because you know all the pros and cons but we are coming to the decision we need that this is something a company knows a company then agrees to make a deal with that country this is then part of the deal so and if the company then says okay you have this VEP we don't sell the products to you okay so then we have obviously a company that does not match the democratic standards of this customer so then it's over the decision of the company so I think I know that NDA's are a tool for companies and if I were a company I would do it too but because it's so easy and states obviously need it but when a democratic state says okay we need it here for democratic law enforcement and we're doing only things that are in line with our values with our constitution okay so then we don't do business kind of companies that do not agree with our democratic standards and the great potential of misuse is very nicely shown at the moment in Germany so we have experience just at the moment in Germany with an NDA so atrosion that German law enforcement is using is probably close to the border of that what our constitution allows let me put it that way and we have elected politicians who have to control our law enforcement and the ministry of material just says oh sorry we can't tell you the details about that because we signed here a contract so people who are democratically elected are not allowed to do to check this technology in the interest of the public because a private company says you are not allowed to do that and this is something that with my understanding of democracy is not in line Heather? I think you guys have done a really good job of again explaining the complexity of the NDA question I don't have a great answer Mozilla's work around the vulnerability process was really around the kind of creating transparency and trust in the process and the actual the acquisition and the NDA I think you can absolutely use it as a loophole but I tend to hope that the folks who are working on this are really well intentioned and the fact is that everyone that I've interacted with who is a part of the vet process in the U.S. the vet process has really been well intentioned and wonderful and smart and really juggling these questions and I don't think anybody thinks there's an easy answer because I on the one hand if we say you can't enter into an NDA as the government then suddenly you don't have access to the marketplace and that raises a lot of very good interesting questions and now that we have a better understanding of what that process looks like I think talking about acquisition and NDAs is is due so continuing to talk about so Kate talked a little bit about the question of who has to see that the table and making these decisions and some questions that came up before the fully the more transparent version of the vet was released whether commerce was even participating and there has been some criticism of the vet charter that the number of entities who will wait toward retention for use by intelligence and law enforcement purposes outnumber the entities that will push more toward disclosure or early disclosure for repair so I'd like all of you to talk about this waiting process whether addressing it through the composition of who has to see that the table is the way to do it, are there other mechanisms do we have the correct balance now some other criticisms that have occurred have been the want to cry situation might not have been nearly as disastrous if there had been a much earlier disclosure so are the pressures inside the vet leading toward disclosure too late even when they do lead to disclosure so if you could comment on the waiting whether to see that the table is the way to go with the mechanisms to address this we'll go back this way again Heather I think Kate actually mentioned that this is thus far has been the discussions have been consensus based which I think means that we're not counting votes and I think that's appropriate at the end of the day if you enter that room and you say alright we're all going to find a solution together I think that the chance that gives you a much better chance than some kind of adversarial discussion would where agency A says I need to keep this and agency B just starts I don't know probably not very dramatic but you know kind of saying no no no we need to close we need to withhold you know I I tend to think that the considerations laid out in the charter are good ones and take into account you know the US government has both a defensive and an offensive mission here and making sure that that's representative in the process is a really key component so I was quite pleased with all of the transparency and how they talked about the process yeah I mean I think it would be very silly to add or subtract agencies to this list to try to achieve some sort of representation of the trade-offs you're making I think you have to have really good criteria and you know factors that you're weighing that the group as a whole is looking at you know of course from their specific agency lens but also from the broader good and the waiting so I think the idea of let's add agency X so that a better balances the group just isn't the right way to go go about making sure that the trade-offs are being weighed appropriately I think that requires that you have the right kind of content and criteria that you're waiting I mean it should be a debate if there is an agency that is affected that you know isn't normally part of the process although that's probably few and far between because there are so many agencies in the U.S. charter they should be brought into the process and the system should be flexible enough to allow for that and if somebody does disagree you know there should be a process in place to elevate and escalate that conversation if there's somebody who really feels strongly in a way that the rest of the group doesn't share I think it depends on who's doing the technical evaluation because if you have like it doesn't matter if you have like 10 agencies for retaining normally intelligence law enforcement and so on and then 20 others the problem is the 20 others who do they send representative who basically understands the concept of what the vulnerability is what it can do and then what it can do in the sector or do they send someone who has a technical knowledge to really understand what the vulnerability is about so just last week the UK NCSC released their they went on their continued with the transparency initiative and also published parts of how they run there the UK VAP and they basically they have first a technical evaluation they talk about it and if they don't have consensus to more senior staff and that more senior staff is mostly not technical so they do a write up in a non-technical way so that the senior level can understand what the vulnerability is about and then they will make a decision or not and I think that's basically the key component because if you don't understand if you have to make the technical analysis and evaluation and you don't understand the technical details then it's easy to not basically be a champion of your own equities because you don't understand what it actually means and I think that's where the most important part of the discussion is and not should we add another 5 or subtract another 3 in the German debate we have a civilian only defensive national cyber security agency so it's only for defensive purposes civilian and so on and we will have and we already had a discussion about should they be included in the decision making body because they have a very good reputation in Germany with companies with society and so on other government agencies so they argue if we are not part of the process we will never know about vulnerabilities that are not disclosed so no one can say that we are not doing our job because we are actually disclosing all the vulnerabilities that we know about because we don't know about the ones that the web will retain but the other discussion is well if not you who's going to be the champion for disclosure because you are the only guys who have the technical understanding of what's going on there and you are the ones who actually want to disclose or favor disclosing of vulnerabilities so I think that's a very crucial aspect so if we generally feel like we are maybe doing okay on the process point who has a seat at the table going for a consensus approach do we also feel like we are getting it right by and large on when disclosure is happening I mean I mentioned the WannaCry example there certainly has been a lot of criticism they held on to that vulnerability too long could have prevented a lot of damage what about other weighting the equity considerations the US web charter has NXB which lists a lot of the criteria that are considered should there be a weighting of the criteria or are we already getting it right from the balancing so then we'll go back this way this time I just want to it might be an unpopular opinion but I don't think that WannaCry was actually a failure of the web because as much as we know about WannaCry it's a vulnerability that was inside job or not stolen from NSACA and then when they noticed that they told Microsoft Microsoft provided a patch rolled out a patch for the most current versions of their operating system and then companies had between four and eight weeks to patch the systems and only then it was operationalized and WannaCry and then not patch up and then later on of course Microsoft provided patches for outdated systems in that case I don't think that that was a failure it wasn't whatever the failure was that they were not able to safeguard their vulnerabilities that they were holding on to which is a very crucial issue and we haven't found a solution to that because I mean honestly if CIA and NSA are going to be two of the most powerful intelligence agencies in the world don't know how to safeguard their vulnerabilities from third parties then we are in deep trouble with every government agency that's retaining vulnerabilities in Microsoft so I mean I think the specifics that have been discussed in the past are about 90% are disclosed and 10% retained that doesn't provide a ton of information to judge exactly the specifics of what decisions were made and I think everybody will be looking out for the report that comes out from the United States eventually subject to its new charter so I think some of that is inevitably going to remain somewhat hard and probably necessarily hard to assess but I do think to Sven's point there is this it's a bigger picture what are the steps that we are all taking to reduce vulnerabilities that the public faces in the ecosystem it's not just the act of governments releasing it it's also the act of companies, vendors patching people accepting those patches and so there's a for the effectiveness of the system we have to look at it more holistically about not just the government's responsibilities but others as well Daniel should we have a waiting toward disclosure? No this is also the point I would like I think we have to make also some differences between vulnerabilities so I think it's different when we're talking about critical infrastructure where we have people who are really able to patch that with a big IT department and we have other where we are working on it's basically mostly on mass communication tools where we I can just tell from the field we really have the problem that people don't know that updates are a good thing they don't have the ability to buy the latest devices they just have the experience when I update my computer it's lower so this is also something of course this is maybe not shouldn't be part of the web but this is something we're talking about patching this is there are the problems in the field again like most of the cases we see when when people are hacked are with very very old vulnerabilities so in May AccessNow brought out a report that German software hacking intrusion software is still used in Turkey and the German government says ok but since 2013 we did not allow any export of intrusion software to Turkey so either the German company circumvented the export control regime or it is still possible to use this 5 year old intrusion to hack people and this is something I would really like so if the government says ok we would like to do that they also have to support people also talking about media literacy etc Heather yeah I mean I think waiting just feels a little bit like the wrong frame to me since we're talking about a very qualitative process not a quantitative one but we have definitely said that you should bias towards disclosure and I think the numbers that folks have talked about Kate and others there are some public quotes about percentages of vulnerabilities that are disclosed and I think that's fantastic but the real question is the context of each of those vulnerabilities and I think that an overly prescriptive framework might actually be counterproductive so some flexibility in our early drafts for VEP reform legislation we did outline a number of interesting questions that we thought should be considered but we never talked about it as a specific scoring framework or rubric so we've alluded a little bit to some of the statistics and expectation for hopefully more statistics coming out the US VEP charter says that there has to be a report on an annual basis it has to include certain numbers and has to include an unclassified version but it doesn't actually promise that that will be given to Congress and certainly says nothing about disclosure to the public although I understand there's some hope that that will actually happen it's been a year since the public charter came out but I believe that the reporting period is the government's fiscal year so we haven't yet seen this report that is hopefully coming soon and hopefully will be made available to the public but I would welcome panel's thoughts on do you agree with me that something should be made public to the public beyond Congress and in other countries as well what reporting should we have and what should be included in that presumably a lot of details about vulnerabilities that have not yet been disclosed is not going to happen you wouldn't be retaining them because they are classified but beyond statistics what kind of statistics what we would like to see in this transparency reporting start with Heather I'd love to see numbers but the numbers themselves as I just said don't necessarily tell the story I want to know whether the process is working how many times did it break down did we classify given sets of vulnerabilities as super high impact and how often did we withhold those I think that there is a lot of potential context that transparency will give in reporting and that will help people trust the process I do think that there is some really valid things that shouldn't go in that classified report because that's not the way disclosure can or should work and the we from our perspective the most important part is we trust you within this process and then talk about how best to kind of we're always going to be trying to find these vulnerabilities anyway we're not trying to be oppositional but we do want our product to be safe and but again it's the I want as much context as possible and that you know I think we'll go a long ways in terms of that trust so I work for reporters without borders our claim is for freedom of information so it's not a favor of transparency because it helps governments responsible maybe so yes I would be personally especially from the international perspective be interested in okay what's the source of the vulnerabilities so did they find it on their own or did they buy it somewhere and especially like is there an exchange with which countries because I think this international dimension is crucial in a world in which we have like globalized communication infrastructure I can only say that from my experience now with negotiations about surveillance technology we also talking there about transparency my experience is unfortunately quite bad so we do not even know what what category of product so I don't know telecommunication surveillance equipment intrusion software monitoring center our souls so we do not even know it from our own governments and one thing that really yeah comes now to my mind so like I had a very good conversation with the German yeah with one of the German people negotiating on that thing and I explained to him like why transparency is important and why people have the right to know it and why it helps people also in the field to know what's going on there so he convinced I really felt it and then he made a draft for the negotiation and he wanted to bring it in and then somehow there was like we talked again maybe it's not that easy so what obviously happened is that there was like internal briefing and the intelligence agencies said well we do not want to have that public so yes in favor of transparency because it makes governments accountable for what they're doing but very skeptical that we in that topic here the vulnerabilities that are from my perspective like one step further from other problems we already have yeah very unsure whether this is what happened Kate what should the public know beyond the 90% number of the current version of the 90% number well so I'll take this in a slightly different direction which is I think if you're looking for bang for your buck in effort and transparency it would be to expand these kinds of processes and discussion transparent dialogue about these processes to others right now there are two countries who have put processes on the table for public discussion maybe one more coming with Germany but if you go to I guess some of the DNI has talked about 30 countries having offensive cyber capabilities or intel capabilities ok well not all of those probably are finding and using zero days but that still leaves a lot of countries that ought to be having this kind of conversation and laying out a process and showing that they do take it seriously showing that there is some measure of oversight of their process and I think to me that's where the bang for your buck in transparency comes I think one indicator that I would like would be how many times were vulnerabilities that we retained used against us by other states or other non-state actors because that basically indicates that the process has failed that would be my faith number we have an Israeli intrusion at the moment used against Israeli citizens technological sovereignty so Sven you talked a little bit about gratuitous timing of when things are coming out and I think you also alluded to both alluded to the UK so just two weeks ago when this event was already already in the works the United Kingdom's National Cyber Security Center released its own statement of its vulnerabilities equities process and we know Germany has a system in the works if all of you could comment on differences between the US and the UK what they put out there differences particularly for Sven that you anticipate with the German version and whether of these 30 countries all of whom should have a VEP are there reasons why we might want to differ and not all look exactly the same for different countries so I'll start at this end Sven actually I think the process in the US and the UK look somehow similar and what we suggested in our paper looks also similar and I think what will be the outcome of the German policy discussion will also look somewhat similar which is I think somehow also a good thing the differences in for example where the secretariat so where is the center based as it can be the White House and Germany the Chancellery is it like in the UK it's the GCHQ and who's going to sit on the table like in the UK case for example it's mainly within the GCHQ and CSC people so people working on cyber security in the first instance here in the US and also in Germany we will have set up composed of the different ministries that all have equities in Germany I expect additionally to see an independent expert pool outside the government that will be we'll get an ADA or whatever so they can be additionally drawn into this discussions and evaluations of vulnerabilities because the technical expertise might not be sufficient which is not the case in the US and the UK where there is arguably enough technical expertise there to evaluate those vulnerabilities so from the setup and where it's based there's going to be slight differences but the workflow as such is very similar and also I think the parameters by which you will evaluate vulnerabilities will be similar transparency reporting might be a bit different in Germany for example we have a lot of transparency reporting on wiretapping government hacking and all that kind of stuff so also going to see good numbers there and I believe that the important thing about transparency initiative by the UK but also in the US and that we are doing more or less in the open in Germany is that the countries that the other 27 countries out of which maybe 20 might not have the resources and to spend on developing such a process they can now look at these three things that are out in the open and they can say well if we want to draft one maybe we can use this one some elements there some elements there and build our own or we can just take one of those approaches and make it our own and that helps them to implement something which basically when it works well increases overall national security without putting too much resources into it yeah I think to me the main difference again was as you said was national and organizational structure I mean having it happen within the GCHQ and NCSC that works for the UK because that one organization is simultaneously responsible for intelligence and protecting the public so they can have that debate in a way that is you know that that process works for them because you can bring those different equities to bear in the US system where you have a number of different agencies with different equities kind of run through the White House I think it has to be run through the White House because nobody else can bring everybody to the table and be the kind of forcing function to bring vulnerabilities up the system and to force the debate and decision making in between them so I think that every process is going to be very unique for each nation and so we should not get overly prescriptive about what exactly that works how that works because the structures are very unique I think the commonalities between the different processes are some of the things like what are the you know what are some of the criteria that get taken into consideration and that's an area where I think really you can draw from the experience of the US and UK for some of the other countries but to me the things that matter are one that there is a process that involves competing equities that there is some measure of kind of oversight and somebody who can kind of make sure to be a check and that there is an expectation and a need for those who want to retain to have to make their case and to put meat on the bones to explain why they need to retain it and so to me you know those are the kind of three recipe items that have to exist and I think the other elements like how a nation structures it is going to vary pretty widely I'm skeptical because we so these kind of processes won't be implemented in autocratic or dictatorial states and we have a discussion about that at all there which is problematic but I also don't think that it's like that it will be the case in all the democratic countries so we have a lot of countries in which we have well they they face like all they feel that they face threats and they really need a strong intelligence they really need a strong law enforcement so the discussion there is just a different one I think you just was you were last week two weeks before in Israel so talking there about these issues is like completely different I am I'm very skeptical especially when we're talking about a global solution so everything we're seeing is like completely national one because all the intelligence systems are still of course we have intelligence sharing but they are mostly like in the national framework so we will have webs on the national level maybe but the global solution that is needed I don't see this happening maybe I can just react to that too I think there's always this question like when we're talking about cyber norms should we agree to something when the rushes of the world are going to agree to it but then cheat so there's always this question is it still valuable if not everybody is doing it and should we still do this even though most of these countries that you're talking about may not ever sign up to something like this although maybe we could get China to see the value in it but I see that it's a national it's our national interest to have a policy like this don't get me wrong so I think that too but I think it would be of course better to solve this problem on a global level but this is what I don't see I think that's a communication problem because communication should be okay even if you want to have your intelligence to be very strong you would still need a process because that process does not make your intelligence weaker it just helps you to determine what you should do with vulnerabilities and if you want to have if you're a country like whatever whatever the test okay our prerogative is that the decision should be weighed towards retention for the use of our intelligence agencies because that gives us overall more national security than disclosing it because disclosing it gives you overall national security by fixing it and your infrastructure is everywhere so I think that's a communication problem that we had from the beginning that having such a process is good for your overall national security not only for these geeks there and these IT security people but for the overall national security if done right so every country should aspire to have one and do it right so their overall national security improves but I agree with you that it's not how we talk about it most different Heather? I mean I tend to agree with everything that got said here I hope that I mean in a perfect world I'd love to see this everywhere and I hope that any processes that are set up have similar kind of considerations and principles behind them and then are appropriately structured for whatever government structure they live in so fingers crossed we've been working with a lot of non-US folks to kind of encourage other governments to be thinking about this and certainly that people are thinking about it so I want to open it up for audience questions two rules here one please identify yourself and ideally your affiliation and second please actually ask a question also we have microphones so because we do have a webcast thank you is it on? Hi my name is Karate I work cybersecurity for a large provider and so I really had at first I wanted to thank you you answered just about every question I had but there was there's kind of a two-fold question here which one was private sector involvement so using say like the ISAC model information sharing analysis center so for large providers and software companies for example being involved in this process maybe cleared individuals who were basically you know sworn to secrecy and the next question was someone talked about oversight and I just wanted to mention you personally docked with the orders guy but like Bloomberg recently released a so-called vulnerability for Supermicro and they said oh there was like a Chinese chip besides a rice grain that was vulnerable and Supermicro announced today that they had done oversight and they basically had a third party analyze it and there were no chips found so everybody needs some kind of oversight was what I was trying to say yeah I'll jump in from that kind of private sector side I think having some private sector involvement would be super hard because it would place anyone who well here's this vulnerability should we tell you about it like that's going to be a very conflict of interest side of things I do think that that is an important structure and ISACs and ISOs and all of that is fantastic but I think that's the step that comes after the equities process that when we're talking disclosure from that side and I do think that this is all very complicated the Supermicro thing points to that and I've heard a lot of people kind of go I wonder what that was I'm going to take the first question of course you're not going to bring someone from Google to the table to talk about the Google vulnerability because that would not work you might bring some other technical experts to the table when it's about the Google vulnerability as I said it's going to be difficult of course it's one to have a good accuracy whatever but at the end of the day I think maybe not in the US case maybe even in the US case we do need outside technical expertise to do a better vulnerability assessment so of course we can make the decision now we don't want to include private sector so we don't run the risk that the vulnerability will be told to the vendor before we decide to do that but then we lose the opportunity to get that extra expertise on the table it makes it better informed of the right decision so in my perspective we should bring private sector and maybe even society a civil society on academia into the fold and help them bring them to help us better evaluate and assess vulnerabilities but of course it would be up to them if they want to do that if they want to be part of this secret body that decides on the security of the ecosystem and the internet there's a webcast there's a webcast I'm giving you one more but then I want to see if other people have actually does anyone else have a question right now? let's move on I saw a hand over there first that's fine that's fine you've got it hi Patrick Barry for this purpose I'm a fellow with the New America cyber program and so I wanted to ask a question on what you've been discussing and its relationship to what in some ways I think is a as great or a much greater problem which is the presence of known vulnerabilities and the relationship to cyber security incidents I think Rob Joyce has been on record as saying that the vast majority of vulnerabilities that NSA identifies have to do, or sorry the vast majority of incidents that NSA identifies have to do with known vulnerabilities which suggests that perhaps the the discussion has been focused on a very narrow slice of the overall cyber security problem so I guess my question is what from the process that you guys have been working on for so long or participated in or modified to meet sort of a global context what do you think of what aspects of that process are important for pushing out to this much larger problem of people either not taking action on known vulnerabilities, companies failing to patch in time individuals failing to download patches in time this much bigger issue of the prevalence of known insecurities but very little action which is you know I think personified or demonstrated through incidents like NAPETCHA where you know the western world is fairly inoculated against that problem but so much of the rest of the world was not so if you could address that question I would really appreciate it, thanks this is my Bailey work so I explain people that an update is a good thing and that it's worth to buy a new smartphone every two or three years especially when you live in countries where yeah you have a lot of fishing etc etc yeah so the soft but most important answer is so people need to be aware of that I think all of it is already there and so for example I know it from my personal experience Google is for example when you log in they say to you ok do the security check and try to update and blah blah blah so there is a lot of going on but yeah we need to be better and broader pictures of course but this is like probably one of the largest challenges in the world yeah people need to have money to buy it but yeah I think this is something you cannot solve like and this is incremental improvement right so that is helpful it's great it's not going to fix those problems I think that my experience in tech has been very much that companies are realizing that making things like security practice optional is a terrible idea I would not I don't want to think about it right I the user and I'm a pretty like savvy user I still don't want to think about it but that comes with its own difficulties when you're thinking ok so I'm forcing this update on all of these users but it's for their own good but but but in the whole line there but I do think that there's there could be some improvement when we're talking about security updates especially when we're like then looking at the wanna cry since copy of windows you didn't get the update and you got you know ransomware um so I think we're working on that we ran an education campaign I think last year about updates it was really nerdy but people liked it so I've been working on you know VAP and how we can expand you know VAP processes more internationally I'm very interested personally in kind of the tradeoffs and the intellectual aspect of this but I'm also very cognizant right that there is the opportunity cost of how much you sink into this right so I mean that's why I'm not particularly a fan of like long legislative processes to detail everything that a VAP you know should figure out that's why I think that you know I look to the three key ingredients of having a process that you've stated are having to make the case for retaining it and having some kind of oversight involved but you know it is when we face so many difficulties in patching the things that do exist in this whole world of supply chain, cyber security and vulnerability and even understanding where vulnerabilities come in in the supply chain like what's the opportunity cost so I think we have to you know do vulnerability equities in every country should invest the that finds these kinds of vulnerabilities should invest the time into having a process but there are limits in my mind to that when you could be spending you know pushing resourcing advocating some of these other things that have major impact three quick points to that first of all as you all mentioned I mean the process doesn't deal with known vulnerabilities only with unknown vulnerabilities of course as we discussed so it's just a tip of the iceberg and seeing that of course it might be ludicrous how much resources collectively we spend on designing the process talking about it and discussing it but it's really important as we also just stated that we have it and because it kind of governs a certain problem the other things of course second part is how does it still factor in the web and how is it how are known vulnerabilities factored into the paper for example they're part of the how you assess vulnerabilities for if you want to retain them and for how long or not for example what do you do with a vulnerability that exists in an IOT device which doesn't have an update channel because a known vulnerability can be known vulnerability but if there's no way to patch the devices remotely and I mean who if you connect to their connected smart light bulb and see that there are updates to run so that factors in the evaluation and then a third point and I'm going completely off the script here you all mentioned patches and how important it is it is in development that we're going to see in the next couple of months and years the Australians just passed legislation about it I know it's going to be discussed in Germany but not officially not so officially yet that the idea government hacking is now being closer drawn to infected updates so we're rolling out individual updates to maybe you WhatsApp on your smartphone number and you're going to get a new update which is so awesome because it has a new feature that no one else has which is like all your WhatsApp messages are sent directly to law enforcement congratulations and that if that's going to pick up and it's going to get public it's going to completely undermine the trust that people have in the update process it's going to be the worst thing one of the worst things that governments can do in terms of overall IT security which of course factors in again international security because then we are talking about well I know it has no vulnerabilities but I don't want to patch because well maybe that patch is provided to me by the currency of our government and then it depends of course where you live I think that it's going to be like I'm not going to roll out I see the journalist in front of me and when I say how important updates are they show me the news article saying that an update infects your computer that's going to be then of it I know we had a question over here on the side I am Sean Lingus with Cyberscoop and a reporter with Cyberscoop I'm wondering in terms of the market for purchasing zero days or for a flaws whether you would recommend governments tilt towards obviously you're not going to get anyone to say that they should never buy them but if they should err on the side of finding their own rather than buying them from a shady middleman and in that vein should we expect more pressure on governments in the wake of revelations about zero day vendors or companies offering offensive tools like hacking team and NSO group they've had some very unsavory business dealings for example the murder of Jamal Khashoggi the hacking team the Italian company did correspond with one of the assassins and helped train him in a way email show so I guess my point is should there be more pressure on governments to definitively state we're not doing business with certain vendors who traffic in this kind of thing most certainly yes so one point that we are always bringing in I think this is a good comparison so we in Germany we have a we have two rulings by the constitutional court under which conditions governmental hacking is allowed conditions are like very strong let me put it that way it's hard to build technically a trojan that is in line with the constitutional safeguards some say it's impossible but what we see is that we have trozans developed in Germany that are not in line with the german conditions but are sold to other countries so technology where we say okay this is against human rights is sold to others and this is something where we say this this is just not okay you can't but we also have to say so our government is customer of these companies so it's hard to make pressure on a company you rely on I think my usual picture is like we're using taxpayers money to pay companies that are using that money to develop tools which are not in line with the constitutional safeguards to violate human rights in other countries around the world which is like I mean you don't have to think about that's truly wrong so yes most certainly we should regulate that more and I think then what you were mentioning about the arms control part of surveillance technology I think that's actually the right step in the right direction because I don't think that states should act like that that's really wrong I mean I think in a perfect world I'd love to live where governments don't transact with shady people but I think I'm wearing my pragmatist hat and that's just not the world we live in should there be a more oversight and interested? I think that would be a positive a positive development but in terms of I think that the governments are going to buy or find regardless which is why we've really focused on the process around deciding whether or not to disclose Any other questions? Yes I'm Alan Wheeler with the Cherdorf group My question is in regards to sort of an international dimension on the intelligence side here that was sort of alluded to but I was wondering if any of you wanted to discuss or address the intelligence sharing aspect that exists here I mean you have the five eyes that it's pretty widely reported that vulnerabilities like this are sort of passed between them the NSA and the band A have relatively close relationship and certainly how one of these countries sort of addresses this vulnerabilities equities process is going to have an impact on how all of them are addressing it so one of your allies decides to give up a vulnerability that you're using for an operational mission that sort of creates a whole bunch issues and changes incentives in terms of sharing in the future both in terms of vulnerabilities you're using for offensive action but also vulnerabilities that have been identified for defensive activities so to the extent that any of you have thoughts or want to address that So it's definitely the question about the fact that there's discussion of these processes among 5I partners let's say more countries across NATO for example were to pick up and build VEP processes I think there would be an inherent tension between being able to share that vulnerability before a decision that you've made whether to retain or release so if you're a country and you're about to release a vulnerability I think there's not going to be there's not always going to be the ability to share that really widely with or a retained decision share it really widely with a larger set of countries so as these processes grow I think you're going to see more and more tension between the need to coordinate and know what each other is doing and the secrecy that inherently is involved in some of these decisions so my best guess is as more countries grow capabilities and processes sometimes it'll happen sometimes it won't sometimes you'll have pissed off allies and it'll be a learning process as you go through that experience I don't think that's going to be a problem between the 5-5s I think intelligence sharing also extends to vulnerability sharing to a certain degree within the 5-5s but outside that I don't see that and I think an important aspect is there if we have a vulnerability and we retain it for operational use and we want our I wouldn't say allies or like-minded states to have that as well operational use or for securing or whatever I think one important aspect is would I know if these vulnerabilities are exploited against us so for example if I have this wonderful vulnerability and then I'm going to share that with the United States I wouldn't trust that they're not going to use it against the German government or against infrastructure companies in Germany so if I don't trust that then I have to know when it's going to be exploited against us and if I can't know that then I shouldn't share it what about sharing for defensive purposes right so you the government has made a decision to disclose a vulnerability so it can be repaired but in the meantime presumably taking action to make sure it's not going to be attacked using that vulnerability or sharing it with partners so that they can protect themselves is that I wouldn't know if they're using it to protect themselves or not unless I can know when they're exploiting it against me other questions I'm sorry yes Sandy that's where I was going I was going to say we would have time for one more question if somebody else has one but we're just about at the time when our reception is ready thanks to Mozilla for hosting our reception and I hope you all will join us out there but first if you would join me in actually you know what before I tell you the reception because we have a couple more minutes does anyone want to make any closing key highlights do you want to emphasize in closing go across and do that quickly since we don't have more questions and then we'll go to our reception okay quickly I don't know I don't know if it's going to be a highlight but I'm actually very happy about I think what you should approach is the first internationalization of the debate with more countries not only adopting it but also those who are already having that process we know about the Dutch, we know about the Australians urge them to go more transparent with their process they can do it like the UK they've published everything, the US hasn't published everything but they've published some things and core parts of the process because it's for two things first of all you have all the other governments that want to do it and don't have the resource to do it and second of all it helps all of us to have a better informed debate about what's going to happen in that space and third point is also good for the government because I mean one of the reasons why the US is closer I think was also because there was so much speculation about that they knew about Heartbleed there can always be the bad guys I mean we make them out to be most of the time most of them don't want to be so also for them it's good to have that kind of transparency so I can only urge to be more transparent about it maybe international norms might be a point about that I don't know but at least like on the national level try to implement it if you have it go more transparent about it and because at the end of the day if you're doing it right you have overall more national security so why not do it I second what Sven said and then I guess I'd make a last point on legislation which is I know that there is like a lot of desire and push for legislation in my view if you're going to go there less is more because I think there's there's always this push to be as prescriptive and specific in legislation and we are all every country is kind of learning from this process and it's going to be an evolutionary process and so in my mind if you've got a process it's transparent it's got oversight and you're on the record with it there's not necessarily a need for legislation but if you do have legislation make sure there's enough space for agencies to learn as I'm working for Human Rights Organization I am a bit of an optimist so I really hope so this whole debate about vulnerabilities I think is one of the very few in our area that is a bit like of new so we really have the chance here to build something like when we talk about mass surveillance Edward Snowden told us that we have intelligence sharing system going on since years and now we're talking about okay can we maybe find an oversight for that and as we hear like more or less still at the beginning of the discussion I really hope that we use that and try to from the very beginning bring in the international perspective of the debate and also bring in although we're always talking about national security or about security as such that we from the very beginning bring in the human rights perspective because this is also it's also a chance to do something to do something for the people this is what I hope Heather? I'm glad that this discussion is really going global but and really thinking about it in the context of people as opposed to I mean I'm a representative tech company so I'm like I want to fix my product but it's not actually about that so I like that there's been kind of a broad perspective here on this panel even though we don't disagree on too much but yeah I think this was a great discussion. Well thank you and if you'll join me in thanking our panelists and please join us for the reception and thanks to those on the webcast.