 Even before the Congress I did not know what the Vasanar arrangements are and I learned now it's an international treaty that has some security implications for us and all about that will now tell you Walter, Nate Cardoso, Meredith Patterson and Richard Tynan. Have a warm welcome. Thank you. Just to introduce us briefly, Nate, who's on my far right, is with the Electronic Frontier Foundation, that probably doesn't need any further introduction. Meredith is mostly responsible for the language security, well I call it a cult, but that's apparently not nice to call it. Richard is the chief technologist with Privacy International, this probably does not need much further introduction as well. I'm just a loudmouth who submitted a panel and have been helping out a bit with Fukumi in Brussels. Fukumi is on behalf of CCC involved in this policy area trying to make things less wrong on this topic and basically what we will briefly do is discuss well what the hell is Vasanar to begin with, how much of a problem is it for the IT security community and how we came here and go through this. Basically we were actually not supposed to have a panel to begin with on the program and for that reason we will open up the floor to questions from the audience in about 10 minutes or so. So start thinking about questions to the panelists because it will be kind of a partially a conversation on the stage but also a conversation with this well not necessarily intimate room but hopefully it will be a good conversation. Anyways, Vasanar, what is it? It's a very poor suburb of the Hague, the Netherlands, King of the Netherlands lived there with ghettoish places like you see on screen. It's also a framework between most of the industrialized nations on export controls and weapons technology, mostly conventional weapons and signatories range from basically all the NATO members plus Russia. Most of the former Warsaw Pact countries are also signatories to Vasanar and it is actually technically not a treaty but has a lower status in international law and what it more or less does it sets policies on conventional weapons as well as dual use goods. Until December 2013 that didn't touch much upon IT security issues apart from cryptography, remnant of the past crypto wars which are reheating again but in December 2013 two new elements were introduced as dual use technologies and the first one is surveillance technology and the second one is called intrusion software and to give you an idea of the definitions of these two this is what the definition of surveillance systems is basically anything that can intercept at the at the carry grade IP networks and can target specific selectors and I'm bringing up this definition not because it's such an enjoyable definition to read but from my perspective as someone was interested in the intersection of policy law and technology is it's very fascinating that this this idea of having surveillance technology considered a weapons like technology that is worthy of regulation that we because you don't want to have it fall in the wrong hands like extreme regime etc that in that process there were very funny exceptions made in surveillance technology because if it's for a marketing purpose or for quality of service or quality of experience whatever that may be then all of a sudden it's not deemed to fall in these export regulations and conversely there's the definition of intrusion software any of the following extraction of data or information from a computer or network capable device or the modification of system or user data or the modification of the standard execution path of a program or process in order to allow the execution of externally provided instructions I'm not a techie but having javascript sent from an htp server to a browser does that alter the execution path of the browser I think it does and this may be a bit of a broad definition and to give you an idea why we brought this up now is and is that this arrangement will be translated into legislation on in the signatory countries and I cannot really speak for the US side that will briefly be done by Nate when I'm finished talking about the European side in the European Union you have a regulation that basically copies the list of cover technologies by the wasana arrangement and says if you want to export any of this stuff you must apply to for export permit in your member state and the member states have done some additional rules on that the country where I'm from the Netherlands happens to have a royal decree saying thou shalt not export any of the stuff on the list in the regulation basically meaning what's on the wasana arrangement and otherwise you will be punishable by law that particular European regulation is up to review and that review will also include the updates that have been in the meantime to the wasana arrangement the current EU regulation on this topic is from 2009 and there have been about yearly annual updates of the wasana arrangement so in the upcoming two years there will be a new European regulation that may or may not affect your ability as an IT security expert from Europe to let's say travel to China that we don't even know what the commission actually will be doing on this on this topic but we do know what on the US side of things is happening and for that I refer to night thanks walter so on the US side the bureau of industry and security which is a division of the department of commerce is in charge of implementation of the wasana arrangement in may of this year the department of commerce released its proposed rule its proposed implementation and it was atrocious it was it was a shock to pretty much everyone involved and it caused an enormous amount of fear and uncertainty in the infosite community both in the united states and abroad intrusion software is not actually controlled under the bis implementation only technology and knowledge implementing intrusion software is controlled as well as command and control systems for intrusion software the problem is under the united states definitions of deemed exports in the uk there's a similar concept of intangible knowledge transfer simply talking about something like an exploit could be considered a deemed export and therefore controlled so this is really really really bad so as part of another part of the US implementation is if any of this software contains cryptographic or crypt analytic capability the the person applying for the export license must on demand turn over source code to the national security agency and you can tell who put that little nugget into the US implementation so essentially what it means is that if you're a US researcher it well okay so these rules haven't gone into the effect into effect in the united states but if they do what it will mean is if you're a researcher in the united states or a united states person so someone with citizenship or a green card you have to dump all your odays to the NSA before you can talk about them to anyone else even in private that's problematic to say the least also what the regulations don't do they don't actually stand in the way of someone like hacking team or fin fisher selling their stuff to Ethiopia Bahrain or the Saudis all you need to do is get a license and as we could tell from the hacking team dump getting a global export license under Vasanar is relatively easy so what what do the regulations do they make bug bounties conferences and small teams working on defensive or offensive research next to impossible they make distributing things like metasploit or pen testing tools nearly impossible and they create legal uncertainty for folks doing research on essentially anything interesting if you could get a security paper accepted here at ccc it would not be exportable under Vasanar that's of course a gross oversimplification i began my journey into vasanar when the bis released its proposed rule eff was not closely following vasanar in 2013 when the intrusion software and network surveillance definitions were entered i can speak only for myself but i i regret that fact okay i'm i'm going to have meredith explain a little bit from her perspective as an american living in europe how this makes affect her life but i also will announce that from now on people will want to ask questions to these panelists just just walk through a microphone and and we'll uh from now on the floor is open basically but meredith yeah so i'm originally from texas but i've been living in belgium since 2009 so this actually makes me subject to both the us and e you implementations of uh uh of the vasanar arrangement um the ones that and like uh nat was just saying the ones that are most relevant to me are the deemed export and intangible transfer um regulations which i have to point out i mean those are those are things that like the us and the uk came up with and i guess it was the uk that presented it to the e you because intangible transfer was totally you know the topic of the day um at one of the uh the meetings in brussels on uh on the vasanar implementation that i went to what does intangible transfer mean in practical terms well i mean in practical terms it means that if you have an idea in your head that qualifies as an export you can't cross a border and i don't know how they plan to enforce that oh it's way worse than that uh what it means so i'll i'll give a a discussion i'll you know i'll talk about it in in practical terms open whisper systems is based in san francisco uh it produces signal tech secure uh all those lovely uh things that i have on my device so that i can talk to people here um and if there are people with multiple citizenships even sitting around a conference table in san francisco speaking about it to someone without a us citizenship is a deemed export so you don't even have to cross a border all you have to do is open your mouth to get into trouble with vasanar yeah and so i'm in the situation where you know because of the citizenship i have and the place i live there are two different sets of regulations that can basically constrain me from traveling or from even talking to anybody because i understand the general principle of how to construct exploits from differences in how input handlers parse things that's obviously not the intended result of this but you know it's only one of a lot of side effects that the the the people who wrote this language were clearly not thinking of when they wrote it and that brings us we have now invented a new class of thought crime uh and uh on that happy note i would like to ask richards how we got here um and i'll just follow on with the press your button press your button process uh the um intangible technology transfer um giving lectures and and various things like that are are problematic even if you don't necessarily know who's in the room if they are a foreign citizen they can take it elsewhere and there's also another problem that if you were to go to a company and potentially disclose some of this information in your own jurisdiction but that company had many offices around the world which most companies that that we use products in our everyday lives are based around the world they may actually send on that information internally which may also include crossing a border to various other different teams so that the other the other teams could potentially fix uh the problem in terms of uh the starting point for this uh i i guess it probably started back in 2010 2011 where the goal was to uh limit the spread of uh tools like finfisher or tools with the capabilities of finfisher um to regimes like ethiopia and marocco and various other different places where we see that technology been used um very very uh in a very very nasty manner for journalists and human rights defenders and very often these countries what we've seen is they wouldn't even have the technical capabilities to develop this stuff by themselves and ironically in one of the hacking team dumps the the guys were very very concerned at a particular country's technical skills and their ability to use their tools not that they wouldn't get what they wanted but simply that just by screwing up the use of these tools would actually reveal their use and potentially provide a sample or some material that techies could then use to to counteract them so that was the i guess the starting point and as i understand it uh back at that time vasinar was considered a very uh attractive target you essentially lobby one uh organization who can pitch uh text into the agreement and if it's all agreed by default you get up to 41 different states who who would uh who would follow suit but of course as we've seen the implementation the text the various things like that have not been ideal in fact far from ideal and many of the the not only the technical definitions are not necessarily correct but the the lack of inclusion of factors such as the intention of the individuals when they're exporting this stuff there's no distinction between an export for for example generating new snort rules to detect malware that's uh on people's machines or antivirus signatures or potentially um the disclosure of malware that represents an entire class of new type of attack which would allow much more general safeguards to be put in place like DEP ASL or those kind of things so the the starting point i think was was uh well intentioned and the implementation as we saw has left a lot to be desired and i think it's the question now is whether vasanar itself can capture the nuances of the problem uh to achieve the goal but also to not screw things up royally for the entire internet security and the very infrastructure that we rely on on a daily basis but i do think we also have to look beyond the text of vasanar itself because deemed export and intangible transfer were bolt-ons from you know you know from from the us in the uk so we have to look at the implementation and shepherd that through as well as the text so i would say that the i i totally agree with richie uh in in that the intention was good uh but in in my mind at least the trying trying to make a legalistic definition to separate good software defensive stuff to bad software offensive stuff is a fool's errant the definitional problems overwhelm any possible benefit here uh the chilling effect of a bad definition which is exactly what we have is worse than any of the possible benefits especially because you know as the one of the founders of eff said in 1995 or 1996 uh the net interpret censorship as damage and routes around it right this software isn't magic finfisher and hacking team are barely more functional than vnc i'm exaggerating but you get the point um it's the service and support that actually matter right you can it you can give ethiopia finfisher and the it folk there won't know what to do with it um without the service and support and it's the service and support contracts that actually matter um so take that for what software isn't weaponry right software isn't guns you can't control it in the same way that you can control the export of physical devices so um my in in in my opinion trying to fix the definitions in vasanar um is uh you know i admire people who think they can do it but i don't delude myself to think that i can do it so i think we need to ditch intrusion software from vasanar all together okay we have a first question from the room so uh one important note for the vasanar is um and you haven't mentioned it yet uh open source is is exempt from vasanar well so open source is sometimes kind of exempt uh if you're for the purpose of this discussion it's exempt there really is not a bright line i tried very hard to get um public domain okay public domain is exempt that ain't the same as open source yeah well there's this slight problem to interject here right um when you interpret interpret international treaties per the treaty of yana you're supposed to use the common interpretation of words unlike other legal documents which have created our reality so public domain is no longer necessarily a copyright like like thing there's two problems the vasanar arrangement is not a formal treatment so i don't actually know how to interpret it the second problem is i've been looking for exemptions like that as well i'm nowhere near as much an expert as the other people at this table there are much clearer exemptions for let's say open source or source code availability for the crypto bits but less so for this and i would i would just add to that that while yes in the in the text on on open software the general technology notes and general software notes um open source software as i understand it doesn't isn't always open source at least in the initial embryonic stages and while yes it may become open source later on and we may become in the public domain later on while it's sitting on a laptop of a developer who's yet to commit their their changes into a repository that is open source to the best of my understanding that is still not public domain information that that is still information that resides on the laptop of the individual or individuals who are creating the next version of the particular open source software and this this regime may limit the ability of those individuals to communicate in the formation of the next the better improvements to the open source software and so even things like we've had very conflicting statements from government saying things along the lines of well if you if you're working on something and you intend to present it at a conference at some time in the future well then maybe it's open source but how are we supposed to prove that how are we supposed to demonstrate that when we're working on bits of code and bits of technology not all of them are going to work some of them aren't ever going to see the light of day and it's at that process before things become open source or before things become in the public domain that can actually limit the the ability for people to to conduct their work and and to make matters worse the general software note which is what Richie was discussing is not included in all of the national implementations including the BIS implementation the BIS implementation specifically exempts the general software note exemption. There's another question on the left. So I wanted to share an experience I had in dealing with Vassinar for voice encryption and how it may be related with that. For export of a strong encryption product there are a precise specification whether it fit in the category of a mass encryption tools or whether it's subject to export control and it dictates a very precise rule like if the end user is able to change the encryption algorithm if the end user can acquire directly without and can install without any substantial support from the manufacturer and those are specific kind of details related to the way of deploying and the purpose of the technology for voice encryption the for strong encryption subject to export. So I'm wondering if it will not be useful to think and propose from a policy standpoint of view a set of rules that create the boundaries of what should be subject to export regulation and what should not be subject exactly like strong encryption is already in the provision of Vassinar arrangement. But I would say that it's very very difficult to look at a specific piece of software just the ones and zeros and make a determination on that basis and so without the ability for an agreement such as this or a regime to achieve the stated objective that I said at the start it can't capture things like what you just said about what the the intent of the individual who's going to receive it or what they could do with it and that was expressly or it wasn't included in the intrusion software definition or any of the exemptions in in Vassinar as it was and the transposing legislation into the EU. I mean it's also not always obvious on its face what a piece of software actually does. If you've ever looked at the underhanded C contest there are a lot of examples of software that appears to do one thing but actually does something else. I mean I have I don't think we've seen this as an export controls dodge yet but it's certainly possible. And just to pile on on the crypto front you know it has long been EFF's position that crypto should not be export regulated at all. In the United States we have there's very little regulation on publicly available crypto so open source crypto all you have to do is notify the NSA that you're putting it online you don't actually have to ask for permission. Most people don't even do that and it's not really enforced as far as I understand. And following on from Meredith's point there's a there's a very interesting competition that's held every year and I think based on a Linux vulnerability that was that was found a few years ago where where semicolon was inserted that to any cursory reading of the of the code might indicate that things were hunky-dory but in reality what it allowed was basically anybody to gain root on on the box and it was all in there in in in the open source and there's a competition each year to find new and innovative ways to hide in plain sight essentially issues with software that just a cursory analysis might not actually reveal. We have several people wanting to ask questions over there. Okay so first off thank you so much for this panel it's been very informative. I have two questions that are really tightly related. The first one is coming from a basic knowledge of policy standpoint it seems that if you have an agreement that was presented then there was probably a chain of events like meetings and so on that led to the formation of that agreement and I'm curious is it possible to go through the meeting notes and for who introduced these terms and what the context of the discussions were. The second question that I have how have these kinds of agreements actually been enforced in the past because if this was introduced in 2009 I assume that there were things before them because cryptography is not a new thing. I think the first question is best to be asked to Richie in the second one I'd like to refer to Nate. I also want to note that when asking questions we are very thankful for your gratitude but let's stick to the questions. So I guess the first time and look obviously I may not be aware of the entire picture but the first time Privacy International became aware of the text was almost before days before it was actually completed so to the best of our knowledge there wasn't a consultation process and it's almost a problem with the actual I guess the drafting process that people were involved not only too late but that the potential consequences were huge and they should have been involved at a very very early stage so we didn't have to get to this point that we're in now where the implications are very real I think mostly be aware of HP pulling out of a of a bug bounty contest during the year simply because of the under maybe it was Microsoft account member which company was because there was question marks over whether they could actually have people showing up and presenting bugs and zero days and things like that and what legal regime had did they need to get a an export control license for every country and things like that so there is tangible evidence that this stuff is is stopping things but as far as I'm aware there was no consultation outside of governments who are drafting this and so as that is your second question how are these sorts of agreements enforced the answer on I can only answer I'm a U.S. lawyer I can only answer in United States terms they're enforced by the Department of Commerce and they're enforced sort of very rarely and very selectively so that would be a big problem in the EU because selective enforcement is like you're not allowed to do that in Belgium yeah Belgium being Belgium but that that selective enforcement gives rise to a chilling effect the mere possibility that it could be enforced is often sufficient to trigger the negative consequences that everybody I think here in the audience have which is that you want people to engage actively in in improving things and improving the software that we use and that necessitates in the modern age the ability for people to collaborate across borders and so I'm not necessarily sure whether the fact that it could be enforced people could go to jail which obviously is is a disgrace but the the the chilling effect of people actually engaging in this research in the first place you don't want to be in the receiving and enforcement I've been specifically told the questions from the internet yes thank you I have a bunch of questions regarding open source again in detail what is the expected impact on projects like meta-sploit that's partly open source you know and Kali Linux so meta-sploit the the lawyers over at hacker one not at hacker one at rapid seven have determined that the open source meta-sploit projects and modules there will be no impact on them the meta-sploit pro however is subject to the license so meta-sploit open source just fine keep exporting keep using keep posting meta-sploit pro is going to become a lot more expensive I'm just another question from your backlog of internet questions yes there's one question what is the the detail problem is it that exploits would be more open or is there also a financial aspect in this individuals would be criminally liable for for having exploits basically oh having exploits and crossing the borders or talking about exploits to foreign nationals so hi I'm an academic from the United Kingdom and I was wondering if you could comment on the possibilities of this affecting research particularly and thinking of my own research would be classified under the the very broad definition of surveillance and the follow-up question is what about teaching how am I expected to teach an international group of students in my university without an export license I'm afraid you need an export license if you've got international students in there because when you impart it would be deemed an intangible technology transfer in the sense that the transfer actually occurs in the heads of your students when you've delivered that lecture and they go on to their their countries that would be an intangible technology transfer it's really not clear like they're they're they're not drawing clear lines about what kind of research I mean because because politicians have been have been telling us to our faces oh researchers don't have anything to worry about but they don't tell us why and so when I said and so when I ask okay well why for what reason do I not need to worry about about intangible knowledge transfer you know I just sort of get padded on the head and and they move on to the next question nobody really seems willing to clarify how this is going to how this is actually going to be brought down on research with the exception of Australia Australia apparently seems perfectly happy to just like shut off all crypto teaching in the entire country but you know that's Australia and we've already seen and I'm now forgetting the gentleman's name but we've already seen one doctoral dissertation heavily redacted by the by his committee in the UK because of Vasanar and that's fucked up and the and incidents like that are going to produce continued chilling effects fewer people will get into research because who wants to end up in the situation like that poor guy in the US who who did a geography PhD on maps of critical infrastructure department of homeland security suppressed his thesis if you don't get to graduate that kind of wastes the all that time you spent on a PhD and so like that's fewer people will be coming into the field and people who are already in the field are going to you know are going to have a lot of fear and worry about you know am I about to get busted for trying to publish a paper and I'm also not sure what security research actually is and when security research stops and ends our pen testers security researchers are people who go out and just tinker with software and say reverse engineer bits of bits of code simply because that's what they like doing where they're not affiliated to say an official university are they covered and so many of the many of this opposed protections or exemptions for things like academic material I just it just doesn't make any sense to me about what indeed what what the definition of a security researcher is and whether it includes pen testers and their collaboration cross borders to to to ensure that all their tools are as up to date as possible and I'm in exactly that boat I'm not affiliated with the university and I do not do security for my day job I'm a bog standard programmer at nuance um all of the security stuff I do I do in my copious free time so I wouldn't even be protected by uh protections that only apply to academics well luckily in the u.s. you don't have to worry about that because there is no security but maybe a very short follow-up question here is there any chance to get out of here um claiming that it's freedom of speech so this kind for the academic area so I can give you an answer from the european perspective yeah and they may probably respond from the u.s. perspective um as a kind of groundwork for those poor european countries which don't even have a constitution like the united kingdom um and other kind of backwards monarch monarchy is like the Netherlands and all that the the european convention for human rights is kind of actually the real constitution and the article 10 says that freedom of expression shouldn't shall not be well I'm paraphrasing I'm not crossing with you but it shall not be this shall not be prior restraint unless by law and proportionate to the public interest for which that law is brought into being so there's no evident cut off there why you could or could not have that we already do have the cybercrime convention which has equally chilling well not equally but also chilling effects and security issues it has never been challenged in that court as far as I'm aware so uh but it's a very good question basically can we fix this and after nate covers the the first amendment issues I'd like to move on to that bit thank you I'll just follow on with what was that there freedom of expression at least as you said in the UK in the EU is what's called a qualified right like privacy is and the qualification is based on law but many times freedom of expression uh can or can be compelled to yield to privacy rights of individuals so for example would my doctor be able to engage their freedom of expression rights to breach confidentiality and publish information and should I be able to prevent that from happening so not only is there a balancing act between freedom of expression and the general legal regime in a country freedom of expression and privacy very often can collide in in in the real world from a US legal perspective if these if the proposed implementation went into effect I would love to take the case that would bring it down um one of EFF's first major cases was Bernstein which you probably know better as djb versus United States Department of Justice and we got uh code declared as speech which was good and we got uh cryptography pulled out of the United States munitions list and into a much lower level of export control um so I I think that in at least under US law at least for open source software um even if it wasn't publicly available even if it wasn't public domain I think you would have a very very good argument uh that the regulations would fail under the First Amendment and then there's another thing under the under the United States Constitution which is if a criminal law and all of these export controls are criminal is not readily understandable by a person of average education and intelligence then it fails it's the doctrine is called void for vagueness um so I would say that this would fail as void I'm not sure about this far from registering the church of the weird machines is an actual church in the United States and then we get freedom of religion into the mix amen brother oh please do so I must emphasize that I'm not aware of any European jurisdiction that has such a lovely concept as void for vagueness yes definitely no I don't think any of European countries have such an exception basically if you don't understand the law it's still your problem I blame France um well yeah that's not blame the French for everything I think there's some there's some jurisdictions which have the phrase that ignorance of the law is no excuse yeah and I think that unfortunately might be operative in in many of those situations but since you're speaking how fixable is this um well so if we look at some of the the objectives that were were set out uh at the start in terms of the the very well intentioned very well um kind of you know there was a there was a very bad problem there to be fixed there are a number of of potential shortcomings in Vasanar itself which I think we need to look at addressing and potentially saying well if a tool or if a if a mechanism that that we're looking at isn't able to capture the kind of nuances of this problem that you know we may need to look at at a different mechanism so some of the ones that are our kind of open questions is that that Vasanar obviously looks at everything in the basis of the ones and zeros and not on the intention as we said before um in court on DJB's behalf and uh it was it was such a catch 22 for the department of commerce that they that they eventually backed down completely and after that going to court sorry going to court is something you do after legislation is passed that is clearly unconstitutional or otherwise contributing to fundamental rights we're both in the US and New Europe in the position that the pretty vague criteria of Vasanar have to be transposed in more concrete legislation and that process is still not finished and especially on the European side um the only contangable thing we have is a non-legislative report by Marieke Schaake and MEP and while that non-legislative report is has not been ideal um as we managed to get in amendments that specifically acknowledge that this has potential for chilling effects on research although we inserted the word bona fide research which I regret in hindsight because it's probably not helpful either um but at least um there is potential to have a European implementation of this bit in Vasanar which may be revised within Vasanar in the near future but also in the in the in the European implementation to make this less bad but I'm not sure whether it's um how that will survive that process um on that I mean the most frustrating thing about all of this to me is that we keep finding ourselves at daggers with people who are supposed to be our allies like we're supposed to be on the same side uh as the anti-surveillance people but what's happened is well meaning anti-surveillance people who don't understand the technical landscape propose what sounds like a good idea to them without ever without ever actually asking technical people then the NSA and GCHQ get involved and they push their agenda and the technical language ends up constraining the people who the surveillance the anti-surveillance people are supposed to be allies with I mean and then we have to you know then it ends up being our problem to massage people's egos and get them to you know and find them a way to save face when they're backing down and I'm sorry I'm not very good at that personally Katie Masuris did a fantastic job of it um a couple of months ago but you know we're kind of an abrasive tribe like you know being nice to people is not really our job well maybe it is but still I mean it's extremely frustrating to be put into this situation that we didn't ask to be put into in the first place and so I'd also like to you know if we can get into like how can we avoid this happening next time that would be real nice too but we also didn't learn the mistakes from the crypto wars so it's almost like this is the problems with that happened back in the 90s seem to be repeating themselves and the lessons that were learned then seem to cyclically go out the window and unfortunately I just wonder what in 10 years time they'll do next oh sorry to interrupt at that point please first the question from the internet thank you harold um the the internet wants to know our defensive technologies potentially going to be restricted by this arrangement yeah one defensive technology that is not simultaneously in attack technology or in other words and this is a quote will we end up with city everyday security because yes things are not exported yes yes that will happen if this gets implemented the way it's been written so part of the problem is that the boss in our arrangement definitions risk throwing away the solution that it's trying to solve um you know harden the endpoint harden the server harden the pipes the tools to do all those things potentially get screwed by vasanar right we want to get secure devices and services into the hands of the folks who need them um and banning pen testing tools is not the way to do that number one please so one one the insidious things about um ita regulations with the arms trafficking regulations in the us is that you are also liable for whatever end users of whatever you export is like um use it for is there similar things in like the implementation of the like vasanar argument is it just on the us munitions list or is it some different it is not it's not on the us munitions list it's on the er rather than itar which is the export administration regulation list um so you're not liable for end use end user you're only liable for the export at least in the us i think one of the problems that we're seeing is that traditionally uh the kind of technologies that that the regulators have been used to dealing with uh have been either solely military goods or so-called dual use goods and i think one of the things that potentially this falls into is that there's actually another use which is highly beneficial to the overall security of our systems as the the question from the internet pointed out that that is one of the consequences and so it's not necessarily clear that these things should be on something like a dual use list but maybe we need a recognition that the the positives of these things uh drastically outweigh the negatives and that by not having secure or not having the offensive tools to create the the defensive protective measures that we potentially leave ourselves open to much cheaper much uh uh ghost rat whatever whatever rat you want to buy on uh on uh alpha bay or whatever the new uh the marketplaces another question from the internet please thank you is this uh wasana arrangement draft openly available yeah where to get it it's it's not a draught wasana arrangement just wasana.org yeah and it's not a draft it's just a published document on that side yeah it's it's final it's not a draft number four please so i think the high level point i'm hearing here is that we have learned in the 1990s the wasana arrangement with its weird reserved word definition of public domain um it's in scare quotes and its weird framework has been an awful fit for crypto back then we sort of got ourselves out of it that's great it's coming back it's an awful fit for the sort of possibly intrusive software we're talking about here but actually it's possibly a little bit closer in its original intent to this software than it was to crypto after all its original intent is to keep that heavy truck out of its use as a troop transport it is to keep this um dual use tool out of another army's hand and what we're talking about here is keeping this overtly dual use attack technology out of the hands of an oppressive government that wishes to use it against its citizens while maybe keeping it in the hands of researchers so that framework sounds attractive we're learning in this conversation it probably has a lot of secondary effects that are unpleasant i would like to hear from the panelists what your view is as to what a reasonable regulatory environment around these offensive technologies looks like i heard nate say for crypto it is just take it out of export control make it freely available it is so beneficial are we saying the same thing about offensive technologies i won't i won't like give the mic to mary there from that because he has published a paper on that together with surga bratus well i mean i mean honestly my my perspective has changed over the course of this conversation um you know because what surga and i have have been writing about um is how to change the language to something saner um you know one thing that's come up over and over again in discussions on the the current language is that the um the language about execution paths is essentially meaningless um nobody really agrees on what it means um and even if they did agree on what it meant it wouldn't actually help um there's a wonderful paper from usenic security earlier this year um called control flow bending um and they they take a look at uh control flow integrity systems um this is these are basically like systems that try to whitelist um what paths through the control flow graph of a program uh are considered legitimate um and then you only whitelist those and anything else is considered an exploit um turns out um you can actually uh have all kinds of memory corruption fun and get arbitrary computation with printf you don't actually have to uh you can you you can violate even theoretically you you can get an exploit on even theoretically perfect control flow integrity so like what they've described doesn't even make sense and doesn't help but i mean i think what i i i think what what you said earlier about the provision of services is the far more important point like the you know government of random third world nation is not gonna get a lot of use out of you know here's finn fisher have fun they need that support contract regulate that or you know impose a strong liability regime i'm i'm lead counsel at EFF in our case where we're suing the government of Ethiopia for using finn fisher on a democracy activist in the united states we were lucky enough to get a client who we were able to catch Ethiopia using finn fisher red handed within the united states which gives us jurisdiction to sue them um but PI is uh is pursuing a case against uh gamma in the UK for the same thing um you know EFF was involved in cases against Cisco for helping build the great firewall and against IBM for building South Africa's apartheid identification card system uh that kind of liability on the on the back end for doing this the stuff that we care about that were that we that we don't want companies doing um i think would be possibly much more successful um and would have chilling effects on the on the bad stuff without touching the security research and i think that's the really big ask because good luck getting any government to you know be willing to be sued for anything but well i think that the one of the point in relation to the question is that um you know to someone with a hammer everything is a nail and Vassanar is is the tool that was there but as has been highlighted just there there are potentially other alternatives uh that may or may not be be be better and maybe the exploring whether the the language can be fixed within Vassanar to take account of the various different uh situations or whether there's other mechanisms in place i think needs to be explored yeah and as a lawyer as someone with a law degree everything looks like a lawsuit so i'm like sue them um question number four i i just um have noticed something in a new South African cyber security cyber crime legislation i'm not sure if it's influenced by Vassanar or if it's just the current environment um and uh yeah basically in South African now according to the cyber crime bill um malware would include any electronic mechanical instrument or device that could create a vulnerability modify or impair or interfere with the ordinary functioning of a device computer or network uh so that occurred to me that you know if i modded my phone for example there would be malware on my phone um a whole lot of things yeah and the second be i'd also got me thinking um how does one actually distinguish between possessing malware and then on the other side the people who are owned have been infected um could that not be a catch 22 that might be useful in certain judicial context i mean um an exploit is still an exploit that's still taken across the border if it's in the hands of a researcher attacker or possibly even an infected computer we do have the problem already with the cyber crime convention in practice it's basically not really happened so we have no idea because yeah there's the cyber crime convention with an entirely different framework on Vassanar uh which again most industrialized nations have signed um it has become in most implementations a crime to possess a piece of malware even if you're just a victim you're technically a criminal i'm not aware of any public prosecutors stupid enough to pursue such a case i would love to hear that right away south africa is a signatory to wasanar though i just looked it up um and i think under us law in there there's always an intent requirement for virtually every crime except possession of child porn um so if you're a victim of malware and you happen to cross a border you don't have the intent to export so no crime and i think the some of the other tools that might fall into that would be things like fuzzer's idea pro that would you be able to manipulate various different bits and bobs through the true decompile piece of software but um night at the start i think mentioned that the uh the actual piece of malware itself um strictly speaking or at least that's probably not fair uh the intent of vassanar was not to control anything which could go on the individual's device precisely for that reason that an innocent victim crossing the border and be found now as as as experts have analyzed the um the actual text and found various different ways in which potentially uh um code or or um executables on a victim's device could potentially be considered uh uh caught under vassanar yeah thank you oh so here's a wacky thing under the us implementation uh the code is arguably not subject to control but the comments to the code are definitely subject to control yeah um i think we're kind of running out of questions from the room and number five oh sorry one six um you mentioned that hacking team very easily got a license so why why would it be very hard for all of us to get a license well i think hacking team had a special relationship with the italian regulator that allowed them to get that license and i don't think people who want to play with software and figure out the problems with it should have to register with their government uh in order to get a license they they probably should be able to conduct the the research themselves because then you're running into the problem of well supposing somebody in in canada gets their hands on finfisher or hacking team and then they have to go and register for a license with canada who are potentially customers of finfisher and hacking team to say hey i'm doing reverse engineering or malware analysis on the very tools that they have bought um i wonder if that information stays within the the authorization department or whether that may potentially get passed to the companies who are involved in the uh in the sale of the the material so that they get tipped off that hey somebody's got a sample of your latest malware version and uh you might want to think of changing it i mean one thing that could you know one one thing that could be done to sort of point out the absurdity of uh of particularly restrictive uh licensing requirements you know could be the equivalent of a work to rule strike um DDoS the uh the the licensing agency with requests for every single time you install open SSL um so that's one possibility and on that note uh thank you all for your attention and enjoy