 We have an hour and 35 minutes, we're not going to talk all that time, this is going to be a roundtable discussion so we are going to open it up to the group after a presentation where we kind of go through some of the basics. I'm sorry, your head is giving a shadow, sorry about that. I'm sorry also that we are not able to dim the lights just for this individual room, but we're going to make it work. This is hacker law for hackers, I want to let you all know this is on the record room, so there's Chatham House rules in the other room, this one is on the record and being streamed, just so you all know. Leave now if that's a problem please. And let's do it. So like the description says, we're going to go over some of the changes to hacking law that have occurred particularly in the past year or so, 2021 to 2022. And look at some of the areas that are rife now for community advocacy in changing hacking law to help hackers. So I'm Harley Geiger and I am Senior Director for Public Policy at Rapid 7. I'm an attorney and have been working in cyber security and technology policy for many years. And we are really blessed to have this man right here from the Department of Justice. Yeah, I'm Leonard Bailey. I'm serving as Harley's wingman today. I am from the computer crime and intellectual property section in the criminal division of the U.S. Department of Justice. I am Special Counsel for National Security and head of our cyber security unit. Doesn't that sound very impressive? It is. My mom is very proud of me. It is in fact very proud of me. Very proud of you. Yes. So believe it or not. There are no speakers. So these microphones are actually just for streaming. So when we open up to the group, we're going to have to pass around these mics, but there is no amplification. Are you having, can you raise your hand if you're having trouble hearing me? Someone should be loud. Okay, got it. Yeah, okay, really sorry. I thought I was already projecting. Feel free to crowd in a little bit closer if that helps. I know the acoustics are not super. All right. First, this is not legal advice. It's very important that we get that out of the way. I really don't want anybody watching this to go and do a research project and think that this covers all the nuances in the law. The law is just riddled with little exceptions and important things. We're not going to be able to cover all of that in this presentation. So if you have questions, particularly about a specific research project, if you're worried about legal risks, then you should talk to a lawyer. Just not these lawyers do not rely on this presentation. Okay, so start from the beginning. Why do we care about hackers? Why do we care about the way that hackers are treated a lot in our lives? We need the talent. We need the insight from people like you. It's not going to be a small cadre of experts. It's going to be the community. And from the government's perspective, it's exactly the same thing. We know this is a complicated problem. We're not going to solve it ourselves. AKA, the government also wants to hire you. So federal law has evolved in favor of hackers. I think that's a little bit provocative. I think that for a long time, the security community had this impression that federal law was stacked against them and that you will be nailed to the wall if you violate terms of service. You'll be nailed to the wall if you are doing IoT research in your own basements under a 40-watt bulb. And I think that it's time to challenge that perspective. And I think that one of the reasons why we should challenge it has been the changes that have occurred just in the past year. So 2021 and 2022 had a lot of changes. And they were almost all, at least in the United States, at least at the federal level, universally in favor of the hacking community. Are the acoustics better? Am I projecting enough now? Sort of. Yeah. I'm doing the best I can without yelling. Sorry. We're going to cover the Computer Fraud and Abuse Act, the Digital Millennium Copyright Act. These are the sort of two traditional bogeymen of the security community. We're also going to talk a bit about the international perspective. So China's coordinated vulnerability disclosure law and then some about the state laws. And then, like I said, we are going to open it up to hear your perspective. I think we have a lot of talents and experience in this room. So it's going to get interactive. We're definitely not going to talk for the entire hour and a half. So first up, the Computer Fraud and Abuse Act, so the CFAA. This is maybe the most feared law and sort of the pantheon for the security research community. And we're going to talk, start basics. We're going to talk about what it does. And it has a long list of restrictions. I categorize them like this. So first, it's a criminal and civil statute, meaning that you can be prosecuted criminally by federal prosecutors. But you can also be sued privately. So you can be sued by a company or by an individual if you have violated the Computer Fraud and Abuse Act. Here with the CFAA, we are talking almost entirely about other people's computers. So it is about whether you are authorized to access or use the computer in certain ways. And since you're doing it on your own computer, you're presumably authorizing yourself. Now, it gets to be a CFAA issue if you're interacting somehow with somebody else's computer. And it's not just, you know, the laptop that's sitting here. It can be things like Internet servers, you know, or somebody else's computer over the Internet. All right? Now, the CFAA restricts, among other things, accessing a computer without authorization. So if you're authorized to touch my computer or not, causing damage without authorization. Damage can include, for example, service disruptions, DDoS, dropping malware. It can even be altering code on somebody else's computer. That's considered damage without authorization. And then intent to defraud if you have intent to defraud and you are trafficking in passwords or access codes. That's also the CFAA violation. And then extortion is another one. So if you are saying, I will damage your computer or I will access your computer without authorization unless you pay me cash money. That is a CFAA violation. Oh, we have a new projector. So most of that stuff, though, is avoidable by good faith security researchers. If you are truly conducting good faith hacking, you are not going to be demanding money for, you know, where you're going to threaten damage to a computer. You're not going around causing DDoS attacks or disruptions to service. And you also don't have intent to defraud. Your intent is to patch the vulnerability. It's to improve security. But there is another provision, the CFAA. And this one is arguably the most broad and most problematic for the security community. And that exceeds authorized access. That is where most of the action in the past year has occurred. Do you want to add anything before we launch into that? No, I think you get all the buttons. So exceeding authorized access, what that means is you have access to a computer. You're authorized to access it. But you may only be authorized to use it for certain things. So I'm allowed to access this computer for work, but I'm not allowed to go bug hunting on it. Because I'm not a hacker, personally. Or I may be able to access a social network but not use my profile on the social network to go bug hunting or to conduct security research from there. So that would be exceeding the authorization that I have been given to access those computers. Now, that gave a lot of power traditionally to things like terms of service or acceptable use policies or employment agreements because those were the documents. Those were the ways that your access was defined. So if you join a social networking service, they will have an acceptable use policy or terms of service that says here are the things you're authorized to do. Usually security research is not in that list. Same thing with employers. They do not usually authorize their employees to also go bug hunting on their computers unless that's what you do professionally. So gave a lot of control to that. So that's some of the basics about CFAA. Now what has changed recently? And one of the big whoppers that has changed recently is Van Viren versus the United States. So that is a Supreme Court case. It was decided in 2021. And that kind of deep clawed exceeds authorized access. The facts of the case are pretty colorful. Van Viren was a police officer. And Van Viren was authorized to access a license plate database as part of his work. But Van Viren was not authorized to use that same license plate database for personal criminal purposes. And that is what Officer Van Viren did for money. He was brought up under charges under a CFAA. And the Supreme Court in a five to three decision, was it five to three or six to three? Anyway, it was not a five four decision. It was a pretty clear cut towards finding a narrow interpretation of CFAA. What they said was essentially the question is not if you've been authorized to use it for one purpose and then are using it for another. Instead it is are the gates up or down? Was there analogy? Are you authorized to access the computer or are you not authorized to use the computer? And so for Van Viren what that meant was since he was authorized to use that computer for work then it is not exceeding authorized access. It is not by itself a CFAA violation to use it for other purposes as well. And it may be illegal under other statutes. Just not under this really broad provision of the CFAA. This helps not only security researchers but also a lot of ordinary consumers who use the internet and should not be under threat of a federal hacking crime simply for doing things like violating terms of service, lying about your age on Facebook for example. So that was a big deal. It also applies in part to publicly available computers. So if a computer or an asset is on the internet and it is publicly accessible then you arguably have authorization to access that publicly accessible computer. And if you have access for it for one authorized purpose under Van Viren then it is not an exceeds authorized access crime any longer under the CFAA to also use it for another purpose. And that was that theory about publicly available computers was put to the test in another case in 2022. This is IQ versus LinkedIn. IQ versus LinkedIn. IQ was scraping publicly accessible content from LinkedIn. LinkedIn went so far as to send a cease and desist order to IQ saying don't do this any longer. And the Ninth Circuit which is traditionally looked at CFAA in narrow ways said that LinkedIn could not bring IQ up on CFAA charges or in this case it would be a lawsuit for scraping its publicly accessible content. It was publicly accessible. So it put Van Viren to the test on this issue of publicly accessible computers. So I want to give an example of sort of where this may hit the road with an actual case, actual hacking. Did you want to add to that? Did I describe them correctly? You certainly did. I guess there are a couple things I just, I toss out which is one of the things I think we've learned from watching courts attempt to interpret technology generally is they're not great at it. And what occurs is something that, you know, in my office we have these Talmudic discussions about whether something is like in the cyber world is something like something else in the physical world, right? And that's what happens in these cases. And so the Supreme Court, for example, in Van Viren spoke in language of gates up or gates down. If the gates are down, the restrictions apply. If the gates are up, there was no lack of authorization. Exactly what a gate is may not be entirely clear, as we say here, right now of the opinion where they make it clear that is, for example, there are files, directories that are, quote, off limits to you, then the CFA applies. But what exactly off limits means also is something that's going to take some interpretive work in courts. And so I'm absolutely right, what Harley said, that this clearly has rained back the reach of the CFA. There will be more work to be done, though, to figure out exactly what it means. There are things that it didn't address, or just that were outside the ambit of the decision. For example, what is accessing a computer under the CFA, right? Which I think is an important question for people in this community. When you're interacting with a computer, when does that cross out to being accessed that might be actual under the statute? This case wasn't intended to address that. And so that's still out there to be resolved. All that said, everything Harley covered was exactly spot on. Okay. Does anybody remember what this horrible human being did? Yes? Good. That's right. So this horrible human being did find the addresses, email addresses of AT&T iPad users exposed on the public Internet and managed to scrape a great deal of this. And I then disclosed it to Gawker, I think, and was brought up on CFA charges. And I raised this to, as an illustration, that this type of prosecution was already difficult because there was a court split at the time. Van Buren resolved that court split. And this kind of prosecution will be a lot harder going forward. It was publicly available information. And now there is not a court split. So not to say that there would not be any potential liability, but it would be a lot harder to bring a case like this now than it was previously. I just want to say one more thing about this case. I think it was US versus Spittler. So Harley hit on this earlier. There was a perception that security researchers were being prosecuted far and wide. And when we first began engaging with this community in 2014 or so, we took that back and we actually thought about that because we were concerned. We wanted to make sure we weren't shilling research. And what we found in the last decade was this is the only case the federal government has brought against the security researcher. Okay? For security research. There may have been other cases brought by state authorities, but this is the sole case in the last decade against the security researcher. I may surprise some of you. And just so you know, you don't have to accept our word for the Center for Democracy and Technology also took a look at this. And in 2017, they found the exact same thing. That said, and we'll talk about this later, there are reasons why this community associates the CFAA with let's say aggressive restrictions. And we'll get to that later, but I'll give you a spoiler alert. It has to do with the civil portion of the statute. So that is one change is those series of cases, Van Buren and IQ. It's a big deal. It solves the split that was in circuit courts for like at least 10 years, I think. Now limitations, that change does not, that's a different part of the statute than if you're causing damage or if you have intent to defraud. So if you cause damage or if you are out there trying to defraud people, this change is not actually going to help you. This is about exceeding your authorized access. And it's federal. So it applies just to the federal law, just the CFAA, it's different from the states and we're going to cover the states in a bit. But it's still a big deal. On top of that news in 2022, in fact in the past four months or so, the Department of Justice changed its charging policy for CFAA. And here is a look at that. Don't worry about reading through the block of text too carefully. We can parse this language a little bit later. But this is public. And who better to talk about it? In May of this year, we announced this policy. And it's sort of maybe the capstone of a number of years of work with folks like Harley and others in this community who in a very open-handed way allowed us to kind of understand better how the community works and to get more comfortable with the idea that there may be some allowances for actors in this community. Again, in the interest of improving cybersecurity. So in May of this year, we announced this policy that essentially says that in assisting U.S. Attorney, we have 94 districts across the country that have federal prosecutors in them who are responsible for these types of cases. We are the office at Maine Justice that oversees this statute. Every indictment across the country has to come through our office. And so we promulgated the policy to the U.S. Attorney's Office saying that they should decline a prosecution if the activity involved good faith computer security research. Meaning, and I won't delve too far into this because we'll talk about this later on in the DMCA, but essentially, activity that involves testing, examining, correcting, accessing security to do those things to identify security law or vulnerability. And it's done in a way that avoids harm to the public or individuals and is primarily for the purpose of producing information to protect the class of devices, machines or online services. And so, you know, that's a lengthy way of attempting to capture what it is people we understand attempt to do in this community. And let me tell you, that wasn't easy to try to capture in words. It took a lot of tooling and throwing, but we'll talk about this more. Justice has changed its charging policy for charging crimes under the CFAA to say you should decline it if the defendant is a good faith security researcher. The language that is up there, again, we'll parse through it a bit more, but that is borrowed from another part of the law where a lot of these conversations were happening and is sort of becoming now a lot of standard language around defining good faith security research for better or worse, and nothing is perfect. But that, again, may, so this is in the past year, a big change on the heels of those court cases. Now, the limitation there is that that is for criminal law, so it does not apply to civil suits. Remember, you can still be sued under the CFAA by private companies. And I will say that anecdotally, in speaking with researchers at Rapid 7, and speaking with, like, while at Rapid 7, speaking with researchers external to Rapid 7 and within Rapid 7, we are seeing that much more than private lawsuits than over things like criminal prosecution. And it is kind of a hard thing to measure because a lot of the private lawsuits come in the form of a threat, like a cease and desist letter. It doesn't ever reach the stage where it actually becomes a lawsuit. A lot of times the researcher will back down or they'll work it out with the company, but it is still a major problem for researchers. So just that civil suit aspect of it. The charging policy does not apply. Do we want to say anything else about CFAA? I guess there's only one thing I should say, just in the interest of candor. Well, I said there have been no prosecutions of security researchers. There may be a question about, well, have they been investigated? And there I'd have to concede there have been investigations, but you have to kind of cut us a little slack there and that is because if you're, let's say, penetration testing a system that you don't have an agreement with, what you are necessarily doing is something that looks identical to someone who was actually trying to, let's say, break into a system. And so there's no way of telling what's going on until and unless there's a little bit of investigative activity. And so there still could be a knock on the door asking, what are you doing? The point is that has not progressed and is not progressing to a prosecution. And so, you know, we also are better at understanding the difference between certain types of activity that is actually a precursor to actual, you know, intrusions versus others. Like, for example, mirror scanning. Mirror port scanning is not going to end up with someone knocking on your door asking, you know, for your political views from the government, I should say. But that wasn't necessarily the case going back a decade ago. But I think that is something that has changed. So I just wanted to clear about one point. And so just to recap the changes, those major court cases that are, you know, curbing in a big way the most problematic part of the CFAA and then the DOJ's charging policy change, you know, all since 2021. This is the next big important one. So this is section 1201 of the DMCA. So the DMCA, the Digital Millennium Copyright Act. Here is a law that if it were proposed today would not even get a hearing in Congress. It is archaic and clunky and it wastes everybody's time. And arguably it is broader than the Computer Fraud and Abuse Act. Because remember I mentioned that the Computer Fraud and Abuse Act largely applied to other people's computers. Section 121 of the DMCA applies to other people's computers and your computer. So it can apply to you where you're doing IoT research in your basement, you know, under a 20-watt mold. But it does apply to you. What it does... There we go. What it does is it restricts, here comes the lawyer speak, circumventing a technological protection measure to a protected work, a copyrighted work. So what does that mean for hackers? Circumventing a technological protection measure to a copyrighted work is what you do all the time, the software. So a technological protection measure can be something like encryption or a login, anything that is protecting access to software. And so that, because you license software, does not mean that you have authorization from the copyright holder to conduct that research. And that it can be software that is on your own device. It doesn't have to be just on somebody else's device. So arguably this has been broader. And a lot of times when I've seen cease and desist letters that are sent out to researchers, it's both. There's a citation for CFAA, DMCA thrown in there. So, I think the best thing about this law, the best you could say for it, is that it gives the copyright office, or technically the Librarian of Congress, but we'll just take the copyright office, the ability to every three years make an exception to this rule. This prohibition on restricting or circumventing TPMs. And so 2015 was actually the first time that there was a security research exception to Section 1201 that did not rely on the authorization of the copyright holder. And there was a lot of folks that were doing unsung work in the community to try to make that happen. Andrew Mitzwitian, the Center for Democracy and Technology, EFF. You don't hear their names proclaimed from the rooftops, but this process was where a lot of the cutting edge conversations were happening for security researcher protections. And it evolves every three years. So, 2015, 2018, in 2021, I think that what we finally have is decent protection that you can hang your hat on for the vast majority of research. And we're going to look at the language. So, good faith research. Sorry if this is not a sharper image here for you. Accessing a computer program solely for the purposes of good faith testing, investigation, or correction of a security flaw or vulnerability. So, that's the research part. And then the rest are sort of caveats on doing it responsibly. So, where the research is carried out in an environment designed to avoid harm to the public, where the information that you're taking from it are used for security. So, you're using it to promote the security of the devices that you are researching. And you're not, you had to throw this in because remember, this is a copyright-based statute and you're not using this legislation in a manner that it facilitates copyright infringement. Now, I don't know if you caught a lot of the language in the Department of Justice's charging policy. It's identical. The Department of Justice borrowed from this exception that came out of the Copyright Office. This is, in fact, government's best attempts, best stab at articulating what you all do when you're doing security research. And I know that there is a lot of different terms, like a white hat hacker, ethical hacker, they went with good faith, good faith security research, I think because good faith carries a lot of legal precedent. And now we are actually seeing that term come up in a lot of other places that we're not even going to cover here. So, for whatever it's worth, the description, you know, white hat hacker, ethical hacker, good faith security research is the one that has momentum in a lot of policy circles. Oh, please. How many of you are lawyers? Okay, a good number. So, this one statute amuses me, in part, because, so there's this prianial process, right, whereby the librarian of Congress looks at the Congress laws and decides whether there's an exception. And again, every three years, like, they sat around and said, two? No, that's not four for some reason. So, every three years, there's a chance to alter the law. Back in, I believe it was 2018, we became aware that this was up for consideration, and we decided we were going to weigh in. And so, we did something we haven't done before, which was we issued a formal letter to the Library of Congress in support of the security research exception, because looking at it, we felt it made sense. And then in 2021, it was up again because some of the restrictions on security research that were in the exemption in 2015 were considered maybe too narrow, and we looked at that again and found that we could actually agree with expanding it in various ways. And because part of my office, where I'm from the computer crime and intellectual property section, we actually enforced the criminal DMCA provisions. The Library of Congress actually was pretty amenable to our position. So, just to kind of give you a little more history on how this came to be. I think I want to put it a little more bluntly. The Department of Justice stood up for researchers on two separate occasions in this process over a period of years. They put their money where their mouth is and got involved with another agency saying, do better to protect researchers. That is really substantive. You know, from a law enforcement agency when they say that they want to protect researchers, it's behind the scenes. It's kind of inside baseball, but it actually, I was there. I was, you know, one of the people that was pushing the copyright office to have better protections, and the Department of Justice's letters had a big effect on the copyright office actually improving these protections. I'm sorry, I have to jump in here just so this is entirely a kumbaya moment. One of the things that did happen over these years is that people in this community were willing to sit down with us and have good candid conversations about the way in which research is conducted. And so I think this is really a good example of collaboration between, you know, what in many times is viewed as antagonistic parties, but actually are two parties that are working towards a common goal, which is better security and protection of people's information and assets. So this was 2021 this was released. I think it was late 2021. But consider the timeline also, right? 2015, it started before 2015. So this was a period of like seven, eight years just working with the copyright office alone on this. I think EFF had started even several years before that. I mean, it, like, talk about, you know, the way that policy moves slowly. One of the reasons for that is because you, oh, we managed to do it. That's terrific. I'm assuming everybody. Awesome. Thank you. Yeah, so I'll tell you just a war story. So in 2018 we heard from other trade associations and other sort of interests who were wanting copyright to be better protected and thought that was more prioritized than security research. And they said things like if we don't have restrictions on security research, we're going to have unfettered hacking of elections. We're going to have people hacking planes. We're going to have people hacking cars on highways. It'll be, you know, they'll be hacking cars in such a way that they're violating pollution laws to which our response had always been, well, there are other laws that take care of that. We don't have to rely on this one to do it. But that took, you know, like at least four years to overcome that argument. And now we have, the language that you see here does not have that same caveat in it. So this is an example of research. Why would this not fit the language that we just saw? Yes. That's right. That's right. So this was on a highway. But this particular example was brought up in the section 12.01 proceedings as both a foo and against. We were saying, well, look, the research actually did result in recalling these vehicles for safety purposes. This was valuable to society. On the other side, we heard people saying, no, this is terrible. Look at what these folks can do. They can stop it on a highway and put everybody at risk. So yes, this is one of those caveats. This is why this specific example made a lot of waves in policy land. And it's one of the reasons why that language there. Yes. So it'd be authorized access to the territory, right, to the computer, maybe, but not to the software on the computer. Because the software is copyrightable, and that's the way 12.01, section 12.01 works. You have to get the authorization of the copyright holder. And in this case, they did not have authorization of the copyright holder even if they had the authorization of the person who owned the Jeep. Art? Yes. I think it probably depends on the specifics. Yep. So a big limitation on that. So now you have, again, just in the past year, you've got much better protections. Under section 12.01 of the DMCA for good faith security research, there is a limitation to it, which is that that protection does not apply, and actually the copyright office cannot make it apply. They're not authorized by law to trafficking in the tools. So if you are making the tools available publicly, even not-for-profit, put it out on Twitter, then that may violate section 12.01 of the DMCA, but a different part of it, one that does not fall under the protections that we just discussed. So the act of research is protected, but the disclosure of the tools and the techniques may not be. Yes. Yes, there will be time for questions, but if you want, we're going to be loose with it. If you're talking about will a civil lawsuit look at the DOJ's charging policy change, it may be persuasive, but it's not going to be, it's not going to control. And it's a different standard of proof, right? So in a civil's proceeding it'll be a province of the evidence versus beyond a reasonable doubt in a criminal case. So there would be a different standard also used. That's what I would say. It starts to look like extortion. And I think that's not best practices for research under any circumstances. And I would just say it probably depends on the specifics, but you definitely don't want to give the impression that you have exploits that you might use if they do not pay you. That is something that comes up both under section 12.01 and also comes up under CFAA. Question for the streaming. And crack me if I'm wrong if I mischaracterize it. So for example if you're doing research on the dark web, you come across what is apparently stolen data, and you access it, are you violating the CFAA? And the simple answer to that under those facts is no. I'm glad to say that one thing I can recommend to you is one of the things that the cyber security unit that I had has done is we've produced various white papers on issues that people in the community and in the industry have flagged for us as legal issues that they're interested in seeing if we can provide some guidance on. The one we released in February of 2020 is intelligence gathering on the dark web or purchasing your information back on the dark web. If you did a web search for cyber security unit, DOJ, they'll be landing page and on there are links to various white papers. One of them actually covers and I think it's rather instructive on this issue. Yeah, no, I guess in lawyer's you wouldn't be actually accessing a computer without authorization, right? The data is out somewhere else. Right, but it's not you have not accessed the computer from which it was stolen is just data that someone has posted somewhere and you're accessing their computer with authorization as it's posted on the web but that would not be a CFA violation by itself. But that's CFA, I mean there are other laws that can apply like receded stolen property some privacy laws so it depends on the situation and just I think be aware that there are more laws in the CFA that may apply to that situation. And on that score I would recommend the paper I flat because we get into some of the other issues in Weasley. The simple answer is the facts will matter, right? So if you, for example the case that we talked about before he had accessed through this AT&T server which was configured to populate certain fields when you accessed it if you had an AT&T iPad you know he identified what some would argue is a security vulnerability that shouldn't have existed. He wrote then a pearl script that allowed him to do that 116,000 times which did then kind of move it out of what today we would consider good face security research. Now if you're saying you accessed this you saw this and then perhaps you reported it that would help in terms of what we're thinking you should be in the mindset though of thinking how will someone perceive my actions, not my intent but my actions because it's very difficult to discern what your motives are. So if someone, for example identifies your IP address coming from wherever you are as accessing this particular server and there's nothing else going on again you may get a knock at the door there may be some questions you should do everything you can though to make it clear what it is you're doing. Again the same paper I talked about before goes into this a bit that is to be sensitive for example you work for a company and there is a work plan that demonstrates that this is a manner in which you collect cyber threat intelligence information that would be helpful not dispositive necessarily but helpful but that's the advice I can give you here put yourself in the mindset of without knowing your intent what do my actions look like and how can I demonstrate that my actions are not those actually someone who would do this for malicious purposes. Alright, thank you I'm going to try to get us back on track if we have questions that are not directly related to what we're talking about we should hold those to the end we do have time for a roundtable so we will have time to get to the questions but let's try to keep it to what we're talking about now just for the flow of conversations that folks get the information we were talking about section 12.1 of the DMCA and we said that this good face security research exception which came out in 2021 provides good protection but does not cover trafficking so trafficking in the tools the exploits the techniques trafficking sounds like it might be more complicated than it is but it could just be public disclosure of those things and this is the basis of the Apple versus this was partially settled in 2021 we don't have the details of the settlement but originally was that Corellium was violating that trafficking portion of the statute and one of the reasons that it was settled in fact was pressure from the cyber security community that put out a statement saying we should not be suppressing the use of cyber security tools we don't want there to be the case that like a public disclosure of techniques and tools is outlawed under section 12.1 it is a bad cyber security situation to say everybody must develop their own because those tools will not be as good and able to actually improve cyber security yes is one of the things that was kind of eye opening about the lawsuit was that there did not seem to be an exception for open source and if the techniques and tools can circumvent a technological protection measure to a copyrighted work which a lot of open source tools do then yes it could violate that. This is not a part of the statute that we see come up very often and I think that it was under the radar quite a bit and it is one that we don't have that opportunity to come to the copyright office every three years for an exception that has to be something that is done by Congress and Congress does what Congress does so we talked about CFA and we talked about DMCA and I want to look at the states as well so if you are wherever you reside if you look up your state and then computer crime law it will be worthwhile to take a look at those statutes many of them are very similar to the computer fraud and abuse act they use similar language access without authorization some of them say exceeds authorized access just like the CFA did what is different one of the things that is different is that evolution that we just discussed for CFA there is no venviran counterpart to state computer crime laws there is no DOJ charging policy does not apply to the state computer crime laws those are enforced by states they go through state courts this is our federalist system and so some of the states end up actually having language that is even more broad than the CFA Missouri which is actually where I'm from is one of those examples and in 2021 we had this issue where the St. Louis Post Dispatch a reporter for the St. Louis Post Dispatch found on their department of education websites that the social security numbers of many educators were revealed and this was not a very complicated exposure it looked like hitting F-12 and looking at the source of the website revealed this it was this close to the agency which initially said thank you somehow it became political and the governor said that this was hacking the ones that you know we should have a law enforcement investigation the highway patrol did investigate the Post Dispatch and did investigate the reporter no charges were brought and what the highway patrol said at the time was that there was no criminal intent obviously this is true this is also true for good faith cybersecurity research that's also not required in the statute the statute does not require criminal intent it actually does not require intent to defraud it is whether or not you are accessing the computer at all it also Missouri is one of those states that says that if you disclose an access code or a password without authorization then you are in violation of their state law now in the CFAA if you're disclosing an access code or a password that requires intent to defraud to be illegal into the CFAA here again no intent to defraud it's just if you should have known that you do not have authorization to disclose the password so for example you're conducting IOT research one of the routine things that you see with IOT cybersecurity research is finding a hard coded password in the device are you allowed to disclose that under Missouri law do you have authorization to do it this is what I'm talking about it would not be it would not be in violation under CFAA arguably not anymore under DMCA now that we have that protection but you have state laws that are just as broad or more broad than the CFAA and don't have some of the same limiting factors that we've discussed Maryland is another so in Maryland in fact so Missouri is one where it says if you are disclosing a password or an access code without authorization Maryland in fact makes it a crime home of the NSA makes it a crime to try to identify an access code or a password so not even just a disclosure the act of research to try to identify the password or the access code is illegal and again no intent to defraud required and then by contrast just to show you some of the variation here there's the state of Washington the state of Washington actually has and the only one I know of in states it has a bona fide security researcher protection and this is here they call it white hat security research the way that they do it is they say you can't do a bunch of stuff without authorization just like CFAA but without authorization is defined and without authorization does not include white hat security research and if you see the definition of white hat security research up there at the top you'll see that a lot of that also mimics the language that was in that section 12.01 exception yes so there are sorry so I guess the question is whether when should someone who is doing security research be concerned about Missouri law essentially anyone who is operating anywhere on the planet should be concerned about Missouri law given the nature of the internet and the jurisdictional reach of some statutes I guess the short answer is it depends but if there are various jurisdictional bases for these statutes so if you have your victim in the state so if the computer accessing is in the state you certainly should be concerned right you yourself are located in the state you should be concerned there is a separate question of like if a company that is incorporated but has an office in that if their headquarters is in Missouri but the servers are the back end is in California hosted by Amazon or whoever do you have to be concerned not clear exactly in the wonderful world of cloud computing there is the separate question of how exactly jurisdiction works I can tell you in those first two instances though you should definitely be concerned and how would you know right how do you know if the computer that you are accessing is in one of these states this is why it is something that matters to everybody no matter what state you are in also get a lawyer who knows venue and jurisdiction and can maybe remove it to federal court yes issue of consent and kind of merge but I think conceptually for the wiretap acts 18USC 25 11 and the computer fraud and abuse act they are actually quite separable so the idea of having two party consents both parties on the line consenting is a requirement for I think 12 states across the country I don't think that those map to any sort of comparable language in those states but I don't think that's what computer laws all right I want to cover this international aspects of the law as well and then we are barreling towards the round table portion so I hope that there are more questions and we haven't answered them already so in 2021 China publicized after a comment period it publicized a coordinated vulnerability disclosure and patching law of vulnerabilities of network products and this law requires companies to have vulnerability disclosure policies that's great it requires patching of vulnerabilities this is required it also though requires vendors to disclose vulnerabilities to the government within two days of discovery so you discover the vulnerability you flow it up to the government two days that is going to be a lot of vulnerabilities researchers believe it or not are actually kind of encouraged you're encouraged to have bug bounties if you are a vendor but remember the vulnerabilities that you hear flow up to the government within about two days now what is very different from the way that we are used to things here in the US is that there is a strict restriction on public disclosure of the vulnerabilities and a strict restriction on publishing tools so as a security researcher under this law at least under the letter of the law can't speak to how it's enforced on the ground I have no visibility into that but the letter of the law is that you must report the vulnerability either to the vendor or to the government so your options and remember if you report it to the vendor the vendor reports it directly to the government anyway so bug bounties, CVD and it flows to to the government and the penalties for violating this can include imprisonment it is a criminal law but it also includes things like administrative fines there is a wide array of penalties so this is in stark contrast to the direction that we are moving in with the United States where it comes to adopting coordinated vulnerability disclosure recognizing the hacker community and our criminal laws and also continuing to preserve the ability to disclose vulnerabilities publicly do you want to add anything to that so this is the last slide well, last substantive slide so where do we go from here and just want to wheel it back to the point that we made at the beginning that a lot of the researcher community had previously been focused on the computer fraud and abuse act but that exceeds authorized access terms of service problems in section 12.01 of the DMCA I would argue that those things are absolutely not perfect but you are not going to get the same bang for your buck that you had prior to 2021 now, areas of greater legal risk would argue are in the US states which have not had any of that same evolution as federal law international laws like China's CVD law is another big one speaking of CVD, another area to focus on is continued adoption of coordinated vulnerability disclosure for companies and one reason for that is remember, not all of the protections under CFA apply to private lawsuits those are often the greater threat than criminal liability and for section 12.01 of the DMCA that protection although it comes up for renewal every three years it's I'm relatively confident it is pretty baked at this point we may be able to improve it some further but I doubt that unless there's a major event then we would do a lot of backsliding what does need improvements in a major way is the trafficking portion so I actually think that this is what we talked about here today the progress on CFA, DMCA is in part due, in large part due to security community advocacy it has been a powerful force but now let's turn it towards the areas of greater legal risk and especially the states where we are all from and where our lawmakers have similar processes to the federal lawmakers and I think that that is the next area where we can get the best bang for our buck in our advocacy and that is the security research exception from 12.01 DMCA if you want to look at it some more figure we might end up talking about it in this Q&A session which begins now yes please coming to the GSMA to present on the China law so I'm the chair of the fraud and security group we have an industry CVD scheme which is still quite unclear to us we have asked MIIT what the situation is and not got a response so for companies that are involved in our CVD committee some of which are Chinese whether they have to disclose their vulnerabilities so that's the first point I want to take which is of concern the second one is on the international law aspect so in the UK we have the what's called the PSTI bill going through parliament right now which is the product security and telecommunications structure bill which is our IOT act when it gets passed and in that there will be a requirement for companies to provide CVD and a lot of that work has been promoted really from here so thanks everyone for getting that just close to the line and getting into law so thank you the proliferation of coordinated vulnerability disclosure policies and standards I think it deserves its own talk of similar length but that is another major area of progress from the hacker community into policy Peter you so you were the UK is looking at the computer misuse act also right and they one of the things that the government is considering there I think one of the questions that they asked was should there be an explicit protection for security researchers have that right yeah so I'm David Peter is around this week so the home office has consulted on this and I think at the moment they're kind of quite busy with other things but that's what they're looking at and I think clearly you know the precedent work that's happening in the United States is going to feed into that so I personally was quite pleased to see all of this because we can put that into that work I think there's a lot of there's a lot of kind of input needs to go into it there's a maturative thinking that needs to happen so on the I bill actually when it went to the House of Lords which is our upper house there were questions raised about whether by putting CVD into law that they're creating a defense for malicious hacking and so then it got into this kind of basically letters of mark discussion which which I don't think we want to go down that road to be honest so I mean so the UK has a process where they're looking at the computer misuse act it's sort of their version of the CFAA so if you have contacts with the UK government or if you want to get involved with that see David and see about weighing in and that is an area of open consultation right now looking at potential legal protections for researchers in the UK yes I mean I'm sorry, follow on question on this we're going to ask as you can tell I'm from the UK as well one of the one of the dangers that we're looking at in the UK with that rethink of the computer misuse act is organisations in the UK trying to define cyber security researchers as people who have received a certification handed out by government department and who have registered on a government list as being validated security researchers and that's one of the big problems and one of the dangers that we're seeing in the UK is that their definition of cyber security researcher is kind of being pushed towards a government approved cyber security researcher which is a danger that we need to fight against and ongoing struggle with this review of the computer misuse act because we kind of don't want to lock out people who are not government sanctioned in some way I'm glad you raised that that's absolutely real the UK government also put out a consultation on certification for cyber security professionals and this would be certification for a variety of cyber security professional jobs if you work in the UK or if your work touches the UK this affects you but one of them was security researchers and yes it lends itself to being on an approved government list being certified in order to conduct security research which is a huge barrier to entry for what is in the United States at least a very decentralized community yes and sorry that was UK specific I wanted to quickly follow up on the issue of the program requirements and CVD requirements let's take the first part of that question first and I want to put and let's do you want to answer the CVD question what do you think Art Manian on the spot you heard Amit's question about the next frontier for CVD in the marketplace what do you think and reminder that the microphones don't have a speaker that's for the stream so we still have to project I did here I mostly tracked Amit let's see so I think you've touched on them here and then Amit with the commercial contractual stuff so bug bounty related disclosure policy related safe harbor but not in strict legal safe harbor terms but I appreciate good faith security research so contractually and in bug bounty and disclosure programs I think there's going to be progress there if people are agreeing and planning those agreements there's less likely to be any conflict whether civil or recognition from a government perhaps or federal prosecution the Chinese law is a good example it's concerned a lot of us although the implications are not clear yet I don't think but what if how many countries are there on the planet 200 some if you depending how you count what if every sovereign state has its own report to the government law with different numbers of days and different industries and things so that has the potential to get really messy and I'm sure particularly a multinational corporation or researcher who works across you know country lines would appreciate something more globally standard as opposed to you know needing a third party to tell them how to talk to this country that country that country I think that could be a mess but jury's out still do you have a view as to where CVD adoption might spike next like are there sectors or is it just sort of company by company company I don't think I have a real viewer opinion there's a lot of progress which is great to see UK was mentioned the rest of Europe in the NISA and the NIS2 has disclosure stuff built in there I believe I think it's going to go through so you know those are signs of progress the Chinese stuff is concerning but again I'll speak for myself I'm reading a English Google translation don't know how the law is actually implemented in country so not sure yet so I guess from my own part I don't know where the next sectors of the economy or sectors of the market that will implement CVD will be like where the sort of the low hanging fruit is but I know that you've noticed the same thing which is that coordinated vulnerability disclosure policies are now being built into a lot of cyber security laws so the infrastructure incident reporting act has a nod towards it there are laws like you know think like the IoT cyber security and innovation act right that had a requirement that if you are going to sell IoT to the government that you must have a coordinated vulnerability disclosure policy this too which is like a critical infrastructure protection law that is being proposed in Europe is that a pretty advanced stage requires coordinated vulnerability disclosure policies we're seeing it in a lot of sort of regulations that you may not even expect from different agencies for just for their particular sector are at the very least promoting coordinated vulnerability disclosure you had mentioned oh please I was going to respond to that because my company has also been tracking the implementation of CVD by IoT companies over the past five years and we're just about to do the research again so we look at about 330 companies that are sort of the top IoT products in countries across the world and we started out it was less than 10% of companies had any way for security researchers to contact them and that includes you know their security dot text as well and looking at our website and so on and that only increased even in light of all of this global activity promoting CVD that only increased to less than 20% in our last years survey so even with the threat of regulation even with laws in place there's a huge amount of companies you know four in five IoT companies are not doing anything and that's just a bit that you can see so it's really concerning to me I mean you know when the laws come in I'm hoping that obviously the stick comes out and they actually do something but it'll be really interesting to see what happens this year and next year in our survey if there's this massive increase or not there's this trope right that policy and regulation is always way behind where the private sector is I don't think that's true for coordinated vulnerability disclosure we're seeing it built into a lot of cybersecurity laws best practices you know guide things like that out of DC at least the UK the EU China and I think that it's representation in new cybersecurity regulations is more more broad than where we're seeing it actually in the marketplace so there is still a lot of catch up I would not I'm optimistic because it is making its way into policy but there is still a long way to go in terms of actual adoption so the question is whether the good faith exception is a matter of law or policy I guess and why should folks care so in regard to the CFA and the federal policy it is a matter of policy right so it is not written into the law and that has certain dangers folks have flagged that that means it is possible that another administration could have a different view of this or a different attorney general or turn another head of the criminal division I will say that we have a number of policies that I think are comparable that have persisted across multiple administrations and they have not as a matter of course changed every four years or anything like that sort but that is one important distinction that if it is written into law it is less likely to change the reason it is not is because it is very difficult to get anything written into law right now so this is not a bad second to that Congress can spend trillions of dollars but they can't change the law I did want to jump on one issue about bug bounties that I wanted to say as a public service announcement and it is something that we have noticed I think over the last few years and that is while we are very supportive of bug bounty programs because we think they do a good job of letting the different parties know where lines are drawn what is in bounds what is out of bounds the one danger that we see that has manifested in a few places is it also in some instances has created a sense of entitlement among researchers such that there is a feeling that if I did some work here I better get paid which can inform the manner in which any discussion about compensation or disclosure or things occur and in the worst instances is exactly what we talked about earlier internal conversation that looks a lot more like an extortionate type of demand rather than some engagement to try to improve security and so I just drop that as a caution in the course of reviewing indictments that have come through our office we've had a couple of instances where researchers used maybe some in provident words in the manner in which they approached this and this harkens back to an issue that I flagged before which is I truly believe it would be in this community's interests to work on norms that are some sort of baseline understanding of what security research engagement looks like so that those who like me who are outside the community would try to apply those rules have something to go to that's not imposed on the community by us but rather something organically developed so that there's actually buy in and an understanding of the true depiction of the way in which things should occur so that's my PSA that I wanted to just drop on you I realize we skipped over your second question on incident reporting and how that interacts with the CFAA and one thing I have noticed there's incident reporting there are a lot of regulations already on the books and a lot that are being proposed that would require a company to report a cyber security incident to the government to their regulator and the definition of cyber security incident is usually something that is significant not just an insignificant exposure but something that is relatively significant so that the government knows about it and can take action to protect others something and this is another sort of sign of the evolution in thinking about policy and hackers a lot of these laws actually have an exception to the incident requirements so that it doesn't qualify as a reportable incident if it is something that was brought to you by a good faith security researcher many of them don't use necessarily those words and I think a lot of times it requires authorization of the computer holders but something like a bug bounty if a bug bounty has brought it or it's a coordinated vulnerability disclosure process then it may not qualify for incident reporting purposes to have to report it to the regulator this is a sort of thinking that we I think 5-10 years ago couldn't imagine but they're building it into things that you know somewhat niche you know cyber incident reporting and so they have a mind towards this community even there yes 80% of what you said that's okay can I ask you at least for this question to take the mask off sorry 20% comes from someone who is from the UK about how discretion on the policy is exercised in the US by our investigators and prosecutors and kind of taking as a model that there's a comparable law in the UK where a lot of discretion is used to determine what is in the public interest so in terms of the discretion I guess the policy does state and you know promulgated by the Deputy Attorney General's office to all is that one should decline if it meets these criteria the question of when it meets these criteria is a matter of discretion to some degree but no more so than I would say applying our laws right to see whether the set of facts meets the requirements here and so I I think there's in some ways avoidably a certain degree of discretion that's exercised but one of I think the positive things is one of the things that helps when you have exercises of discretion is to have kind of oversight and transparency and so this wouldn't be a decision that would be made solely by for example the US Attorney's office we oversee all of the CFA indictments so it would come through us as well and the point of that which was a practice that was adopted in 2016 was to try to harmonize how these things are done nationally right so if you have one collection point you can have greater consistency than you would have in 94 different districts and so there is discretion we believe though that we have mechanisms for kind of leveling that out and making sure that it is consistent you'll notice at the bottom of the charging policy it mentions consulting with prosecutors that's working on this yep, specific applications hello the question was the policy applies to charging decisions not investigative activities so is there any point in perhaps having a comparable policy that applies to investigations rather than just prosecutions it's an interesting point I don't think we have considered it in part because at least we haven't yet detected a problem but hopefully engagements like this would flag that and if we did see a problem it's something we certainly could consider whether a similar policy on an investigative level would make sense at this point it doesn't exist and we haven't really thought of it but could that change if there actually were a need absolutely so first a suggestion to folks which was that reaching out to law enforcement to establish a relationship so that they know and are familiar with you in case there is a need for a knock at the door because there is some activity that they're interested in is a good practice and I would footstomp that that in fact is something that we flag in the white paper I mentioned earlier having that situation can be very helpful secondly though there was a question about whether accessing a criminal network let's say you are a researcher who is accessing a ransomware groups server whether that could potentially get you prosecuted under the CFAA and I guess to that I would say this whenever someone tells me prosecutors can indict a ham sandwich my response to them is yes but then we would have to try the ham sandwich right and that would be ridiculous so it's the same thing here one of the things that we think about one of the actually in 2016 we released our CFAA charging policy that was broader about consideration for federal prosecution there has to be a substantial federal interest in a prosecution or for us to bring it we do assess whether for example that's a case that is not going to play very well right now there may be extenuating facts I'm not saying that it's impossible that such a case could be brought but by and large that it's unlikely that said I do think that the one thing that I often caution researchers about is that even if they're being good guys they should understand they do not have carte blanche right that there are people who are investigators who have been given the authority to do certain types of activities that are invasive and that that should be left to them so reporting is an important first step we close that door please thank you only about five more minutes before we have to clear the room for the next panel and let everyone else also know if you don't have one of these limited edition Defconn policy stickers please feel free to see me I laid them out on the table but I know that the folks that were standing didn't get one if you want one then hit me up yes I have a few suggestions but I think the simplest one would be to contact the electronic frontier foundation they've worked in litigation in this area they may not represent you but they may know other people that do it's probably easier than a google search or me just giving you a bunch of names right now just contact them that's a good first start they also have online resources I believe they do yes in a lot of it at least in the United States this stage is going to be engaging directly with policy makers I don't know of any open comment periods for CFA, DMCA or states I do think that you get your best bang for the buck right now in the states like I mentioned the states don't hear from experts to the same degree as the federal policy makers do and you know it may be slow going at first but I think that once you build that relationship then you'll see it come to fruition in an open comment period then I think talk to Steven about what's going on in the UK generally I guess I'd add that there are these rulemaking processes that are open for public comment and people like you in this room actually do have an outsized voice in those and what I mean by that is talking to the people who in this process have to review them and then educate the answer is not that 100 people said yes and 50 people said no it's the quality of the comment that really is what matters to them and having some sort of sophistication and understanding of the actual issues and being able to present that is something that the people who adjudicate these rules and decide how they should be shaped really look at so these public comment periods are something that if you're interested there are things that you should take seriously I actually think it would make a great DEF CON presentation to just go through where to find these public comment periods and how to engage with them because this is how the government there's this feeling that the government is in this ivory tower and you can't communicate with them and it is actually not true at all they are acquired by law to solicit input from the public whenever they come out with a lot of major rules and so another time but there is in fact a lot of opportunity I think DEF CON policy I'll be able to connect you with folks depending on what you're looking at everyone thank you so much thank you for coming thank you to the people that were standing the whole time and again let me know if you want a sticker and hope to see your ends