 Hello, good afternoon. It's a pleasure to be here with you in this excellent wonderful village and it's very exciting to see people hacking things on the back and we're going to talk about the legal implication of this. My name is Amita Lazari, I'm a doctoral law candidate with Berkeley Law and a steel to sea grantee here with me, Jamie. And I'm a staff attorney at the Electronic Frontier Foundation and I am on our civil liberties team and do a lot of work with the computer front end use act which we'll be talking about today. Cool. And as you might have heard, I'm Israeli, that's the accent. So in true Israeli fashion, I want to start with a direct question. How many of you here know this guy? None? Nobody? Okay. This is Kevin Finster. He's a respected security researcher that found a vulnerability in one of DJI drone systems. A vulnerability that according to reports leaked personal information of their consumers. Now although he has tons of hair, Kevin wanted to wear the white hat. He wanted to report the vulnerability to DJI in their newly just launched bug bounty program. Now when this program was launched, it wasn't launched with a clear scope or terms. So Kevin contacted DJI and according to reports, in fact DJI authorized the vulnerability he found was in scope. Not only that my friends, they offered him $30,000 for that bug. That's a lot of money for you bug hunters here in the room that is considered very high. But then the plot thickened. DJI also wanted Kevin to sign an agreement that he found was one sided, one that left him exposed. And when he refused, according to reports, there threatened him with legal action under notorious computer fraud and abuse act. Well, how does the story end? Kevin ended up walking away from a $30,000 approved bounty. That's right, my friends, a new Tesla. Let's take a moment to appreciate that Tesla. And this is a wake up call for all of us here. Legal threats are on the rise. We hear more about security researchers, even reporters that are being threatened with respect to issues concerning security research and vulnerabilities. In fact, this is such a huge topic that the Center for Democracy and Technology, CDT, just asked 50 experts to sign a letter basically going to the community and telling everybody that we need to address this now. The chaining effects are creating an atmosphere of anonymously disclosing full disclosure vulnerability instead of working together in coordinate disclosure. Not only that, they conducted an interview with 20 leading security researchers. Half of those researchers suggested that the DMCA, corporate law, and we're going to talk about the DMCA, and the CFAA, the computer fraud and abuse act. The main two federal anti-hacking laws have basically undermined their research in a certain way. There were concerns with respect to those laws that affected their research. One researcher even said that he avoided implicating the CFAA claim when we're searching a vehicle. So these are really relevant concerns and we need to address it now. Even when it comes to bug bounties or vulnerability disclosure programs that are used to be considered quite safe. So this is a legal talk in terms of use talk and in two terms of use fashion, we have a disclaimer. Although we are lawyers, I'm not admitting the United States and we are not yet your lawyers. You can definitely talk with the FF about them becoming your lawyers, but this is not legal advice. So let's take a deep dive to what we're going to talk today here. All right, let's see. All right, so there's good lawyers in the world, like us, and there's crafty lawyers and they use, these are the tools that we're going to talk about today that these lawyers use for companies to go after security research. So the first is the computer fraud and abuse act, which is a criminal law that has a civil enforcement provision. Very vague, passed in 1986. We'll talk about that more. Also state anti hacking laws, which are very similar to the CFAA, the DMCA and its security exemption. Contracts, in terms of service ULAS, and the consumer of you fairness act, we're going to go over as well. In addition to aware tap laws. All right, wait. So for one example, this is Nest's terms of service. People are, I think, hacking on Nest in the back and we wonder if they've read these terms of service, but for instance, if you breach a terms of service and that could be a contact, cam trick violation, or get you into trouble with the CFAA. One very common restriction is this restriction here against modifying, making derivative works of disassembling, reverse compiling, or reverse engineering, any part of a software product. And then there's also here a limit on disclosure. So not just, not just what the security research that's prohibited, but also actually disclosing it. And this doesn't necessarily apply. So even if it didn't constitute hacking, you could get in trouble under this disclosure provision. Yeah, so this is really interesting. This is a new development. If you look at this language here, they suggest that even the performance that is going right there in the back, the analysis on the basically security practices of the devices, you need their consent before you go disclose it to any third party. Now here there is a new development. This is a new law. We still don't have much clear see because the courts or the FTC have yet to weigh on this. So this is still emerging. But this suggests that actual security researchers as consumers of product should be able to communicate as like a review, right? What are the actual implications of the product? What are the assessments of the performance of the product and contracts that try to limit such disclosure that is important for transparency for consumers are not allowed or prohibited. So this is a new thing to look at. What's interesting here that they do not allow disclosing potentially damaging computer code. So you need to think about the limitation of the proof of concept that you're publishing. How are you going into depth about allowing others to reproduce this? Probably not a good idea. Again, this is a law that is just emerging and something to look at. But while this is a new law, unfortunately the CFAA is or the DMCA are not new at all. They were enacted in a very early early stage before the internet as we know it. So let's hear more about those main anti-hacking laws. All right. So I want to talk first about the Computer Fraud and Abuse Act. And as I mentioned, this is a 1986 statue originally. Congress was trying to go after serious computer break-ins and actually cited to war games in a senate report. But back then, of course, I mean maybe even still, Congress doesn't necessarily always understand how computers work and had a little bit of trouble defining what they were trying to get at. So they criminalized intentionally accessing a computer without authorization or in excess of authorization. And the term, the statue defines exceeding authorization but it refers back to without authorization. So the key terms of the statute are without authorization and then without authorization. Where's the line between those two? And the statute doesn't define that. There's other sections of the law. This is just one of them. This is the broadest section and the language has to be interpreted the same through every section. It also prohibits unauthorized damage which is a separate provision of the law. And courts have been confused about what this language means. So there's currently a circuit split. At first, courts were interpreting terms of service violations or actually employment contracts. So computer use restrictions that your employer would place on you. They were interpreting violations of those or duties of loyalty to your employer. So if you access the computer for non-work purposes, you are breaching your duty of loyalty and therefore violating the computer fraud and abuse act by accessing this computer without authorization. These are older cases. That interpretation of the law, of course, taken to its end is, okay, so if I lie about my age on Facebook, is that a computer fraud and abuse act violation? And actually the government tried to go after a woman for lying about her age on, you know, MySpace profile back in the day. And kind of ever since that case especially, the constitutional issues of this being a completely broad and insane statute have been kind of more apparent to courts. So courts started going the other way. And the Ninth Circuit interpreted the law narrowly in a case called nozal, United States versus nozal. And in that case, the court said that no. Terms of service violations, computer use restrictions, those are not, violating computer use restrictions does not constitute a computer fraud and abuse act violation. It's not without authorization according to what Congress was intending. But violated an access restriction which the court characterized as circumventing technological access barriers. What was the CFA violation? And so other courts started following that, the Fourth Circuit, the Second Circuit. But then there was a couple of interesting password sharing cases with kind of bad facts. So this is nozal two and power ventures. And these cases kind of threw a wrench in this whole circuit split situation because they were password sharing. They weren't really hacking. And in both cases, the person who accessed the computer was using the password with the valid consent and authorization. Valid credentials with permission and power ventures involved a social media aggregator who was basically scraping information and putting it all in a different place for users who wanted to go to one place and check multiple social media accounts. And Facebook didn't like that. So they sued them under the CFA. Nozal two, which is the second version of the first nozal case, involved an employee giving her password to somebody else who came and then sold some trade secrets. Trade secrets definitely covered it. So the court didn't need to go out and reach it. Facebook could have sued power ventures for intentional interference with economic relations or business relations. But instead they went after it under the CFA. And the courts somehow found a way to contort the law in a very confusing way. I personally don't think the opinions are consistent with nozal one, which is an unbunked decision, which it should have been consistent with. So this has created a lot of confusion. The court said in nozal two, you know, you can't, if you're not the computer owner, you can't even give authorization, which is pretty confusing because people share passwords all the time. The dissent recognized. There was no difference between password sharing in that case and normal password sharing. Not that you should share passwords. Don't share passwords. And then power ventures, the court said that Facebook users had given the company authorization. But when Facebook sent a cease and desist letter saying you are no longer authorized and they violated the cease and desist letter, that was a computer for an abuse act violation. And so now companies are trying to use this law actually to go after, to actually go after companies for scraping online in the publicly available data context. So it's becoming an anti-competitive tool. And if they interpret it super broadly, of course, it's going to be an anti-security research tool. And in fact, they easily, oh, this is an old 2010 computer crime manual from the DOJ, which talks about it's relatively easy to prove that a defendant had only limited authority to access a computer, such as when they violate a terms of service. They have since the government has since watched back from prosecuting these cases, but companies are still doing it. And the way that these cases are interpreted in civil context applies equally in the criminal context. Which is why people are so scared when you see this in a cease and desist letter or a threat letter. So it, we have one good case recently though out of a district court in DC, security researchers and the intercept represented by the ACLU brought a case against the government arguing that the Constitution was, or that the CFA violated the First Amendment, violated the Constitution because it was unconstitutionally vague and blocked their constitutionally protected security researchers research. And in that case, the court actually narrowly interpreted the CFA to avoid the constitutional issue and found that scraping or using automated tools, that's, you can access that information anyway. It's not hacking to use technology to help you get information that you already can, even when the terms of service prohibits it. And so employing a bot to crawl a website or apply for jobs may run afoul of a website's terms of service, but it does not constitute an access violation when the human who creates the bot is otherwise allowed to read and interact with that site. And they actually, the website, the court actually quoted Star Wars as well in the decision, which is, makes it extra cool. Yeah, so what you're seeing here with IQ, which is another important scraping decision, at still, still basically mitigation at all argument in the Ninth Circuit and this Sandvik decision, they might be a future for our bots. So in public facing infrastructure, the idea that you're going to use terms of use or even technological measures like IP blocking to limit the way researchers, companies, competitors, algorithmic auditors that are often engaged in scraping in order to uncover bias and deception in black box algorithms. All of these people are going to be greatly greatly influenced and all of us, by those two main decision, basically progressing right now, IQ in the Ninth Circuit and Sandvik in the D.C. Circuit. So let's say a future for bots. Let me add one thing. I also just wanted to quickly mention, because you mentioned IP address blocks, that it is our position at EFF, that an IP address block is not a technological access barrier because it does not keep anyone out. That's an issue that's coming up in these cases and companies are trying to argue that it does. So that's one issue to watch. So, and this is extremely interesting because in the argument we just, we saw in the war argument in IQ the lawyer saying wait, is it in fact a technological measure if it's just slowing me down and not actually preventing me to access? A great point on that. But the CFA is not alone in the land of anti-hacking laws, especially for you guys out there in the back and for IoT hacking, for car hacking, for all of these devices that have embedded security, the DMCA, the digital system, the copyright act, and amendment to the copyright law is a very important anti-hacking law. So this is a federal law with criminal and civil liability, mostly civil, and it basically prohibits circumvention of technological measures that effectively control, and this is one of the components of this clause, the access to software code as copyright protected work. Now the question is, do we actually need copyright infringement when we evaluate the circumvention and whether the DMCA was violated? Now here we have murkiness and a split. There's some decision that suggests that if you are circumventing for the purpose of basically establishing interoperatability, so jail breakings, et cetera, the DMCA should not be an issue, but when it comes to media and videos, et cetera, courts have been more willing to think that you don't have a really close relationship to copyright infringement. So this is the lexmark decision, one example versus the blizzard decision for those of you writing citations. Now the courts and the regulator and the congress recognize that this is a law that might not keep up with technology, exactly like the CFA, this is a law from 98 that was basically inspired by big entertainment companies seeking to prevent piracy. So in order to make sure that the law keeps up with technology, we actually have some statutory exemption for security testing and encryption, but as you can see, they have tons of requirements for security testing, you need authorization from the owner. Yeah, I'm sure that that person is gonna let you hack their product. We actually have empirical research showing that when people tried to get authorization, they were either often refused or even got foot letters back. So the statutory exemptions are murky in extent because they have tons of requirements and it's not clear what is the weight of each requirement. We also have a temporary good faith security. So I talk about the law basically being at risk of not keeping up with technology and basically stifling new developments. So we do have a process that basically every three years, the copyright office is engaged in this multi-stakeholder discussion to figure out what kind of carve outs, what kind of safe harbor and exemption we need to create from that very broad anti-hacking law. And one of those exemption from 2015 and now pending renewal probably gonna be renewed at very final process is the temporary good faith security exemption that doesn't require authorization like the statutory exemption, but guess what? You thought that it's gonna be free and broad? No, we do have tons of limitation. So first of all, we have a device limitation. Right now as it's written, it's geared towards devices that are basically designed to be used by individual consumer. So let me ask you, an elevator versus nest device? What about commercial printers? Right? So this is quite vague. But this also includes voting machines by the way. So our friends right here, the voting hacking machine village, great job. The DMCA security exemption was really important for that. This also includes motorized land vehicles, car hacking village friends. Yes, they also rely on that to some extent and medical devices in certain circumstances. But this is only one requirement. Guess what? You need to lawfully acquire the machine. So our friends here in the IELT village probably got their machines from nest. That's great. You need to lawfully acquire the machine. Then you also need to only be engaging in your research for the purpose of good faith security research. Not violate any other law. And this is the real main issue. Because as we will show you there is a relationship between the laws. So for example, if my nest comes with the terms of use, and I'm violating the terms of use because I'm hacking, you saw our anti hacking language before? I'm violating that contract. Do I get the DMCA exemption or not? Because now I'm risking a CFAA and other law violations. So this is a very important component that there is a lot of debate right now and basically we are, there are a lot of people requiring, requesting that this will be removed. Then the researcher not only that should be conducted in a controlled environment. Is this a controlled environment, etc. So the idea here there are plenty of requirements as well and we are not trying to remove some of them. In fact, the Department of Justice itself weighed on this issue and they suggested in their comments that there is an issue with limitation on security research exemption only for privately designed for consumer devices. So we might see this removed and also just suggest that this idea that everything should be in a lab like environment is not realistic and it's not what we need. That's not how products are being used in reality. So whatever is going to go with the DMCA exemption is going to be really, really important for all you IoT hackers. So stay tuned. Now I mentioned bug bounty in the beginning is because there is another aspect here which is the relationship between the CFAA, the DMCA and believe it or not, bug bounty contracture. So this is my own project legal bug bounty. And the relationship basically is dependent on that clause you just saw that suggested that violating the CFAA contract law might undermine the DMCA security exemption. So by show of hands here in the room who here heard about the bug bounty ever visited a page like this or vulnerability reward program. Yeah, we have some hunters in the room. That's cool. Well, this is on the rise. Not only bug bounty, it's also vulnerability disclosure programs. And one reason is that regulators are actually pushing that and that includes IoT regulators like the FDA and NHTSA. In fact, the FTC has written just two months ago and we're not mistaken in one document that they think that failure to maintain our process to get security vulnerability from the community and addressing them is in fact unreasonable under the FTC Act. This means that we're going to see more and more companies coming to you, coming to us, coming to everybody and setting at place at least a vulnerability disclosure program, a communication channel. And the language of that program, that contract is going to be also very important. Why? I'm not sure that anyone here perhaps encountered this piece of bug bounty terms. So these are bug bounties, not vulnerability disclosure program, although they also have legal terms. But what's funny is there are a lot of terms of use in bug bounties. And often we ignore the legal part of a bug bounty and we just focus on the technical scope. But this spot could actually create liability. So what I'm talking about actually conducted a research and I read hundreds of terms. And what I found was in some cases pretty conflicting stuff. For example, this is AVG bug bounty terms. They suggest that when you submit the bug you also agree to the YULA, the end-usualized agreement that is geared towards users. And guess what? That YULA has the same anti-hacking language that we just saw. No spoofing, no attempt to get unauthorized access, no hacking. So this combination between the bug bounty terms and the YULA creates civil and criminal potential liability. It shifts the risk to the hacker. Now these are taking away the terms. So you should be careful in reading them and addressing them and knowing what's at stake. This is just one example. To kind of summarize this point from what I saw, there are a lot of cases where companies are actually shifting the risk to the hacker. Now this is not, as we saw, this is not just contractual liability. This is CFA liability and DMCA liability because it's all bowing down to authorization. So piece of good news, I don't know if anyone here knows at Overflow, but we are working together on standardizing safe harbors and legal language for bug bounties. This is my own project, Legal Bug Bounty. You can check it out. And in fact, we had some developments. Background is now launched Disclosed IOs in collaboration with me. Background is on a hacker platform, placing, creating, almost like open social, one type of language people can adopt. Bug bounty and vulnerability disclosure program can adopt in order to make sure they don't put a crowd of hackers at risk. So check that out. It's called Disclosed IOs. And we did have some success and I really want to give a shout out here for Dropbox, adding an explicit safe harbor in the bug bounty from the CFA and the DMCA and Mozilla. Yes, my friends, the pioneer of bug bounties this month, basically, launching a new safe harbor in the bug bounty. So slowly we're going to see this type of language in bug bounties and VDP with respect to authorization and waiver of potential claims against the researcher. Now this is key. You need to be at the lookout because it affects your liability. If a company has that type of language in their contract, you are probably safer than in the case that they don't have it, right? Because basically when they authorize access and if you are staying in scope, you are like a pentester. The legal foundation of the claim is negated. So this is another part of the conversation. And I have my own project where I actually list all the companies with safe harbors and you can check it that out. Just to finalize that note, a final comment on wiretaps and I'm not going to get into the weeds of this too much, but I'm happy to say that this is actually one of my own project. We just presented it at the Crypto Village. We conducted an at scale audit on wiretaps that included a lot of security testing. Let's put it that way. And one of the concerns, one of the actual comments that we got from the reviewers and I have my co-author here in the room, Paramal, shout out for him, was is there a potential wiretop problem here? So let me explain what was the issue. We created basically a stocker, a cluster, a database of 6,000 apps from Google Play Store and we created an infrastructure where we basically look at all the network analysis. But in order to do a dynamic analysis at scale, we actually needed to quote unquote play the apps. What we wanted to see is what is being collected from a data perspective each and every moment the app is basically being interacting with the user. But instead of having real users interacting at 6,000, with 6,000 apps all the time, that's impossible, right? We had to automate the process. So we use something that is called a monkey, an exerciser, basically a software that is acting like a user and touching the interface. Bottom line, what was interesting here that the wiretap claim wasn't relevant because our communication wasn't human at all. The person, so called quote unquote person interacting, the network that was basically monitored wasn't created by any human, but our software, our monkey. So when you're basically doing IoT testing, wiretap is also important. Make sure especially if it's speakers, Google Home and the like, that you don't have other people in the room that you're listening to or others that you didn't obtain their consent to. Wiretap is another thing to be on the lookout. So I want to wrap it up and I need to Jamie to think, to talk about a little bit of actual practical accommodations that we have for you people here in the room. All right, so as Amit mentioned, one thing you could do is ask for permission, but that definitely doesn't always work and it's not always an option. But one of the first things you need to do is be aware. So are there terms of service? What do they say? Read them. And also, if there's a bug bounty program, read those terms. And if you get a cease and desist letter or someone saying you're not authorized to do something, just be aware that that could be a red flag for the CFA. It's good to just know what laws that they're using, know how they're interpreted, know that there's a circuit split and you might get an issue because then you can know one other thing that you can ask EFF. So who all knows what EFF is? Or who EFF is? Whoa, not everybody. So we are a digital rights nonprofit. We've been around since 1990 and we have a coders rights program where people come to us with security research questions. We represent and consulting bases and give advice about what to do, what not to do, if the research already happened, what legal risks are and things like that. So it's free. So that's one thing that I definitely want everyone to take away from. Come ask us if you have any questions. We're happy to help. Also CDT, another nonprofit, the Center for Democracy and Technology, posts recommendations online that can be really useful. Another thing to do is use your own computers and accounts and your own devices rather than your neighbors without their permission. Devices that you are definitely, they're authorized to use accounts that you're definitely authorized to use. That doesn't always work in some cases. For instance, under the CFAA, if you're accessing a server or doing anything on the cloud, you're accessing someone else's computer and therefore this won't necessarily protect you, but offline testing could help. So if you get, if you're downloading things and doing things on your own system, then you're not accessing anyone else's computer. Also, minimizing interactions with users' data and also second-hand devices, which helps with the first-sale doctrine. So if you're entering into buying a phone, you have a terms-of-service agreement with the phone company. But if you sell the device, it's not necessarily, you don't have the same contractual relationship. And then again, bug bounty. Wait, what is this? Yeah, I'm going to add on that. Sorry, late-endition, but maybe news flash to the audience here in the room. So bounty factories are European, bug bounty, they're really cool. And recognizing that this landscape isn't perfect, they actually created a tool to report vulnerabilities via 4. I haven't tried it, but just wanted to let you know I heard it was used in some circumstances. The idea is basically minimize your risk. I am hoping that we will get to a point that they will know there will not be any question or whether if you're engaging in a coordinate disclosure and trying to work with a company under a VDP, there are no risks of legal threats, but this is not our case yet. So this is with respect to that. And also important note, respect to bug bounty is with respect to basically what are the lines when it comes to demonstrating impact. So again, the idea is if you see users data, that's a very, very, very hard stop. Even if you want to show the impact of vulnerability in a bug bounty, you should be very mindful of what is the proof of concept that you are producing and communicate basically with the program owner if there is any doubt. We see a lot of tension and kind of vagueness around that issue of where does the line stop when it comes to proof of concept. If you want to hear more about bug bounty, I'm doing a legal bug bounty talk tomorrow at Sky Dogs at free. So finally, we want to give you some takeaway, some websites you can check out. So we have here the EFF website and the quarter rights project. Do you want to say something about that? Sure. Well, I already talked about the quarter rights project. We also have a website online with lots of blog posts about all sorts of things, which is a really great resource. We also have a booth in the vendor area if you want to come ask any questions. And if you want to contact the quarter rights program, I should say email info at EFF.org and we will help you. Yeah, and legal bug bounty, you can check out my project CDT report that I just mentioned. That is a really cool report basically coming from their interviews with 20 leading researchers. They have a bunch of recommendations there. They also explain the law. Check it out. That's here's the link. You can check it out on CDT and just Google it on their website. And if you want to have a general CFA overview, there are great talks by Leonard Bailey and others on YouTube. There is also this great overview document. The main idea is be mindful, consider learning more about this issue. Yes, the situation is not perfect, but we're doing what we can to deal with it, especially Jamie here who is doing amazing, amazing work. That's here. I want to end with a very kind of good note and a kind of final hopeful story to the CFA mess. You remember Kevin Argyle, the last of 30,000 bounty? Well, guess what? Although DJI threatened him with the suit later thereafter, they added an explicit safe harbor to their bug bounty policy communicating to the security researchers that they will not pursue a good action. And the point here, this community has tons of power. It's not just the law. It's also reputational issues that will basically come hunt you if you threaten our researcher. And we have a lot of power to change stuff. Stay tuned, be involved, ask questions. We are here. And that's it. Thank you so much. And follow also the IOT bill. This is not law yet, but it has a CFA and DMCA safe harbor. This is work by under Cavalieri is doing amazing work. This is also kind of a positive final note to end with. Okay, we are here for your questions. If we have a bit more time, I think we do.