 Good morning. If you're on the West Coast, welcome back wherever you are. We are here with the EFF panel, which I am excited to see come back to Hope yet again. Hope of course is supporting the EFF with a fundraiser and we would very much encourage you to visit the Hope website. Look at the information on that. We want you to donate to that fundraiser. We want to hit our goals this year to support the EFF, our nation's premier civil liberties organization for digital rights and for the internet. I'm going to turn it over to Kurt Opsal from the EFF to introduce our panelists and for them to give us their update for the year. We're going to be going into a Q&A session a little bit later. If you're not already signed into Matrix Chat, please do sign in there. Please prep your questions and we'll be picking from those questions for the conversation for the rest of the two hour session that we have here. Kurt, over to you. Thank you. It's great to be back at Hope. It's a little strange this year not being with everybody in person. I certainly miss seeing you all directly, but we're glad to be back and offering at the EFF, the year in digital civil liberties as we do. So for those who are new to this, this is a basically an AMA style conversation with the Electronic Frontier Foundation. We will give a short introduction soon. We'll go down the panelists, my co-panelists, to introduce themselves and to talk a little bit about their work. And then we turn it over to you for your questions. And we hope that the introductions inspire some of your questions, but you can feel free to ask any questions that you'd like for the EFF. A couple of things to sort of sort out as ground rules. One is that as many of you know, the EFF does provide legal advice, particularly for this community. We have the Coders Rights Project where we represent security researchers who have legal questions about their research, about publishing their research. However, this is not the time to bring your privileged conversations or questions about specific things that you may have done. You want that to be an attorney-client privileged conversation, which is not something which is shared by the entire world on a live stream. And this also includes thinly veiled hypotheticals about your friend who has a surprisingly similar situation. Like if you have a real legal question about your own situation, talk to us on a different forum. But we can talk more generally about legal issues. And then if you do need to reach out, info at EFF.org, that's the email address. It goes to our wonderful intake coordination team who will then route it to the appropriate people. And it goes into our ticketing system and such, so we will try to get back to you quickly with any kind of legal situation that you may have. So with that, we'll start out talking a little bit about some of the things that I've been working on. One, of course, is the Coders Rights. We've been talking to a number of people who are presenting at various summer security conferences. That has been exciting so far, so good on that. But not much in detail, I can go into the particular situations until their presentations have concluded and we've found there's no troubles. But another thing that I've been working on is COVID apps. We currently have a pandemic where contact tracing is considered to be at a very important aspect of trying to rein in that pandemic. Contact tracing is the process where you find someone is infected and you try to figure out who they may have infected themselves before they went into quarantine. It's used to help stop the spread and get those people either to get tested or to quarantine themselves until they can get tested. So it can be very helpful. And a lot of people said, well, why don't we apply technology to that? And why totally appreciate that people want to use technology to try to improve the circumstances. Oftentimes, when you bring technology to these things, it raises new additional issues. And this is no different from contact tracing and COVID apps. So we've been looking about that and have a bit of a trade-off, I guess, to some people will say, but I think you can actually do it well without making sacrifices to essential liberties. And the rising winner in this is the notion of decentralized contact tracing, which is more proximity tracing, trying to get the records of when you've been in proximity with somebody else as opposed to location. And I think a lot of the bad ideas that came out in contact tracing were focused on trying to keep permanent records of where everybody was at all times to enable contact tracing later. And that's a lot more information than you need to accomplish the job. And it has a tremendous effect on civil liberties if there's a permanent record of where you've been. But nevertheless, your right to association is also an important aspect of civil liberties. So a permanent record of whoever you've met is also sensitive. And so the better way forward on that is to have user control. The information stays on the device until it is needed, trying to anonymize the information when it is provided. And it is notifying people that they may have been contact and they can take the next step of going into to getting sent. So a lighter touch, but one more protective of civil liberties. But that is not all. How do we make sure that doesn't do any harm to the freedoms? Well, a very important one, a starting point, is informed voluntary and opt-in consent. It's a fundamental requirement. And this includes informal pressure, for example, saying that you can't enter into this space unless you have this app working. Things like that are a way to manufacture consent. We have the ability to turn it off. There may be times when it is you're engaging something sensitive like political organizing, or maybe you're a healthcare worker and you're going into a situation where you're going to have high contact with COVID people and you don't want to create a whole bunch of unnecessary contacts because you know what that situation is and you're taking appropriate precautions. Another key principle, minimization, proximity tracking for contact tracing, collect the least possible information. So this is maybe just that you're in proximity, maybe the vagueness of a vague piece of information about the time. You don't need to know the precise time. You don't need to know the precise location. You just need to know approximately when, so we can see where the quarantine time begins. Another thing that's probably very important to this community, information security. These are going to be apps that are going to be running constantly on people's phones. They're going to have access to at least Bluetooth, maybe some other functions on the phone. Unfortunately, I've seen that apps that are rushed into production due to a crisis sometimes skip over the information security step and this will be a tempting target if it is actually an app gets popularized and is used by millions and millions of people constantly. So these apps need to be robustly tested by independent researchers. Also some transparency, put out the code, allow people to look at that, not only testing the app but looking for bugs within the code and make sure as best we can that we've identified as many bugs as possible before it goes into popular use. And we're going to make sure that the apps are addressing some of the biases that will come from use of these apps. The apps will be, well, it can be used in two ways of bias. One is that the bias inherent in who has access to smartphone technologies and would be able to use the app. That is not 100% of the of the population. And if you make things dependent on having a smartphone app, you're leaving some people out of that picture. It also may affect the resources that are being provided by the government. So if they are looking at the app as a source of what the truth is and where resources are needed, that means that these communities will be less likely to get the materials because they didn't have the smartphones in the first place. And I think finally in a very important limitation, it's got to have an expiration date. It has to end. It has been an unfortunate history of things done in an emergency situation that continue after the emergency has ended. And so any of these apps need to have an expiration date along with the ability of someone to independently turn it off and get out. Anyway, that's just some of our thoughts on COVID apps. But let me turn it over to our next panelist, Alexis. Introduce yourself. Thank you. Hi, I'm Alexis Hancock. I am a staff technologist at the Electronic Frontier Foundation. I primarily work on HTTPS Everywhere Web Extension that's available on Chrome and Firefox and Package Instantore and also used within Brave. And I primarily work on that piece where building tools within tech projects. And I also focus on researching things around the realm of mobile phones and consumer privacy. And with what Kurt said around COVID, I've been working on COVID immunity passport research and digital identities. So a lot of it had entailed where he, as he said, where there's been some technical solutions around enforcing what it could look like when we reenter society without proven science or knowledge, whether or not what immunity looks like. So I've been seeing a lot of problematic apps out there and proposals around COVID immunity passports in particular. And that really concerns me around tech equity and researching the standards that are being put in place to enforce these things. The fact that immunity passports aren't really standardized documentation. It's not just simply test results. It's an actual formal document that could have a dynamic status and a permanent status depending on what type of technology gets used and when and in what context, especially with law enforcement or your employer or a venue you're trying to enter or simply a space you're trying to enter in the public space. So those really worry me in particular when it comes to COVID and I've been focused on that piece, especially with a bill that's been in California that mentioned COVID immunity passports not in that particular language, but it's pretty much hinting at that and we're scared that would give a segue to formalize digital documentation as the first standard and that could possibly lead to conversations around nationalized IDs in the U.S. and a database where it's more centralized and more subject to breach. Other than that, I do a lot of research in other realms where I try to focus on tech equity in particular and usually around usage of mobile phones and discussing with different communities and my activism around how to keep themselves safe with their tech. And with that I work on a security education companion at EFF at sec.eff.org to help security trainers and people out there stay safe and train other people to stay safe online. And I'll pass it to the next panelist. I believe that's me, yeah. So hi, my name is India McKinney and I am the director of federal affairs at EFF which is a fancy way of saying that I am a lobbyist. So the first time I ever went to one of these conferences and I learned a little bit about social engineering I felt very uncomfortable because the principles of persuasion that social engineers use to talk people out of their passwords and their social security numbers are the same principles that I use to convince lawmakers that they should listen to us. And sometimes it really helps when current events sort of overtake some of the things that we've been working on. So for example, one of the things that I've been working on for a number of years with EFF is our facial recognition advocacy and all of the civil liberties risks that come with facial recognition technology when it's deployed and who has access to it and should be should your face be able to be used for your identification in all the ways that TSA wants to use it and we think that's terrible. And so what's really interesting with some of the news events that have happened in the world is lawmakers are really attuned to what's happening back home in their districts and so then all of a sudden people started calling us back because we had already established this incredible body of record and body of work and so we had just kept putting it on their desks and so when they finally had some questions and they wanted to actually introduce legislation related to a ban on facial recognition they called us to make sure that their legislation was actually going to do the things that they wanted it to do. So that was a really great thing to do. So I used to work on Capitol Hill so I actually understand the legislative process from the inside and I use some of that knowledge to help EFF figure out how to target our resources on bills that are actually a threat as opposed to the thousands of other federal bills that never move. Every two-year cycle there's about 5,000 pieces of legislation that are introduced. Most of them are never going to go anywhere. You're never going to know about them. You're never going to hear about them. They don't matter. So what's the difference between those bills and the bills like the EARNET Act that are a very real threat in the world? So for those that don't know the EARNET Act is a bill from senators Blumenthal and Graham that would massively change the way section 230 works for internet platforms with a little side bonus allowing the DOJ to actually force companies to break encryption. EFF recognized very, very early on what exactly the threat that this bill was, what the legislation actually said because they were very sneaky about how they did it. And so we have been very active in opposing this bill very early on and I'm very proud of the way our advocacy around this has unfolded and in fact in a large part because of our activism and our lobbying and our successful grassroots efforts the bill's sponsor has actually radically changed the structure of the bill. So it's now it's a little more complicated and it's a little more sneaky, still very bad and we can get into all those details later if you want. But a lot of the ways that the bill process unfolded had a lot to do with the way that we were successfully talking to the world and talking to other lawmakers and really convincing people that it's a really terrible idea to let the DOJ control whether or not you're allowed to have end-to-end encrypted messaging which it is full stop. So really looking forward to taking your questions my job is super awesome and I really love talking about the legislative process so really interested to hear whatever questions you have for us on that and I will pass to the next person. Hello, I think the next person is me. My name is Naomi Gillins. I'm a legal fellow at EFF where I litigate free speech and civil liberties issues that intersect with technology. So first of all thank you so much to Hope for having us back here. It is awesome to be here chatting with you guys. Thanks to all of you for tuning in. I'm just going to talk a little bit about what is on my docket right now. So first of all protests happening across the country right now of course so at EFF we've been keeping an eye on how law enforcement is policing those protests and because we are EFF we're focusing especially on police abuses of technology from police tracking people at protests, compiling intelligence reports on journalists covering protests, monitoring people's social media feeds, all sorts of things like that. I've also been tracking the ways that the spread of COVID-19 has affected speech rights around the globe so a major trend that we are seeing right now is countries using the pandemic as an excuse to enact laws that prohibit the spread of false information online. Okay so that like might sound like kind of a good thing on first listen but this is a huge problem because what this does is this gives the party that's in control of law enforcement the power to decide what is true what is false and then enforce that right with the full power of the criminal law. So what we're seeing across the globe is that governments in power are using these kinds of laws as a pretext to detain interrogate prosecute people who share information that doesn't align with the official state narrative right and that includes journalists it includes whistleblowers political dissidents members of opposition parties and just anyone any citizen right of the country who's sharing information online about what they're observing or their own experiences and that chills people from talking about what's happening in their countries or investigating official actions or challenging the official narrative right at a time when independent reporting and investigation is should absolutely be fostered and encouraged and not suppressed in any way. Final thing I'll talk about right now that might be of particular interest to all the hackers out there in the audience. I've been working on issues surrounding the right to do computer security research in this country so many of you I'm sure are familiar with the computer fraud and abuse act or the CFAA and what this is is it's the federal anti hacking law right this law is notoriously ambiguous courts have been trying to figure out exactly what it means since it was enacted in the 1980s and different courts in the country have interpreted this law in different ways and this has led to this patchwork effect where certain types of security research might be a crime in some parts of the country but in other parts of the country maybe it's fine so a lot of confusion around this and counseling people about this is a lot of what we do at the coders rights project that Kurt was talking about definitely encourage all of you as he did to reach out to us if we can help you wade through this but hopefully this year we'll get at least a little bit of clarity at least about part of it because the Supreme Court is hearing a case called Van Buren versus the United States and this gives the court finally the chance to weigh in on whether violating use restrictions is a crime under the CFAA so EFF submitted a brief in that case on behalf of computer security researchers the nine justices on the Supreme Court may be very concerned about malicious computer break ins and we are concerned that they don't fully realize the importance of public interest independent computer security research so what we did in our brief is we explained you know first of all independent security research is critically important right to the overall security of this nation from shoring up election systems and software to making sure that our critical infrastructure systems are secure to you know securing medical devices or automobile software so we explain the importance of this kind of research and how it's contributed to security in the past and then second of all we explained that even though this kind of independent research is obviously in the public interest searching for vulnerabilities that actors could exploit might require computer security researchers to violate terms of use right companies have terms that prohibit things the all kinds of things from reverse engineering scraping you know what have you that a researcher might need to do but the government's really broad interpretation of the CFAA would make that kind of research a crime if it violates terms and that's a serious disincentive that prevents some independent researchers from doing this really important work right and work that's to everyone's benefit I'm going to stop there for now I am happy to take questions about this I love talking about this and we're expecting a supreme court decision about it sometime in this next year which is very exciting so stay tuned for that but for now I'll pass it along to Rory Hey thanks and thank you again hope for having us I'm Rory I'm the most recent addition to the EFF activism team I started in March so it's been an interesting time to join the organization to say the least I'm the grassroots advocacy organizer which is part of the organizing team which manages the electronic frontier alliance so the electronic frontier alliance hopefully you've heard of is a network of local organizations working on important local issues that aren't always taken up by national organizations like the EFF unfortunately the EFF can't be everywhere at once so it's really falls on these local organizations to advocate for their city for their state to make sure that our digital rights are defended and to make sure that their neighbors and people in their community are also empowered to take action and are well informed about how to stay safe and how these technologies work so the EFF started the EFF I should say to support these organizations and give them a system of support and I'm happy to say we now have more than 70 grassroots organizations in our network across the entire US and it's continuing to grow so prior to the EFF I was actually a member of one of these EFF organizations the Cypher Collective in New York City so I'm lucky enough that I'm now on the managing side of it but I was a member of one of the groups earlier so I get to see the whole picture of what it has to offer so groups in the network do remain completely autonomous and it's really important to us to keep it distributed instead of overly centralized because we don't want to be the bottleneck preventing people from organizing we want to make sure that they are empowered to work together share resources and all while we still offer whatever support we can and plenty of opportunities to join our campaigns when it's aligned with their local issues and I think it's a really cool aspect of the alliance is these community organizations while the EFF might be experts in digital rights these community organizations tend to also be experts in their own community an example might be a student group there's many student groups in the EFF and they know plenty about student rights and issues particular to their campus so we can help lift up the digital rights component of that for things like the upcoming semester a lot of concerns with the pandemic concerns with contact tracing apps being required or proctoring software being required we can help them on the digital rights front and they bring their own expertise to the issue as well so members of the EFF are just asked to have accessible events and to endorse the EFF's five core principles which are supporting free expression security privacy creativity and access to knowledge and as long as they endorse those principles we're happy to work with folks it's okay if they don't have full alignment with EFF's stance on issues as long as we're all working towards the same end we want a really broad coalition a really broad network and broadly speaking these groups fall into three buckets one community education advocates and that's like the cyber collective I was a part of or crypto party Ann Arbor in Michigan that are working to usually with libraries or universities to help people learn how to stay safe having crypto parties there's also hacker spaces and maker spaces like crash space in LA or DEFCON 201 in New Jersey and then our advocacy groups such as surveillance technology oversight project in NYC or PDX privacy in Portland and it's really interesting to bring all these different kind of strategies and all these different types of community engagement together especially when people cross across those different categories so some recent big wins I want to throw out there is especially for the educators and maker spaces it's been really hard to pivot to be online that sort of community work kind of necessarily involves hanging out with people in your community and talking with them face to face so it's been a difficult transition but I'm really happy to say our groups have been incredibly resilient and have been sharing resources on how to have streaming events and maybe even conferences for example your privacy lab is a member that had the flat on the curve summit in July and then just a shout out DEFCON 201 and ethics and tech again for being super involved in streaming events and of course at events like this one I hope we have cyber collective in DEFCON 201 submitting talks and engaging with the community in that way and then of course advocacy we've had some really great wins and great efforts I'll shout out surveillance tech and oversight project in New York passing the post act or the police oversight and surveillance technology act in New York City an amazing win totally in all of their efforts so now the NYPD must set policies on surveillance and actually follow those policies and then there's groups like PDX Privacy which have been doing great work on banning face recognition in their city and passing things like C cops or the community control over police surveillance and militarization so we have these groups across the country doing these amazing local things I want to just quickly plug if you are part of a group and I think there's a lot of folks that are feel free to email me Rory at EFF.org or organizing at EFF.org or if you want to learn more there's EFF.org slash fight yeah happy to work with y'all all right well thank you Rory so Kurt do you want to take some questions now yes absolutely I've had some some good questions come in and so thanks for keeping those questions coming so one of the questions do you want to read it off I'll take the last question yeah so we have a few questions already in the chat I just want to remind people if they're not logged into Matrix please log into Matrix chat throw your questions in the session Q&A channel and we will get them teed up for here so one of the things that's come up a couple of times with the the comments in the Q&A is summed up I think by this question what more can we do besides encouraging others to visit sites like the EFF to inform and educate others from attorneys to activists to average technology users to learn about their civil rights and liberties and violations of such and to advocate for themselves so I can take that yeah um that's a really really great question and it's sort of the fundamental basis for all of the work that we do so the first thing is like there's a lot of stuff out there that's really scary and especially once you start looking under the hood of what is technologically possible with a lot of things there's a lot that can really freak you out the thing is though that doesn't mean you should just give up there are a there have always been a lot of threats that have always existed in the real world as well as now the digital world but when we leave our home every day we lock our front door when you go somewhere in your car you get out of your car and you lock the door that doesn't mean that somebody can't break into your house that doesn't mean that somebody can't break into your car but you still lock your door you still lock your house so the thing to focus on is the specific actions that an individual can do both to make themselves safer as a person and also to help create a system that creates more safety and more transparency and more openness to protect us all as a structure so you know something it's really easy like you want a password manager you want to make sure that each one of your passwords for all of your sites is unique and long and complicated a lot of us at EFF use one password we do not endorse any products whatsoever I'm just telling you what we happen to use you want to make sure that your passwords are unique and different you want to make sure that your you're using two-factor authentication on anything that has to do with your personal information you're not giving your personal information to people who don't need it especially in the pandemic I'm really annoying for a lot of people like legit services that I want to buy where they want me to give them their credit card my credit card number over the phone and I just don't do that as a practice there's just a couple of things that you can do that are not that difficult you know if you want to start getting more into OPSEC you can start looking into other things like Tor and other stuff to protect some of your browsing history but you don't have to start there start with the password manager and then for other things I mean going to the EFF website signing up for the action alerts working with the grassroots groups like you definitely want to be talking to your elected representatives technology is becoming a much much bigger part of our global structure or country structure and they need to hear from their constituents one of the things that I run into a lot on Capitol Hill is that a lot of Capitol Hill staffers don't fully understand technology in the way that y'all will and that's okay they understand the legislative process but they need to hear from people who do understand the technology and they need to understand what the limits of the technology are so a lot of lawmakers and a lot of people who don't know technology all that well sort of think technology is magic and so if it's all magic to start with why can't you sprinkle the fairy dust and make it do exactly the thing that you want to do so you know encryption is a great example you can totally have end-to-end encryption where it's totally saved from all of the hackers and the bad people and whatever but the DOJ has this secret key that only they have that they will never ever misuse that they can go in and get the messages from the bad people magic super fairy dust that sounds like a great thing in theory it just the math doesn't work but you have if you don't know that it's based on math if you don't understand how it actually works then there's like if it's all magic why can't you have the magic do the thing that you want it to do so you want to make sure that your voice is being heard decisions are made from those people who show up so you want to make sure that you show up go to town hall meetings ask questions talk to your elected officials a lot of stuff is happening at the civic local civic level city council school boards all that stuff make sure that it can be really boring but make sure that you just show up and say hello this is who I am I live in your district and this is what I think and that does a lot more than you think it does I don't know if anybody has Alexis to do something you wanted to add to that yeah um in my security training experience the one thing I just tell people especially technologists because I feel like we fall into this category a lot when we explain things to people there's a difference between informing people and telling people what you know so a lot of us are very excited to talk about things we know and try to share that information but if you're not coming down to the level where you're informing someone on what they can and can't do what the limitations are and what tools they can use and translating it to how what their needs are in their context then the information cannot be getting lost and overwhelming so I just wanted to add that piece thank you both so our next question how do you think the relationship between masks and facial recognition will evolve who wants to gather that I can take that question so I've actually been looking uh at masks and facial recognition the uh uh NIST the National Institute for Science and Technology just published a study on masks and facial recognition um there's actually they are working to try to make it so that uh facial recognition can get past uh masks they're working with the DHS and the Customs and Border Patrol but nevertheless it was very interesting study because they were revealed in great detail how well algorithms we're working with with masks and masks unsurprisingly do help protect you against facial recognition but there are a lot of differences out there so a couple of handy tips that that came out of looking at that research is that you want to have a mask that goes all the way up to the top of your nose the way your eyes are the more of the nose you cover the more difference it makes up to 36 times more protective than the median algorithm for a full coverage then also full coverage on the sides that also helps that's about two times better than the round masks the the sort of more typical N95 construction dust masks style and then black was better at protecting against facial recognition black led to a lot of no-face where they were the the algorithm wasn't able to detect a face in the first place to uh to measure against uh or to compare against uh however this is seen by the government as a problem that that masks are making facial recognition more difficult facial recognition is something that they really wanted to do DHS uh in the last uh the report came out last year that they were doing something in the order of 45 million uh facial recognition scans each year they they're frustrated that uh masks are getting in the way uh and so uh the the next this study is going to be looking at new mask enabled algorithms that are going to try to be able to identify people despite wearing a mask so i'm very curious to see how that will go and whether the lessons one can draw where uh masks will help protect you from existing facial recognition algorithms will also continue to protect you otherwise and the other uh interesting aspect of of the research was uh i was talking about you know various things that would increase the error rate but uh it was increasing the error rate in the no match uh direction which if you're trying to protect your privacy is the direction you want and was not uh increasing the error rate very significantly in the false match which is to say it gives a false answer of saying that you are somebody else which is good because one of the dangerous things that that happens with facial recognition is that it recognizes you with somebody else's picture and then all of a sudden they think that you've done the crime and uh you know you get caught up in some things that we've had people who have been arrested because their face met somebody else's photo uh and they just went and arrested them and unfortunately a lot of these algorithms uh show uh racial bias where they are less effective and have more of these false matches uh for minority communities and this you know has a terrible effect on on uh civil liberties where we have a false accusation based on a flawed algorithm uh the other thing you say you know uh so science of this they're going to try to make it so the algorithms can figure out who you are uh even despite having a a mask on but there's also the the politics which is trying to get it so that uh there are laws uh usually effective so far at the local level uh that are prohibiting police use of uh facial recognition uh and I think actually Rory some of that's been with EFA groups you want to say a few things on that yeah the local EFA efforts have really been great um with recent success in Boston and looks like in uh Portland uh so um yeah it's part of our about face campaign if you go to eff.org slash about face um there's a lot of materials there um sample language you can use for advocate for these uh face recognition bans locally um and yeah meeting up with local EFA groups to work on that all right super so talking of facial recognition our next question is focused on New York uh and what do you think of the link nyc kiosks around nyc that have three cameras in each of them yeah um I will quickly speak on link nyc and then I'll pass it to Rory to add to that um so what I think about them is that the initial rollout was inherently flawed in a lot of ways of link nyc it was an initiative that allegedly was bringing wi-fi to the general public in New York City and ask someone from from there and live there for quite some time it was very troubling to see the rollout of such so there was no real secure way of like handling the wi-fi and there's a discussion around vpns and such but the cameras in particular uh we're worried because well I'm worried in particular because of the relationship with nypd and the way they could use that footage and that data since it's a public service that's handled by the city that was in partnership with google so there's a corporate surveillance aspect to that so those are my initial worries and concerns especially since they rolled it out in all five boroughs I believe I'm not sure it's in Staten Island I don't really include them half the time but they're a borough right um so I've seen a massive rollout of this of these kiosks everywhere and you see them normally by like train stations so I figure the the footage and the extra surveillance is usually around transit centers in particular and seeing people from walking certain patterns from day to day so you can link a lot of data in that way right so that's my initial thought on those and I'll pass it to worry yeah definitely I also recently lived in New York and link NYC definitely felt like a Trojan horse of sorts of uh we're providing community wi-fi and like modernizing the city and then um there's all these questions of uh surveillance and um privacy violations and I want to really plug and defer to rethink link NYC another EFA member who are calling for a few demands so link NYC halt the construction remove surveillance cameras and bluetooth beacons answer public questions because there's just a lot of unknowns about how these kiosks are working and provide genuine community wi-fi that kind of make good on that initial promise um and institute some sort of oversight to make sure that network's not being abused so definitely check out rethink link NYC for more depth and policy on that okay thank you so changing from one coast to another but staying on the streets we have a questioner who was learning during the course of protesting in Seattle about the way that activists are doing their research on cops via pacer what is the state of liberating pacer how much progress have we made yeah I can speak to that so pacer for those who might not know is this um government-run system that's you know a document retention system that collects all the papers that are filed in federal courts this is what lawyers use and judges uh to look through the dockets of federal cases and see what's been filed um and the general public also and all of these materials are um publicly available right there's a there's a first amendment right to access documents that are publicly filed in courts um but because they're collected online in this database um the problem is that this database is fee-based so you know even if you could like physically walk up to a courthouse and ask to inspect documents which of course right now you probably can't um and in general maybe you can't because there are these courthouses all over the country so your option is to find them online and then you're going to be charged for them and the charges are often pretty exorbitant and add up quite quickly so this is a huge problem because it limits public access to these public documents but the good news is that there is a great project that is set up to address this and it's called RECAP um which is PACER backwards uh it is an online archive and I there's a Chrome extension I believe and I think there's a Firefox add-on and the way that it works is that you can download this extension and then when you go to PACER um to see a document and pay a fee for it it will automatically upload a copy of that document to the RECAP archive where it will then be available for free to anybody on RECAP so if you just download RECAP you can go to their archive um you know find it online and see what's available online and it's not everything but a lot of documents saved in PACER are available there and I would really recommend first of all definitely look for documents there before going to PACER and then if you do go to PACER um please download this extension because it is a benefit to everyone uh and a huge boon to public transparency in the courts. Super so we had a shout out from uh you guys to DEF CON 201 earlier and they they said thank you very much for that and Side Pocket sent in this question. What is your opinion about President Annoying Orange wanting to ban TikTok? We at DEF CON 201 feel that while TikTok is a nightmare and we have a TikTok to inform people on TikTok about its issues it seems obvious that TikTok is being banned not because of privacy invasion but that the privacy invasion is for China and not for the US pound sign delete Facebook. All right yeah I can take that that question I've been looking uh a bit about the the TikTok uh situation and let me just sort of say as a as a starting point uh is that uh when when President Trump will has a has a technique he often uses which is to say something like I will ban TikTok without explaining what that means what law might be invoke to do this what authority or illicit I have the authority and what does that exactly mean uh so we have seen for example from uh the US actions against Huawei uh that uh when they said they were usually against Huawei what it meant was saying that like there are no federal funds would be spent to buy their routers um there they were not going to be using them in various government and that's something that could be a ban that the federal government could easily do could say that like no federal employees on their government phone shall have TikTok uh though you know I don't know if that really would make a big difference to to TikTok's uh usage or make them care you could also have uh say no no federal funds being spent on TikTok which I guess could affect some advertising uh something your federal government would no longer advertise on TikTok uh but could there be like a real ban like a ban saying you know uh one one form a ban might take is uh no one can use TikTok again US citizens I hear why ordering that that doesn't work like uh there's no authority that gives the president the the power to do so if he nevertheless asserts that power uh it raises serious constitutional issues there there's a first amendment issues are raised when a medium of expression is being cut off by the government so that all the people who are using TikTok for their expressive activities uh most of which are not having to do with national security and that's the important reason for the ban protecting national security and if you're a teenager making a cool dance video this may not implicate national security even if the Chinese do know about that I could also say that you can simultaneously think that TikTok has bad security bad privacy practices and the government shouldn't have the power to ban it and you know sometimes people try and like conflate those those two things but remember if the if if president didn't have the power to by fiat say I hear by ban this app uh they could do it with signal they can do it with uh you know a host of encrypted messaging apps that people rely upon uh in order to have secure communication like can't do that uh there there are other things that a ban may mean I suspect that the the thing that's actually happening now is this this enemy called the committee on foreign investment in the United States that looks at acquisitions of U.S. companies for national security implications uh bite dance the Chinese parent of TikTok uh purchased musically merged it to create the TikTok you you know and love today or no one hate for that matter uh and uh they can say you need to unwind that uh investment uh might be something that could happen um that is not exactly a ban but the weird thing that happened was yesterday after things came out indicating perhaps they're going to be asked to divest the bite dance was going to be asked to divest TikTok which probably meant just musically was an aspect of TikTok uh then the president doubled down by saying no no I'm banning it uh and this was in response to reports that Microsoft was interested in purchasing TikTok and you know yet again he's like he's saying something but uh it really should be on the government you know obligation of the government to explain what authority you have to do that you know what what constitutional authority or statutory authority but you know that that doesn't happen just says things so we'll we'll see if uh a idle comment made on uh to reporters on air force one where where that ends up having meaning uh but if it attempted to be one of these broader bands it would have severe constitutional issues absolutely our next question is have you thought about organizing a federal citizen's grand jury as described by Justice Scalia um yeah let me let me answer that one um no uh citizen grand juries is a performative thing that some people have done like people have done this to indict Mueller if I didn't like uh what he was doing with the uh investigation of Trump they've used it uh about uh 9-11 it is a a method of you know activism of you know saying it's a citizen grand jury and and saying you know you think these people should be indicted it doesn't actually do anything uh you know for anyone as it is going to be prosecuted they need to in fact be indicted in the ordinary course of things uh and that's not really our style of of uh uh legal work uh one is that we are not prosecutors right we're on the defense side we are the defenders of the internet and we're um trying to uh uh protect people so we're not really going to get into the prosecuting end of things uh and the second is that uh you know we rather work through the uh legal system file lawsuits in the course try and create uh precedents at least for our legal and then we have other things other our activism things all right super uh so next up cfa a can you discuss what you would consider to be the likely best case and worst case outcomes that might stem from the cfa a decision as well as the aspects in which you think the court will be considering most critical to making their decision sure um so the vampir in case is actually fairly narrow in what the specific issue that it's presenting to the supreme court um and it might be helpful to just describe the actual facts of the case briefly so the case is about is a police officer in georgia who had access to a law enforcement database through you know as a normal part of his job and abused that access by using the database to look up a woman who a friend of his um you know told him that he was interested in it turns out the friend was an fbi mole so uh so the federal appeals court um in his jurisdiction uh that covers georgia said that um even though the cop was allowed to access that database for his job um he violated the cfa a because he accessed the database for an improper purpose right um and that decision was in direct conflict with a decision from a few years back out of new york with actually like eerily identical facts that also involved um a police officer with access to law enforcement database for his job who accessed that database to look up women in a way that wasn't allowed um and the facts diverge a little bit in ways i won't get into but you can look up a cannibal cop to learn more about the facts of that case um but but the court there the appeals court that covers new york said um that's not a cfa a violation because even though he used the the database in a way that was improper it was a database that he had access to right he's not breaking into this database in order to do it um so two rules directly in conflict and and that's really the question that's before the supreme court and of course it's not only about police officers it's it's you know we're hoping that it's there will be a broader decision um that will address terms of use violations generally right if i have access to um my work computer or if you have access to facebook right and we violate um terms of use then is that going to be a crime under the cfa so that's that's the answer that we're hoping to get from the supreme court um and in a best case scenario i think that the supreme court uh would you know first of all of course agree with the majority of appeals courts that says terms of use violations are not a crime and we'd love to see more sweeping language about you know the importance of computer security research and how that is impacted by the cfa as well as some other problems with this aspect of the law right like the fact that making violations of companies use restrictions into a criminal offense really delegates the power to write criminal law to private companies right and that's something that should be happening through congress through your elected officials right where citizens have a voice in saying what we want the law to actually be um there are a lot of other problems with the cfa and unanswered questions about how it applies that um are very unlikely to be addressed by this case right and so that includes things like um over criminalization right there uh very very high sentences that the cfa imposes um that can lead to even like you know say one computer security researcher uh does one study in the public interest that involves violating use restrictions excuse me uh they can end up facing like possibly decades um in federal prison for doing that under the cfa right now so so sentencing reform is something we really want to see that's not actually going to be before the court in this case um and then there's just a lot of questions about how it applies like for example um if you share your password with someone and then they access an account is that authorized or unauthorized under the cfa um and that's probably going to take another supreme court case to answer okay thank you nyomi so the next question a lot of us freaks are doing party lines and voip calls and discussing op sec against lawless evil actors when it comes to call recording if one caller is in a one-party consent state like philadelphia but i assume it's pennsylvania and another is in a two-party uh for example new york which law applies i can speak to this also you know like every question posed to a lawyer the answer is it depends and um it's a little complicated and the law is actually going to vary on that question from jurisdiction to jurisdiction so as a general matter uh i would say that the best thing to do is comply with the most privacy protective law um but the reporter's uh committee for freedom of the press also issues a guide called can we tape um and that's a 50 state guide that actually does give state by state details so it could be a helpful resource if you want to look it up and figure out exactly what the rules are in your specific jurisdiction and under that the uh can we tape guide also has a section that would discuss some of the issues in more detail about interstate phone calls uh but you know it also concludes the same uh use a more restrictive one a more private one thanks so the next question is on hippa uh any thoughts on hippa and privacy in our new paradigm of online medical care i feel this is the question or it's increasingly problematic especially for those that have no alternative for their medical care so i can only speak to that a little bit um so um so hippa is a great law because it is a good example of what a federal privacy standard can look like when it's a floor and not a ceiling so one of the cool things coming from a consumer data privacy standpoint which is entirely different than medical privacy um it's the federal government has set a floor of there are certain things that are so sensitive and that are so unique to an individual that there has to be all kinds of explicit permissions between from the individual for that type of information to be shared beyond the initial point of contact so when you go to your doctor and they do a scan of your insides they are only allowed to share that information with very specific people for a very specific purpose and each state actually has different ways that either hippa has been interpreted or different uh protections on top of the federal standard um and it's a great it's complicated it's an incredibly complicated piece of legislation there's another person at EFF who spent quite a lot of time looking at the intersection of hippa and some of the other consumer data privacy proposals that we have been working on and all I know about it is because I've been in the same meetings with him and it's it's a really really complicated set hippa is not completely ironclad it's not like you only you and your doctor get to see it but it's like you and your doctor and your insurance company can get to see it but they can't advertise based on the information that is collected in the course of a legitimate proceeding there are a lot of protections that are built into hippa which is why you can't just text your doctor you have to go into a portal that is secure so you get encrypted messaging to go or you can't just email them that you have to go into a special website that's extra layers of encryption to go over there so there's a lot of hurdles that you have to jump through and that's all because of hippa that's trying to keep you and your information private um so it's an ongoing landscape it's I'm sorry I don't have more on that it's an incredibly complicated landscape and I don't want to go out beyond my skis all right thank you so we next up we have a short question here uh about how to engage so what is the best way to engage with elected officials who are far from us on the political spectrum yeah I can say that um so yeah that's definitely a question I hear a lot um since I'm working with organizers in all different parts of the country all sorts of ends of the political spectrum um but luckily for us the EFF mission and the EFA principles are extremely popular um and there are ways to address issues um in a different way that will often be more appealing to folks um people think about these issues in terms of narratives that they hear so something like face recognition maybe for this audience is very clearly privacy invasive and um limits our autonomy um some folks see that as helping the police that it's like a good thing because it helps them catch criminals um so I think working on reframing that and saying you know we would say that's not the case but it's also definitely a case of big invasive government um getting involved in your life um and um altering your personal life uh so kind of engaging on the individual level uh you can definitely reframe the discussion in a way that hits on again these values that are ultimately very popular and in terms of elected officials the nice thing about organizing is if you can't change their mind you can work against them and block their efforts and maybe get someone else elected so yeah definitely encourage organizing building that support across the political spectrum on the ground um and then the elected officials hopefully you can press into doing the right thing yeah so I mean as the resident lobbyist I lobby everybody on the political spectrum you know in the House of Representatives there are 435 votes and if you're going to win on a particular legislative measure you need a majority um in the Senate there are 100 votes and there are you know especially in our issues there's an area where libertarians and civil liberties advocates there's a Venn diagram of those people where they really come together and you know as somebody was mentioning in the chat and this is totally correct focus on the thing that you agree on so there are definitely members I am not a single issue voter so as much as I work at EFF like there's a lot of other things that I care about in the world and so I care about your position on this and this is my day job so I'm going to only talk to you about the things that are in EFF's portfolio and in the areas that I want you to focus on right now so if we're talking about like FISA Patriot Act stuff there's a lot of stuff that was coming up earlier this year that has to do with what programs are we going to reauthorize to allow the government and the NSA the intelligence community to secretly listen to more messages and there's a lot of distrust around that particular process and some of the lawmakers that I work with who agree with the EFF position that we need to do a lot of reform I don't agree with them on literally anything else and some of the things that I overhear when I'm sitting in their front office areas I'm waiting to meet with the staffer to talk about the Patriot Act it's like I don't want I don't want to hear I don't want to have anything to do with that because this is what we're focusing on they're on the committee that is going to vote on this legislation if they're going to support our position if they're going to support the legislation that's what I'm here to talk about right now can talk about all of the other stuff later in my personal capacity but right now this is what we're focusing on because it's important so you sort of pick the things that you want to focus on and then just drill down on that okay so let's focus on maybe another legislative question the questioner is asking do you think there are any legitimate government uses for this technology and they're talking about facial recognition and biometrics but I think the the the question that goes on to ask is the only path forward towards a full ban of facial recognition no that's a really great question so we are currently advocating at the federal level we are advocating for a full ban full moratorium on facial recognition right now because the short answer to the rest of your question is it depends but what we know right now is there are so many abuses and potential abuses for the way this technology is being used right now that we need to put a full moratorium a full ban on it right now immediately so that we can figure out if there actually are any circumstances where it would be appropriate what are the correct safeguards and transparency requirements and oversight capabilities that need to be put in place later but you can't do that while the technology is still on the street and still being used all over the place and people are worrying about which face mask to cover your nose all the way to the top so that you don't you know get tripped up on the thing it's important to put a ban on it now because that also supports a lot of the great work that the efa groups have been doing and state and local governments i mean san francisco's got a ban on law enforcement use of facial recognition there are other cities that do too we really want to make sure that we're supporting those local efforts and we think the best way to do that right now is a ban on facial recognition there's actually a bill in congress right now both in the house side and the senate side from senators murkley and marquie and representatives jaya paul and pressley and the house side that would put this type of ban in place we have an action alert on the eff website if you want to contact your local representatives to tell them to support this legislation that is a ban right now and again this is facial recognition a ban of facial recognition is something that we've been advocating for for a very long time and once a lot of the protests started and people started realizing exactly what law enforcement can do with this type of power if it's not really curtailed or looked at critically we started getting a lot of phone calls back from legislators so we'd really like to see this legislation move forward so anything that y'all can do to contact your local elected officials would be great super and of course we we support eff at hope so just another plug for the donations to eff as part of hope and making sure we hit our goals there i also just want to mention that we still have the q&a chat open in matrix please post your questions there and this panel will be continuing for the rest of the hour up to 150 eastern so please do throw questions into the matrix chat there for the eff panel the next question we have are there any plans for the eff to set up a presence on the fediverse so i'm not aware of any plans to to do that the fediverse is a bit interesting because it offers a variety you know federates a number of different networks and so that may be handy uh eff you know we we have a number of online presences it requires some resources to maintain additional presence so we don't like jump into the next latest thing you know sometimes we put a bunch of effort into something and then it became less uh widely used uh so we're a bit cautious on on you know which ones to put that kind of effort in but uh in some cases we you know maintain a presence uh and uh it seems to be worthwhile so uh i'm not aware of any plans to jump into the fediverse but it's something that we we may consider okay you're on the activism team are you aware of any anything on the slides can you not at the moment um i think it's a cool idea but yeah i think like kurt said um we have to be strategic about how we use our resources both in time in terms of hardware and in terms of hours our wonderful tech ops team has been working around the clock having us all work from home so um yeah i think it'd be a cool idea but no current plans yeah but it is very interesting because it tries to solve the fediverse tries to solve a little bit of that by having a number of different protocols federated into one place but uh uh it's we haven't gone there yet okay thank you kurt and rory um so i think the next one is also asking a little bit about how you're you're you're working and what you're interested in doing in a different way uh has eff thought at all about forming any state effs to focus on state legislation so uh uh going back through the mists of time you know we're now in the 30th anniversary of eff and in the the first decade of the 90s eff did have a chapter model uh there were there a couple of uh state chapters uh there also were some uh international chapters uh and that that turned out to be sort of a unmanageable uh we uh abandoned that model uh though uh those who had been chapters at the time were allowed to continue we didn't want them to cut them off and there were a few that that should survive to this day electronic frontiers georgia uh and electronic frontiers austin in the united states uh electronic frontiers finland uh in the e u and uh i don't think we have done much recently but electronic frontiers italy also has done some things in the in the relatively recent uh past uh but uh uh so that and here's the thing just talking about state legislation and we can add a little bit to but the ef so the electronic frontiers austin electronic frontiers georgia are part of our electronic frontiers alliance uh and we we have worked with them on state legislation uh issues uh there was one uh with electronic frontiers georgia where they were really uh useful uh and a powerful voice uh for advocating of some bad uh security legislation being proposed in georgia uh and it made a big difference that it wasn't some you know san franciscans coming to tell georgians what to do it was people who lived in georgia who were like you know cs people from georgia tech who were in the community there going to their representatives and talking about it uh and we've also uh uh both those uh those chapters have had a strong presence at big conferences in there in the zone so ef austin with uh uh the south by southwest conference uh we've done some joint things with them there the ef georgia with the dragon con conference they run the electronic frontiers track uh where we've gone to speak and and talk a lot about uh ef f issues so it's great working with these organizations but our model for doing that uh today is the electronic frontier alliance and where you want anything um yeah i just want to urge you to be the ef f you want to see in the world um if you're interested in starting an ef f chapter there's no reason you can't start an organization and join the ef a uh work on state level issues and as kirk mentioned we'll be happy to help you every step of the way okay this one i'm assuming is a follow up to the prior one on on phone calls and and one party state two party state does a normal phone call have more legal protections against wiretapping normal versus a yeah i'm wondering about that i don't know if you interpret that i i interpret that as maybe versus the party line versus the voice of rip but if it's not meaningful we can skip that one well i did i'll say a few things about this i'm yes a little unsure what what normal means it could be plain old telephone system call i guess uh you might call that a normal one um and uh there there was some difference for a while under uh kalia the computer assistant to law enforcement act uh where internet was treated differently uh they they changed the regulation a little bit uh to well significantly i should say to add voice over ip for a long time kalia was hands off the internet completely brought under a voice under over ip under kalia um the uh other forms of communication so might be treated but separate from that right other internet communications might be separate from that what that means is that whether or not the the provider has various obligations to assist law enforcement is sort of the question so maybe it's like treated differently in that way uh but uh if you extend up to like the electronic communications privacy act that covers all sorts of of communications the wiretap act is designed about you know communications that go over a wire uh so we would cover whether it's a normal phone call or a abnormal phone call uh would have all of the same protective uh armor uh and then you go finally up to you know things like you know your constitutional rights and you know those are not uh technology dependent so that's a little bit of more color on that yeah and actually um we did just try and clarify what the question meant that there was a bit of confusion there but um it's about mental health was the question i think originally so it was was normal versus a call discussing mental health issues um but i don't know if you want to add anything to that in that context i mean i think all communications deserve to have protections in privacy medical ones i think a lot of people can see where that's but uh all do and that question is is related to HIPAA uh what i was talking about was protections vis-a-vis like government law enforcement was sending into your your conversations and HIPAA is not about protecting you from law enforcement uh so it's uh okay so the next question we have up is actually a follow-up on the facial recognition conversation um should we also work we've been talking about the uh government uses but should we also work to ban facial recognition technology in private spaces like corporate managed environments yeah so that one's a lot trickier um so the it's easier to focus at the federal level because we have a lovely thing called the fourth amendment and that is a great backstop to all of the things that we're trying to do to protect biometric information and your face print is a pretty out there easy to understand version of that once you start getting into the corporate's the relationship between employers and employees that is an entirely different landscape of workers rights and the contract between the employer and the employee and there are a lot of things about various requirements that are concerning and again it really sort of depends on the specific of the individual information there's a lot of people that are working on uh workers rights specifically related to privacy and security and biometric safety um and that's I'm not fully versed in all of that but I know that it's very very different there could be other legitimate uses that they're using it for where it's not connected to the internet it's not part of a database it's not you know the way your phone if you've got an iPhone the way your phone recognizes either your face or your fingerprint is technically facial recognition but it's not set up in such a way that it's ever going to be used or accessed by anybody at apple anybody it's not in the iCloud it can never be stolen there's a different there's a lot of different ways that that can work so there are ways where it would probably be okay and there's ways where it's deeply deeply concerning so it really depends on the specifics of what you're talking about all right super next question is also coming back to the medical side of things and and here part a little bit why was there such a focus on protecting medical information very early on but in every other aspect of life our life was fair game for surveillance yeah I can talk briefly about this um and then maybe turn it over to my colleagues but the first thing that I would say is that one of the biggest hurdles that we always face in getting people to care about privacy and surveillance is um and getting people to think that it impacts them right and there is a narrative that some people have that you know they have nothing to hide um and so why why not have mass surveillance uh to improve safety right and obviously obvious to us the idea that anyone has nothing to hide is just not true so a lot of our advocacy and our litigation strategy around surveillance um is is working to reframe this for people right so reframing it so people aren't thinking whether they have a criminal enterprise to hide but do people value their own privacy right do people have things that they don't want broadcast to every member of their family or their community or government or workplace um and one of the examples that always has a really big impact on people is medical privacy and so this is an example that we always bring up in litigation we always talk about how you know just to give an example if you have location surveillance uh you know automated license plate readers that track your car for example then okay maybe people think I don't care if people know I'm going in my car but when you point out that that means that your car is being tracked to the oncologist right or to the planned parenthood or the AA meetings or or whatever it is whatever medical appointments you might have people people do care about that and so I think this is just an area where people inherently really value their own privacy yeah and to add to that normally in my security trainings what I try to tell people is okay you have nothing to hide but what do you have to lose so those are the things I usually discuss with people the exercise I'd always start with at these workshops is normally is so if you lost your phone right now what would happen what does logistics look like behind retaining your accounts what would happen if you didn't have a pin code on your phone and someone could just unlock it what happens and what people could look at are you comfortable with that do you feel okay about the state of your affairs personally if you just lost your phone let's say you left it on bus or on bar or train or anything of that nature but what happened and walking people through the exercise you can kind of see like the the freak out that happens like man if I lost my phone right now like that'd be very devastating to my day in my week possibly so that is what I usually go for when people are going to route of I don't have anything to hide I want to add one other thing to that which is that just kind of maybe a weird fact out there about your your privacy many people think I wonder what is the most protected information about you and it turns out the answer to that is what videos you watch so the video privacy protection act is the strongest uh uh privacy protection that the federal government has has an active a lot of videos to anyone thinking what videos uh you've been watching and this came from uh court confirmation hearings where they were looking at uh confirming I think it was work and uh uh some enterprise reporters found the uh video rental records uh and about that conversation they were not actually particularly shocking video rental records but then congress simultaneously realized that what they had rented at the video store could be found out and very quickly enacted an extremely strong privacy protection for what videos you watch and it talks about videotapes but it defines it in such a way that it is still a relevant statute for for online so there it is even more than your medical privacy even more than your financial privacy your social security number it's what videos you watch that congress has protected the most yeah I mean the thing with medical privacy is it's very easy for people that are not in this community to understand why it's important to have medical privacy you know they congress passed in george w bush signed into law the genetic records privacy act because that was when gene sequencing first became a thing in the early 2000s and like if you got tested and you found that you had the the BRCA gene the breast cancer gene congress really wanted to make sure that you couldn't be discriminated against either in hiring or for insurance prices or all sorts of other stuff and so it was really really easy for people to understand the connection between this personal private thing and also other potential negative consequences of that so again in our world like we understand what all of the privacy means and it's sort of second nature so at EFF when the whole cambridge analytical analytical scandal broke we were all a little confused because we knew that this kind of thing had been a problem for a while like we understand why this is an issue but all of a sudden there's this special secret sauce that everything happened at you know the same time and the rest of the world understood exactly the type of information that all of these data brokers have about all of us and what that means for us and what they can do with that information and it became terrifying to a lot of people who'd never thought about that before so what that you know that I like Doritos oh you know that I like this type of Dorito and that means I'm more likely to vote for this type of political candidate so you're going to advertise and send me this type like that's terrifying and that's when we started talking about a bunch of consumer data privacy laws in the United States which is great there we're still a kind of a long way from getting a solid piece of legislation in that to protect regular data the way that we protect medical data but we're in a good place in the conversation and we're continuing with that so you know keep up the good work folks all right super so the next question we have is as a society we put a lot of trust into the idea that the staff at the private companies holding our data are not abusing their access for example we trust that Gmail does not read emails anti-competitively to outbid competitors when hiring candidates this trust is sometimes attacked as we saw in the recent twitter hack that made use of insiders and of internal tools what legal safeguards are in place and are there efforts to strengthen these legal safeguards my short question for you all right well I mean it raises a very important point I'd say like the insider threat and this was very strongly illustrated by the recent twitter hack where they got onto the account management tool but we've also seen you know years before there have been other instances in which the means of attack to get onto a system was either an issue with the an insider going rogue or an insider innocently being compromised but with a someone escalated to that insider's privileges and was able to to do things and it absolutely makes sense to for companies who have this kind of position to look at things to have all sorts of protections against that kind of internal access so they may need that access but they maybe make sense to have a signal for certain things two people have to approve the the access if it is like in twitter's case if you know someone is going to change the email address of a blue check account with a million followers maybe multiple people need to sign off that that is a real request do these kinds of things but for for the large part as far as legal protections that are involved in this it has a lot to do with unfair trade practices that is to say if they have made promises to you about security and didn't live up to them then they can be held accountable for making those false promises however if you ran a website and said it's you know it's the yolo site and we have no security whatsoever then you wouldn't have to worry about the the false promises issue though you have a really terribly insecure site and actually twitter and is under a consent decree with the fdc about their security practices from a run-in that happened many years ago one aspect of it was that there was like a pseudo uh password and i believe it was uh chuck norris was the password for for sudo uh because you know they thought that was funny but it turned out that that was uh uh ended up being a big problem for them and led to a 20-year consent decree with the fdc anything else to add to that one duck duck i'm not hearing you i don't know if anybody oh my apologies uh the the wonders of trying to keep background noise down um so what is eff's position on electronic voting where actually seeing your vote counted is impossible yeah eff absolutely opposes electronic voting where that's happening without a paper record um and it's such a good question because this is such an important thing for everybody to be talking about and thinking about right now um you know look electronic don't think it'll surprise anyone in this audience to say that electronic machines are subject to breach and malfunction and that's not just a hypothetical right this happens all the time uh we see independent researchers who do pen testing into electronic voting systems are able to breach those systems to do it extremely quickly to delete people's votes or change people's votes um and these breaches are not even always detectable right you don't even know when they're happening which makes it so important to have paper records so that you can detect these kinds of breaches and correct for them and not only protect the integrity of people's votes and of the election results but but also to protect the appearance of integrity in the election right because almost of equal importance is for people to believe that the results are are reliable and so having paper records is just absolutely critical for that we want paper records and we want regular risk limiting audits to go through and make sure that everything is showing up appropriately um and of course these are really important things right now everybody should be registered to vote everybody go vote this is incredibly important now and always but start now all right our next question uh is uh i think about continuity of government and and particular criteria in in our u.s federal system uh i'm just going to read it out as it's written here are there different majority definitions for the house of representatives and the senate such as if the senate has 100 seats and 10 senators have died of covid are 51 votes still required getting 51 out of 90 could be more difficult than getting 51 out of 100 if terrorists bomb congress like in designated survivor and less than 30 senators survive hopefully fewer uh a majority would not be possible does anyone want to comment on that so that is a great if slightly terrifying procedure question um so the way the rules are written is it is a majority and so you look at so there's a process after the election where the members get seated so you know after their states and districts like states control a lot of election infrastructures so they get sent to congress and then congress seats them there's a process on either january on january the third usually depending unless it's a sunday where there is they all get sworn in and then they're officially seated in congress so that's what takes the total number of congress on that day so at the beginning of this congress for example there was an election dispute in north carolina i think it was north carolina that took a couple of weeks to resolve so that district wasn't seated for a couple of months so there weren't 435 members of congress there were 434 members of congress and so the majority is based on the number of members who are seated so there is also times if members die there's a process to remove they're no longer seated because they're dead also members can take leaves official leaves of absence there have been times when members have taken maternity leave since you can't do you can't vote in washington if you're actively having a baby they've taken official medical leave for cancer diagnosis and treatment things like that so there is a difference between just missing votes and not being around because then you still have to get 51 votes like when john mccain was nearing the end of his life he was still an official seated member of the senate and so he's still counted towards the overall total so he could come in famously and you know thumbs down the vote so protected the affordable care act as it currently stood while he was undergoing cancer treatment because he was still a seated member of the senate had he taken official leave from the senate then the total number of seated senators would be slightly less so in the event of a designated survivor situation where you only have 30 members of congress it's defined as majority and two-thirds not a static number so we tend to conflate those two things a lot but they're not the same and the rules are written very clearly to be majority and two-thirds depending on what you're talking about so it's set up to be in that particular type of situation but there's there is a process that members have to go through to be seated or to temporarily not be seated and it does change the margins all right well thank you for for handling that India our next question is a slightly shorter one we have a question about covid and covid tracking what is EFF's position on the mobile apps and and the covid tracking of you know these things on ios and android all right well so we talked about this a bit at the very beginning of this show but for those who have joined since then in my introduction I was talking about some of the work we've been doing on contact tracing apps in particular where we're focused more on the proximity tracing that the location-based apps are too much of an infringement on personal personal privacy and are not necessary and we put out a series of of of principles that we would like to see we want to have it be voluntary opt-in there should be regular security audits you should be able to turn it off and on at your your at your whim and there has to be an expiration date and your business are the ios and android and so there's probably a reference to the bluetooth program that have been put out by apple and google that is a protocol so the apps will be on top of that protocol and we haven't seen one that got very widespread here in the in the u.s there are a number of apps out there but they would coalesce against one app to be widely used but that probably will be the more likely one because it doesn't really work to uh use bluetooth to determine proximity it needs to be on in the background all the time and you have to use this api in order to have bluetooth to be on the background all the time so people probably tend towards those things but to really sort of judge whether it sort of meets these standards of security privacy civil liberties protections is the complete picture that is necessary so not just the protocol which we know about but a deeper dive on the app itself being put forth by the public health authorities okay next up on data brokers do any in particular stand out as a greater threat and how do we defend ourselves from data brokers uh sure uh so on data brokers yeah i mean data brokers uh are in way that your information will be stored and sold to the highest bidder uh and this makes it very easy for people to uh find out about you uh sometimes data brokers will will actually package information to be sold to to consumers marketing and as like background checks more often data brokers are selling it to entities like corporations or governments uh and so the you know the best way to protect yourself from a data broker is to not have your information go into the data broker so you might check out things like the privacy badger and the extension that you can put on your your browser to make we're just tracking cookies but the truth is it's hard it's hard to not have your information ever get into a data broker like if you are perfect about this you're you know all all your life and then one day you you click the wrong box of saying oh i i agree then that information is out there and that broker sells it to another broker it does another broker so it's very hard uh you can go to a number of data broker websites and say i'm opting out some make that easy some make that difficult some use dark patterns uh it's a process you could do but that is you know what uh you are blowing against the the wind uh it may help but it's not going to get you all the way uh all the way there uh and then another like legislation uh so in in europe there is the uh gdpr uh in in california uh there's the ccpa so there are some some statutes uh being contemplated or or enacted that are trying to give people some of these rights to revoke their consent and if some revokes their consent then something that was previously done through consent uh has to stop doing it uh so data brokers you know that they uh at least the least shady end of the data broker spectrum theoretically will respect your your consent uh and you know uh if you are if they're required to under these laws may remove your information upon request again gdpr also does that it protects people who are in the european economic area basically europe but uh it does not protect people in the united states nevertheless many people uh use it from the united states because companies will say well we just it's easier for us to apply it generally i don't know data brokers are going to be that friendly about it so i'd also like to add that one of the things that we are really really strongly advocating for in any federal uh consumer data privacy legislation is something called a private right of action which means that if you have if the data broker has violated any of the statutes that get passed any of the ways that your information is supposed to be protected if we have a private right of action that means you as an individual or eff on behalf of a class action of its members etc is able to sue the data broker because they have violated the law as opposed to waiting for the attorney general or the doj to build a case there's a lot more enforcement mechanisms that are just sort of built into a private right of action system we see this in illinois in particular so they have a biometric identity privacy act but but which we really like that has a very robust private right of action and um there's currently a lawsuit um aclu is one of the lead um litigants against um facebook for violating the bippa law i believe eff is one of the amici in the case but that's the type of thing that you see with a private right of action is it allows individuals to step forward and say you violated this and this is not okay you need to stop and it happened almost immediately and that's a really really great way to make sure that the law is doing what it's supposed to be doing to protect your information private right of action okay so our next question is about two factor and i think we all understand that uh corporations ask for phone numbers not just so they can send you a text message but for other reasons um but the questioner is asking around the trade-off there of convenience versus privacy he agrees having 2fa is more safe than not having 2fa but at the same time feels like it's being used to force uh giving up more private contact information what thoughts do you have on those trade-offs i can answer that so with two factor authentication or multi-factor authentication uh one of the biggest problems is hasn't been rolled out in a way that's been uniform account to account so something like giving over sms information has been considered the worst practice of 2fa so far because of all the vulnerabilities with sms not just with corporations themselves using the phone number but also the vulnerabilities are like signaling system seven which you know dominates the sms infrastructure so that's something that generally don't tell people to go for at first for sms and then there's um i would like more accounts out there more services that offer 2fa to have multiple ways of being able to recover your account because it sort of addresses the whole single point of failure aspect where if something happened to your phone and all your 2fa tokens were on your phone then you're kind of like you know left out in the wind and you have to it's very difficult to retain access to your accounts unless you set up some sort of other way or secondary email to recover right so i'll just go ahead and say like i don't think it's like an official eff standpoint but this is how i approach 2fa normally in a multi factor if an account offers different ways of actually being able to factor things in i use things like ube keys which are on my person um yes you can lose these two like these are the that's the straight off like i could possibly you lose a ube key my toddler can go run off and hide it like she did a couple weeks ago and i had to search for it in the couch or um a way of storing your recovery codes in another place um possibly on your desktop somewhere and have it encrypted there um also being able to use different tokens like one time tokens where you can have that on your phone where if you do lose your phone you can at least have your ube key or i can at least have my recovery codes that are stored somewhere else um does that expand the breadth of attack where i have multiple things in multiple places yes but if you store it in a way that makes sense for you in your model i usually enforce that and suggest that so 2fa and has been an issue because a lot of my friends outside my technical network are really annoyed by 2fa especially my academic friends um they really hate that the universities use things like duo and enforce people to use 2fa just to access their email etc so it's not something that um that feels convenient at the moment because there hasn't been a rollout that's been uniform account to account and it hasn't been well discussed and been informing users on why this is good or bad right now i'm seeing something called credential stuffing as i think some of you may be already heard of where credential stuffing has been one of the most uh useful ways of accessing other accounts and escalating your privilege with other people's accounts if you simply have the you know the plain text password accessed from some database somewhere that got leaked from some you know paceman somewhere and you got their email you can try against other accounts and 2fa helps protect against credential stuffing in that way and so that's why i am really really um behind 2fa in that way but i do understand it can feel inconvenient efforts and if you don't store it in a way that makes sense for you especially if you knew to it and you just do sms or you know one time tokens on your phone and you lose your phone then it can be a huge amount of um hassle to get back your account but that's the way i usually teach 2fa and mfa and how to like parse that out in the way that makes sense for you um maybe not do it for all of your accounts maybe your most important accounts that you feel so maybe not everything at once so i try to roll out for somebody that where they feel like you know it's just being enforced on them and they're not able to access anything because they lost their phone and they're scared about those two factor tokens which is not the general um position behind 2fa but that's where it's kind of led us to so we have to we have to reckon with that as a security community okay thank you alexis so our next question the internet archive is being sued by book publishers does the eff have a position on that case will you be helping them at all absolutely we are helping them we are representing them along with our good friends at a law firm called durry tongry and just as background here what's happening in this case is that the internet archive is a non-profit organization that functions as a digital library right so it provides electronic access to all kinds of information that they have um and they have a book lending program and just like uh almost all libraries uh right now that allows people to check out um digital copies of books right and the program just like almost all libraries that lets people check out books for two weeks or less it only lets readers check out as many copies of the book as the internet archive actually owns um or its partner libraries actually own and so you know if the archive and its partner libraries only have three copies of a book physically then only three patrons can read that book um in its e-format at a time so this is how all libraries pretty much um are dealing with ebooks right now and we are very proud to be uh working with durry tongry to defend the internet archive from publishers copyright lawsuits um and really with all libraries right now thank you Naomi uh we are we're coming closer to the end so i'm going to ask you to keep the remaining responses brief uh i think we can probably fit two questions in the next question what global alliances or privacy organizations does eff work with what is the biggest global privacy issue can gdpr be created and work in the us for government and corporate oversight by customers so we work with a a lot of organizations around the world uh we are members of uh entry for example entry is european digital rights uh which is an umbrella group of lots of organizations throughout the european union uh and as well as we work with like privacy international in the uk uh we have also a network of organizations we work with uh throughout the rest of the world i've done a lot of work uh in latin america uh with local NGOs who are uh doing a what we call a who has your back uh project who has a back of something about eff where we examine the practices of various online service providers about how well they protect your data against government uh intrusions uh direct government data requests the local NGOs we work with customize that for their particular uh legal and political situation and do reports about things in their country so um we're we're absolutely working with with lots of uh global organizations so we are running out of time so i'll keep this kind of short just to one thing on on the gdpr i think just taking the gdpr wholesale and bringing it to the united states not a good idea there are things about the gdpr that perhaps we've learned some lessons of how it was implemented that could be improved there are things which are specific to a european experience also worldwide uh for a lot of countries out there there's other data protection laws that those countries have and not everybody is keen in having europe be the uh decider of what the best uh data protection uh policies are so it's an important thing but uh i wouldn't say just take the gdpr super well thank you very much kurt and thank you to all of our panelists from the eff today um we did have one other question we were going to get to which frankly i think the answer to is uh a lot of what you've heard already but go look at the eff website to stay up to date with news to stay up to date with case law updates and by all means to donate and please if you haven't donated to eff uh please look at the hope dot net website for our goals during this conference and we want to make sure we're keeping up with our our expectations on the donations there so again thank you to the whole panel and we are going to be having a a short break with our infobemer and with our bumps hope dot net slash bumps dot html uh before we start the next conversation at the top of the hour thank you very much thanks everybody thank you