 All right, greetings from San Francisco, and welcome to our spring numbers. I'm Aaron Chu, head of the membership team at the Electronic Frontier Foundation. And also on screen is my colleague, Christian. Hey, I'm Christian. I'm the member outreach assistant at EFF. Well, we're really glad to have you all here. If you folks in the audience would like to say hello to folks in the room right now, feel free to do that in the chat box. I mean, you can just type in your name and generally where you are in the world right now, if you would like to do that. So, Rose, you got it. Thank you. So again, thank you so much for coming a few times a year. We invite current members like you to meet up with fellow Internet freedom supporters and the EFF staff while we started doing a few in-person programs this year. These virtual ones have been really great for connecting with you all. We'll keep experimenting with timing, content and everything else to keep you in the loop. And of course, suggestions are welcome in ways. We're really grateful for the members that make EFFs work possible and we're really proud to fight for online privacy, free expression and creativity rights for everybody. Now, people sometimes think of EFF as a US focused organization, but the web stretches across traditional boundaries like nothing else. Actually, one in 10 EFF members actually lives outside of the US. We all understand that a robust and secure web is incredibly powerful and important. Now, the European Union has made some impressive strides in digital rights legislation in the last few years, but they've also made some troubling missteps. So today we've invited a couple of staff members who will tell us about the kind and upcoming proposals in the EU that could change how millions of people use the Internet and their potential impact on tech users everywhere. So first up, we've got EFF's International Policy Director, Christoph Schmohn. So prior to working at EFF, Christoph led the Consumer Rights Team at the EU Consumer Organization, the BEUC, and was appointed to several expert groups to the EU Commission of Brussels. Christoph's expertise includes EU policymaking, international copyright law and online intermediate reliability. Today, he focuses on EFF's international policy strategy and helps to make sure that digital rights are respected and forced around the world. So welcome, Christoph. Thank you so much, Aaron and Christian and everybody. We're really glad to have you. So to start things off, I was hoping you could tell us a little bit. There's a lot of legislative activity happening in Europe. So why is EFF weighing in on this and what are you hoping to accomplish? Yeah, I mean, I'm quite sure that many of you might have noticed that there aren't many big tech companies in Europe, but there's a lot of appetite in Europe to regulate big tech. And what we could see is that many national bills in Europe are popping up on platform regulation and we see loads of activities at the level of the European Union and a very good example that everyone of you knows is the channel data protection regulation, the GDPR, which sets out high standards for personal data processing. And we saw that this bill, the GDPR, served as a blueprint for similar bills around the globe and we see that other European Union bills could produce a similar Brussels effect such as the unfortunate copyright directive, which contains the infamous upload children provision. So basically requiring platforms to scan all the content we upload on platforms, which had traumatic ramifications for our online experience. So we saw as EFF that these examples show us how important it is for us, for civil rights defenders to make sure that the European Union and the national governments in Europe adopt the good kind of legislation rather than the bad one. So that's the main reason why we are weighing in more and more on Europe and we try to come up with constructive policy suggestion and try to ensure that red lines are not crossed. And we see that more and more ambitious European Union bills are prepared and pop up to deal with internet legislation and regulation. And one big package we have been working on for the last three years is called the Digital Services Act package. It was a super ambitious package. It consists of two separate bills, the one is the Digital Services Act that tried to find a new balance for the responsibilities of online platforms and user rights. And the other one was the Digital Markets Act, which was designed to make it much easier for companies and innovators to compete with mainly US tech companies. So again, the creators of those bills, they stated they would like to set a benchmark for legislation at global level and that EFF, of course, we have welcomed the discussion to think about how to update perhaps outdated rules on online platforms and how to address the problem that many big tech platforms have built entire ecosystems around us. But we were quite alarmed as well, because as everyone has stated before, things can easily go wrong if lawmakers aren't careful. So that's the reason why we decided to work on these proposals from the start and to build up capacity more and more in Europe. Now, you mentioned the Digital Services Act, and I hope you can tell folks here a little bit more about this bill. So what's the currency to play there? And how are you expecting it to impact users? Sure, the Digital Services Act or DSA, so it's yet another acronym for you to remember DSA, it's extremely ambitious. It wants to regulate how terms of service should look like on platforms, how content moderation decisions should be done. It would like to tackle risks that occur on platforms. It wants to deal with safety online. It wants to deal with platforms, responsibility for illegal content and many, many other aspects such as law enforcement. So it's a big deal and the big challenge is giant monstrous bill. It's monstrous like it's like insides anyway. It was overwhelming and you can you can expect that, you know, thousands of lobbyists try to jump on the opportunity to shade that bill to enhance your sector. It's a horizontal piece of law impacting all sorts of platforms, of service provider, messaging apps and everything that you could think of. And for us, the big challenge was how to regulate all that at scale without breaking things. So what we did is to work with you law makers even before the law had been adopted and presented and tried to help them to guide their work as good as we could. And we came up with like two basic suggestion first to try to preserve the stuff that works online and that help to make the internet free, there's some principles that contributed to a free internet like the idea that liability for speech should rest with a speaker and not with platforms that merely hosted content. And then to try to make sure that freedom of speech is protected by rejecting any sort of scanning or upload filters ideas. And on the other hand, we try to be constructive and came up with a few suggestion how to fix what is broken online, like that many of us, it's very difficult to understand why our content on Twitter or Facebook is suddenly gone and why our right to anonymity is not always respected. So our claim was let's not go into the dilemma situation to think of who can send so better. Is it platforms, is it governments? We came up with this suggestion like let's try to think of how to give more power to the users. How can users being able to decide for themselves what to see online? Better options for them to switch platforms without losing contacts and the friends who stay behind. For example, this could be done through some interoperability, which is something that ultimately addressing to markets act. I'm happy to talk about this one later. And so we made all those suggestions and well, it was super tough at times. There was a lot of sympathy amongst members in the parliament to transform platforms once again into the internet police. And we tried to do a lot of campaigning around this to stop this from happening. And now after three years of work, it doesn't look so bad. The result of the European lawmakers, which is basically the European parliament and the member states representatives in council, which is not quite the same but similar like the House of Representatives and the Senate in the US. So those lawmakers, they came up with a political agreement. And it looks like that the Visual Services Act does not follow the footsteps of the bad kind of legislation like the copyright directive. There's no language in there that supports upload filters and platforms aren't held liable for any content that pops up on the platforms. What it does is to focus a lot on processes rather than which kind of speech you can see online. So whatever you heard that it's that the DSA should make it very hard for Elon Musk to comply with the DSA if he actually buys Twitter in the end. It's not really true. What is true is that in the future, let's let's let's think that a platform would like to operate in Europe like Google, Met or Twitter, they will need to explain why users see certain content. Amazon will need to explain why certain movies is recommended by Amazon Prime. Why content is removed. And if it turns out that content was removed wrongly, then users will have an option to appeal against that and to see their content reinstated or their country in stated. So those sort of things on the DSA, which is that's the good part of the DSA. And then there is something that is not so good, not everything is sunshine and rainbow in there. There's a lot of ambiguity when it comes to a new requirement for platforms to carry out risk assessment and to mitigate those risks on the platforms. And we see this push towards co-regulation that platform should work together with law and force is a lot, which is perhaps a persistent issue in the European Union that there's a lot of trust in government agencies and the police to do the right thing. You know, the agencies are trusted to flag problematic content. They can ask platforms to hand over user data and all those things add up to enforcement overreach and be concerned how this part will turn out in practice. Yeah, it's pretty interesting to think about, especially from an American perspective, like how that contrast and some in some cases doesn't contrast with the way that things are handled with our law enforcement. So you mentioned actually quite a few things, but I want to know what's coming next for the DSA and DMA and are there other kinds of bills that we should be aware of that are the pipeline? Yeah, I mean, both bills, the Digital Services Act and the Digital Markets Act, they have now been politically agreed, but they still need to be formally adopted or approved by the lawmakers. And in the meantime, we are still giving a lot of input to negotiators to make sure that the recitals that explain the text are correct because the negotiations were quite chaotic in the end. I think we can expect that both bills are going to be adopted before summer, perhaps after summer, and they should kick in sometime next year. And what we are going to do is the DFF to accompany the enforcement process of both legal acts and make sure that they're properly interpreted like in the most fundamental rights friendly way. At the same time, there are loads of other bills cooking in the European Union and it's sometimes not easy to keep track. There's the EU Data Act, which settles who can access and share machine generated data with the EU AI Act, the Artificial Intelligence Act, which has produced some sort of a risk based framework on how to deal with AI and automated decision making. But our main focus would be yet on another legal act. The EU sees some scanning proposal, which is a very concerning proposal for human rights defenders, and it's the same problem again that the European Union would like to place platforms that's first in line to check in police content. But I will hand over to my colleague, Joe, who knows everything about this bill. Well, thank you so much, Christophe. So, yes, that brings us actually to our second guest. That's FF Senior Policy Analyst, Joe Mullen. He was recently appointed our Mark Cuban chair to one of the two good patents, actually. Prior to the FF, Joe worked as a reporter covering legal affairs for Ars Technica and American Lawyers Magazine Group. He's also written for the Associated Press and the Seattle Times. So now with EFF, he's focusing on encryption, platform liability, copyright, free expression online, and of course, patents. Welcome, Joe. Hey, thanks a lot. It's great to be here. And I want to talk for a few minutes about this encryption proposal that we're concerned about in the EU. But I actually want to take a big picture, look at it. Why does EFF care about encryption? Right. And EFF always has, since its founding in 1990. Well, why is that? It's because we care about your right to have a private conversation. It's a basic human right. And our democracies actually won't work without private conversations. So as soon as EFF was founded, there was a realization among the founders that if we don't take that right to have a private conversation and make sure that right is protected and persists in the online world as well, we won't be able to have functional democracies in the age to come. So that's why we care and that's why we fought battles to defend encryption for decades now, literally. So what is going on in the EU? The EU Commission, which is the executive body of the EU. So sort of like the presidency in the US, except it's a bunch of people instead of one, the EU Commission has a new proposal and the proposal follows a troubling trend. It's not the first national body to come out with a proposal like this. It's a proposal that would mandate that private companies start looking at user files, start scanning them, comparing them to a law enforcement database and then reporting users that violate the law to law enforcement. We saw this in the US with the earn it proposal, which was meant to do the same thing and we had to fight against that. And then we also saw this late last year when Apple was pressured by government bodies to try to install scanners on their own phones and we mounted our Apple. Don't scan my phone campaign, which has been successful so far. So the reason they're doing this is because of child abuse, online child abuse and images of children being abused and sexually abused. And so that is the reason that they want to scan user files. This is not the first reason. So about five or six years ago, the DOJ in the US tried the same thing with terrorism. They said that because terrorism is such a serious crime, they really need some kind of system of what we call a backdoor. And we continue to call it that because that's what it is. They need a backdoor to encryption so that they can check even files that are encrypted, even messages that are encrypted. They just need to do one little check to make sure there's no terrorism. So now they're using the same tactic with child abuse, which is a terrible crime. And law enforcement, which we have no problem with. I hope law enforcement is successful and does an excellent job. But the problem we're having is that law enforcement bodies in different countries, instead of having conferences about how to do excellent law enforcement work in a world with strong encryption, they have conferences about how to break encryption. And they try to convince us to, if not be on their side, stand down, be neutral. And we won't do it. We haven't done it because we were not going to lose just like we didn't lose the battle against Turnit and we didn't lose the battle over Apple phone scanning. I don't think we're going to lose this either. Anyhow, that's a lot of big picture of what's happening in the EU. The EU Commission proposed a long proposal, 135 pages, which is even more straightforward than the other anti encryption proposals we've seen. It would mandate companies under certain circumstances to look at user files, look at user photos, and it would not just compare those scanned images to known child abuse images. It would actually, in some situations, it would go further than that. It says, you need to do things like detect grooming, which is a vague phrase that means different things to different law enforcement bodies in different countries. But generally means like a type of communication that can lead to future child abuse in this context. So what they're really talking about there is not just scanning user files, but possibly scanning plain text against an AI of their own creation. So there is no way to look at this, except as a mass surveillance scheme. And they've tried to use a lot of different, essentially new linguistic forms to try to get around the fact that they haven't been able to win a 30 year battle to invade everyone's privacy because of EFF and groups like it. So they've tried things like client side scanning, which is something that would break the promise of end to end encryption, even though it wouldn't, in some technical sense, break through the encryption. Because, you know, what end to end encryption really should mean is that there is no one between you and the intended recipient of your message. That can read it. So that's what's happening in the EU. This is a pretty new proposal. And we so far, I would say I'm actually cautiously optimistic because the way that it has landed has been with there's been a lot of skepticism. We already have members of the European Parliament that said they're very concerned about this. That's where this is going next. This is going to a committee procedure in the European Parliament. And that's going to continue for some months. That's a time when we can have a lot of influence. We just yesterday had eight members of European Parliament in the EFF office in San Francisco, where we met in person and we were able to discuss a bunch of issues, including encryption. And I think that's what's so great about EFF. It was great to see in the chat today, actually, how many supporters we have coming from abroad and EU nations. So EFF, in addition to the fact that just our supporters span multiple continents and so many different countries, EFF really has played a unique role in these battles. And I want to give you just a sense of what that looks like if I can. EFF is really unique among the organizations that are fighting to defend strong encryption, because the room we have, it has people like Chris. We can get in the same room, virtual room nowadays. You know, we have Kristoff, who's experienced at lobbying the EU and all of the things about the EU Parliament. We have myself and I've been working in campaigns and activism around tech policy for four years at EFF. And I was covering the issues as a journalist for more than a decade before that. And then I also am in the room with our director of technology, John Callas and Erica, who's a cryptographer. And then we also have lawyers who work on encryption, like my colleague, Andrew. And to have all those superpowers in the same room together is really something that our allied organizations, as great as they are, don't have. So that's what you get with EFF. And it's fantastic. And I just want to I don't want to take too long because I wanted to get to Q&A. But I want to thank everyone here for your support in those incredibly important battles in the last two years, because the reason I'm able to come here and tell you that we haven't lost yet is because of you guys. We in the first earn it battle when they wanted to break encryption, we had sent more than 200,000 messages in 2020 to the Senate. And we have together with our allies, we had more than 60,000 people ultimately sign a petition telling Apple to not put scanners in their phones. And that's why those things haven't happened yet. I mean, we stopped it. It didn't unfortunately we lost a vote in the Senate committee, but it's really because of opposition and you're speaking up that it didn't go any further than that. And it's also because of you speaking up that there's not scanners in Apple's phones. The part of Apple's webpage where they describe the scanners that they were going to win their phones is deleted. They just deleted it. So, you know, that's what you call a victory. And so I just want to thank everyone here for that. And we're going to be fighting another fight this year in the EU. But I think with what we have here and the support outside the FF2, I'm cautiously optimistic so far. So I think I'm going to leave it there because I want us to be able to have time for questions, unless there's something Aaron or Christian, you want me to cover. Real, real quickly, someone asked in the chat and Chris posted a link to the actual proposal. Is there like a is there a name for the particular proposals that you're talking about that we should be aware of or pay attention to? A name for the current proposal. Yes. Sure. I mean, we call it just the EU, you know, scanning proposal here. I can I'm in a post. I am going to post in the channel. This is a link to the blog post we have up about it on the FF.org, which is from two weeks ago, which is the day it became public. And that links to the proposal and the text of the proposal, which is all in English. It's also in other languages if you prefer another language. The EU is calling it regulation of the Europe proposal for a regulation of the European Parliament and of the council laying down rules to prevent and combat child sexual abuse. So we shorten it because that's a long name. But yeah, that is the official title. And so we link to a press release and the text of the proposal in the blog I just put out. And that's the that's the first of many, I think, that we're going to be blogging about this issue. Thank you, Joe. Yeah, I actually was wondering about that myself. And it's it's it's almost kind of like funny a lot of times with US legislation, they'll use some kind of like insane acronym that contradicts what the thing's actually doing. So I'm sort of I was waiting for that. Yeah. Yeah. Yeah. Right. I mean, we're expecting the the acronym. I don't know that the US, the EU does not seem to have the same kind of budget for the world's best acronym creators that the US Congress seems to have. It seems like you don't you don't even get to square one in Congress unless you have a clever acronym. So but the EU is not not embraced that, which is to their credit, I think. Well, so I think I think we're ready for question questions, Christian. Hey, yeah, let me bring everyone up or let's see. OK, cool. So yeah, we got a couple questions in the chat, but feel free to keep posting questions in the chat or if you would like to speak your question, you can raise your virtual hand. I think you just go to reactions and then press the raise hand button. And we can give you a quick shout out. So. Let's see. First off, I guess we can start with encryption a little bit. So I forgot to record the names of the questions. So sorry about that. But someone asked, can the EU law really kill signal and end to end encryption in general? Yes, it could. Yeah. I think, you know, it's it the way they do it isn't that the different laws we face off have different kind of methods. But I think the EU proposal would require companies under some circumstances they would get some kind of formal notice from a government body. That's the system it describes. And then if they were given that notice, they would have to use they would have to be able to access the user messages. They would need to scan their body of data. So this applies and this applies not just to private messages, but also to cloud services. So that would be the point at which the EU, a government body in the EU could say, well, we served you this notice and you're out of compliance. And then you'd be at a penalty stage. I don't know if I haven't read that far into it. I'll ask Christoph to jump in because this is getting into the process stuff. And I want to make sure I'm getting it vaguely right. But this is a long process. I mean, when the EU Parliament passes something, it still doesn't become law right away. It has to actually then the member states have to go on to create their own laws around it. So there's a lot of room for this to change at various points in the process. But the proposal we have does describe essentially a system of putting companies on notice, requiring them to be able to look at user messages and scan them as part of that notice, if the notice is serious enough. And then if they don't comply with the notice, taking punitive action against them. Christoph, I'm going to check in with you to make sure I describe that. Right. I think you described it perfectly. I mean, there are limits. What you can do right as a human rights charter in the European Union, right? We have the fundamental rights charter and this constitutional right to privacy, constitutional right to private life, which is our argument after all that if you compel platforms to monitor and scan with users, share online 24 hours a day, then this would be monitoring of user communication that would be intrusive act against privacy and its unconstitutional as if we reject a proposal already on those grounds. The thing is that the executive body of the commission would say, no, it's not monitoring of user content, which we just like that. You figure it figured out, nerd, harder, find a way to minimize the content, perhaps using hashing or other methods and in the impact assessment and a document that analyzes the effect of the law that was proposed, the European Commission referred to Apple's client side scanning safety proposal thing to say that this would be one way that would be perhaps acceptable, that, you know, you don't monitor all types of content, but you check content against the database of hashes. The just a problem is that the anti encryption proposal is not quite clear on how platforms are supposed to minimize content if they can't look at communication, but there are all sorts of legal arguments why this is a terrible idea and technical arguments why it's not going to work. And practical arguments why it's not the right way to actually see some content. We're trying to bring all those arguments into the debate and we need to see how this will work out. Yeah, there's not so there's a lot of stages. I think the direct question was like, with this ban signal, right? I mean, there's a lot of different stages that happen before that, including at the end, like legal defense, right? I mean, if they're trying to disable or ban important services, that's a point where the FF and other people can still step in at the legal stage. That's why we have a litigation team in the US, at least. But, you know, we we have to kind of fight the battle at multiple stages, right? Because there are things, there are bad things that can happen way before you get to that stage, right? So, like, we don't really we actually want using strong encryption to be normalized. That's actually the goal. And when we lose ground off that, that's a loss in and of itself. We don't want the border guard at Country X to look at your phone and say, oh, you have signal, we're going to send you to a special room. I'm a little worried about people that have signal. Oh, you know, that's why we want encryption to just be used by big services and normalized and signal is is has millions of users, but it's still a lot smaller than things like iMessage and Google Message. But there always might be a possibility for us to get more freedom and more privacy through niche services, super small services. But that's not actually the goal, right? That's like our fallback position. What we want is privacy, security and encryption to be normalized and available easily to everyone. Yeah, exactly. Someone said in the chat, you want the guard to say, oh, I use that too. Well, I guess kind of going along with that, you mentioned that the DSA is like a directive, so it tells EU countries that they must implement a law around this directive. So. Do we need to focus more on the details of what nations will implement rather than the DSA specifically? There was a question asking something along those lines. Interestingly, the Digital Services Act is a regulation, not a directive. And the same holds true for the Digital Markets Act. Supposedly the Act will apply directly in all European Union member states. That being said, the many provisions, which must be like procedure will be implemented or we need to breeze life into some requirements. So there's kind of technical implementation. And it would take months for member states to figure out how to redesign their authorities for the commissions to step up enforcement capabilities and platforms will get some time to adapt the mechanisms and then all those for all those aspects, we can help making sure that no one is doing a stupid thing. And there's still many open questions which can only be solved through enforcement. There are springing cases before courts and ultimately the court of the European Union that will decide whether something has been interpreted correctly or not. And so there are many options for us to make a difference and we're going to monitor and die in the enforcement process of the DSA and the DMA. Cool. So we have another question from, I think, Bastian Pure. They asked, you talked about requirements to scan content. What about identity? To what extent would platforms be required to identify users? Would anonymous accounts slash content creation still be illegal? That was something. It's a good question. That was something that we successfully prevented from happening in the DSA. There was a lot of hope from some parliamentarians, many of them of French nationality, to have used the DSA more as a monitoring tool and to monitor private user communication, those sort of things. We have fought back against those attempts and for working with some committees of the parliament, we were quite successful even in pushing our idea that you should have a right to anonymity online, which would have been perfect. We didn't achieve this from in the end, unfortunately, which would have been perhaps helpful for that new scanning and encryption proposal. So there's no right to anonymity in the European Union, but there's no prohibition either. And in some states like in Germany, they set out a high bar to platforms to require you to provide your real name because it's considered unfair practice to do so. So we still have a chance actually to push the European Union to adopt such a right, which would be great. But now looking at the chat control CSAN proposal, it doesn't look good indeed. I could take a swing at that, yeah, about the chat control proposal. There are things in the the EU CCM scanning proposal that are of concern around anonymity and identifying users. So the thing they have in there is, you know, there are there's a section that talks about how providers of communication services that have identified that they have risks, which is something they might be told to do by the government need to I'm quoting here to take necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the mitigation measures. So how does that look in the real world? The way that could look in the real world is country X says we think there's child abusers on signal, you guys got to install some kind of age verification system that's in the law that we have. And so that's that's not good. I mean, we've we've seen also age verification proposals in the US and they're pretty concerning because they come down to deanonymizing proposals. I mean, they just they ultimately will infringe the right to anonymous speech. There's just no really great way to do it. You know, there's there's not. I mean, you know, I'm a I'm a parent of a very young child who thankfully doesn't use a computer yet, but like, you know, kids usually find a lot of ways to lie about their age like upwards in order to get on the services they want. And even ones that are age restricted right now. So we think it's problematic, but there are there is language like that in the the proposal. And I would say I would add to that, actually, there are already kind of discussions among our allies about. You know, things that compromise positions that members of parliament might ask us for, even if they're not comfortable with this full proposal and they could ask for things like identifying any user who use cloud storage. Right. So like that's a thing that might get asked for, even though it's not in this current proposal, I think there's a real threat of someone saying, well, we're not going to scan every message. But if you're offering encrypted cloud services or cloud services, shouldn't you at least know who your customers are? Like a bank has to know who their customers are for a money transfer. And our answer is no, because there's just no way to do that without hurting people that need anonymous speech. And that's going to come down on human rights workers on the front lines of serious conflict, you know, journalists, their sources, oppressed minorities, LGBT people in countries where that's not legal and they can be punished just for being who they are. Those are always going to be the first victims when you move against anonymous speech. Cool. Along these lines, we've been talking a lot of like EU proposals and stuff. And owner asked, what are the US implications of these proposals? Presumably, there would be a trickle down effect, right? Yeah, I mean, you would see it's like what we're all clicking buttons now about accepting cookies, right? I mean, that's because like European privacy proposals impact American users. The web is inherently global. And yes, it will affect you. There's not global companies aren't going to comply with they're not going to make a special internet for the EU. It's not feasible. So in so far as the EU makes good regulations or at least manageable regulations, then that could be good. You know, it's going to affect the whole world. And I actually that a point I didn't make in my earlier talk is like it definitely will affect US users. And also there's a really important point that we made during the earn it battle that we're going to make again in a very pointed way during this EU battle, which is that if you build it, they will come. The first users affected aren't going to be Americans, aren't going to be people in other democracies. They're going to be people in countries where they don't have the right to elect their own leaders. They're going to be people who are already subject to regimes of censorship, regimes of control, regimes of surveillance. Those countries will say, you know, the EU has a system for scanning every message and reporting back the bad guys. Well, we've got bad guys in our country, too. And we're going to need to make a list and we need your help to do it. And. Yeah, so it's it's it'll have big effects beyond the EU. Cool. And then we have another question from Harold Bergman. Is protection against personal transparency the only way to go? Would it not be a second path to fight for government's transparency? If such. Would have been established people's transparency. Would it be more acceptable to establish people's transparency? I'm trying to rephrase the question because I think it got mixed up in typing. Yeah, I mean, I I guess I'm not I don't I'm not having a hard time parsing the question, but I will just say in the, you know, at EFF, we definitely fight for the private things to be private. But we also fight for the public things to be public, right? But like your government needs to be open and transparent and you need to get those records so we can we can turn we can turn on a time if we need to. But yeah, those are sometimes it's sometimes it's complicated, but it's it's usually pretty straightforward. And we're we're definitely able to fight for the the public things to be public and the private things to be private. I think it gets really complicated in areas that I don't work in like interoperability, right, where we say, well, we want your social media to be able to go on to another service. What about all the people that commented on your social media? What are their rights, et cetera, et cetera? And so those get back into your really complicated question between different people's privacy rights. But government transparency has always been important in EFF and we have a lot of people working on it and litigation as well. Cool. So we have another question. Activists were concerned about content filters under the EU Copyright Directive. What does EFF think of the recent ruling and does it do enough to protect free speech? No, it doesn't. That's just to to to explain again what this is all about. No, we had the infamous article 13 late article 17 of the Copyright Directive that basically compiles platforms to use filters to monitor user content and check against copyright protected content. And there's litigation going on in Europe to challenge the compatibility of that provision with the Charter of Fundamental Rights. That challenge was brought by Poland, so the country Poland against the EU lawmakers on that one. And there has been a recent judgment on that. Let me drop that in the in the chat. There you go. And the bad thing is that the court basically confirmed our understanding. We had some hopes, but not high hopes that Article 17 indeed will lead to the fact to lead to upload filters. But what the court did is to the Court of Justice, the European Union to focus on interpret exceptions, limitation in a way that is very privacy and fundamental rights friendly. The court basically held that an upload filter would need to accomplish both to filter out content that is copyright protected and to enable upload of content that is not copyright protected, which basically means as upload filters are not very working practice, they are prone, that could only mean that platforms can only filter out evidently illegal content. So that was a good step. On the other hand, the court completely failed to explain the parameters for platforms to help them to decide when to block and when not to block content. And it completely sidestepped the question of whether upload filters can ever be reasonably implemented. We always say at EFF, they don't work. They don't upload filters don't understand context. And lots of things we upload is context dependent. You know, there is not a fair use provision in the European Union law, but something similar, copyright exceptions. And the court completely sidestep that issue, which means in the end there will be upload filters in the European Union, but they can't be used for all used uploads and for all types of content, peps, entire movies, but not only snippets, you know, those sort of delimiting lines, platforms you need to respect, but no one knows what this really means in practice. So I think that the ruling is a good step in the right direction. But on the other hand, it cements the idea that upload filters can be established by law, by lawmakers. And I am personally fearful of what this means for other legislative files, you know, that spillover effect for for files that will be hate speech on the DSA, we have the same problem now on the chat control proposal, this idea that upload filters are a good idea. And that spillover effect we will see outside of the European Union too. So in this sense, it's not a good ruling. Thank you. I think that we probably have time for one more question. So we'll take this question from Augusto Herman, who says, in cases where applications are required to scan users' own devices and report back suspicious cases, is that also some kind of DRM in that it makes people's devices work against their own interests, including but not limited to hogging system resources, reporting false positives that may ruin a person's life until proven innocent, etc. Could you guys expand on that a little bit? Sure, I can take a shot at that. So the answer is yes, I mean that these this is yeah, we're talking about installation of software that's not in your own interest and isn't working for you and making your device not work for you, but work for the government. And so it's really bad news. And in that way, it is comparable to VRM, right? DRM is a form of copyright control that's created by copyright owners that's not in the user's interest. So yeah, those are all concerns. Those concerns were really on point while we were fighting over the Apple plans last year, and I think they still are. And the questioner also raised the point of false positives, which is a big concern to us with these new proposals that are based on child abuse because they do have false positives. It's very difficult for us to get information about what kind of false positives there are, because just handling or viewing this type of information is illegal, so there aren't really good independent audits on it. But we are now starting to get different forms of transparency reports from tech companies, and we're looking at those. Some are some are better than others. But we're starting to get information about how the most widely used scanning software that scans. There's some software that scans specifically for child abuse images. We're getting some information about how well it works, and the answer is not good. It's not that good. And a false positive can ruin a person's life. And I mean, so it's it's not great. And I think they're they're it's not a coincidence that we don't have that much information about how common false positives are because the information we're getting is not going to make our opponents look good. And it's going to be it's concerning to us. And I think it'll be a strength to everyday people. Very well. Thanks so much, Joe. So actually, I think that's that's pretty much all the time we have for today. So I want to thank our guests, Christos, and Joe Mullen for being here. I want to thank you all in the audience for joining us today. If the members are our best bet for a brighter future online. So thanks for being on our side and I hope to see you at our next member event. Thank you very much. Thanks, everyone.