 Hi, I'm Scott Jones. I'm acting director of electronic frontiers, Georgia and this evening's Evening's presentation is what's all the fuss about Apple's CSAM solution and our speaker tonight is Joe Mullen from EFF Joe if you could start by introducing yourself and then Kind of lay out what Apple proposed and remind us again what CSAM stands for Sure, my name is Joe Mullen. I work on the activism team at EFF and I work on a few different issues including encryption Which is how this issue came across from my desk. I So I'm gonna kind of give an overview of the work we've been doing on this issue I've been at EFF for three years and a bit. I've been working on encryption issues for most of that time So Apple proposed I'm gonna share my screen here and show a timeline Okay, I should start sharing it just a second In early August Apple came to us and explained that they were going to roll out some new features In the new version of ILS that they said had to do with child safety They are going to introduce two types of scanning on all of their devices One of which would scan for what's now called CSAM and what used to be called child pornography So the new acronym for this is child sexual abuse images and they were gonna Put a new feature on their phones that would have an on-device scanner that would search through people's photos It was photos that were had iPod switched on and it would scan them against a essentially a government database a database that's maintained by an organization called nick mic Which is the national center for missing and exploited children. It's a quasi governmental agency and Apple would be scanning against those and reporting matches Onto law enforcement. They were also going to do a second type of scanning for the phones of or the devices of Miners and family accounts Which would be a more general type of scanning for really nudity anything that Their machine learning algorithm had determined was quote sexually explicit What does that mean? I don't really know. That's a culturally dependent term We don't know where it would have gone for apple And the feature hasn't rolled out which we're happy about so That's what we got that that phrase is the phrase they used to describe the second type of scan So we have been working on this issue for about a month We think both types of scanning are very problematic This came up on us a bit by surprise apple I want to give a timeline here of how this issue came up apple told us that they were going to announce it a couple days before they did they told A variety of organizations Civil society, maybe some press. I don't really know the full array of people they reached out to But they announced two days later on august 5th And we responded the same day that they rolled out their announcement So their feature is described on A page they created called child safety. It's apple.com slash child safety They released an faq and then we wrote a response post Basically say and I'll I'll show that post now. Let's see And I want to this page we've made is Um It's you can reach it just by going to eff.org slash apple and what we've put on this page now We had a petition off but what we have now is kind of a general information page that links to everything We've done So this is the this is the post we put up on august 5th, which is the day that apple came out when it's When it's planned And this is that let me give some of the context here on The government in different forms particularly the fbi but other agencies too have been Complaining about the strength of online encryption For over 20 years now um, there are some agencies in the government that are very bothered that There are some That they can't read Um, and they think that they should have access to every message that's exchanged online Um, eff disagrees with that point of view We always have and we think you should be able to have a private conversation in the digital world Just like you have a private conversation in real life um, we don't think that it conflicts with the goal of Um doing good law enforcement or finding people that are doing bad things um So we've resisted a number of these efforts over the years um, and as I think the the public discourse about encryption has kind of changed and there's been a more widespread public understanding of why we need encryption For privacy and security online um What we have seen now are kind of different ways to wordsmith around it um And there have been a few examples of these just in the few years that i've been working the encryption group So we saw from out of the uk there was like a what was called the ghost proposal where uk intelligence agencies kind of floated the idea that they would have a you know They would be a party to They would be added into Private chats as a third party and that somehow this wouldn't break encryption Um, so we had to kind of write about that. Well, you know encryption is really it's like when we talk about end to end encryption It's an idea. It's a it's a promise that you're making That only the sender of the message and the recipient of the message Have access to it And there's ways you can break that promise even if you don't break the actual encryption algorithm. So for example um, these are some blog posts we put up in 2019 Where the new way to talk around it was client side scanning And the idea is if you have some kind of scanner that's on the device um And just finds and reports bad stuff Then somehow you're maintaining encryption um while still being able to service a law enforcement demand of Doing some type of searching some type of looking for information So, you know, we actually have blog posts in a sense about this out before it happened And i'm going to apologize again for that. There's a pretty good background noise. They're paving the Streets outside my apartment, which is extremely exciting for my two-year-old son, but um, sometimes disrupts my My video chats in the era of the home office um So this is what we see this as is also kind of a version of client side scanning Uh, and it's so it's something that it really breaks the promise of the end to end encryption even if it doesn't literally break the encryption and um So we see this as another Sorry, we can actually go back to the timeline here. Um, another version of uh What we've been seeing for a long time, which is essentially trying to create some kind of backdoor for government agencies or quasi government agency to be able to Look at people's messages and So we have a response i'm going to go through kind of the timeline of our response here We have we started by publishing some blog posts. We wrote a general response post the same day as their announcement um We wrote a post about the global implications of this which are are really concerning um because We see this as something that's um inevitably going to be subject to if if and when it is rolled out It will be it's a scanning system that's um Going to be catnip for governments that already have Uh, not just surveillance, but in some case online censorship regimes already in place Um, so there we think they will you know insist on utilizing this technology And while apple has said flat out that they'll resist that pressure Um, which is an admirable thing to to say Uh, that's not exactly the tap we want them to take because we don't think they're going to be the first um corporation in history to um, you know resist uh Resist, you know dozens of governments around the world who ultimately have authority over them and their employees We think the best thing to do is to not build systems that Can be of use So we wrote about the global implications in a post on august 11 uh apple started sort of changing their system um within some days of curing a lot of pushback from us and from others um and Later in august we launched a petition We decided that we wanted to let give a format for users to be able to speak out about this and say that they weren't Happy with this program. They wanted apple to withdraw it That petition ultimately got over 27 000 signatures The same day that our petition launched two other or well Two other organizations launched petitions one of them came the same day as ours fight for the future um Is in another online Organization that deals with freedom in the digital world. They launched a petition the same day as us There's ultimately got about over 16 000 and then there's an organization that's based in canada called open media That had a similar petition called iser veil Ultimately the three of us decided that we would do a joint event um We would hold a press conference and present those petitions together on the same day to apple um So we did that that happened on september 8th um We also had a there was a coalition that was organized by the center for democracy and technology Of a lot of different nonprofits and NGOs that were opposed to this program. They ultimately had Uh, I think 90 organizations um So anyhow, there was a big public outcry in a number of levels apple has said they're delaying the program Which we're happy about Um, but we went ahead with our petition delivery We also held protests in front of the apple stores in eight different cities That was the day before their big press event which was on september 14th um And we went forward with that, you know, we did get asked by some reporters and other folks You know this this program seems to be dead in the water for now So, you know, why are you still moving forward with it? But it was important for us to move forward with it because to say, you know We weren't going to be silent about it in september I think to some degree, you know, these things happen in a political atmosphere and I think a lot of different Forces in society not just apple are sort of Taking measure of what the response looks like and it's important for us that You know during a month that is commercially important frankly for apple and and all the technology companies That are consumer focused have pretty significant fall events Um because you know the holidays are such an important time for for sales for our whole economy um So we weren't going to back away from our protest or activism plans At a time like that because we wanted to We wanted to insert the debate into that press cycle and I think we were Really successful at that. We had some great success just Not just in sort of regular press uptake, but also in Social media we flew a banner over apples headquarters and um did some kind of like fun and attention getting things like that Um, so that is kind of where it stands right now. I mean discussions continue There's an update on apple's post that says, you know that they are delaying it But they're still thinking about moving forward But um instead of just kind of going on on a monologue. I thought I would Sort of stop there and see what this group, which is a group I need to would be interested in me Following up on um, there's a lot of different things we could talk about But this is this is dominated not just my workflow that um several other people at EFF for the past uh a few weeks now and That's where it stands right now. We feel we feel good about what's been accomplished. You know, we're taking this Notice of a delay as a win for now. Um EFF is at the end of the day You know a law firm and I'm not a lawyer I worked in a tech journalism and policy journalism for over a decade before I came to EFF But I've been writing about lawyers and working with lawyers for a long time Now and one thing I've learned is, you know victories come in different forms and sometimes what happens is, you know The organization or person you're dealing with kind of just stops doing the thing that you were complaining about and they don't necessarily Write a public letter of apology saying they see the light and they've changed their ways um As much as sometimes, you know, you'd like that to be the result. That's that's not always where it looks out um So I thought I would pause here and let uh the folks that are in the group here Maybe the direct the discussion of it. Um, I see some notes and one question in the Channel which I'm I'll answer that question that Keith has in the channel And give other folks a minute to ask a question too and so I can Hear what part of this is most interesting to you or you'd like to know more about whether it's what apple's doing or how EFF works and we came up with our response or Whatever you'd like to hear and I'm going to keep drinking water because I got uh cold to just fortunately I got it after we did our protests, but uh Just saying right there. Okay. And also, uh, kivis has has raised their hand and so After Keith's question if you could move on to kivis also Okay, sure. Um, so yeah, kivis question is isn't apple capitulating to china and russia already Yeah, they have taken actions in the chinese market and so, you know I think that's um We and then what god this is i'm having a moment where it just escaped me But there was some item that was a pretty big headline just in the last week or so where, uh Um, yeah, right. This is the russian thing that there was an opposition app That google and apple were told flat out take it out of the store And it was just a nakedly political move and they capitulated. I mean they both did it So the idea that they wouldn't um, you know capitulate on a Sort of possibly a higher stakes issue that was presented to them in a more Threatening way about their own systems. It's kind of hard to believe and that was that app was um the kind of thing that if you live in A democratic country where people are able to choose your own leaders It's the kind of thing that's absolutely critical to our freedom. It was like a voter guide Of who the opposition candidates are that we think are good, you know the kind of um Content that uh, we kind of take for granted our right to do that here Um, so yeah, there are those capitulations and we've had discussions about, you know How could we elevate, you know apples commitment if they're serious about what they're saying, um Well, what could we ask for on that and um, so we're thinking about how we could, you know Ask for for example specific written commitments that if they were asked to do that type of scanning They would leave a market You know, they would they would just decline to be in that market rather than capitulate to the demand and I don't really think that's going to be possible. I don't see a us tech company agreeing to get out of the chinese market, for instance, which is Probably the first country people think of when we get one of the first People think of when we get concerned about, um a demand for censorship Yeah, thank you for posting that that link to then the baldy story. That's exactly what I was thinking of but couldn't couldn't make um, so was there a Sorry, was it keyness? Who had the next question? Yeah, I see key this is typing. Um Oh, there we go. There's the question Okay So the question is for those malicious governments, can the technology be repurposed for them? Or is it hidden or obfuscated slash encrypted? Domestically, I'm not sure I see the issue if the scanners are accurate because it seems they won't get any info Other than indicator of the photo of being cesium or not So, okay, so I let me take what I think is the second part of that question Excuse me first um, this because the two types of scanning work pretty differently In terms of the scanner that's looking for cesium, yeah, it's it's scanning against a list of known images um that are the databases controlled by Nick Nick The way the procedure would have worked is that apple would have gotten a flag If there were hashes of the image that matched The cesium database and if there were a certain number of them Then it that would get forwarded to apple. It was more than one um Then apple had a second layer of review which was They were going to do a human review to make sure it wasn't a false positive And then after that they would forward it to law enforcement So that's how it works and You know, there's other companies that do Scanning that's similar to this um And in some ways, you know apple did put in, you know, we'll I'll give them credit For saying they put in some additional safeguards that other companies don't have but I think the reason they did some of those things is because they honestly thought that there wouldn't They somehow thought there was not going to be opposition from civil society And actually there was a lot because you know just The fact that there is some of it out there and in some ways to be melodramatic about it Like this is one of the last forts to fall. There's a lot of other companies that um do this already um But apple made a particular commitment on privacy So there's there's a few things that are different about this one is Apples market themselves as the privacy company and they say they take privacy really seriously Two is the fact that other people are doing it doesn't make it right The philosophically the concept that they're operating on is well We're not going to look at your stuff Except for this one thing This one crime because we think it's really bad So we don't think your stuff should be proactively preemptively scanned because that would be treating you like a criminal But for this one thing we will because we've been asked so many times by so many different agencies and everyone agrees it's such a bad thing and we put in these safeguards And we had to stand up on this one and say no, there's kind of no But but but at the end of the day, this is mass surveillance. It's a type of scanning you're doing Where you assume that a user could be a criminal and then you check to make sure that they're not criminal Um, and it's a type of scanning that's done. It's not in the interest of the user. It's not done for a feature that helps the user It's not under control of the user and it is about sending information to a third party So those are like three big red flags that are things that Are bad for users the controlling governments want to use and abuse And that we were accepted. We're going to speak out against so and they did this with terrorism in 2016 And at that time we were protesting In front of an apple store again same apple store in favor of apple Because they stood up to the government and they said we're not going to build a back door into our software We cooperated and giving you a lot of stuff But we're not going to endanger the privacy and security of all our users to make life a little easier for law enforcement um and they did the right thing and now the they're In many ways the same organizations are back And the line this time is that a terrorism is about child safety and cesium um, so there's there's a new crime that they want to scan for But we kind of we have to get the same answer and we gave it Loud and proud and part of that was too, you know apples promises. It was 2019 when apple put out Their big privacy. That's iphone campaign. Um, you know what happens on your iphone stays on your iphone So a pretty direct promise to users that they wouldn't be doing something like this Then I would add, you know, that was a global campaign So, you know, if you if you do a google search On images from that apple ad campaign It was not just and you'll see billboards in san francisco and new york and other cities in america And there's also a billboard in dubai um Where it's illegal to be lgbt and you can be thrown in jail just for being who you are um, and there's a lot of other Freedoms that exist in the west that don't exist in other countries So when you say privacy, that's iphone and you put it on a arabic language billboard in the busiest airport in the middle east That's a big promise And when you have it linked to an arabic language website where you explain that apple we believe Privacy is a fundamental human right That's a big promise and I think users have a role and say You don't get to just roll back that promise You know 18 months or two years later because those are people that really have Very valid reasons to Not want the government to skin anything on their phones um And don't want to know about their their personal life Or what images they share of of who's buying it's just really none of anyone's business And there are things that they will They're up against dangerous forces that will portray things as products Um, so that's what this is about. It's about apple, you know asking apple to keep their promises So that's a long answer. I want to try to let's see if I can get to another Divas, I hope I answered your Question there as well as I could um Yeah, and I want to say there's there's false positive issues with both of these types of scanning um It's a more limited problem with that when you're skin against a particular database, which they are for the ccm scanner That is You know, they can legitimately say it's going to be extremely limited For the false positives and you can spoof those scanner scanners if you're working to to spoof them But in the case of the machine learning one For miners, it's like well They sort of almost acknowledge that that would have a lot of false positives. I mean, there's no way to not have false positives on that in part because What people view as sexually explicit Changes from culture to culture um Even within the united states not to mention china is not the u.s. Is not sweet um, so I think that would have been Really problematic And there's a whole other set of issues that went with that scanner that I haven't talked much about but Okay, any additional questions Um kit kat, did you have something? I don't have any questions. Okay. And the rest are I guess comments Yeah, thank you guys for your Your interest in your thoughtful commentary on this So I had I had heard that this would only be supported in ios 15 I wondered if it's um, if it's there in ios 15 and it's asleep Or if it would require an update Um, we're pretty sure it's going to require an update. I think they told us it would require an update um Our before they told us about the delay Uh, I would say we had an unofficial impression That this would be rolled out in an update to 15 that would come around december um Don't quote me on that times 10 because I don't know but now I would say that was our unofficial understanding um Since they've announced the delay we don't have a timeline because They haven't told us how long the delay will be other than months They said that the delay is for the purpose of listening and we're glad about that We think they are going to talk to us again and they're going to talk to some other civil society groups that they didn't talk to the first time um Part of the problem is that they felt like they had talked to child protection experts And what they really who they really talked to is is nick mick which is a quasi governmental agency Gets its budget from the federal government was created by the federal government created by congress in order to work with law enforcement It's fine. They have a job to do but they're not an independent NGO um, and They're not the only voice that you need to box it And I haven't talked that much about the scanning for minors, but you know, we're pretty concerned about that too um, that's a lot broader form of scanning that's going to have more false positives and uh We think it has the intention to you know a lot of Um, we think it has a lot of bad possibilities. It could just affect, uh, you know Minors have the right to a private conversation too And we just don't know what those false positives are going to look like But we've definitely heard from lgbt groups that feel like those types of scanners end up Targeting them and treating them like They're the bad guys And that's not good and The other thing is like it will rope other people into a Dragnet surveillance. So apples apples promise on the family thing as well This is only that this is only going to happen if the parents in a family plan opt in right, but So for example like an example of a potential false positive amount would be you know, let's say I take a you know A family picture of My two-year-old running naked on the beach or playing in the bathtub and I send it to a family email list And one of the people are message lists of you know six people one of them is my 17-year-old nephew who's On this scanning program. Well, then he's going to get a message from apple saying are you sure you want to view this picture? All kinds of flags are going to go off um, you know, and so we think it's It's not good to be Sending messages from an authority about People's images being bad or problematic or criminal um when there are serious false positives that are coming around And so apple in that scanner they emphasize things like well apple will never know about it But um, it can still cause damage to people's Ships into it to life So I wonder if it would have some of the same biases that the facial recognition has such as operating differently on darker skin colors Um, it's just hard to know when yeah, yeah, it's hard to know when you don't have it in front of you right and we didn't You know our tech folks ask to ask them some specific questions like what kind of I'm gonna the right phrase here. It's gonna escape me, but maybe someone else knows it But you know when you have a machine learning thing there's inputs of into that so What are they going to learn to detect sexually explicit material? You know some of that they're drawing the inputs for how they build their machine learning um Will come from Media that's out there. So like, you know, it can come from like mainstream pornography For example, like you're trying to just build a machine learning for sexually explicit material and that's you know, again That's going to be like reflective of like one type of sexuality. It's going to have it's going to over represent some Body type some skin colors. It's going to have all kinds of biases Built into it Um, there's a there's a good question in the channel. What's your opinion if apple moves their scanning? to their iCloud servers that's an important question because um, you know in some sense, it's right like well, they could just uh scan the cloud um uh, I think there is You know There are certain issues where like if we if we had to say is it worse. I think there's some people at the ff that feel apple scanning is kind of Worse in some sense because it's on the device and I certainly think there's People that I've talked to you that just it has a feel to it. Um, that's a bit great year, but You know And we there's sort of this you could speculate about why they're doing it I mean we've been asking cloud services Including apple to encrypts their their cloud And we got some positive developments from the whatsapp on that front pretty recently um You know, I mean we think in general if you're promising privacy in cloud storage Then you should deliver on that. So if you're promising encryption and privacy on the cloud then You should deliver on it. It's a complicated question, right because um as the service provider as the storage provider It's also legitimate to say, you know We don't want our cloud and our servers to be used for anything malicious Or criminal You know, that's understandable So there's kind of a hypothetical like well if You know if the scanning was Disclosed Then Would that be okay? I mean You know, I guess the answer would be we wouldn't like it if it was replacing a private service Like if you're taking away someone's privacy by doing a certain type of scanning um It's it's better to disclose that But we think there should also be space for uh end-to-end encrypted conversations and end-to-end encryption on storage as well Well, don't isn't there also A fundamental question about how we're governed The fundamental principles of the of the constitution of bill Bill writes This is uh A continued attempt by the government to put pressure to Do incursions where most people don't think they have the right to go as a government Now normally if a neighbor decided that he believes in the Uh, I have nothing to hide argument, which is a false argument And he thinks there's no problem with it Well, he doesn't have much effect because he's just one guy But if that guy happens to be the president or the ceo of apple He can literally affect the entire world With an attitude that most people would not agree with when it comes to our rights and privacy And that's also part of this people feel not only betrayed in the Sense that apple made a promise that they seem to be going back on But also in the sense that they've got somebody that Uh is not the government so they can't yell the government boogeyman argument And but at the same time is siding with the government boogeyman that everybody's afraid of And that scares people rightfully so And there's nothing we can do to control that except vote with our wallets Those are some great points Um canary asked on the channel what countries will the human c-samp checkers be based in We know very little about that human review step Um apple just told us that there will be human review um That's it I imagine it's going to be us based because they're only rolling out this feature in the us for now That was one of the things they they told us um and there's a lot of I don't didn't even begin to know the details of how you do a human review of an illegal image. I mean c-samp is kind of the um Has this special legal Place because it's it's a type of content that's not protected by the first amendment and Viewing it is itself a crime. So I imagine that you They would be working directly with law enforcement to give us that great the human review system Yeah, but we already know there's hash collisions We already know they've had hash collisions So we know that ai makes mistakes So if you took those pictures of your wife or your kids or Something that you consider very private with your phone And suddenly you've got people something you don't know somewhere reviewing the private photos That you have simply because the system Inappropriately they decided it was an inappropriate flag But now other people other than the people you've chosen to show those photos with are now being seen by other people you don't know you have no control over that and Frankly you have no control over what those people do with those photos It's like sending an email to somebody. It doesn't necessarily remain private That can happen with this content too So that's another issue what happens when somebody breaks in and downloads 100 gigabytes worth of photos out of apple's database and releases it on the web And now all your photos are out there. So yeah, there's there's some things to this that Get seriously creepy if I encrypt a if I encrypt a photo because I want it to remain private There's nothing illegal about the content. I just want it to remain private And I put it up on the internet to store it I fully expect it to remain encrypted and only accessible by the people I choose to have an accessible why And this completely undermines the entire process is no Except if we deem it necessary we'll look at it But you don't get any control over that and yeah my immediate take on that is Well, I'll never own another apple product but If apple does this Count on every other company going the same way Yeah, I mean google is probable. I think I've been told google scans its cloud You know foresee sam right now So it's not like we can tell people, you know We'd like apple to keep its promise. I mean in a way it was kind of a it was kind of a win for society at large that apple put into an encryption as a default on iMessage and We'd like them to Get back to that. We're hopeful that they They will Um, but I don't know. I mean, you know time time changes like I said five years ago We were protesting sort of in favor of apple in the Stance they were taking That's not a case today And the options are getting, you know, it can be the options are limited, right? I mean especially think about how There will continue to be You know end-to-end encrypted products And messaging services But we also want it to be a thing that's mainstream and kind of you know, we want privacy by default because That's better protection for people like Journalists and activists and people trying to tell the truth about what's going on in their Societies in the world around them. We don't want them to have to Download maybe an unusual privacy specialized app that Then just just by virtue of them having it might raise suspicion or flag them in some way We kind of want privacy protective features to really be mainstream and to be a to be a default not something that you have to Seek out some sort of, you know, highly specialized thing that not that many people Is That people unfortunately feel safe on the internet And what they don't realize is like I feel safe here in my house From attack by spets gnats, which is the russian special forces I feel safe because they got to come across borders and I got armies between me and them All of that but on the internet The spets gnats or the organized crime or the cyber warriors in china are literally my Store neighbor and there is nothing between me and them. So when you start seeing these type of Fundamental security being eroded It's not just between me and apple. It's not just between me and the government It's between me and every other person on the internet, which is all the other bad actors in the world That's the scary part people don't realize that when you lower that shield to our government you lower it to everybody yeah Well, thank you guys for this opportunity. I'm gonna have to sign off now To go pick up my child Okay. Um, thank you very much. Uh, yeah I'll um Go ahead, but then I'll I'll need to shut off the recording and once we close it down And then those people are who are here. We can stay and socialize for a while Great. Well, it was great to meet y'all and get a chance to talk about this and, you know, scott has my email and Uh, if you want to do it again reach out and you know, you know As we say in the journalism business only time will tell so who knows what's gonna happen in the future Thank you guys for your time and the opportunity All right. Thank you very much and uh, I'm gonna go ahead and shut off the recording now