 Hello and a warm welcome to our little talk about chat control or chat monitoring. This talk came into existence relatively spontaneously because it's a topic that we will have to live with in the following months a lot. So we have Konstantin Kalisi and me, Tom from the Digital Society. And we want to say thank you very much to the studio Suthaus in Berlin who spontaneously built the studio for us. And thank you very much for the content team who just fit us in on short order. And it costs a lot of nerves, I'm sure. So thank you very, very much. So very spontaneously we created this because I think it's a topic some of you might have heard about, but it is very high up on the political agenda or it should be for people who care about our rights as citizens. And we will get into detail a little later what exactly it means. But it's about pretty heavy surveillance measures that should be or will be introduced and that will also affect end-to-end encryption. So we think that this topic and there are concrete plans already on the EU level that we have to mention this and to face this. And we'll get back to the action that we are planning in the end and how you can participate. And right after the talk there is a workshop planned, what we can do, how we can do that in the following month. Because we think that right now we have a chance to chance those really dystopian plans and maybe stop them before they come into action. So chat control or chat monitoring, some of you or most of you probably heard about this before. It was a topic in the last years, but I want to explain quickly what exactly it is because there is some confusion. So, especially last year, it was used to or understood as an exemption from e-privacy. And e-privacy is something most of you will know about. It's about the e-privacy rule that still happens. So we are talking about data protection when communicating digitally. And the main problem with this ruling, this rule, that surveillance is not allowed. So it's guaranteed or it's supposed to be guaranteed that you have trustworthy communication, that not state agencies or companies can have surveillance on our communication and control it. So what has happened so far with chat control or chat monitoring, we have this. So I'm not sure if it is readable on this slide. So the e-privacy guideline, it instructs national states to take care that surveillance and interception of messages is made impossible or being banished. So now lawyers say, well, you have to be certain that this doesn't happen. We all know there are exceptions. But basically it should be made certain that whenever we communicate via phone or wherever. So this is not only about chats, but let's not get into detail. So our communication is not supposed to be monitored at all times by companies, by state agents, whoever. Concrete. So to be a bit more in detail, this guideline is in effect since 2018, but it uses a definition or used an older definition. And that has been replaced in 2018. So suddenly there were new definitions. What is electronic communication? What are the electronic communication services? And it defined that this is also not only number dependent services, but also those independent of numbers, phone numbers are affected by this. Before there was arguing. And then suddenly some companies noticed and also the commission that Microsoft Google and others, even before that, they were scanning communication, mails, chats, and so on. And they were scanning it for documented child abuse, sexual child abuse. There were different methods for this. And so they noticed that the scanning already happened, apparently only then. And then they got hectic. And in 2020, in September, there was a new proposal. And they said, well, we need an exception from the ePrivacy guideline. And it happened as a decree. So it came into effect immediately. And they said that scanning for these things and also mentioning it or calling the police basically was still allowed. And only in July 2021, already in July 2021, this came into effect. Some people worked against it, but all in all, it went through pretty unchanged. So we will call this chat control 1.0. That's the scanning that was voluntary. Yeah. Back then, also back in April 2021, the commission announced that, yeah, it's important we have to introduce those exceptions. But they already said in a few weeks, we'll also have a draft that not only allows voluntary scanning, but that also makes it mandatory. And especially, and that was quite obvious back then, that it should also include end-to-end encrypted communication. A few weeks turned into a full year by now. And we don't really know what's planned there. But then Johansson is very fixed on this, that it should be mandatory for service providers to scan for such material. And this is all public knowledge. What the plans are in detail, we don't know. But we have to see that we are handling a commission that is now headed by Ursula von der Leyen, a German politician who has a bad history of surveillance. And that's the commission that plans a few things. We don't know any details yet. There was a leak recently. So there was an account or report about possible plans. And apparently, these guidelines will be quite far-reaching. So grooming should be punishable as well and be scanned for. And the issue is how to scan for new material that would only be possible through AI or something. That's something we will not talk about that much. So it's a bit dystopian, what we have to expect. And in particular regarding the very clear commitment that the commission seems to have. And at the same time, another parallel development is that encryption. Of course, this is not something that's been in the focus for the last year or so. European governments, world-wide governments have this. In 2020, the Council of Ministers under the German presidency at the time. The German ministry headed the respective committee. There was a paper from which I want to quote a few things. So back then in 2020, the idea was that in a dialogue with the IT industry, they wanted to come up with new technologies to circumvent encryption in communication. And they cited terrorism, organized crime, sexual abuse, but another large number of cyber crimes and crimes enabled in the cyberspace. So you have to keep this in mind when you say, when we talk about the concrete plans that we're talking about right now. So when they are talking about sexual abuse of children, this is what they have in mind as well as applications of the new technology. Now back then, the whole thing was mostly talked about as end-to-end encryption, which should be banned or circumvented. And in the last few years, the whole debate turned to client-side scanning, which you may have heard about. There are plans from Apple that they would not refer to communication through a messenger, to matching hash values coming from a database, which I will be talking about. So matching these hash values with files on the devices of the user. In 2021, these plans were put forward by Apple and there was quite an outcry from civil society organizations, politics and scientists, who all said scientists in particular that this was a bad idea and within one month Apple backpedaled and said, we are going to postpone these plans. So that's the history before this. And I have already alluded to the meaning of chat control, and I will now ask Konstantin to talk about chat control. Thank you, Tom. Yes, I'm going to sketch out what we actually talk about when we say chat control and there will also be a more technical view on this later on, but I'm trying to break it down to make you see why in my view this is a form of surveillance infrastructure that we haven't seen yet. So it will make sense to first delineate this from traditional or classical surveillance. These plans that Apple put forward are helpful to understand what is envisaged and what we are talking about. So normally when you use a messenger to communicate, the way it works is that from your smartphone, be it via a server or directly, you send a message to another smartphone and traditional surveillance would start by eavesdropping on the communication and intercepting the communication and decrypting it. And what has been started through signal and other messages is that end-to-end encryption has become such a standard that about every messenger does or should offer this feature now if they should be regarded as respectable messengers and this is regarded as a problem by those supporting chat control. So business protection of our private communications of course protects everyone including criminals who would like to transmit depictions of child sexual abuse through this channel and the solution therefore is supposed to be that the confidentiality of all communications is attacked. There are three escalation scenarios in this potential escalation scenarios which I'll talk about and these are all undermining end-to-end encryption in smartphones to find out which messages are sent and whether child sexual abuse images are included in those messages. The first escalation stage is to look for known depictions of sexual abuse so you will have known images and hash values for those which you would match with the messages that are sent so the messages would be compared to the values in databases. Then there is the search for hitherto unknown images so you would need some kind of artificial intelligence for that which tries to recognize what is the age of the people in the image what is happening are these cases of child sexual abuse and the third and most grave escalation stage is to try to detect grooming attempts from adults towards children and that would require a content analysis of communications and probably a linking from this to other data that is available about the person and we do know from the leaks and Ilya Johans Adren says that this scenario is actively in debate and they are trying to establish this. So what have we done about this so far? From the time that the apocalypse were known at latest we know that we have that we are dealing with attempts to establish a large surveillance infrastructure and that we have as a civil society have to stand against that so we try to build an alliance which has stable foundations we have entry European digital rights which published a paper of principles stating certain red lines that any draft regulation would have to keep to because of course we do support the attempts to protect children but you cannot do this by bringing forward a regulation that is in violation of human basic rights and that will be nullified by the courts and which at the same time restricts the fundamental rights of everyone so this paper makes clear that there cannot be a blanket surveillance without cause it has to be targeted it has to be controlled it needs judicial permission and independent institutions have to control this client side scanning is an unpermissible technology which would affect the communications of everyone and control and surveil everyone we also say that it's not a solution to use solely technological measures to solve a social problem social work prevention child protection which really tackles the problem at its roots that has to be put forward and that has to be rather than trying to tackle it at the end now with these principles that we agreed we have applied pressure to decision makers we wrote a letter to the European Commission we had conversations with various actors in civil society to bring forward this position we made input to the German coalition negotiations with the new government formed in last autumn and you may have noticed that the coalition treaty says that automated scanning is rejected by the German government coalition the Social Democrats, the Greens and the Liberals and we will make sure that they are keeping to their promises we tried to raise awareness we were able to make a few journalists aware and of course we do advocacy work as well by contacting people from the European Parliament for example so what is going to happen now presumably at the moment the draft is still being agreed on within the commission it had been postponed so it is being postponed which might be due to the fact that even within the commission there may be doubts about the legality of this draft and surely it is also due to the massive resistance that has come from civil society so we have had an effect already we and all the others have been raising their voices and it is necessary to keep being loud the invigid state of publication is the 11th of May with a question mark because this has been postponed and we feel, we notice that Commissioner Johansen is having a hard time finding support for these plans where are we now? we are actually before the proposal of a regulation which is depicted in this graph in this photo so it is a good thing that we started building an alliance early and that we are not surprised and are articulating our concerns even now and that we built a foundation to maybe hopefully before it is being published to resist this draft regulation so this is where I am going to hand over to Kallisi who is going to talk about the technical side of things thank you so how does it look from a technical standpoint? it already sounds bad and I think once we look at the technology it gets worse so as Konstantin mentioned we want to see how does it usually work when you communicate using a messenger usually it looks like this the green line, that is end to end encryption that is the end to end encryption of our communication so Assad and Bettina they communicate and the whole communication is end to end encrypted so that means no one can read it not even the server that facilitates the communication so now we enter the question can we still scan this? right now as Tom mentioned the big tech companies are allowed to scan so unencrypted communication for example uploading something to the Google Cloud pictures or something it can be scanned and the idea to also include this or do this for end to end encrypted communication works similarly the first escalation step is looking for already known material for example Microsoft they have a big database of known pictures of abuse and these images are turned into hash values so what are hashes or hash values? they are like a fingerprint of an image so a so called hashing algorithm is used on the image and that is supposed to be unique so that this fingerprint only matches this image and then there is two scenarios one or the first one is that you scan on the device so there where the magnifying glass is that's where the search would happen so that would mean that whenever you are sending an image and hash is created and then your phone contacts or uses a hash database and compares this image to all the known images that's part of the sexual abuse material and then this image would then be sent to the service provider and this might for some of you sound not that bad but you should know that it's attackable so there is additional software on your phone and if someone can compromise your phone or maybe you root your phone then you can either trick the software or you can just change it so you might include hashes or might change the hash database so pictures of maybe protests are added and then you get flagged there are other open questions but we'll come to that later the second scenario or the second version of this is to have everything on the server so we all know that even if our phones are basically computers were 10 years ago we still have limited space so of course we can't download all the hashes or that would make our phones slower at least and that's unwanted of course so the second option is to compare the hashes on the server so your message is still enter and encrypted but the image is turned into a hash that hash is sent to the server and the server then compares the hashes so that's the second version so it always goes over the server and then if it was flagged of course it has to be checked by a human and we should remember that the second scenario that Konstantin already proposed was scanning for previously unknown material so until now we were looking for material we already know exists but now we want to look for new material that is previously unknown how can we do that? well, AI fans will shout now yeah, use AI image recognition and what they basically do or can do is just guess is there a person on the image how old is the person how much skin is there on this image and just how does it make humans react basically and that's what could be used but of course the problem is if we have young people who just text and maybe send each other nudes then this classifier could already flag it or maybe you are on holiday with your kid and maybe there is an image from the beach and maybe the kid is naked and because the kid is very young and a lot of skin shown it flags your picture so again there are two ways to do this first on the phone so we already have we need to have already trained classifiers and this classifier is used to sort images in two categories so it could say, yeah, this is a young kid and likely abuse so what do we have to do? we have to train classifiers and to do that we need data and we need people who say yes, this is correct and this is not correct just classification and all this is then put on our phone and the classifier already runs it always runs over all the images before we send them and should the classifier flag something then the material is sent to the provider then controlled or checked again and should it really be prohibited material then of course there will be an according reaction the other option of course is to have this classifier run on the server we already talked about this the phones are powerful already but still limited so it would be better to outsource this so we have it run on the server so now we have to check all the pictures on a server so all the images are now taken out of end-to-end encryption and of course this is a problem because if I can enter or break into the server I have access to everything now the question is how long is it kept on the server of course and the classifier can be manipulated if you have access to the server but the same of course is valid for the phone I can always attack the system and I don't know about you but I think I don't like the idea of uploading the images of my small needs to some server that's why I use end-to-end encryption because I don't want that and now there is scenario 3 which is the most likely scenario so far or seems like it so not only scanning pictures but also scanning text so now we need text recognition software and this software needs tries to detect grooming it tries to detect age which is difficult of course in the grooming context so if you know that a person an older person who attempts grooming they change their language so how do we differentiate between sexting of teenagers and grooming or attempted grooming so it's difficult and it would imply that this end-to-end encryption has to be broken because if we do it on the device then we have to classify a problem the classifier is trained but AI can be wrong and it's very difficult to put it into context and if we have it on a server then we all know if we scan the text on the server there is no end-to-end encryption anymore so we have a whole band with a whole array of problems that we cannot easily solve so much for the technology behind it but of course there is not only technology some people might think yeah we can develop something for that but of course there is more so what's the basic problem with this control and scanning I mentioned it before of course with technology we already always have the problem of missing context so if I send some image of my kid to my partner or if it's abuse it's very difficult to differentiate and we all would be put under surveillance and especially with teenagers when they do sexting it would always mean that this material is taken out of context and is reviewed by strangers and that's a problem then we have the image review done by providers because usually it's not directly sent to policing agencies because they have a lot on the plate already so there is review done by the platforms and we already always know that it usually is done by service providers by third parties who are badly paid and it's a very in-principerant action and we don't really know the criteria so the next is the legal aspect so looking back to the title of this talk the secrecy of letters is in the title and there is the secrecy of communications which is fundamental right in the German constitution for example so this is the effected law and there is an opinion by a former judge from the ECJ who said that the ECJ simply would not accept this proposal but of course it has been our experience with data retention that as soon as the device is there it takes a long time to get it removed again legally the next issue is mass surveillance without cause now psychologically the fact is that we would not be able to trust our devices anymore but our mobile device is the most personal thing that we possess for most people it is a kind of diary I think there are several people who simply put all their notices whatever they think their images whatever into their mobile phone so the feeling that you lose control over your device because there might be a software on it that controls it is grave I remember this wonderful slogan by the reclaim your face campaign there is this so called treating effect if I as a person feel under surveillance I regulate myself basically so this is not just a threat to the personal space and the feeling that someone constantly is intruding there but it is an issue for activism for people that get engaged politically and want to take part in activism now economically we know this debate we have known it from other areas of course the Microsoft Google giants have their software in place they have an application for this and smaller platforms will have a problem there are ideas to have an open source solution available okay there is an open source solution but then again you have to look how much would I trust this software really I may not have a black box anymore but I still there still might be changes this is not this is not something I want and people with less technological knowledge will simply not be able to understand what's going on and the political level we have said that the council of ministers back in 2020 already said that these innovative surveillance technologies are something they want to use and in concrete this means the probability that they are going to use it is quite high and use it for other purposes too in a time when climate activism is criminalized when Victor Orban is censoring a lot of things in society you really have to ask do you want to give this tool to anyone and I think the answer has to be no because it's always going to be exploited and abused as once it's there and if we in Europe do it it will be out in the world and once it's out there others will want to use it and that means that very quickly images of political calls might be affected now pragmatically, practically we did already talk about the fact that implementing this without breaking encryption is not possible and it has to be made clear too that what a chat actually is there is a big question how about the centralized services how about pinboards where you put up notices to exchange messages it's not really defined how about end-to-end encrypted emails so this might be a difficult issue and very fundamentally the security authorities are on a lot of strain already the Swiss Federal Police said already that a large amount of these things are simply too much for them to look at there are too many false positives so the whole tool wasn't of much use and to just react to events again gives you too much data and more data isn't necessarily better and the other question is these databases that contain these hash values which are used to compare content to where would they be stored we have seen the current Europol debate that they have a large amount of data but cannot really say what's actually in there and the question therefore is how well is this suited to be used in this context so fundamentally is this a good idea clearly no it's not and one other thing you have to keep in mind is that as soon as something like this is used the perpetrators will be driven underground and that doesn't mean that it will be any easier to find them now what we call for as Konstantin already said we do not want end-to-end encryption to be undermined we don't want any obligation for scanning in messenger apps we want investment into victim protection programs and good police work rather than a blanket surveillance and we want a good legal basis for deleting this data because it has been in the press that the German Federal Criminal Police Office does find data but has no legal basis to delete those data so that no data gets deleted and we do we've seen that we're at the very start of the debate in contrast to other issues where we only got involved as the debate was actually in the middle of its process so we have a chance to really raise an alarm right now for one thing we can apply pressure to the German Interior Minister Nancy Fieser who is of course in the Council of the Interior Ministers reminded her of her coalition treaty and you could always call your MPs your members of parliament and talk to them about it and make them understand what is at stake you'll find us on twitter and mastodon under chat chatgeheimis that's German for chat secret the hashtag is chatcontroller German for chat control we're happy for any donations for our campaign materials and the twitter and mastodon accounts will give you the links to donate what you also can do is use there was someone who asked people who got up and watched a platform to ask German members of parliament for their positions and we've had some clear answers for example the green politician politician so these platforms are a cool way of contacting members of the German parliament at least and ask them to take a position to make their position known and to remind them of the coalition treaty and make them aware of what makes sense in that regard now how would it continue after our Q&A we will have a workshop as well to show you how you can take joint action maybe in a different kind of context discuss how we can do what we can do to make the breadth of civil society aware of things and I think the motto of bridging bubbles is very well suited for that because who if with us is placed to bring this message forward and tell people that there is a real problem coming and we have to do something because our communication is under threat and we need encryption as a fundamental right and as a foundation for our democracy I think it's a very important issue and for several people it will be perhaps kind of hard to grasp but I think there are good ways to make people aware of the issue and I think that's what we need to do and that I think leads me to the questions that you may have right now any questions or comments or requests none? oh that's a pity do you have any questions there's nothing in the pad apparently this interpreter hasn't opened the pad yet in that case I will close the session here please follow us on twitter or mastodon and if there are questions you can reach out to us Konstantin and myself through our personal accounts you can reach as well and Tom can be reached via the digital society I very much recommend following European Digital Rights to be updated on the at the European level and I look forward to seeing you in the workshop the big blue button room is linked to on the page of the organized sessions page I haven't got the link in my head at the moment but I think the Divock website will lead you there and that will hopefully make you find us a self-organized session as I said and I hope to see you in the workshop and I look