 like your feedback, hashtag C3T on Twitter. We also have a Twitter account called C3Lingo and we have an email address if you're still using that old medium. Hello at C3Lingo.org and our speaker Max Becquedal is a hacker, activist, blogger, journalist at NetsPolitik.org won the Grim Online Award for NetsPolitik.org and please a quasi-traitor. He was famously investigated for treason which was a scandal in itself that just took place and that was quickly dropped. So our favorite traitor. Good morning. I'm a bit confused now because I was waiting for a certain door to close. Oh well, never mind. The good thing then is that I can tell a few more few more things about you. Why not? Who of you was at the camp 2015 and saw the talk by Markus and André where they exposed or talked about the investigation they had. The translator can say he was there and he was in the booth translating that as well. Yeah, I wouldn't have expected that. Strong numbers here. Who saw his first talk at Congress, whichever day it was. A few more. If you're wondering why we can't start, there's a door that's locked and we have to have all emergency exits open before we can start. But that is not tragic because, well, that gives us a little break and the next talk will start much later. It will be the break. So there's no pressure except the translator wants to get some food. We were told that we can overdraw by no more than two minutes for cleaning and stuff. Let's call the lock pickers for that emergency exit. Methodically incorrect would be another way to open that door, I guess. Last night's presenters. Otherwise, we have two translations for this talk and that has worked for a lot of talks. So the people in the cabins there. Let's give them signals here. Okay, door is open. Great. Thank you for coming. A bit unusual, perhaps three, netspolitik.org talks all at the same time. I would love to be in those other ones too. Well, it's all in video I've heard. And in Russian as well, so later. Okay, a topic that is actually quite dry and no one much is interested. But from our point of view, it's one of the most important topics at the moment because what I'm going to talk about has parallels in different net political debates. Privatized law enforcement is about the trend that more and more classical law enforcement is no longer done according to a rule of law with the due process, but through private actors. And the state is sometimes even interested to pass on responsibilities and get rid of them that way to hopefully, in their point of view, let other actors solve these issues. So privatized private laws being privately enforced is not that new. Actually, press law is one case. For example, I've been told by lawyers, but the thing is that everywhere in these processes, there is a court that gets involved somewhere in between or at the end. But with privatized law enforcement, you have private actors deciding which platforms on the basis of their terms and conditions, which is probably never read, what happens on these platforms, what is allowed to happen, what is not allowed to happen. And that is quite a problem. Many will say it's not a problem for me. I'm not on Facebook, but many other people are on Facebook, for example. And that is a problem. There it is. It is a problem. And large parts of the public space, the public debate is now in the hand of private actors comparable to a shopping mall. Because a shopping mall is not the same as a marketplace. On a marketplace, of course, you have basic rights, freedom of assembly, freedom of opinion, you can stage rallies. But who have you ever seen or staged a rally in a shopping mall successfully? And right now, we have the problem that this privatized law enforcement, which is a bit older as a concept, but now it gets dominant because private platforms get dominant. Perhaps here, many of you will say I'm not on Facebook, have to live with the fact that a quarter of the German population every day gets information from Facebook. And Facebook is one of the, if not the dominant platform for opinion forming. And Facebook is just 10 years old. 1.6 or 1.8 billion people are supposed to be logging in there. And Facebook is just one of many platforms that, as private actors, in effect, exercise their house rights, their conditions that perhaps you're only informed once per year when you have to accept changes. So they determine with those rules how we live. And of course, also they determine the technology that we use. So other consequential effects in public space are that there are problems when courts are more and more overrun and there is not enough competent personnel, for example, on IT questions in the courts, in prosecuting authorities. So this is something that we all should have cared about in the past, or they should have, but nothing has been done. And if you then look at voluntary corporations between private actors and states and where they come from, then there is one debate seven years ago in Germany that was very obvious. And that was Sen's Ursula, the Ursula from the line that then minister wanted to have censorship against child abuse images and providers with a large market share. They'd come together in a voluntary cooperation with Ursula von der Leyen, the then family minister, I think, and agreed that to exchange or receive filter lists from the German criminal police office and make the websites on those lists inaccessible. And it's all a stop sign. So Telecom, Verdefon, Arco, Anzenet, Telefonica, O2. We said that this was a censorship infrastructure that would no longer be under democratic control. And there was an outcry. The Pirate Party, the Social Democrats even recognised that voluntary cooperation is a problem, then passed the access impediment law. A new coalition then came into power. Conservatives and Liberals that removed this new law and the party, the Pirate Party was gone a bit later too. But there was this huge debate about cooperation. Why corporations with the private sector are bad in terms of law enforcement? Why there should not be a censorship infrastructure in the private space like this? And that was good. But the problem is we are getting it back and no one's interested. Between 2009 and now, there were many attempts at the international level, particularly in the, in intellectual property law, to install filters. Why the intellectual property law is a long story, lots of complications. Since I've done netspolitik.org, I have to deal with the ancient intellectual property law. It's not getting better either. But one of the few things in the last few years that have been reformed in intellectual property law was the creation, was the industry that serves people with cease and desist orders that come with a fine immediately and a fee at least. And that's perhaps the only thing that has been improved. We had a large debate on ACTA, which had been negotiated in secret since 2008 against piracy as it was called. So there was this debate in 2011 or 2010. And the mechanism of privatised law enforcement were not covered by these texts. States wanting to agree to impose rules on internet service providers and platform operators to install filters to solve, as it were, intellectual property issues to enable the law to not be reformed and everything to continue as it was. And in parallel to ACTA at the EU level, there was a dialogue with internet providers and platform providers about illegal uploads and downloads. And the EU commission suggested voluntary application of filters and blocking peer-to-peer traffic. The commission then already pushed service providers to adapt their terms and conditions to give them unlimited opportunities to delete websites, delete links shared to websites, links to websites being shared, which was a violation of the Charter of Fundamental Rights of the EU. You have to imagine we know in Germany if the government introduces a law that violates the constitution, that is bad because the government's task is to protect the constitution. And the EU commission also may think, well, the Charter of Fundamental Rights, who cares about that? We'll just do something else. And people were thinking, if providers do this voluntarily, then it's all right. No one can go to the courts. And that's one of the problems that privatized law enforcement carries. Another example, this dialogue about illegal up and download failed. Fortunately, ACTA failed a few years later. No one dare touch it again, at least not in terms of privatized law enforcement. But a lot of things did happen, did continue to happen. Google, for example, removes about one million links every day that rights holders, intellectual property rights holders enter to Google, give Google in forms an estimated 340 million removals. Our thought of happens in the last year. Google says it takes them six hours to remove a link. So you can imagine there is a long, long check there if every removal request is actually justified. So the list of collateral damage is a long one. Many GitHub projects legitimate videos that have suddenly been removed because someone else thought they have the intellectual property rights. So that, again, is another form of privatized law enforcement. Google does this because Google, through the Digital Millennium Copyright Act, the DMCA, the American copyright law, is obliged to remove within a single day in a takedown procedure. And if they don't, it will be expensive to them. And they don't want to pay that bill. So they'd rather remove too much than too little. So that's about that. Now, let's come to terror. For many years, people try to install censorship infrastructures, child abuse images didn't work, at least in Germany, didn't in other countries. Unfortunately, it did. Intellect property law didn't work, actor failed and all that. So terror, that's where they are on a roll. In 2011, 2012, there was an EU project called Clean IT that even then, many people, not many people cared about. It could be called the most stupid collection of suggestions for internet rules in human history, as human digital rights called at the time. And I would underline the sentence, sign it immediately. And even then, in the Clean IT context, there was a semi-officially EU project of fighting terrorism, secret services in Germany, together with internet providers from a few states came together to come up, to draw up joint measures to fight terrorism on the internet. And that involved the suggestions included at the end of that project, which was just a big talking shop that you receive money to meet. And then there is a certain catalogue that you publish at the end. And then that includes suggestions such as private industry standards for upload and filters and takedown and an alarm button for the internet. So Clean IT wasn't a master plan by the EU, but it was an upcoming or a blueprint for the upcoming EU forum, EU internet forum, which maybe you've never heard about, but which has a big role now. The EU internet forum, first, the forum of the community of internet service providers. So it was shortened. The internet forum has been meeting since 2014. The first meeting that we documented was at the margins of the G6 EU Interior Ministry meeting Germany, Italy, France and three others. And it involved the US, Canada. We didn't quite understand why and Turkey. Then they suspended dinner together in a one plus one meeting with Facebook and other actors to set a direction how to deal with terror on the internet. And then it got a little bit more formal. And this was group meeting in the first forum. And the EU sadly has small pictures, so you can't really see all of them. The goal of that forum was not to reduce freedom in the internet, but to reduce the usage for terrorism on the internet. That says pretty much everything what the goal of that task force was. There were different meetings since then and it increased a lot. And the beginning it was to deal with the challenges that arise with terrorism using the internet and to find a way to react on terrorist online reactions and to develop other narratives and voluntarily possibilities to deal with other things. And as an example, sorry, I missed that part. And another part that came to it was cryptography. When you're while you're doing terrorism, you can do cryptography cryptography with it. Then terror, what always is terror, we need a counter narrative. Here we have Cecilia Malmström that was called to the EU. She is the trading commissioner for the EU. Before that she was the in politically commissioner until 2014. And she got the nickname Cecilia from censorship. We were happy that the names fit and we were happy that back then that Tzenzula also worked. And she worked for it to build a forum for all the European players in communications. So that everything that is illegal and potentially legally can be dealt with. And that there is an easy way to put a counter narrative into the internet that reminds us of Akta or we prove a successful campaign against Akta and then the EU commission starts a narrative against for Akta. So it is a way for the EU commission to build a counter narrative against things they don't want. And that's where Afram Monopoulos comes in. Minister for Inner's Interior and Migration. And he also manages Europe. And Europe is the European police force like the BKA on a EU level. And they had the nice idea. We were talking about terrorism things but now we can increase it to migration and other things like against traffickers. And there were instantly three task forces that should destroy migration trafficking human trafficking rings. So typical pattern. We're talking about terror. Everybody is looking at ISIS. And then when we're making progress somebody comes in and adds another topic on top of it. And with the internet forum that was founded officially in 2015. And they invited five internet companies. And we were wondering how the ask.fm gets on this list. We really don't know. I don't know. Probably we didn't really care about it. They apparently have 160 million users. We don't know if they're active. And we're thinking about there could be. They wanted to invite Apple but then somehow ended up asking us. But we don't know. When the forum was founded we want three things. First we want to fight terrorism propaganda on the internet and can be put into place. And another problem is to and to speak about the problems that law enforcement have with Cryptography and how we can put interface in between to how we can have surveillance interfaces. And we don't want to. So the interiors says we can't and don't want to enforce enforce laws or introduce laws about contents that cannot be shown by providers. But we appeal to the humanitarian responsibility, the ethical responsibility for those to take some certain content out if governments ask for that. But why that does what could possibly go wrong. So if this works with child abuse images why could it not be used with terrorism and companies start to define in their terms and conditions what and what is not acceptable. And so they start to become the judge and jury and the executioner and and delete content. Europol, as we said, joined this and had the great idea to install a reporting referral unit for unpalatable internet content. Of course, what is unpalatable palatable is always a question of perspective. And this referral unit was then placed at the anti-terror unit. And it suddenly became a center for blocking and deleting unwanted internet content. As a commission document showed, Europol is supposed to help member states by identifying and removing certain content. And so we have a situation where every one or two months behind closed doors, these large platforms meet with the EU commission and various things are on the table. And well, the thing is probably a kind of backroom deal behind closed doors, host trading. So if you have things like you have certain monitoring interface students like and you want to keep up with keep using cryptography, you'll have to offer something else. So for a while, we have been trying. Oh, only 10 minutes left. Okay. For a while, we've been trying to get information on this using freedom of information law, and that was rejected because releasing public publicly deals about the engagement and cooperation with industry results in these industry representatives potentially becoming subjects to threats by terrorists. So the EU commission is referring to this because Facebook YouTube are deleting many IS accounts which leads to Mark Zuckerberg receiving death threats on Twitter. And this is why we cannot know what the EU internet forum is doing. Not a joke. You can read up on this. So we didn't find it very funny. And when we were told, sorry, we can't tell you what's happening. We have to be concerned for the security safety of the negotiation partners and to support the argument, then the commission would Twitter lavish pictures of everyone involved at the negotiation table. That's the smaller one. If you click on it, you see the whole thing. And the ombuds person of the EU has started a procedure initiated by edgy.org who sometimes publish articles at naspolitik.org according to this huge scam. And it was their rights to freedom of information. And the ombuds person is investigating. And in parallel this year, our interior minister started to put on the pressure. He told a morning TV program, we are in intensive talks with providers. Bomb building manuals should disappear from the internet. And well, the people that call for freedom of expression. I don't find these people convincing. We want providers to themselves be liable if crimes happen in their networks, right? So you could translate this by saying what the interior minister said in the morning magazine there. We translated it. That's a beautiful platform you have there. Wouldn't it be sad if you had to be liable for your users' contents? And that brings us back to the horse trading. If you have pressures on platforms to start doing things, because otherwise, we will look at the exclusion from liability, which the EU is trying to do, the protections that the platform providers have. So about the whole privatised law enforcement thing, terror thing and all that. So all these problems were good arguments that had results in the last few months. For a few months now, we have upload filters. We call them censorship filters. The Facebook, Twitter, YouTube platforms have those filters. And that gets us to the photo DNA technology, a Microsoft and Facebook project. Back then in the child abuse image debate, we were aware of this. It assigns a hash value to image and puts it into databases and compares these. So the large platforms quite unnoticed by the public have installed these filters. And they receive blocking lists from the censorship database. And all the images that are uploaded are checked against that list, that database. And if they're in it, that database is fed by police authorities what could possibly go wrong. The providers also agree to put everything into the database that are most probably violating terms and conditions by all these platforms. If you look at those community standards, you can imagine, you can think, well, this more or less covers everything or forbids everything that could happen. Problem also is no one is controlling what is on those filter lists. No one really cares. All right. So that's one thing. And what we tried at the censorship debate at the German level, we now have installed this very month on all these platforms. And then we have the problem, an EU parliamentarian from the conservatives from Bavaria, daughter of previous Bavarian Prime Minister Franz Josef Strauss. She is the rapporteur on the terror directive, a directive that is fast tracked, of course, terrorism, you know, more or less behind closed doors. And she also had the idea of installing censorship infrastructures at the DNS level. And if we, by looking at the draft that we saw and commenting on it, we could prevent this. And now in addition, there are words about member states being able to decide themselves whether to install censorship or not. But what the main problem or one of the main problems with the terror directive that again, no one is looking at, and that has to be implemented in the next few months, there is a definition in there what terrorism actually is. And you can imagine that's all fairly relative, right? For us, the Islamic State clearly is terror. China would call the Dalai Lama clearly a terrorist, the one that we give awards to, and the other problem then is, from our point of view, certain forms of civil disobedience or actions such as occupying a coal field could be called terrorism and illegal system interference is another item. Online protests, perhaps the payback operation by anonymous against Visa card and Mastercard with DDoS attacks, which from our point of view could be called a political demonstration on the internet. Had it been registered with the authorities at a demonstration, it would have probably been possible. But this was not limited to Germany, of course, where registration would happen. So that could be called terrorism and upload filters and filters could be used against that as well. And also the document that defines terrorism is often talks about provocation. That again is an interesting question. Every fever posting, could that be a public provocation, fever, very known blogger in the hackers scene in Germany? Okay, I'll have to rush things a bit. At the end of the year, there was a behavioral codex to fight hate speech. At the time, fake news was an issue already. And so they were thinking about fighting fake news, being counter narratives. Of course, we think of things like pastelion, the satire magazine, others may think of the consumer protection minister from the Green Party, Rihanna Kuntast, not the current one. She was the consumer protection minister until 2005, I think. And then we have Gunther, who we sent to Europe to solve the internet problem. Gunther Oettinger, who was the commissioner on the digital agenda. And he tried to reform, copyright and manage to publish an even worse draft that makes things even worse, which hardly anyone thought possible. Among other things, it's about compelling all platforms, not just the large ones, to install upload filters and look through everything that was uploaded and checking whether it's compatible with intellectual property law and whether no one has rights on it. And this all is linked to the, bottled on the YouTube content ID. And the thing is that these photos, this applies to the image, photogenic images, but also it applies to Wikipedia. Imagine Wikipedia having to check each upload, whether intellectual property could be violated, what could possibly go wrong. All right, so Gunther Oettinger has been, well, he has been removed, well, but his draft is still there. And we have the next things on the agenda, TISA, the trade and services agreement. We leaked it together with Greenpeace. And thankfully, they projected our logo onto a nuclear power station, not something I have every day. So TISA is a trade agreement on services, including two or three states and the EU and the US, pictures shortly before the end, suggested to define interactive computer services. And according to the definition, that's all platforms on which more than one user at the same time can access certain content. And those services and their users should no longer have to justify the platform, shouldn't justify what content they delete if they haven't created them themselves. So if this is a voluntary cooperation and these platforms start to voluntarily delete all this, they can no longer be held liable for those removals, cannot be held responsible if they just consider these contents simply damaging or offensive. You can imagine Facebook with their moral values, what they delete, female nipples, forbidden male nipples allowed. So it all looks pretty bad. And that's how it is. We are expecting up to 10 different EU legislative processes in the next year. The EU privacy directive, the copyright reform, platform regulation that deserve need much more attention. We need counter narratives to stay in the language of the EU Commission. The answers that we have are important. How we can do without upload filters, we were able to turn the debate around by presenting alternatives to blocking in the child abuse image debate. And we need similar arguments when it comes to the Islamic state. Of course, we cannot present solutions to everything, including fake news and all that. So we need more people to join thinking critically and think up alternatives. Don't give up. We have the big challenge that in times of democracies becoming more and more unstable, we have huge censorship control apparatus being installed in parallel. And we are the ones that have to say what could possibly go wrong. We have to fight against this. We do this at NetsPolitik.org. If you want to support us, we're looking for your donations, recurrent donations too. I have to stop. Doors will be open soon. Thank you for listening. Fight for your digital rights. Get involved. Get others involved. Hangs and have fun at the congress. Yeah, thanks for listening. Your translators were Zebalis and please give us feedback. Hashtag C3T. Search account C3Lingo. Email hello at C3Lingo.org. Signal angel, question from the internet. No, not a question. Okay, no questions. Thanks a lot.