 I'm very happy to have these guests here. We have Thomas Lohninger from EpicenterWorks and Chloe from ADRI, and they are talking about content takedowns. Who cleans the internet? The EU plans to swipe off our freedom of expression under the carpet, and I'm very happy to announce them, and that they're here. Give a big applause to Chloe and Thomas Lohninger. Thanks. Thank you. Thank you, and welcome to everybody. Yeah, let's get started. We're going to talk about content takedowns and content moderation, but why the debate? You can call platform regulation. The next 40 minutes will be about, in my prediction, I would say after doing digital rights for 10 years, probably the biggest digital rights debate that we'll have in the European Union. But before we get into that, we have to start with the basics. And the basic in this case is a weird complicated word called intermediary liability. What our intermediaries? You can think of that if you go back to dead trees, to classical media. If you publish a newspaper, you are liable for every article that somebody writes in there. So a publisher has to take responsibility for the content that they are putting in their paper. For classical media, that works. And media and liability law around these things has developed over decades and centuries for radio, for television, for all types of media. But that logic, of course, did not work if the internet came about. And there have been several cases in the EU in the 90s where the police thought an ISP is just like a newspaper. They have to be hold liable and responsible for the contents of their users on their service. And so several ISPs in France, actually not France, in Italy to the left, in Austria to the right, were raided and the servers were confiscated. Thousands of websites of companies were offline because the police thought, okay, we have to arrest the internet and it took all the servers away. There was even a famous case in Germany around CompuServe where the CEO of an ISP and host that was actually charged with criminal charges about pornography because one of their customers made content available that was illegal. But all of these cases, of course, created a huge uproar in Austria. The internet was even shut down for a day and then the EU reacted to that. As a result of this legal uncertainty, rules were adopted at European level and they were formalized into what now we call the e-commerce directive, which was adopted in 2000. And it gave basically internet companies legal protection against illegal activities that are taking place on their system. And the rationale at the time was that it was to ensure that there was a unified digital market that would develop and expand in Europe at a time where very few people had access to the internet and the digital global corporation that we know nowadays didn't exist. And one of the key provision of this directive, Article 14, says that company are not responsible for illegal content that their users are generating unless they obtain knowledge of it. In such cases, they have to act as quickly as possible to remove it and that was until now the European model. Yeah, and you can also think about intermediate liability as a sword in the shield. That analogy mostly holds true for the US and Europe it's a little bit more complicated but the basic principle still holds true. So intermediate liability protections, these safe harbors that Joe just explained in the e-commerce directive act as a shield. You are not liable, your users can do whatever they want and when a problem comes up, you have to deal with it of course, but you don't have to go through every eventuality of bad thing that could happen if you start a new service. So that shield was very influential and important for creating the diversity and innovative capacity that we have witnessed over the past decades in the open internet. But intermediate liability also gives companies a sword. They can moderate on their own. They can proactively moderate, they can moderate based on laws or terms of services and they can do that to their own choosing. In the US the debate right now is a little bit, if you don't use your sword more often then we'll go after your shield. Now we also want to give you a brief video that also summarizes that concept of intermediate liability by the most famous and most painful example that we had in Europe, the copyright directive that was the video recreated with Alexander Lehman for the pledge 2019 campaign. The open internet we all know and love where everyone can participate will soon no longer exist. EU politicians are about to pass a law which they say is supposed to combat unauthorized copy. In pursuit of that goal, they're about to make a fundamental change that would affect all of us. It all comes down to the main question, who is liable for files illegally uploaded on the internet? So far the person uploading illegal copies is also the one responsible and liable for the content. The app or website they do this on is innocent unless they're made aware of the copy and do nothing about it. It's like when a crime is planned over the phone that doesn't make the phone company responsible. Today's legislation makes sense. Nowadays, everyone can communicate and share with the whole world. It's simply impossible for websites to manually review each one of the billions of images, videos, texts, audio files, we post online. But that's exactly what the politicians want to change. In the future, as soon as something goes online, the site would be just as liable as the person who posted it. So what you may be thinking, why is that a problem? Because then the only way for websites to operate legally would be to verify every single post by every single user, assuring that it doesn't infringe any copyright worldwide. If they cannot guarantee that, it cannot go online. But it gets worse. This new law is not only demanding something technically impossible, it's also threatening one of our fundamental rights, namely freedom of expression without unjustified censorship. But no program can tell for sure whether a parody, commentary, or remix is legal or infringing on copyright. Making these decisions today takes lawyers, judges, and long trials. And yet websites will be expected to somehow make them automatically millions of times a day. To avoid massive fines, the platforms will have to filter extremely strictly and a lot of perfectly legal content will get caught up in these filters. But unfortunately, technical feasibility and censorship are not the only problems. Before this bill has even passed, the EU Commission has already presented an even further reaching law. This one would require filters for so-called extremist content, with each post having to be checked with law enforcement agencies. We're talking about nothing less than EU-wide internet censorship machines. What could possibly go wrong? If we don't act now, we could find the internet in Europe scrubbed clean of anything challenging, surprising, weird, or enlightening. The big winners of this new law would be multi-billion dollar companies like Google or Facebook, as their budgets would still allow them to implement the new guidelines. All smaller platforms would only be able to further offer their services if they use the filter systems provided by the big corporations. The smaller companies will have to trade in their data, which would make the big players even more powerful. All of us would be the losers of this new legislation because structures that make the internet so diverse today would die tomorrow as a result of this law. If you want to stop this dull and miserable filter net, you have to take action now. These websites tell you how. Call your representatives in the European Parliament today and tell them that you will only vote for them in the 2019 European elections if they vote against upload filters. Yeah, that vote, of course, we certainly lost by five votes, so five MPs. We could not convince and henceforth the copyright directive was adopted and is now on the way to be transposed by EU member states. The discussion doesn't stop there, but before we come to that, first to the... Just to kind of summarize again what just being said, like there is some problems now in the ecosystem, in the online ecosystem that makes certain people believe that the ancient rule, the rules from the e-commerce directive needs to be fixed. Why does that? Today's intermediaries are very different from the ones from the 90s, obviously. We now are witnessing what we call the platform economy, which is a centralization of the net around a few players. And most of what peoples do online today is mediated through intermediaries. And those intermediaries, those are few giant corporations that dominate the ecosystem and the reality is there is very few possibilities for credible challenges to enter this market. The second problem is there are many, many more people online today than it used to be. Everybody can post content 24-7 online and can possibly reach a global audience, even that for free. And this is quite overwhelming. I mean, this is how it translates into numbers. And this is how the online communications landscape looks like today. So million, even billions of users whose online activities are mediated by a few platforms and they post an enormous amount of content per day, which makes it impossible even for those powerful companies to control and assess each piece of content that is posted. And that's a big problem. These billion users subject the entire communication under monolithic speech regulation policies. And this is impossible for a single set of rules to encompass and to accommodate the diversity of cultural norms in the world. What's more, because those platforms not only host content, but they actually curate it, they push it, they delist it, demote it, they decide for their entire user base which voice gets heard, which viewpoints gets visibility, and which does not. And because most of our communication, their content regulation rules, so-called community guidelines, become some sort of constitutions that actually regulate a quarter of the world's population speech. That's a huge problem. And they start behaving like they were actually the ones making the law. Recently, Facebook created an oversight board which is supposed to kind of make decision on content moderation cases. And in this way, they are acting more or less like a Supreme Court that decides and interpret Facebook terms and services. Yeah, and on that point, it is important to understand, there's really a core concept here. The laws that we are accustomed to, which regulate which speech is acceptable or not. They are always contagious. I mean, the case law of Europe's highest courts even says fundamental rights protection, fundamental speech protections are particularly for challenging speech, for still legal, but, you know, like these very hateful statements, even statements that are creating an uproar that might even spark a demonstration. These are exactly the edge cases that we need to protect with freedom of expression. Yet, most of the content moderation decisions that happen today are not even based on law. It's right to assume that around 80, 90% of the content moderation decisions are actually about the terms of services of every platform. And those are not laws. Fundamental rights protections do not apply to these texts that companies have written themselves and probably change every week. And that's so important because that also changes the dynamic of any legislation that comes further down the road. But there is an ample amount of cases where platforms have acted really irresponsible. TikTok, the Chinese social network for short videos that's particularly popular in the younger generation. The Guardian and adspolitic.org recently leaked the content moderation guidelines for the humans that are moderating the content on TikTok. And what they found is that this platform intentionally curbs the reach of people with disabilities, with autism or Down syndrome or people that are not fit that have a normal, regular or fat body size. All of these people are curbed in their reach so their posts and their videos are actually never reaching a critical audience. And what platforms actually rely on automated tools, on filters to the content moderation, it doesn't get better. Actually, when they try to do the right thing, which is for example, fighting hate speech against people of color, it end up in black people are the ones being the most censored rather than the violent racist speech that is targeting them. And that's because the technology is unable to understand the nuance in every language. And that's the case of Twitter here. Before going, presenting to you what's coming up at European level, let's have a quick look at what happened in the past years because both the EU and its member states have started to lead a true crusade to clean the internet. They were adopting very quickly legislation to tackle all sorts of problematic content. Yeah, and of course Copyright Directive is first and foremost there. We really wanted to win this fight, not just because of article 11 and 13, now 15 and 17, so upload filters and until we copyright for news publishers, but also because we knew that in the upcoming fight around digital service act about the intermediate liability debate, it would be a bad start if the Copyright Directive went down as it did. The other file that was mentioned in the video is the regulation on terrorist content online that's still up for grabs and will probably be adopted next year. The Audiovisual Media Service Directive is a particularly nasty piece of legislation because it is a law that doesn't say companies should do X, Y, Z. It is a law that say companies should have terms of services that do X, Y, Z. So it is kind of outsourcing or privatizing things like the moral development of young people. And then we also have two pieces of software, the code of practice of disinformation and the code of conduct on a legal hate speech. There are not even legislative acts. It's basically the commission sitting down with Facebook, Twitter and Google and saying, you really don't want us to regulate you, do you? So just follow this law and come up with a self-regulatory scheme and then we'll let you know. And actually members of it have not been resting on their laurels either. They've been very prolific. And the first one to adopt as its own anti-hate speech law was Germany. And it was actually copy pasted by a lot of its European counterparts. And all those law are currently under debate in their national parliaments and governments. But basically there is copycats law in France, UK is talking about it, Ireland and also Croatia. One of the common denominator of those laws is that they are shifting the responsibilities to decide what to block or not online onto the shoulders of companies. And that's very convenient for European governments to do that because companies do not have to respect human rights law like the EU Charter of Fundamental Rights. Contrary to the very same governments who have to abide by these laws. A very, very convenient way for them to shrug off the difficult task to balance fundamental rights at stake. What is actually the playbook of these laws? So again, they are pushing content takedowns to be based on terms and condition. So contractual rules rather than the state law. So that's a very big problem for the principle of the law. And then for some of those laws it goes even beyond actual legal content under the law. It also covers harmful content. More or less undesirable, what is undesirable in the eyes of the legislator, but they don't give a legal definition, obviously. What else? They also incentivize companies to act very quickly to make the decision on the content. And for that, it leads company to obviously use if they can, automated means, so-called upload filters to the job very fast. And how do they incentivize companies with high fines? If you don't comply with the rules, then you'll find. Yeah, again, summarize that nicely. Platforms are put in a really difficult position with a strong incentive of overblocking. Some piece of content is notified to a platform they could either just delete it and be done with it or start a quite complicated, expensive legal assessment. Is this within the rules or not? And if they decide it wrong, then they could face the risk of penalties or even liability for that content. And hence, you have a strong incentive for overblocking and is not really a level assessment that what they do. But to increase complexity further, it is not just about content moderation. So this whole debate about platform regulation will also include e-commerce. So there's a famous case when L'Oreal counterfeit products were sold on eBay, L'Oreal sued, ultimately lost. That case was in 2010. And the European High Court went with us, went with Intermediary Liability protections back then. But as you have seen with the years of the previous dossiers, both on the member level as well as on the EUA level, it is actually more or less difficult for us now to hold that line. The CEGU, the European High Court recently decided in 2017 that yes, Uber is a taxi company and does not benefit from the protections of liability. In the US, we had CESTA, the Stop Enabling Sex Trafficking Act, which was allotted, was adopted bipartisanly in the middle of the Trump administration in 2018. Both Democrats and Republicans could agree on that law. And it again, removed liability protections for hostess. Tumblr deleted around 17 million sex-positive or sex-educational blogs because of that. Because they could not bear the cost that was not willing to actually go through that content on their platform. So they just massively deleted millions of blogs with a lot of content. And that is a kind of sign which internet we could end up with if suddenly every platform has to take responsibility before stuff even appears online. And this is not just a Western debate, also in India. The regulators are looking at that. TRY launched a consultation and they are also proposing new rules on Intermediary Liability. And that whole debate is global in nature. It's important to understand that every rule that we make here will have repercussions in the rest of the world. Net-STG, the German Law and Content Moderation, was copied by 17 countries already. The first of which was Russia. When we call for Facebook to respect the decisions of a court, people in Azerbaijan here, okay, that same court that blocks a hundred opposition websites every month now should also decide about my Facebook posts. So it is quite complicated to get it right here. But yet I believe that Europe has the best cards to come up with a fundamental rights-based solution to this intricate problem. Now about this reform. Yeah, what is coming up? Since 2000, it was quite silent around the e-commerce directive. But in the last decade, there are a lot of movement. Thomas already explained to you there were several case law, but there were also several calls at European level to reopen the text or to complement it with another more specific piece of legislation. But nothing really concrete came out of it. But now with the new commission in place since 1st December this year, and with its new president, it is quite confirmed that the review of the old e-commerce directive will happen, and it will take the name of the Digital Services Act. And to be honest, what will this future legislation say or contain? We don't know exactly yet, but there are several ideas on the table. Among them, there is a possibility that the reform look at the current definition of what is an intermediary, what do we call an intermediary and how we categorize them. Trying to update the old definition, trying to take into account the new internet companies that emerged in the 2010s, like Airbnb, for example. It will look at one of another core principle of the e-commerce directive, Article 15, which prohibits member states to put an obligation on platforms to monitor all their content to look actively for illegality. And this is a big question mark for us because will this principle be upheld or not? That's a big question because the copyright directive already kind of start going against that very principle. And then it will also look at some obligations for platforms or for intermediaries in terms of how they regulate their content, their practices, and how they do content moderation on their services. Another thing, which is actually very welcome from our side. This whole list here is based on a leak from the commission on the working level. We actually have good people, they are working on that file. They have been doing that for over a decade and they have a really good understanding of the issue. The question is how the political side will deal with these good ideas that are being discussed. But yeah, rules on online advertisement are good because we have to talk about the business model. Most of these problems are symptoms of the strong concentration in the market and a few very dominant platforms that we have in the internet today. And so actually tackling the attention economy and the attention merchant, as Tim Wu put it, is the right thing to do also because the full scope of this phenomenon called targeted online advertisement, I would say has not been completely understood and we need more science and for that we need more data. And another thing, which is actually really hopeful, I think for the people in the room here, is interoperability. Yesterday, Moxie Marlin spiked a talk where he basically bashed the idea of decentralization. What he is missing is really that it's not about decentralization in the 90 cents, it's about interoperability. It's about making dominant platforms open again, forcing WhatsApp to establish a protocol where competing messaging services like SignalFEMA can communicate with people on these other networks so that competition can happen even in an economy that is strongly based on the network effect. And interoperability is not a Swiss knife that can solve all problems, but if it is applied in a case-by-case way to also solve the privacy and security problems that come with it, I think it would be really a visionary thing for a European internet that is decentral and open and not concentrated between China and the US. Another thing that of course many people in government want to see is accessibility of data. Many local governments and cities have problems with taxation from Airbnb, for example, and they're refusing to cooperate to allow for these types of taxation, forcing them to hand over this data, I think in general is something to look at. And lastly, all of these rules will most likely be informed by a new entity, by a platform regulator on the European level. Again, that's still up for debate. People want to see media regulators to more or telecom regulators, but I would think that we'll see a new regulator for these tasks in the EU in the near future. And that is why the digital service sector, DSA, is often dubbed the constitution of the internet. And that is actually a quote from somebody in the commission working on that file. And to compliment the picture, why is this important? Let's look at the more mechanical thing like the lobby side, what Chloe and I are working on on a daily basis. The stakeholders that will be part to that debate is of course all of the US internet giants in Silicon Valley. It will be a whole of the European internet industry. Every telco, hoster, content delivery network provider ever. So basically all of the actors of the general data protection regulation will be on the table with a stark business interest on that file. But that's not over. You have also a big opportunity for a big copyright battle revival with all the classical actors in this debate. Like the classical media, think about newspapers, broadcasters, publisher, a.k.a. the return of Axel Springer. Then you will have the entertainment industry, which is representing more or less all the rights holders from the music, cinema, et cetera, and represented actually by an army of lobbyists. And then obviously the, what? What, yeah, there you are, and more, but anyway. I got it. And then obviously the providers of upload filters like Audible Magic, Facebook, Google, who have a really big financial interest in this dossier. And it doesn't stop there. So Brigham Motors does, taxi drivers, hotel owners. So everybody who is attacked or affected with the sharing economy over the gig economy, they of course will have a say. And particularly in national debates, those are strong lobby organizations. Then it's us. It's a digital rights community. And I think we have to be essential in this debate because in a way it is about the soul of the internet or that question about interoperability and liability could easily change the landscape of the internet that we have today. So I think that's why we should care. Then of course human rights, Amnesty International is waking up the digital rights. Those actors have a different mindset, but I think we have to see them as allies. And lastly, also every type of marginalized group, every anti-racism organization, feminist organization. We heard about people with disabilities. So all of the things where we as society currently disagree if this should be shown or somebody should be allowed to say that, mix in a big bucket, steer it, and then you global private profit-oriented company, please solve that for us. So if you aggregate all those stakeholders, I would dub it not the constitution of the internet, but definitely the matter of all digital rights debates. Just because the interests are so big and that's why we will definitely not have an easy time with that. Okay, what's next? When should you be ready to act more or less? This is all super blurry, but rumors hold that the first public consultation will be launched early next year in 2020 to which all of you can participate gladly. We would be very glad about that. The proposal of the commission to the first draft will be released probably at the end of 2020 by the commission. Then serious business begins when the text reached the two co-legislators, the Council of Member States and the Parliament, which to be fair, are gonna take probably more years to find their own positions on the text, their own version of the text. And so we expect the negotiation between all the parties after in 2022 and probably the adoption in some years later. To summarize, so the 90s have shown us that we needed to adopt the rules because the internet is not paper printed on that trees. Then for a decade it was mostly silent. For the last decade, we had to fight off attempts to open up this law and now it's gonna happen. So we have to deal with the reality and the good lesson to learn is that in European legislative processes, the earlier you come to the table, the more effective you are. The same letter that you could send one day before the final vote will mean nothing, but if you send it months before the dossier is actually released or the consultation has launched, you can really bring ideas to the table that will end up in the law. That's what we have done is epicenter works with platformregulation.du, which is an attempt to actually solve that problem in the form of a request for comments. It's an RFC-based website and you can please comment on that and will of course continuously increase these ideas and hopefully we'll see some of them later than in the law and the legislation that will be released. Then lastly, epicenter works is a donation-funded organization. We have very strict rules on our fundraising and the work we do, which is a public watchdog, is also not welcomed by many people because we can't be bought. We actually always try to speak truth to power and that means we are even more dependent on the support of many individuals and a diverse group of supporting members is actually what makes this work possible. And you can become one if you go to support.epicenter.works. My time for advertisement. Follow our work. If you want to be updated on this file, if you want to take part in many, many battles at European level that concerns all of you, you follow our work. You follow the best newsletter in Europe at edry.org slash newsletters and also the one from epicenterworks and follow us on the live, unfortunately, Twitter channels that we have. Thank you. Time for questions. Thank you so much. Thank you so much. That's good. Again, the contact details are here and we have questions from the internet and we have questions here in room clock. And I would like to start with number two, please. Thank you. So when lobbying against some of the things that we will be lobbying against in the coming years, one thing I hear is that right of speech is a human right, but right of reach never has been. So how do I, how do you react to that? How do I best react to that? Sorry, could you repeat the question? Yes, yes. So what people say is that right of speech is obviously a human right, but the right of reach that is using the platforms is not. Yeah. Do you want to answer to that? Yeah, I mean, it's actually touching on the complicated thing because it's actually a very simplistic view to just think that it's about content just being published or being taken down or deleted. In many cases, it is about algorithmic creation, what actually ends up in the newsfeed. It is also about which content gets monetized, when gets a user suspended. If content gets deleted, does it also get notified to the authorities? But often that content is also evidence for a crime. If we speak about the stuff witness.org is working on in Syria or the documentation of child abuses, all of these things actually deletion could not be the most viable step to take. And we wanted to talk about that, but because of time, we didn't go into the detail, but the policy proposals are all tackling that. And when it comes to the question of responsibility and algorithms, there will be a separate dossier that we don't know much about. There are almost AI regulation, which also might be mixed together with the DSA, come in conjunction. We don't know it yet because those decisions haven't been taken yet, but you surely will see these questions also being addressed in the ongoing legislative term after you. Thank you. Okay, we're gone with another question from the internet, please signal Angel. Hello. So, one of the questions from the internet is, does the law require Facebook and Google to snoop data that is not presented to the public? To snoop data in the sense of monitoring it? Monitor, yes. It's actually a good question. I mean, if we talk about a private WhatsApp group or a private Facebook group, it all depends on the national legislation if they're, I don't know, it from Austria. If you have more than a certain number of people in that group, you have a public forum. And, for example, if you would deny the Holocaust in such a closed WhatsApp or Facebook group, that would still be an illegal fence. But then the question is, where's the judge? So, like, if that gets notified to the platform, they should take action. And for a Facebook group, Facebook could do that and would probably also do it if they would do their job. On an end-to-end encrypted WhatsApp channel, they simply cannot, technically, if they are not parted to that communication. I hope that answers the question. Okay, thank you. We have another question at microphone one, please. Hello. I'm from the United States. I run a non-profit transit internet service provider by running about 6% of Tor network exit capacity. So I facilitate quite a bit of onion circuits which are decentralized and encrypted so we can't see what's in them. How does this affect me? And how can I get involved? And can I get involved? Mere conduit. Yeah. I think where you would fall, actually, nowadays, would be the definition of mere conduit intermediaries which have less responsibilities than hosting providers. And because you're just providing the network, like not really hosting it on your own servers, if I understand correctly. The rules that I mentioned with the knowledge and the responsibility and the liability exemption doesn't even imply to you. You're more protected than you either. As for the digital services act, what's gonna be coming up? I'm not sure. But I don't think there is an intent to actually modify the rules that apply to mere conduit. It's more for the hosting providers, like the big ones, the forums, the blogs, and so on. Yeah. Okay, thank you. Microphone 2, please. So given how with this question, for example, we see how where the responsibility falls is also a bit part of the negotiation as well. So you have given an interpretation now of how this law affects the current way that publishing works on the internet. But don't you think that this change can also incentivize innovation in the way that we do publishing on the internet and maybe incentivize more decentralized approaches to publishing merely in order to escape responsibility? Of course, I mean, one reading would be that platforms just die off. Everybody has to have their own website again. But where would the discussion then happen? Like in the RSS feeds that we all have on our disabilities, even something like mastodon would be directly in scope of legislation like this. So every type of aggregation, every type of newsfeed, every type of interaction even would fall within the scope of that law. And that's why interoperability, I think, is a nice synthesis of this central versus decentral debate. Absolutely. Okay, thank you. I'm very sorry, I see we have more questions, but unfortunately time is up. Please give another applause for Chloe and Thomas Donemann. Thank you so much.