 Hello everybody, and welcome to this evening CIPL webinar. I am delighted to welcome here today Dr. Joao Pedrakoom-Thayesh, who will be talking to us about Article 17 of the new directive on copyright individual single market, the DSM directive, and the EU rules on online content sharing platform that this provision sets up. Joao is a postdoctoral researcher and a lecturer at EVIA, the Institute for Information Law of the University of Amsterdam. And in fact it is at EVIA that I first met Joao. We both wrote our PhDs at EVIA more or less in parallel. It feels like that was yesterday, but I realized today Joao that it was over half a decade ago now. Joao's doctoral research was an alternative compensation systems in EU copyright law, but since then Joao has expanded his areas of research. So among his other research interests, Joao has in recent years been working on questions related to the role and the responsibilities of online intermediaries, including those issues that are raised by Article 17 of the DSM directive. Last year I had the pleasure of working with Joao on a co-authored article putting forward an alternative solution to the one that is set up by Article 17 to copyright infringement that occurs on online platforms. More recently Joao has collaborated with Dr Martin Husavec on the licensing mechanisms applicable in relation to Article 17. This indicates not only the breadth of the topics to which this provision gives rise, but also the depth of Joao's knowledge of the area. So Joao, it is wonderful to have you here today. The floor is yours. Thank you so much, Cristina. What a kind introduction, you know, it's really, really nice to be here. Although I am in my living room, it is quite nice to be here. We miss you in Amsterdam, but now it feels like everyone is working from home. So just a general remark to I hope everyone is doing well and healthy. I'll try to over the next 35 minutes or so discuss this very interesting and complex topic of Article 17 and the new content sharing platforms. But what I am trying here to, okay, but first I have to just a few preliminary remarks while one important personal remark is that I am working from home and this is about the bedtime of my daughter so I always feel necessary when I have this evening webinar to say that she is the queen of the apartment and so you might, you might hear her speaking very loudly or screaming daddy or papa or version of that. So I apologize for that but there is really nothing I can do this is her space and I'm just living in it. On a more serious and substantive topic I think this is such a political discussion that I feel the need to say, what's my perspective on it, where I come from and what I'll try to do in this presentation, because there are a lot of heated discussions on this topic so I do come from academia, I am involved in a number of projects that are publicly funded through the European Union and Horizon 2020 project on Recreating Europe. I'm also with Christina co-managing editor of the Clover copyright blog, and I'm also editor of this wonderful resource created by Martin Kretschmer and Louis Fourwell and their team at Create that tracks the requirements of the CDSM directive the copyright and digital digital single market directive, but I have not done particular studies or legal advices for any of the stakeholders interested in this debate and I think this is important. Because what I'm going to try to do is I'm going to be very clear about what I think is descriptive analysis what I think the law tells us, and where I think there are reasonable doubts, and try to really make clear where I have some policy preferences. This obviously apart from my general normative preferences that are pervasive in my analysis but I'll try to identify those so that we're not in a situation where people think I'm making certain claims just because I'm on this or that side of the debate and I'm happy to discuss this then in the Q&A afterwards. So having said this on the menu for today, I'll go quite quickly through a bunch of topics but the idea is to give you a broad general vision of what this provision means where how do we got here the broader context how we got here, and then I'll dig into the context of Article 17 and then discuss three issues with varying that's the definition of online content sharing service providers, the authorization mechanisms in the nature of the right and then this fundamental discussion between what are preventive measures and versus user rights or freedoms. So on to the broader context I think we have to understand the reality here is that Article 17 is just the tip of a broader iceberg, which we're discussing now which is the Digital Services Act which is going to be a refit of the Commerce Directive. So a lot of the solutions that you see coming up in Article 17 are part of a broader discussion initiated by the Commission in 2016-17 on tackling illegal content online and one of those categories there was copyrights, in particularly copyright as usual takes the lead, but the major instrument in this area that we're currently negotiating is the Digital Services Act. So it is, I know this is very small on your screen, and I don't want you to squirt your eyes too much but the idea is to put the CDSM Directive, the Copyright in the Digital Cinemarket Directive, which is a south of the line of the timeline here, into some sort of context and for you to understand this doesn't appear out of nowhere. It is almost in parallel to this commission efforts, a communication on illegal content online, a recommendation both of them pushing for proactive measures by platforms, vis-à-vis illegal content that then finds a link in the draft of the CDSM Directive, but at the same time the CDSM Directive comes in, there is at the lower of the graphic below this green box you see that there's a very important challenge to the validity of certain provisions in Article 17 which is this Act by Poland, against certain of these provisions based on freedom of expression. And as the process progresses, what you'll see is other instruments starting to take shape, so the terrorist content regulation is one of them, and in June and September of this year you have the public consultation for the Digital Services Act. At the same time you have that consultation, you have a particular aspect of Article 17 that's being discussed by the commission with a number of stakeholders. So Article 17 towards the end says, in order to better define certain of the mechanisms in this provision, we are going to carry out this stakeholder dialogues, this is the acronym SHD on the south in green in the graphic, and those are still ongoing. But in September of this year, so very recently, the European Commission after the sixth stakeholder dialogue that took place actually came out with a, let's call it a preliminary guidance, which they call the targeted consultation, which comes out with some ideas of how this should look and has now been commented very aggressively by different stakeholders and very passionately also, and right now we're in a whole pattern to see where the commission is going to come down on. But as this is happening, other things are taking place that shape the discussion. So you see there in July 2020 there's a very important ethical channel opinion on the joint cases of YouTube and Cilando, I mentioned them as YouTube and Google. This week already on Tuesday, we had the CJU hearing on the Poland versus Parliament case, where this same ethical channel was actually there and he's going to issue an opinion on April 2021. This is important because it might indicate where it's going. So we'll discuss this a little bit later but I want to mention to you already flagged this that this ruling is going to come out at about the same time that member states have to have their decisions done as per the deadline of the implementation of the directive. So it's very much a topic but also quite important decisions to be made by member states in this respect. Now, in light of this broader context, which I hope is clear and we can come back to it, how do we actually get here in the particular lane that is copyright? Well, there's a short story that is lobbying and politics and here you can see the U.D. Committee, the Legal Services Committee of the European Parliament Rapporteur Axel Wost had pushed most of this provision forward, but I think there's a longer story that we need to look at. And the longer story, I think it's always nice to tell this in a certain narrative. So it's a fight between two different competing narratives. On the right-solder side, pushing for stricter measures in this provision, there's a value gap narrative. And on the other side, there's the user-right association, the upload filters narrative. And clearly at the end of the day, there's a prevalence of this value gap nature. You can see a meme here and part of the discussion on the upload for the defenders of those that said that this provision is unacceptable because it imposes upload filters is that this would be a danger on the free and open internet. But on the other side, what you have is this very powerful narrative, and I have here an example of the record industry's representative body, the FBI, that basically what they do is they say, look, we have a problem here. We have a value gap between what the platforms are, the user-upload platforms or so-called also user-generated content platforms are paying versus what other online service providers are paying. And the example has always been YouTube has this terrible player that pays nothing for the content on their websites versus Spotify one. And I'm not here going to have economic discussions about whether or not there is a value gap. In my view, this has not been proven economically speaking, but I do think from the legal perspective, we really should look at this with a bit more attention. So what does this mean, this kind of narrative in this parallel between YouTube's and Spotify's? So I use this, my narrative device is whether we're comparing oranges and oranges and apples. And I've used this before, so those of you that have some presentations from you before will recognize a lot of this. Yeah, so I would say that it's called the YouTube, the orange and the Spotify, the Apple because there's a nice color scheme there. Let's start with Spotify. Spotify is actually an online music service provider. All of this pre-DSM directive framework, talking about the legal framework that basically intermediates in a way between the public and copyright holders. From the perspective of copyright holders, what they have to license to Spotify are author's rights, a lot of them aggregated through collective management organizations, and then related rights, record producers and performers rights, a lot of them aggregated in record producers. Now, obviously, the regimes that apply here are quite clear. On the collective rights management part, we have a specific directive that they enable as a regime of multi-territorial licensing. That's the MTL acronym there, and the CRM is Collective Rights Management Directive. Apologies for all the acronyms. But on the side of the actually exclusive rights being licensed, these are a clear case of direct liability. It's reproduction and communication to the public, and there are articles two and three of the infos of directive. There is not a great doubt about the legal regime applying here. So this is a clear situation of an online music service provider, which would be directly liable unless it obtains licenses to exploit these exclusive rights, either individual licenses or collective licenses. Now, the case of YouTube is obviously different. YouTube is a user-generated content platform that intermediates between the public and the end user, but we have to understand what's the legal regime applying to it. Now, of course, the end user is here the one uploading. The copyright relevant act is the upload of the end user. This would be mostly, it can be also reproduction, but it's mostly about communication to the public. And in this particular case, it's very clear that the act of the end user is subject to primary liability, which is harmonized under the infosack directive. Articles two and three and four, but online two and three talk about the rights of reproduction and communication to the public. Article five talks about exceptions and limitations, and then there's rules and enforcement. For us, they're relevant for copyright also in article eight. This is the setup for the user. But from the perspective of the platform is quite different, right? What we have is mostly harmonized regimes of secondary liability with partial harmonization in the e-commerce directive because they are considered hosting service providers and the benefit from a liability exemption, so-called immunity or safe harbor, coupled with a general ban on general monitoring obligations. So they are not subject to direct liability. They are subject to a regime that ends up being a notice and takedown regime that's the NTD there. They're also subject to injunctive relief for content uploaded by the users and member states are allowed to impose certain other obligations or duties under the guise of duties of care. This has led to a lot of these platforms implementing variations of content recognition technologies. That's the CRT there, but that's not the same as what's happening for primary liability. It's a totally different regime. So there's more or less a clear line between those regimes as far as the law in the books is designed, of course, there's a lot of interpretation. The question has been, how do you monetize from the perspective of the rights holders, these uploads on platforms, and because the platforms operate under the shadow of illegal immunity or safe harbor, there's been a number of mechanisms to monetize. There's some licensing, some monetization through ad revenues and some other types of deals, and usually big rights holders have privileged access to the platform and they act as what's called or usually known as trusted flaggers. Now this kind of setup is obviously tenuous, right, because the line is not always clear. It has become less clear in the relationship between platforms and rights holders as the court of justice of the European Union, in interpreting the exclusive right of article three of the InfoSoC Directive of Communication to the public, as slowly eroded what is traditionally a strict liability tour to introduce some mental elements and this starts, well, it starts way back a little bit with SNAPSET but the main cases where this is explicit are GS Media, film spaler and then the pirate pay. And there are a number of pending cases. The first two there in red are now joint cases that involve Google in YouTube, and also in case the Pulse for TV I think also involves you two that are actually flat out asking the question that is present that gave rise to the creation of article 17 being, are these platforms directly liable for communicating to the public? So far the answer has been no, but the line has been eroding so they're close to the line, but the answer has been no. And there has been an AG, a very lengthy AG opinion I'll come back to from that Africa General I just mentioned, that actually says no, they are not liable. As far as the law stands today, there is still that division despite the introduction of these mental elements. This is important to know because we already see that there's a difference here, a clear difference between what's the apple is in the orange or in my analogy. Now, where are we specifically today? Well, we are today in a very complicated situation because, as I showed you in the timeline before, we have the Court of Justice of the European Union trying to answer the question that gave rise to a legislative change, and so far the answer has been mostly no. We have the Polish challenge to article 17 based on freedom of expression. Actually, that could invalidate key parts of the new regime. There was a hearing as I mentioned this Tuesday and the ruling is going to come out at the same time as the deadline for implementation of the directive. In the stakeholder dialogues and their article 1710 where there's a targeted consultation. That's already gives more or less the idea of what the Commission wants to do about it but we're still most member states are in a whole pattern to this to decide how to implement this provision to see what the guidelines are. I think there are national transpositions that are starting to take shape, but in some cases, countries are it's a sort of regulatory competition to try to set up their, their position in there you see, especially France pushing for a stricter implementation of the provision more strongly on the preventive measures in other countries like the Netherlands trying to, well, now the Netherlands not in the first version, taking a different approach. In the academic area there's also a boom in scholarship and I think I've been involved in a lot of it actually pushing for a more user rights friendly and freedoms approach Christina also, I would say on that side. And there's the European Copyright Society and then there's a number, a number of stakeholders, where I would put, not just CMOs right so there's that organizations also be ally to to its mixed nature with with stakeholders and academics that push in different countries here. So this is where we are today, but probably will change next week. Now, with that set up in mind so I gave you the broader context and also gave you a little bit of how we got here from legal perspective and copyright. I will now explain the very very complex mechanics of Article 17, and then go to the number of issues that I had mentioned before. I'd like to start with this citation from Severine Dussoli as a recent article on the Common Market Law Review, which calls Article 17 the monster provision, both by its size and hazardousness as always a very nice figurative thing, and basically focusing that well if you want to discuss legislative intent here it's quite difficult in this area if it was highly contested, but it is such a broad and complex provision that it actually opens up to a lot of interpretations and that's both a blessing, and the curse as we'll see. Now, going through it, who does it apply to? Well, let me just push the dialogue box a little bit to the side. Well, it applies to online content sharing service providers. So whoever came up with this acronym really hates academics because we have to write it all the time. And what provision does is go for positive definition says this has to be a user upload content platform that has large amount of works, organizes and promotes them, and has to have a certain commercial competitive effect. So clearly covering cases like YouTube and video sharing platforms are clearly covered. I'm going to come back to this because I think other so so so considered or so called obvious cases are not so obvious at all. There's a number of exclusions. And these are electronic communication service business to business cloud service online marketplaces and so on and so forth. This is a combination of examples that would fit in the positive definition and examples of entities that love it specifically to be specifically excluded and then the legislative abstracted the category from a particular example and obviously nonprofit online encyclopedias is Wikipedia. And the last one is clearly github because they were quite involved in the discussion of the directive. I think they're rightfully excluded but this is a bad way of doing law. I would suggest if you do not fit under the definition of OCS SP or you're excluded from it, you are considered a non OCS SP platform. And that means that you're outside the scope of the directive, and you are inside the scope of the previous regime I mentioned. The definition of legal regimes here that means that we're going to have a lot of platforms that are going to fall under this definition this regime and platforms that are going to fall outside of it and they're going to coexist. And the question is, how does the previous case law and whatever case law continues from the court on that side, the prior prior existing framework is going to affect the case law and article 17 so already see potential for trouble. Okay, what is the what and how we are well, if you do the phone and the definition of OCS SP, you are now communicating to the public, you are no longer an in an hosting service provider benefitting from. So if I were communicating to the public independently of knowledge of the illegality of the upload so there's a clear differentiation of legal regimes. This easily means that there's direct liability primary liability. The OCS SP becomes in a certain way, a copyright user explicitly not benefiting from those things say for an article 14 of the Commerce Directive meaning that it is this is clearly like specialist to be commerce directive. It will be, or the platform will be directly liable, except if the following happens and now please bear with me because this is quite complicated I tried to make drawings so it's a bit easier. So it will not be directly liable if option one, it obtains authorization and I put this in quotation because of recession as you will see might mean many things from the right soldiers. Now that authorization must cover the non commercial uploads by its end users so there's an extension effect. Actually, in my view there's a joint nature of the two of the two acts so there's a shared sort of legal construct there. And I mentioned non commercial as a sort and because of the law says, or can be commercial but without generating significant revenues are sort of priced into the license so that's option one. If you do not obtain authorization and you go to what I would say an option to and that's what the provision calls or actually a recital cause it a liability exemption mechanism. And that's an article 17 for B and C. What we see there in this option to is first that you have to have best efforts to obtain an authorization so the regime says if you obtain authorization, you're okay. But if you don't obtain authorization, the first cumulative condition is that you can add best efforts to obtain such an authorization. Now there's obviously a link between these two concepts and I think San Flavin and Metzger have really in the European Copyright Society, actually very nicely put as expressions of the same duty of the platform in this case. And they say it's very difficult to know what is best efforts, but if there is proactive search of publicly known copyright holders. It's almost a sort of system of notice and action to license for others. So if it's publicly known. I have to go meet them. If it's not publicly known then basically they have to notify me and I provide a license and then these authors actually going to some sort of pre contractual obligation system there but I think the essence here is. It's a fuzzy concept that's difficult to pin down and direct if doesn't say much about it, but you must comply with it because it's a cumulative condition. Now there are two more right. So you didn't obtain authorization you prove best efforts. In addition, you have to carry out best efforts to ensure unavailability of specific works for which copyrights others provided relevant and necessary information. So this is what's colloquially known as an obligation to implement upload filters and and act expeditiously subsequent to a notice from copyright orders to take down so it's basically back to the notice and take down regime infringing content, and in addition, make best efforts presented for upload so that's basically what could be said a notice and stay down regime, right, you receive the notice you took down in forever then you have to filter, or include an automatic recognition system that doesn't allow that to be uploaded again. And this notice and stay down is what's colloquially known as re upload filters so just we're clear on how these terms are usually used now. To all of this, this is one side so this is the licensing and preventive measures side of the provision but then the provision as a whole second part to it. In the second part, we can call them mitigation measures or they're quite more than that. And you have proportionality assessment and factors for all the stuff that's most of the stuff that's above. There's also regime for small and new platforms, which is mostly window dressing because one of the requirements is that the platform is not more than three years old so if you're a new platform trying to get venture capital after one year and a half. You probably already have to start complying with the stuff so no one's going to give you money for it. You have a number of other more relevant stuff which is provisional mandatory exceptions and limitations that makes them similar to user rights in 17 seven and couple with 79. There's a ban on general monitoring in 17 eight, and there's a number of what I would call procedural safeguards about complaint and redress mechanisms. Now this is all very complicated. There's none of them, but if you want to know more about the part on the ban on general monitoring. You really should read Christina's and Martin same flippants, very, very recent study where mainly the main conclusion Christina forgive me I went there and I put a tweet of yours is that, well, basically you have to be consistent between the ban on general monitoring in the commerce directive article 15 and this new ban and you have to prospectively be consistent in a new digital services act so here you also see the importance of the copyright directed as a sort of trojan or trojan or so precursor for what's to come depending on your perspective, and also what Christina and Martin says that other preventive measures may be required by entering the areas, but certainly not type of filtering provisions and systems that we've been talking about so far so I think in the Q&A we can get into that, but I would highly recommend that you read this study. What I will talk about a bit later is about the user rights part of the safeguards, namely in one of the issues that I address. So but hopefully by now, I know this is quite complex and hopefully it's clear different mechanisms of the of the provision. In sum, this is I think what article 17 years is a mixed a mix between the apple and the and the orange in my analogy sort of legal hybrid, and I think this legal ebriety of it comes with some serious consequences for how it has to be interpreted and implemented down the line. What I think it's pretty clear is that there's a normative hierarchy in the provision that puts at a higher level licensing and on the other hand user rights for freedoms as the main goals and obligations of the directive and preventive measures at the lower level. And this is what Martin St. Fleven has in a different article this Bermuda triangle between licensing for authors, the liability exemption mechanism preventive measures for platforms and the user rights of freedoms for users. But it's also comes out very clear in the alley in the arguments in the hearing last Tuesday in the by the European Commission, the Parliament and the Council, and also targeted consultation that just said a very simple thing we've been saying for a while which is, if you read the provision on its face, there is an obligation of best efforts for the preventive measures, there's an obligation of results for the provisions on mandatory exceptions and limitations. This is clearly stronger than the other from a perspective of normative hierarchy, and from that perspective when you're implementing the provision you can't ignore that reality. Now, moving forward to the different issues that are that are here important now very quickly. First is an elephant in the room. What is an online content sharing service provider. I want to go very deeply these are my top provocations for the discussion but I think it's very clear that YouTube and Vimeo are there but some authors have argued that Facebook wouldn't be there, and other social networking websites wouldn't be there, because they're not necessarily fulfilling the same requirements of the positive definition and that with the assistance of the recitals that would lead you to the conclusion that their main function is to organize and promote copyrighted protected materials. Their main function would lie elsewhere, their main competitive in fact would also lie elsewhere. Now I'm not saying this is correct or not, but I am saying that because of this open definition. I leave the citation here for those who want to then read the slides I'll make them available, but I think the point here is quite a serious point if you do a case by case assessment. It is possible that depending on our implement these factors that some of platforms that we would think that would fit this definition because they do post even large amounts of copyright protected content might not fit the definition. Certainly as you go down the line from the bigger platforms doubts start to creep in about some of this sharing website so that's one point that's important. Now something I will dig a little bit deeper into is about authorization and I think the question here is, what's actually the nature of the writing article 17. Well I have a very lengthy and quite boring article written with Martin simply that should come out of peer review very soon. But I think it's an important point, because what we do is we actually look at the different options of interpretation, and we see four possible options. This is a set type of communication to the public within the minimum standards of international law. This is a set type of copier communication to the public outside those minimum standards. It's a special right of communication to the public or a new sweet generous right of communication to the public. So this first option you just, it would look something like this it's everything fits neatly in the international minimum standards. Article three of the infosop directive is basically an implementation in Sweden that scope and optical 17 would be a subtype of that communication to the public. Well this is clearly not the case because a lot of things that fit currently after the case law of the court since Benson in linking and film spaler and others would not fit in the definition or the concept of communication to the public in my view and I think most commentators that have looked at this seriously have said the same. So I'll move up from that. The second option which is option B is actually a little bit different it's it posits that the European standard has gone beyond the international minimum standard. This is what this image and tries to show, but actually article 17 is squarely or fully inside the scope of article three of the infosop directive. Our argument here is, there's a lot of arguments here but that is actually not the case because the current interpretation of article three of the infosop directive in all those cases I mentioned, does not go as far as to cover clearly the hosting platforms that we're talking about but more than that, the own construction of the right in article 17 is such that it puts it outside the requirements that have been, let's say, creatively invented by the court to interpret article three. So it just really doesn't fit even if you think it overlaps partially which I grant it's really doesn't fit. So our actual preferred interpretations are then. Sorry, I have lost control of my presentation. So I have to share screen again. Okay, let's see if this works. Do you see Christina apologies my whole presentation or the cool. Thank you. So I was going with this option bees really not convincing so what we think is this is either a special right or a new generation is right. The special right will be just basically qualification of like specialist saying well it actually would fit into the type of activities that are already covered by article three, but the specificity in the regime means that it is its own regime so it doesn't follow it's carved out from the infosop directive, and it has its own rules which has some serious implications also the arguments there we don't need to go through those. But the last part is that it's actually new switch and is right so something completely different, created a new outside the regime of article three these are different activities they were covered by the safe harbor and other fully outside with their own regime. My main point here sorry to go through this my our view is that this is something between option C and options be in actually it's also sort of what's endorsed by the general opinion on Google YouTube, although obviously not exactly what we're saying, but also it's endorsed by the AC targeted constant consultation which calls it elect specialist, and you would say okay john that's really really nice I like all those nice designs and think but why does it actually matter all of this well. This is because if you accept that it is an exclusive right within the scope of article three so the first two options. What you have here is only certain number of possibilities of implementation of direct licensing voluntary collective management or collective licensing with extended effect. And those are the different options that are accessible to member states and they're not really very satisfying in terms of making it operational. If you accept the other option that you see below, which is not faded but I'll go back a little bit and follow our interpretation because we can potentially even talk about having exceptions and limitations to part of these activities in this idea that one size does not fit all enforcement sectors, it will give you more possibilities of implementation. So that's a broad strokes the argument. Now, very quickly to preventive measures versus user rights or freedoms which is very very hot topic. One of our schematics is you didn't get authorization, you have best efforts to obtain authorization so you comply with 17 for a, and then you apply preventive measures. Now, the idea is that you, how do you apply this preventive measures, the only thing we've seen so far is that the, the offers out there are filtering mechanisms of content recognition technologies that would apply the so called upload filters or re upload filters. They're not suited for the types of uses that are permitted by or that are aimed by article 177 those user rights and freedoms that I told you are at a higher level. So, not only before all these reasons that I list here but practically this there is a contextual nature to a lot of these users that admittedly by all these companies this, this type of technology is not attuned to. One of the great things about this step and is that if you do implement this type of measures, you will encroach upon article 177 those user rights or freedoms. So how do you deal with this situation well, there is one approach which is basically say, well, you know, we just put the filters on this is the first version of the Dutch stress. We put the filters and all that is sufficient is to have a complaint and redress mechanism, and then users can actually complain and if they complain and the stuff is actually covered by an exception then we're all good. Well, this is really not satisfactory because what that means is the fact though these uses will be blocked. And that will put actually the preventive measures as more more important and more relevant in the Sierra Post structure than the user rights so it's actually inconsistent and not proportional. And it would end up I think that's also Christians and Martin's conclusion they're studying to imply sort of ban on general monitoring that's actually prohibited by the provision itself. Now, the one way, one possibility, what we have to explore is how do we make sure that these user rights of freedoms prevail in the schematics of the implementation over the preventive measures. So how do we do this while one approach, don't get scared you don't have to read all of this is what we proposed in a recommendation signed by about 60 European academics is how do we construct a system based on the case law that we have at the last instance. So the important part is really here right building on the case law and also the recent Facebook case on defamation what we try to find is a solution that says well, what can you actually filter well you can filter either total matches of files, or equivalent because it's something that actually unfortunately the court has recognized for these cases. And those you would actually have a blocking rule implemented but if it's not one of those cases actually during the complaint and redress mechanism that applies that content would have to be left up. So there's a number of mechanisms to do that one is flagging lawful uses and if you do that, perhaps there's a chance that provision is valid, but that would mean actually a change in the types of systems that platforms are using now it's not a complicated but it is a different type of approach now this has been opposed very very strongly by right soldiers organizations on arguments that the harm that comes from this is much greater for right soldiers than applying the fact of filtering system would pose a harm for users. Coming back and I promise this is my last slide Christina to the current discussion. This same mechanism that I mentioned now is actually a variation of it is what the targeted consultation from the Commission is proposing. They just call it something else they call it either manifestly infringing I believe or partially infringing in manifesting fringe and you can filter partially infringing not but we have a problem here and the problem is the following. This is precisely what was discussed discussed in the hearing on Tuesday with a court on the Polish challenge to article 17, and it's the focus of the challenge the focus of the challenge is the fundamental freedom of expression concerns with the preventive measures that I just explained. So member states are in a bit of a complicated situation. The AG opinion comes out in April 2021, and the decision will come out in the summer. In the meantime, they will get to consult the guidelines from the Commission, and they have to implement the provision What do they do then if they do an implementation that in the court later on comes to say oh actually that's invalid and you have to change the law immediately again. So I would say we have a lot of fun times ahead for this but I hope at least I managed to provide some sort of some clarity on the topic. So thank you very much for your attention.