 And now to announce the speakers. These are Diego Naranjo, who is a senior policy advisor at ADRI and Andrea Beleu, whose campaigns and communications manager also at ADRI. ADRI stands for European Digital Rights, which is an umbrella organization of European NGOs active in the field of freedom, rights and the digital sphere. CCC is actually a founding member of it. And they will be talking about citizens or subjects, the better the control of our bodies, speech and communications. The floor is yours. This one, this one here, this is my phone. There are many like it, but this one is mine. My phone is my best friend. It is my life and I should master it as I master my life. This is my phone, but what makes it mine? I mean, it might be quite obvious right now that I'm holding it for all of you. What is not that obvious, though, is that my phone is also holding me. On one hand, we use our phones. We use it to connect to the Internet, get online with our friends, exchange opinions, coordinate actions. On the other hand, we are used. We are used by third parties, governmental, private, who through our phones, through our devices, monitor. They monitor our location, our bodies, our speech, the content we share. At ear level right now, there's a sort of a pattern. There is this tendency, a trend, almost. Certain laws like the e-privacy, the copyright directive, the terrorist regulation have this very central core that we call the body and speech control. It looks like it is really the driving force in the moment. So in the next 40 minutes or so, what we will do is give you short updates about these laws. Talk to you a bit about what their impact is on us and what do they mean past article X and Y. And hopefully convince you to get involved in changing how they look right now. Well, we represent the rights as Walter was mentioning before. We are 39 human rights organizations from all across Europe. We work on all sorts of human rights in the online environment, so-called digital rights. We work on data protection, net neutrality, privacy, freedom of expression online, and so on. And Andrea and I are glad to be for our very first time here at 35C3. Now to continue that quote from that adapted quote from Four Metal Jacket. My phone without me is useless. Without my phone, I am useless. We spend most of our seconds of our lifetime around devices that are connected to the internet, whether a phone, a computer, a fridge or what not. This means that these devices pretty much become attached to our bodies, especially a phone. Tracking these devices, therefore, is equal to tracking our bodies, controlling our bodies. For the purpose of this presentation, we will talk about online tracking in terms of location tracking, the tracking of our devices, the behavior tracking of users on websites, how much do they spend on, I don't know, what part of a website, where do they navigate next, how many clicks do they give, and the tracking of communication sent between two devices. First, location tracking. They are on average more screens on most of the households than we have people. We carry some of these devices in our pockets and they have more personalized information than most diaries used to have before. Our phones need to be tracked because they need to be able to receive and send calls, messages, data. But this opens, of course, new ways to use location data for commercial purposes, but also for state surveillance. When it comes to behavioral tracking, tracking our behavior online provides a lot more information than just location. It adds on top of it, right? A user can then be targeted according to that tracking. And the more this targeting, not tracking, targeting process basically represents the business model of the internet nowadays. For this reason, the more complex and detailed someone's profiling is, well, the more accurate the targeting can be done, the more effective and efficient most of the times, and therefore, more valuable the data about the profile is. You can see here a really cool infographic from Cracklabs, Axiom and Oracle profiling of populations. You see the amount of variables and the amount of information in the depth where it goes, and you get that business model cash flow. And this business model is quite interesting. I wouldn't imagine a postman going to my physical mailbox at home, going through my letters, and then putting some leaflets for advertising according to what he reads. Right now, J-Mail and many other services, that's what they do. They leave out, as you well know, reading your emails to sell your stuff that you don't really need. Facebook conversations now are, through the API, are an option. They want to read those conversations in order to find patterns, for example, for intellectual property infringements, especially for counterfeiting, but not only also for copyright. Now, also, the WhatsApp metadata is used by Facebook in order to know who your friends are, who you contact, who your family is, in order for that social media services to gain more and more power, more and more data, and more and more profit, of course. The life of others. Quite a good movie. If you haven't seen the movie, I guess everyone has seen it. If not, you should. It's basically about a stasi agent who follows the life of an individual through a period of time. After the Snowden revelations, this movie has changed in a way from a drama to a soft comedy, because the capabilities of surveillance services, to avail all of us, to get so much data also for companies, to get so much intimate information from us, has double, triple, or perhaps exponentially grown compared to what the stasi could do back then. And all of this, to some extent, is going to be regulated by this regression with a very, very long name. I guess half of you have fell asleep already by going through it, but I'll go through it quickly to let you know what it is about. Why do we need to know about ePrivacy regulation? Why is it important for the control of our bodies and our devices? The ePrivacy is about online tracking. You might have heard of the cookie directive, the one bringing all those cookie banners on your screens in parties and due to a bad implementation of this directive. It is also about your emails. Who is going to be able to read your emails to use the data from your emails to sell you advertising or not, how confidential that information can be? It's also about your chats, how you communicate nowadays with your WhatsApp signal wire or any other devices. And finally, it's also about location data. Who can track you? Why can they track you? And what are the safeguards that needs to be put in place in order to safeguard your privacy? And I can imagine many of you saying, well, don't we have already this GDPR thingy of the emails received in May? Yes, we do have that, but the GDPR was not enough. After more than four years of discussions to achieve this general data protection regulation, we achieved a lot. And the GDPR was the best possible outcome in that current political scenario. And there's a lot to do regarding the implementation, though. We have seen problems in Romania, in Spain, and we expect that to happen in many other places. But we still need a specific instrument to cover the right to privacy in the electronic communications, including everything we mentioned before, metadata, chats, location data, the content of your communications, and so on. So ePrivacy is basically meant to complement GDPR and be more focused on exactly the topics that Diego mentioned. What did we advocate for and still do? Privacy by design and privacy by default should be the core principles, the pillars of these regulations. Moreover, politicians need to recognize the value of maintaining and enhancing secure encryption. Cookie walls. I mean, we should be able to visit a website without having to agree to being tracked by cookies. This is another topic that we strongly advocated for. And finally, content should be protected together with metadata in storage and in transit. We actually succeeded. Last year in 2017, at the end, the parliament adopted a very good text, a very strong text. It supported most of the problems that, no, it addressed most of the problems that we pointed at and supported the values that we're going through. But this has been quite a ride. I mean, it wasn't easy. As Diego said, we're a network, 39 organizations. They're not just legal people or tech people. It's a combination of both. So when we provided our input in the shape of analysis or recommendations, some bulleted there, all sorts of skills were combined. And this played a big part of our success, the fact that we were able to provide a comprehensive, yet complex analysis of what encryption should look like, of what cookies should act like, and also a legal analysis of existing legislation. The diversity of our skills became productive. Did we win? Well, we are on our way. After the EU parliament adopted its position, now it needs to sort of enter into a discussion with the member states. It was called the Council of the EU. So the parliament with a strong position is now to talk with the rest of the member states, right? Currently, the negotiations around the privacy are not really moving forward. They are being delayed by the national governments. They claim that there are issues that need to be tackled, that is very technical, that we already have the GDPR. We need to see how it is implemented. And member states fear that another layer of protection may impede that some businesses grow in the European Union. And if this was not enough, they are also afraid of getting bad press from the press, who right now depends to a high extent on behavioral advertising. They say that without tracking you all over the internet, they are unable to sustain their business model. And of course, since the national governments, the politicians are afraid of that bad press from the press, then they are quite cautious to move forward. Online, we exercise our free speech in many ways, but one of those ways is the way we produce, share, or enjoy content online. Our opinions, the people with whom we communicate can be seen as a threat in a given time by certain governments. We've seen the trend in certain governments such in Poland, Hungary, and to a certain extent as well in Spain. All of this information can be as well very profitable. As we see with the mainstream social media platforms we were mentioning before. So they are political and economical reasons to control speech. And that's why the best way to control speech is to control the way that content is shared online. Right now, there are two proposals that raise huge threats to freedom of expression online. Both propose upload filters by increasing liability for platforms, making platforms, companies responsible for the content which they host. One of them is the famous article 13 of the copyright and directive proposal. And the second one is the regulation to prevent the dissemination of terrorist content online. Both of them are private companies. As you will see, they are just another way to make private companies the police and the judge of the internet. This is the first one, the proposal for the director again with a long name. Just stick to the short name, the copyright directive. And this copyright directive is based on a fable. The fable goes like this. There are a wide range of lonely and poor songwriters in their attic trying to make money. And the fable comes for their audience. Then there are these big platforms, mainly YouTube, but also others that allow these uploads and gain profit. And these platforms give some money, some small amount of money for these authors. And the difference between what they earn and what they should be earning is what they call the value gap. The fable though conveniently hides the fact that it's not just a year after year after year, that increase the revenues to a high percentage every year. And that keeps growing especially in the online world. What is the solution to this problem? Well, as you can imagine, it's a magical algorithm that will solve this problem. And this algorithm will filter each and every file that you upload to these platforms, will identify it and match it and will block or allow the content, depending if it is licensed or not, and if they like you or not, and according to the terms of service in the end. As we will mention, there are some technical and legal problems with upload filters. In essence, if they are implemented, it will mean that YouTube and Facebook will officially become the police and the judge of the internet. The other big fight that we have is around terrorism, to be specific about terrorist content online. After the Cold War, we needed a new official enemy once communism fell. Terrorism is a new threat. It's very real to some extent. We lived through it in Brussels recently, but it has been also exaggerated and inserted in our daily lives. We see that in the airport controls, surveillance online and offline, restrictions for freedom of assembly and expression all over Europe. And whenever a terrorist attack occurs, we see pushes for legislation and measures that restricts our freedoms. Usually, those restrictions stay even though the risk of the threat has disappeared or has been reduced. Again, there we go with the long name. Let's take to the short name. This regulation to prevent the dissemination of content or terrorist content. This proposal allegedly aims at reducing terrorist content online. Not illegal content, terrorist content, in order to reduce risks of radicalization. This avoids what we have seen through the experience that a lot of radicalization happens outside the online world, and that radicalization has other causes, which are not online content. It seems that politicians need to send a strong signal before the elections. We need to do something strong against terrorism, and the way to do that is through three measures. Three measures, three, as we will see in a minute. First, SPD consulates take-dance. My favorite. Platforms will need to remove content based, which has been declared terrorist content by some competent authorities. This definition of terrorist content is, of course, vague and also incoherent with other relevant pieces of legislation which are already in place but not implemented all across the EU. This removal needs to happen in one hour. This is sort of fast-food principles applied to online world, to audiovisual material, and they give you some sort of complaint mechanism, so you have any problem with your content being taken down, you can go and see if this content is legal, please take it back. But in practice, as you read it, you will see that it's likely to be quite ineffective. First of all, also the overblocking will not be penalized, so if they overblock legal content, nothing will happen. If they leave one piece of content which is illegal on their platforms, they will face a sanction. The second issue is those of measures for voluntary consideration. According to this second measure, the states will be able to tell platforms, I have seen this terrorist content in your platform, but really, really bad. So I really felt I had to ask you, could you be so kind to have a look just if you wish, of course, no worries. And the platform then will decide according to their own priorities how to deal with this voluntary request. Third, good old upload filters. That's the third measure they are proposing. Upload filters are generally monitoring obligations in legal jargon are prohibited in EU legislation. But anyway, let's propose them, we'll see what happens. And in order to be able to push them in the legislation, let's give them an Orwellian twist to our filters. Let's call them in a different way. So we call them proactive measures. Platforms will need to proactively prevent certain content is uploaded. How will they prevent this? Upload filters, of course. I mean proactive measures. And whether it is copyright or terrorist content, we see the same trend. We see this one size fits all solution. A filter, an algorithm that will compare all the content that is uploaded, will match it against a certain database, and then we block it or not. We will need many filters, not only one filter. We need filters for audio, for images, for text, and also one specifically for terrorist content, whatever that is defined. So this is basically the principle of lawmaking today. We really want filters. What can we invent to have them? We've got an issue with filters. Well, quite a few issues. But in big, an issue. First of all, they're illegal. The European Court of Justice said like this, a social network cannot be obliged to install a general filtering covering all of its users in order to prevent the unlawful use of musical and audio-visual work, in the case salon versus network. Despite this, it seems that automated content filters are okay. Not general filtering covering all of its users. Of course, there are the technical issues. Yeah, there are some technical issues. One of my best examples, the magnificent examples of how filters do not work, was James Rose, the pianist that we few weeks ago tried to upload a video of himself playing Bach in his living room. The algorithm detected some copyrighted content owned by Sony Music and automatically took down the content. Of course, he complained. He took the content back. But it's obvious that it's a good example of how filters do not work because one piece of Bach who died around 300 or 400 years ago is, of course, out of copyright. And with this, the video of a famous artist is taken down. We can imagine the same for many of your content. So not only that filters don't recognize what is actually copyrighted and what is not, but they also don't recognize exceptions, such as remixes, caricatures, or parodies. When it comes to copyright, filters can't tell. And this is why memes were a central part of the protest against Article 13. And this is why we will show soon why this filter has huge potential for a political tool. Another issue with automated content filter is that they don't even recognize contexts either. When it comes to hate speech or terrorist content, they can't tell nuances. A girl decided to share her traumatic experience of receiving a lot of injuries, not injuries, insults. Insults in her mailbox from this person who was hating her a lot and threatening her. So she took it and copy pasted it on her Facebook account, made a post, and her profile was taken down. Why? Because the automated solutions can't tell that she was the victim, not the actual perpetrator. And this is very likely to continue happening if this is the solution put forward. Then it's also a problem for SMEs, of course, because these filters are very expensive. YouTube spent around $100 million to develop Content ID, which is the best, worst filter that we have now online. So we can imagine how this is going to go for European SMEs that will need to copy that model, probably getting a license from them, I can imagine, in order to implement those filters online. In the end, this will just empower these big companies who already have their filters in place, so they will just keep doing their business as usual. And this new company that would like to develop a different business model will be prevented from doing so, because they will need to spend a lot of money on these filters. Then there's the issue of privatized law enforcement, the privatization of law enforcement. Attributes change. Past state attributes are now shifted over, can you take care of it, to entities that are not really driven by the same values that a state should at least be driven by. I'll just give you one example from a project called the Mandala Project, a study commissioned by the parliament to look at hate speech definition in different EU member states. Their conclusion, their huge disparities between what it means hate speech, what hate speech means in Germany compared to what hate speech means in Romania to what it means in the UK. So in this context, how can we ask a company like Google or Facebook to find the definition? I mean, are their terms and conditions the standard that we should see as the one influencing our legal definitions? Am I the only one seeing conflict of interests here? There's a problem there that once we have these filters for copyright infringements or any other purposes, like terrorist content, we of course will have it as a political tool. Once we have this for copyright, why are you not going to look for those dissidents in every country? Dissidents change very often. I see that in Spain, but I see it all across the EU nowadays. So once we have them in place for one thing, one small thing like copyright, why not for something else, something more political. There's a really interesting example coming from Denmark. Some year or a year and a half ago, the social democrats announced their immigration plan. They made a video in which Mette Friedrichsen talked about how great their plan is. Some people were happy, some were sad. Some of the sad ones decided to criticize the plan and made a video about it. It was a critique during which they caricatured her, but they used two audio bits from their announcement video. Social democrats sent a letter to the NGO accusing them of copyright infringement and threatening a lawsuit. Obviously the NGO thought, yeah, we don't really have enough money to go through a big court case, so we're just going to take the video down. They took it down. Now, why is this case important? If an automated content filter for copyrighted material would have been in place, the social democrats wouldn't have to even lift a finger. The job would be automatically done. Why? Automated content filters can't tell exceptions such as parodies. And this is a very clear case on how copyright infringement can be strategically used to silence any critical voices in the political sphere. So we see a few threats to fundamental rights. First on privacy, they will need to scan every piece of content so they can discard this information. Then we will live in a sort of black box society that will affect freedom of speech. We will face also over-sensoring, over-blocking, chilling effects, and these tools which are going to be repurposed as a political tool. In a nutshell, rights can only be restricted when there is a proof of necessity, when the measure is proportional, and when this measure is also effective. These filters are not necessary for the ends they want to achieve. They are not proportional as we have seen, and they are not effective as we have seen as well. So these in effect they are unlawful restriction of freedom of expression and privacy rights. Now, obviously we were also unhappy about these and I mentioned before how we organized within our network to fight to get a strong privacy. When it comes to copyright, this fight went out of our network. It got a lot of people mad. People like librarians, startups, the UN special rapporteur, all of those there basically and more, and even YouTube in the end who thought about endorsing our great campaign. What we learn from these fights is that we really need to share knowledge between us. We need to team up, coordinate actions, be patients with each other. When it comes to different skills it is important to unite them. When it comes to different perspectives it is important to acknowledge them. If we're separate individuals by ourselves we're just many, but if we're together we're one big giant. That is where the impact lays. Now this is basically a call to you. If you're worried about anything that we've told you today if you want to support our fight if you think that laws aimed at controlling our bodies and our speech should not be the ones that should rule us and our internet, I think it's time to get involved. Whether you're a journalist writing about privacy or other topics or a lawyer working in a human rights organization whether you're a technical mindset, whether you have no clue about laws or anything like that, come talk to us. We will have two workshops, one on privacy, one on upload filters. We will be answering more questions if you have and you can't ask them today and try to put together an action plan. We also have a cluster called About Freedom that you can't see there, but it's right by the info point in SSL. Do you have any questions or comments? Thank you. There's ample time for Q&A so fire away if you have questions, walk through the microphone, wave your hand signal angel, are there questions from the internet? Mic from number one. Good question, I don't know. I don't think that's going to be possible but they're going to find a way to do that because either yet they ban the encryption in the channels or it doesn't matter because they will make you liable. If you have a platform with encrypted channels and you have everything there, but by any reason they find any copyrighted content which is not a license, which you're not paying money for, they will make you liable. Perhaps in practice they will not be able to find you to make you liable because they will not be able to access the content, but if they find a way to do so, they will make you pay. Okay, microphone number two. Thank you very much for the presentation. You've been talking a lot about upload filters. A lot of the telcos and the lobbyists are saying that the upload filters don't exist. The trilog mechanism for the copyright reform is, as I've heard, ending in January and there will be solution in the European legislation process. How will we be able to inform this process and influence it to try and make it better before the European elections? Well, we still have time. That's why we hear one of our main goals to be in 35C3 apart from enjoying the conference for our very first time, is to mobilize all of those who have not been mobilized yet. Thousands of people have been active, they have been tweeting, they have been calling their MEPs, the members of the European Parliament, they have been contacting the national governments but we still have time. The vote will be sometime around January, we still don't know. We are afraid it's going to be sooner than expected but this is the last push, the last push to say no to the entire directive and say no to upload filters and that's why we are here because we still have time. Worst case scenario will go to the implementation phase of course. We go to national member states and say, do not do this, this goes against the Charter Fundamental Rights and then we will stop it there. But either now, which is my hope, or in the worst case scenario we will stop it for sure in member states. Microphone number one. What do the companies have to do with that? Well, they could do that for different reasons because they could get bad PR. Imagine you are a small company in Hungary and then goes Orban and tells you you need to block this because I think this is terrorist. It comes from a human rescue organization. What would you do if you are an SME that depends on perhaps not on the government, but on the general structure, you could get bad PR from the government, you could be perhaps still because you're not acting promptly on this terrorist content but it's true. That is only for your voluntary consideration. Again, microphone number one. Thanks for the talk. I also think when I see a problem, oh there is a technical solution, so it's hard for me to admit maybe not. It does look like it's the case but also when you mention in the workshop maybe more with a, I mean anybody can come but more eventually with a legal background. I don't have it. I'm a developer but I want to understand how a system is working and I understand a little bit about the European process and the regulatory process but so not so much. So what's the most efficient way for me as a developer to get a better grasp of how this system, all those regulations are getting implemented and all the different steps? Well yeah, we didn't come to the lawyer's computer congress. We came to a chaos computer congress so I hope you can make chaos out of it. We need developers, we need the lawyers, we need the journalists, we need graphic designers, we need people with all sorts of skills as Andrea was saying before and we need developers to develop tools that work. So you are capable of developing any calling tool, any anything or any sort of tool that we can use to transform our message and take it to Brussels, take it to the members of the European parliament, take it to the national member states. We really need you. If we need something, it's developers. We have enough lawyers in this world. I think we have too many in Edry with myself already so we need you tomorrow and the day after tomorrow. Okay, any other questions? In that case, I'll ask one myself. Andrea, what will be your start at the member state level to start campaigning if you've never campaigned before? What, can you please repeat? What will be a good start if one wanted to campaign at their member state level? And never campaign before? Yes, campaigning for dummies. Well, we've got a lot of organizations in EU member states so as a person who has never campaigned before and was for someone to campaign with two years ago in Denmark I was advised to look for the Danish Edry member so I did and we managed to organize a lot of great workshops in Denmark where nothing existed because I think all the Danish member had a very complex grasp of the political environment and most of the Danish Edry members understand how this is, how the dynamic is working both politically but also journalists, also what the interests of certain nationalities are so I would say that find your first Edry organization is the first step and then unite with the rest And if there's no Edry member you can always contact consumer organizations, you can contact directly your members of the parliament, you can organize yourself with two or three friends and make a few phone calls, that's also already enough that you can do so there are many ways for you to help out. Of course, make sure you contact your country's MEP. At European level we are being represented and we get to actually elect the parliamentaries, they're the only one who are elected by us and not just proposed by governments or other politicians so if we want to be connected to our country member state but influence a law at European level, like the ones we talked about, it is very important to let our EU parliamentaries know that we are here and we hear them and they came from our country to represent us at EU level. Thank you. Any other questions? Sigma Angel, do we have questions from the internet? Unfortunately not. In that case we're finished. Thank you all for your attention.