 The future laws on AI in the EU are being shaped right now. And while many voices from the police to governments call for widespread video surveillance, a broad coalition of 14 European organizations calls for the opposite. A ban of facial and other biometric recognition systems in public spaces. The next talk is about these efforts and will be given by five people. Andrea, who is organizing campaigns at Edry. Eleftherios, co-founder of Greek civil society organization Homo Digitalis. Ella, who is policy and campaigns officer at Edry. Philip, who is an engineer, artist and activist. And Riccardo, who is a journalist and researcher for the Italian Hermes Center for Transparency. In that talk, they will present the campaign, their goals, and explain what you can do to support it. We're really happy to have you here. So good morning, good evening, or good day, everyone. We don't know when this is being broadcasted yet. But this is the reclaim your face talk for this year's Congress. And we wanted to start with an imagination exercise. Imagine you're in a public space. What do you do in this public space? You're probably walking, you're on your way to work, or to your house to grab a beer, or you're just walking for the sake of it. You might be meeting a friend to hang around in the park or some people for a demo. Maybe you're in a public square and you just decided to join some folks for a random concert that spontaneously just started because some folks had their guitars around. The point is that public spaces are spaces for communities. They're a platform for group dialogue and civic action. They're spaces where we can really exercise our freedoms, be it our freedom to gather in an assembly to speak out against injustices or the freedom to document and record abuses. Public spaces are the areas where we want to be treated fairly and not discriminated against because of how we look, how we walk, what we wear, or who we kiss. Public spaces allow us to decide ourselves how we want to be seen by others and have the autonomy over what actions we take. However, the introduction of biometric mass surveillance into our public spaces is threatening our communities, our freedoms, our autonomy, and the very expectation one has to be treated fairly. When we say biometrics, we mean any type of data that relates to your body or your behavior. Biometric data is sensitive data under EU law. What better idea than to combine this sensitive data with an unlawful practice? Mass surveillance. Mass surveillance is any monitoring, tracking, and otherwise processing of data of individuals or groups in an indiscriminate or randomly targeted manner. Our qualities, behaviors, emotions, characteristics are used against us. Our dignity is under threat. People are objectified, commodified, dehumanized. The use of these technologies, like facial recognition, are manipulative. For example, coercing people into avoiding certain places or events. The problems with biometric mass surveillance are many, from being constantly monitored to being treated by our government like a potential suspect, being discriminated against because you have a certain color, religious accessory, or because you're holding the hand of the wrong partner. Biometric mass surveillance technology, such as live facial recognition, is being deployed in European public spaces every day in secret with no evidence behind the need of such deployment, with no respect for our right or for our dignity in the public space. This is why the Reclaim Your Face movement is calling for a ban on biometric surveillance in European public spaces. We are a coalition of organizations across Europe, as well as organizations with international reach, such as Article 19, Privacy International Access Now, and European reach, such as EDRI. My name is Andrea, and I come from EDRI, European Digital Rights, the Umbrella 44 Digital Rights Organization. Today, I'm joined by my colleagues, Ella, also from EDRI, Elefterios from Homo Digitalis in Greece, Philip from Share Foundation in Serbia, and Ricardo from Hermes Center in Italy. In the next hour or so, we will give you an update on how the EU legal landscape is looking like. We will zoom in on the mobilization in Greece, Serbia, and Italy, and we will discuss how we can help out if you're as concerned as we are. I'll hand it in now to Ella, my colleague, who will give an intro into how the EU legal landscape is looking like. Thanks Andrea. So, what are things looking like for biometrics in Europe? Well, the EU has actually been developing rules on biometrics since as early as 2004, and parts of the EDRI network have been advocating for just as long to make sure that this is done in a way that's lawful and that respects people's rights. So, probably most notably in 2018, the now world-famous General Data Protection Regulation, or GDPR, as well as its lesser known counterpart for police purposes called the Data Protection Law Enforcement Directive, both came into force. And these rules explained for really the first time in European law what biometric data actually are, and they also established the principle of a ban on their use. So, specifically, this meant that the processing of sensitive biometric data was now forbidden in EU law. But there are a number of really broad exceptions and loopholes to this ban, which has opened the door to deployments which have really clearly violated people's rights and freedoms. And it's been a really similar story in non-EU European countries like Serbia, too. So, in February or March of 2021, we're expecting that the European Commission will propose a new and potentially quite ground-breaking law for how the EU will regulate artificial intelligence. This law will likely have ramifications on other European countries and probably the rest of the world in terms of the standards that it sets. And it's likely to include rules on the use of what the Commission calls remote facial recognition, but we call that a form of biometric mass surveillance, and there are many other forms of biometric mass surveillance out there, too. Over a year ago, there was actually a leaked draft of a paper on artificial intelligence, which revealed that at one point, the European Commission had actually considered a three to five-year ban on some uses of biometric surveillance in public spaces. But ultimately, and very unfortunately, they made a political decision that so-called innovation and profitability were more important. So we're really hoping that this time round, when we get this new proposal for a law early next year, they'll be more aware of their obligations under fundamental rights law. So to quickly give an overview of what's happening in European response and what we're doing about this, well, because our lawmakers and our politicians are not yet taking decisive action to protect European public spaces, democracies, and essential rights and freedoms from biometric mass surveillance, our coalition is stepping up to demand that they do so. We've been raising the alarm about the fact that any use of biometric surveillance technologies to scan everyone in public spaces inherently constitutes a form of mass surveillance. And as a network, we've been following the deployment of these systems in almost every European country. And after investigating the high levels of abuse, the harms posed to individuals, communities, and society, the resistance to this biometric surveillance from people across Europe, and the emerging global examples of people's rights and freedoms being really, severely violated as a result of the use of these technologies, we have decided that enough is enough. Over 20 organizations in the Reclaim Your Face Coalition have been exposing why biometric mass surveillance is so harmful, and they've been providing strong evidence for why we need to ban it. We've focused on the lack of transparency and justifications for existing systems, with, for example, members of CCC in Germany raising freedom of information and data subject access requests for information from authorities and companies. We've focused on the need for clear legal limits, with Lacroix Dretcher du Net in France successfully litigating against unlawful uses of facial recognition by authorities. And across Europe, we've focused on the shocking absence of respect for human dignity, human autonomy, and human rights. So I'm going to pass over now to three of the brilliant organizations in the Edgering Network that have been resisting biometric mass surveillance in their cities and countries, so that they can share more about what's been happening specifically in Greece, Serbia, and Italy. So without further ado, I would like to introduce Eleftheria from Homo Digitalis in Greece. I think Riccardo is the first one to go. But I'm sure reshuffling is not an issue. Riccardo will basically detail the black hole Italy is when it comes to details on the facial recognition used by police and how local municipalities are trying to catch up with the latest innovative technologies completely disregarding fundamental human rights. Riccardo, you've got the floor. Thank you. In the past three, four years, Italy has seen a slowly but constant introduction of these biometric surveillance technologies ranging from face recognition to other kind of metadata analysis of video cameras. In this map, you can see three main point of entrance. In the city of Cromo, a face recognition system was introduced, but later, thanks to the intervention of the data protection authority, the system was stopped because it was deemed illegal. The city of Turin and the city of Udine, they're both trying to introduce biometric surveillance systems. In the case of Udine, they are specifically talking about face recognition systems. While in the case of Turin, they are talking about a kind of metadata analysis of the video that would allow the police forces to monitor the movements of citizens across the city and to distinguish if someone is a man or a woman and to check and detect what kind of clothes or objects they are carrying on. These are examples of what's going on at a local city level. When we look instead to the Italian police, so at the national level, the scientific police, as both in 2017, a face recognition system that can be used during investigations, just to give some examples of what's going on, what we've done and what the data protection authority has said. In the case of the city of Cromo, this is a summary of an article you can find on Provis International. We wrote in English, so we can spread it around. Basically, the city of Cromo was approached by Huawei in order to acquire a new innovative system for ensuring that the city was more safe. So basically surveillance as safety measures, but also seen as solidarity between the citizens. From our investigation, from file information requests, we obtained documents that clearly show how the protection impact assessment carried out by the city of Cromo was basically a field of approval meaningless because it was done after the system was acquired. It was basically claiming that face recognition equals video surveillance, which is not. And for this reason, the data protection authority stopped the city of Cromo from using this kind of surveillance system or video of face recognition system. And also from the documents, we've seen that Huawei, the company involved in this kind of activity, was basically pushing for the introduction of these innovative technologies. And in the end, the city of Cromo wasted public money on a system that the data protection authority say it's lacking a legal basis to be used. From the documents that we obtained thanks to the freedom of information request, we can see the other kind of abilities of the technology that it's being sold by Huawei. In this case, it's a soft or soft of metadata analysis. So we can see that their system in addition to face recognition can also track abandoned objects, loitering, crowd density, face detection, head counting, abnormal speed detection. These are all kinds of metadata analysis that criminalize behavior. So basically, if there is a suspicious behavior, which is not clearly defined, this system can send an alert. This kind of mental analysis falls within the biometric surveillance, the biometric processing of our data. And it's also some video that it's been installed in the city of Turin, as I mentioned before. If we move forward at the national level, talking about the system acquired by the Italian police, this is a screenshot from the system being used, showed on national television. You can even see the address of the website. And it consists of two different aspects. It has a sorry enterprise, which is basically an upgrade to the manual search that they used to do in the database of measures. Previously, they did this read writing down the details of the suspect. In this case, the police can use a face recognition system to automate the process, make it more fast, a kind of optimization and can search images during investigations. Given an image from a CCTV camera, during an investigation, they can match this image to see if in the database there's someone they know and there's a match. As Andrea was mentioning, Italy is a black hole regarding this information. This is the fact that when we ask questions regarding this database, we don't know how many people are in there. The overall number is two million images of Italian citizens and seven million images on foreigners. But when it comes to these foreign citizens, it's not clear if they are citizens from other European countries or migrants, we know for sure that they are in there because the office database, this is the name of the database used for this face recognition system, is a database that includes also fingerprints of people. So if you are required to give your fingerprint to the Italian police, you are included in this database. Also, there are lacking evaluation of the algorithm used by the police. Every time you ask information, they basically refrain from giving you any kind of details. In 2018, the Italian National Authority opened an investigation on the second aspect of this face recognition system, which is the real-time one. So after two years, the investigation is still ongoing and we don't have any kind of details regarding the legality and the possibility for the police to use the real-time system that was acquired in order to monitor public manifestations, public events, which falls within a kind of biometric master balance. But we're still waiting. I now leave the floor to Filip to introduce what's happening in Serbia. Thanks, Ricardo. Hope everyone can hear me. So as Ella said, Serbia is a non-EU country. So we are in kind of hybrid regime. At the moment, media is controlled, freedom of speech is suppressed. So officially, we are leaning towards West and European Union, but we have also strong influence from East and China on, I would say, neutral interests. We are part of the China's Belt and Road project among other 70 countries, mostly third world countries. And we are being probably victims of that trapped and soft power influence. So Huawei managed to sell us a huge infrastructure with around 8,000 facial recognition cameras being deployed either on poles in the streets, police cars since a few days ago, and also there will be bodycams on our police officers. So since we are not into geopolitics, we don't care if this is Huawei or Siemens or some U.S. companies, cameras. We just care that this is completely unlawful. So two years ago, our minister of police said that there will be no significant street entrances or passages between buildings that will not be covered by cameras. We will know from which entrance and building the perpetrator came from which car. So this was kind of shocking to us. We started doing research on this. We were sending a lot of freedom of information, access requests. We were being denied, mostly because they're confidential. So we did some OSINT or open source intelligence research. We found a nice case study about Belgrade and Huawei's website, which was kind of weird because we didn't get any information from our government but we got it from Chinese website. And as soon as we published it, it was removed from their website, which was kind of interesting. So we analyzed the laws and we were pretty sure that this is unlawful. We also found some other ways to obtain some information like this one while the cameras were deployed on the streets. And finally, we realized that we have to reach out to the community because now we know that the whole system is really bad for the society and it's also awful. So basically in Serbia, we have this data protection law, which is kind of translated GDPR. So actual purpose was not defined for this system and the necessity was not confirmed. So basically Belgrade is not unsafe city. We don't have terrorist attacks. We don't have much of this low level crime. So why do we need this kind of system? Also data protection impact assessment was not approved yet, but we still have this whole system being installed in Belgrade. So what do we do next? We start reaching out to community. This is a small exhibition at the art festival. We put one bench under surveillance and the QR codes to our cute little small website, which was just a means to leave some contact information for everyone that was interested in joining this fight. So we basically were promoting it on music festivals. We, on Hucket events, we printed a lot of these stickers to promote the website. At one point it went viral. So actually a lot of people reached back offering help in many ways. So this was a good sign for us. We gathered them in a nice event. We then to do collaborative work in several domains. So basically what we wanted to do, we had a group for called the dot.txt where journalists were speed writing everything that our citizens needs to know. We had the dot.pdf group, which were legal guys who were analyzing laws and making some strategies for this. We had the dot.html group, which were web developers and designers creating this website. We also had tech guides that were analyzing patterns and other resources. This was one of the blueprints that came to be something like this. So we actually did a really nice map called the architecture of a face recognition system. This is yet about to go out, but we used the understanding of this system and some of its details to help our co-citizens understand the issue. So this is part from a website which actually explains how this whole system works and why is it so bad for the whole society. So finally we have the guys from the local HACLAB in Belgrade that helped us deploy the surveillance under surveillance. It's a German free and open source application for mapping cameras. So we deployed this and then we started the hunt for the cameras as we call it with our community in Belgrade in Serbia. We invited people to recognize the cameras that recognize themselves. So these are the three most common cameras using this system. We set up the Twitter profile which became instantly popular. People were sending us photos from the streets with GPS coordinates and we actually were able to fill in and gradually the whole map of Belgrade which counts more than 1000 cameras at the moment. So we launched this crowdsourcing campaign which was really nice and this is kind of successful because we have now a strong community supporting us. We did a nice campaign a few weeks ago asking the petition campaign asking the governments to ban biometric mass surveillance. And of course finally we managed to start a crowdfunding campaign because as you can see on this internal meme we did one of the meme components in our starter pack for anti biometric mass surveillance actually is resilience. We mostly do this pro bono. So we started this crowdfunding campaign and it's going really well. So we're hoping to ban biometrics in Belgrade and remove all the cameras. So thank you very much. I will leave it now to Therese. Thank you, Philip. Elif Therese, if you would like to turn on your camera and... Hello, yes, thank you. So I had a small issue with my camera before also. I hope that now you can see me well. Thank you, Philip. Thank you, Andrea. Greetings from Athens, Greece. In the next few minutes I will present to you my slides about the actions we have taken in Greece. I will speak about the central database of biometric info about the use of drones at cameras by Hellenic police during demonstrations and about an important contract of smart gadgets that will enable the Hellenic police to use facial recognition technology during stops and controls in our streets. So without further ado, to begin with the Hellenic police and the private vendor called Intracom Telecom signed last year a 4 million euro contract. The company will develop and deliver to the Hellenic police smart devices with integrated software enable facial recognition and automated fingerprint identification technologies among other functionalities. The devices will be very small in the size of the smartphone and police officers will be able to carry these devices with them and use them during massive police stops in order to take a close-up photograph of an individual as well as to collect their fingerprints. Then the fingerprints and photographs collected will be immediately compared with data already stored in EU databases for identification purposes. So we decided to take some related actions and to file some action formation requests to the Hellenic police. The replies we got were very unsatisfying. So we decided to proceed with a complaint before the Hellenic data protection authority in March 2020. It is important to remember that no related legal basis exist for the use of such technologies in Greece as well as no data protection impact assessments were conducted by the Hellenic police prior to the signature of this smart police in contract of 4 million euros. We were very satisfied to see that the Hellenic data protection authority follow up and in August 2020, they started an official investigation against this contract. Now, moving on to the next action, I will briefly speak to you about the central biometric database of the Hellenic police. Our police collects all fingerprints of Greek passport holders in the central database. It is again important to underline that based on the EU laws on passports as well as the set case of the European Court of the Court of Justice of the European Union, fingerprints shall be stored in the passport themselves in the very documents we carry in our bags and in our pockets. So the EU law neither prohibits nor allows for national central databases of biometric info to exist. If member states want to proceed with the creation of such databases that have to create their own laws. But also we need to remember that based on European data protection laws, processing of biometric information such as fingerprints is allowed where it is strictly necessary subject to appropriate safeguards and authorized by a national law. So we had to take action again. And in June 2020, we decided to file two strategic complaints before the Hellenic data protection authorities using data subjects. Two months later, the data protection authority replied to our complaint and they started again an official investigation. Now, in addition to these two actions, I would like to briefly speak to you about the use of drones and other type of surveillance mechanism by the Hellenic police. Back in April 2020, we filed one print of information access request before the Hellenic Ministry of Citizens Protection because the Hellenic police used drones in order to monitor people's movement during the COVID-19 lockdown measures last Easter. Also a few days ago, we filed other two freedom of information access request before the chief of the Hellenic police regarding the use of drones and cameras on sticks in public demonstrations. It is important to mention that back in April 2020, the Hellenic police didn't have a legal basis to use drones or other types of cameras and surveillance in order to monitor individual movements. The adoption of a related presidential degree of the legal base happened few months ago in September 2020. So with our first print of information access request to the Hellenic Ministry of Citizens Protection, we demanded to know what was the legal basis for using these drones in our city centers back in April. While with our second round of print of information access request, we asked the Hellenic police to give us access to the related data protection and impact assessment that it is obliged to carry, as well as to the administrative decisions that the Hellenic police is obliged again to publish based on the applicable laws. And I will now give the floor back to Ella and Andrea to continue their presentation. Thank you, Ella Stadios. Ella, I think now you will give us some tips on the next steps for our campaign. Absolutely, thank you, Andrea, and thanks to the real Ella Stadios this time. I'm sorry for my mix-up earlier, I hope I didn't confuse anyone. So yeah, next steps, how are we continuing to resist biometric mass surveillance in Europe? Well, we're continuing to reveal abusive uses and to contest the implementation of biometric mass surveillance across our public spaces in Europe because we're just not seeing enough action from regulators or data protection authorities. Instead, we're seeing law enforcement, governments and private companies really taking advantage of the democratic and legal vacuum that we're in right now. So in 2021, we plan to do even more to engage people across Europe to help us challenge those in power and to seek answers about what's really going on. This is going to include everything from formal challenges and requests for information from authorities like with freedom of information requests, also through to public workshops and even partnerships with artists. Our big ticket item for 2021 is going to be a European Citizens Initiative calling to ban biometric mass surveillance practices in the EU. This initiative, which is also known as an ECI, is a form of legally recognized petition which will call on the European Commission to take concrete legislative action. We have very clearly told them that right now the lack of specific EU laws to limit uses of biometrics and the problems of enforcing the existing general principles have meant that they are in violation of their obligations under the EU Charter of Fundamental Rights and that causes so many harms in the ways that my colleagues have just explained in just three of the countries where this is happening systematically across Europe. And so soon we are going to need over one million European nationals to sign our ECI to say that they agree with our call to ban biometric mass surveillance practices. So we really hope that we can count on you for this. The ECI is also going to give us a platform to extend the coalition that we've been building, bridging across issues of labor rights, media freedoms, social and racial justice, women's rights, LGBTQI rights, environment and more. Because after all, biometric mass surveillance is an issue that can cause very serious harm to certain groups. But of course it can also have a chilling effect on absolutely everyone because it really goes against that fundamental right that we all have to respect for our private life. Andrea, I'd like to invite you to let everyone know how they can join us and be a part of the Reclaim Your Face movement. Thank you, Anna. So biometric data includes data about our body and behavior. And that means everything from fingerprint, palm print, palm veins, face recognition, of course, DNA, hand geometry, iris recognition, retina recognition, typing rhythm, walking manner, voice and much more. Companies and governments in Europe are innovating and trying to find new ways to capture your identity. Yet under EU data protection law, this data is especially sensitive. They're linked to our identities and can be used to infer protected and intimate information about who we are, our health and more. If you think this is problematic and if you're worried about all the information my colleagues have mentioned so far, well, get involved. Over 11,000 people have already wrote to their governments. You can go on the ReclaimYourFace.eu website and sign a petition to address your government. From February though, we're mobilizing to address the EU. So two things, what can you do now and what can you do in February 2021? As you've just heard, we're focusing on gathering evidence and gathering support. So right now, for starters, you can write an email to your mayor or to a city councilor. In your own city, ask them to promise that they will not deploy biometric mass surveillance in your city's public spaces. Do they reply they're positive about it? Get in touch with us and let us know. We'll celebrate together. In February 2021, we will launch the European Citizen Initiative that my colleague, Ella, mentioned. In order for this to be really powerful, we need to reach one million signatures. Then the European Commission will be obliged to respond to us. This means we'll need a lot of help. Do you want to get involved in gathering these signatures? Again, get in touch. Finally, what if you're an organization? What if you know an organization that might be interested in joining our coalition? We're looking in particular, like Ella mentioned, for organizations that cover a broad range of areas for media freedom, freedom of assembly, disability rights, sex workers rights, labor unions and more. Let us know. Put us in touch and let us build together a civil society movement that will make your grandchildren pretty proud of you. Thank you. I'd like to invite my colleagues to turn on their cameras. You can get in touch with us through our Twitter handles, of course, or to writing directly at info at reclaimyourface.eu. I'm seeing them pointing in all directions. We're looking forward to your questions and see you around. Thanks a lot for the overview over everything that's happening in Europe right now. I was really interested what made you start the campaign or what was the point where you thought, now this has to change, we have to do this campaign and bring organizations together and stop this. Yeah, so thanks, Rick, and hi, everyone. And on the point about experimenting with the Q&A, we are very happy and comfortable experimenting with Q&A. We are not happy with experimenting with people's faces and bodies and public spaces. So that's kind of what's driven us to launch this campaign. From a policy and legal point of view, we've been following developments across Europe for many, many years across the Edri network and we were noticing a trend in the number of really harmful uses just going up and up. We were seeing pretty much every police force in Europe starting to trial these technologies in a really sketchy way. We were seeing private companies rolling them out in supermarket, concert stadiums, football stadiums without any public debate on what this means, without properly considering the risk, without knowing what it means for the law. And then when there was a rumours that the European Commission was considering a some sort of ban on five-year, three-to-five-year moratorium on some of these technologies in public spaces, that gave us a lot of hope as civil society and as activists. And it never materialized. And we realized that civil society aren't really being listened to and the public aren't really being listened to when it comes to these biometric technologies. But as I've noticed when I've spoken on panels alongside CEOs from surveillance tech companies, they are the ones that are being listened to by the EU. They're the ones that are getting millions of euros to essentially experiment with our status and bodies, with our public spaces. So we decided enough is enough and we need to do something about this and we need to make sure that our European values, our democratic principles are actually adhered to. It's really, really important to us that this is something that is considered and not something that's just done and a few years down the line we hear, well, it's everywhere, it's too late. It's not too late, this is really the time for us to take decisive action. But yeah, Andrea, am I missing anything? Yeah, no, I think that was a great overview. And if I may chip in some thoughts, we started working on this already, I think more than a year ago, right? So when the Corona crisis and all the measures to stop the spread of it were not even in our worst dreams. But I feel like in the past months, the general public has grown more and more aware of where can surveillance measures lead to and how do they look like and see them above their heads when they go out in their cities. And I feel like this awareness is also a very big push for this campaign to move forward. And like Ella mentioned, when you see that people are outraged about this and when you know that the EU is making new laws on the topic, there is no way we can stay quiet. So I feel like, yeah, I'm sure and I hope not but I have a presupposition that more and more reasons, we carry this fight forward. We're a peer, so we better start mobilizing. Ella, you mentioned that in many cases, there's no real debate and people just start implementing systems and are listening to the company bosses of big companies instead of the general public. What do you think is the reason for that? So why are we missing out on these really important debates? What do we need to do to have these debates before systems are introduced? There are so many reasons really why it's happened this way. A lot of the pushback, shall we say, that we hear from the European Commission, for example, are things that just aren't true. So there's a lot of myths out there. People say, oh, well, you can unlock your phone with your fingerprint. So it's too late. And actually that's really not the same thing as suddenly every time you leave your house to go to a protest, to walk in a park, to do nothing as if you're right. Suddenly you're being tracked and not only tracked, but that's etched into your identity in a really immutable way that's going to follow you around and connect you to lots of people. It's absolutely not the same thing as unlocking a phone. So we hear these kind of arguments that aren't really true. And we face the kind of security argument a lot. And we have governments in particular using that to justify doing unlawful activities. And actually, if you look at the EU law that we have, there are of course provisions that allow governments and police forces to protect their citizens. That's a legitimate policy goal to keep people safe. It's just that there are controls on the way that they're allowed to do this so that we can have those checks and balances that also protect us from abuses of power from arbitrary surveillance. So actually these excuses about, oh, well, we need to do this and it's justified are not true, but a very clear legal threshold. And what we're seeing is pretty much every single European country is not meeting these legal thresholds that are designed actually to keep us safe. And when we think actually about what it means to feel safe and to feel secure, to know that you can express yourself, whether that's your gender identity, your religious beliefs, whatever it might be. You should be free to be yourself and secure knowing that you can vote, you can attend a community event, you can participate in the things that make it worth living, the public activities that make us feel a part of something bigger than ourselves. And I've said this before, you only have to look at every dystopian science fiction film and book ever to know that the sort of society where there's someone breathing down our neck and capturing every bit of us to put in a database and turning us into walking barcodes, it's not a society where everyone feels safe and secure. It's actually one where we're treated as if we're a criminal suspect all the time. And we know that once people understand it in those terms and see the fact that this doesn't contribute to the sort of free and vibrant and open society that I think the vast majority of us want, I think then it's a lot easier to understand the nuance of this debate and to not necessarily pay attention to a lot of the either political statements that we hear or often the marketing. You hear tech companies spout all sorts of glossy things and often they can't do what they say. We've found many that are not lawful if they do what they claim they can do. But that's a lot of what we hear in civil society. Our budgets are not as large as the tech giants, which is why it's really important that we find creative ways to raise our voices. Yeah. Yeah, just to add to that, I think biometric data is really a goldmine for government surveillance, but also for surveillance capitalism. So when we're talking about this, we're talking about private companies, of course, on the other side, but we're also talking about government. So there is here a common interest that I feel it's not made clear for obvious reasons to the public. And I feel like there's also a misunderstanding when we're discussing what is the real public interest, because of course, governments and companies can shape the discussion around public interest and what does that mean when it comes to biometric data and biometric mass surveillance, face recognition. But is that really the public's interest? Is it really in public's interest to feel like a suspect whenever you go out of your house in a public space? Is it in your public interest to fear that whatever database that has your face stored on will leak tomorrow and you will never be able to change your face again? Or is this in our public interest? I doubt so. So I think perhaps we can also do better across civil society to have a common narrative about what is it that we want this public interest to mean. Explanations, to me it seems like you're implying that many of these misunderstandings or that people see these technologies in different lights actually comes from not being informed enough and not being informed enough how these technologies work and what implications they have. And it kind of like what I got from your answer is that the difference of a utopia and a dystopia is like the nuances what you think might happen as soon as these systems are installed. And what do you think, what do we need? We as a movement as a people, as civil societies, what do we need to do to get that knowledge to these people who are making these decisions that they kind of like can come to our side and see that it's actually a step towards a technological dystopia and not a safe and nice future for everybody. Yeah, well, if I may, I think I'm thinking translation and chewability. We often talk about human rights but these are such theoretical non-tangible concepts that very few of us actually envision something when they spell them out. And I feel like we need to make these concepts much more clear and much more close to the everyday reality of people. I feel like this is something we can definitely work more on and also personally, I think the role of art is so important at inferring the impact of biometric mass surveillance on our freedoms without mentioning the human right that is at stake but rather the experience that you will have if you don't take action. And finally, I think we spoke a lot about dystopias and utopias that I feel like we're, and I feel like we're doing that already. We really need to envision what do we want this future to look like in order for us to have a common vision and make it clear that our values are common. We're all working towards the same goal here. Ella, please add. Please add. Thanks Andrea. And yeah, just to wrap off what Andrea said, absolutely. We need to work towards the positive vision of the kind of world that we're trying to protect and create where everyone is free and not put in a box and not judged and labeled and controlled based on who they are but actually a world where we encourage that beautiful full spectrum of human difference, human diversity, human ability. So trying to build that world, and as Andrea said, translate it not just in terms of the concepts but in terms of languages, making sure that this is a multilingual, accessible, broad movement that whatever language you speak, wherever you live, wherever you come from, you have a place in the movement. So yeah, I hope that was short and sweet enough. That was wonderful. Thanks for being here today. Although remotely, thanks for recording their talk, doing their talk. Yeah, and I think there's not much more to add. Please, now everybody who's watching this and is interested in a further discussion, go over to the discussion room. Please, you two, I don't know if you already are or please move over to the discussion room and then we can start the discussion there. And thanks a lot. And yeah. Thanks to you. Looking forward to seeing you in real life next year, hopefully. At some point. Everything will happen in real life again. Bye. Bye. Bye-bye. Bye-bye.