 Okay, hello and welcome to today's event on perspectives on encryption and child safety. We're gathered here today because Apple has made the announcement to release two new features for their products. One of those would allow parents to see what children are doing on messenger in some cases. And the other one would scan their iCloud photos library that they upload the cloud for CSAM, which is child sexual abuse materials. Both of these had good intentions in mind, but at the end of the day, they do pose some novel questions for both the child safety that they purport to help. And for the broader question of privacy on our devices, Apple had, you know, an eight story billboard saying that what happens on your iPhone stays on your iPhone and now they're suddenly changing that. And we in the community of people who worry about what's happening on your devices and who worry about the people that these things might harm, which it turns out to be quite a broad range of people have some worries about this that we want to make sure are discussed. And previous discussions about this topic in the field have tended to be more among the usual suspects in the field of encryption and technology. And so the goal of today's event is to make sure that a broader range of voices are heard in this discussion as they should be, because it turns out that the devices that almost everybody in the world is using will affect almost everybody in the world. So we have a very large group of people who have kindly taken the time from their day to come talk. They are they focus on child safety, they focus on surveillance in parts of the world with not great human rights history, they focus on a whole variety of things, which is why I'm excited for you to hear directly from them instead of me, who is, oh, let me introduce myself. I'm Erica Portnoy. I'm a senior staff technologist at the Electronic Frontier Foundation, and I will be moderating today's event. As for the logistics of the event, we're going to start out with opening statements from some of the people attending after which we'll transition into a panel style discussion. I have some questions prepared, but if anybody watching on the YouTube live stream or any of the attendees would like to discuss a particular topic more in depth, please put it either into the YouTube live stream chat or into the Zoom chat for the meeting participants. And without further ado, I would like to start off with our first speaker of the day, who is representing the Child Rights International Network, Diana Georgiou. And I will let her say for herself what their organization does and what their perspectives are today. Thank you very much, Erica. Hi, everyone. It's a pleasure to meet you all, and thank you very much to EFF for organizing this panel and for having us on. So Child Rights International Network, or CRIN for short, is an international human rights organization that is focusing on the rights of children. And we work on a number of projects from pushing for justice for the survivors of child sexual abuse to rights-compliant approaches to counterterrorism measures in the UK, and access to justice for children's environmental rights. And CRIN started looking into children's rights in the digital environment around eight years ago. And the entry point into children's rights and tech was actually a policy paper on the protection of children's access to information, including online. And the paper highlighted disproportionate restrictions on the right to information, which were being defended on child protection grounds. And CRIN's argument was that access to honest and objective information appropriate for children's age is a prerequisite for all children's rights and should be part of any child protection strategy. So in our work, we are looking at the whole range of children's rights. And our approach is that children's rights are interdependent and mutually reinforcing. So in this discussion, and for us in beginning to explore the issue of encryption and child sexual abuse material more generally, we aim to help move the debate beyond a framing of protection of children versus privacy. And we aim to recognize that children's rights are on both sides of any balancing exercise that we make in this area. So any attempt to tackle the distribution of child sexual abuse material should take all children's rights seriously from protection from sexual violence to privacy and free expression. So I'm really looking forward to this discussion and to learning about your work and your approaches to the issue. Thanks very much. Thank you so much for that introduction, Diana. That was great. You know, we have a bunch to get there, so I'm going to keep it short in between and so please forgive me for not having longer introductions. Next is from the National Center for Lesbian Rights. We have Maxi B. Hi there. My name is Maxi B. And I am working with the National Center for Lesbian Rights. Our focus is on supporting LGBTQ people through policy, education, advocacy and legislation. We started to become focused on this issue because we believe that the approach that Apple has taken to both sides of their policy against trying to tackle the CSAN material that they are trying to combat on their platforms has been kind of a problematic one in several ways, but today I'm mostly going to focus on the parental notification system. So the parental notification system we're mostly concerned because it highly impacts LGBTQ youth in particular. It's an opt-in feature that will notify parents of children's behavior depending on the kind of imagery they send and receive. We have huge concerns mostly because LGBTQ youth in particular are very vulnerable to changes in their security and their safety and with, of course, the ways that the electronic frontier is changing. We think that these sorts of changes pose a really serious harm to their well-being. So today I'm going to mostly focus on that. Thank you. Thank you for that, Maxie. And next up from Defend Digital Me, we have Jen and Pearson. Thank you, Erica. Defend Digital Me is a campaign organization for children's safe data use in the education sector and for the last six years we've been researching the safety tech sector in the UK. The government here has vowed to make the UK the safest place to go online, but what are the costs to children's rights? We agree very much with the two previous speakers that children's rights and children's voice is left out of this conversation far too often. In fact, if you look at the Australian Safety Commissioner's research carried out in 2019, she found that when it comes to monitoring, young people were split in their views, but the majority, 71%, believed monitoring systems and scanning content was helpful in preventing negative experiences, but over half, nearly 60%, were uncomfortable with these features running in the background. And a sizable minority found that those features would be intrusive and were unsure about their effectiveness at all in ensuring online safety. So we feel that children's rights is easy to say, but is actually much harder to hear in these policy debates. And it's very important that those rights are respected, given that the UN Convention on the Rights of the Child, the General Comment 25, was published only this year with a view on surveillance that digital surveillance should not be routine, that it should be very carefully applied with particular targeted measures and only where the least privacy intrusive measure could be available to fulfill the desired purpose. And they also drew out the importance that the digital environment presents particular problems for parents and caregivers in respect to children's privacy. And that recognizes the child's right to privacy, but also that the child's right to freedom of expression, to free expression of conscience, religion, freedom of thought can all be impacted by surveillance in the digital sector. So we're here to really advocate for children's voice to be heard, not only to discuss their fears of misrepresentation, their concerns about trust in the family and institutions, but also to ask industry to consider the reputational risk for themselves and individuals when looking at digital monitoring and surveillance in their private sphere of life. Thank you. Thank you and thank you for bringing in those reports and numbers. That's really helpful to hear. Next up from Freedom Network USA. We have Jean Bruehman. Hi, thank you. I'm the Executive Director of Freedom Network USA. We're the US's largest coalition of survivors, advocates and organizations working to end human trafficking and ensure comprehensive services and support to survivors in the United States. Among our members, we serve over 2,000 survivors a year working with adults and minors, sex and labor trafficking survivors who are both US citizens and foreign nationals. We work from a human rights-based approach where we understand that human trafficking is the inevitable result of sexism, xenophobia, racism in our policies, and that thus we must engage in dismantling these systems in order to end exploitation. So understanding that sort of background, it is our understanding that we do welcome technological advances that help keep children safe, but we have to engage in careful consideration of the complex issues at play. Although often we use child sex trafficking as sort of the reason to engage in really extreme measures, what we know is that young people can be involved in sexual activities in many ways that can be difficult to understand from simply looking at their images or online activities alone. Additionally, determining the age of an individual by sight or photo is often impossible. We also know that young people rarely disclose who might be controlling them through their online activity. And what we know from working with survivors in the US is that those exploiting, exploiters of young people include parents, siblings, friends, people in positions of trust like teachers, law enforcement, and other people in power, including those in government and in corporations. So it's important to also understand on top of that that young people are arrested for prostitution and related crimes, including sex trafficking crimes every year. So these are people who we might see as both victims and perpetrators of these crimes, and our legal system often sees them as such. But understanding their actions, culpability, and the most effective strategies to meet the young people's needs is very complex and needs to address the full range of these experiences without assuming that all young people engaged in sex acts are being abused or want intervention. And then in fact, intervention that is not wanted can often be harmful to these young people, especially in cases of our queer and transgender youth, where intervention might actually bring to attention the very harm doers that they're trying to hide from. So I don't bring any solutions to the table. What I bring is the complexity, the questions, and the challenges that I think need to be carefully understood before engaging in any actions that could cause more harm than we're seeking to prevent. Thank you. Thank you for that nuanced view. I'm sure that'll be really helpful going forward. Next up from Hacking Hustling and Reframe Health and Justice, we have Danielle Blunt and Kate Dodamo. Hello, everyone. I'm Blunt. I'm an organizer with Hacking Hustling, which is a collective of sex workers, survivors, and accomplices working at the intersection of tech and social justice to interrupt state surveillance and violence facilitated by technology. Good morning. I'm Kate Dodamo, and I'm from Reframe Health and Justice, we're a queer and trans people of color collective working at the intersections of harm reduction, anti-violence, and criminal legal reform, including a focus on expanding concepts of safety, health, and wellness for all people who trade sex. And so earlier this year, Apple shared a set of interventions regarding young people's access to technology. And after criticism, a narrow set of advocates were asked to give line edits on the policy and held a series of conversations that were fronted by these hand-thicked advocates. Many things were missing. Missing were marginalized communities who will be most impacted by these changes and have struggled to be heard in tech spaces, which value profits over people. Missing were conversations on the efficacy of these interventions and any evidence that they would be successful in addressing said harms. And missing were conversations about evaluations of the program's success and transparency and how often these identifications were wrong. Missing were any protections against misuse, how misuse would be reported, or how to make sure that these narrow intentions would not be forgotten during negotiations with authoritarian governments. Missing was a context that this has come in the midst of years of government demands for private companies, including Apple, to surveil their users to an extent that the government is barred by the Fourth Amendment, and then to report on its users in an effort to expand policing. We're here to offer the perspective of sex worker activists and our concerns are based in our experience in organizing. We are not speaking from conjecture, but from a very long history of having tech policies written in the name of protecting children then be used to harm sex-working communities, including the youth who traded sex in those communities. We also bring concerns that there is an underlying assumption that Apple is neutral on the sex industry at all. Apple products have never been a safe space for adults in the sex industry. In 2010, Steve Jobs himself said, quote, we do believe that we have a moral responsibility to keep porn off the iPhone. Folks who want porn can buy an Android. Right now, you can't have sex-related apps on the App Store, meaning that we can develop harm reduction apps and create bad date lists, but we cannot effectively distribute them because of Apple's decisions to prioritize its moral stance over our community's well-being. This community, this conversation does not exist in a vacuum and we should not treat it as such. Privacy on our devices and encryptions on our communication platforms have offered sex workers the ability to connect to community, negotiate with clients, and enact the very basic harm reduction tools which keep people safe. We are a community regularly erased and banned from technology, whether it is being banned or censored by Twitter, or fundraisers being deleted by GoFundMe, users weaponizing content moderation systems to erase marginalized communities or accounts being closed and our money stolen by PayPal. Pretending that this specific intervention does not sit within a context of a host of abuses against sex workers, including an explicit antagonism against us by Apple, ignores everything that we know about how tech companies have treated the sex trades, and all of these erasures begin by being left out of every decision-making conversation. There is no precedent that the expansion of policing and surveillance for a specific intervention has ever stayed within those parameters. Erasing users' privacy should not be framed as a novel conversation with no implication to the wider impacts. When we think about those who will be inevitably harmed by compromising privacy, we can simply look at those who already struggle to safely access information and connection. That means sex workers. That means LGBTQ youth seeking information on their bodies, communities, and sexuality. That means that anyone in Texas scrambling to find information on abortion care, something that we've already seen Instagram compromise. This means anyone who does not consider home to be a safe place and for whom parental control over their access to technology means isolation and the potential for abuse. Like similar policies which have passed without community feedback, these policies will increase harm to the communities that they purport to protect. Let's not have a siloed conversation when we have never had a siloed intervention, and let's not ignore the context that we are living in. Let's begin our conversations in addressing harm, not just hiding it and calling the damage done in unintended consequences. Thank you for helping to situate this in that broader context. That was quite helpful. Okay, going on to our next speaker from R3D Mexico. We have Luis Fernando Garcia. Thank you. I'm Fernando Garcia, and I'm the activity director of R3D, the digital rights advice network from Mexico, where the rights organization that focuses on expression, privacy, and many other human rights issues in Mexico. We want to briefly put into the table the context in which people in Mexico experience technology and how the efforts that have been announced by Apple have sent certain concerns for ourselves. First, it's important to understand the context of countries like Mexico and many other places in Latin America in which there is a history of abuse of technologies and of laws supposedly dedicated to combating crime, but then purpose to attack civil society and journalists. Mexico is one of the most dangerous places to be a journalist in the world, and in recent years we have been able to investigate several cases of surveillance of human rights defenders and journalists, including by exploiting vulnerabilities in Apple products. For example, in the case of Pegasus Maurer, that in many cases exploited vulnerabilities in Apple products to be able to surveil journalists, human rights defenders, including journalists who later on, after they were surveilled, they were actually killed. So it's also important to understand that in many places in the world, sometimes in the US, there is a very rowy and naive conception of good call by the criminal, but I don't think that's really true in the US either, but in Mexico, for example, it's important to understand that the line that divides organized crime and the government is usually blurry or nonexistent, and many times legal and technological tools that are confided to the government are then exploited by or for for the benefit of organized crime groups. And we have, so we are really concerned about how the efforts that Apple have mentioned could, it can have, would be to attack the vulnerable communities, and also finally, just mentioning our concern based on history as well of how cooperation between the US and Mexico and authorities have been exploited, including cooperation related to CSUN has been exploited for censorship and has been exploited for surveillance. For example, there's a case of a webpage that was designed to actually bring to light police abuse during protests in recent years that was taken down thanks to the cooperation between Mexico and the US and exerted pressure on a private company that was goadary.com to seize a domain and censor that website that included a lot of evidence of police abuse. Also, we've seen recently how the US, Mexico, cooperation on intelligence and police has also resulted in the persecution of migrant rights offenders that have been framed as traffickers, et cetera. So we are really concerned and we think that Apple probably is not looking at how this is going to play out in any places in the world. Fragile, thank you. Wow, thank you for all of that. That's interesting to see how the government and industry are working together. And I also want to put a special thank you to anyone coming from Latin America where it is a holiday today. Thank you for joining us on this holiday. Speaking of which, next from Derechos Digitales, we have Rafael Bonifaz. Hello. Derechos Digitales is a human rights organization that focuses its work on digital, on the human rights in the digital world. This issue related with Apple sounds like something that will be taken like goodwill. We want our kids to be safe. However, this puts everyone that uses Apple iPhones under surveillance. This can be used not just for child pornography, but can be used for government abuse in the future. And we think this brings this open high risk for people using iPhones and a risk for everyone on the Internet. So we think privacy is a human right that should be protected and we don't like this Apple approach to this issue. Thanks. Thank you for that. Next up from Article 19, we have Vladimir Cortez. Hi, everyone. I'm Vladimir Cortez, Digital Rights Program Officer at Articles 19 Mexico and Central America, Human Rights Organization, Tending and Promoting Freedom of Expression and Access to Information. We certainly are concerned that in countries like Mexico and others in the region, the lack of democratic controls will lead to the abuse of backdoors implemented by companies like Apple. When governments want to access information, what we see is that they do not respect legal requirements and they end up using these backdoors to chase people and to silence dissent and critical expression. CSM is a super sensitive issue and restrictions to human rights in the name of the interests of the child are valid, but from a technological and human rights point of view, there is a strong need to have this safe work. If you weaken the protection of information, even if you do it with good intention, then anyone can exploit weakness on other purposes, as we also experienced and Fernando mentioned with Malware Pegasus against journalists and human rights defenders. We also recognize that the use of high databases to detect illegal materials must be pursued with a strong rule of law framework that includes safeguards from the mental rights. In the case of Mexico and other Latin American and Caribbean countries, there is a structural weak legal framework that has been abused by governments against journalists and human rights defenders. From Article 19, we have adverted those vital actions to help read the world of child sexual abuse material. Some actors are misusing their sweeping censorship powers to censor work art and to criminalize those who create and produce. This action disproportionately affects those from marginalized, artistic and LGBT communities and have resulted in the arrest and imprisonment of people who are innocent of any wrongdoing against children. One of these is a 17-year-old Costa Rican girl who was arrested for posing drawings to her blog on a referral from the Canadian Centre for Child Protection. In other cases, according to state authorities, 90% of reports received from National Centre for Missing and Exploited Children related to innocent images. In that scenario, we are really concerned and we really believe and think that if we are lacking of these frameworks and as we can see how this can be exploited and not really guarantee and safeguard children, then this can also be used to criminalize and to persecute journalists and human rights defenders. Thank you very much. Yeah, thank you for talking about those, definitely real worries that we have. Next up from Access Now, we have Namrata Maheshwari. Thank you, Erika, and thank you, EFF for organizing this much needed event representing diverse views. I am Asia-Pacific Policy Council at Access Now and I also coordinate the organization's work on encryption. Access Now is an international organization that works to defend and extend the digital rights of users at risk globally through our presence in 13 countries. We provide thought leadership and policy recommendations to the public and private sectors to ensure the protection of fundamental rights. We also operate a 24-7 digital security helpline, which the advice of which often includes encryption for human rights crisis to protect against surveillance and invasion of privacy. Our concern with Apple's announcement is primarily that it jeopardizes the privacy and security of all Apple users and anybody communicating with them and our concerns are a few. It's a list and the first is function creep, which has often been tagged as a speculative or academic concern. It is not, we're seeing in our part of the world that it is, there are legislations already in place that will be exploited to make demands, to use this technology for purposes other than CSAM. One example is the traceability mandate in the 2021 IT rules in India. Australia, for example, already has stolen the online safety determination that has been proposed, all of which will be used to exert pressure on Apple and other companies to use these tools for purposes other than CSAM. The second thing is we think it is extremely problematic for the average user and having said that these tools need to be designed not just keeping the average user in mind but also the most at-risk user in mind. This goes to what Daniel mentioned earlier about very critical voices of folks that are very directly going to be impacted, being left out of the conversation and we think at-risk users and vulnerable communities absolutely need to be made part of the conversation much before the announcement, while the tools are being created and not after there has been a flashback and the pause has been announced. And this goes to the third point about any tools that are created for the purpose of content moderation need to draw not just from technical input but from a much wider range of inputs, a much wider harm reduction framework including sociological perspectives. Our concern is also that these tools need to increase user autonomy. I think any tool that leads to the device working on somebody else's behalf that does not enhance the user's control over their data and over their communications in any manner are fundamentally opposed to human rights. And there's also a problem with the broader message that Apple sends out to users around the globe, to the government, to the private sector with a move like this. The message that goes is that there is willingness to go that extra mile in the name of protecting CSAM whether or not you might actually do it and without an adequate analysis of the costs at which any marginal benefit may occur. To be sure it's CSAM is a serious issue and steps need to be taken to address it, but they absolutely need to be in line with principles of necessity and proportionality and without risking the human rights of everybody involved. I'm happy to go into more detail through the panel discussion. Thank you. Thank you. That's a really helpful framework. And our last speaker from the introduction portion, SafeNet, Southeast Asia Freedom of Expression Network. We have Damar Juniardo. Thank you, Erica. Hello, everyone. Thank you AFF for organizing tonight discussion. I am Executive Director at SafeNet, Southeast Asia Freedom of Expression Network. SafeNet is a Bali-based regional organization that focused on defending digital rights in Southeast Asia. So in the context of Southeast Asia, I think everyone in this panel should know that privacy is very important, especially for countries in Southeast Asia who has a weak protection or no protection for the digital privacy, yet like in four countries in Southeast Asia. So this Apple new move to CSAM scanning iPhotos actually raising a safety concern. This move will be a potential harm being exploited by authoritarian government to access more of at risk group and users data. So like in Indonesia, with the new regulation from Minister of Communication and Informatics of Republic Indonesia who passed, recently passed, Minister regulation number five, or we simplified like MR-5, concerning the private electronic system operators. With inside the Article 13, there is obligation for platform to terminate access to prohibited electronic information and electronic documents and also Article 15 regarding the timing and procedure of terminating access to prohibited content. This will be making more situation more harmful for the people. So with this new Minister regulation or MR-5, actually government can access to data. So with this regulation, it will give access to the backdoor system of the electronic service operators whenever the authorities request it. And the government is not only allowed to request access to traffic data or an electronic system user information, but also specific personal data and communication content. This may include financial data, by metric data, health data, and any data or metadata on users, data on users, political view and sexual orientation. So we think that we have, Apple should be very careful by releasing this GSM scanning iPhoto because it could make the situation become more harmful for any users coming from Indonesia, especially those who being active as a journalist or human right defenders in Indonesia. Thank you. Wow, thank you for that. Okay, so that concludes the opening portion. And now we're going to switch into a more panel style discussion. I have prepared some questions, but they're really just a framework to help to draw out your expertise. If there's something you would like to say, please feel free to bring it up or you can ask in the chat to make sure that we direct the conversation that way. You can either jump in or if there ends up being a bit of a queue, use the hand raising feature in Zoom. Okay, and so just to start us right off, I wanted to start with focusing on the more of the the children's rights portion of this. You know, some of our speakers talked about how the surveillance of minors and keeping in mind their human rights is an important thing that we also have to think of. But I want to start off and talking about the current examples that we've already seen of what happens when children are put under surveillance. In what ways does this harm them? What technology have we seen? And what is the what are the behaviors that we've observed? How does this affect the development and life of children who are subjected to this surveillance? So just looking for any sort of, you know, particularly concrete examples that you've seen. I'm interested in hearing about some of those specifics. This is open to whoever would like to jump in. So I'll go first. Hi, everyone. My name is Han. And since just to introduce myself, I'm from Human Rights Watch and we work in 90 plus countries around the world. And so the questions I'm going to be posing today are really drawn on an expertise on our children's rights work from around the globe. And including as what many said, the most at risk users. So when Apple came out with its announcement, one of the most surprising and disappointing things for me was just the paucity or the lack of detail. Because the devil is in the details, right? You know, how how can we trust a corporation to set to do what it says its intentions are? And how do we how can we trust if their adequate frameworks and safe legal safeguards for adequate use, especially in context for rule of law is not the same as in other countries around the world? And a couple of, you know, as Diana mentioned in her opening remarks, if you're looking narrowly at child rights, there there are certainly child privacy concerns, both for those who have been a victim or a survivor of child exploitation online and their privacy. In addition to the privacy of child users around the world in different contexts, and how do you balance that out? It is, unfortunately, it is a balancing tradeoff, but there are so few details that it was very difficult to figure out what considerations Apple had made on behalf of child users everywhere and all of their rights writ large. And so just maybe two examples. I'll pause it for the group here. The first is that Human Rights Watch did this. We did an investigation in Russia. So in 2013, Russia came out with what is informally known as a gay propaganda law. And it was aimed at quote unquote protecting children from information, promoting the denial of traditional family values. And that was widely interpreted and implemented as a ban on any LGBTQ related material on print media and on the internet. And so website operators that provided life saving information for LGBTQ youth or safe spaces were all taken down and they were put in jail and they were really harassed and persecuted. And so with in a country like Russia, it's very easy for us to imagine what happens when a government like Russia compels a company like Apple to start searching for what it deems as objectionable content under its own law. So this is legal in Russia. And so what can Apple do to try and prevent any potential new tools to be misused by a government like this to go after queer youth and queer teens who are looking for life saving information or to be able to seek out safe communities. So that's certainly a concern of ours. At the same time, the other example I'll give is one of Apple's proposed safeguards was that we're only going to search for hashes that are confirmed by two or more hash databases provided by child safety organizations, which sounds really great. It's certainly a good start to trying to figure out how can you make sure that you're not relying on the child as child safety database from one jurisdiction, which may be subject to certain pressures by a single government and how do you kind of diversify that risk. But my question here is when it comes to CSAM and encryption, which is, of course, as we all just talked about extremely fraught, there is certainly an alliance, a security alliance around the world where certain governments have been very strong on their stance to break encryption, such as the Five Eyes Alliance. They also coincidentally happen to have some of the strongest health safety databases in the world. So you could imagine that with the UK and the US, for instance, so the UK's version of NICMIC is the Internet Watch Foundation. But in 2014, there was a high court ruling that took the list of blocked websites that were known websites to distribute CSAM that is held by the Internet Watch Foundation. And suddenly it was the luxury watch manufacturer, Cartier, was able to get through this high court injunction a bunch of websites that it thought were copyright infringing into this database that was only meant for CSAM. So that's an example. It's not about image hashing, but that's certainly an example of political lobbying pressure in which a database that was only meant to have a very strict definition of CSAM and to be able to hold other websites against that was certainly manipulated. So we certainly have concerns about how can Apple build in more safeguards or how can we help companies, not just Apple, but around the world to try and build in safeguards or to prevent certain safeguards being misused by multiple actors that we know to be working together in the safety and anti-incruption spaces. Oh my god, that's a really good point. I don't think anyone's brought copyright into this discussion so far, but I think that's it. So now that you mentioned it, right, that's what we see happen all the time. Copyright is used on YouTube to remove objectionable content to remove anything with copyright lobbies. So wow, that's a great horrible point. Yeah, thank you for that explanation there. Did anyone else want to take a shot at answering? Okay, yeah, I see a hand up from Jen. Thanks. I think that's an extremely important point. The copyright is one of these editions that is considered legal but harmful in the sort of UK online harms, online safety school tech. And this is a big threat, I think, of the scope creep of this type of technology. And whilst everybody talks about balancing rights and children's rights, I think actually we're in an impossible situation when we look at balancing here. There is no real balance, but there is a question of fundamental human rights. And you either support them and uphold them, or you don't because of some points at which they get crossed, you've crossed the line, you cannot go back. And our school safety tech in the UK has a policy not only of scanning all incoming, outgoing content, everything that passes a child's screen, webcam, counseling chats, confidential, private conversations, everything is scanned and matched against keyword libraries for identification of risks to self, risks to others, or conduct and contact harms. But also a lot of those technologies are including, slowly but surely, copyright into those considerations. And the most, I think, acidious scope creep that we have here in the UK is now that once you can do this sort of scanning of content monitoring and activity monitoring for one type of activity, you can do it for another. And so the UK's anti extremism and anti terrorism legislation has brought in the prevent duty under which children are monitored and tagged by their activity and profile that's potentially extremist and terrorism activity. Now it's completely opaque. There is no transparency with children themselves, with parents, what these implications might be, the consequences are not foreseeable, but there's three things I think we just should bring out quickly. One is the chilling effect that this has potentially on activity. And that's been well documented by Dr. Emily Taylor in London City University and her studies of life through a lens, what that means to children, how it impacts their development and free and full development. I think the error rates that are again, sort of common across these types of technologies are opaque. There is no requirement to publish error rates or identify what they are. And in fact, false positives and false negatives are almost impossible to identify accurately. But it does mean that companies can then wash their hands of problems such as children being tagged as suicide risks for having looked up cliffs and somehow being flagged by the system, wrongly all being tagged as a potential gang member for having looked up the words black rhinos, two real cases that teachers have come to us with UK safety tech that's tagging children with these profiles and behaviors. And that data is then retained by schools, even though they know there might be mistakes in it, because schools are risk averse. And I think this is the question of then these sort of scope groups of technology, is if you're risk averse, you keep the data, you profile people longitudinally for life, and the real implications of that for a child developing, and the uses that can be put not only in terms of the Western Northern countries in terms of visa applications or applications for insurance, but around the world how those types of longitudinal life profiles might be used is extremely worrying. And I think those types of cross cutting issues of error rates, of the chilling effect, and that scope creep can touch across all of this conversation. Oh my god, thank you for those examples there. You know, I'm actually interested in hearing, you mentioned a few things there. I wonder if you know a bit about the political situation of how it got to this point, right? Do you know about the interplay of the government and schools? Who's driving this? Where does that pressure come from? Is it the law that they have to do this? How do these things come into place? Do you know a bit about that? I'd be interested to hear. It's, I think it's a lack of transparency of activity between industry and the public sector, and that the tools and technology have been developed and then offered and procured to schools that have felt the obligation to do everything possible under the umbrella of protection of children. But unfortunately, that that sort of idea has never been risk assessed of what those implications might mean. So there is a statutory duty since 2016 to prevent children from entering into extremism and terrorism. But again, I think I would refer back to the concerns with the sort of countering violent extremism agenda. I mean the 2016 UN Declaration on Freedom of Expression and Countering Violent Extremism summed it up. They said there's insufficiently clear definitions of extremism or radicalization and some governments target journalist bloggers political dissidents, as colleagues on the call have said under these same umbrella terms. So we don't have adequate law that is permissive gateways, but this scope creep has crept in and the problem has been that once the technology has been adopted in schools, it's very hard to give up. And it's also very hard to not adopt what they are offered in terms of technological advance or innovation as it is packaged of that scope creep. We can now do this. Would you like to offer more? And of course, now it means we're actually doing in the UK what you prosecuted 10 years ago in the States, which is enabling schools to take webcam photographs of children covertly, which Philadelphia and Lower Merian school area found that hundreds of photographs have been taken of children through their webcam covertly by the schools in that case that I think there was over $600,000 lawsuit that settled that. But again, that sort of there's not clear laws, there's not clear definitions around this particular activity. And whilst other legislation should be enough to make sure that we understand how data about children is used, it's simply not being sufficient at the moment. And again, it's packaged under that child protection umbrella, which is very, very hard to then have a nuanced discussion about what's actually important across the full range of rights and all of its implications. Yeah, thank you. You mentioned, oh, okay, great. I see Maxi has his hand up. Was I muted when I said that? Sorry, Maxi, I see your hand is up if you'd like to go next. Yes, thank you so much. I also just wanted to echo the fact that there is a paucity of or paucity of details about not just their iFotos automatic scanning, the text for CSAM, but also in their other program, which is the parental notification system that they are using. In this particular issue, we're highly concerned because the system utilizes a machine learning algorithm to detect whether or not imagery is being sent to or sent from child accounts in Apple's program. So in their family plans, you can mark a user as a youth, a minor, specifically if they're 13 or under. If they send imagery to and from from their account, what ends up happening is it notifies them that the image may be explicit and utilizes a machine learning algorithm in this particular mechanism. So whereas in the iFotos one, there is sort of a human intervention, this one is much more broad in the sense that it determines automatically whether or not an image is explicit. And according to their own documents, what is explicit or what is inappropriate is very poorly defined. And we found it to be highly concerning just because inappropriate and explicit has often been used to target LGBTQ communities. In school districts, as has been mentioned, a lot of the times these well-intentioned efforts to limit access to inappropriate or explicit imagery often limits access to content or information about LGBTQ issues. We run into issues all the time where if we are engaging with the school district, our own emails get blocked because we're at the National Center for Lesbian Rights, just because we have lesbian in our name. Apple I think is going to face similar challenges as they try to limit what exactly is inappropriate and explicit because content filtration is never values neutral. You will always have to decide according to some framework what is inappropriate or explicit. It's also particularly overbroad. It surveils not only the child that is in, you know, that is being who was enacted this program. So a parent, a well-meaning parent could enact this program and enact this feature to monitor their child's behavior. But it also monitors everyone who is in contact with that child. It monitors anyone who is texting them and they will probably not be aware of what is occurring. And this could include their classmates, their friends, other youths who is in contact with them. We think in particular LGBTQ youth are exploring their identities. They're discussing with each other. They're trying to seek information and we're concerned that if they live in an unsupportive household or if they live with people who are not supportive of LGBTQ identities, then this could out not just the child who is being surveilled but also anyone who is in contact with them who may not be aware of this. Thank you for that. Okay, I'm seeing maybe I missed in chat. Diana, great. Yeah, I see your hand is up. Okay, Diana. Thank you very much. I just wanted to respond to some of the issues that were brought up because I think there was a very powerful point made about, for example, the UK counter-terrorism strategy and how that is being framed. So research by civil society organizations and work in the communities has shown that the way this was framed was about safeguarding. Safeguarding children, for example, most at risk of terrorist radicalization. And this is how it got some support in communities, even marginalized communities, in the UK's case, religious communities, so Muslim communities. So it was by framing it as a safeguarding issue that it actually achieved its purpose of getting support in the community. So I think that's a very important point. The second one would be around issues of consent because Jen mentioned about the collection of massive collection of children's data in a way that is very obscure. It is not clear what data is being collected, who it's shared with. And there have been cases before UK courts where it's taken children years to get inaccurate information about themselves deleted. So also it's about disregarding in a way what children know and what they consent to because when they are being flagged as potential at risk of terrorist radicalization, they don't notice. It's not something that is transparent at all. So in any discussion about privacy and children's privacy, the consent part should be a very clear part. And we see that in some cases where there is this mission creep, consent goes out the window. And I just wanted to very quickly respond to your question, Erika, about how children feel, how they react to being surveilled and how this affects their development. Because what we are finding in research is that children really value their privacy. If they feel they are being constantly monitored by those with whom they have very personal relationships, that causes them distress. It undermines their trust in these relationships. And I think it's very important to stress that this erosion of trust is a child protection issue in itself because it makes children less likely to see these relationships as a source of help when they need it. And also UK research has shown that surveillance with regard to parents is no longer seen as about discipline and control alone. It's about care. It's about safety. It's about what a responsible and caring parent would do. And it is promoted and normalized as an approach by a caring and responsible parent. Thank you. Thank you. Yeah, that's a great point. We've seen definitely that some of the most important ways to stop this sort of abuse is to have those trusting relationships with adults and anything that erodes that is certainly going to be harmful. We had, next on stack from the chat, we have Hejong Han. Hi, Ged. I would actually love to pose a question to the others at this group and just hear what you all think. And I'm going to offer this question very cautiously, because as you'll hear, it's very fraught with a bunch of issues that I can see, but I'd love your thoughts on it. So my question is, what do you all think about the possible ability for Apple, or for Apple to provide the ability for people to report abuse on iMessage, specifically around CSAM? And of course, that comes with a bunch of caveats, because we know that tech companies, the biggest tech companies in the world have stumbled when it comes to anything to do with content moderation and reporting mechanisms. So that's a huge caveat there. There's a bunch of questions around, well, what would happen after that? What would be, you know, who would that go to? How would, how or if what would law enforcement be involved at all in different countries? Or should they not be involved? Or how would that all work? That's, I'm leaving that as a black box. And there's a lot of fraught questions that others have mentioned in their opening statements, but I just would love to hear your thoughts on that. Okay. Yeah, it looks like Chris, is that an answer to this particular question? I don't know. Can you unmute yourself or do I need to unmute you? Thank you. So, I think the one thing that I would add on, can you hear me okay? I think one thing I would add on the issue of reporting, I like that it's a manual thing, but I do know that there are a lot of people who fundamentally understand the difference between second trafficking and or adult and binary gay, but who, who get to use, report, and have a chore or report any sort of erotic material. And so, I think we need to be safe or at this point for anything like that to make sure that it's not used. A cause harm? I think going back to some earlier discussions around children and keeping children safe, I know those of us who do anti-sex trafficking work is a anti-sex trafficking act. We've got a survivor of sex trafficking as well as as a professional. You know that up to 30% of commercial sexual exploitation of children is done by a parent, caregiver, or family member. And so, it's specific to any policies that would notify a parent or caregiver. We have to be aware that if you're notifying parents, not only are we sending out children to use it, but their kids will die when they are out at their parents' house. But we also may be notifying their traffickers of their behavior and making it, they got children of the same part and the way that they can take a moment to rest on their conscience and all children have to stay home. And especially given the number of already in the system involved to mention the foster care system, we may have experienced a high amount of abuse in their homes already. We can't make that assumption. We need to support those vulnerable to extradition in the system. Do not have to take home the things we can't for a day or a half. I think the other thing I would say in terms of increase surveillance, I just want to note that a lot of sex trafficking survivors during sex trafficking or as adults sometimes trickle back in and are doing substantial sex work. And while some of these trafficking, many trafficking survivors are using themselves to keep themselves from entering vulnerability or having, and a lot of the things that those survivors who are now sex workers are doing to keep themselves, they rely on physical work. I saw that Namrata, you had your hand raised next. Yeah, I just wanted to very quickly respond to the question about user reporting and blocking mechanisms. I do think platform based safety tools that enable user reporting and complaints and blocking are aligned with the idea of enhancing user autonomy and placing that decision in the hands of the user regarding the kind of content that you want to report when you want to report it. But what I want to emphasize is that there is an ecosystem that has to support this kind of reporting mechanism. It wouldn't just be enough to create that option and to make it available. But there is a lot that has to happen for it to be effective. And this includes creating a kind of user awareness program that equips users, especially youths with the know how and with the knowledge that they need to be able to utilize these tools. And very importantly, that has to be very detailed follow through by the trust and safety teams at these companies. There were reports recently, for example, which said that a large percentage of children who had been targeted were in fact recontacted by people that they blocked and people that they reported. So I think unless these reporting and complaint mechanisms are strengthened to improve outcomes and to ensure follow through, it might in fact be a weakness. It might result in revictimization when the perpetrator who's been blocked realizes that they've been blocked. So I think that it's important as important as it is to recommend strength and reporting mechanisms. I think it is also important to look at it, to take a step back and look at what else needs to be strengthened to make sure that it has the desired effect. Similarly, I think that between the two, between an automatic system that monitors messages or content of messages and user reporting, user reporting is as preferable as a step of engaging the user's consent. You know, asking a user to consent, you know, who says, I want this information to be looked at. I want this information to be reported is an important step to making them involved in this process. But then, as was mentioned, it's important then to follow up with further consent from that user. Any escalation from a third party is dangerous. You know, in the situation of parents being notified that their children are viewing explicit material, the parent could then on their child's behalf accelerate this process or expand on this process and report the information to say another child's parents. If they don't support that they're engaging in same-sex activity, for example, it could be possible that other minors or their own minor could be charged with crimes. Sometimes even as adults it is a serious issue. So making sure that the escalation continues along with the consent of the user of the person who is directly affected and where that goes, I think is integral. Whether that involves, you know, law enforcement or whether that involves just content moderation or just blocking just needs to be highly dependent on the user's consent. That's a great point. Next on the queue we have Jean. Yeah, a couple of things. I mean, first following up on the sort of that immediate question of the self-reporting, I think, you know, echoing everyone's comments that reporting what to whom for what purpose. And acknowledging that this would not be the same in every country, let alone in every jurisdiction within each country. So what we know about human trafficking in the United States is that the response differs dramatically from location to location. In some places they have very robust service networks in other places there is little to no response. In some they are very discriminatory responses or very law enforcement led while others are very victim centered and service provider led. And that leads to very different responses. And in fact, you know, what we what we also know is that a bad response can be even more harmful than no response. When someone expects that there will be a specific response or they will receive services or assistance and in fact get a response from an abusive criminal legal response, criminal justice center response that actually is harmful, denies their existence or doesn't recognize their exploitation, which often happens, especially with marginalized populations or puts them at risk of deportation or other forms of harm. Then folks are even less likely to come forward for assistance because they've been sort of trained, right, that it's not going to be helpful, it's going to be harmful. The other point I wanted to make about this sort of development of any kind of surveillance system or response is that abusers use these to their advantages. And we've seen this happen dramatically in the domestic violence context where abusers are using any form of surveillance or support that's been designed for parents to be able to monitor and protect their children in order to follow abuse, stalk and harm those that they seek to abuse and exploit. And traffickers will do the same. So I think, you know, this thought that, oh, we'll give parents these options to be able to monitor their children. How do we stop these tools from being used by abusers who are either committing domestic violence, child abuse, or trafficking and exploitation where they put them on a quote unquote family plan when in fact they're not family members or where they are family members who are being under the abuse and control of a family member or that it's just an exploiter who's been using this as a tool to surveil, monitor and control others. So I think, again, as I promised initially, I don't have an answer. I don't have a solution. But what I have is real concerns about ensuring that every aspect of these decisions is being carefully considered. How else could this technology be used to commit harm? And what safeguards have we put in place to ensure that it's not perpetuating further harm? And I think sort of none of these questions have been answered yet. Thank you. That's a great point. Next on the queue, we have Kate. Yeah, and I also kind of want to touch on some of the questions that have been asked before, because I feel like so much of this of what's being discussed around automation versus opt-in. Also, I think a lot of this also comes from the fact that the history of digital surveillance, digital oversight, corporate oversight, talking about child protection is and automating it is how you demonstrate due diligence in order to not have government oversight. And I think that has kind of been the history of the way that technology has happened. That when we start talking about this, when we're talking about Section 230 and the protections that offered, the way to break that down was to talk about child safety and to remove it in a way that was automated. And because automation is the way to protect your bottom line, that's what it always defaults to. And so this has become really the pattern. I feel like we're just in the newest iteration of a conversation about child safety with new technology that started 150 years ago and has played out very similarly every single time. And so when we say that this conversation is siloed from its impact, I think that is a very intentional thing, because it is at the end of the day incredibly expensive to deal with the problems that we're talking about. When we talk about automation, we're not talking about people who are paid to review things. We're talking about automation that leads to reporting. We're not talking about human reviewers in the meantime. And the reason why reporting to a customer service at any of these things isn't happening is because that's just too expensive for the size of these companies have been allowed to grow to. And so I think that as much as we can propose that, we have to also be really cognizant of the fact that this is a decision around automation, around not reviewing, around having things being red flagged and tagged and having those accounts closed and frozen to enable fear in people to self-censor is because that is the answer that doesn't affect anyone's bottom line while also closing down their liability from third-party websites. It demonstrates due diligence so that the government doesn't have to come in and actually regulate that company in a way that would affect its shareholders. And so I think that as we're discussing this, we have to unsettle the conversation from this. And so when we talk about also reporting, we have to start talking about the fact that when you report something, nothing really happens. The number of cases of situations of violence, of harassment, of trafficking that service providers, including myself, have reported to tech companies, most of that stuff goes nowhere. And when we call things like the trafficking hotline, often those people are being referred to services that are deeply underfunded, some of which are all volunteer services. And so if we're actually going to talk about child safety, about what it means to report something, we have to talk about the fact that those reports don't often result in the answer that we want anyway. And especially if these are conversations that are built on trust, if disclosure is built on trust, if sharing the very personal information of your victimization is built on trust, and we're collapsing that trust by saying we're reviewing everything, we're surveilling everything that you look at. So either you're going to be reported, and we know the impact of mandatory reporting on people being able to disclose, whether it's healthcare providers or service providers in any other context, we have information on the fact that people do not disclose if they know they're talking to a mandatory reporter. Those all have to be part of this conversation as well. And the ultimate impact of that is that when we do have surveillance, what happens is people either lose access, and that is sex workers who put words referring to sex and words referring to money in the same message, and all of a sudden your account is closed. That is distributing sharing information about harm reduction, and your account is closed, or you get reported. This happens very frequently. I know literally just in organizing over the last few years, we've had to figure out how to organize differently because of automation and the way that it's had an impact on literally just connecting to share information on whether it's sex work, whether it's organizing, whether it's literally like we put out a press release about a raid that had happened, and it was censored, and we couldn't get it out. And we've had to figure out how to organize and connect to people to talk about safety multiple times over the last couple of years of organizing because the automation of this has gotten so much worse, all under the guise of we have to protect children, and therefore we are not allowing people to connect to each other. And so the answer is either self-censoring, not being able to connect to your community, and or just knowing that you're going to lose everything every couple of months. It sounds like we have a direct response from Blaine. Yeah, I just wanted to add onto that in the way that this type of surveillance or content moderation impacts the way that we speak to each other, the way that we organize, and the way that we talk about harm. When Kate was talking about the way that we needed to alter our language in order to share that report, I also think that something gets lost in that translation, and it becomes that information becomes less accessible, and it becomes more difficult to find community to connect with when you can't speak openly about your experience. And also I wanted to quickly address the question, which was how we see this content moderation impacting people. And I think when I think about this intervention, I also think of not only the parents who have control over receiving information on what their children are doing on their phone, but also for abusers outside of the home who might weaponize that system to have something reported to a parent where they know that they might take action. And this is something that we see in sex working community and online spaces where many of us lose access to our accounts because of a malicious user reporting and weaponizing the content moderation system to make sure that our accounts are taken down, to make sure that we don't have access to community. So right now specifically, I'm thinking of potentially an abusive partner who knows that their partner's parents have this turned on their phone, sending something so that their phone or device gets taken away because they know how the parent will respond to this type of alert. And I think that's important to be part of the conversation as well, how other people might misuse this technology. And I think it's really important that we continue to think of the way that this might be misused. That's a really good point. Thank you. Vladimir, I see that your hand's up. Thank you. Yeah, actually, I was thinking when the question was raised up on how user reporting can be implemented or is implemented, I was thinking on the question on how actually we prevent also from being abused. I know that there was saying that sometimes there are no actions, but as Bruno also is mentioning, sometimes how this can be also be reporting being an abusive system. We can see perhaps when thinking, for example, on DMCA reports, at least in Latin America and in other countries, how this is being abused to take down information of public interest and many other information that are important for a public and a social debate. So then it's like how do we create certain safeguards to prevent this type of mechanism to be abused. And the other things I was thinking and perhaps like erasing the concerns and thinking about the context in Latin America is for example during protests and recently in the case of Colombia, police was like doing this kind of like cyber patrolling using sometimes like CSAM provisions and how this lacks like three levels or three important issues. First, that sometimes it does not comply with the principles or provisions of legality. There is like this very broad and very vague way of acting of police officers going to undoing this kind of preventing what they call cyber threat. For example, to fight this information or to profile suspicious activities in person and how this like vague and proposition actually like they become and they led to criminalizations and sometimes identify and prosecute certain expressions. So how in the context in Mexico and Latin America these kind of provisions and the idea of like cyber threat or even like when we are like thinking about like child sexual abuse or CSAM like mechanism or provisions how this also like conducts to the criminalization, identification and the prosecution of those who were like protesting. So there is like not just the need of transparency of how like they are acting and how they're like using this type of and how can be abusive in terms of like this type of backdoors but in certain cases and I will perhaps insist on this and the idea of like the legality. And then I think it was also like mentioning like everyone who knows that somehow it's like being surveilled then how this creates like this chilling effect in terms of freedom of expression and reframe from participating in civic space. So those I think we have to see also like in light of actually how surveillance is being operated in the context of Latin American countries particularly during protests and how CSAM provisions can also lead to this criminalization and because we already think this very broad and very vague actions of cyber patrolling in for example in the case of of Colombia and now we are seeing also some legislations that actually are trying to restrict more and if there are like not any type of safeguards this can also like restrain from the participation and active participation in civic space. Those are really good points. It looks like our queue is empty so I'm going to hop on to the next question that we have here. So we've heard the children are at risk of abuse both from their from their caregivers and from others and the latter case seems to be what Apple is worried about in particular. How can we actually empower children to recognize that they're in this sort of dangerous situation without doing this surveillance without disrespecting their human rights? What do we know that actually will respect their ability as human beings to make these decisions and recognize it while recognizing that they're still young and may not have all the experience that they need to know what they need to do? What social structures can we put into place? What information? I mentioned that one of the things we can do is educate young people and give them accurate information like any other especially talking about sexual health issues, any other sexual health issues. We know from the evidence that I'm powering them with accurate information that it's not just about panning them, it's not just about excretion happening but that's about how you navigate boundaries in your everyday life and how you do that online. Here are red screen slides that help you behave your interest. Here are some things that can help you behave your screen slides in your online interaction. Here are some things that are red-climbing. One of these is a systematic review of the literature of online safety programs found on a lot of the sales by treating the online safety as an example of information of partners in general information but it is hard to be on the screen every day. I recommend that instead of talking about reading like that, we'll change that. Navigating is how we'll be. Instead of talking about here, we've got a different person that's been having a whole discussion on online safety that's everything we need to do. We need to integrate both and bring the hands of online practices. So, we need to be able to control how many people say things we can't do. So, we talked about yesterday to the point that the online safety will have a part of it that's perfect for everyone. I know I've been a part of a few people who want to tell us that you can find out a few of your children from using online interactions. We know that's not good for us. And our hope is to see your parents on the other end as well as your children. I can work to see them power them with information and at the same time, we need to be empowering their community, their parents, just as their peers. You can know that as an adult, feeling an interest in the other person, you know, a minor is not a sign that it isn't for reservoirs, right? That it's a challenge to choose. So, we need to be countered with information and the power of it and with changing the control conditions that cause you to get more expertise with children and more matured. We're not going to arrest and criminalize our way out. We're certainly not going to make our way out of it with surveillance, just like you mentioned earlier, with surveillance technology that are going to have to be expanded as people can put to harm and to be behind when it's around them. Thank you. I see Kate, you had your hand raised. Yeah, I think, you know, echoing a lot of what Chris said, we have to begin with comprehensive sex education that includes, you know, a range of identities, a range of bodies. It has to be consent focused and it has to start young and we just don't have that right now. And the way that we approach a lot of this is very based on stranger danger, it's very based on fear of the people outside your home, of only trusting people inside your home and to give you very few options as far as who in school settings we can, you know, children can go to if you're talking about the number of police officers versus the number of nurses and counselors. I mean that ratio is terrible. We don't have as much time spent one-on-one with teachers with exploding class sizes to develop the kind of relationships that we need. We know the answers to this question and it's, you have to start with a wealth of information. The reason why young people are seeking out more and more online is because of the dearth of information that they're receiving from other trusted places. And so, you know, I feel like it very much gets treated as like this exploding scary thing when realistically we have not put in the foundational structures that we need to treat young people and children as, you know, rational thoughtful beings who are navigating their world in a reasonable way. And by doing that we're saying we have to come in and offer you protection and we have to do that through not providing you information, to cutting off the number of resources that you have access to, which is the exact opposite of what we should be doing. I think the other difference is, you know, I think when a lot of times the stuff, you know, one of the things that bothered me deeply on the SIO conversation with Apple is that it began by an organization completely misdefining CSAM and completely misdefining trafficking. And so if we're going to start with a flawed understanding that actually doesn't take into the nuances that there are significant differences in the experience of a very, very, very young child experiencing physical and sexual abuse at the hands of a family member that is not the same thing as a teenager who is either pushed out of foster care system, run away from a home that does not feel safe and now is trying to access resources and cannot do that in a way that is affirming to their identity, these are two different experiences and we cannot lump them all under the same thing. And so I think, you know, I appreciate that that Jean came in saying like, I'm here to provide nuanced not solutions because we use this language, we miss to find this language in a way that that gets clicks and retweets. And what that does is completely oversimplify the problem and leads us to a completely flawed and oversimplified solution. And so I think, you know, broadly speaking, more access to information, more access to trusting spaces on your own terms is what's needed every step along the way. And it also has to be really responsive to the fact that we are talking to a range of different scenarios that cannot all exist under this umbrella of what do we do about CSAM? That's a great point. I want to thank you next. Just echoing a lot of what folks have been talking about, but young people need access to resources and comprehensive sex education, resources of what abuse looks like and what healthy relationships can and should look like. Young people need access to non-judgmental mental health services without fear of court mandated reported. And young people need community with other young people in positive relationships with trusted adults. I think it's important to also understand that young people are capable of consent and are going to explore their sexualities and that this exploration is not synonymous with or should not be treated the same as abuse. Surveillance isn't a stand-in for resources and tech interventions aren't the answer of how to protect young people from abuse. I completely agree. Jan, I see you had your hand up next. Very much echo the points made, but I think it's really important if we're saying, you know, how do we empower children? We're also too often, especially in the policy sector where adults are talking about children, it's without them. And too often we're saying, how can we do this thing to you? We empower you. And in fact, most research we've recently seen is all about children asking to be listened to. And tomorrow, the Observatory for Monitoring Data-Driven approaches to COVID-19 in the UK is bringing out research. All of the work with young people has brought out we are not being heard when we're asked about how we want our digital rights or our data to be used, for example. And it very much echoes that consent decision where we're left out of this conversation. We're not enabled to make decisions that affect our life. We can see that they're having discriminatory effects. We can see that our disability is not recognised in this space. We can see that we are our friends and classmates or marginalized communities are left out of participating in these conversations. And yet we're not able to influence it. And this feeling of disempowerment is actually not self-inflicted. It's something that we're doing to children by not listening to them. So I think our plea would be, you know, whilst very importantly skills and training and capability and capacity is important to pass on to children in different degrees as their capacity develops. And it's very important to recognize that doesn't necessarily come with age. One of the big pushes at the moment is for age verification, age to be some sort of key to unlock your next stage of your digital development when in fact capacity is what matters rather than age. But it's to recognize that children must be listened to, that they're not homogenous, that they're not one group. And in fact, the majority of children, we should be looking when we're making these decisions coming out of Silicon Valley, from the US, from the UK, from Australia, we're not necessarily considering the children in Sub-Saharan Africa and the huge digital importance that access has in their lives to what that means for them. So I would just give one case study of a young LGBTQ community member that we spoke to in a workshop this year and they said, I cannot go online at home because of what my parents might say or do if I were to be found to be researching these materials. I cannot go online at school because I'm being monitored at school. So the only way I can access the information I'm trying to find is by using what they would call the dark web or going through tour to find these safe channels. And of course, adults turn around to them and say, well, why are you going into that riskier space? And in fact, we're chasing children away from perhaps the safe channels that they need by sort of imposing this sense of what can we do for you rather than abling what it is that they need. And I think we need much more to listen to what children say. And the research is out there. And I'd love to know if Apple had done research with young people, what their findings were, how they were made sure that community of young people and youth views were inclusive, were representative and dynamic and how that might be assessed over time, that those sort of views are not static at one point in time and may change as experienced and as these kind of technologies if they were to get rolled out, that we don't say, well, that's it. It's a done deal. It's a decision. It's over. That there must be a continuous assessment and listening to young people of what the effects might be. Yeah, I would love to hear that as well. And I'm sure they would really help. They would appreciate your expertise in helping to run those sorts of studies. Okay, one more on this topic from Maxi. I think in a more targeted way, too, there's a world of difference between, say, in Apple's specific case for their parental notifications. There's a world of difference between a system that is mandated by the parent that is implemented to surveil their parent, their children, and a system that allows children to choose to report content to their own parents. I think in this particular instance, the former or the former one where the parent is mandating the surveillance against any wishes or even inclusion in the conversation by the child fosters distrust, fosters suspicion, fosters fear on both parties. Whereas if children have the option to report content that they don't want to see or messages that they don't want to see to their parents, I think that actually builds trust and builds a relationship because it is driven by the child who chooses to engage the parents and asks them to step in as a parent in these situations. And I think that that will help not just in regards to child sexual abuse, but also in regards to bullying. I think a lot of times there are children who don't really under, don't know how to approach these conversations or find it difficult to approach these conversations. And, you know, by approaching their parent and talking to them or asking them about it, but giving them the option to just sort of quietly ask them to intervene or quietly ask them to view material, I think would help build a relationship rather than kind of tear it apart. But again, this is driven by the child in this case. You have to listen to the person who's being infected and if the child is exploring or is discussing with people and wants, you know, the privacy of that, of the ability to do that with the people they trust, their confidence, I think that's important to maintain. And we shouldn't step in with technical solutions that override their consent. I think that's a great point. You know, we've definitely seen that building trust is a great way to go forward there. I want to change gears a little bit and make sure that we're hearing from our international representatives here as well. And so, you know, I have a question. So one of the groups that Apple has worked with to build this program, they say on their website, part of their official platform that minors under 18, that's every minor under 18, should have no access whatsoever to encrypted communications. And so I'm interested to hear if you could talk a little bit about what the effect might be in the communities that you work in, if this were the case, if minors could not encrypt their communications at all, anyone under 18. And sorry, I actually didn't see who had their hand raised first, either Vladimir or Namrata. First one, Namrata. Thanks, Vladimir. I think that is unreasonable and that is very flawed, right? Because I think privacy online and digital privacy and security is crucial for protecting your right to privacy, your freedom of expression, your freedom of assembly and scores of other human rights. And when we speak about human rights and fundamental rights, they are not limited to adults, even if you look at it from a purely legal technocratic standpoint, none of these legal instruments will say any adult has so and so right, whether it's an international instrument, whether it's a convention on human rights or it is a domestic or a constitution in a particular country, these rights are guaranteed to all citizens, to all persons, to all human beings. So I don't think it is fair to limit encryption, a crucial tool for privacy and security and freedom of expression to those aged 18 and above. And I think for children, in particular, there are several other rights also, right? I think encryption is important to protect the right to education, to be able to have conversations with their fellow students, with their teachers in a free environment, to be able to ask questions openly. And to some extent, encryption is important for that freedom of thought as well, to be for intellectual development, for a child, for a youth to develop their views on issues of politics or sexuality that cannot happen in an environment devoid of encryption, that cannot, the same way a child needs privacy to be able to talk to somebody they trust in a room without having somebody listening in on those conversations or the possibility of somebody listening in, that same space has to be available digitally. And we're seeing that over the last year and a half more than ever. So at a time when we actually need enhanced protections, enhanced spaces for children to communicate securely, to think freely, to Google things freely, it is the opposite is happening. We're looking at these spaces being eliminated or being severely compromised. And in some ways, I think the effect can be compared, a bit of a stretch, but can be compared to an internet shutdown, right? We saw that happening in Jammu and Kashmir and several other parts of India. When there is an internet shutdown, children are not able to access education in the way that they should be. They have to go to cyber cafes, connect, there are several mechanisms, additional steps they need to take. So the moment encrypted channels are not available, the same thing will happen with several kind of research projects or just conversations that they are seeking to have in private spaces, they will just be shut down for them. And that is very, very critical. It will result in a violation of their fundamental rights. And the very basic point, like I said at the beginning, is one cannot restrict human rights and constitutional rights to adults. That is fundamentally flawed. Thanks, yeah, that's a great point. Vladimir, looks like you're next. Yeah, definitely. A close one to what Namrata was just like mentioning, just imagining like from here to here, you have encryption communication, but you, sorry, you are young and you cannot have like any type of recognition of a human rights standard. Just like to reinforce this idea of like encryption anonymity is being recognized by international human rights bodies and they are part, a potential part of freedom of expression and privacy. And the other thing is like when we are thinking again, for example, on protests during, in Latin America and just recently, for example, in Cuba and Colombia, young people were the ones who were participating actively in protests. So imagining like having, not having like the chance to communicate to encrypted communications, knowing that you are being surveilled and you're being prosecuted and you're being criminalized by the government, then it's like, yeah, just like opening and clearing communication, which put also like in a great danger, in a great risk. So I think we need to really reflect and think that if it's a human rights standard, then it applies to everyone. And if we are thinking particularly in the context of like protests, social mobilization, and so on encryption and anonymity is really vital and really like important of those young people that are like mobilizing and expressing their demands and that they are like using technology for those purposes. So from that perspective, I would say no, there is the opposite. We need to preserve and we really need to think about encryption as a very vital and strong need for many different actions, from education, from access to knowledge, from communication and another action. So, yeah, then I will, the idea of like reinforcing it and the idea of like really keeping this as a very important tool and a very important standard for everyone in particular for young people who is taking out the streets and using encryption communication to mobilize. Thanks, yeah, that's a really good point about young activists in particular. Next we have Rafael. Yeah, thank you. No, I just wanted to say that it's important for kids to know what are their rights and privacy is a fundamental human right. So cryptography allows us to protect our privacy and I mean we can understand privacy as a way to know what to decide what part of your life you want to share with whom and kids should be aware of that and they should know what part of their life they're sharing with whom and the internet is surveilled. Everything we do in the internet is surveilled and kids should know that and they should know what alternative they have to communicate securely and I think that's very important as we talk about sexual education this kind of education kids should have since they're young because that the world we're living on. Thanks. Absolutely, Luis. As I would just want to reiterate that it's important to think who the adversary of children is who wants to whom I want to attack them. And I think it's really really misguided and not really yeah someone living in the fantasy world saying that kids shouldn't have a right to encryption when their adversaries will totally use that against them and to reiterate organized crime which is very much involved in committing crimes against children is often the state in many places in the world and if you are by default making children's communications less safe you're making them more vulnerable for their attackers that which include people who can have access to really powerful legal and technological tools to use them against them so by requiring or requesting that no children has right to communications you are basically probably making it making them more vulnerable to their adversaries to the people who want to abuse them which include people from in the government and include people in the government that are part of organized crime so yeah I think that that's we should really dismiss this view because it's really harmful for many people maybe people around the world yeah it's a great point especially when your government is the organized crime that certainly wasn't in the original threat model Damar yeah September 2019 there was a group of students last 18 years they have communicated to each other using an encrypted communication to join a rally a peace protest rally in Indonesia and what happened next is their their communication being exposed on the media all is saying that inside their communication and they actually the the one that behind a protest against the government and they're using those conversation as a as a base of their persecution on the court so if this situation you mean without without encryption I think this is will be this could be happening more and more I think because in the context of Southeast Asia many many young generation join a lot of rally and protest and they're being part of they need those they need those encryption not only in Indonesia but also in Hong Kong in Thailand in Myanmar right now they're joined a lot of young movement against the authoritarian regime and without providing this inscription I think it will be difficult for this young generation to be safe in the context of what happened in our in our region thank you that's yeah it's a really good point those are certainly all places that I have some high-profile protests particularly led by young people so it's a great reminder of those Diana thank you I just wanted to draw a thread through what has been said because I think it relates very closely to my introduction about not seeing things as either or either protection or privacy either one right or the other and I think this is a particularly good example this very broad and no nuance prohibition on encryption for under 18's proposal is a very good example of seeing things very simplistically and not seeing human rights as mutually reinforcing because so we had the argument that obviously human rights do not pertain just to adults children have those same human rights and the other argument that I think is needed for the next step in this debate is that they are interdependent mutually reinforcing so when we talk about countries where people individuals have been arrested or imprisoned for online content relating to political social religious issues where you have young protestors being targeted arrested and subjected to violence that's not just a freedom of expression or a freedom of assembly issue it is a protection from violence issue as well so the protection of violence does not come just on the on one side of the debate in sort of as an opposition to civil and political rights and somehow more important civil and political rights have an element of protection in them and everything is mutually reinforcing yeah that's a great point it's all a question of safety just different aspects of it Jen I think if anyone's looking for more information that's not as expert as the panelists amongst us here on that you know edgy European digital rights has some terrific information around encryption and this balanced nuanced information that is often missing in other circles and policy discussions but I think the thing I want to raise is the children are often I think what we call in British English the canary and the coal mine that the test subjects for these changes in technology and I think I would want to bring back to you know whilst it's not what Apple's doing right now this safety tech is in schools exactly this kind of removal of encryption workarounds of encryption putting system software services in what's effectively man in the middle attacks is how these safety tech work and they set up you know so that they are able to access everything on a screen which means bank logins you know secure passwords and you know the companies will openly say in their frequently asked questions we try and have technological solutions to this but we might miss some passwords and pick them up that's not an answer to you know breaching your fundamentally secure digital access you know they're very open about even putting into marketing materials the most invasive case studies they've got you know there was one where a teenager had a you know suffered an attack and she was writing offline to her mother using a school computer and the the company put into their marketing materials we were able to identify this and we flagged this to the school that young woman had chosen to write something to her mother to a trusted relation that she was telling this thing had happened to her she didn't expect these company these third party people strangers to interfere with her private sphere and then tell school and then have some sort of intervention from them she'd have chosen to do that if she that's what she'd wanted to do and so the the sort of you know this breach of encryption as a tool is the way into a breach of actually somebody's personal and private life in every aspect and the technology for children is is doing it in the most invasive ways and I think we need to be all alert to you know one of the suppliers of safety tech who advocates for what's the direction of travel what what happens next with this type of technology and what they'll say and we've heard them talk at conferences is you use this type of technology to monitor children's activity to keep them on point that means are they working in the class on their task are they focused are they staying on screen are they looking away or distracted because this is what employers will want in the future this is what our work partners want them to be like and these technology companies some of them are very open about this is what they want the world to look like in future your employer will be able to monitor everything you do that there'll be no safe secure content on your screen in any sort of employment sector either everything every activity you do and whether you're on the screen or not will be monitored so I think all of this discussion and what Apple chooses to do at this moment is a fundamentally important decision about the direction of travel of how we preserve fundamental human rights and freedoms in the digital environment because it's not just about the technology now it's about what it's going to lead to next and that we are normalizing for children this acceptance that whatever you do we'll be watching you and suddenly at 18 you expect them to understand oh now you can do safe online banking and now you can do safe secure shopping no that's not what the technology is going to be used for it's going to be using the sort of the other ways they've they've managed to normalize the sort of lack of encryption and lack of secure spaces to be able to invade every aspect of your daily digital life and I think Apple has to be very very careful about this sort of normalizing of this attitude and acceptance of this in children because this is the next generation of of adults very very quickly and that kind of world that they want to live in is not the one that they're being offered right now by technology companies yeah wow I think that's exactly right and we've definitely seen that happen especially in the last couple of years as more work has been done from home we've seen it those things that were tested and maybe tested on children that expanded to the entire workforce you know I think that's exactly right for what happens with children and I think we also see a very similar expansion you know as some others have mentioned around the world with expanding from scanning for one type of thing to scanning for another type of thing we've definitely seen it with the terrorism content with the GIFCT database I know we've already had a couple other examples of that if anyone wanted to throw in another one there you're welcome to looks like the queue is empty we can move on to a new question here unless anybody has anything else but I that was so well said I think that's exactly right it's about the larger direction that's I mean that's why we're so worried about even what might be a narrowly scoped backdoor because we all have seen it become expanded in other cases but you know I also wanted to talk about even though it is a narrowly scoped backdoor I also want to talk about what it is that they are even purporting to do here particularly on the CSAN scanning front it just seems so clear and even Apple has admitted that the CSAN scanning is essentially being driven by law enforcement interests right NICNIC is technically an NGO but they work with the government and they're basically doing helping them with the law enforcement aspect of removing CSAN and they those in the law enforcement side of things they often talk about the re-bictimization of people who appear in that CSAN because they're trying to connect what is really just another form of law enforcement like any other one against possession of CSAN to helping people they're trying to say oh look we're not just doing law enforcement we really are trying to help people but what I'm interested in what are the actual effective measures to help people who have been sexually exploited what can we as a society do or change that would actually be helping people the most whether that's a technical or non-technical intervention right if you know if the goal of the CSAN scanning technology is to help the people who have been victimized and included in the CSAN whether they're still children or they're now adults is that actually helpful is this what we can be doing is this where our resources and energy should be going I'm interested in hearing from anyone who might have any information or research about any of those things Jean that's how you turn off your mic no I I um I'd be you know for Freedom Network what we see is that primary prevention is what's really missing in the conversation here in the United States and as I said in our introduction what we see is that human trafficking is the inevitable result of the racist sexist homophobic policies that we've pursued in the United States that put people into positions of risk so it's our lack of affordable housing it's our non-existent safety net it's the lack of availability of affordable healthcare that puts people into positions where they're desperate to meet their needs and are unable to reach out for the services and support that they need on top of that it's the criminalization of sexual economies that ensures that people who have engaged in commercial sex who have worked in the sex trades are limited in their abilities to try other trades to experience other jobs and still access safe housing other forms of employment and to sometimes even fully engage in parenting people lose their parental rights because of their history and sexual economies so what I think we need is a fundamental shift and change that actually protects and supports the individual and that ensures that no workers in any industry are forced to engage in harmful work and are unprotected and are not made safe and additionally to engage in real support and services for families to ensure that children have their needs met so in the United States our child welfare system is really a punitive system that removes children when the state determines that they're not being made safe by their parents or caregivers instead of focused on ensuring that families have all the services and support they need in order to meet the needs of children and to ensure that children have access to safe and caring adults that will meet them where they're at and ensure that they have safety and they have the food that they need the housing that they need the clothing that they need and the connections that they need to be safe those are the things that really help those are the things that are needed they are not easy they are not a technology solution they are not a shiny penny they are not quick and and attractive and it's what needs to be done to really make a change so yeah that's a great point those all sound much harder than implementing this one change in our technology okay we are actually coming close to the end of our time so I wanted to give anybody who wants a chance to get any final thoughts in things that you might have heard other people say that really resonated or that you feel might apply in a different context just really any final thoughts and then other and if not we can wrap this up for today so just quick if anybody wants to jump in just feel free to take the mic yep maxi I just want to say thank you so much for hosting this this was a wonderful conversation with just really excellent perspectives and I also just want to just highlight the fact that these are systems that we're building that will persist you know data that exists today is not safe tomorrow necessarily and we are in the moment we're building foundations of digital privacy of how we're going to engage with the digital world and we're really deciding you know just sort of as as a world which direction this will go and will we go towards broader surveillance or will we go towards broader protections and consent of everyone who engages in the system and I think now is kind of a pivotal moment because what we decide today will just resonate far into the future thank you thank you yeah that's a great point I actually think that that's maybe a perfect summary closing for us and I think that that's maybe what we all want to consider looking forward I want to thank each and every one of you for taking the time to be with us today you know these are really hard topics but I appreciate you taking all of your expertise and knowledge and bringing it all together this has been a great discussion I feel like I've definitely learned some things hopefully anybody from the technology community or the policy community watching will be able to take some of these insights as well it will be available on YouTube if anyone would like to rewatch it back and I am looking forward to continuing these conversations with every one of you and the larger community in the future hopefully we can work together to build that more correct future that's going the right direction and make sure that the decisions about technology we make are building the future that we do want to see thank you so much