 want to welcome everyone to our first up FTGS ISA Global Voices Seminar Series. And so ANWR is the first up for the seminar, and the series itself is created by FTGS Section to foster more of a global conversation amongst all of our section members and with section allies as well, too. So we're really taking advantage of the technologies of COVID, right, and amid the different time zones fostering a global conversation. So excited to be promoting this series. My name is Amanda Chisholm. I work at King's College London and the Department of War Studies on Researching on Gender and Security. And I'm also this year's FTGS Section Chair. And I'm so pleased to also be one of the co-chairs of this seminar series. I'll have Lua as a co-chair introduce herself in just a second, but I want to introduce our panelists quickly, or presenter quickly, which is ANWR Menaji. And ANWR is going to be presenting on the title of her topic, our presentation is the Application of International Humanitarian Law on Israeli Cyber Strategies Against the Palestinians. This is really cutting edge emerging research. I'm so excited to have this platform for you to present. So I will stop talking now and quickly present over to Lua to present herself. And then ANWR will let you get going on your presentation. So Lua. Hello, hello, everyone. My name is Lua. I'm a PhD student at York University in Canada. And I am studying transnational networks of feminists and also right-wing women in the Americas during the first half of the 20th century. And I'll be here helping out with the chat and helping you with run the thing. Thank you and welcome everyone. That's fab. So ANWR, we'll just let you share your slides now. And I guess while you're setting that up, I just wanted to let the audience members know that questions, comments are strongly encouraged. We want to have this as an interactive discussion. So you can either place your questions in the question, answer box. And I can read that Lua and I can read them out loud to ANWR after she's done a presentation. Or you can raise your hand to ask them live. And I should actually quickly note here that Alexis has kindly agreed to be ANWR's discussant. So after ANWR's presents, Alexis will be offering some discussing commentary. But now I will honestly be quiet and pass the virtual floor over to you, ANWR. Thank you, Amanda. I am talking to you from my son's playroom. So that's why you see a lot of pictures in the background. I'm really happy to be the first person to start the series. I'm very excited to attend the next webinars. And I think this is an important project. And I want to thank Amanda and everyone, Lua, for organizing this. A little bit about my interest in this research and why I started thinking about cybersecurity. I attended multiple panels at ISA, the International Studies Association. I had somebody who did, like I knew somebody who did cybersecurity research. But they did it from a realist perspective, specifically an offensive realist perspective about how states should behave in the cyber realm. It was all about kind of attacking developing attacking capabilities. And I am also a Palestinian Israeli, so a Palestinian with Israeli citizenship. So I've always been interested in technology, the way it's applied in the context of Israel and Palestine. So both of them kind of came together to me because I felt alarmed that it's all just about developing capacities. I mean, there are also defensive capabilities of states. But I felt that the civilian element hasn't been focused on enough. So that's how I started thinking about this project and my other writing on cybersecurity from a human security perspective. Of course, I also wanted to incorporate my feminist training, so we'll see if I'm doing this correctly or not, trying to speak to multiple bodies of literature all at once. So now we're aware of cyber attacks happening by state actors, non-state actors, where the 2016 election after that, and then we know more recently heard about Pegasus and Israel. And with more attacks happening, more people are paying attention to the dynamics in the cyber realm. Even though this is an evolving reality that is becoming more prominent and is becoming more present, the focus on how to navigate that, legally navigate cyber attacks and data during armed conflict, specifically during armed conflict, is still lacking. So the two bodies of laws that I looked at were international humanitarian law and then human rights law. And I'll talk about the differences and why I'm focused on international human rights and humanitarian law and human rights law, or why I'm mentioning both, but not others. So initially because the IHL developed before technology became very prominent, so they're not really specific protections of the Hague Regulation of 1899, 1907, the Geneva Convention of 1949, and then the additional protocols of 1977. Even with amendments that I'll talk about a little bit, we can see that, I'm not sure, can you see the slides? I don't know why you can't see it. So you can see the slides on our sort of interrupt, it was just that you can see all the slides, the slides that are coming up as well. So I didn't see it. Oh, I see. Yeah, here we go. Okay, better. Good. So what was I gonna say? Again, I'm gonna talk about some of the amendments, the amendments still don't address it adequately. So what is international humanitarian law? It's the love-armed conflict. It usually regulates what happens during armed conflict. It sets rules which you kind of, it sets rules to limit the effects on civilians. It has actually safeguards for attacks on kind of private property, but we don't know if like, data is not considered private property in that sense. So that complicates it a little bit. So I also wanted to talk about human rights law versus international human rights law. They both apply in during armed conflict. However, in cases of emergencies, states can suspend human rights law, but they can't do so with IHL, with international humanitarian law. So human rights law has advanced a little bit more than international humanitarian law when it talks about privacy, especially with Article 12, you can see kind of some mention of privacy in the Universal Declaration of Human Rights. Afterwards, it was included in the International Covenant on Civil and Political Rights. And then we can also talk about Article 12 of UDHR. And then you, Article 17, specifically CCPR. And they said that individuals should be subjected to indiscriminatory and illegal violating their privacy. To address some concerns about IHL and privacy in comparison kind of, or sorry, human rights law and privacy, the International Committee of the Red Cross issued a commentary in 2020. And they said that these issues are covered in international humanitarian law under the human rights law. However, they don't really clarify the scope and nature of rights to privacy and data protection. And I'll talk about the difference between data privacy and data protection in a little bit, specifically during times of war. For instance, there was a ruling of the International Court of Justice on a case with the Palestinians in 2004. And it acknowledged Article 17 of the International Covenant on Civil and Political Rights on the rights to privacy that it applies in the Palestinian territories. However, that wasn't enough because it didn't clarify why it applies or what fashion, what are the specific consequences of the applications of these laws. Also, again, no specific mentions in the general amendments for IHL treaties and also the ICRC customary IHL database excludes any real mention of privacy or data protection within the 161 rules it identifies. So now I'm gonna take IHL and feminism. I know it's a lot of literature that I'm trying to speak to. I found multiple critiques of feminists with international humanitarian law but a lot of it is about sexual violence, gendering, victimhood and also narratives of motherhood. And I'm trying to challenge my feminist colleagues to add another element which is data privacy and data protection which I'm seeing more and more stuff emerging on data feminism, stuff like that. More recently, currently reviewing a book about it but kind of to make a case for why IHL and feminism kind of critiques of IHL could be incorporated within feminist thoughts when it comes to data protection and data privacy. So in a lot of ways, data privacy and data protection are important because if we disregard them we are replicating existing inequalities that feminists and post-colonial scholars have been addressing and talking about, right? This state versus non-state actors, civilians and non-civilians and how even like minority groups, rights are being abused in the cyber realm due to the development of surveillance technology. And I'll talk about the Palestinian case. It's a very specific case because surveillance is really at a wider scale with cameras, with kind of biometrics everywhere in the Palestinian territories and in Jerusalem is also a specific case. I don't know if you guys are keeping up with what's happening right now with a just your journalist that was killed today but it's just kind of a whole thing that is not related to us. So it is important to think about these things with data privacy because cyberspace is heavily populated by civilians. A lot of activists, Palestinians and non-Palestinian activists rely on the cyber realm to organize in the virtual space. I mean, of course we started out with the Facebook revolution but then we have WhatsApp, Telegram and other apps that activists use to organize. In the Palestinian case, it's even more crucial because Palestinians are separated into two different territories. So we have Gaza and the West Bank that are not connected and then we have Palestinians living in the diaspora on Palestinian refugees and kind of using these apps and places where they have to share their data is essential for them to organize and be active which is good for activism in one sense but also bad because that means with advanced surveillance technology these activists could be targeted easily. Information could be collected and there were cases where information were collected about people to force them to collaborate if you're let's say gay in a conservative community or somebody who's cheating on his wife or husband this information could be used to blackmail and has been used in some cases, documented cases. So it's kind of these replications of course it's gendered now there are reports about women activists who've been targeted by Pegasus as well if we wanna talk about the kind of the specific how women and cyberspace interact but in general I went to focus on inequality in the cyber realm. So data privacy and data protection I said I wanna clarify the differences because they're important. Data privacy is usually about who has access to the data. Data protection is about what mechanisms exist to prevent people from accessing the data and they both are important. So states can justify accessing let's say Israel will justify in the name of security biometric data and collecting crazy amount of like text messages and all of that in the name of security but then what happens let's say if the information was leaked or somebody hacked into the system what kind of obligation do they have to the Palestinians in that case can they tell them, should they tell them nobody really talks about these issues as well in the context of cyber kind of data protection and data privacy. Again data I like, Yvonne O'Hara you said data is the most precious kind of possession you have in this kind of age and I think that's because it gives power to a lot of people and it can take power away from people. Also feminists have talked about private public dichotomies usually private has been critiqued because women have been pushed into the private realm public you know it's the spaces that women were prevented from occupying but in this context privacy is not privacy is something that should be pushed for because privacy especially in the cyber when it comes to data if there's no privacy and no protection of this privacy that could cause that could cause gendered harm plus you know exacerbate racial and class and ethnic inequalities as well. I'm talking about my feminist colleagues we need more work on data protection and how feminists could delve into the cyber realm as I said I'm noticing more work right now that I'm looking at I'm excited to read a book called Data Feminism that I'm sure will address some elements of data. So you know feminists can start addressing data protection by analyzing and questioning government and private institutions power over users data collection and distribution. You know your data could be collected without you knowing of course the consent we know that that's not really consent because nobody has time to read so many pages of terms and conditions. Also sometimes you know even for these communities the level of education and awareness is not enough and because of a security situation in a lot of places there's no opportunity for anyone to give real consent for data collection. Why does IHL specifically apply in the context of Israel Palestine I talked about surveillance technology but also because under international law and occupation falls under the what's regulated under international humanitarian law. Some when I presented this somewhere scholars said why don't we talk about human rights law which I do address it but in pieces of emergencies as I said you can suspend some elements of international humanitarian law or human rights law excuse me and that's why I'm focusing on international humanitarian law. So these laws are important again because of the occupation status because Palestinians are considered on their occupied people so Israel has certain obligations to the Palestinians. Israel you know is praised and known for its cyber security capability if you talk to anyone that tell you Israel the high-tech nation but the advancement of technology in Israel surveillance technology is immediately tied to the IDF and specifically one unit unit A200. That unit was established actually before Israel became a state and it was doing surveillance at the time surveillance of Arabs but it became more sophisticated with time. They own a lot of big companies a lot of people graduated from this unit or served in that unit own a lot of companies that are known to us. There's even like one company called Checkpoint it even in the name itself you can see their application of kind of the colonial aspect of that. Some people call this cyber colonialism multiple articles have been written on that. We talked specifically about how Israel now has control over Israel kind of gets to decide what when Palestinians can have access to the internet what type of internet they control the physical infrastructure of the internet. I just read something today about how now Palestinians are negotiating with Israel to access 4G technology. So even access of technology like the physical elements of the internet as well as like who has access to it when the speed it's all controlled by the Israeli government. Oh, sorry, I went back. With all these limitations as a cyber activism is very essential to a Palestinian cause. I'm not, I'm trying not to talk too much but I'll answer your questions. But also that makes Palestinians exposed because if you're using the cyber realm to organize that means that you're going to be surveilled. And we noticed with Air Spring it was like the Facebook revolution but when authoritarian regimes noticed that and are aware of the power of technology and now know how to navigate it as a tool of surveillance that could be also an oppressive tool at the same time. It's easier to look at you gather more information about you. And in Israel it's known that it's arrested multiple students, journalists, people because of surveillance technology. I'm also working on a project with Hamla. I don't know if you guys are aware of Hamla. Hamla is a digital rights Palestinian digital rights organization that focuses on like surveillance of Palestinians. They document cases especially collaboration between social media companies such as Google, Facebook specifically I guess because of Instagram as well and how they collaborate with Israeli government, how that's impeding Palestinian digital rights which is fascinating to see the amount of requests from the Israeli government for information from Facebook and the amount of times Facebook complied. And kind of to also look at it at like in that perspective collaboration between private companies and the Israeli government to suppress activism as well on Facebook and Instagram. I don't know if you're paying attention to Bella Hadid and the whole controversy that her posts about Palestinians have been censored multiple times and you'll hear that a lot even like with hashtag al-Aqsa or whatever. And I'm working now on a paper kind of addressing how their list of dangerous most dangerous organization individuals and organizations is shaping how they're censoring people online because now if there's a terrorist they label it like al-Aqsa or a terrorist group. And the mention of al-Aqsa is now going under that kind of flagged by an algorithm as something that is supporting terrorism. So which is fascinating to also look at these things as well. NSO, NSO is a company that produced Pegasus the whole controversy of selling at Tabahrain, UAE Saudi wanted access to it and how these organizations so what's being used on the Palestinians tested on the Palestinians is now being exported to help authoritarian governments as well to suppress activists which is also important to look at. There's a direct contact between these private company, NSO and the Israeli Defense Ministry they have to approve any sales to foreign governments so it's not like it's out of it and the founder of NSO is also a former member of unit 8200 as well which is kind of a fascinating thing to think about. Pegasus is the program we talk about. It does surveillance in two ways. One, it just kind of gives your data away. Also it can be used as a real-time surveillance tool which is terrifying. And it was found on six Palestinian human rights activists phones who were in organizations that Israel labeled as terrorist groups even though they're human rights organizations. So you see also that the label of terrorism, emergency human rights and surveillance all interacting at the same time in this case. One of the activists right now I think is to file the suit in France for Israel violating his privacy. We have since 2000 CCTV technology that increase and now Israel has a program called the blue wolf which allows you to actually do that with smartphones. It's not just like stationary cameras. Soldiers can do it with their smartphones and they've been taking pictures of men, women without their consent and entering in them into a database. And that's also creating kind of that sense of insecurity in the Palestinian context. Again, all of this is in the name of security, right? Emergency. So I think what I'm trying to do with this and I don't want to take more time I guess we can answer in the questions or comments that I'll access my kid is a data is an important thing to look at and even examine that it needs to be codified not only into human rights law but also international human rights law in context of armed conflict. And it needs to have transparent guidelines for when and how this right to privacy and protection should be applied and kind of what are the consequences of violating such laws. Now I'm not to say that Israel hasn't violated international humanitarian law or any international law before. I don't know the Supreme Court recently made a decision that contradicts international law and they decided that in this context doesn't apply. So it's just kind of, that's what I'm trying to do with this project. I don't want to talk too much but there are lots of questions about Israel's obligation to Palestinian data. If somebody gets hacked, if let's say the data gets hacked, do they tell these users or not users in this context that people that they collected even using the term users is weird in this context because it's being collected without your consent. What is their responsive? What's the responsibility of Israel to protect the integrity and security of its databases? And then you're kind of also complicating that to when is surveillance technology allowed to be used and how it should be used and what kind of data is allowed to be collected as well. So I'm gonna stop here and that I'll access over. Thank you. Oh my goodness, that is such a fascinating and terrifying presentation. I'm just going to pass the floor over now to Alexis to have some further commentary and questions. Alexis, the floor is yours. Yeah, thank you. So I have some notes on this paper. I'll start by just kind of reading what I already have and then with whatever time left, we can kind of move into sort of a more open discussion and hopefully some folks in the audience will have questions to engage with too. But I wanted to start off by saying that the paper feels extremely timely on multiple counts but especially insofar as it addresses the need for more feminist engagement and especially engagement in feminist IR or feminist security studies on questions of data privacy and the potential weaponization of data. So obviously you have a really powerful case study of that here, but even more so thinking holistically about the COVID-19 pandemic and the concerns it raised over use or misuse or harvesting of people's health data, Pegasus you mentioned and it's worth noting that I think as you alluded to, we've seen that Pegasus has been used to surveil feminists in India and UAE but also women journalists in Mexico and El Salvador most recently, which happened even after all of the first sort of wave of Pegasus issues came out. Now we have these debates going on in the United States with the possible overturning of Roe versus Wade and there has been some conversation about the way that cell phone location usage data, data form fertility apps could be potentially misused and exploited and these are conversations that we need to have. And I think that arguably we needed to have like a few years ago, but we just generally see I think how the gendered and classed and racialized patterns also in data literacy and in technology access, leave minoritized populations vulnerable in ways that are not sufficiently addressed by national or international law. So along those lines, the discussion in your paper on war about how privacy is developed in international humanitarian law, I think is an incredibly strong point to make. So on the one hand, your discussion highlights how privacy is defined in really like androcentric ways that don't acknowledge gender impacts. They also ignore the impacts of sexuality, of race and ethnicity and so on. So the notion that objects but not data can merit that protection of privacy under international law really presupposes that an idea that unregulated data can't pose risks, but of course we see through your case study that that is not the case for many people. Even in addition to what we discussed, we've seen other examples, obviously in the international community of where data misuse has posed real risks and resulted in real harm to people. Thinking specifically of some of the research that has come out about the Islamic State where we saw that authorities in the Islamic State were confiscating people's phones and using searches of phone data, basically to persecute members of the LGBT community. We've also more recently had issues in Afghanistan with the potentially misuse and actually, I believe it's been documented at this point, misuse of biometric data that was collected by the United States and other allied militaries that has since fallen into the hands of the Taliban and now can be used to identify, for example, people who worked with the United States. So the case study portion of your paper, and why I think brings together multiple threads. So on the one hand, there's the notion of surveillance as an ongoing activity. And that brings with it questions about how that surveillance is racialized. Related to that, there's the related issue that you bring up of cyber colonialism. How Israel is controlling who can access the internet, under what conditions they might do so, and what types of information or services people are able to access when they can get online. The third dimension to the case study that you talked about in the paper is the application of humanitarian law and the laws of war. And so I'm gonna separate those things out a little bit because what I actually think is that you may have the basis for like two separate papers here. The first could be one that deals with issues of digital colonialism and surveillance. That discussion has clear linkages to literature and surveillance studies, which is an interdisciplinary field, but that clearly links technological surveillance environments to those historical systems of colonialism, segregation, and even slavery. So thinking about examples like the development of fingerprinting in colonial India, the use of passbooks in apartheid era South Africa, and of course, the surveillance of indigenous populations in North America through multiple means of surveillance. So to that end, if you were interested in developing a paper that focuses more specifically on this as a surveillance issue, work in surveillance studies, particularly the work of David Lyon, and I'll send you these citations afterwards. But there's also an excellent collection called Feminist Surveillance Studies, published by Duke University Press and edited by Rachel Duprofsky and Shoshana Magnet. And I think that that has really clear implications to what you're looking at here. On the other hand, so I think you have the one paper potentially that deals more closely with the kind of chronic surveillance of Palestinians, how that's facilitated by technology and the unresolved questions of how that data can be used or misused by state actors or by corporate actors, which is an important dimension to your paper. So I think there's the possibility there also, of course, to draw out a comparative case study because unfortunately there are no shortage of examples that we can look at where we're starting to see how data use or misuse and technological surveillance are being used in support of racialized surveillance. So I think you mentioned a little bit in the paper, controversies over the use of facial recognition technologies, especially in the US and the UK, when we know that those have lower levels of accuracy when applied to women and people of color, potentially cases in Europe that we can look at. I also thought about surveillance of populations in India and especially how India controls access to the internet in regions like Jammu and Kashmir and even in China. So the use of surveillance against Uighur populations or against demonstrators in Hong Kong is really analogous. So unfortunately, if you wanted to look at this from a comparative perspective, there are no shortage of cases to choose from. But at the same time, I also thought you had another interesting dimension that could be kind of developed into its own paper that deals with the sort of more acute deployment of disruptions to information access. So in the version of the paper that I looked at, you talked about things like the use of signal jamming, blocking deliveries of specific technologies that are actually like needed for like infrastructure access to the internet and digging up cables. And so it's here that I think where international law governing conflicts and or post-conflict environments most clearly applies. And that's where your examples really show, I think in a strong way, the failure of international humanitarian law to keep pace with the expansion of conflict into the cyber domain in quotes. I think that the case for expanding international humanitarian law is really strongest where you can indicate how the state itself is acting as an agent of harm and where the state is abdicating its current responsibilities under international law to protect civilians. And I know that that's something that we've discussed before, but I think both of those themes really, strongly integrate with feminist IR and feminist security studies. So this also can become a messy discussion but an interesting discussion because private actors, as you highlight, companies like META are also playing an important role in facilitating what's happening and providing the parameters in which it can occur. So I think approaching the question from that angle could actually prompt even deeper reflection about whether international humanitarian law can be enough insofar as the aim of this law as it was developed is primarily to constrain states. But we see that they may have limited power, states may have limited power to actually bring private cyber weapons developers like the NSO group, which you mentioned, to heal or into compliance. So with the caveat that I'm not necessarily an expert on international law and the arms trade, it seemed like there's also an interesting possibility for you there to bring in another legal avenue, which is not only to look at international laws that applies to human rights, but also to look at international law as it applies to the arms trade. And I wonder if something like developing international law that's attempting to deal with the movement of small arms might provide an interesting parallel or analogy to how international humanitarian law can engage states but also potentially private actors. And with the small arms issue in particular, I know that there's a case of how civil society can be mobilized to push for those expansions and changes to humanitarian law. So just final points, I did want to, in your case studies, also see maybe more of a callback to the gendered impacts of cyber weapons or surveillance. So I do understand that these issues definitely highlight the need for feminist scholarship to be more active and more engaged on issues of surveillance and cyber weapons. So it definitely wouldn't hurt to emphasize that continually, but in terms of impact, I did feel that your discussion of the tools is centered on racial and ethnic dynamics, but I sort of wondered if gender dynamics are present too, and especially I wondered if that might be a place to bring in masculinity because I know that that has been a focus in literature on preventing and countering violent extremism, right? This is the way that young men of color, especially in Western societies, are made hyper visible by these technologies. And so I suspect that that might apply in your case too. But that's all the feedback that I had. So I'm welcome to happy to turn it back over to you guys for any further comments and questions or engagement from the audience. Great, thanks Alexis. I mean, while we're trying to conjure up some questions from the audience, I have some questions and thoughts too, but maybe we'll just pass the floor over to Anwar to respond to some of Alexis's comments and questions. Yeah, no, I think that was great. Because as I said, I know that I was trying to speak to different things all at once, and I knew that there's a potential to expand it to probably multiple papers. And I think the comments were very helpful to kind of help me map out maybe a revision process that hopefully will bring in all these elements. But you know, there's a lot to cover. That's what's exciting about this. There's more room for research. And the more you do research, the more you have questions. And that's hopefully will produce more research in that area. So Alexis, I really appreciate the feedback that you have. I think it's interesting to think about PCV masculinity. Even though my question, like just my initial thinking that since women also use digital technology at high rates, you know, like there's this case where because women, some women were prevented from participating in public spaces or worried about being in public spaces, they're utilizing the digital realm more often than men. So I wonder like, how is that translating as well in this context? I don't know if it fully applies in this context. But it's an interesting point to think about in this and kind of the role of civil society. Metta's paper, I think Metta, and as I said, I'm working on something with Hamla on Metta. And that could be a full book on how they're behaving and you know, their collaboration with Google, Twitter and the government and kind of how they regulate these issues. So with Metta, Facebook, on Twitter to say Metta, it just doesn't sound right. But because it includes Instagram, WhatsApp and other kind of technologies. So for them, the way they regulated what's interesting is that, you know, they comply with, they have their own kind of community standards, but then they comply with local law. And in the context of Israel-Palestine, who is regulating the law? It's the oppressor. Also the PA has been critiqued as well for utilizing not only Israel, but also the Palestinian Authority has been using digital surveillance to suppress people as well. So that question is like how much collaboration is legitimate when are these companies responsible for applying human rights law and human rights approach to their collaboration with oppressive governments, right? I mean, this whole thing started with the Rohingya crisis and how Facebook was used for incitement, but then regulating things become complicated. Now you have the language element as well, because these algorithms, you know, how like with biometrics to say, like with minorities that identification becomes harder. So with algorithms that don't understand Arabic or the context, now if you flag certain groups, for instance, as terrorist groups and the algorithm sees one word, then all that content is blocked automatically, right? So because it's not sensitive for Arabic and other languages, that creates another set of problems as well. So it's just a kind of so many layers to cover as well in the context of Facebook. I mean, I think to add to the layers and just where else could you go? I'm also thinking of the work of Claudia Eridau and her work on big data, but also surveillance in the context of migration and migrant crisis into Europe. And the, you know, how different technologies and apps have been used on irregular migrants. I love these terminologies, right? Legal terminology, irregular migrants. As also a space of experimentation, right? So like you had mentioned about how these different data technologies and surveillance and data gathering gets tested on particular communities and then marketized, right? And then sent elsewhere, but she's doing some really interesting work on data failure as well too. And I don't know if you have them to stand or just another kind of concept. You need to write a book, maybe a couple of books on this and non-knowledge, right? And those kind of aspects and what does that mean when we think about the limitations and the data flows, where data goes and spaces for resistance and spaces for appropriation. Cause like, yeah, or doing data otherwise or the resistance and over what not to, right? Cause, you know, your presentation does highlight, it's terrifying, right? And I just, yeah, I wonder if it, have you found in your research any spaces of creativity or any sort of spaces where you see different communities appropriating or using data otherwise? That's another research paper. I've been keeping up with social media responses to, you know, activist responses to the censorship on Instagram and Facebook. So for instance, one of the creative ways that they're doing it, like if you mention anything with Israel and Arabic or a Luxembourg, sometimes it gets blocked, as I said, because of the problematic list of dangerous organizations and individuals and how the algorithm, whenever it sees one word that lists in one of the titles of the organizations, it blocks any content related to it. So now the activists are, so like they're putting a dot and not putting a full word. So instead of like, usually in Arabic or anything, but they could say, if I wanna do it in English, it would be I-S-dot-R-E, like, and then you continue the word, just to kind of mess with the algorithm. So that was one of the interesting things to kind of see how activists are using that. There are lots of organizations dealing with digital rights of Palestinians, documenting abuses, spreading awareness about it. And there are asking Facebook for accountability. Facebook was forced to create like a special committee to, I can't remember the name of the committee, to address the Palestinian conflict and what's happening with surveillance there and kind of why certain content is being blocked. So there's a lot kind of to see and witness and even there's resistance from within the company itself because they're like, this is, you know, it's ridiculous. So there's kind of reports as well, but it is fascinating to kind of examine this more in depth and see what other strategies have been used. I mean, I guess you're still, yeah, sending so much tentacles out there. I think just reflecting on your research right now, what do you think, what has been most surprising for you, I guess, in exploring all of this? Surprising. I knew that surveillance existed and it gets normalized and, you know, as I said, I grew up in Israel, it just gets normalized, right? That it is happening, like we all know the Mossad or whatever, the Shabbat is collecting data on everybody, right? But the extent of how surveillance is kind of affecting day-to-day operation, like even walking in the street to your data is being collected on you without you using your own phone, right? That's kind of what's the scary part about this and also how the data is being utilized without your knowledge and the extensive of how if the actor that is using the data is considered a legitimate actor as a state institution, then there's not a lot of regulations really, especially again, in cases of emergency, just claim emergency, just claim terrorism, slap that label on that and you're done, right? So for me, just to kind of understand the extent of it, I could just walk around now and I'm like, I view the world differently, right? And I can't say that's a pretty way to view the world at this point, but it is true, kind of it opened my eyes to all these levels of discrimination and how all inequalities that translate, and we know that because of COVID as well, like we saw that highlighted even more how inequality in the physical space directly tied to inequality in the cyber realm and who first gets to regulate that space, right? It's the privileged ones. It's the ones who have the chances of regulating, the chances to create these laws and codes and kind of how the minority communities or communities affected are responding to that. Of course, the role of private companies and the government was another thing to kind of understand and it's complicated because they're based in, a lot of them are based in the US, there are different US laws and then there are different EU regulations and there are different country-specific regulations and it's interesting to also pay attention to all of that. Again, like, thinking about Palestinians and their data, it's just very, seriously, like you have no privacy whatsoever, nothing. Like they're reports of women afraid of sleeping without their hijab because they know the cameras are directed at their houses. Sorry. No, go ahead. I was like, because that was one of the questions that I was going to ask in the sense that if this cyber surveillance is pushing people to go offline, but even if they go offline, they're still being surveilled, right? And that's what the new thing with the blue wolf technology where soldiers are using smartphones and the controversy was that they were getting rewards for how many people, like the amount of pictures you take and you add to the database. So, yeah. And, you know, because the harm is not, like you don't see physical harm, like immediately you don't see somebody bleeding, you don't see a lot of people don't think about it as something that is, you know, that you need to be attention to, but the level of harm like goes beyond, it's not the physical, it's about the life, kind of the quality of life, the way the insecurity people face. And, you know, that's another gender element, like if you're a woman and you don't even feel safe at your house, right? And you feel like you have to wear the hijab because you are exposed as something else. Yeah, and I mean, that's also an interesting, kind of gendered component to that is the ways in which the private and public gets, well, feminist hierarchy for the longest time is, you know, that's not a clear line, right? But the way that that gets significantly changed and profoundly changed and impacts mobility, but yeah, like you just said, everyday life, right? Yeah, and it definitely blurs the public-private dichotomy. It's, you know, it's just really at some point, in some areas it doesn't really exist because they're connected. Yeah. I have another question. It's more a personal question, like how has this affected you in the sense, like in the online realm, have you been, you know, attacked or have you been influenced or, you know, impacted by this? Because I'm from Brazil and I know researchers were doing like research on Bolsonaro and then they get attacked on Twitter. You know, I mean, doing work on Israel, Palestine in general is very tricky for different people, right? If you don't, I am privileged in the sense that I do have the Israeli passport. So supposedly I have the protections of Israeli citizen, right? I, if I were, you know, somebody who lives in the Palestinian territories without the citizenship and the way they regulate these things are different who are under military law, not civilian law. So it's being regulated differently, but as academics, scholars, you get attacked, of course, with any critiques of what's happening in Israel, you know, the equation of critiquing Israel with anti-Semitism is also problematic. The silencing, it's just, you know, I write outpads, so I'm used to kind of getting emails. But it's part of the job, right? Yes, along that lines too, what, methodologically, you know, how do you, what do you find most challenging methodologically about doing this sort of research? I'm actually thinking about data collection itself. I'm like thinking, okay, how can we actually collect data on that impact? And just thinking about that has been tricky as I said, some organizations are doing that, but you can't really get enough from the Israeli government, right? You don't really know the extent of that. And with cyber technology, in a lot of ways, governments around the world want to keep it a secret. They don't want to expose what kind of cyber technology they have. And the problem with this, if one thing is out, so let's say Stuxnet is now, it's exposed, right? It was exposed. Now anybody can access the code and use Stuxnet. Or now anybody could like, if they know that technology exists, it's possible, they can recreate it and create it, and that's another problem as well. Why the US hasn't pushed for regulating cyber space because they're benefiting from lack of regulations, right? So for me, it's the data element. What kind of data can you get? How transparent can governments be in sharing? Even Facebook, with the dangerous organization, individual list, I think it's individuals on an organization list, it was leaked by the intercept. It wasn't like, they didn't share with anyone. They're not really clear on the guidelines. How do they surveil, how do their codes work? I think EU now is passed a law that is forcing these companies to be transparent about how their algorithms work and how they do these things. So it's that element. And also understanding all the technical aspects of this, this challenge, because it's multidisciplinary, right? Interdisciplinary research that works. You know, if I can just jump in. I mean, I'll say that the companies often don't know how their own algorithms are working. And we see this, especially in the case of social media, social media recommendation algorithms and the aim to better understand the spread of misinformation or disinformation, especially in the past couple of years. We now see that Twitter comes to mind specifically as an organization that is funding their own research into how their own algorithms work because they don't have a complete knowledge of that. And what they're finding that has come out of some of that that has been released so far is that they are recognizing unintended consequences too. So that's part of the uncertainty, I think that's baked into the system. Yeah, absolutely. So there's lots of elements that are challenging about this. And again, since you don't get transparency understanding the impact of these technologies and kind of the role of private sector government in what's happening, you know, it's hard to assess. You rely on reports from human rights organizations that document complaints from people on the grounds rather than kind of things that the government are like talking about or trying to improve, right in that sense. So, yeah. I think I talked to somebody from Oxfam, she does the IT aspect and she was thinking about data production. She's like, I don't know how to do data protection from a feminist perspective because, you know, a lot of these, like she wants to deal with it, she deals with the technical stuff. Like, okay, like don't connect to public Wi-Fi, connect to public Wi-Fi, like these small regulations in order to protect data that they collect. Think about it, human rights organizations collect a lot of data that whenever it gets hacked that exposes these vulnerable populations and also the organization itself and could be manipulated by criminal groups, by non-state actors, by state actors. But she was saying like for NGOs that are operating conflict zones, sometimes the only access you have for the internet is through public Wi-Fi. So what do you do in that context? And I think that was an interesting thing to also think about from like applying this, not at the theoretical perspective, but also kind of practicing it, right. It's tough as well. Yeah. Gosh, this is such a fascinating conversation. And when are these papers coming out, Anwar? Do you know yet? We're working, Alexis and I are working on an edited volume with multiple chapters that do address different elements of like from a critical security perspective. It was a mid-air book proposal and we're waiting, you know, everything is taking longer this time. But yeah, with the Hamla paper that I'm working on, they wanted out this summer. I highly recommend you check out the organization. I can look up, I think Hamla is harder for people who don't speak English. I'll look up the link and send it in the chat. But it should be out this summer and I think it's important to look at their work as well. And hopefully we'll hear back about our book soon because I'm excited about it. I think it will highlight recent issues and actually, I think one of the main reasons why I thought about the edited volumes, like even just to start the conversation, let it go and kind of expand the research on that topic. That would be, if that happens, I think that's a success. Yeah, this is such an important conversation. And like you said, it's terrifying because it's so difficult to know the extent of the problem or right? Cause you're just like, how do you, you don't even know what data is being necessarily held or collected or stored or whatnot. Let alone the significant lack of state and international law around regulation and transparency. And I just wanna thank you so much for sharing this research and I can't wait to see how it unfolds. And when you do get published, you need to obviously let all of us know so we can circulate your research more widely. Thank you so much, Anwar, for being first up and making this seminar brilliant. And Alexis, for your lovely commentary too and thoughtful engagement with Anwar's work and Lua, for your questions and the audience for I know you're sending your positive vibes listening in and I'm just, I'm so grateful for this cause I just did so lovely to be in intellectual spaces where you feel intellectually nourished, you know? And because of COVID, I think we've had less and less of those. So I'm really excited that, you know, we have these spaces too to engage and it was lovely to learn more about your work, Anwar. So I will leave the final word to you, Anwar. And then we will get this recording up and circulating it around social media. Yeah, thank you very much. I really appreciate this conversation. As I said, this is kind of a new area of research for me and it helped me rethink a few things and also think of new projects to work on. So that's a plus, right? Yeah, and ground proposals, right? Yeah, absolutely, yeah. Great, well, thank you so much everyone for participating and stay tuned through social media following the FTGS Twitter handle and visiting the FTGS website for the next seminar. So have a lovely morning, afternoon, evening, wherever you are in the world and we'll see you again soon.