 My name's Tim Pringle. I'm chairing the panel. Each of our speakers, we've got four speakers, they will introduce themselves. We've got Darren Baila, Nisha Kapoor, Rahima Mahmud and Mayawong. Each speaker will speak for eight to ten minutes and then we'll open it up to the Q&A. So mostly a repeat of the previous session except that we are going to finish at quarter past one because even though we don't give you lunch, we're going to give you three quarters an after think about that you haven't got any lunch, okay. So you can't say we're not generous. So that's what we're going to aim. So if you please, please do have contributions. The discussion was great in the last panel. We want to try and emulate that. But keep trying to keep your contributions brief. Okay, so we're going to start off with Darren and I think Darren, you're going to come over here because you've got a PowerPoint presentation. So thank you and welcome to all the speakers. It's a real honor to be here. And I'm Darren Baila, an anthropologist at the University of Colorado at Boulder, was at the University of Washington. I almost said that. I just started a postdoc position there looking at infrastructure in China. I study surveillance systems, technology systems, both in China and as they move to other places. And I also have another project that related to the Uyghur context where I'm looking at forced labor. And I'm not going to talk a lot about that in my presentation today, but if you want to have a discussion about forced labor in China, I'm happy to do that too. Okay, so the title of my 10 minutes is Technologies of For-Profit Colonialism and Making Lives Matter. In the summer of 2017, nearly all of the men in Shulpan Amar Ken's family were taken to reeducation camps. Aside from Shulpan's husband, it was just women who were left. Because her father-in-law was an Imam in the local Kazakh community, her whole family was deemed unsafe or suspicious. In the months that followed, state workers who were tasked with carrying out the reeducation of Muslim minorities in Xinjiang entered her home on a regular basis to inspect the remaining members of Shulpan's family. They used a wide range of technological tools to do this. For instance, they scanned her family members' bodies and their belongings with handheld metal detectors, the kind an airport security worker might use. These devices were manufactured in Xinjiang by a technology firm called Dali, who had shipped these scanning devices in Xinjiang in mass quantities. A spokesperson for the company said in a joke in a speech that only the bathrooms in Xinjiang were without surveillance. In fact, even the bathrooms in Uyghur and Kazakh homes were not safe from the scans. The police were looking for electronics. Unreported smartphones, SD cards, hard drives, language learning devices such as a Quran reciter. It's a pen that you can scan over the text and it will tell you how to learn the Quran. Many, many people had these in 2014 and 15 when I was doing my research. These are the types of devices that they were looking for with their scanning equipment. According to the guidelines that the state issued to enforce, the religious de-extremification policy, having five or more digital copies of unauthorized teachings resulted in a criminal charge of promoting terrorism and extremism. Possessing fewer than five could result instead in being labeled a pre-criminal in need of reeducation in the internment camps. The police installed cameras at the front gate of Shopan's home and they made visitors register their names and ID numbers. Shopan said, quote, our home became like a government office. The neighbors began to avoid Shopan's family. Shopan said that on numerous visits, the police plugged her smartphone into a scanning device that looked like this, that could recover deleted data in a matter of seconds. Although the AI system learned something more from about the patterns of her behavior each time they did this, she said it did not detect anything extremist. She said that there was reasons for this. It was, since it was a non-Chinese iPhone, they couldn't find anything. If it was a Huawei phone, they could have. They asked me, why are you using this phone? They said I should have been patriotic and gotten a Chinese phone. I purposefully bought the iPhone from a Kazakh from Kazakhstan because I knew it was safer. Otherwise, I would have definitely been sent to the camp because before I cleaned my phone, cleaned it, I had lots of religious content on my phone. Shopan was not afraid, not alone in her fear of surveillance. Many of the dozens of others I interviewed as part of my research for my book project on reeducation technology said that they or others that they knew were detained because of digital texts, audio clips, and videos that they had shared on their smartphones. To be clear already in 2014, when the Chinese state declared the peoples war on terror and began to discuss the 75 signs of Islamic extremism, which Rachel mentioned in her remarks, they began to list digital files, WhatsApp, and VPNs as signs of suspicion. But initially, none of these new regulations were enforced. Now with new surveillance tools, their digital footprint from years prior could be used against them. So the kind of activities they were doing on WeChat back in 2014 and 15 were now detected and being used against them. The surveillance system was a product of so-called preventative policing systems that drew on theories associated with the British prevent program, which has been mentioned in the previous panel, a prevent or CVE, countering violent extremism, relies on the theory that religious ideology or piety leads to violence. In Euro America, it justifies cameras in mosques, teachers reporting on politically active Muslim students, and watch lists, no fly lists, all of those things. In China it took on Chinese characteristics that drew on the Maoist past and counter insurgency theory as practiced in Iraq and Palestine. Like people throughout the region, Shulpan had her irises face and fingerprint scanned and her unique voice signature recorded. They took her blood and a DNA sample. She said, the village government leader told us openly that those who would refuse would be taken to a reeducation camp. This is an image of a Uighur person who went through the scanning, recreating what that looked like. The system is supported by this really unprecedented data set of face scans, iris scans, and DNA, all that stuff. The biometric data that was added to Shulpan's citizenship file as part of a new smart ID card and checkpoint system enabled them to track her movement over space. Once this system was fully implemented by the end of 2017, it became impossible to enter a bank or shopping mall without having her face scanned and matched to the image on her ID at the fixed checkpoints at the entrance of every store. Shulpan said, on average, over the span of a single day, I had my ID scanned more than 10 times. This is a map that's showing from internal police documents from Arumchi, the capital of the region, what the frequency of scanning and the data that's being collected. This is done at a very large scale. Shulpan said that she also began to change her habits. At the advice of a police officer, Shulpan and her husband started going to dance parties and drinking in order to show that they were not religious. Once on their way home from a party, the police followed them and pulled them over. They asked both her and her husband to use a breathalyzer, using a device similar to this one. Even though she wasn't driving, they also tested her. When they found that her husband was not drunk, they asked him why he had not been drinking. He replied, I didn't drink because I had to drive, which they grudgingly agreed to saying that that was a legitimate reason not to drink. They were pleased, however, to find that Shulpan's blood alcohol levels were elevated. They didn't say anything and just let them go. Looking back at it now, Shulpan said, we had to perform the way they wanted us to perform. If they said drink, we drank. Many Turkic Muslims I have interviewed over the past two years have said that reeducation technologies and the terror they feel because of the treatment of their relatives that they've received in detention have made them live their lives differently. They had always been very careful not to do anything wrong, but now they were intentional about posting content on their WeChat walls that reflected the talking points of the reeducation campaign. A lot of this centers around something called demonstrating positive energy. They posted things in Chinese rather than Kazakh and Uyghur. They smiled at Han people and served them first. They smiled and said, thank you, good, okay, yes, yes, yes, yes. Ha, ha, sing, sing. They said that was the way that they just responded to everything. They did everything. They did whatever they could. They were assigned to do. They embraced post-truth as the new reality. It became normal. So what are the implications of all of this? Technology has been central in the modern history of systems of control ranging from barbed wire and automatic weapons in North American internment camps to the past books and checkpoints of apartheid South Africa. The technology used in Chinese projects to contain and transform Uyghur populations takes these systems of control in new techno-political directions. As Sasaki assassin notes following 9-11, the built environment itself has become a technology of war. In a world of counterinsurgency and CVE, the pursuit of total knowledge of civilian populations through systematic assessment of their behavior and control of vital infrastructures is becoming the norm. For stateless populations, it is producing a new form of for-profit colonization, an intimate form of material and epistemic violence that seeks to eliminate and replace indigenous sociality. This economic and political formation, what I name terror capitalism, justifies the dispossession and exploitation of Muslim populations by defining them as potential terrorists or security threats. It attempts to generate profits in three interconnected ways. First, lucrative state contracts are given to private and state-owned corporations to build and deploy policing technologies that surveil and manage targeted groups. Then, using the vast amounts of biometric and social media data extracted from those groups, the private companies improve their technologies and sell retail versions of them to other states and institutions such as schools and corporations. Finally, all of this turns targeted groups into a ready source of cheap labor, either through direct coercion or indirectly through stigmas associated with their surveillance status. Already similar systems are being built in India, Palestine, and the southern border of the US and in Hong Kong, where the slogan that was used kind of everywhere in Hong Kong, I was there in January, you see this everywhere, is today Xinjiang, tomorrow Hong Kong. These examples show us that surveillance of marginalized people will likely get worse before it gets better. They demand a re-articulation of what the Taiwanese scholar Xu Mei-she describes as a minor transnational politics, to confront terror capitalism, a new social movement that links targeted communities to tech workers of conscience and builds legal frameworks and decolonial platforms is necessary. As Judith Butler would argue, this is what is necessary to truly make Shulpan's story count. Of course, her story does count, but it's not always recognized, it's not always heard, and the loss of her family members is grievable. And so I'll leave it there. And this is sort of a provocation or a way of thinking forward, is how do we organize to resist this collectively? Because without it, we're left in a very difficult position, and the marginalized people around us are going to be susceptible to really great harms. Thank you very much. Rahima, I'll leave it to you to introduce yourself. Yes, thank you. Thank you very much. And it's an honor to be here to talk about the situation as being all of myself. I would like to talk through my own personal experience and to give you a bit more inside stories what is happening at the moment. And I'm sure the previous speakers have already mentioned quite a lot. So I was born in the city of Kholja, of what was East Turkistan, but the Chinese called it Xinjiang. I was brought up in a large religious family. The discrimination and the persecution of the Uyghur people has a long history. As a Uyghur, I experienced from my childhood until when I left my homeland in 2000, frequent discrimination and witnessed brutal crackdowns on moderate descending voices, especially the massacre in Kholja massacre in 1997. So since coming to the UK in 2000, I have not been able to return home because of my involvement in the activism against the human rights violation against my people by the Chinese government. The last time I spoke to my brother was in January 2017, and he told me in a shaking voice, please leave us to God's hand and we will leave you to God's hand too. My work as an interpreter and translator brings me first-hand information on those who have suffered in notorious 21st century internment camps, as well as the heart-wrenching accounts of mothers and fathers who have lost their children, young and old. Every Uyghur family has a similar story, one more horrible than the other. As a result of the brutal ethnic cleansing and cultural genocide that have been taking place since 2017, behind the closed eyes of the international community, the most painful part of the job is that I cannot offer those in pain and suffering any words of comfort and hope. I worked as a consultant and translator for the documentary on the cover China's digital gulag by Robin Bunwall, which was broadcasted on ITV in July 2019. Information was gained through both interviews with those that had experienced and escaped the camps and through an undercover operative within the Uyghur region. There's one part, if people didn't watch this documentary, I would recommend you to watch it. This is one part I believe is most extraordinary account by the Chinese official. So this is the conversation between the undercover reporter and the Chinese official in Urumqi. So this is what he said. The authorities say it is a special situation here. What is happening is excessive and too extreme. The people, the police look at Uyghurs with suspicion. If a Uyghur refuses to be checked or asks why, they just lock them up. There is no procedure. So the undercover reporter asked, do Uyghurs feel their human rights are being violated? They don't have human rights. It is not about violation. They just don't have human rights. A Uyghur IT expert who worked for a state controlled high-tech surveillance company, we interviewed him. The extensive interview record is with me. I can share some of what he said about this surveillance situation. Said everyone was fearful whether those already taken inside or those who are waiting to be taken in. Our relatives under influence of the state terror dare not even greet one another openly without fear. It is unbearable to describe life there. A knock on the door from anyone would bring extreme anxiety and fright. So on my last visit in the country, I spoke to the expert again. And he told me that last time he spoke to his mother, he learned that his neighbor's daughter had hanged herself after being released from the camp on the same day. I have also worked as a translator for the BBC documentary, China, a New World Order. And the interview records and translating the interview records was heartbreaking. And I would like to just share this one mother's story. This is what she said. My mother-in-law was looking after my children. My sister-in-law could not get hold of my mother-in-law for a few days. So she went to visit. When my sister-in-law got there, the children told her our grandmother was taken away by police three days ago. My mother-in-law is 72 years old. I could not believe it. She is old and in poor health. She started sobbing. Three days after my mother-in-law was arrested, my children were taken to an orphanage. My heart was just torn apart. I never thought that this kind of disaster could happen to me. And this is just one example. This lady came to Turkey because her father was ill. And she came in 2016. And when she wanted to return, that is a time when this mass arrest started, so she is trapped, couldn't go back. Her mother-in-law was looking after her children, a 72-year-old woman. And so police arrested her. And then three days later, all her children were taken away. Up until now, she doesn't know what happened to her children. The most difficult part of interpreting job is to work, especially to interview the people who suffered extreme torture, especially rape. Only two weeks ago, I went to Germany to interview a lady called Rokia. We called her name as Rokia. And we spent five hours. I had to ask the researcher to leave the room because I couldn't translate. It's impossible for me to translate the details of the scene and how it happened. It was just too much. And she said the rape in prison and in camps are 99 percent women are actually experiencing that. But they wouldn't say because they feel too ashamed to say it. Therefore, she said, I have to tell the world because this is extreme situation and the world should know what happened. I think I should leave my speech here and then maybe any questions. Thank you very much. Thank you, Rahima. So now we'll ask Nisha to give her presentation. Thank you. Thanks to the organizers for inviting me. I'm Nisha Kapoorama, an associate professor at the University of Warwick in the Sociology Department. My research has broadly focused on the war on terror in the U.K. and the U.S., looking at extreme cases of individuals who have been impacted by things like extreme cases in the sense of state disciplinary measures. So looking at individuals who have been subjected to citizenship deprivation, passport removals, extradition, deportation, so on. And I had a book a couple of years ago with Verso called Deport to Private Extradote, which is part of a broader project looking at documenting these cases but also thinking about what they illuminate about the sort of authoritarian practices being enhanced through the security state in the name of the war on terror. So my subsequent to that, I've been increasingly looking at surveillance and surveillance capitalism. My knowledge of the sort of Chinese context is very sort of tentative. So it's been really great to be part of this because it is something I've been thinking a bit more about. But my comments really today are focused more on kind of outside China, but thinking about the trade links, I guess, and the sort of competition between the U.S. and the U.K. and the discourse of human rights that gets played out within this sort of surveillance capital with this sort of competitive context between the U.S. and China mostly with the U.K. being a really important trading market for the competition of surveillance technologies between the other states. So I mean, I'll begin by just saying something about some of the artificial intelligence. Some of the surveillance technologies that have been cultivated by Google in the name of, you know, through the war on terror as part of trying in the frame or justified in terms of trying to make the internet a safer place. So the wealth of artificial intelligence being generated en masse by companies such as Google who surveil our online behavior to create what Shishana Zuboff has turned prediction products that anticipate what we will do now, soon and later is a project that's not simply rooted in the desire to make us better consumers, but actually is a project intricately embedded in the U.S. military objective of conducting information war for which it realized in the late 1980s, early 1990s that the internet and social media would be essential. Nafiz Ahmed, an investigative journalist, has laid out in great detail the intricate connections between the U.S. military and security agencies and Google, not least through the circulation of personnel between these two camps and the creation of networks between governments and big tech for enabling this. And although there's much to kind of comment on and reflect on this relationship, I just want to stress that surveillance capitalism is deeply tied to the securitization agenda of the war on terror where surveillance techniques and technologies are being cultivated through a focus as we've heard this morning on Muslim suspect communities. So we see this partnership between the military security services and Google come to the fore in multiple ways, but one particular sort of interesting way is through Google's Jigsaw project of which Yasmin Green is the Research and Development Director. Last year, at the end of last year, Vogue, many of you might have followed, fronted a cover of three women, Muslim women, who have played, you know, leading management roles in the counterterrorism industry, real kind of glamorizing pictures, sort of showing, glamorizing the counterterrorism industry if you like, but kind of playing on, you know, the role of feminism in liberation or sort of in the sort of West counterterrorism agendas and playing on the ideas around human rights, freedom of speech, and sort of an imperialist feminist reinvigoration. And so Yasmin Green was one of these women who, as I said, Research and Development Director of Jigsaw, and has overseen projects such as the creation of the animated character Abdullah X, who fronts a YouTube series voiced by a former extremist, who's a, I think, actually a British Pakistani, that explores themes of young Muslim identity in society and aims to steer young minds away from extremism, and also another project called the redirect method, which Jigsaw states is focused on reaching those who are actively looking for extremist content and connections and uses pre-existing YouTube content and targeted advertising to direct young people off the path to extremism. So what you have through these projects that Google are using through this animated YouTube series and also through the redirect method is the profiling of individuals online and the sort of manipulation of search engines to redirect them or to steer what content gets seen. So the kinds of censorship or redirection that we usually hear about in the Chinese context is being used in certain ways through these projects like the redirect method. And so, for example, in the TED talk that Yasmin Green does, she talks about how if you search, you know, how can I join ISIS that you'll be redirected to other information about, you know, either academic articles or sort of more critical articles about kind of what ISIS is and does, or articles that work in the interest of Western agendas. So, and that's unlike the results that you get if you search, how can I join the IDF or how can I join the RSS? You know, there are as sanctions. But the sort of the discourse or the rhetoric that's narrated around this project is around kind of keeping people safe, is around dealing with extremism, deradicalization, and freedom of speech. Yasmin Green along with others such as Sarah Khan and Nikita Malik, as I said, have been celebrated for glamorizing the cancerism industry for drawing on liberal feminist notions of human rights discourses and freedom of speech should justify the surveillance and suppression and criminalization of suspect or vulnerable Muslim youth. This kind of imperialist feminism has not hurt Google's reputation, but the country has been celebrated as one way in which Google can play a progressive, a role in progressive securitization measures. Google has certainly not received the same kind of international scrutinization for human rights abuses in that context. It's taken collaboration, sorry, it's taken the sort of collaboration with the sort of far-right Trump regime and its role in China for Google to come under critique from its staff members for participation in military programs and for its role in developing the Dragonfly Project. So Google was providing AI assistance as part of something called Project Maven, which was a drone program for the Pentagon. And as I mentioned, Google's kind of initiation has been very much about being closely tied with military agendas in the US. But following protests from its staff opposing the deal and calling Google to write a new charter to block work on weaponized AI, Google subsequently has ended its contract with the Pentagon openly and has also, it was stated last year, officially suspended its work with China on its Dragonfly Project, which was a censoring search engine developed for the Chinese market. So what I really am trying to get at, and this is just stuff that I'm starting to think about, so it's not as coherent as it's formed as I'd like it to be, but I guess that's for discussion, is to really think about problematizing the way in which human rights in relation to China gets invoked in the West, because one, we have this situation where both the sort of far right invokes for nationalist and for sort of trade reasons invokes human rights reasons, as we heard at the end of the last panel, in relation, as a legitimation for kind of not working with China or for seeing China as a security threat. And we also have the sort of liberal left factions invoking kind of a human rights argument in relation to surveillance technologies, not necessarily because the surveillance technologies themselves are a problem, but because of which corporations are actually doing the work. So you have Amnesty, for example, talking about if we're going to employ certain surveillance technologies in the UK that we need to sort of think about the source of those technologies, which companies are providing the technologies rather than thinking about the problem of the technologies themselves, which is really my point. So against this kind of ongoing trade war between the US and China, it's generally, has generally been in Google's interest to maintain a collaborative relationship with Chinese big tech and surveillance corporations. In July 2019, the Intercept reported that the Open Power Foundation, a non-profit organization founded by Google and IBM, set up a collaboration between IBM, Chinese companies Semtion, and a US chip manufacturer, ExoLynx, I think that's how you say it. Together they have worked to advance a breed of microprocessors that enable computers to analyze vast amounts of data more efficiently. So China and the US companies are working together to do this. According to Ryan Gallagher's investigative journalist report, Semtion, which is Shenzhen-based company, is using the devices, these microchip things, to microprocessors to enhance the capabilities of internet surveillance and censorship technology it provides to security agencies in China. A company, Employee, said that its technology is being used to covertly monitor the interactivity of 200 million people. So through a subsidiary company called Inex, it's selling internet surveillance and censorship tools to governments and it's working with US companies in order to develop technologies to enhance its ability to do this. Google's mission to expand its surveillance empire to China is constrained by these two factions. One, you have Trump's alt-right factions who are invoking sort of nationalist trade interests and are using human rights to argue that to condemn Google for its interactions with Chinese companies. And you also have the protesters we've heard by its employees for different reasons, but also kind of invoking similar human rights reasons. And what you have then is this kind of coming to the fore for my purposes in the UK, where we've heard a lot recently about Britain's contracts with Huawei and the extent to which they would be allowed access to Britain's 5G network and telecommunication stuff. But there has been Britain itself, the UK is an attractive prospect beyond telecommunications. Britain, the UK is an attractive prospect for any company working in the security industry because it's one of the most surveilled countries in the world with up to an estimated 6 million cameras, one for every 11 people throughout its towns and cities. So it's a kind of really ideal market. And in terms of artificial intelligence, of which artificial intelligence technology where China is leading the world, the UK is currently sourcing products from mostly from HickVision, NEC, which is a Japanese company, and Palantir, which is the US rival. And so the human rights, sorry, a recent report by Stephen Felden shows that technology linked Chinese companies, particularly Huawei, Doua and ZTE, supply AI surveillance technology in 63 countries, 36 of which have signed onto China's Belt and Road initiative. But Huawei is responsible for providing AI surveillance technology to at least 50 countries worldwide. No other company comes close. In contrast, AI surveillance technology supplied by the US is present in 32 countries. So we see that the kind of trade war, the security reasons, the security risks that are currently being invoked by the US are really underpinned by the difference in market access. The most significant US companies for AI technology are IBM, Palantir and Cisco. So we have what's kind of interesting in terms of how this is playing out in the UK is that you have on the one hand Palantir, which is headed by Peter Thiel, who is a big Trump supporting venture capitalist, who has not been able to kind of crack the Chinese market, I think, in the way that he would have liked, but who has secured controversially and quietly a 28 million pound contract from the Ministry of Defense last year in the UK, taking the total value of UK government deals won by the company to at least 39 million pounds. Palantir has become a lightning rod for concerns in Silicon Valley, and has actually maintained quite a low profile in the UK comparatively, even though its offices are actually, its UK office is now its largest and has taken responsibility for a number of kind of military context contracts, but also is developing, is also providing welfare. So the AI technology that's now being used as part of the social credit system in the UK by welfare in order to administer benefit claims is now being run by Palantir technologies, by Palantir AI technologies. So 140 councils out of 408 are using Palantir software, running into kind of millions of pounds in order to, you know, as part of the results of cuts and austerity measures that have been delivered in the UK, which means that there are less people working in delivering in the benefit in the welfare system. And on the other hand, you have HickVision, whose camera technologies, whose AI technologies have been developed through the SkyNet program, tested and developed in Xinjiang, being, you know, HickVision cameras being used prolifically in the UK across the London Underground, across Parliamentary, the state, across London, across London boroughs, across boroughs in the UK, in public spaces, by private companies in shopping malls and shopping markets, in schools, in hospitals, universities. I think it's something like 1.2 million HickVision cameras are sort of reported to be in use in the UK with all the possibilities that that allows for. So I don't, I'm kind of running out of time and I don't have a kind of a nice conclusion, but it's just really to ask or to invoke us, to provoke us to think a bit more critically about the way in which human rights arguments sort of get invoked in the context outside of China as to think sort of a bit more critically about sort of what it means in relation to the politics, the market politics between different corporations who are fighting for, you know, market access to, for their own sort of profit reasons. Thank you. Thank you, Nishit. So now we should have, hopefully we should have Maya Wong from the States on Skype. Maya, can you hear us? Yes, great. Can you hear me? We can, loud and clear. So over to you. Can you introduce yourself, Maya, please? Yes, I'm the senior researcher on China, so human rights watch. I'm actually based in Hong Kong, so and thank you for inviting me and I, unfortunately, I'm not able to be there. So it's like, you know, sorry if I might have repeated some of the things people have said earlier. I think my presentation is kind of sit somewhere between Darren and Rahima on one hand and Nisha on the other. I would just describe a little bit about kind of China's mass surveillance systems. I've studied China mostly in the last decade with a focus on Xinjiang in surveillance in the last couple of years. We have observed basically that the Chinese government's implementation of mass surveillance systems as far back as year 2000, nearly 20 years ago. And we have seen it developed as a multi-layered kind of multi-dimensional program at a kind of very basic of the system is the use of the requirement that everybody has to have an ID number and the requirement that people have to use their real name in accessing many kind of services, including when they travel on long-distance buses, getting a SIM card for the phone, getting broadband internet. And then at the next layer is the use of biometrics and the use of artificial intelligence. The collection, I think Darren also mentioned, facial images, voice samples, iris scans, DNA, and this kind of collection is mass and often without being tied to kind of criminal record or criminal suspicion. In Xinjiang, for example, these kind of biometrics are being taken from children as young as 12 years old as a requirement. And then in addition, there is also the collection of phone identifying information. I am EI number and MAC addresses of people's phones have been collected and put into massive databases run by the police. And what is interesting in Xinjiang in particular, but also across China, is the use of kind of multiple authentication technique. So when you go through your life in Xinjiang, it's not just about the use of facial recognition to identify you, but when you walk through checkpoints, you're presenting your ID, you're going through facial recognition, but the checkpoints are also equipped with these things called data doors, or some of the checkpoints are equipped with these things called data doors that look a bit like a security checkpoint when you walk through. But I'm being known to the person walking through it. The data doors are also pulling your I am EI number, your MAC address from your phone to identify who you are. And if there's any kind of irregularities between your phone, your face and your ID, these data doors send alerts to the authorities to flag you as someone who is somewhat problematic. So this kind of in the past, if you look at surveillance by government or by companies, you have some ways to circumvent that like we have worked with human rights offenders over time. And then we teach them how to secure the device, right? Like, you know, use VPN or use whatever to avoid a kind of device level surveillance. Now, these kinds of surveillance is getting even more intrusive is moving towards the physical biometric level. And in multiple ways, it's very difficult. I mean, let's say we developed some kind of mask to go around facial recognition someday. In fact, there's already the beginning of such masks. But you know, if you walk through the door, it's going to be very difficult to avoid three different types of recognition altogether to avoid detection and tracking by the state. At the top level, after kind of biometrics and the use of artificial intelligence is the use of big data programs. And again, you see that in Xinjiang, the use of integrated joint operations platform, a big data system that human rights watch publish a report about last year, where we developed where we discovered that the authorities are using big data system to kind of, you know, receive information from multiple sensory systems across the region. And by sensory systems, I mean, mostly the data doors I was talking about, but also facial recognition systems or CCTV cameras, surveillance cameras, installed in public places, these sensory systems contributing to information to this big data system, IJOP. And the IJOP also gets information from government officials who have these phone apps that they go around to collect data from people in the region. And any kind of irregularities, like I said, if they are using a device that doesn't belong to them, if they are driving a car that doesn't belong to them, if they use too much electricity, if they enter their house through the back door instead of the front door, if they donate too much money into the mask, these kinds of irregularities as defined by the system, by the engineers in the system, who decide that too much electricity is a sign of extremism. The system then alerts government officials to go and interrogate the Turkic Muslims involved and go and ask them specific questions and then vet them for, some of them go into political education camps. So the reason why I think, so I want to talk about these different layers is because I think it is important when we think about the diffusion or the use of surveillance in Xinjiang, in China, across the world, I think there are different kinds of mass surveillance we're talking about. If you look at the Xinjiang, if you look at Xinjiang compared to the rest of China, you see that the rest of China are also using these similar layers of surveillance, very similar ideas, very similar infrastructure, except that Xinjiang is much more intrusive and visible. Elsewhere in China, you have similar layers, but at the same time, there was, I think, significant resistance, sorry, not significant, some, but growing a bit of resistance against this kind of surveillance from the population. Last year, we had parents being very uncomfortable about these brain detecting kind of ring on your head, that supposedly detect students' level of concentration in schools, that those devices used in a couple of schools were actually stopped by the relevant educational authorities, I can't remember which province. You also have a law professor suing a private sue in China about the use of facial recognition when you enter the sue. You also have when Beijing announced in 2019 that they were going to combine, use facial recognition to sort people into different buckets for security vetting, so like when you go into the subway, people are subjected to security checks, but Beijing says, well, in Beijing subway, they're going to, depending on your facial recognition results, we're going to divide you into different risk level. That kind of announcement created a lot of debate in Beijing and beyond about the use of surveillance technologies. So there was some kind of resistance in China about these or some good discussion about these technologies. But these kind of resistance, I think, have been pretty much eliminated during the coronavirus outbreak. And it's interesting to me that now with the virus outbreak, the government is arguing that while it's an emergency, we must use these kind of technologies. We are seeing, for example, the development of this system called health code where people are divided into red, green, sorry, red, yellow and green buckets. Red means you have to be quarantined for 14 days because of the virus. And then green means you can go anywhere and yellow is somewhere in between seven days of quarantine. And in China, quarantine conditions are pretty harsh and rough. And the health code also makes this determination based on kind of algorithm that is unclear, like just precisely how people ended up in red versus green and how one day or two and other people can go from green and yellow. That has is also a big question. But importantly, the health code app is a little bit like Xinjiang, not entirely comparable, where the health code app is dependent on intrusive surveillance. So the health code app generates this kind of color code depending on where you have been, who you have been in touch with, and also send your real-time location data to the police unbeknown to you. And it also implements a certain kind of movement restrictions that in some ways also similar to Xinjiang. So you can see that in China, these kind of levels of surveillance also become increasingly kind of resemble what is happening in Xinjiang. And then you have outside of China. In like the Philippines, in Zimbabwe, in Kyrgyzstan, you see the diffusion of kind of similar kind of technologies. I think, Nisha, you're talking about both US and Japanese and Chinese companies selling these kind of technologies. What we see, though, I think for Chinese technologies, right now, we are seeing more kind of facial recognition systems being sold to these other countries, and not so much kind of these multi-layered, multi-dimensional systems being present in these other countries. And I call these technologies in other countries kind of more off-the-shelf facial recognition systems, because facial recognition is kind of like neat and packaged. You can kind of like sell them. But I would expect that given how, you know, if you look at Internet censorship and Internet, the Golden Shield projects, sorry, the use of the Great Firewall and Internet censorship diffusion between from China to the rest of the world, it took a decade or two for this kind of China model to diffuse to different parts of the world. And it might be that we will be seeing more over time that the Xinjiang model would spread beyond Xinjiang, but also kind of like beyond China as well. And I am not entirely kind of pessimistic about that. And I can explain why, but I would also respond to what Nisha was saying, how I think the use of surveillance technologies is not just a uniquely, well, China or Xinjiang problem. You're absolutely right. You see, I think Darren was mentioning how, you know, in the border, if the U.S. you see the mass collection of DNA on migrants, the use of facial recognition in mosques. In the Netherlands, recently the court just overturned this algorithm automated decision-making processes that are used to surveil welfare recipients. And the use of artificial intelligence, automated decision-making, are increasingly across the world being trained on marginalized peoples and increasing the power of the state. And to such an extent that I worry that over time is going to so significantly erode human rights and democracies around the world. And I think what we need, and as an advocacy organization as Human Rights Watch, is a global, is international standards that ring in these kind of, the use of these kind of technologies and have some kind of human rights standards that speak to the use of these technologies worldwide. And you can see why that is important, because in Xinjiang, we can see, well, in China, there's almost no law or regulations that regulate the use of, well, the collection of private data, personal information to circumscribe the surveillance power of the state, such that essentially the government can do whatever they want. And in the places where they can do most what they want, i.e. Xinjiang, the intrusiveness is the greatest. Whereas in places like Hong Kong, which Darren also mentioned, the fact that we have privacy laws, courtesy of the British colonialism, did protect Hong Kong from the worst types of surveillance that you actually see in China. Hong Kong is actually still leading the region in terms of the protection of privacy rights, despite it being increasingly under threats under Chinese rule. So I would just end there and say, well, I think a lot of these systems, if you look around the world, just when you talk about facial recognition, not the Xinjiang or the China multi-dimensional kind of type of surveillance, a lot of these systems are actually quite new. They were put in place in 2019, and there is possibility to roll them back if we actually stigmatize their use around the world. So thank you, and I'll end there. New DBS's fine tradition of abusing the position of the chair by a three quick question, four questions of my own, hopefully to give you times to formulate even better questions. So first question to Darren, really. Darren, fantastic presentation, really detailed stuff. Two questions, really. One is kind of a little bit about method, and second is more about the implications, is how do you do it? How can you do this research in Xinjiang? So it's kind of a methodological question there, really. And second is what do you think, based on your knowledge of the technologies, will be the implications of what's happening technologically-wise around surveillance for future research, either by you or by others following in the trail of your blazing there, or Adrian's Center or others. And to Rahima and that profoundly moving presentation, Rahima, it struck me that the contrast, really, between the kind of high-tech horror that Darren talked about and the very physical horror that you talked about, and it kind of made me think, really, of the very worst forms of colonialism, of stolen children, of language repression, of rape, of Hannah Arendt's, you can vote Hannah Arendt's, the right to have rights by quoting that policemen. And it made me think of Mandani's research, where he also votes Hannah Arendt's work looking at the Congo. And I'm wondering, my question would be, how do you see the way to combat this? Is it simply, simply is not the word, is it a question of confronting what is essentially a colonial, if not colonialism, then a colonial narrative? Or is this other stuff around surveillance technology or equally important? Does that make sense? That question, thanks. Nisha, my question, again, great presentation, my question there is, does it matter, and it's following on, really, from what Maya talked about towards the end of her talk, does it matter who owns the technology? Does it matter if it's sourced from a company operating, and this doesn't make a lot of sense in terms of globalization, that is restricted at least to a certain extent by liberal democracy checks and balances that we don't see in more authoritarian states? Or does, is that mattering less and less now, which seems to be the way your talk was going? And thank you, Maya, for your very rich talk there. My question there is, I mean, very comprehensive talk, so I'm going to kind of, my question is going to be on something that you maybe didn't mention, but I know Human Rights Watch has some data on, which is, you noted in one of your reports, I forget which one, maybe the 2019 one, is that recently, that some of the camps, satellite images are suggesting that some of the detainment camps in Xinjiang are being closed, and the prisoners, that about 80,000 prisoners, maybe Darren can speak to this, are being shipped off in various forms of what you guys call coercive labor to factories in China, maybe to do work that hand workers aren't yet willing to do because of the virus. And also the transfer of other prisoners to the kind of, the other detainees shall we say, to the larger prison system in Xinjiang, and I wonder if you could speak to that for a little. So if we go in that order, maybe start with Darren and work through it briefly, and then we can open it up to the room. Sure, great, great questions. So how did I do this research carefully? I've been going to this region for 10 years now as a researcher, and because of that, I have lots and lots of connections, a lot of social network, and I've seen changes over time. And so the last time I was there was 2018, which was already hundreds of thousands of people sent to camps, so over a million sent to camps. Lots of people had disappeared, checkpoints everywhere. And so I couldn't really interview my friends. I didn't contact them, as some of them were in camps, so it was impossible. But I was being watched by the camera system, so I was very careful. It was an observational ethnographic trip. So I went through checkpoints, observed how people went through them. I was detained several times, and that's informative too. So I wasn't at checkpoints, I don't speak Uyghur, I just speak Chinese, and so they don't know that I understand what's being said in Uyghur, and so you can really pick up on what's happening in those spaces and kind of see the mechanics of the system, how it works, the forms of terror as they're moving from body to body in those spaces. But the most kind of rich ethnographic detail I've gotten is through going to Kazakhstan and to Turkey. I haven't been to Turkey, but I've heard people have gone to Turkey. There's people there that you can speak to have come out recently. And in Kazakhstan there's lots of Kazakhs from China who have come across the border over the last three years, and they can speak to what the system was like, how they lived with it. I was just there for about a week, this last trip, and over that week I spoke to 40 people, interviewed 40 people. The call goes out that the researcher is in town, and then people show up to be interviewed. There's like a waiting list, people wait all day to speak to you. So there's lots of people to speak to and lots of opportunities for researchers to do this kind of work. I think it does really advantage you if you have a background, you understand what the system is like, and so you can ask pretty informed questions and sort of fill out the data set that way. What does it mean for future research? That's a good question. There are people in the diaspora you can talk to, that's one thing, but you can also use the master's tools. Because all of this is online, it's digital, if you have friends that know how to obtain materials like Maya does and others, you can get data very quickly. So we can assess things using satellite imagery, we can use nightlight vision to see if these camps are actually being closed or not, and we can get the internal police reports, 40,000 of them is what I'm working through right now, that show in very specific detail what's happening on the ground, the capacities of the system, and what are its limits. So there's some knowledge we can gain. It's not enough to push back against the system, we need the global stigma of these systems to actually get it motivated and going, but that's one thing that can be done. Thank you, that's quite an important question and direct answer how to combat this evil. Sometimes I think that every morning when I wake up I think about that question, and how is there any way that the situation can change. But what motivate me is that the more truths come out the better. So the real human stories, the human suffering, and also more women especially, because I believe now that it's approximately 27% women in camps. I translated the book The Land Drenched in Tears, and in that book it's a memory story of a Tata woman, single Chinese chef, and her life during cultural revolution and before cultural revolution imprisonment. And when I compare what is happening now, and compare what had happened that time, and we can say it's maybe 100 times worse. Last time when I spoke to a single Chinese chef and she said, at least I had freedom even in that little cell because there wasn't a camera watching me. And I had some time opportunity to speak to the people next cell pretending reading newspaper aloud. But at the moment that is not the case. It's extremely, even in your own home, you don't have that freedom. So in exposing this, and also very glad that the document leaked in November twice than recent document. So I think that when the truths are exposed, more people can join the campaign. And maybe I just hopeful that I'm not expert in legal procedures. But I wonder whether there will be an international action. Also the Chinese people, whether they can take any action after all these truths being revealed. Because a lot of time people don't believe what is happening. But now I think more and more people are believing this. And then having all these most terrifying stories being told, brave people like Seira Gul, like Rukiya, like Jelolova, they have been coming forward to tell their experience. And I think that is very powerful. And number one, for any organizations or countries to take action is first of all, they must learn the truth. So I think that is quite, quite powerful. Yeah. So does it matter who owns the technology? No. I mean, so I mean, I guess the I guess there are a number of points. I think sometimes the problem with the idea that there is artificial intelligence or surveillance technologies that are used with more checks and balances and liberal democracies versus authoritarian states is to kind of not recognize the point that liberal democracies on paper, or I think as Natasha was saying in the earlier panel, you know, are trying, you know, doing what they can to, you know, their sort of envious of the authoritarian states for being able to use surveillance technologies in a more efficient way. So it's always a kind of dance to try and impose some of these surveillance technologies within a framework that alludes to something democratic. But the very premise of the surveillance technologies is, you know, is anti what we would think of, is in its very essence contradictory to a kind of genuine sort of democratic settlement. So I mean, I think even in the, like Darren was saying about the use of surveillance technologies on the border in the US, and equally we can think about the UK, these, the technologies that are used against Muslim communities. So even like the Cambridge Analytica scandal, or as I was mentioning, the stuff that Google does doesn't arouse public upsets because it's happening to Muslims, it's not happening en masse. And even when we have the debate around, around privacy and when, you know, it was revealed about how much communications data the state does have access to and monitors, it was a kind of, you know, debate around the surveillance that's necessary and the surveillance for all. And with a range of different responses from, well, you know, if you've got nothing to hide, then you don't need to worry to, you know, protect most of us, but just to, you know, surveil the terrorists and not everybody else. And so, yeah, I think we always have to kind of think about what's happening sort of within states as well as the sort of broader global capitalist dynamics around sort of trade and to think critically about sort of disingenuous, you know, invocations of human rights that often get invoked for other reasons. Thanks, Nisha. Maya. I would like to just respond to that question as well. And I actually, well, as someone and as an organisation that is kind of deep in the trenches of fighting against surveillance technologies globally, I actually think it makes a huge difference, I mean, not a huge difference, a major difference. We, I mean, just as an example, we could be sitting in London talking about surveillance technologies and not to fear the police would barge in and basically take every one of us to the cell. I think that's the testimony of why civil society, free press, the independent courts matter in having debates and raising awareness, which is something that we can't do. We can't have this panel in Xinjiang. And I think when we write to, when we write to companies about the surveillance technologies, we do get responses usually from companies that are based in democracies and not to not to kind of underplay the enormity of the challenge of holding companies like Google or Facebook, which, which are just really big companies, very powerful ones, accountable. The fact is that we have very, very little leverage over companies like iFlytec, like MacVie, like SenseTime, that are based in China. They can also wage many different battles against those of us who try to expose these abuses, raising personal cost imprisonment. And I'm not saying that they as a company do that, but they are backed and related to a state that is very powerful reaching those of us outside of China as well. I mean, some of us sitting in Hong Kong also face risks that are increasing similar to China, just for simply researching about some of these issues. So I would just emphasize that to those who, to people who I think feel, you know, these companies are so big, it's really difficult to challenge. I think we have a bit of more leverage outside of China or in democratic societies. So I think that's one point. And then I think the other issue about forced labor are changing gears. I think, first of all, we have to be very skeptical about the Chinese government's claims that all of the detainees in these political education camps have been released. The claim is actually that all of them have been released. As I understand, some of the Uyghurs in particular have outside of China continued to not to know anything about what had happened to their family members. I think this claim that these camps have been closed require careful scrutiny and using methods that we have been using, including satellite imagery and documentation research to look into this claim. But I think that secondly, you just mentioned that while some of these detainees have been moved to prisons as well, I think even when I was researching and interviewing people about these camps in 2017, there was a validity of where people go. They would be held in detention centers, which are formal facilities, and then they would be moved to political education camps. Or some of the people in the camps were themselves coming from prisons and vice versa. There was a bit of a movement. I think there is, at the same time during this crackdown in the last few years, there is also an influx of prison numbers in Xinjiang. The New York Times have documented that. And also the use of mass surveillance, as we have also documented, the use of integrated joint operations platform, multiple checkpoints, mass surveillance essentially has locked down Xinjiang, which is the size of a third of Europe. An incredible feat if you would think about that for an authoritarian country, to be able to essentially calibrate people's ability to move around, depending on how much loyalty they show the government through technological means. So I would actually think that Xinjiang's problem is much greater than just the camps themselves involving the use of forced labor, movement restrictions, mass surveillance, prisons, detention centers, that most of these problems persist. And the government is merely kind of arguing that some of these camps have been closed. And I think the problem of Xinjiang is shifting shape because of the efficacy and the attention that is paid to spotlight issues, thanks to the work of those people in this room. But they have not gone away. And recently, the Australia think tank ASPI has just done this great report tracing how Uyghurs and Turkish Muslims are being shipped off to elsewhere in China as kind of cheap labor to be rented in dozens of factories that, sorry, in factories that supply dozens of brands, including those that are well-known names internationally. I'm going to say, wow, okay. I'm going to take three batches of three questions. Please keep it brief. And if you can, can you ascertain a member of the panel, you would like to answer your question. So I'm going to take people who I don't think, I might get this wrong, haven't spoken so far, gentlemen in a blue jacket, a blue jumper first. Hi, I'm David Streep. I'm from the University of Manchester. I think this is a really great exploration of how this is not just about the implementation of surveillance technologies by authoritarian governments in places like Arumchi, but it's also evident that there are roots in this whole system in places like Palo Alto and London, right? And so to that end, I would be curious how any of you, I guess Darren and Nisha in particular, thinking about how markets might impact the other end in the UK and the U.S. Does concern for profit maybe have an impact on companies if they get out it as being complicit in this process? And even collaborations in the academy, when you talk to your colleagues in particular in STEM fields, how do you get them to think about reputational costs about perhaps being more circumspect about what their work is used for? What kinds of impact does that have in America, in the UK and elsewhere? Thank you. Gentlemen, in the check, shirts, second row from the back, yeah. Beg your pardon, I feel there's something really missing so far, which is about why on earth are they bothering to do all this? And is it possible that the thing that's missing is corruption? The big companies must be marketing their surveillance products undercover by huge payments. It's a cheap way of doing it. Then question, I think, for Darren. Okay. Woman next to the guy in the white teacher. Third, fourth row. Thank you. Hi, I'm Catherine. I'm a master student here at SOAS, and I'm really interested in how surveillance and technology is used on refugee and migrant flows. So maybe to Maya or to Darren. Can you talk about which are the biggest companies that are being employed in this kind of way for technology, and what kind of aspects do you see tying back to the state for this? Thank you. So Darren, do you want to tackle the first question? Yeah, I'll try to answer quickly. Roll of markets and pushing back or incentivizing this from David. I think probably it's more about incentives, because all of the big tech firms in the US at least, Google, Amazon, Microsoft, they're competing against each other for market share, and all of them are doing these kinds of things. They need to be at the cutting edge of face recognition surveillance. I gave a talk at Google about what's happening in Xinjiang, and a lot of the questions I got from the audience was about the capacity. How good are they at doing these things? It had something to do with them thinking about where they're at, I think. That's how I felt the questions were. There was also many that were very concerned with the human rights aspects of these things, but there's certainly a competition that pushes people to do these kinds of things. So Microsoft's ethics statement is that they'll do it if it's for a democracy. It's like a patriotic duty to help the US military, and also now they're working with any vision, which is in Palestine, which is also a democracy, they say. Israel's democracy, so it's okay. So there's ways in which they frame these things that allow them to justify them. A lot of the ethics boards that most of these companies have are set up to do that, to give permission to follow through on these technologies. I think if we, I mean, Microsoft is reviewing their relationship to any vision because NBC did an expose and really showed how the system works, where you can plug in someone's name and find out where they are in real time. The system is very similar to what's happening in Xinjiang. Moving on to the next question, why are they doing this? It's very complicated, and it's hard to answer quickly, but basically the Uighur homeland, Kazakh homeland in northwest China is the source of a great deal of natural resources. 20 to 25 percent of Chinese oil and natural gas are from this region, 84 percent of Chinese cotton, 25 percent of the world's tomatoes are from this region. And so having control over this space, which is also framed as a key zone on the Belt and Road is really important for the Chinese economy. And so that's really what's motivating a lot of this. It's actually resource driven settler colonialism. What's built out of that, though, is this surveillance system has become its own thing, has its own momentum, and it's sort of an incubator space to build out these new kinds of technologies, experiment with them, and find new markets for them. So now China, Chinese tech firms are framing what they're doing as leading the world in counterterrorism. They're doing counterterrorism, counterinsurgency, better than the West. And so it's a competitive advantage for them to be developing these kinds of tools. There's also the labor aspect of this, where Uighurs are seen as a source of cheap labor potentially, and the tech is being used in the labor space as well. Amazon is doing this too, when you're tracking biometrics to make sure that people are efficient. They're not really looking at efficiency in quite the same way, but they're making sure that people pump to work on time, that they speak the language they're supposed to speak, they go through checkpoints, they have their phones checked three or four times a day. There's lots of tech that's used in the factories as well. Thank you, Darren. Rahima, can I direct that question about why are they bothering to you as well? How would you answer that question? Not on the screen with Darren, but I think it would be interesting. Yeah, I would add, of course, I agree with Darren's answer. Thank you very much. Another motive, according to some of the high-tech experts we interviewed, one of the motives is using this technology to maximize the arrest, to criminalize the people, the Uyghur people and other Turkic Muslims, because CCP sees Uyghurs all potential separatists, and in order to arrest them, they need some kind of excuse. So using this high technology, using the so-called integrated joint platform, the data is actually analyzed after collected in this data, I will call it like a reservoir, then it is analyzed and it was connected to the public security bureaus. So the names every day, according to this high-tech expert, their names will pop up after the analysis of these people's movement. And so there are three different colors, green, yellow, and red. So red is dangerous, yellow is suspicious, green is normal. So no one can guarantee that they are clean. Even this expert was detained for three months. He said, I knew how to avoid, I knew the technology so well, but somehow my idea, when I swiped, when I going through the checkpoint, it was yellow, and for that they arrested me. One dangerous thing about this technology is that just very simple example, if you stayed in a hotel on the same night with someone who served prison sentence in the past, your name automatically linked to that person. And that actually, and you don't know how you can avoid that. And in that way, your ID became yellow and then you became suspicious. So the design of the technology, according to these experts, is actually they knew that this is very unfair, they can wrongly accuse people being criminal or being dangerous. But that's what they need. They want to maximize the arrest. This is genocide. That's my answer. Thank you, Rahima. Maya, do you want to speak to the question on surveillance technology and migration? As briefly as you can, please. Yeah, I would say that it's not very easy to just single out that the companies are involved in that. I haven't done enough research globally to look at how surveillance is practiced on migrants. But some of the bigger names that I think people have just also discussed would be applicable for, I don't know enough about the US context and how they have been used on migrants. But in terms of facial recognition companies, there are only a few that are really big globally and expanding. And for example, NEC and SenseTime and MACV, boys recognition, I fly tech, but I mean, also, you know, Amazon. So I don't know enough specifically about migrants to say that. But I would also maybe also, I think there was a question about the motivation. I think profits is definitely a very important one. But I think that in the context of China is social control. The government and police is in a race at the bottom to figure out what's the best way and to control people. And you see that as far back as when the party came to power decades ago. And technology was thought to be a very promising arena. We've got three minutes left. I'm going to take two questions. Sophia there and chap in the middle in the stripy shirt. That's you, Paul. Sophia first. We got Sophia first. What problems does it cause to use that terminology? Obviously, I have a very loud voice. I didn't notice the mic. Another question is, are we falling back on to this framing of the week we go suppressed by the strong Chinese government and isn't it rather more complex? May I just abuse the position of the reader of the live stream to ask my own question? Which is, isn't it the case that in the UK and in the US and maybe many of other contexts, we are accepting surveillance technology into our environments for reasons of convenience and even well being. I'm extremely concerned about the extent to which surveillance is envisaged in our universities where the privacy concerns can disappear because we're talking about improving our students' learning, supporting their well being, avoiding suicides. And these are real projects that are going on that are of extreme concern. And if you look at a recent story since time, which is doing facial recognition technology in Xinjiang and has sold that to the Chinese government, is also marketing its technology to the UK higher education sector. And the UK minister was meeting with this company to talk about that. So if any panelist wants to respond, that's more of a rant than a question. Thank you. A very important rant as well. Gentlemen, as briefly as you can, please sir. Thank you. What does the UK's modern slavery act, if anything, be able to strengthen this front and follow up on what may I mention about prisons for slavery? We know that big companies like Tesco sources, you know, prison made goods such as the Christmas cards recently as reported. Then they doesn't seem to have faced any consequences. And what does the UK consumer or consumerism in general place in enabling or resolving all these? Thank you. Thank you. So I'm going to, in view of the time constraints, I'm going to allocate those questions. Nisha, could you respond to the question around Sophia's question around the introduction of these technologies in higher education? Raheema, could you talk to the question of terminology and the issue of Uighur in inverted commas victimhood? And Darren, are you confident with that last question? I am. I can answer part of it. I'll go and get lunch. Okay, so let's go with Nisha first. So Nisha, Raheema, Darren. So hello. Yeah, I agree. It's hugely problematic. I think actually one of the ways in which the positive aspects of surveillance, one of the ways in which surveillance receives kind of hegemonic sort of consent is through the sale of it as a positive thing in our social lives in all kinds of ways. It actually makes me think about, so obviously we have problems within the university and within broader social settings in all kinds of ways. The way in which surveillance technologies are measuring kind of heartbeats, health data that will have all kinds of implications in the future perhaps in terms of things like insurance, access to health and welfare services and so on. But one of the things that occurs to me about the use of surveillance in these other domains that are seen as kind of positive things are that it is just about what was just said about it being around social control and not just manipulating population so that we're not now in democratic states trying to gain consensus, but actually trying to modify behavior. And this is what the positive surveillance technologies do. So they modify in all kinds of subtle ways. So it won't be long before we get requests, I was just saying in a slightly different point, but for lecture capture not for reasons of being able to keep up with one's work, but for things like kind of public health. So the Canora, the recent kind of health scares and doing a lot for promoting the idea that we need greater surveillance technologies in order to promote kind of non-contact social spaces. So that's one way in which kind of our behaviors are modified or the social space gets changed. And I think that's the kind of worrying thing about the way in which surveillance technologies are being used or invoked is that it's changing, it's allowing kind of normalization of surveillance in all kinds of other ways. Thanks, Anisha. Darren, I've changed your order. Darren, could you address that? Sure. So I can't really speak to the UK specific anti-slavery laws or what have you, but in the US context I can. It's very hard to prove this stuff because it's, you just can't get access to these spaces or the companies, they do an audit of a factory and when they show up everything seems fine, the people tell them they're not forced to work. They're very good at preparing for the inspection and also obfuscating what they're doing. So oftentimes these companies are subcontracting to other companies, so the actual contract is with a parent company in another part of China and they don't, in the initial context they don't understand that the manufacturing is happening elsewhere, they don't know that Uighurs are being forced to come and work in their factory. That's starting to change because there's more and more documentation. We're looking at it more closely and so that means that people are starting to be held accountable, but usually it's kind of a one-off sort of thing with a story coming out in the AP or something and then they, you know, those products are taken off the shelf and they said they didn't know and they're sorry basically. So we need to actually have more of a concerted effort and I think it's beginning to happen especially around cotton because so much cotton products are coming from this region. The difficulty is that many of these companies like Gap and H&M are very concerned that they'll lose market share back in China because a lot of their sales are happening there, so it's not just also that, it's not just their supply but also the market and so for them it's a huge thing to take Chinese-made cotton products off the shelves completely. That's, there's just a lot of power, a lot of money at stake and that makes all of this difficult and the companies, if we don't put pressure on them, will do nothing, so we have to continue to sustain the pressure otherwise nothing will happen. Thanks, Sterin. Rahima. So about East Turkistan, is that the question about? Question around the terminology we use in East Turkistan and also the question from the live chat on victimhood, if we're not, yeah, any feelings you have on that? It's actually a quite a difficult question and this is being debated quite a lot, whether we should continue to call East Turkistan or otherwise you become victimized or being labeled as a separatist, but I think maybe China is maybe one of the, maybe the only country that victimized people for calling the name of the former Republic and I think every or person has the right to use the terminology that they want to use East Turkistan. Of course, people are free to use Xinjiang if they think that is a more appropriate official title, but Xinjiang itself is, it says, the meaning is new territory, so that is my answer to that question. Right, Maya, I hope you don't mind, we are really hungry here and you're about to go off for a wonderful Hong Kong supper where we're probably having sandwiches, so I'd like to thank all the panel, Maya over there in Hong Kong, Nisha, Darren and Rahima and for some fantastic presentations and great questions and contributions from you guys. Thanks very much.