 Every day, your movement is tracked, your purchase is logged, your searches saved, and your face is scanned. Facial recognition technology is becoming more widespread daily. Privacy International has reported 24 countries have already implemented location tracking to help ensure compliance with quarantines, and if you were thinking that increased face mask usage might help protect your privacy, China's facial recognition algorithms have already figured out a way around them. In January 2020, The New York Times reported that a company called ClearView AI has created a database that makes it possible to snap a photo of a stranger and reveal that person's identity. The technology was developed using more than 3 billion images scraped from public social media accounts by Australian Hwantan That, whom the Huffington Post revealed has collaborated with anti-immigration alt-right political operatives. Evidence of ClearView AI are in use by more than 600 law enforcement agencies in North America, including the FBI, Department of Homeland Security, and ICE. What if your software facilitates a world you didn't want to live in? Well, we're not going to make sure that won't happen. So can we resist the surveillance society? Should we? Kate Rose says yes. I think you have a right to consent to how your information is used, especially if it's meant to be at some point used against you or used extrajudicially. I am trying my best to explore these systems and see how we together can work to confound them or to put data into them as we please. Rose is a designer, cybersecurity expert, and founder of Adversarial Fashion, a line of surveillance resistant clothing such as masks meant to block facial recognition cameras and shirts patterned with fake license plates meant to feed bad data into the automated license plate readers, now ubiquitous in the U.S. and Europe. I came up with a couple of different designs that not only do read effectively into license plate readers, but they gave me enough wiggle room design wise to also make a pattern that is aesthetic and fun to wear and something people would like to actually buy. Rose's concern about extrajudicial use of personal data is less far-fetched than ever in the age of coronavirus lockdowns. Politico reported in late March that the Department of Justice has requested Congress pass a law to allow indefinite detention without trial of U.S. citizens during national emergencies. Unauthorized movements picked up by surveillance could be a pretext for such indefinite detention. Privacy rights need to be more enshrined. Any data that's collected about you must have a warrant before it is used. Rose is one of several designers trying to fight surveillance with fashion. While Rose's license plate shirts and dresses disseminate bad data, other anti-surveillance designers use fashion as a form of obstruction, marketing makeup products, or these glasses donned by electronic frontier foundation researcher Dave Moss. I really love how people are exploring the different ways to counter-surveillance technology and to empower people to do so. But at the end of the day, people should not have to wear a mask or put on face paint or wear like complicated t-shirt patterns in order to protect their privacy. Our government should be protecting our privacy and, you know, the power structures as they are should be valuing our privacy. Moss and his colleagues at EFF successfully lobbied the California Legislature to pass a law that starting in 2020 puts a three-year moratorium on law enforcement's use of facial recognition technology, including those departments who were experimenting with clear view AI. It'll mean that law enforcement agencies in San Diego County will have to stop using a shared facial identification system available to officers in handheld tablets. San Diego was one of the first agencies that we identified that was using mobile biometric technology, face recognition that they could use from the palm of their hand. This wide-range system where one law enforcement agency gets the technology and they're able to disseminate it across the entire region. And the data didn't stay local. San Diego, a border county, regularly shared access with the federal government. We saw that local law enforcement agencies had given face recognition devices to border patrol ICE, and we don't know how those agencies use that technology. San Francisco and Oakland have outright banned the use of facial recognition technology by law enforcement. This idea that we're suspending AI facial recognition, like, for example, in San Francisco or in Oakland, or the idea that California might be putting a moratorium on it for a couple of years, is idiocy. To be honest, I think lives will be lost as a result of that. Zoltan Istvan is part of the transhumanist movement, which believes that humans should use technology to modify the human body and experience. He's implanted an RFID chip in his hand that allows him to unlock his front door. It's going to be very useful to the human race, but we just kind of got to get over it being creepy. Istvan envisions authorities using facial recognition and other AI surveillance to prevent terrorist attacks by recognizing abnormal behaviors or suspicious individuals in crowd settings or aiding the government in fighting human trafficking. And that's not even including other biometric surveillance capabilities, such as scanning for elevated body temperatures to isolate feverish individuals in a pandemic. Let us look at what it can do for human trafficking. Let us look at what it can do for overcoming criminality in our cities. Let us look at what it can do for the overall safety. There are so many different things where we could use this kind of guardian angel out there to recognize that we're having a hardship and maybe even going to die, and authorities can be notified. FaceMe software was originally marketed for virtual makeup demonstrations before evolving into a product serving a wide range of uses, such as logging into apps, entering a space and identifying intruders in a secure facility. We can have a precision level of up to 99.58 percent. Literally the only companies that are slightly more precise than us come from countries like China or Russia that have a difficulty going to market outside of their home markets. Richard Carrier is FaceMe's general manager, although the majority of the company's clients are in the private sector, they have supplied technology to governments around the world. And Carrier agrees with Istban that facial recognition technology could be a giant boon to public safety and that it could reduce the likelihood of police interactions turning violent. If I'm a citizen and cops come to me, I'd be very happy for them to know who I am even before they come to me. And Carrier says the company won't sell its technology to repressive governments or law enforcement agencies. I'd like to believe that we would only associate ourselves with police forces or law enforcement organizations that are respectful of individual rights. But law enforcement agencies are already showing a lack of accountability in how they use facial recognition technology. The police department in Chula Vista, California failed to properly report to a federal oversight committee how it was using a facial recognition program according to a fired whistleblower. Chula Vista PD declined our interview request. What we're saying is with all surveillance technologies, whether it's license plate reader systems or face recognition, that agencies are being very promiscuous with who they share data with without really ensuring that the rules that are in place locally are also being followed nationally or by other states. They want to collect it all, but they don't really care about protecting it all. And he worries about China's use of facial recognition in conjunction with a state-run social credit system, which assigns citizens a numerical score based on their behavior. China has also rolled out increased pandemic related surveillance that monitors for fevers and flags individuals not wearing protective masks during an outbreak. The thing that we can learn from China is that this surveillance, as it continues to grow, is going to be less and less about public safety and more and more about controlling people. It's about social control. I think the social credit system that China is using is absolutely awful. And the problem is that they're setting such a bad example for the rest of the world that everyone's turning their back against AI facial recognition. There is a good way to use it. It's just pure safety. Is there a terrorist walking into the Super Bowl Stadium with a suitcase that somebody didn't catch? If there's somebody walking into a mall with a gun, these are things that I think AI facial recognition, as well as other types of surveillance devices can really help the public. Istvan also believes our entire conception of privacy will need to be revised in the coming years. I believe in a society that's totally transparent, a society where sort of everybody can see what everybody is doing. Now, I'm not saying in your bedrooms, but certainly anywhere in the public sphere that it has to go both ways. And you must be able to look into the lives of all government officials. I would like a kind of law that all police officers are always filmed whenever they're on duty. I would like something very similar with any kind of politician. I don't want back room deals. I don't want things that the public can't see. Privacy, I believe, really does steal our liberty away. It's transparency that's going to give us all the freedoms we want. I do think conceptions of privacy are changing, but I think they're strengthening. People didn't necessarily have a whole lot of concern about the images they posted to Instagram or Facebook, but now post Clearview AI, where it's coming out that all of these images were scraped and put into a face recognition database. People are concerned and outraged about that. And we are seeing a backlash against this company and the law enforcement agencies that work with it. But Rose says that as the technology becomes more powerful and present, Americans will need to take a page from the protesters in Hong Kong who've used face masks and encrypted communication and, most importantly, mass disobedience to resist authoritarian control. The things that seem like anti-surveillance actions that don't matter by yourself, when you hit a critical mass of people, it matters a lot. So everyone there decided they were all going to ignore mass laws. They all decided they were going to get out umbrellas. They all decided they were going to get out the laser pointers and point them at the cameras and try to disrupt the collection of that information of people next to them who maybe didn't have a mask or it was knocked off. And I think that kind of belief in your power, that even if you think it might not work 100% of the time, that all of you together have this tremendous power. Rose's aim isn't just to design clothing that thwarts today's systems, but to cultivate a community that continually develops new methods to confound the surveillance state as its tools continue evolving. I'm just a single human and you actually have all of the skills at your disposal to try some of those makeup techniques or see if anything in your closet that you might already wear has the ability to confound or interfere with these systems. It's a really important opportunity for us to try and get as far ahead as we can before we begin playing catch up again.