 Thank you. Ms. Lynch, you're now recognized for five minutes. Chairman Chaffetz, Ranking Member Cummings, and members of the Committee, thank you very much for the invitation to testify today. Since my 2012 testimony on face recognition before the Senate Subcommittee on Privacy Technology and the Law, face recognition technologies have advanced significantly. Now, law enforcement officers can use mobile devices to capture face recognition-ready photographs of people they stop on the street. Surveillance cameras boast real-time tracking and face scanning and identification capabilities, and the FBI has access to hundreds of millions of face recognition images of law-abiding Americans. However, the adoption of face recognition technologies like these has occurred without meaningful oversight, without proper accuracy testing, and without legal protections to prevent their misuse. This has led to the development of unproven systems that will impinge on constitutional rights and disproportionately impact people of color. The FBI's Interstate Photosystem and Face Services Unit exemplify these problems. The minimal testing conducted by the Bureau showed the IPS was incapable of accurate identification at least 15 percent of the time. This has real-world consequences. An inaccurate system will implicate people for crimes they didn't commit, and it will shift the burden onto innocent defendants to show they are not who the system says they are. This threat will disproportionately impact people of color. Face recognition misidentifies African Americans and ethnic minorities at higher rates than whites. Because mugshot databases include a disproportionate number of African Americans, Latinos, and immigrants, people of color will likely shoulder exponentially more of the burden of the IPS's inaccuracies than whites. Despite these known challenges, FBI has for years failed to be transparent about its use of face recognition. It took seven years to update its privacy impact assessment for the IPS and didn't release a new PIA until a year after the system was fully operational. And the public had no idea how many images were accessible to its Face Services Unit until last year's GAO report revealed the Bureau could access nearly 412 million images, most of which were taken for non-criminal reasons like obtaining a driver's license or a passport. Without transparency, accountability, and proper security protocols in place, face recognition systems may be vulnerable to security breach and misuse. This has already occurred in other contexts. For example, in 2010, ICE enlisted local police officers to use license plate readers to gather information on gun show customers. In 2015, hackers breached the Office of Personnel Management Systems and stole sensitive data, including biometric data, on more than 25 million people. And in 2015, the Baltimore police may have used face recognition and social media to identify and arrest people in the protests following Freddie Gray's death. Americans should not be forced to submit to criminal face recognition searches merely because they want to drive a car. They shouldn't have to worry their data will be misused by unethical government officials or stolen in a security breach. And they shouldn't have to fear that their every move will be tracked if the network of surveillance cameras that already blanket many cities are linked to face recognition. But without meaningful legal protections, this is where we may be headed. Without laws in place, it could be relatively easy for the government to amass databases of images of all Americans and use those databases to identify and track people in real time as they go about their daily lives. As this committee noted in its excellent 2016 report on law enforcement use of cell site simulators, circumstances in emerging surveillance technologies like face recognition require careful evaluation to ensure their use is consistent with the protections afforded under the First and Fourth Amendment. And justice with cell site simulators, transparency, and accountability are critical to ensuring that face recognition's use not only comports with constitutional protections, but also preserves democratic values. Justice Alito noted in his concurring opinion in United States versus Jones that in circumstances involving dramatic technological change, the best solution to privacy concerns may be legislative. Just as this committee found with cell site simulators, the use of face recognition must be limited. Specifically, law enforcement should be required to get a warrant before accessing non-criminal face recognition databases and before conducting real time tracking and identification. I urge this committee to introduce legislation to do just that. Thank you once again for the invitation to testify. I'm happy to respond to questions. Thank you. I appreciate it.