 My name is Rashida Richardson and I'm the Director of Policy Research at the AI Now Institute at NYU. Dirty Data is a term that refers to missing, wrong, or misrepresentations in data. It comes from the data mining research community. I'm specifically focusing on its use in predictive gleasing technologies, which attempt to predict where a crime may occur. The concern with dirty data is that it reflects the environment in which the data was created. So if it was created in a place where there is consistent discriminatory and unlawful police practices that will be embedded in the data, and then when it's used in a predictive analytics setting, that means those systems will likely produce or reproduce the problems embedded in the data. I think it needs to be part of the wholesale reform that's happening in police departments around the country, so we need to scrutinize the context in which this data is being created, organized, and used, and figure out the scenarios where we need to limit or completely not use it.