 Good evening and welcome to the New America Foundation, where we work to foster new thinkers and new ideas to address the new policy challenges of the 21st century, including the topic of tonight's panel discussion, insecurity, race, surveillance, and privacy in the digital age. My name's Kevin Bankson. I'm the policy director at the Open Technology Institute, which is New America Foundation's Tech Policy and Tech Development Wing, where we're working to build a stronger and more open Internet for stronger and more open communities. I'm here to warmly welcome you to tonight's discussion, those of you who braved the rainstorm outside and those of you who are tuning in on the Internet. I guess you don't really tune in. And I'd like to especially thank those of you who are joining us here in D.C. for the Knowledge Exchange, a three-day convening of advocates, activists, experts, and social innovators that's intended to bridge the divide between those of us who work on policy inside the Beltway and, you know, the rest of the real world outside of the Beltway. That annual event, like tonight's panel discussion, is sponsored by our good friends at the Center for Media Justice and the Consumer's Union, so I'd like to give a big thanks to those organizations for making tonight possible. That's discussion about how new technologies of data collection can both help and harm underserved and minority communities comes at a really interesting time. We're expecting this week, perhaps as early as tomorrow morning, that the White House is going to be releasing its big report on the social and political and economic implications of big data, that is, techniques and tools for collecting, storing, and analyzing unprecedentedly large sets of information, including personal or private information. These new tools can be engines of efficiency and economic growth and teach us new things about ourselves and the world we live in, but they also create a new risk of data-driven digital discrimination and the risk, the reinforcement of existing inequalities through automated decision-making without appropriate oversight. You're hearing now in the press reports that the White House report is going to focus a great deal on these issues of data discrimination and data fairness, thanks in no small part I think to the work of civil rights and consumer justice groups like CMJ, who recently released a set of civil rights principles for the era of big data, which I expect will be mentioned more than once on tonight's panel. So tonight's discussion is, I hope and expect, just the beginning of a much longer and wider ranging discussion about how big data and surveillance technology is changing how we need to approach questions of privacy, questions of fairness, questions of discrimination, and access to opportunity in the 21st century. I'm really proud that New America has been a part of that conversation, and I'm delighted to now pass the mic to Dolara Darakshani from Consumers Union and then to the moderator of tonight's event, the founder and executive director of CMJ, Malkia Cyril. Thank you, and good night. Enjoy the discussion. Thank you all so much for joining us tonight. My name is Dolara Darakshani. I serve as policy counsel at Consumers Union. We're very pleased to present this panel tonight as part of our annual Knowledge Exchange Gathering, which brings together local grassroots organizations and social and media justice organizations together with policy advocates to discuss critical issues in both telecommunications and privacy. The value of bringing together all these diverse viewpoints is to be able to share perspectives, and we really look forward to hearing the perspectives of all of you tonight. At this time, I'd like to introduce our moderator of the esteemed panel, Malkia Cyril, founder and executive director of the Center for Media Justice. Yes, I am the senator and the president. The senator of media justice, Malkia Cyril. Thank you. Welcome, everybody. I'm really glad to be here with you. And some of you I've been with for a couple of two days, a day and a half now. And I love it. The Knowledge Exchange, as DeLara and Kevin both mentioned, has been going on for about eight years. This is the eighth year, so what, what? And here in this eighth year, we've had the privilege and really the honor of sitting down with, I think, some of the most interesting leaders across this country who are thinking in new ways about how we should think about the future of big data. How do we incorporate our struggles for justice and equity into the 21st century, a digital age? It's a question that I think we all sit with, but it's not a new question. The question of how we engage with the communications platforms of our society as we fight for equity is an old question, in fact. It's a question that was asked in the 1800s during the fights for abolition, slave abolition. It's a question that was asked throughout the global south in the 30s and the 40s. It's a question that was asked even as Europe transitioned from feudalism. It's a question that has been asked throughout history. The fact that we're in the 21st century, the fact that we are now in a digital age only serves to make us part of history. It's not something brand new. So I think I'd like to start there, yes? That we are on the edge of a new day, but we are also on the edge of a very old past. And that means we have a lot we can learn from. There's a lot of lessons. Today, our conversation is about insecurity. Let me just look at the title real quick, insecurity, race, surveillance and privacy in the digital age. And I want to open up, well, we're going to start this panel in a minute. I want to open up the conversation by just framing a way of thinking about this question. So for the last, I don't know, 10 months, how long ago were the revelations, the Snowden revelations? It'll be a year. It'll be a year. So we're about to come up on the anniversary. So for about a year and a little longer, because this was in the news prior to Snowden, but we've been talking in a public way about the question of digital privacy. How is the internet and other digital technologies being used to help or harm our society and our economy? What strikes me is this conversation is about being private. Now, I know I'm a private person, you know, I don't like everybody know my business. But for me, the issue really hasn't been about an individual question of privacy. A Jeffersonian concept, an elite individual, liberalism, mine, my information. That's not what this fight is about for me. For many people like me, the fight is about sovereignty. It's about the right to control the land on which I walk, the information that I carry around with me. It's not about privacy. It's about sovereignty, self-determination, the ability to decide who my next president will be, the ability to protect my voting rights, the ability to protect my right to employment, my ability to get health care, my access to my own health information, my right to control, who knows what about me and how it is used against me, right? I am not convinced that this is a conversation about privacy. For some of us, the question of whether or not governments and corporations have the right to intrude is a new question for us to consider. And then for some among us, it's an old question. There are people in this country, millions, who have been struggling with government intrusion, who struggle with government intrusion from the day that they're born. There are women who struggle on welfare and have to turn over their information every day. There are people all over this country whose medical information is in the hands of big insurance companies and they can't control who does or does not get it. There are consumers all over this country buying products from Walmart, being tracked as they move through the internet, being sold and marketed, excuse me, being marketed products and they can't say no. There are people in this country who don't have the right to consent. And that is an old problem. And I'm very familiar with that problem. Can I ask if there are those among you that are also familiar with that problem? Yes, okay. So I'm not alone. And I didn't think I was alone. I thought I was among several hundreds of millions of people who are concerned about the question of whether or not our First and Fourth Amendment rights are in fact distributed. Let me just say that again. Are our First and Fourth Amendment rights distributed? What does that mean? That doesn't mean do they exist. They exist. There is a constitution. It exists. We cannot debate that. Whether those rights are enforced equally across racial groups, across class, across gender, that's a question I think we need to grapple with. And if the rights are not enforced equally, then we have a problem when it comes to trying to enforce those rights on new platforms. Right? If those rights were already not enforced equally, what happens when those rights then have to be translated to a digital environment? So what we're talking about here today is we're talking about how we reframe the conversation about big data, about digital privacy, about surveillance from one that is simply about protecting individual liberty, although that is important, to one that is about protecting the rights of groups. We want to talk about how we can move from a conversation about big data, surveillance, and digital privacy that is about furthering inequality, pushing forward the mass of capital, and instead talk about how these questions are focused on countering discrimination. This is the challenge. This is the new conversation we're trying to have. And that is fundamentally different than the conversation that emerged after the revelations of Edward Snowden. So today's panel, y'all are about to be, I would say, amazed and wowed, but you probably already know so much that we're going to talk about today. But the interesting thing is that the folks that are here with us today are individuals that have not generally been included in the public conversation about big data and digital privacy and surveillance. These are perspectives on race and discrimination, on income inequality that generally have been left out. And in fact, why this is important is that these pieces, we talked about, Kevin mentioned that these pieces are going to be addressed to some degree in the White House report on big data. And that's critical, because these are the pieces that, in fact, make up the bulk of how we as individuals will experience whatever legislation exists or does not exist. We will experience it either as supercharged discrimination, or if we're lucky and we're organized and we're mobilized, we'll experience it as something way more beneficial. So I want to first introduce our panelists. Actually, I'm going to let our panelists introduce themselves, because I think that they each have a lot that they can say about what they've been doing. So I'll ask each of you, as you speak, to start by introducing yourself, where you work, and kind of what brings you here. So we'll start there. How we'll open this up today is I'm going to read for you a piece, a section of an article in the American prospect how big data could undo our civil rights laws. And we're going to use that to open up this conversation and allow our panelists to react to some of that. OK? Does that work? Does that work for you? All right. Big data will eradicate extreme world poverty by 2028, according to Bono, front man for the band YouTube, because he clearly is a digital privacy expert. But that did not say that in the article. That was on me. But it also allows unscrupulous marketers and financial institutions to prey on the poor. Big data collected from the neonatal monitors of premature babies can detect subtle warning signs of infection, allowing doctors to intervene earlier and save lives. But it can also help a big box store identify a pregnant teenager and carelessly inform her parents by sending coupons for baby items to her home. News mining algorithms might have been able to predict the Arab Spring. But big data was certainly used to spy on American Muslims when the New York City Police Department collected license plate numbers of cars parked near mosques and aimed surveillance cameras at Arab American community and religious institutions. Until recently, debate about the role of metadata and algorithms in American politics focused narrowly on consumer privacy protections and Edward Snowden's revelations about the National Security Agency. That big data might have disproportionate impacts on the poor, women, or racial and religious minorities was rarely raised. But as Wade Henderson, president and CEO of the Leadership Conference on Civil Rights, Civil and Human Rights, excuse me, and Rashad Robinson, executive director of Color of Change, a civil rights organization that seeks to empower black Americans and their allies point out in a commentary in Talking Points Memo Cafe, while big data can change business and government for the better, it is also supercharging the potential for discrimination. That is the launch point for our conversation today. The question of supercharging the potential for discrimination. We'll kick this off by asking all of our panelists, first to introduce themselves, and then to answer this question. Given the concerns raised by the article in the American Prospect, can you share some concrete examples of exactly how big data might supercharge the potential for discrimination? And how might or might not the White House report on big data address this concern? And I'm gonna start down at the very end here and we'll work our way back to me. Sounds good. Good evening. My name is Grace Sheedy, and I'm a researcher at the United Food and Commercial Workers Union. I work on the Making Change at Walmart campaign there. We support the organizing efforts of Walmart employees nationwide, and we're also working to hold Walmart accountable as a corporate citizen. So that's where we're coming into this conversation from. Last year, Walmart's global CEO of e-commerce told investors that he wanted Walmart to know what every product in the world is and know who every person in the world is, and then have the ability to connect them together in a transaction. So Walmart's big data ambitions are very real, and we've taken note and done some work with CMJ and Color of Change especially to understand what's going on there. But stepping back and looking at retail in general, I think a very salient example of how big data can be used to discriminate in sort of our everyday lives and habits comes from the example of staplers being sold at Staples, which some of us heard earlier. The Wall Street Journal investigated online pricing at a variety of retailers, and one example that came out of it was that two different people who lived a few miles apart, not worlds apart, were being shown different prices for the exact same items. And it seemed like maybe the prices were different because one person lived closer to rival retailers than the other person did. But effectively, one person lived in a more affluent neighborhood with more retail options, and that person was shown lower prices than the other person in a less affluent neighborhood was shown, and these kinds of practices could resolve from the collection of our data as we navigate the internet, and the reality is that we don't necessarily understand how our information is being used, and so it's extremely difficult to recognize some of this discrimination when it happens and to hold companies accountable or to right these wrongs. Good evening, everyone. My name is Sita Penya Gangadharan. I'm a senior research fellow here at New Merica Foundation's Open Technology Institute, and I work at the intersection of digital inclusion and surveillance and privacy data and discrimination issues. And I think the example or story that I wanna give comes from how I got into this issue several years back, and that was around the issue of subprime lending. So following, and I should just add for context, this question of history is really important, so when I was originally doing this research, I wanted to talk about subprime lending and the types of targeting and database creation, data profile creation mostly about black and Latino consumers, was I wrote about it in the context of the types of data profiling that we saw in our analog past around redlining in the 20s, around racial profiling, sorry, around medical profiling, also in the 20s throughout till the 70s with experiments on prisoners and African-American communities, as well as racial profiling in the 1980s. And I wanted to link that issue to the issue of subprime lending and how there had been an effort within the financial services community to really try and identify consumers that were high-risk consumers, subprime consumers, lower credit ratings and target them with these risky financial products. And the ways that they did that, I think are continuing to inform how sketchy financial services and products today are marketed, targeted towards certain communities and engage people in a cycle, or entrap people in a cycle of debt. And what we saw was, at that time, lenders, third-party companies, consumer financial services companies like Bankrate and some of the credit agencies, collecting information about your web behavior and integrating that data with offline information about where you live, your other consumer habits and so forth to target you with these products. And we know today that blacks and Latinos were targeted very aggressively and they also bought into these subprime mortgages and other subprime products at higher rates. And today are feeling the effects of this process with greater severity than other populations. So we know today, for example, that blacks have not recovered from the recession as much as other populations have recovered. I think that there is some promise in the White House review of big data. I think there was an article a few days ago in the Huffington Post that had John Podesta, special counselor, talking about how he was surprised, for example, to hear these questions of discrimination surfacing in conversations about big data and privacy when in fact he, I think he and others were expecting this to really be a conversation about privacy and innovation and not about discrimination. And so that is really encouraging and that's really exciting for many of us, myself included, who were in the room really trying to push some of these issues. I think what I'm hopeful for moving forward is that we actually can sustain these conversations where the issues of discrimination, where the issues of predatory targeting really come to the fore with a precision and with a broad understanding of what's at stake in these issues. Thank you. Good evening, everybody. My name is Hamid Khan. I'm here from Los Angeles. I'm an organizer with the Stop LAPD Spying Coalition, which is a coalition of diverse communities in Los Angeles, including formerly incarcerated people, youth, artists, academics, lawyers and several concerned communities. And the coalition's primary goal is to expose and organize against local law enforcement's policies of surveillance, spying and infiltration. I also wanna just recognize one of our founding members, Mariella Saba, who's a leader with the coalition as well, who's attending this conference. So, and just wanna say hi folks in Los Angeles, it is really raining out here a lot. So, but I just wanna start by briefly saying that tomorrow is May Day and two years after the 1886 Haymarket riots after which we know that several people were hanged, the Chicago Police Department came together and the chief of the Chicago Police Department said that revolutionary movements must be carefully observed and crushed if it showed signs of strength. That basically, when we talk about self-determination, it really goes to show and that set the tone for the development of the police red squads. And there's a long history of police red squads starting from Chicago, Los Angeles, New York, that how these covert and illegal police agencies and which continues on, I wanted to talk about this in the context of history because to look at it, that what we are dealing with it today, a big data and whatever we wanna call it, whether it's invasion of privacy, whether it's racial profiling, this has been going on. It's not a moment in time, it's a continuation of history. And in order for us to better understand, I think it's always helpful that we look at the intent, the development and practice of how it evolves and how the structures get created. I'll give you three very simple or clear examples of the intent, development and practice of surveillance and infiltration. So there's a Los Angeles Police Department, as many of you may know, has been on the forefront of intelligence-led policing for many, many years. So Los Angeles Police Department launched this new concept of predictive policing and which basically goes to show that if you put previous acts of crime in the database, it'll crunch out and predict where crime may happen and then it creates these hot zones, right? So in essence, because it only picks up survival crimes and people who are just trying to survive and committing petty crimes, so it creates these occupied zones in poor communities. So this goes back to a grant that was given by the US military in 2006 to a professor at UCLA, who's a professor of anthropology. And the idea was to predict acts of insurgency and terrorism in Afghanistan and Iraq. And obviously we can't talk about this thing without corporate profit and money-making, so the good professor decided that there's a lot of money to be made. So by 2009, he brought this whole concept home and he did this presentation to the LAPD and the US military, where he used the faces of Afghani men their faces were covered in scarves and shawls and they had a weapon sitting on the side and it was labeled as insurgents and terrorists and right next to it, he had faces of Latino youth from East Los Angeles as gang members and future insurgents and domestic terrorists as well. So the intent was clearly to demonize and profile and identify that this is the enemy. It's the same, again, we talk about the current face of the enemy as a Muslim and Arab, but there's a whole thing going on. So that was the intent was that how do we sell it? How do we stoke this racism and sell this program? And now Professor Brandingham has formed his own company called Bread Bowl and making thousands of dollars with contracts given to the police. Let's talk about the development piece of that. Just about last week, this whole thing was revealed that in early 2012, the LA Sheriff's Department had an airplane flying over the city of Compton, which is predominantly African American in South Central Los Angeles and with these very high powered cameras and it circled Compton during the daytime for about seven or eight hours a day, mapping all of city of Compton and looking at people and developing these pixels of people moving on the ground. So again, the development piece is another, so discrimination being so inherent in this thing. So let's talk about practice now. So we primarily work in Los Angeles around the suspicious activity reporting program and the fusion centers and intelligence-led policing. Last year, the Inspector General of LAPD released an audit of the LAPD's suspicious activity reporting program. And the audit was done, it took a four month sample and did an audit of that four month period. And in that race sample, what we found out was that 82% of these SARS were filed on individuals who were identified as non-whites. And the largest sample was in African Americans. So again, looking at it that how the language is created, what is it couched in to prevent us from terrorism, but yet at the same time what practice shows. And the last thing I just wanna say about what to expect from the White House, I mean, just as you that was talking about John Podesta's statements, and I think it's very clear because what you see is that a lot of this conversation is anchored in economic interest. That high tech and technology is, and they call it the crown jewel of economic development in the United States. So where that would go? And when you intersect that with the needs of national security and the crown jewel for economic development, I mean, I don't know, I mean, I really don't have a whole lot of hope because there's gonna be a lot of smoke and mirrors, but time will tell that where we would go with this thing. Boy, this is a fun panel for me to be on. Let me tell you, I am probably the person on this panel who spends the most time actually talking about privacy. My name is Chris Calibrize. I'm one of the privacy lobbyists for the ACLU. So I probably, the person on the panel kind of comes from the other direction. And I think the best way to capture how different this conversation is and how far we've come, even with the White House report in the last couple of months is usually when I do these panels, it's four other white guys sitting next to me. And I look out in the audience and it's 65% white guys and 35% white women. And really, there's just, but it's just the conversation is not about race. It's not about income disparity. It's all about data and personally identifiable information. And that's the world I come from. And I'm not disparaging and I'm just saying the conversation has been very narrow and very limited up to this point. So simply having a major White House announcement and a report on, that did not start out as a race-based discussion, started out as a discussion about, to really have that embrace the obvious racial realities is just incredibly important. And I'm very excited that we've, I feel like the people here, the people in this room have made that happen. It didn't happen accidentally. It happened because we went out and found the examples and we organized and we went into John Podesta and we said, here are the examples and here's what you need to pay attention to. And to his credit, hopefully, he listened and he's incorporated them. So to make this a little bit of a conversation, everything you said, I was thinking to myself, oh yeah, I have, that was one of mine. I have something on that too. So I'm just gonna kind of build on a lot of what you said because I think it's like, all of this stuff is such a reality. Let me find the right piece of paper. So suspicious activity reports. We mentioned this idea of collecting information and sharing it. So through the wonder of FOIA, I can actually tell you what some of the suspicious activity reports say. So this is what gets reported. And by the way, almost anybody can report a suspicious activity, right? There are lots of people, especially people who do security at critical facilities, for example, are trained to report these kind of things. But anybody can do it. They're supposed to be vetted. Then they go into, well, two different national systems, but they go into a national systems and then they get disseminated nationwide, okay? So, you know, suspicious gathering, these are actual reports. Suspicious gathering at private residents in Elk Grove of individuals of what appear to be Muslim faith or Middle Eastern descent. That's the whole suspicious report. Suspicious conversation overheard. The neighbor, one of four clean-cut Middle Eastern males, was speaking excitedly in a foreign language. This is suspicious activity in the United States. Information regarding trending at Sunrise Mall. There was a substantial increase in the presence of female Muslims fully dressed in veals and burkas. So, I mean, they just go on, like, you know, I mean, it was amazing to me that they actually gave us this stuff. But I think that anybody who doesn't think that race is really integral to surveillance has not been paying attention, as we've said, for the last, well, more than 100 years, you know, and still is happening in a very blatant way. The FBI, we did a FOIA on this just a couple of years ago. The FBI has been, since 9-11, engaged in an ongoing program to do racial mapping, to racially map communities, not because there is any particular suspicion, just because the assumption is that suspicious people and dangerous people will come from these communities, so we must know them in advance. They must have profiles so we can go back and, you know, round up the usual suspects later. So, you mentioned the Chicago police. So, I'll end with the Chicago police very concrete example of this. So, here's a new tactic, very similar to this predictive policing. And it's actually, it's not very similar. It's the same thing. The Chicago police have created a new tactic they call a heat list, which is the, what they've just defining as the 400 most dangerous people, you know, excuse me, the 400 people most likely to be involved in violence in the city of Chicago. And it's, you know, a wide variety of different categories. It's not just like, you know, you were arrested for, you know, assault last week or something, you know, it's who your friends are. It's, you know, what your social media looks like. I mean, and what they're doing is going to talking to these people and saying, well, you know, we're keeping an eye on you. So, you know, just to unpack it a tiny bit before we leave it, because it's astonishing, but it's astonishing for a lot of different reasons. I mean, one is, it's probably wrong. I mean, so much of the science is like snake oil. It's somebody selling something and they found somebody who'll buy it, right? You know, two, even if it works, it's not clear what it's working to do. I mean, if you've got somebody who might be involved in violence, perhaps a visit from like a social worker or, you know, an intervention of another sort might be better. Like what are you trying to accomplish here aside from finding people and putting them in jail? And, you know, third, even if amazingly, this happened to be accurate. And even if it was accurate in predicting something that was legitimate for a police officer to be investigating, is it fair? Is it fair to look at your friends and make determinations about whether you should be investigated based on who you're associated with? I mean, this is pretty fundamental constitutional First Amendment stuff here, right? You know, who your friend, you know, you should not be, there shouldn't be no guilt by association in the United States. There is, of course, but you know, that's the standard that we should be holding these institutions to. So I will stop there for now. A lot of good stuff. So first of all, that was excellent, right? A wonderful way to kick off this conversation. We are in a moment when, on the one hand, we have a media conversation about big data that is about making our lives better, right? We're going to use big data to improve our education, our educational system, improve opportunities for access to health care. You know, we're going to learn from this data, and we're going to use what we've learned to improve our lives. In addition, we're going to learn from this data to make us safer, right? So that is on the one hand. We're going to use this data to predict crime. This is wonderful. We're going to determine who can have access to what products and services. And we're going to be able to market to everybody in the world. That just sounds exciting to me. And so that's on the one hand. On the other hand, we have this issue of discrimination being supercharged. So it existed, and now it's being supercharged. So the question at hand here is whether or not the uses of big data and the way big data is currently being organized and used, will it make us safer, right? That is the charge. That is the assumption that is being leveraged, that it will. So let's turn to that question, right? Will it? And I'm going to turn to Hamid. And I'd like to ask you. So right now, while many are arguing for the benefits of big data to help counter crime, to help counter crime domestically and terrorism internationally and domestically, a study by the American Muslim Civil Liberties Coalition found that the dragnet surveillance of Muslims in the US was actually destabilizing, created a huge amount of insecurity, and actually hindered the ability of law enforcement to identify and deter real threats. Do these uses of big data and digital technology, in your opinion, increase or decrease national security? What are the implications for the so-called suspect communities that are being observed? And given that, are some of those communities organizing for change? So I think in order to talk about it, we have to really look at the whole architectural surveillance. And given just the scale and size of this apparatus, I mean, it is overwhelming. From the NSA to local police, and I'll give you an example going back to the Suspicious Activity Reporting Program, which was launched by the LAPD. They were the launching pad for the National Initiative in March of 2008. As of March 2013, the Government Accountability Office released a report on the National SAR program. And by March of 2013, now this report goes back to 2012, that the program is in 46 states, two US territories in Washington DC. It's been already incorporated by 14,300 law enforcement agencies around the country. It has already trained 295,000 local law enforcement officials out of the estimated 800,000. It has trained over 53,000 EMS and first responders. It has 300 DOD entities as a part of this program. It has over 53 federal agencies and every federal agency as a part of the National Suspicious Activity Reporting Initiative, including such agencies as Housing and Urban Development and Wildlife and Fisheries. So I think when we look at just the sheer scale and size, I mean, it's not only overwhelming. And when we have talked to LAPD about it, they proudly say, yes, we are looking for a needle in a haystack. OK, well, let's talk about the needle in a haystack. So that's one piece. Then the other piece is that, of course, and I do want to just put it on the table that we talk about the revelations of Edward Snowden, but I hope that as we move forward, we don't have to wait for any more revelations because this is a fact. This has been going on. And we do stay awake on the switch because if we keep on waiting for revelations, it's going to mess us up a lot. And then you have the whole NSA piece, which the estimates are picks up about 1.7 billion hits a day. So when you look at the sheer size, it is overwhelming. And what did they have to come back with with John Alexander and the chief of ODNI Clapper, Jim Clapper? I mean, lying to Congress, started off by saying that we have thwarted 49 as a terrorist attack. And finally, with the president also standing up and saying, finally coming down to that was one of a cab driver in San Diego, who was then charged with, I believe, was charged in material support for $8,800 of sending money back to Somalia. So I think this is what we need to look at. Just let's look at some other stuff. In October of 2012, the US Senate Subcommittee on Homeland Security and Government Accountability released an audit of intelligence gathering at fusion centers. Now, people who are not familiar with fusion centers, these are central hub and spoke system of information gathering, storing, and sharing. There's about 85 or so fusion centers around the country. One of the largest one is in Los Angeles, which, proud to say, we shut it down on April 10. We did a national day of action, and we put it in lockdown. And Mariela had done an incredible skit. So what they came out was that intelligence, and I quote, the intelligence gathering as fusion centers was flawed, irrelevant, duplicative, outdated, potential violation of civil liberties. Washington Post said, in their headline, said fusion centers are pools of ineptitude. When the Los Angeles Times talked to the LAPD's Chief of Counterterrorism and Deputy Chief Michael Downing, his answer was, and I quote, there's a lot of white noise, but an occasional gold nugget. But when he was pushed further, he couldn't point to a single conviction regarding the gold nugget at what could have happened. So I think when we talk about whether it's less safe or more safe, what we are seeing now is this whole apparatus that invades our privacy, promotes racial profiling, creates a cultural suspicion and fear. And it is completely based on bogus standards, which are completely vague and overly broad, like reasonable indication. And then, of course, the waste of resources, where billions of dollars is being spent. Now when we talk about the implications on the Muslim community, I think some of the things are very clear, because it has been very successfully now the fifth face. And we talk about the fifth face in the long history of the other. You have the savage indigenous. So let's justify his policies to displace people. It's a criminal black, so let's justify Jim Crow and segregation. It's the illegal Latino. It's the manipulative and disloyal Asian. I'm reminded of Executive Order 9066 that General Duwit saying that, then I quote, the persons of Japanese ancestry contain enemy-race blood, hence inherently disloyal and shall always stay unassimilable. So these are the arguments that are made. And now the fifth face come into the face of the terrorist and the things. So I think it's so now it has obviously created a siege mentality in the community. It has already created a cultural suspicion and fear. But I think it has also created an opportunity as well that how these communities need to and are, and at least that's the position that the Stop LAPD Smiling Coalition makes, that how are we building bridges? How are we learning from the histories of the strong culture of resistance that has been built for centuries in this country as well? How are we celebrating that power, that regardless of what, regardless of the most, under the most harshest conditions? I mean, I always think of like, what were the conversations ever going on 200 years ago? What were the conversations going on 100 years ago under the most harshest condition? People organized, people fought back. And so I think while the implications are tough, and this is a different reality, but yet looking at the opportunity and celebrating that people's power, I think that's where we need to build these movements as well. So what we have here, if I'm understanding you correctly, we have a situation of predictive policing that is fusing a variety of streams of information, right, into some centers of information, whether they be fusion centers or other kinds of places where this data is being collected. And on the one hand, we have a conversation that is about intent. We're gonna use this information to make your life better and make you safer, but the impact is very different. The impact is actually follows a trend of racial bias. And so that we can't really be in the conversation about intent, we have to be in the conversation about impact. And the third kind of I, because I always, if you know me, I like to speak in threes, would be interests, right? So we have to have a conversation a little bit about whose interests are being served here, okay? Since we can't talk about intent, you can say that the intent is anything, right? So anybody can say I intend for this to be wonderful. The impact then can be terrible. So then we need to look at whose interests are served. So I wanna ask this question, so I'm gonna break it into two parts. So one, we have a situation where we're training police to discriminate, right? We are actually literally training police to discriminate. We're also training them to, so secondarily, we have a situation where they're looking for a quote needle in a haystack, is what you said. And yet the actual processes and practices create the haystack, correct? All right, so they could never find a needle because they are creating a haystack that is unbearable. Gold nugget. Exactly, as big as a nation. And so I wanna ask you, Chris, to just stay on this question of law enforcement. How do we, one, begin to, how should we understand the interests, right? What interests drive this set of policies and practices? And then two, how can we begin to connect across different sectors and movements so that the work that is happening in LA, for example, or the work that is happening in Oakland where organizations are fighting to get rid of, to ensure that there's no fusion center built there, or any of the places where fusion centers are being opposed, how can we begin to connect the local and the national strategies in a way that actually is gonna make a difference? So interests and then what can we do to really begin to collaborate in a different way? Well, I mean, I'm gonna start with a little bit of us, maybe a surprising way with the answer to interests. I mean, I think it's important perhaps to be sympathetic to your average law enforcement officer on the street, even your average government intelligence agent as hard as that might be. Because when something bad happens, or when something bad even sort of might kind of have happened, like even though like a plot got thwarted, say for example, the underwear bomber sneaks onto the airplane, and even though the bomb doesn't go off because of existing security practices, there's still all this finger pointing and blame, right? So nobody ever, and a great example of this is the terrorist watch list, right? So the terrorist watch list is a million strong name of people who aren't dangerous enough to arrest, but need to be searched before they get on an airplane or can't be allowed to fly or have some other potentially suspicious characteristic which we never have to prove in court, right? It's really your ultimate like sort of suspicion list. But from an incentive point of view, if I'm a local cop, nobody ever lost their job for putting somebody on the list. But God forbid that somebody doesn't put them on the list and they do do something, then you're done, you're toast, right? So there's such a pressure to perfect and have perfect safety and perfect everything. And when you're displacing the harms onto the people that you've never really, that in the past it's been okay to displace harm onto, that's not okay, but it's also a function of a system that is just so demanding of perfection and so looking to really just not have a realistic assessment of what harms are and what society we're in. So in terms of uniting our communities and our work, I think this kind of discussion is really helpful to talk about it's the same thing that's happening in the fusion center in LA, it's happening in fusion centers across the country. It's the same predictive policing is popping up and just realizing that, recognizing it. And then next starting to form common alliances because these programs really are starting to be demonstrated as not working. So there are people, there are people on the right, there are people like the author of that fusion center report that was so incredibly damning is Senator Tom Coburn from Oklahoma. Nobody's idea of a liberal, but he doesn't like wasteful government programs that are stupid. He didn't like, I mean he doesn't and he's very clear about calling them out. So you find those and you make that case that like, I mean I often say that like if I could get rid of the terrorism programs that just don't work, like 80% of my job would go away. It's you know, it's like, it's rooting out like the excessive money that's pouring in to buy surveillance cameras and buy drones that's just pouring in and like grant funding. So you know, I think there are alliances to be made but it's you know, you're gonna have to make it on efficacy grounds. You're also gonna have to do what I think is probably the most critical piece of this privacy discussion, which is demonstrating harm. Too often people are told, oh, it's privacy. It doesn't really hurt anybody. So what you are, you know, I don't care about my privacy. I have nothing to hide. Which is just a lose for a statement but I'll leave it alone. But obviously there are real harms here. Real actual people are getting interviewed by the FBI and put in jail and kept from flying because of privacy violations. So we, you know, we use that common ground. We use that reality. We bridge to libertarians. We bridge to people who care about privacy and technology issues and we try to build a broader coalition that, you know, passes laws and policies that protect all of our interests. So as we kind of like think about, so right now for example, the rate of incarceration of African-Americans is higher than it was in the 1960s, right? And during legal segregation. And so, you know, we're in this moment when we have this fusion of data that is literally criminalizing, increasing the rate of incarceration, creating all kinds of harms that we need to demonstrate. I wanna move from the question of law enforcement to the question of financial equity, right? So we have this fusion of data on the one hand, government data and on the other hand, we also have a role that corporations are playing. And so I'm gonna turn to Sita, if you don't mind. I read another article in the American Prospect in which Virginia Eubanks suggests that poor and working class Americans already live in the surveillance future. So first I'd like you to react to that, just that statement, you know? She says a bunch of stuff about, you know, what that means, but I'd like to ask you, what are some of the economic threats posed by big data? And do you think that big data's gonna help close or expand the racial wealth gap in America? I'll try to take all of those, but I actually, I wanna respond to, you know, I think that for me, I see a lot of through lines between the issues of government surveillance and the issues of corporate surveillance. And when I think about the through line, I find it really useful to come back to the work of someone named Oscar Gandhi, who's talked about this concept of cumulative disadvantage. And so as we become more and more dependent on these kind of statistical aggregations of who we are and that information is then used to make predictions about us and determine our digital reputations, our digital destinies, there's a real threat that we become captured by that process. So in the case of law enforcement, we use predictive policing to target certain kinds of communities. And one of the consequences of that is it creates a heightened sense of insecurity, both physical, psychological, it creates a greater sense of mistrust within the community. We see communities falling apart. And that type of action can catalyze stronger police presence and so forth. And so the cycle kind of continues. In the context of economic justice, I think the analog is that as people are targeted because of their income and because of their race, and it's determined that, oh, you can make a profit by the way off of poor people through certain kinds of predatory products, that same process and that same pattern of cumulative disadvantage takes place. So in the case of the article that you cited, Virginia U-Banks was talking about that future is now for poor people. That future has been now for poor people for decades, for centuries, right? Poor people have been watched continuously. Every movement that they make, they are also blamed for every problem that comes along with economic or income inequality with income disparity. There is a cycle that makes it possible to keep people in that state of fear and that state of powerlessness, I think, within these new technological or powered by these new technological predictive systems. Now I'm forgetting the second and third questions, but I think you were talking about... Racial wealth gap. The racial wealth gap and how that's exacerbated. So I think it becomes easier with digital technologies as you collect information about people to kind of cordon off a type of world that you can access. And this isn't just when you use your phone or when you go online. This is when you're at the cash register, at a certain vendor. This is when you're in the shopping mall. This is when certain businesses make their way into your community. There are all sorts of layers in which you become kind of path determined as a result of all of this algorithmically determined technological systems that we're increasingly reliant on. I think that a huge problem that I've seen in the work that I've done, so I've spent a lot of time talking to communities that are trying to learn how to use technology and go online and become digitally literate. And one of, two concerns that I hear often, right, that there's no choice but to kind of comply with the technological systems that we have and that there's no way to redress problems. So I don't know who to turn to when I see certain outrageous things come across my inbox or flashed at me or on my phone. I don't know who to turn to when I feel like I've been incorrectly targeted or sold a certain product or it's been determined that my insurance rate is gonna be this much as opposed to this much for somebody in my neighborhood or otherwise. And so that opacity and that lack of redress I think is really, makes it difficult to see out of that problem. Now, where are the opportunities? Cause I think that was your third question. I think similar to my co-panelists that there are several opportunities. I think part of what we're trying to do in events like these is really bring attention to what is going on behind the scenes, who's involved, who's responsible? Who can we target? That is an important part of moving ahead. I do think that there may be some ways in which we can flip the script and use technology or use these automated decision systems or use the environment that we're in in ways that are not expected by those who design them or implement them. So I don't know if this is a great example, but as you talk about the systemicist activities reporting system program, I always wonder why haven't we staged a campaign where we inject more noise into that system as part of the process of pointing out how ridiculous it is? Similarly with law enforcement, I do wonder, and I think this kind of came up recently in the news as the New York police department said, post or tweet your images about NYPD, and they all, there was a huge rash of images that pointed out police brutality. I think that's another example of how you can flip the script with economic issues and financial inclusion. I think it gets a little bit trickier. There are certainly good reasons why, for example, we should be sharing data within the public assistance program to make it easier for people to sign up for these services and stay in these services and get back to where they need to be. What that, I think that entails a really, we want to make sure that kind of data sharing is done very carefully and very responsibly and with a very narrow purpose. And that could be a huge benefit, but, and there are other examples of how big data can be mobilized in the service of social, racial and economic justice, but we have to move those along. We have to really, I think, support that and use that as a countervailing force to a lot of the bad uses that we see, the bad applications. So it sounds like, as you talk, Sita, about the question of cumulative disadvantage, and as you raise this question of, on the one hand, there are some opportunities to use this data in ways that will increase equity, right? And then there are some ways that the data's being used that will decrease equity. It sounds like what hangs in the balance is who controls how it's used, right? We go back to this question of interest. And so, and that some of the ways that technologies are being used hide the mechanisms of economic inequality. So what I want to do is turn to Grace and ask you to talk a little bit about the specifics of Walmart. And if we're talking about this question of, we've been talking about government intrusion and we began just now to talk about corporate intrusion, but how can we understand, what lessons can we learn from this very specific case, this very specific company, that might help us understand both what the harms are, right? But also, what is the potential, for example, for workers to organize around some of these issues? What's the potential for consumers to organize around some of these issues? So that, you know, give us a picture of what that could look like. Sure, so Walmart has made it a major priority to grow their online sales. Walmart is, as we all know, the largest private employer in the world. It's the largest company in the U.S. and what Walmart does tends to set the standard for other companies in terms of both employment practices and business strategies. So currently, investing in e-commerce is a major priority for Walmart. They don't want to seed too much ground to Amazon or any ground at all and the company in recent years has spent about $2 billion buying up tech startups and creating this Silicon Valley-based business. So we were able to look into Walmart's data collection practices specifically last fall and we realized that Walmart probably has consumer data on, you know, as many as 145 million Americans given its scale and we worked with a technologist to find out that Walmart is sharing consumers online data through its apps and through its websites with more than 50 third parties and that kind of information includes every page and product that shoppers look at, unique identifiers for users and their devices, system information like your device type and operating system and location information and these are the kinds of things that retailers and data brokers and other companies use to build profiles about us that we don't particularly have access to or any ability to opt out of or rectify or correct. One of these third parties that Walmart works with says that it has information about 80% of US email addresses and the company has talked about connecting data from customer social media interactions to real life transactions and Walmart, you know, like many retailers, maintains the ability to track people's locations and movements through stores and so we've been using this example to talk about, you know, what this kind of like corporate surveillance means in the retail sector and, you know, we've been able to talk with workers too about it and there are a lot of examples of Walmart workers, you know, raising like really important concerns about the business model of the company and ways to improve Walmart's business that have been gaining widespread attention. So, you know, this is something that we're starting to weave into the conversation and I think that it's really important because Walmart has specifically called out its intention to dramatically increase its marketing to communities of color. The company, you know, believes that basically all of its future sales growth is going to come from communities of color so, you know, that's who's really implicated in this large-scale data collection and targeting and I would also just add that I really agree with what Zita is saying about how this isn't a world of, you know, government surveillance and then like this corporate surveillance is something totally different and separate. You know, I think that some of that certainly became clear with the Snowden revelations that this is, you know, where it's all one world and more recently it has become clear that Walmart is interested in using facial recognition technology and its stores and I thought this was like a crazy marketing scheme and they'd like, you know, recognize the faces of their customers but they actually clarified in the press that they want to do this for law enforcement purposes. So, yeah, I just really agree that this is, we're not all operating in separate worlds here. And then I would also add that, you know, we've been hearing some rumblings of stories from workers that, you know, they're being talked to by their bosses about their social media interactions and, you know, there's this other level of worker surveillance that we're, you know, really interested in getting to know and, you know, defending the rights of workers to come together for a voice on the job and speak up about their concerns. Thank you so much, Grace. So, as we kind of wrap up, right, you know, we're talking, you know, whether we're talking about the rights of low wage workers or we're talking about the rights of low income consumers, African Americans, Muslim communities, Arab communities, Latino communities, you know, a variety of people who are considered suspect to be watched, to be tracked, who do not control the technology that is being used and simultaneously accumulate the disproportionate amount of disadvantage from that use. You know, it's a good place to end, right? When we, as we think about the question of interests, right? As we think about whose interests drive this level of intrusion, this level of monitoring, this level of surveillance, it is a good place to end with Walmart. That is good. Thank you. So, I will end by saying, you know, we have a bunch of algorithms, a bunch of math that is adding up to discrimination. And ultimately, it sounds to me like if the math adds up to discrimination, then we have to build a movement to change the equation. And that's what our work is today. That's what our work is going forward. We are actually over time because it's such an amazing panel of brilliant people. What I'd like to do is offer you the opportunity to ask one or two questions. If there's something burning, and then I'd like to, and then we'll wrap up. Does that work for you? Okay. I'll take the sister right here in the back, and then I'll see you two. Hi, my name is Jamila Brown. I work for the Digital Rights Organization Access. My question, I feel like it would be remiss if we didn't talk about gentrification and its role in big data, particularly Walmart. Yet again, there are six Walmart's that are set to open in Washington, DC, which not only displaces the population, but also exploits their consumerism. On top of that, I know from personal experience, because I've worked in public housing projects here in DC, and you have HUD and the DC Housing Authority that are working to get low-income people online. So I'm wondering, because it just really spans all the areas of consumerism, surveillance, being pushed out, et cetera, what role is gentrification playing, not only in this city, because as we know it's happening nationwide. Sita, would you like to take that? So it's a great question. I don't know if I can speak so closely to the issue of gentrification just because I'm not, I don't follow that as closely as other issues, but I think it's an important point to raise. Certainly when new housing develops come up in Washington, DC, at least in my understanding of the issue, it does seem like it's very tied to attracting a certain type of person and moving out certain other kinds of people and really trying to create, to use housing development as a way of just excluding certain groups from being a part of the city. And insofar as housing is concerned and public housing is concerned, there is I think a noble effort to provide for access and availability to communications infrastructure, though there is a danger associated with that, namely oftentimes people are being brought online without the proper steps to make them both safe in these environments, not surveilled, not know how they are being surveilled or what kinds of information is being collected about them, whether that's for public housing reasons or otherwise. So I think that there's, I do generally support this idea that we should have access to communications technology, but I think it has to be done in responsible ways. And I haven't, I'm not sure I'm seeing that in the ways that urban development projects are happening around the country. I'm gonna ask Hami actually to take this a little bit too. And if you can speak to kind of how some of the big data is being used to some of the predictive policing, for example, is being used to decide to make decisions about what gets built and what doesn't and what kind of infrastructure ends up in what communities and what kinds of infrastructure might end up in other communities and so forth. So last year in July, Department of Homeland Security released a memo. And in that memo, they identified three low-level cases of arson in Vancouver, Grand Rapids, and Seattle, Washington State. And what they did was that if you, and it's posted on our website, stoplappdspying.org, that they actually defined gentrification in that memo. And when you read that gentrification and they use that memo to highlight the dangers of anti-gentrification work. So that's what they did. And if you take the words anarchist extremist out of that thing, it looks like a definition of anti-gentrification that comes from any housing rights group. So in essence, the message that they were saying is that it starts with a low-level activity. And they say that in the memo of basically organizing, flyering, rallying, and going after as a danger to this gentrification and as these developments are taking place. And at the last page, it's a four-page memo that clearly highlight that suspicious activity reports should be filed if you observe these kind of behaviors. So I think when we, gentrification is the key, and I'm so glad that you raised that, that it's not only as a way of like, how this databases get used, but how it gets enforced as well. In downtown Los Angeles, starting in 2006, they launched the Safer Cities Initiative, which was after surveilling, and this is in the Skid Row community, an unhoused community, it's a 50-block radius. And the way they, and this was Bratton who was, Bill Bratton who was the chief of LAPD, within two years, and this is how displacement happens then, they issued 36,000 jaywalking tickets and urinating in public, 36,000 to unhoused communities. So in essence, I mean, it reminds me of the underlying theme for SB 1070 for the Arizona Anti-Immigrant Law, and it was attrition through enforcement. So it is like every municipal code then gets used, which leads to attrition and displacement of people. So I think when we were having these conversations, it is also like when we talk about big data, and obviously thank you for raising the issue of corporate interest, it's not just about corporations, private individuals are now trained by the FBI to be terrorism liaison officers, who are now linked up with the Fusion Centers as well. Washington posted a study back in July 2010 where they released this, it's called Top Secret America, and in that they highlighted over 1,900 private corporations, we don't even know how many are there. But I think, and we know we're gonna end real soon and I'll stop right there, that when we look at this thing in the lives of the community around racial and economic justice, starting from gang database, the war on drugs, the war on terror, the war on crime, the felon database, the probation database, the parolee database, and on and on and on children as young as 10 years old because of gang injunctions, it's nothing new in their lives. This has been going on. So rather than them feeling insecure and paranoid, communities fighting back. So I think this is what we have, and this is what we have learned because, you know, I mean the coalition was anchored out of Skid Row learning to debunk this whole language of national security and to learn that what it is mean to be suspicious? What does it mean to be transgendered in the streets of Los Angeles and to be considered suspicious and undesirable? So in a way, as we are building this narrative, there's a tremendous opportunity as well to really build a long-term movement out of this as well. Again, intent versus impact and interest. I like that. I think we should take that with us attrition through enforcement in terms of that response. I saw this young lady in a very back. Hi, you actually just, there I'll stand. You actually just touched a little bit on what I was gonna ask about. I'm really interested in the issue of gang databases and gang surveillance. And I know that was really spearheaded and piloted in California and it seems like more law enforcement agencies across the country are taking up and using what California's been doing as a model. And I was wondering if you could just talk a little bit about trends that you've seen and specifically these gang databases, gang surveillance, both on the state level and how now states and local agencies are sharing that increasingly amongst one another and what the consequences of being in that database might be for individuals that get kind of swept into that net. Sure, but actually, yes, tack on and then do you want us to take that? Or can you take that? So I just wanna ask like related to that if you can explain specifically who are these people or institutions that are creating these gang databases and other similar kinds of databases? Like what do we know about them as well? In terms of the companies that are, okay. So one of the founding members of the coalition is the Youth Justice Coalition. If they're watching this thing. So I wanna just honor them and their presence and kind of bring them with us. They're really the experts in dealing with gang databases and gang injunction. So I would just make an attempt as to what my familiarity is with that. Gang injunctions are basically restraining orders. That's what it really comes down to. The city attorney and the police go to the judge and get a restraining order. And this trend started in California back in the 80s. So again, looking at it as an extension of the war on crime and the war on drugs and what it meant. There are now documented issues where we're finding out that children as young as 10 years old have been going into these gang databases. As we speak now, there's 46 communities in Los Angeles or 48 that already have gang injunctions. What's really interesting what ties to the gentrification piece of this is that when you look at these gang injunctions they don't start in communities where there is heightened gang activity. They start in communities which are affluent and rich. So it is again the displacement and weeding out of the undesirable. And to completely displace them is where the use of gang injunctions are. They go into these communities and they walk in with 400 John Doe's. So they have like, and then they have these indicators just like the suspicious activity reporting program that you'll be in a SAR database if you're taking a photograph, using video cameras, drawing diagrams and what have you, there are these 12 or 14 indicators that if you're wearing certain types of clothings and if you meet either two or three of those or maybe four indicators you are put into a gang database. There are stories of brothers from the same house who are put in gang databases. They cannot be together at all. So one goes to school at a separate time, one goes to school at another time. So this runs very deep finally. Again, Kudos to Youth Justice Coalition. There's a community called Echo Park and this is the first time that now the city attorney has been convinced that they would now revisit this gang injunction in five years and now the young people may have an opportunity to challenge if they have been placed in a gang database but besides that, it goes with you all your life. And just as somebody was talking about earlier that if you, after 1980 or 1984, if you've been convicted of a murder or sexual assault or something, you cannot be employed or you can't get jobs and all that. So it is very deep and it's really something that is impacting a lot of communities. But again, it's the youth who are fighting back, the youth who are really galvanizing and completely fighting back and gutting this up. So as we wrap, the question that is before us is basically are we going to allow our digital platforms to build a security state that is used to track and monitor and enforce income inequality essentially, right? That the on the government end, the security state is built through private databases owned by private companies and ultimately ends up being used to deepen income inequality and organize that income inequality. Or are we going to have a situation in which that data is actually used to counter criminalization, to counter income inequality, produce financial equity, and actually move representation of these voices right to the forefront? That is a debate and a question. It's not a foregone conclusion. And the outcome of that depends largely upon our ability as a movement to move the questions of race and income class to the forefront of the conversation on digital privacy. And that is in fact, where we'll end tonight. Can we thank our panelists and thank all of you for being here with us. Thank you.