 Good morning. Hi, and thank you so much for visiting us here at New America. New America is a non-profit, non-partisan public policy institute that invests in new thinkers and new ideas to address the next generation of challenges for the United States. I'm Kevin Bankston. I'm the policy director of New America's tech policy and tech development wing, the Open Technology Institute, which seeks to foster a more open society through more open technologies. That mission includes working to ensure that new technologies help to empower communities and to reduce rather than enhance the inequities in our society. As such, we at OTI are very proud to be signatories to the subject of today's discussion, the civil rights principles for an era of big data. Too often discussions about privacy have failed to address the issues of justice inequity and have failed to include the voices of the civil rights community. These principles, which seek to remedy that failure and inject new voices into the privacy debate, come at a particularly critical juncture, just as the White House has initiated a comprehensive review of the way that big data will affect how we live and work, our relationship with the government, and the question of how to maximize the innovation and opportunity that comes from this data and minimize the risk to privacy that comes from this data. We think that consideration of civil rights must be a core component of this review. Indeed, I and a number of the folks on the panel today will be attending a White House convened workshop in New York on the ethics of big data on Monday to promote the principles we're going to be talking about today. That includes Sita Peña Gangadharan, our senior research fellow who's long been focused on issues of digital inclusion, data profiling, and social justice. Sita's going to be moderating today's panel, and so I'm going to turn things over to her. Thank you. Good morning and welcome, everyone. As somebody who spent a lot of time in the field working with and in communities, especially communities of color, I encounter many, many people who, like me, are excited about harnessing digital technology to improve our lives. We are excited about the power of digital technologies to help us achieve personal and collective goals. But there's power in digital technologies, power that can be inhibiting, power that can be limiting, and a power that reflects choices and values about, by those who design, deploy, and disseminate our digital tools. However, that power affects us. I think most of us would agree that no one wants to see digital technologies promote unfairness, unaccountability, and inequity. That is especially salient for members of underserved communities who, for generations, have had to deal with unfair data collection and analytical practices. And that's why we're here today. This morning I'm joined by five individuals with whom I've had the privilege of working in the past year or for more than a year now on, among other things, civil rights principles for an era of big data. We aim to have a frank and open conversation about big data not only in the context of privacy, but equity. Let me introduce our panelists briefly. All the way to your left, we're joined by Karine Yu, Senior Counsel at the Leadership Conference on Civil and Human Rights. Hazine Ashby, Legislative Director at the National Urban League. Chris Calabrese, Legislative Director, or sorry, Legislative Counsel at the ACLU. Jason Legria, Senior Staff Attorney at Asian Americans Advancing Justice, and we do hope that we'll be joined by Todd Robinson, Executive Director of Color of Change shortly. I imagine he's stuck in traffic because I understand it was quite difficult this morning. Our panelists here represent vast social networks of citizens and consumers across the United States who provide thousands of data points in their daily digital interactions. Given how grounded all of you are in your communities, I anticipate that you will have many stories to share that go beyond the purely legal or technical discussions on predictive analytics. And on that note, we do want this to be a lively and interactive dialogue, so I'm relying on you, our panelists, to feed off of one another, to prompt one another, and I'll try and interject with questions from myself and also from the audience, both of those that are here with us in Washington, D.C., as well as those who are tuning in by webcast. And we'll leave a greater chunk of time towards the end to field the audience's questions. Please do join us at hashtag data justice and post your questions there so we can follow along. So, Karim, I'm hoping that you can get us started and tell us how did these civil rights principles arise? Who is this group of signatories and how did we get to this point? So I can only do that if I can figure out the technology of the mic. Is the mic on? Yes. Okay, so you can hear me. Great. And do you want to introduce Rashad, who's coming down the hall? So, thank you, Sita, and I'm delighted to be on this panel and delighted to be sitting here with all my colleagues and friends. So I think we're going to have a great discussion today. I just wanted to note that there are a number of important milestones that we're going to be commemorating this year or have already begun to commemorate. It's the 50th anniversary of the war in poverty and of the creation of the great society and of the Civil Rights Act, 1964. It also happens to be the 25th anniversary of the World Wide Web. So it's a very good time, I think, to look at the intersection of civil rights and online privacy, and that's what we're going to be doing today. Now, these are not the kind of anniversaries you celebrate with flowers and candy. They are anniversaries where we... Celebrate with panels. Yes, we do. And talking heads. Exactly. We talk a lot about them, and we look back and see how much progress we've made and we look ahead to see how much yet is to be done. And maybe 50 years from now, when we've got our trackers and our heads at that point, we're going to be looking back at this time and we're going to think what was happening there. Now, civil rights groups have been thinking about issues of data and technology and privacy for a very long time. The collection of data, such as through the decennial census, is important for a whole host of reasons for civil rights enforcement, for the implementation of federal programs, for redistricting, and it's a hot priority for the civil rights community for that reason. Technology in the form of television brought the civil rights struggles of the 60s into the homes of Americans across the country, so they could see what was happening in the South, and it was a direct cause, a direct reason why we have a Voting Rights Act now. And for privacy issues, well, these are issues that the community has been concerned about for a long time because of the long history of surveillance being used against these communities. So, we looked at today's moment, including the White House review of big data as a great time to bring groups together to think about these issues and determine whether there's common ground. And as you heard two weeks ago, the coalition of civil rights and media justice organizations came together to endorse the civil rights principles in an era of big data. And this is a historic step for many of the groups who came to the table to put forth these principles, a first step into the conversation about big data, and it's just the beginning, and we're looking forward to seeing what happens next. Now, our expertise is in civil and human rights, and we want to bring that expertise to the table, and I appreciate Kevin's comments about how these voices are not often heard as part of the debate. Our intuition was that there were broader issues that are in play than issues of what kind of algorithm we're going to be creating. Not that algorithms aren't important. If I understood them, I would think they were even more important. But I do know civil rights, we all know civil rights, and whether we use the language of big data or privacy or civil rights, the questions are the same. So we're talking about criminal justice. We're talking about jobs. We're talking about financial inclusion. We're talking about what kind of society we want to be in, whether concerns of equal opportunity and equal justice and fairness are going to be part of that society, and whether civil rights concerns of bias and discrimination are going to put us all at risk. So that is the frame through which we are looking at all of these principles, and that will, I think, help guide our discussion today moving forward. I wanted to start with the first principle, which is stop high-tech profiling. Now, one of the major reasons why we wanted to address these issues is because big data has the potential to supercharge discrimination in ways that victims don't even see. And I think this issue of profiling is a great example of that. Now, racial profiling refers to the practice by law enforcement of targeting individuals, not because of their behavior, but because of their personal characteristics. It's unconstitutional. It doesn't work. Civil rights groups oppose it. And yet it persists. It persists in the context of street-level crime, you know, stop and frisk, driving while black or brown. It persists in the context of counterterrorism in the wake of 9-11. It persists in the context of immigration enforcement. And now it's cheap because of technology. So we are seeing surveillance and profiling, high-tech profiling done in a way that it's never been done before. And that is a cause for concern. So we have licensed plate readers who can record the cars that are parked at certain mosques and then contract them thereafter. We have a debate that's still unfolding. We don't know where it's going to end up. We hope that our principals can inform that debate. So I think my own view is that you have a flawed practice when it comes to profiling. Why are we going to put it on steroids and make it even worse? And so our first principle, stop high-tech profiling, is something that I think folks in the civil rights community are very concerned about and are pleased to have discussions with those who can develop policy. I know that my colleagues at the table might have some additional comments if they want to say about the profiling issue. So should we just open it up? No, I was just going to say. I mean, it's interesting that one of the things technology sometimes does very well in a very troubling way, is to add a sort of air of impartiality to things. It's like, well, it must be fair. The computer spit it out. Well, garbage in, garbage out is another old programmer saying, and if you put bad data in, you'll get bad data out. So to put a concrete lens on this, the Chicago police announced recently that they have what they call a heat list. So 400 people who are most likely to be involved in violent crime in Chicago. Now, it's not entirely clear what you have to do to be on the heat list, but it is clear that it doesn't mess, it's not just sort of your criminal background in the past. It's not that you've been arrested for us. It's not that you've been convicted of assault, for example. It has to do with things like your associations and, you know, police interactions that may not have anything to do with, you know, being arrested or convicted for anything. So, you know, it's worth asking, who are the police targeting? You know, what criteria are they using to make that target? How do you end up on the heat list? And if you are on the heat list, by the way, the police may come to your door and say, we're watching you, right? We're keeping an eye on you. Well, it's been my experience that it's a really good predictor of whether someone is going to be arrested for a crime is whether the police are watching them. You know, because if you watch me all the time, I'm pretty sure you may find me doing something wrong. I mean, it's not me because I never did anything wrong, but other people do things wrong. And if you... So, you know, the focus itself causes, you know, causes potential discrimination. So, you know, the fact that the computer algorithm spit it out maybe gives it a gloss of being fair or reasonable, but that doesn't make it more so, and it potentially is relying on all the same biases and problems that the old data has just kind of been cleaned up by an algorithm. So that's, you know, that's one, and I think we can continue to elaborate on those. So on that point of fairness, because I think that's going to be a question that comes up for all of us here on this panel, I think that extends to some of the discussion that we've been having around fairness in automated computer decisions. And I'm wondering if, Hazine, if you can take us down that route and explain what's at stake. Of course. So, and I want to back up for one second, too, and say that fairness is what brought the Urban League and I think the other, like, four legacy civil rights organizations to the table, to looking at these issues. And I don't want to conflate the two, so I want to make sure that everybody understands that we are talking both about surveillance and data brokerage. So there is the government aspect to it and there's a commercial side to it. And it's because of that fairness. Like, for instance, the mission of the Urban League, right, is to promote economic empowerment, self-reliance and parity for African-Americans and other underserved urban residents. So you get to that, you get to that issue of fairness when you start looking, one, at the surveillance that's happening by the government and understanding that there's a potential for African-Americans and Latinos who are overrepresented in our criminal justice system to receive, to have a lot, for that surveillance to have a lot of negative effects on our communities of color. But then if you go over to the commercial side, you also realize that we don't know what amount and what types of information is being collected about our communities. So I love technology. I think everybody that I work with loves technology. I mean, I am, if you would have seen me this morning, realize that I am woefully, woefully, directionally challenged. So I could not find my way without my map quest. I could not find my way without Google Maps. So I rely on those heavily. But because I rely on that heavily, that means that I know that there's information being collected about me that I have no idea about. So I don't know how many people in this room got to either attend the Senate hearing in December or read the report that came out from the Senate hearing. One of the quotes that I really wanted to share with everybody, because I just thought, and this one is about commercial. I'm not even going to touch the government side right now. I'll let Chris from ACLU really discuss most of that. But on the commercial side, they had a quote from one of the respondents, one of the data brokers. And this was the quote. The amount of available data has created an unprecedented amount of info about consumers, their attitudes and behaviors, perceptions about brands, what they're buying, and even where they happen to be at the moment the data is captured. I don't know about you guys, but I remember reading 1984 when I was in high school and just thinking how unrealistic that was. But the fact that it's not unrealistic... Looks like you didn't become a privacy lawyer. The rest of us were like, oh my God, that's the future. Exactly. Well, no, it's true. But just thinking about that and looking at it in the frame that we look at everything at the National Urban League, being about economic empowerment and jobs, if you look at it that way and you know that there is virtually no federal legislation that protects the information that data brokers collect about you because they're not normally considered consumer reports and therefore they don't fall underneath the Fair Credit Reporting Act. All of this can then be ascribed to you when you're applying for a job, when you're applying for a house mortgage, or car loan, when you go to the doctor. I mean, because they collect even what is I think it's Experian, right? Experian or Equifax has 75... They responded to the query from the Senate for this hearing saying that they had 75,000 different set of data points. That included what shampoo I bought, when my last OBGYN visit was, where how many miles I traveled in the last four weeks. It included how far I lived from public transportation for employment. You could consider what type of public transportation I take. Now, as a civil rights person, you know that redlining occurs a lot. We know that there's transportation issues. We know that there are certain things that are ascribed to communities of color that aren't ascribed to other communities. So I think that's why we're here because we don't know what... There's no transparency yet in the system and we hope that these principles will help create some sort of transparency so that we can then help educate our communities about what's truly at stake for them. So I just want to jump in to say if you can all hold in your thoughts, in your heads for a moment, what might be the flip side of big data, big data collection, big data analysis for civil rights principles, kind of thinking of the more positive aspects of data collection. Just hold that in your head. I think we'll get into a conversation later about that, but I want to give Chris the floor to take us deeper into that whole and talk about preserving constitutional principles. Thanks. Yeah, no, this has been such a fun conversation. I mean, the ACLU sits at a great and a really nice place in that we do a lot of civil rights work and do a lot of privacy work. So we've been thinking about these issues for a long time and I frequently felt like they were sort of flip sides of the same coin. I mean, Haseen is talking about it very clearly. It's like, if I collect lots of information about you, I can make judgments about you, right? I can decide whether you're a desirable or undesirable consumer. I can rate you. I can give you better or worse services. To me, is that a privacy issue or is that a discrimination issue? I don't know. It's probably both, right? So to have these conversations is really useful. You know, when we look at the government side of it, I mean, I think that the problems that we've seen are at least certainly the ACLU clear and compelling. I mean, the third principle just to note it is to preserve constitutional principles. This is actually pretty straightforward, right? I mean, the Bill of Rights exists, essentially, to protect minority viewpoints of minority rights. It's not the only reason the Bill of Rights exists, but it's one of the reasons. So we've seen a host of discriminatory data uses or surveillance practices that seem, at least from the ACLU's point of view, and I think from many unbiased observers, to be based on discriminatory or faulty premises. For example, when you collect the license plates of everybody who attends a mosque, surely not all those people are guilty, can be guilty of anything. When the FBI engages in a practice of racial mapping, where they go into communities and attempt to map the entire community based on sort of broad or vague generalizations about the types of criminal activities they might be engaged in. No specific suspicion. Just let's map the entire community. Let's collect information. For us, that's a very clear example of the intersection of what we're talking about here, and then it plays out in really pernicious ways. I mean, I think the government watch list is a great example of this, right? For those, the one of you in the back who's not familiar with what a government watch list is, I mean, basically there's lots of different kinds. The government maintains a very large list of potential people to keep an eye on. One of the subsets of the list is the no fly list. So these are people who are not dangerous enough to arrest but too dangerous to let fly. So, you know, if I actually had any evidence about your wrongdoing, I would of course arrest you. But since I don't, I'm just going to kind of keep you from being able to, for example, visit your dying mother or, you know, work in your job which requires you to travel. And a surprising number or an unsurprising number of those people are Muslim. You know, again, if they had evidence to arrest those people, they would arrest them. But as it is, they can just put them on a secret list without a real ability to contest or get off. I mean, I took to, I, we haven't really been talking about particular people, but I'll put a face on this. Abe Michelle is a, has a specialized business in dog training. So it requires him to fly around the country and, you know, train dogs for particular purposes. Since he was put on the no fly list, it put a significant crimp in his business and his ability to, you know, fly around and work. So, you know, this is a Marine veteran. This is someone who served his country. But because of his religion, he's on a no fly list and can't get off. Incidentally, FBI agents told him they would take him off the list if he agreed to become an informant for the FBI. So you can see how these pernicious practices play in how they undermine our constitutional values, our things like due process, the ability to contest the determination the government makes about you, and how they're carried out by technology. They're, you know, decisions are made using data analytics. You go on a, you know, a no fly list, which is promulgated through databases, which are then linked through every time you fly, you've got to, you know, provide your identification. It's so you can be checked against the government, no fly list. So you can see how the technology spreads out. And so, you know, all of this intersection ends up disfavoring particular groups and communities. I've now been talking a long time, see that I got fired up, and that's what happens. But there are a lot of these, and I think that we can, you know, we can talk about the heat list. We could go ahead and talk about Cunt Intel Pro. We can talk, we talked about racial mapping. I mean, the intersection of this is, is a deep infertile terrain, and I think we're only beginning to explore it. Thanks, Chris. Rashad, if you could walk us through, I think, one component of that problem that Chris and Hazine and Green have been describing, which has to do with control. Great. Well, first of all, thank you all for having this conversation. I think it's incredibly important, and obviously timely, given the conversations that have been happening. At Color of Change, we've been involved in these issues around sort of technology and our rights on the internet since 2010 when we took a stand as one of the few national civil rights organizations around net neutrality. And in many ways, this work is an evolution of that for us. It's an evolution of us trying to find ways and using the voices of our members to hold big corporations accountable and the government accountable for how we get to have a voice in our democracy. The internet and our ability to go online and access the tools of the day is no longer a luxury. You can't get an education. You can't apply for a job. You can't deal with certain government programs without sort of being able to access the internet. And so the way in which the new economy works and is driven relies on our ability to be online. And with that comes all sorts of changes or loopholes to the civil rights protections that have been won, earned, and fought for for the last several decades. And so as we look at the Civil Rights Act, as we look at the fair credit rules that have been enshrined and fought for for generations, we're entering a new frontier where the rules are not clear and the kind of predators, the villains, the heroes, none of the stories that we oftentimes tell ourselves in the way that we're able to do our work is not as clear. And so let me talk you through this a little bit and I'm going to try to see if we can get involved. How many folks go online sometimes and you get an offer for something, a buy one, get one free or something or an offer that if you sign up here at this site, they're going to, for me it's like, you get offers on clothing that you would get a little bit cheaper. How many people sometimes enter their information? Does anyone do that? Oh, come on. I mean, come on, y'all. Let's be clear. So we enter our information in that. And so maybe this room, because we're all here, we showed up into a conversation around big data at 9 o'clock in the morning. We're not necessarily the kind of target audience for my raising of the hand. But you don't even have to enter your information. I mean, if you walk into a store now, they have card size, like a deck of card size like sensors that'll ping how long you've been in the store and what you buy from your credit card data. So you don't even have to enter information anymore. And that's the part, because it's what's being collected about you offline and what that ascribes about you. I'm getting there. And so the point is, right, in the old era, the place where you may have, you went into the bank and you gave them your information, they couldn't then use that to sort of deal with predatory lending. Now, sometimes they did and we could catch them, but there were rules sort of against that. There were rules against sort of going inside of stores in the past, giving your information and that being used sort of along the lines of race or other protected classes to prevent the targeting of certain communities. What we're entering in sort of this new era is third-party collectors, collectors that sort of are not public-facing corporations that sit in between maybe your Walmart or your bank or someplace else and you. And these folks are explicitly collecting data along the lines of race or other sort of identities that are not completely against the rules. And so we've seen, for instance, if you live in certain communities or if you are a certain race and the store decides that the competitor, the competitor, the competition, that store is much further away. They'll offer you a different price, in that case a higher price than maybe someone who lives in an area that has a lot of competition. And what you're seeing is that data being collected and being offered along the lines of race explicit list of black folks who are unemployed or women, Latino women with children. And those lists now being sold specifically to these corporations and those corporations being able to use this, those businesses being able to use that to then target and make decisions. So our ability to be able to participate fairly in this economy, our ability to be able to have, get the same thing at the same place in the same time, which is sort of really the hallmark of the work that was done for decades in the civil rights community is really at jeopardy because the new technological age really opens up so many opportunities for folks whose clear agenda is to make money. And that's not to say that that is necessarily a bad thing. But without protections, without sort of a look at how do we enshrine a new set of protections to deal with the technological age? How do we sort of advance in this time, in this place, a new conversation around what are the do's and don'ts of this sort of new information highway? Then we are going to sort of be looking, you know, at an economy that increasingly keeps certain folks locked up, locked out, and from a criminal justice, and from a criminal justice perspective locked up, and increasingly keeps folks sort of in the place where they started. And for us as an organization that started in the aftermath of Hurricane Katrina with a single e-mail that was sent to about a thousand folks and has grown over the last eight years into nearly a million members, an organization that works specifically to hold corporations accountable and relies on everyday people being willing to go online and make their voices heard, to be able to click a button and share something on Facebook and to ask other people to join them so that we can move folks up a ladder of engagement to make their voices heard, not just in this new economy, but in this new democracy, our ability to be able to rely on the internet to be able to do what it's supposed to do and to protect our privacy is the clear civil rights issue of our time, and it's something that all of us need to be deeply concerned about and fight for. Thank you. So, Rashad has... Yeah, exactly. You've been talking about how we're locked out and I'd love for Jason to talk about how we might be included, but still excluded when we're inside of these databases or inside of these digital systems. Sure. Again, my name is Jason and I'm with the Asian Americans Advancing Justice, AJC, and we're a national civil rights group based here in Washington, D.C., and just for those who aren't familiar with the Asian American community, since we often aren't included in many of these discussions, we're 6% of the population. We make up 18 million people in America. And there's just a factor that 60% of us are actually full and born. And one out of three of us don't speak English proficiently. And another great thing about the Asian American community when we look at the tech adoption statistics is that they lead the way in basically everything. They lead the way in cell phones, smartphones, internet adoption, broadband adoption, and the one I'd like to mention the most is we lead the way in video watching online by, like, over an hour averages. But seeing that the principle I'm going to be talking about is, you know, protecting people from inaccurate data. So, you know, they're a huge databases and sometimes, more often than sometimes, you know, there's inaccurate data in that system. And a great example that's really been a thorn in the side of the Asian American community is the E-Verify system that's run by the government that, you know, gives authorization for people, checks people's authorization to work in this country. As you know, you have to have authorization, especially if you're not a citizen of this country. But what we've seen from the E-Verify program is that naturalized citizens are 30 times more likely than a natural-born citizen to get a false positive in the system. And also, you know, non-legal immigrants, so for example, high-skilled immigrants, non-immigrants that come to this country from Asia are 50 times more likely to get a, you know, false positive. So why does this happen with this database? Parts of it include, you know, the database has to get updated. So if you're running through the whole immigration system, it's got to get updated, and that's updated daily, but there are delays in that. But, you know, there's also other parts of the system that just show, you know, the inherency of, you know, errors in the system. So, you know, the database has first and last names, for example. So, you know, if you look at the Thai last name, really long, something, all you need is one letter wrong, or you put an extra space at the end of the last name that could get your system off. If you look at last names, you know, like the Hispanics, for example, they have multiple last names. Filipinos do, because they are once a Spanish colony also. If you look at East Asians, their names are often switched, and, you know, people entering in the system don't understand that sometimes, and that gets messed up. So that leads, you know, again, just higher rates of false positives in this system that disproportionately affects the Asian-American community. Along those lines, also, what this principle is that, you know, is the appropriate ways to, you know, access this inaccurate information and also to address it. But what we've seen when people are trying to, you know, fix these inaccurate data in the system is that, you know, it takes, like, sometimes over an hour just to even deal with a, you know, a certain government agency. A lot of times, you know, at least, I think like 13% of people who are trying to get this fixed spend over $100. Just to get this fixed, sometimes people hire lawyers just to get it fixed. And, you know, for some people, $100 is a lot of money. So again, you know, these databases are set up, you know, with these, you know, totally innocent, you know, fields, but, you know, again, given whatever with a certain population, it leads to this disproportionate impact. So again, as we look, you know, that's just a government example, but if you're looking at, you know, fixing inaccurate information, who's ever had to fix their credit report? We'll look at it. Luckily, mine's okay. But we've heard tons of horror stories about people trying to fix their credit. Now imagine that, you know, how does that look, for example, if we're going to start to look at big data? And those, you know, people are trying to fix that data in some huge private, you know, completely opaque data system. So it's just, you know, we haven't gotten to that point yet of how we're going to actually deal with that. But again, it's a good discussion to have. And I was going to say that there's not even the right for you to fix your data. Yeah, I mean, some of the kind of stuff, there's laws, but you know, at this point, I'm not sure there are laws that we specifically cover. No, there are no laws that specifically cover fixing the data that's being collected about you outside of your consumer report. There is, for your consumer reports, there are. There are some, because right now it's self-regulated. So if a company wants you, wants to give you the opportunity to do so, they can. But then you can even opt out, but you don't even know who's collecting. You don't even know who the collectors are. So at that point, when you opt out, it's not, you can opt out if you go to find more information about it. But how many people actually have the wherewithal or even the savvy to understand how to do it? And then even when you do opt out, they're not obligated to delete your information, the information that they've collected about you. So there's just a lot that, I think as we sit here, one of the main things I want everybody to take away from this panel, it's about education. We really need to have more, we need to have more information about what's happening. It's only when you have that information can you really push and understand what's at stake for our communities. I think without that information and without the education to our communities, we run the risk of, in a time when we need economic growth more than ever, to not have it. Yeah, I mean, I was just, to make this really concrete, people are segmented in ways that they would be very surprised by and be made very uncomfortable by and that harm them. I mean, you can, if you know somebody is in financial distress, they're three months behind on their mortgage, boy, you can really get in and sell them some product that's going to magically save them from all their, the financial harm that they're having right now. Now, that is illegal using, for example, a credit report. But, you know, you gather the same information from a data broker, you're skirting that law. And this is not, I mean, the Senate Commerce Report that he's seen referenced, it's very clear on this. I mean, they're creating lists based on racial and other characteristics of struggling seniors. Right, like, so I know if you're struggling. I know you. I know your particular circumstances and I can mark it directly to you and I can use the knowledge I have about you to figure out how I can take advantage of you. Now, that is a huge power disparity that, you know, without access it's impossible to remedy. I just wanted to underscore something that you said, Cida, that Chris, you just alluded to, which is that these are issues that are affecting all of us. So, you know, we've been focusing quite a bit on the impact on different racial minorities, but in the Leadership Conference is a coalition of more than 200 organizations that represent basically America. So, not only racial and ethnic minorities, but religious minorities, the LGBT community, seniors, young people, women, labor. And you'll see as you look at the principles, that the principles embrace that, that it's underrepresented populations that need to take this into account. But it's really all Americans who do because of the very fact of that. I mean, you just don't know. And I think the cross cuts would be very startling if you were able to ascertain them. So, I just wanted to make sure that folks understand that when we talk about civil rights, we're not talking about this group or that group. These are fundamental American values of equal opportunity, justice, fairness that we're all concerned about. I mean, like in the Senate report, they listed out some of the titles that the companies gave to different people. And I mean, as I read it, I was like, I wonder where I would fall in. Because some of them were rural and barely making it. Ethnic, second city owners. Wonder what that one is. Retiring on empty. Singles. Tough start. Young single parents. Credit crunched. City families. So I mean, like it's just interesting. Like what does your reading, what is your reading habit say for you, right? What does the fact that you go to Whole Foods or stop going to Whole Foods one month, but you now go to Safeway or Giant say about you. You know, that I work at the Urban League say about me, but then go home and read tons of historical romances say about me. Like, you know, I just want to know, like, I want that, right? To say, to see that data and to see what attributes are being inscribed to me. So I want to jump in here with a question because I promised I would. And I wanted to be, I want to challenge you to think, so everyone has talked about how data collection can segment us, can harm us, can make, can make inferences about us, can make money off of us. And I want to think about ways in which big data collection or data collection and analysis can be used to enhance civil rights. And I'm wondering if you've given any thought to that and can talk about that. Yeah, Versailles. We're about to head into an election cycle. And so anyone who does electoral politics knows how we use voter files. Knows how we try to think about how to target various communities and move our message to those communities. And so, you know, that is and many of the these questions around how folks are reached by either political party by kind of third party organizations during election cycles on issue campaigns. These are oftentimes, we're in the same kind of data frontier here. And so and these are important questions that we have to be asking ourselves as we, as our communities become more segmented as someone who specifically works to, you know, empower black folks and especially as a next generation civil rights organization. I look at what every, the mix of what the housing crisis did to black communities and just the fact that communities are becoming more dispersed, just simply dropping someone off in a black community to knock on doors is not the same as it was, you know, 15, 20 years ago. And so the idea of how do we reach people and find people and target them with a message to get them the turnout to vote is we're thinking about the same things as what Walmart wants to do when they want you to come into their store to buy a product in some ways. That does not mean that there should not be protections, though. And it does not mean that we do not need to push back and we need protections that are going to ensure that our privacy online is protected and just as importantly our civil rights. And we will see over and over that if we, you know, think about, if we think about the good, right, that can be done from, like, our targeting or our message development as equal to what big, powerful corporations are able to do with our data to to continue to advance an economic agenda that will keep those at the bottom at the bottom and those at the top to have more and more. If we think about that as equal then we are sort of really missing the point of where we're heading in this new age. And one of the things... I was just going to say, I mean, one of the ongoing discussions we always have at the ACLU is about data collection protecting our rights, right? I mean, there are clear economic disparities for example in education. How many kids, you know, who are pushed out of the schools and, you know, because of their race but how many racial disparities there are in kids being pushed out of schools. And so we need that data and it's not a question of spying on those kids, it's a question of spying on those schools, right, and making sure that those administrators are doing what they should be doing. So it's not... but by the same token we've worked a lot of those things out too, right? I don't need to know the specific kid necessarily. I don't need to know if I'm looking at student test results, right? We want those posted online at least in aggregate so we know what schools are doing a good job and what schools are not doing a good job and, you know, some of the racial characteristics of the test results as well but I don't need to know individual students. There are ways that you can provide information, require data collection that also, you know, that don't in turn, you know, deal with some of the harmful sort of effects that we've had here. So I think that there are ways that we can do it that makes sense. I want to, like, raise since we're talking about, like, the benefits of big data, this isn't exactly a benefit but I want to raise an uncomfortable question so maybe you guys don't think it's interesting. What do we do when the information, the big data analytic, is accurate but unfair? So here's the example, right? I'm Chicago police, right? I'm creating a heat list where I brought it up before. It's based on lots of characteristics but let's say one of the main characteristics is who your friends are, right? So I can look at your Facebook page and say, well, if three of the people on your Facebook community have been shot there is a better chance that you will be involved in violence than someone who's no one on their Facebook page has been involved in violence. Now we're not saying it's a guarantee or even a likelihood but it's more likely. It's more likely than someone else. Now, maybe that's an accurate result. Is it fair one though? Do we want police to be or government in general to be judging us based on our associations? Should I be worried about who my friends are because the government might attract more scrutiny to me? You know, that analytic might be right but is it fair? And I think that is a very difficult question that we have to grapple with because again, some of the analytics that companies create will be accurate. They may know they can get me into target with a $5 coupon. They might be able to get someone who has less money into me and into target with a $1 coupon and someone who's wealthier than me might take a whole $10 coupon and because they know that because they know how much money I make. That's an accurate you know, analytic. Is it a fair one? Are we worried that like suddenly the poorer consumers getting even less economic benefit? They're getting the worst coupons? I think that somebody at that end of the spectrum would say, jeez, I really would rather have everybody get a $5 coupon or even a $3 coupon than me be the one who gets the 50 cent coupon because they know how much money I make. So it's not all about accuracy. It's also about fairness and justice and I think that kind of problem that drew a lot of us to this. I want to give Jason the opportunity to jump in as well and while Jason is talking if you could prepare yourself to ask questions yourself including those of you that are joining us by webcast that would be great. Talk a little bit about Chris's point and then go back to your initial question but you definitely know it is a tough question to ask. Everyone I talk to when I talk about this issue is where do we draw the line between something that's actually beneficial versus something that actually could potentially harm communities that we represent and thinking about this initially I always thought there's already things we can look at already that exist as a model of where we want to make sure there are protections so we have already laws on fair housing, fair lending equal employment educational access. So I think that's a start to be looking at where this data can be used for good or bad for both. And then going back to your initial question about Asian Americans, we are the first group to say that data is great if you just say Asian that doesn't really help our community when we come from, we have ancestry, 30 different countries and speak over 100 different languages and dialects so like for example the Voting Rights Act, if a certain jurisdiction has a certain amount of people who speak English, they now have to have voting materials in that language something like media diversity where there's not a lot of Asians there and there's not a lot of content that's linguistically or culturally relevant for us, maybe it is good for a cable company or whoever to have data on what type of Asian what is your ethnicity, what language you speak so that we could actually have content that's actually good for us. So we are the first to be out there and say that data can be good that some practices are okay although they are deemed to be protections so I think it's a balance to be at this right. Jump in there. I'm going to challenge you on being the first to say the data I mean I do believe that they use social data in Brown v. Board of Education so I think the data has always been amazing and we love data and data is what companies use right now to make sure that they need their diversity numbers they're going to meet for employers data, I was talking to a company recently data is what helped their collection of data on their hiring process and their interview process and how they analyzed that showed them the difference between who they hired and why there was a difference between their pool of applicants and who they hired and they were able to change that looking at that data that's amazing stuff I think if you look on the employment side data can be great if you look on the consumer side data can be great but going to what Chris said about the fairness and how do you really align fairness with the data that can be collected I think it's a lot easier I would say this Chris, I think it's a lot easier on the consumer side and even on the employment side to make that de-alignation but on the government surveillance side I think it's a lot harder I remember exactly where I was on 9-11 when we find out it was 9-03 my civil, my civ pro class had just started at Howard Law and we did not know what happened and we were told the police came and part police came and told us we had to leave and you had to go somewhere we tried to get to the metro, the metro was closed by the time I got to a TV about almost nearly an hour later I was just in time to see the first tower collapse and I just remember that feeling that I had so that wow I hope that people got out but knowing, I mean I can feel myself tearing up about it now because knowing that everybody could not have possibly gotten out of that building so I think that's where we are in our society right now from the government side you remember most Americans remember that feeling that you had you know what their, you know what their aim is and you just, it's just not, it's not as easy to say that you just have to stop it or that you shouldn't have that information right so I completely agree with things like ECPO right where we need we need to change we need, it's been a long time since the 80s since we since they've had any sort of legislation that actually protects like email communication or any sort of electronic communication right I'm looking forward to see what the supreme court says on Riley and on worry because I don't think that you should be able to search somebody's cell phone right I don't know if you guys know those are two supreme court cases that are up right now for review about whether or not it incident to a lawful arrest the police can search your cell phone and your smart phone I know how much information I have on my smart phone I definitely wouldn't want the police to I don't think I do anything illegal but I still don't want them to exactly but I don't want them to have that information and is that what was intended with the fourth amendment like should they be able to do that because in 1970s when they said that yes they could they thought about things that were actually harmful to police officers a cell phone harmful to police officer so we'll see so I mean I think when you come down to that and even looking at those issues those are some really difficult legal issues because they have such large implications for I think the safety in our community and the safety that we all want as Americans it's a great can I just okay and then I really want to open it up to the audience we're a packed house and since this brings it really it's a it's a really good point and it's it I think raises something that many Americans intuitively feel because they want the government to protect them and it raises something we haven't talked a lot about yet which is the fact that there's a fair amount of analytics floating around at this point that don't work and so there's a real sort of snake oil problem that I think we need to look at here and it's particularly pernicious on the government side I mean there's a data mining report that goes all the way back to like 2008 all the way back to 2008 I know but you know basically that's a data mining has no predictive value for finding terrorists it's just terrorism is too small and varied a thing to be pulled out by big data analytics similarly in the looking at the NSA scandals we saw the privacy and civil liberties oversight board and the president's hand-picked advisory board both say that this 215 NSA phone records program or getting everybody's phone records has not actually stopped helped in solving any terrorist crimes so there's sort of a you know there is a safety issue but there's but there's also a you know before we get into the privacy questions you know we should be using tools that work and then we can decide whether we need to balance privacy and safety because I had to get my pitch in this you know so I want to open it up to the flow hazing wants to try can I get a show of hands of those of you that want to ask questions great okay because my question today was going to be like but how do you know what's working when you don't even truly know we sit around the entire realm we talk about positions all the time all the time so but go on I just had to thank you hi good morning thank you all for being here I I wanted to kind of bring it back to the point that Chris was mentioning earlier about using kind of using all the data that we have to and make the best use of it and in my experience I've seen you know a lot of federal state governments not collect the right data and so I was wondering what insight you all have about promoting more accountability and transparency around you know really collecting the right data that we need I think the principles in general are meant to encourage government and private entities right to develop and use data in a way that promotes exactly what you asked in a way that promotes equal opportunity and equal justice I think we have not they're pretty general right now because we we're still working on more research about the trans behind the opaqueness of the data that's being collected on both sides so I think when we get to that we'll have we hope to have eventually right to have deeper recommendations but I think because it's such an opaque process it's kind of hard to say exactly what we want to do I do know that we all support ECPA you and I do we support ECPA on the government side so I'm protecting electronic communications and making sure at least that the privacy laws that we do have do take into account the technological advances that have happened just this idea from our experience I think before 2000 in the census categories it was Asian-American before that it was like yellow or like oriental something like that we definitely have a lot of experience of educating at least on the government side the need for the right kind of data if you're just doing Asian for example you're not collecting the ethnicity you're not really doing a service to a lot of communities that have very specific health problems or that have the highest rate of poverty out of any racial or ethnic category within the Hmong community for example so again at least on the government side there's a lot of advocacy groups that push to have that data disaggregated on the corporate side though it's a different story I mean they're also collecting data for them for us it's been about showing why it makes business sense to collect the right kind of data again if you're collecting data and it's not going to make you money it just doesn't make sense for a company or anyone trying to make money to do it and they teach them why it's important and makes money for them to have the right kind of data We have a question up in the front here Hi I guess this question is sort of responding to something Hadesian said in particular but the whole panel it's really for the whole panel you had talked about well partially about the opacity of the data and you wanted to you were talking about a transparency principle they collected that you work at the National Urban League and that you read historical romance novels and what does that mean about you I think there's a problem in that the data brokers don't know what that means about you because the whole process is done by machine learning and so I'm curious if you thought about what a transparency principle would even look like because seeing that they collected those raw data will not in any way tell you how it's being used how would we find that out Yeah I mean I'm sure you have some ideas but I mean I think the fact is that I agree but I think there's a good question but in some ways it's sort of an edge question because there's, first of all we're not getting anywhere near all the data points I mean Axiom says they have something like 1500 data points trust me, I looked at Axiom Transparency Report it did not give me 1500 data points and then there's a second tier of sort of opinions about me now they may not be able to give you every evaluative judgment but they could tell me for example what lists I'm on what I've been sold to in the last month like how many companies how was I categorized stuff like that or the types of ads I'm being served whether they are sharing it with a third party company that's using it to inform the ads that I'm seeing there's a huge amount of transparency that could happen at the, you know, beyond that and I agree with Chris I think it's not just it's not just getting finding out what they have what they've subscribed to me and what it says because obviously I don't think that those two that mean those are just two random examples right those two things really say a lot about me right I think anyone can have that but I think that until we kind of start getting to see all the data points and especially who they're selling the information to it's important right that's when we that's when we can really kind of dwell down and really get a little bit more nuanced about our recommendations and know exactly what more transparency we need like for instance there was, so there was this one this sticks out of my mind a lot because I'm at Urban League and we talk about jobs right and African-Americans job numbers have been extremely low for a long time so I think a couple, maybe about a month ago or two there is an article the couple articles written I think one of the New York Times about ways in which the new employers are using predictive analytics to decide whether or not somebody can work somebody will manage to work there successfully so my problem with that and I don't know what questions are being asked but I did know that it said what I would want us to dwell a little bit deeper into it was because they were asking people how long one of the questions they had was how long does it take to get to work what sort of transportation do you take to get to work right so last year the Birkins Institute did a whole piece on transportation and jobs and how jobs is leaving the urban areas and the center business districts right so if you're a low income person and you live in an urban area but jobs now are located in a lot of our urban areas like Chicago Atlanta I know the Chicago and Atlanta where over 60 something percent of their jobs we're 10 to 35 miles away from the center business district Detroit that just everybody knows they file bankruptcy they're the highest with 77% so with a large population of African Americans in this area knowing that that's a question that came up I would love for us to get more transparency about that because if I know that my people and the communities that we serve don't have access to great transportation services will are likely having to travel 90 minutes or more even to their jobs and to know that this is a question that is that could potentially keep them from getting a job that's something that's that is of interest to me so I mean I didn't I say all that to say that I don't know exactly what transparency we need I just know that we need it and as we get more information that's when we're we will be able to really dig down a little bit deeper to say this is this is all this is what we need and I think it's going to be an ongoing process first out I know you have something to say about this especially since your organization has been involved in doing research on corporate collection of data of consumers yes we actually do and I actually raised my hand actually to address a point that I've been sort of stewing on and trying to figure out the right way to say it because you you brought up a point that I know you may not have meant it this way Hazin but I do want us to not leave this room with that 9-11 moment and as an example of why it's okay in this new era because the I didn't say it was okay I know I know I know I know it was okay I know but I don't want to see that until I did not say it was okay but but but hard balance but hard balance right hard balance doesn't mean that there that that we that when the seesaw moves it becomes okay at times and and and that's and that's I guess what I just want to say because because whether it's 9-11 whether it's police surveillance in inner cities whether it's whether it's these sort of issues where certain communities become the target where oppressed communities become we put are put in harm's way by powerful forces and don't then have a voice to sort of stand up and push back and I and I know and I know exactly what you're saying and I and I and I and I and that's why I struggled with even bringing it up but I do know that when we have these conversations are hard balance the seesaw always tilts towards putting oppressed communities in harm's way and and and at the expense of of these ideas of safety which often times don't ever pan out and and and protection and security for for community for for folks in Nebraska that are worried about 9-11 happening you know these you know these are sort of the ways the ways in which these discussions often times get get railroaded off of how do we ensure our civil rights protections are secure it's also the easy way for us to put aside our civil rights and our human rights in times of of stress and challenging this country which are the times where we need to sort of elevate and advance our civil rights and human rights because we are as a people right those those questions those those hard questions are exactly the questions that people should be asking and that's why we have the principles we want to be part of this conversation and and that's all we're saying so we may not have the answers well it's one of the things we're saying we're more than saying more than that well definitely we're definitely saying we're definitely saying well also one of the things that I think I'm hearing is that and this comes back to a point that Green was raising earlier this is a conversation that we want to have with everyone yeah right these are issues that really affect everyone so the question about transparency is a very difficult one to tease apart especially when you start getting down to the nitty gritty details but we need to have that conversation in the context of thinking about discrimination in the context of thinking about fairness and I'm going to ask Kirsten to please um if you could go to the back we have a question all the way into the back Chris were you going to respond quickly no no okay hi I have a couple questions first I wanted to know if any of you think that there's any possibility for legislative action as far as dragnet surveillance goes the prison programs and the justification between telephone kind of monitoring programs and at the same time you know ever since the Obama administration came in during the Bush administration the Democrats were heavily critical of those programs that kind of weren't why it's happening but Obama came in and kind of normalized the process and kind of removed it from the political debate so I'm wondering if there is any potential that you guys see for progress there and then also I'd like to know if there is any potential judicially following the two federal judges that struck that justification down and what the kind of track record and timetable for that is those in my wheel have so take them in no particular order because I'm not sure I remember the order I think that there is a real possibility of legislative change on the 215 program if for no other reason that there are sunsets on some of those programs so Congress is going to have to address them I mean I suppose they could do a blanket reauthorization as they have in the past I think that seems less likely given the disclosures the last nine months so as to what that will look like I think we don't know the USA Freedom Act is certainly the ACLU's believe that's the appropriate way to end the 215 program and continue to safeguard both our privacy and our civil liberties in terms of court action we have a lawsuit so I better say that there's a realistic possibility of overshining things in the courts or I will be in a lot of trouble but I do think that there is obviously we've already seen one federal judge agree with us one federal judge disagree so I think that's at least a puncher's chance right? and I think we'll eventually hear it from the court one way or another in terms of normalizing things I agree with your point it's really unfortunate how many programs have continued under the Obama administration I mean it's not just warrantless wiretapping it's drone assassination it's you know, Guantanamo is not closed you know there are really there are huge post 9-11 issues that remain unresolved that both parties now have will have to deal with the legacy of for many years so I am a strong believer in both the pendulum swinging back and forth and if I can mix my metaphors the arch of history, the arch of history I believe that some of these programs all of these programs are fundamentally undemocratic they are fundamentally unfair and they're wrong and I think that the American public and American society will come to embrace that view and we will see an end to many of the worst of these practices I want to translate a question that has come through the twitter feed and it's a question about young people so this is a question as much for for everyone but I know that color of change has a very young and engaged membership and you can like call me a young person anytime it's not happening anymore so the question is or the statement was I'm pretty sure that young people today don't care about data brokers and I'm wondering the extent to which that's true in your local chapters or in your younger membership how are young communities of color thinking about these issues young people young people care if they're being unfairly targeted by the police through programs like stop and frisk they care if there's surveillance issues happening in their communities that are sort of targeting them unfairly or increasing sort of issues around school to prison pipeline or other issues these issues around third party brokers are new issues and we also have an age where millennials and younger generations have grown up in internet age where they're used to signing up for Gmail or email anymore and having ads that are targeted towards them and that's just been the frontier in terms of how people have sort of experienced the internet and have experienced technology that the idea of everything from applications that you put on your phone and even when you delete them they actually never really go away the idea of saying what you want to say on twitter or sending what you want to send on snapchat the ideas of sort of holding back information is different for this generation but the concepts of how people should be treated the principles of fairness the principles of being able to sort of roam freely and make your voice heard this generation of not being judged by sort of characteristics of how you were born this generation is much more open than other generations and so the idea of the impact that these third party brokers could have on sort of fundamental principles that millennials and the generations that are sort of after them sort of have believe and live this is the work of organizations like all of ours up here to sort of translate these issues and give people something impactful to do but the idea of saying the next generation does not care about their rights to be able to move freely in our country and be heard and be treated with equality is not true and the fact that this generation like generations before them will stand up to forces that sort of violate their fundamental values and beliefs Jason you wanted to add something one thing that is cool that we've done as a coalition is we've actually done polling and we've sat behind the one way mirror and listened to people talk about these issues and the things that I took away for them were like people know it's happening but they don't know the level of it and what exactly they're doing but the other part is there's a lot of apathy or there's just helplessness that they feel they know it's happening but there's nothing they can really do about it and I'm really looking forward to having these principles to you know for me especially just go into my community and say hey let's have a talk let's see exactly what what you know what you don't know and let me educate you and see where we all come out how we come out on this and I would say because of those because of the polling and the focus groups that we've done it's exactly the reason why I think that that we need to continue to educate our communities because it's that feeling going back once again I did not say that it's ever okay to use surveillance against oppressed communities and I want to make sure this is something that we found I didn't see it on the Twitter feed by the way but I also want to say but this was this is one of the things like when we did it we looked at it and we in our focus groups we did it do them by race and we had people of different incomes and we knew and so you could watch and you could hear what people were saying and the thing is that even though you know it's being used against your community there is still that extreme tension because we just don't have enough information about what's going on on the government side so until we get that information you can't really break people's feelings and so I mean because you need data to break to help people understand that their emotions that they have connected to 9-11 isn't exactly translating to what's being gathered on them but until until you have them until you can educate them because we've received transparency and then we can do our jobs as the community as communities who work to empower our communities of color that we can't I think it's just it's hard we have a question up in the front and then I see somebody with a purple sweater back there and if you could start thinking I think we're what time do we wrap up 10 30 so if there's a burning question that you think has to be answered that this panel can't leave this stage without answering we'll try to answer Hello okay I saw two things one was a Ted talk by Ellie Pariser called the filter bubble and another is a documentary called terms and conditions may apply and this relates to the way in which my question relates to the way in which I receive information how it looks different for me and then it does for other people and I'm sorry pardon my ignorance but this is a Facebook and Google are they third party that are are they the ones that are interpreting what I see who is that's a tricky question they're Google is supposed to first and a third party they're both an advertiser that collects a lot of information from other people they are also collecting a tremendous amount of information about you obviously as a first party so they are making judgments but again it's difficult to know Google has some transparency tools that allow you to know more for example than a data broker certainly will tell you about yourself but it's not clear yet how exactly how much information is out there and I wouldn't call them a data broker because they don't they don't sell data so they're not I guess the they're not one of the companies that collect the data and sell them and give that data out they might collect it from other people they might get it from other companies but from what I understand they do not sell data do you want to pose your question and then I'm going to ask our gentleman in the purple sweater to also pose his question because we're running out of time just the one thing about like like a ghostry type add on to an application what does that do anything well certainly will prevent you from seeing ads I think it would prevent some third party collection I don't know that all third party collection it's I don't think it would do anything about first party collection first party yes yeah so partly as the person that works with the organization called the open technology institute I'll just say very briefly that it alerts people as to the type of tracking that's happening and it's one tool in the toolbox that I think is available to us as consumers and citizens to understand how tracking is happening and what we might do about it I think what our panelists are talking about though is that there's more to it than that there are you know ways of engaging communities ways of our civil rights organizations engaging with communities to help grow awareness and talk about these issues to a greater extent so it's one tool in the toolbox and our gentleman in the purple sweater glad I wore this purple sweater thank you for taking my question I want to share a quote I heard which is pretty great it doesn't credit does not go to me goes to Dan Arlie who's a professor Duke he said big data is like teenage sex everyone is talking about it nobody knows what they're doing but everyone thinks everyone's doing it so everybody claims that they're doing it and I feel like that's fairly accurate because the conversation is not very deep on the side of people who follow big data know get that joke it's not very sophisticated one of the questions I had for you is about data scientists and I was curious to find that there you know there's not a data scientist on the panel I think that's an important perspective because as algorithms become the gatekeepers in a sense in making those decisions I don't think any of them would argue that there's assumptions that they're making so I guess my question for the panel is how are you getting that conversation going with the other side of the house the data scientists and very quickly to resources I think haven't been mentioned which I think are important one is the council for big data ethics in society in New York I don't know if you know about this but the data and society research institute is very solid new initiative and the other is a paper that just came out from Washington College of Law called big data ethics and very good piece on like laying out the principles so people haven't seen those that encourage checking them out thanks well I'll just say very quickly great question I think this in many ways is an invitation to have that discussion and since we've I will certainly say just speaking for myself since we the principles came out I've heard from quite a few people we I mean we've been involved in a ongoing discussion with the broader privacy community about these questions and about these principles so I think the short answer is we are very much aware that that there's an enormous amount of complexity and I think we surface some of it here and then and we will our hope is that these principles will be part of educating helping us to sort of educate each other our communities about what the best practices are what the tools are so that you can evaluate for example a good algorithm from a snake well you know and and and then we can decide whether even if it's a good algorithm whether we think it's the right algorithm or fair I'll just say briefly and I think this might be a nice way to wrap up is that the conversations that we're starting here are conversations that are going to travel far and wide over the next year over the next few years as Kevin mentioned at the beginning of this event some of us will be participating in the White House's comprehensive review of big data where we anticipate these questions of discrimination and fairness will come to the fore so we're beginning to have those conversations with the data scientists with some of the legal scholars like Deirdre Mulligan, Cynthia Dwork Neil Richards Kate Crawford those kinds of individuals that have initiated some really good thinking about these issues and putting that directly in conversation with the experiences and the stories and the history that all of these civil rights human rights organizations are familiar with and deeply engaged with and that's the kind of conversation that we look forward to moving ahead so thank you so much for joining us please join me in thanking our panelists