 Okay, the microphone, yes it is, cool. Hello everyone, thank you very much for being here. My name is Eric Null. I am senior counsel here at Open Technology Institute. On behalf of myself and my co-sponsor, Brandi Collins Dexter from Color of Change who will be moderating today's panel, we're so grateful that you could join us for a discussion of centering civil rights in the privacy debate. And we hope you can join us after the panel for a reception. Every time we talk about privacy protections, we should be centering civil rights concerns. The right to privacy exists to, among other things, protect against unfair and inappropriate uses of personal information. And personal data is the raw material that fuels today's discrimination. So civil rights and privacy are directly linked and we'll hear a lot more about that from today's experts. But before we get to our panel, I have the distinct pleasure of introducing our opening speaker. Many of you likely already know her or know of her, but Francella Ocillo is the vice president of policy and general counsel at the National Hispanic Media Coalition. She's been an active leader on many important civil rights issues like ensuring equitable access to broadband and diversity in media. She's also a key advocate on protecting privacy and has given voice to the civil rights concerns associated with privacy protections. For these reasons and many more, we are elated that she took the time to join us today and share her thoughts. Francella. Good morning, good afternoon, you guys. I'm so glad that you have me here. And first, thank you, Eric, for that warm introduction. I don't know that people always know about me, but I sure do knock hard on doors when you don't return my emails. So I just appreciate you guys having this conversation today because sometimes I think it is a little bit elusive to talk about privacy and civil rights and where those two things intersect. I was actually on a panel this morning where a person from an organization actually set out loud what does one have to do with the other. And so I think that that was very emblematic of the work that we have to do in terms of education and just making sure that everyone understands what is at stake and why people are concerned. Now that Congress is getting very serious about crafting privacy legislation, I think that it's important for us to have specific demands about what consumer protections you want to make sure that data collection, data management, and also data sharing policies are the practices that companies used aren't turned against marginalized communities. And the reality is that we live in a time when companies will do anything to get their hands on our data. It is actually one of their most valuable resources even though it's never reported on a balance sheet. It's something that is a treasure trove for them to get business leads and also to generate new profit making opportunities. So I think when we're thinking about their ability to be able to track us online, it's important to think about what goes into that process. And essentially we have tech platforms and also third parties that are very difficult to identify who are creating data profiles about us. And those profiles are essentially, they can identify who we are. They essentially can predict our behavior. They decide who's in our network and who we're gonna see and what we're gonna see online. They actually have the ability to create demand for products and services and even content that we never even imagined that we wanted. And so I think when we're thinking about all of the magic of what AI and these data profiles, what they can do, I think it's also important to identify how those data sharing tools can be used against us. That information can be exploited in many different ways. It's been manipulated to advance motor suppression campaigns. It's also been used to stir hate and division and actually turn communities against each other, targeting communities of color and religious minorities just to name a few. Data can also breathe new life into online fraud and predatory lending schemes. It can extend socioeconomic divisions by incorporating digital redlining. It can also render consumers powerless when they receive data breach notifications which may or may not provide adequate remedies. I wanna make sure that I explain always in plain language that these data intrusions have very real consequences and that the harms are long lasting often for communities and people who have very little access to redress. We need to make sure that we're clear and decide specifically what we want from Congress and use every lever of power within our reach to get it. As we move forward, I want to encourage everyone to think about three things and when we're making our strategy as a coalition. First, when we talk about incorporating civil rights protections into privacy laws, it's important to be clear about what civil rights laws were designed to correct. Landmark legislation of the civil rights movement that was passed back in the 1960s was actually built on a movement of resistance and sacrifice that had already lasted centuries. And hundreds of years later, after people of all colors were confronted with the horrors of what was left over from slavery, that was when Congress finally decided to pass federal legislation to ensure that every American had the right to be free from certain types of discrimination. And so ideally, every American would have then had the right to equal and fair treatment in society. When those laws were codified, no one imagined a scenario where tech platforms would be able to discriminate online in ways that legislation was intended to prevent in our own neighborhoods. But this is where we are and we're in a place where my data profile will determine whether I have access to certain employment opportunities or housing advertisements. It will have an impact on my salary. It will have an impact on how much I pay for a mortgage. And that very profile that maybe a third party that I couldn't even identify created about me in closed doors will last indefinitely online and be almost impossible for me to correct. So when I think about what type of privacy laws we need, I wanna first think about privacy laws that acknowledge the enduring economic, political and cultural consequences of racial oppression that still exist in this country today. Secondly, advocates must continue to document the disproportionate harms to marginalized communities. Our most powerful arguments will rely on data that tells its own story. Data that is originally collected with good intentions can easily be used to justify over-policing in certain neighborhoods, especially communities of color. Zip code and incomes might turn, translate into certain customers paying a certain price for goods and shipping or other customers paying lower prices in better neighborhoods. Ethnicity and your like history on social media platforms can flag you for interest in receiving hateful content or make you a target. We need to generate our own analysis to show how exploitation of data leads directly to socioeconomic harm. Finally, I implore each of you to immerse yourself in your own talent. That means that you have to be persistent and clear about elevating your community's message in language that actually they understand. It's not enough for us to have conversations in DC that exclude people. We need to make sure that because there are lots of different stakeholders in this debate who might all agree on where we're going, but disagree on how we're gonna get there, what are you doing to bring your constituent's message into those closed door rooms? If you're really good at grassroots advocacy, then find a way to mobilize your community and make sure that they understand why they should care about data privacy. If you're really good at lobbying, be strategic about your congressional outreach and make sure to elevate messages from other grassroots organizations that don't have the same access that you do. If you're really good at writing, then make sure that your blog posts and your articles approach people in language that they can actually understand and arm them with information so that they can participate in this conversation. I think that it's also important in closing just to note that we know that the US is already behind in developing privacy regulations, so we know that they're gonna move forward with this conversation with or without us. If you are not satisfied with the options that are on the table for the privacy framework that you want for your digital ecosystem, do not settle for someone else's vision or someone else's imagination. Bring your ideas to the table. You have a powerful voice and you have a unique opportunity to actually be a part of a conversation that will help shape the digital economy for the 22nd century. Thanks to you guys. I'm okay. Thank you. That was so great. I really appreciate that framing. I feel like I needed it. I'm really excited about this panel. I think we have a lot of different voices up here to talk about what privacy and civil rights mean to us. I'm excited to dig deeper into this sort of outlining data intrusions and redefining how we discuss privacy and Jim Crow online. So much so that I didn't want to cut anybody, so I'm just gonna be standing chilling up here for the next hour and a half because I was like, no, we need everybody up here. So thank you in advance. I'm gonna read the bios of our panelists and then we'll kind of dig into the questions. So all the way down at the end, you've got Garv LaRoya. Garv is Free Press's policy council. Free Press is a nonpartisan advocacy organization created to give people a voice in the crucial decisions that shape our media. He works alongside the policy team on topics ranging from internet freedom, net neutrality, media ownership, consumer privacy, and government surveillance. His human rights and civil liberties work has taken him from Capitol Hill to Uganda. He worked at the Government Accountability Project protecting the rights of national security whistleblowers and prior to that served as legislative council at the American Civil Liberties Union. Next to him, we have Aaron Shields. Aaron is the national field organizer for internet rights at the Center for Media Justice, a national nonprofit working to build a powerful movement for a more just and participatory media. Aaron joins the org with more than five years of grassroots organizing and advocacy experience in the areas of technology, affordable housing, and criminal justice. Prior to working with CMJ, she was a community organizer with Bread for the City. Fighting alongside DC residents living on low incomes for equitable redevelopment without displacement. She also previously worked as a government affairs coordinator at the Internet Association on issues of net neutrality, privacy, and internet governance. Aaron recently served as a 2016-2017 co-chair of Black Youth Project 100's DC chapter and has committed herself to the work of liberating all Black people. Next to her, we have Miranda Bogan. Miranda is a senior policy analyst at Upturn, an organization combining research and advocacy to promote equity and justice in the design, governance, and use of digital technology. She focuses on the implications of machine learning and artificial intelligence for civil and human rights and particularly on the intersection of digital technology, automated decisions, and economic opportunity. She has conducted research and authored work on issues, including global data ethics, the governance of AI, and automated decisions, commercial transparency practices, and consumer protection, and corporate accountability in the digital age. Then we have Elisa Valentin. Elisa is the communications justice fellow at Public Knowledge, a nonprofit public interest group that is involved in intellectual property law competition and choice in the digital marketplace and in open standards into an internet. Elisa specializes in shaping tech policy that centers groups at the margins, including low-income communities, people of color, and rural America. At Public Knowledge, she works on broadband deployment and access, artificial intelligence, privacy, and intellectual property through a social justice angle. She also serves as an adjunct professor in the D.C. Area School. She has taught courses at Howard, Trinity Washington University, and Montgomery College. And then finally, we've got Priscilla Gonzalez, campaign director at Mahinté, a digital and grassroots hub for Latinx and Connex movement building and organizing. Priscilla has worked with grassroots organizations since the early 2000s to build power toward long-term system change. For a decade in the domestic workers movement, she helped to build organizing infrastructure, organizing capacity, and reach amongst a broad sector of allies to help pass the nation's first domestic workers Bill of Rights. In addition, is serving as part of the team that has launched the National Domestic Workers Alliance. She then went on to Communities United for Police Reform, the largest unprecedented campaign for police accountability, where she helped to co-lead efforts that resulted in the New York City Council passage of the Community Safety Act. So yeah, these are our panelists. Thank you. This is gonna get heavy, so I thought we'd start with a lighter icebreaker. So to everybody, let's save Google the trouble of having to spy. Can you tell us your favorite song? Bonus points if it has a privacy or apocalyptic theme to it. Oh, should I start? All right, I went with Digital Witness by St. Vincent's, like my tech go-to. Was it gonna do that too? Okay. Is it on the nose? Is that what you were looking at? My real favorite one right now is Before I Let Go, but the Beyonce homecoming version. Yes. But when I'm thinking about privacy things or when I'm like building out privacy curriculums, I listen to the song that I always feel like somebody's watching me. Yes, I love listening to that just to get fully into the vibe, yeah. So I have this kind of cultural musical amnesia and I never remember song names or artist names, so this is the one area where I'm like, algorithms are cool. Tell me what to listen to. So I have a theory that every time I say something is my favorite song, then it somehow makes it into the internet age, like too close by next and it was a whole like why are you always lying thing that happened on Vine. And then Before I Let Go, I love Frankie Beverly Amaze, I love Beyonce too, but like I really love the Frankie Beverly version, so I'm upset that I have to hear Beyonce every single time, so I hope that Beyonce stands don't get me on Twitter. Well, I was thinking about the privacy one and totally thought about Rockwell, somebody's watching me, but what I'm listening to these days, I don't have a particular favorite song, but the album I'm listening to is Romeo Santos, Utopia. So if there are any Bachata lovers in the audience, it's really good. Thank you, I need that playlist. Mine is Private Eyes by Holland Oaths. So moving on, let's start really broadly. I'm gonna kick it to Erin. Can you talk to us about like what is this concept of digital civil rights? What does that really mean and how do you see that framework shaping the privacy fight or not shaping the fight? Yeah, certainly. So I think Fransella really took us to church just a minute ago and was really talking about sort of the source material for digital civil rights. So historically in this country, marginalized people have been engaging with the legal system to gain some semblance of rights and that's certainly over the last century, for sure. So I'm thinking of Dred Scott, I'm thinking of all of the Civil Rights Acts, all of that, right? And so traditionally these fights were about material conditions or material things like housing, jobs, your ability to be recognized as a human being, public access or public accommodations and schools, things of that nature. But our lives have become increasingly digitized. We've been pushed online, whether that's by choice, so we're engaging in social media or are sharing our lives online, but also a lot of services demand or require you to be online for some part of delivering the services. And so a lot of our lives have entered into this sort of digital realm. And that includes things like the invention of credit cards, credit scores, algorithms, cell phones, that expanding into smart phones that are able to track location, the internet and social media. But like a virus, the discriminations that we saw and the oppressions that we saw before like to mutate in order to survive. And so an example of that, for instance, would be like chattel slavery. After the abolition of chattel slavery, we saw it mutate, these oppressions mutate into the U.S. prison system, correct? And then now we're seeing advocates going after the U.S. prison system and attempting to reform it or abolish it entirely. And we see that sort of mutating into electronic monitoring, which is sort of this digitized version. Another example could be racial covenants in housing and redlining. And we see that mutate into Facebook allowing housing providers to select via racial categories who's seeing those ads for housing. Same with jobs and a whole other host of categories. And so we see the old regime of oppression and discrimination really being updated and compounded by algorithms and the ability for corporations, these third party data brokers, and social media, the government, a whole host of actors to be able to collect an extreme amount of information on us and also to guess about us too, so they may not have even the information but they're guessing about us and then deliver us services or not deliver us services based on those guesses. And so the digital civil rights for me and for the organization that I'm working with is the concept of extending those original rights, so those rights that were fought for, as Rantella said, over a century into the digital realm, right? And so that includes net neutrality with our right to speak and be heard and certainly privacy, which is the right to determine what information is collected on you, how it's stored and what it can be used for and able to shift some of the power imbalances that we've seen in this data space that has gone mostly unregulated. Thanks, Erin. I wanna tug a little bit at that thread around sort of law enforcement and legal. Priscilla, take us to the intersection of privacy and surveillance, particularly as it pertains to the communities you organize. A lot of people view these things, privacy and surveillance as different issues. Do you, if so, how do you see them running parallel or overlapping? And if not, what do you think people are missing by separating the two? Yeah, so, Mihimte has historically focused on immigrants' rights, comes out of a long history of organizing around deportation defense and immigration enforcement. And so our entry point into this conversation is by looking at the role of the tech industry in essentially in the current political climate in accelerating the targeting and surveillance detention and deportation of immigrants. What we're seeing or what we've known is that just to kind of take a 50,000 foot view immigrants, people of color, vulnerable communities often serve as the laboratories for testing new technologies. There are connections among the corporations and the technologies that have been used in war that we see along the increasingly militarized US-Mexican border that we then experience in the interior of the United States, specifically as Priscilla was saying, in over-policed and criminalized communities. And so we have, it's important to see the common themes across the harms, the systems, like literally the systems that are being created for local law enforcement, the systems that are being created to support government agencies. The actors, the tactics, in order to really appreciate the full breadth of what's at stake. The tech industry is taking contracts, government contracts to increase surveillance, facilitate war, incarceration, criminalization, commodifying data, all of that is connected and they're doing it in a way that doesn't have any regard for civil or human rights. The tech industry is making all of those connections and moving a coordinated strategy. And so our understanding and our approach to counter also has to be coordinated. I like to say that what's gonna put tech in check is a multi-sector, multi-strategy, people's movement. We have the same targets and we're also all in the dark about the deals that are getting struck between the tech industry and government under the guise of national security or even in obscure procurement processes that nobody understands or knows enough about to intervene. And so as part of this conversation and bridging the privacy and surveillance conversations, I think we need to learn as much as possible because there's just so much that we don't know. We need to be sharing information, we need to be talking to each other and again just coming up with ways to build power, to counter the influence and the impacts, the long, far-reaching impacts of the tech industry that we have yet to see. Thank you. I'm definitely gonna come back to that question of corporate power, but I wanna kick it over to Alisa to pull another thread out from what we've heard. I've heard you talk about the fact that despite the fact that poor people are less likely to be digitally connected, their privacy is most compromised. Can you say a little more about what that means and what that looks like? What are some of the ways we're being tracked and may not even know it? Yeah, so I want to uplift the work of Mary Madden from Data and Society on if you all saw the piece that she published in New York Times a week or two ago. I mean, it's called The Devastating Consequence of Being Poor in the Digital Age. And I really wanna uplift that work because she talked about the fact that there's two ways to exist as a poor person in a digital age, that of being invisible and that of being hyper-visible. So she says that poor communities are targeted with predictive policing methods and also with being excluded from hiring algorithms, which I know is a lot of the work that Miranda is doing. However, there's also chances that you can be invisible because you can't really use your agency, you don't have the resources to seek redress. And that's where the important idea of private right of action comes into play. And I know that Senator Markey has that in his bill, the Privacy Bill of Rights. But I think that's something that's really important because communities of color as well as low-income communities are the people who are being taken advantage of so they should be able to use the court system to seek redress. And as far as the ways in which we're being tracked, I can't help but think of Chick-fil-A, which Chick-fil-A is kind of problematic, but yeah. So let's say that you're trying to order your meal before you get there. And Chick-fil-A will always say to you that they need your location information. They do not need your location information. You guys don't need to know if I'm in Little Rock, Arkansas, or I'm in Southeast DC. You guys do not need to know the information. All you need to know is that I need my order once I press the Prepare My Order button. So yeah. Thank you. So I'm gonna switch it to Garv. I've heard you talk a little bit about secondary unauthorized, non-consensual, secondary uses of information, which sounds like way over my head. Can you say a little more about what that actually is and what are some examples of that? Sure. So fundamentally, I think we all believe that an individual should have the right to know why and when and how their information is being used, and it should ideally only be used for the purpose for which it was gathered in the first place. And so this conversation isn't about Facebook, but this is a really good example, I think. It came out over the last couple of months that Facebook asked users for their phone numbers to effectuate a security provision on their website. So there wasn't unauthorized logins to your account. It turns out that Facebook used those numbers to facilitate their advertising delivery. I think it's clear to see why that's an unfair bargain that they made because that's not why people gave over a pretty sensitive piece of information about themselves. Other examples that don't have to do with the platforms. For example, the cell phone company is just earlier this year. It was found out that the precise geolocation data they use as a provision of service for you to be able to get a cell signal and connect those calls were being handed over to data brokers and through various transactions ended up in the hands of bounty hunters. And now it is possible to pay someone $300 to find out the precise location of a person's cellular device. And so these kinds of uses, obviously the secondary uses we're talking about and in general in the civil rights context, it is I think fundamentally immoral that companies are using the information about your ordinary life online that clicks you, you make, the things, websites you visit, the fundamental bargain that I'm typing in, a website name to get that information and that information is being turned around and used to build tools that exacerbate segregation, that fundamentally undo the important rights and progress that we've made in the last decades to build a more equitable society. And that's not a bargain that I think generally people want to make. Thank you. So actually my original check-in question was gonna be like, what's your favorite dystopian fiction? And then I was Twitter stocking Miranda and she was saying something about how dystopian fiction actually normalizes dystopian technology in our everyday lives and I felt like a bit of shame that I was gonna ask that question. So can you say a little more about what you meant by that? Also like can machines be racist? I'm asking for a friend. Yeah, so there was a great event here earlier this week about how sci-fi frames are influencing how we understand artificial intelligence specifically but I think that extends beyond artificial intelligence to data uses in general. Whereas I think a lot of the reason we're all interested in privacy at the moment is because we've heard all of these bad outcomes. We've heard about Cambridge Analytica trying to exploit racial division in this country and manipulate how people perceive the elections. We've heard about discrimination in housing, advertising online. We've heard about Amazon's racist hiring AI that was saying women were probably not good candidates because they didn't resemble the candidates that the company had seen in the past. These are tangible outcomes that I think weren't in the conversation before that are now becoming more clear in the surveillance side of things. I think we have a picture of what might happen in the future because we've seen it in fiction and even when that fiction kind of tries to lay out really clearly why we don't want this to come about, it creates the perception that moving in that direction is inevitable and it's just gonna be bad and we're not going to be able to really fight against it. And I think that bridging those two narratives is really important when we're talking about civil rights and privacy because it's both of those things. It's the reasons why we're motivated to talk about it today from the commercial side but the commercial side is bleeding over into the surveillance side as I think we have talked about and we'll probably talk about more and can machines be racist? I think they certainly can and often they're just reflecting society as it is today. So much of that is becoming more invisible, more difficult to push back against because A, it's either invisible or B, it's hidden behind the layers of the private sector that at the moment we don't have the levers to hold accountable. We aren't able to detect what's going on to get them to make the changes we need and I think that's why we're motivated to push for stronger privacy legislation or data legislation in general because this is happening. Data is being used to personalize our experience online and when I hear personalization, as I said last week, when I hear personalization, I think discrimination. We're segmenting people into what they like, what other people like, who they are like, what other people they're like and that's exactly what was happening in the offline world, that we worked so hard to prevent and to make better because it wasn't working for people, it was hurting marginalized communities and so it is being rewrapped in a new package of efficiency. The internet is great, let's have everything be specific to you but that's leading to these outcomes exactly the reason that we're all here today. Thank you. So that was a lot. We've kind of laid out the stakes here. I want to swoop in a little more and talk about specifically what are the fights and interventions that we're seeing. So I'm gonna come right back to you, Miranda. We've seen story after story of the ways in which tech companies are not only shifting how we engage in elections globally but also how we can access information and opportunities. Can you say a little bit more about the expanding world tech platforms are playing in society and what should we be on the lookout for moving forward? I think we're seeing the privatization of a lot of public spaces, both public discourse but also how people are coming across information, how they're accessing public accommodations, how they're accessing opportunity and so that's really changing the channels that we have to push for rights. I think we have the right to privacy from the government. We don't have that right now, a right to privacy from the private sector and yet that private sector is taking on many of the roles or at least facilitating many of the same things that we used to expect from government and so a lot of the fight right now is focused on advocating to companies. A lot of us up here have been involved in that process trying to get companies to improve their practices to change their content policies to stop allowing bad actors to use their tools in ways that could exploit divisions, exploit people by their identity and that's gotten us a little of the way there but there's not a strong enough incentive for them to do that especially for those companies that are less publicly visible. Facebook and Google and their subsidiaries are easy punching bags and they have a lot of public attention but Amazon has been getting slammed for this and they haven't reacted in the same way even. Their customers are often law enforcement or they have such a broad market share that they just don't have to respond to some of these concerns. Every other company in the country and in the world is starting to collect data that they didn't used to, hotels, cars, everything like that and we just don't have the capacity to advocate to every single one of them and so I think we need to take it to a higher level and say how do we hold these actors accountable that are collecting our information that are facilitating our lives because right now it's so piecemeal and it's a big fight that we've been involved in but we need to take it further. Yeah, you said a lot there that struck me. One of the things you said was we already have a right to privacy from the government and I was like, okay. Yeah, I wonder what Miranda asked about that. I've heard, I mean Priscilla, I've heard a hint they talk about Amazon's cloud industrial complex. I've heard a lot about facial recognition in Amazon but this was a little bit of a new concept to me. Could you talk about a little more about what that means? Also, I see y'all talking all the time about this group called Palatier. Who's that? How does that fit in? Yeah, so also increasingly, yeah, new information to us too. So coming out of, again, as I was saying earlier, our deportation defense work and immigration enforcement work, we were just trying to understand and figure out how is it that ICE is finding our people? How is it that they're hunting them down? Where are they getting this information? Who's helping them? And so we commissioned some corporate research and ended up finding dozens of links between the tech industry and immigration enforcement agencies and two of the major protagonists that emerged were Amazon and Palatier. So Amazon is no longer just the one-stop shop for the world's consumer needs. Most of its revenue comes from Amazon web services. It is the largest broker of cloud storage data on the planet. It has the most federal authorizations in the US government to store its data. Where Palatier comes into play is that it provides what are called mission critical services for ICE to track, detain, and build cases against immigrants to deport them. And all of that, and Palatier, all of that gets stored on Amazon cloud. Amazon then going up, sorry, going up a level deeper. Palatier is also one of the largest contractors with local law enforcement, different local law enforcement agencies across the country. Local police, that means local police are using Palatier-built systems that talk to each other, they're interoperable. That means that ICE also has access to the information that's on there. So what that revealed to us was that even localities that have past sanctuary policy that essentially prohibits the collaboration between local police and ICE are not actually safe havens for immigrant communities. Peter Thiel, who is the co-founder and chairman of Palatier is a Trump supporter. So when we talk about the tech sector or how they present themselves, many of the companies presenting themselves as do-gooders or as a political act, like actually that's not true. Just last week, we uncovered and exposed Palatier's role in the family separation crisis that reached a peak last year. So border agents were basically able to use the Palatier-built systems to build profiles of families for arrest, including relatives or guardians who would show up to claim unaccompanied minors. So we're seeing under the current political climate tech as a product and an instrument being weaponized and companies like Palatier and Amazon are all too happy to oblige and do the government's bidding in targeting and profiling, targeting and punishing en masse, those that it deems undesirable to be in this country. So yeah, that's a lot. I know we're having this discussion right now about all the ways in which tech companies are out of control and a big question emerging is who's gonna actually rein them in? I know that public knowledge has done a lot of work around pushing the FTC to take on more of a role in reining in tech companies. Yesterday, I noticed that the chairman of the FTC told Congress he did not want the agency to have broad role-making authority to write data privacy rules. So I'm just curious, Alyssa, where do you think that leaves us? Can you say more about that work? Do you think that FTC is the right body to take on the privacy fight and tech accountability? Yeah, so public knowledge believes that the FTC should be granted broad role-making authority because we're seeing a lot, what we think we know what the harms are right now is related to privacy, but we have no idea what that's going to look like in the future. But along the side that broad role-making authority, we have to make sure that they have the right resources. We need to make sure that they have experts that can speak to these issues as well. And also something that we're working on a lot at public knowledge is kind of the intersection of competition and privacy. And Charlotte, who's here in the audience, made the point her out. But she's doing a lot of that work and we also just released a paper by Harold Feld yesterday that relates to this issue. Okay. All right, so we've got some regulatory stuff moving, a lot of stuff popping off in legislation. Earlier this year, Free Press and the Lawyers' Committee for Civil Rights Under Law released model legislation calling on Congress to protect civil rights and privacy online. Talk to us a little bit about that legislation, what are some key components and also like just in general, what's up at Congress when it comes to this fight? Sure. So I think in this space, we realize Congress is hungry for solutions to this problem with fingers crossed that they will get to effectuating them. But in that space, I think Free Press and the Lawyers' Committee built on work other groups have done in coming up with proposals where we wanted to innovate specifically related to the discussion we're having is making sure that there are strong civil rights protections inside whatever privacy law comes out of Congress. And so this is a bit of a list and I'll be very brief. We, in our model legislation, we wanted to make sure that it's prohibited that the use of personal data cannot be used to discriminate against protected classes, which is kind of the way civil rights laws currently structured in things like employment, housing, credit, education, directly or by disparate impact, and as Miranda has spoken to, that's often how this kind of discrimination will show itself when it comes in talking about the way algorithms affect people's opportunities in those categories. We also, and this is a gap in current federal law, classify online businesses as public accommodations and where, why that's important is, right now the bar downstairs has more obligations regarding civil rights in general to the public than any online platform does. That varies by states, but there's no federal consensus on that and I think it's really important to remedy that particular situation. And so by classifying online businesses as public accommodations, we make it unlawful to process personal information in a manner that segregates, discriminates in or otherwise makes unavailable those goods or services that, and our advantages or accommodations that discriminate in those opportunities and access to the services that online businesses provide. We also require companies to audit algorithms for bias and privacy risk, have robust transparency and also give the FTC the ability to effectuate those rules and give ordinary people who are public right of action, private right of action, the ability to protect their civil rights and connecting this to the history of civil rights, it is been true that government agencies have failed time and time again to protect people's civil rights. And so in the context of the privacy debate we find it very important to make sure ordinary people, advocates can go out there and protect their rights when government agencies today are falling down on that job. And so where this conversation on legislation has gone in terms of civil rights, I'm actually very surprised and pleased that in a relatively short period of time I think as the circle of people caring about what privacy looks like, making sure that civil rights is protected in this space has genuinely changed the conversation on Capitol Hill. There's been a series of hearings on privacy, let's say in the Senate over the past year and you can see the evolution in members thinking on how they feel about the protection of civil rights inside privacy. I mean even, and I log Chairman Wicker for this, in the most latest hearing he admitted that there need to be serious considerations of what kind of use restrictions Congress puts in place for how data is used. And it is heartening to see an emerging consensus on that view inside Congress. And I sincerely hope that whatever both Senate and House Commerce Committees come out with really centers these kinds of protections. Thank you. So it sounds like there's a lot popping off. I know we also, there was like a civil rights sort of framework or set of principles that were released around privacy. Another coalition were with digital and privacy rights for all also released a framework. But like while all of this conversation is happening it's like we need protection right now, right? I think that's like pretty clear from everything we've heard. So Erin tell us about the work Center for Media Justice has been doing around personal digital security and defending our movements. Sure. So the Center for Media Justice hosts a national network. It's a little over 100 grassroots social organizations and some larger nonprofits that do work on a whole host of issues but care about the concept of media justice and specifically around things like privacy and neutrality, et cetera, et cetera. And so one of the ways that we relate to our network is by acting as a capacity building support for them. And what we were hearing the feedback that we were getting from our members was that they were experiencing intense surveillance when they were organizing around anything from police violence to stopping evictions or gentrification and displacement generally. And so one of the ways that we felt like we could step into the gap was, well, I'll share two different ways. So one was we partnered with some of our magnet members around the country to put on, it was sort of like a split event. So half of the event would be a discussion with the community organization about the type of surveillance or the type of privacy that they were looking for, the type of surveillance that they were experiencing or the history of surveillance in the city that we were hosting it in. And then it would be partnered with a digital security training for organizations in the area. And so we would have digital security practitioners come in and try to give folks tips about how they can reduce harm and mitigate the ways that the state was surveilling them. And so that looked like talking about VPNs or talking about how to wipe personal data from the internet, what to do if you're being doxxed. Covered a wide range of things and it was actually tailored to what the actual organization wanted to talk about. Another thing that we would, and also I wanted to mention though, that it was sensitive to the fact that digital security for communities of color is often a very sensitive topic. And it also is one that often leaves people with a lot of anxiety and digital security trainings don't help people feel safer. It helps them feel a little bit more exposed. And so we wanted to in those digital security trainings acknowledge that and also try to again, reduce harm as much as possible and also give people space to talk about the history of surveillance in this country and ways that make people feel safer and things that people could do to take into their own hands to help themselves feel a little bit safer doing the work that they were doing. The second thing that we did was develop a knowledge base. It's called Defend Our Movements. So if you type into your phone or your laptop, defendourmovements.org, it's a very comprehensive what we call a knowledge base that holds a lot of the information that I was just talking about. So we partnered with May 1st People Link, which is like a movement technologist, which is a whole other conversation, but a movement technologist group that to build out sort of like a website that could answer a lot of the questions that some of the organizers that we were working with had. And so you'll go on there, you can select by topic or anything like that. And also if you don't see your question answered there, you can send the question in and there's actually people on the other end that will respond to you. So if you're being doxxed real-time, you can talk to them about what actions to take, what to do. So we've been sharing that with community members and organizations as a way to in real-time provide technical support for folks who are having their websites, they're themselves doxxed and their websites messed with and all of that. So we're deploying these to see how helpful it is and also collecting feedback from the organizations and folks that we're working with to see what we can do better and how we can update that more too. Thank you. So yeah, now we've gotten to the Pharrell-Pharrell portion of this panel. Oftentimes when I'm having this conversation around like privacy, I don't tend to see panels that look like this. Often I get invited to panels because they wanna sprinkle a little bit of like civil rights funk on top of like and talking about it like it's separate. When for us it feels very sort of intimately interlocked. Alyssa, you are one of my favorite people to follow because you are just constantly dropping some tech policy, so white gyms. I'm just like, yep, yep, yep, yep. For those of us that don't follow you, you should. Tell us a little more about what you've been seeing out in the field that informs that hashtag. Yeah, sure. So I don't just talk about tech policy on my Twitter by the way. If you guys do follow me, I talk a lot about music and real housewives and that sort of thing as well. That's why I love you. So yeah, I went to Howard University. I got my PhD there last year. So coming into the tech policy space, it was a complete different kind of environment. And I noticed when I would go to various meetings and various events, I would oftentimes be like, the only person of color in a room or I would see my friends that I always see like Francella and Yosef and Sean and that sort of a thing. And so I realized that was definitely an issue when we'd be having conversations about communities of color, but they weren't actually involved in the panels. And I remember specifically going to a panel, a privacy panel, there was all white folks on the panel. There's about four men on that panel and they were talking a lot about communities of color. And that really bothered me. And so on beginning of Black History Month, February 1st, I dropped a blog on a public knowledge website called Tech Policy So White. And I talked about the importance of making sure that we have panels that look more like this. Because when we're having these type of conversations coming from folks who are actually in the community, I think that can really impact the kind of like policies that we see. So that's something that I want to see more. And I think it's also really important to diversify the organizations that are in this space and also make sure that we're teaming up with traditional civil rights organizations as well. Thank you. Priscilla Mahente has been really intentional about intersectional and inclusive movement building. That's one of the things I appreciate most about the organization. Is an intersectional movement for privacy and data protection truly possible? And what will it take? Yeah, I think it's possible and absolutely necessary as we've been talking about. Particularly when we talk about the most vulnerable communities needing to be at the center, who are getting hit in all of the different ways that folks have talked about on the panel. One immediate example that I can think of that where we can see that it's possible was for folks who may have followed the fight in New York City against Amazon moving its HQ2 there, that was truly an intersectional movement. We saw what it could look like and we also saw what it could do. Because folks on the ground from community-based organizations were fighting for the right to live with dignity who were facing displacement. They were fighting for the right to work with dignity. They were fighting to be decision makers and visible stakeholders in defining what economic development in the city should be. And our intervention in that, in terms of supporting the local calls on the ground for this, was also talking about Amazon as anti-immigrant and how powerful a narrative that ended up being. And in general, across Silicon Valley, who as I said earlier, many companies like to see themselves as just providing a neutral service to the world. So I think an intersectional movement with multiple sectors involved with a deep understanding, as I said earlier, of the connections amongst all the different harms and a deep commitment to making sure that the solutions reflect that is ultimately what's needed, particularly when we take, again, like a 50,000 foot view that what we're up against, like privacy and surveillance issues are just manifestations of what we're up against, which is essentially this global trend of power and control of our society getting the centralized into the hands of this unholy marriage of state and corporate power that has clearly a misogynistic and white supremacist agenda. And so when we look at it from that vantage point, we need to be talking to each other across all of the different sectors that we're coming at these issues from. Thank you. Garv, I'm gonna put you on the spot a little bit since you work at a white organization. Free Press lives at the intersection of campaigning advocacy, it's white lead, sorry, it's very diverse. You guys live at the intersection of campaigning advocacy research and policy. So tell us a little bit, how does Free Press think about centering marginalized communities in your day-to-day work? Sure, I think it is true that it's been an evolution over the life of Free Press and I'm genuinely very happy that it is not just something that we tack on the end of every policy ask we have. Oh, what about black and brown people? I think that by genuinely purposefully thinking about the most affected communities and the work that we do, we actually are able to solve the problems that we're talking about. I think that, I mean, just look at this conversation, for example. I think the privacy conversation has moved not just because the downside risks have become more apparent, but because the circle of people caring about this issue has gotten bigger. More champions have arisen, the ability, and it's given, frankly, the privacy debate of moral center that a different kind of moral center than it ever had before. I think the, I mean, as other projects that Free Press has where by centering affected communities we're actually able to get work done. We have, for example, a project about Puerto Rico and fixing the communications infrastructure after Hurricane Maria, and I think if we're able to solve that problem we'll really do something about the government's response in thinking about climate resiliency and thinking about these problems in the abstract without really centering the lives of the people that are most affected doesn't move the ball at all and doesn't help the people we're trying to help. Thanks. I do, I wanna leave some time for questions, but I also don't wanna leave folks just in the doom and gloom kind of thing that we've been in for the last two hours. So have we won anything? Like where are the places where we're winning? What's going good? I'm gonna toss it out to all of y'all whoever wants to catch that ball. I'll do one of them. So just last month Facebook settled assortment of civil rights lawsuits that were bringing up a lot of the issues around discrimination in housing and job advertisements. There was a class action suit, there was a suit by a fair housing advocate, the ACLU had filed a claim before the EEOC about job ad discrimination and all of that came together and in concert with the civil rights audit that has been ongoing with Facebook, the company agreed to kind of take away some of the tools that we found particularly risky, problematic, terrible that were available to people in these areas that we'd already legislated honestly offline and I think it was a demonstration that there are some laws that do apply on the internet but it still took a fight to show them that, to show a big company that to get them to move on it. So that was the silver lining I think back to the kind of gray area of that cloud that didn't solve the whole problem with discrimination online, online ad targeting at upturn and together with some academic researchers we were testing Facebook's ad system to try and see if we ran an ad for jobs and we didn't target it to anyone we didn't use those tools which were still available and are being taken away. We just said anyone in it was the state of North Carolina could see this ad, who's gonna see it? We found that Facebook was delivering those ads in ways that were skewed based on gender and race even when we told them not to do that and I think that's just an example of the way that all this data still exists in that infrastructure they were just taking away some of the tools that people could use to kind of get at it but when we talk about discrimination in machines, discrimination in algorithms that algorithm is just trying to predict who's clicking on what it's looking at the past, it's seeing who's clicked on what that's dependent on who was shown what in the past who knew what they had access to in the past and that's being pushed forward in the future and so I think it goes back to the point where we have to do that pushing from the outside we have to kind of show these examples of where it can go wrong and what can be done to change it proactively but that doesn't solve the whole problem and certainly Facebook's not the only issue here this very same thing is very likely happening on any website like as I said anything that's personalized is gonna end up falling into that same pattern even if they're not collecting sensitive data or demographic data our behavior just happens to reflect those demographics because that's how society works and so we're gonna see that sort of disparate impact happening online in a way that we can't control unless we get a handle on what data's allowed to be used for what and what do we know about what's happening? Any more like good? Yeah, get an upper after that. Well I don't know if this is an upper, sorry but I would say like the fact that we're just having this conversation right now is really important and that we see that privacy it's not just kind of like a transactional kind of manner it's about like we see privacy as a civil right and something that I want to add is that always like when I was at Howard my professors were always like we have to conceptualize the term conceptualize what you mean about privacy and I think we have to remember that depending on what community you're from your cultural background people define privacy differently and I think that it's fine time for us to also make sure that we have research in this space that's led by people of color and let me kind of like disaggregate the term people of color so I'll just read an article in LA Times about don't say people of color when you really mean black people so in this sense I mean black people need to lead the space because of the fact that we own these devices at far greater rates than other folks and we use messaging apps and things at far greater rates than white folks the only place that we're not using apps like more so than white folks with augmented reality apps like Pokemon Go but this is a report by Nielsen by the way I didn't make this up so I think it's really important that there are people of color black people specifically who are leading this research that we can define what we mean by privacy and what our needs are and I think that we can have legislation that follows that. Yeah and I think a really great follow up to that was is that we're seeing in movements this sort of shift to acknowledging the real impacts data has on our lives and especially the issues that we're organizing around and so some of the black and brown led research that we're seeing by people who are in the communities that they're researching are things like our data bodies so I'd love if everyone could look up our data bodies they're doing research in LA, in Detroit, in Charlotte about utility shutoffs about displacement in the housing area and they put out really great and interesting both tools to support people in communities around understanding data and how to keep themselves safer and also research on the issues that they're working on basically. Another one that's really really great is by a media action grassroots network member stop LAPD spying, they recently or maybe like last year put out a report that is called Before the Bullet Hits the Body and it's about predictive policing so everything that goes into algorithms prior to when a police officer pulls the trigger and shoots someone and so both of those are really great examples of when communities are researching themselves and also using data to liberate and enhance the campaigns that they're working on and they actually recently got they being stopped LAPD spying and the coalition they're a part of won some concessions against the LAPD around predictive policing and to show essentially that the program was built on bad dirty data and that it couldn't be used reliably and so they ended up stopping the use of the program if I'm remembering correctly so please look into both of those. I'll just say super quickly that I think it really does matter that people are now having, I'll say a more sober assessment of what these technologies mean and what their deployment of these technologies means for the communities that we're talking about and I think only after, once the shine is and I think it's kind of wearing off of the tech companies you can recognize that this really is just another industry that sometimes can provide us good things and often can exacerbate the problems in society and I think just that recognition and that change that we've seen really matters it sharpens our advocacy and it creates a world where something different is possible. I would also just add the different examples the many different examples that we've been seeing of tech workers inside of these companies being in resistance to their company's agendas but also just discrimination internally so that's been really heartening because I think we need tech workers as important and key allies in this fight. Thank you. Do we have any questions out in the audience? Yes? Sean Davis with the National Consumer League. So there's been a lot of great conversation on how to involve and help communities of color but I noticed and I'm just kind of curious as to your thoughts on how to get communities involved to act or how to compel them to act because oftentimes I mean everyone here pretty much knows about privacy and digital civil rights but when you go to a person in the street and you ask them they're not really sure how do you arm communities like I said I was mentioning earlier how do you arm communities in changing the narrative? Well I would push back a bit in the sense that communities aren't responding to privacy I think there are limited ways for communities to respond to privacy especially if you're talking about government surveillance so if you're seeking services you're very limited in how you can protect yourself from the government collecting information from you but I do think that some of the projects that I mentioned both of the projects actually that I mentioned earlier relied heavily on participatory research and also just breaking I think people understand they're being surveilled but are probably willing to talk to you about it in a different language than probably that we're talking about it here and so one is not better than the other it's just different and so I think it means taking some of the information and some of the research that is being generated elsewhere and translating it to both out of an academic context and just into a regular layman's terms but then also into a way that communities of color where it feels relevant to them and not abstract and affects the material conditions of their lives listening also when communities of color are bringing up the ways that they're already surveilled and not being dismissive of that and meeting people where they are I think and also holding more spaces and this is something that the Center for Media Justice is really interested in doing but holding more spaces where people can come together and actually talk specifically about these issues and where people can collaborate and more of these projects like our data bodies and before the bullet hits the body can be generated and shared. Thanks I also was thinking about the study I can't remember the group but they released that black people are more likely to read privacy policies than any other group which that's my mom so I know for a fact that's true and it doesn't like make us any safer but yeah definitely shout out to data bodies because they were in like skid row they were like they really went into the community and talked to a lot of different folks. Do we have, yep. Bert Lee committee on education and labor, hi guys. So my quick question is working on the hill on a lot of these issues what I'll hear is that members and even staffers will say that this is just tech and tech is out there it doesn't affect everything but one of the things that a lot of the people I've worked with a lot of people up there on the panel and you guys tried to help me through that wonderful meeting that we had kind of like try to bring to bear the reality that tech is everywhere and that these conversations affect everything that we do and so I would love to go down the panel and give maybe a one or two sentence reason why if you run into a member or what you would say to a member if they ask why is this important and why should we care? What would you say to that member? So I mean I think there's growing recognition and I'd say this to a member that we don't address these issues we're gonna sleepwalk into a society that is further segregated where opportunities are not available to people based on characteristics or analogs of those characteristics that I think that we finally collectively said can't happen and I think a recognition that that future is very possible should put a little fear into those members and have them recognize that now is the time to do something about it. That means I go next. This is a tough one for me because part of me would wanna push back on part of me doesn't believe that people don't think that tech impacts our lives in various ways. Like I think people provide cover for themselves by saying that but I'm not sure that you can live and particularly well I don't know maybe be a senator or be a congressional member and not know the ways that technology is impacting people perhaps. We need to like lay out the particulars of it and also I think probably the ways that it pertains to people who are living on low incomes or no incomes. How their information is tracked and sold and used. But I would probably end up getting like saying something snarky, not helpful. To me and I think this is relevant not only for the Hill but for our organizations. Like tech is not an issue area. Tech is a mechanism that everything else is happening and so it's not enough to have a tech staffer. You need to have like just like you all staff need to understand how budgets work. All staff need to understand how technology intersects with their other areas. Like so much of so many of the issues up here we're talking about are not tech specific. Maybe when we're talking about net neutrality that was kind of still in the telecom area but when we get into surveillance that's procurement for national security for law enforcement. We get into housing, we get into education. These are things that exist in the government. The government has been involved in these forever. Tech is just another way it's happening and so it's infrastructural. It's not like a siloed thing that they either is in their portfolio or is not in their portfolio. Yeah I agree with everything everyone else said. It's not a siloed issue and I think we were talking a little bit earlier about the importance of storytelling. So I think it's really important for us to like you know get those constituents or like in front of them that can talk about issues that they face related to like predictive policing or discrimination in housing, education, predatory lending and that sort of a thing. Yeah I mean I think the only other thing I mean just echoing everyone. The only other thing I would add is that in addition to this being civil rights, human rights issue, economic issue, it's also about democracy. I think just this past month there was a revelation of the Department of Homeland Security commissioning some intelligence firm to surveil families belong together protest or you know folks who were protesting the family separation crisis. So you know what is that about? Like that totally implicates Congress in needing to investigate and figure out what's going on. I really wanna ask them money in politics follow up question but that's another panel. Yes sir. Hi there I'm Aaron from Spitfire Strategies. I just wanted to ask a lot of times in this space we talk about the consumer side of things and how this is impacting consumers. But we're also seeing that this is really impacting people who are working inside corporations and organizations. I mean with yesterday's talking about the Uber and Lyft strikes and how that algorithm is messing with their pay. How are we going to have this, how do we change this conversation to include people who are impacted by these algorithms in the spaces that they work beyond just trying to get a job? Yeah I think too often workers are carved out of privacy law. I think that's, their discussions of that are happening in California. There's some exceptions I believe in the European side of things as well. Like in working for a company or entering into a different kind of contact with them and they should have access to everything. And I think that it's just a matter of making sure those voices are at the table. Like I think we need to do a better job reaching out to workers rights advocates. And I think some of the companies are trying to play both sides of it as well to say like, these are our workers and so we need their data but also these are not our workers. We don't want to pay them the benefits that they deserve. And so you kind of need to articulate on both sides of those things. Like fight back on both angles. But it's certainly a community just like we've been broadening the coalition to talk about privacy to groups that are beyond privacy. I think that work still needs to continue going on. And I think the wider we can spread this coalition of trying to talk about why privacy is not just a data issue on its own. The stronger chance we'll get to enter that whatever ultimately gets passed, fingers crossed addresses in all communities. I do think it's true that the siloing of these issues serves the people who want to divide and conquer the various constituencies that people that care about this. And like you said, it is incumbent on the panel and people in the room to make sure that we bring in more people into the conversation. So like I said, these problems can actually be addressed. Yes. Sorry, with the green frame. So we're Tim Wright out from Center for National Private Enterprise. We're part of National Endowment for Democracy and US Chamber of Commerce. Invite just for the record, a fierce anti-waste supremacist. This might seem counterintuitive, but I'm wondering when it comes to, there's a lot of talk right now about our internet freedom agenda. America's internet freedom agenda in the world, what it is we're doing with the internet, how we can push back on authoritarian abuses. And I'm looking at news reports where Chinese companies forced to divest from Grindr because foreign intel is using potentially that to target LGBTI communities that are within the Intel community. Is there any way that this kind of work can be brought to help people who are in our military intelligence communities who are also being targeted by perhaps Chinese intelligence or others to help them understand what it is that the risks are and how these things are generally abused. So they're also safe. I will say that's the then diagram of, I mean Chinese subsrefuge, I mean other foreign government subsrefuge and the kind of stuff that we're talking about is not something I've necessarily thought a lot about except perhaps in the voting rights context where it is true that this kind of information is exploited and used both domestically and foreign to divide this country to prevent people from voting. So perhaps there's actually a larger set of issues there and I would be happy to hear about them. I think overall it raises a question of what values are we, do we care about on the internet? And in that particular case, there was an overlap in an explanation of values but also sort of economic incentives and security incentives. But I think that's something that we need to be thinking about in terms of how we deal with privacy overall because I think Europe has taken that reign saying we as a group of nations value privacy so much it's a fundamental right, we're gonna lead on that and we're going to make sure that technology that's both either born in Europe or is operating in Europe follows those values and I think right now we're not doing a very good job ensuring that values at the very least values that we want to have are being safeguarded in the technology you want and at the same time I'll say there are a lot of problematic things happening in China certainly oppression of the Uighurs and overall collection of data but from a cultural context I think that may be more culturally acceptable than we find and I think if we were looking at the outside at US practices and US private sector sharing of data with the government and how our government treats minority groups there are some similarities that I think privacy protection is intended to address here as well as thinking about how that affects companies that are operating internationally. Did you still have a question? Blue shirt? Yep. Hi I'm Alex Toll, I'm a college student interested in data privacy and I had a question about how the discourse between data privacy as a human right and as a civil right might be different because until recently I think there was a reticence especially in the United States about calling data privacy a human right but how do you all find the discourse around data privacy as civil right to be different and how should it be different? One of the reasons, I mean right the US has decided kind of as a, you know, the way we operate just to have a different set of standards and laws internally and how we think about those things internationally so it is true in some way that there's a different set of standards, right there's a whole body of human rights law that covers a lot of these issues and then the United States has a whole body of civil rights law and so I think that perhaps a rhetorical connection with the tune of two and substantive connection to the two is really important but it's often about thinking about exactly how the body of law that we're talking about civil rights law specifically controls in this circumstance and how it should be applied and certainly the United States should also abide by various human rights standards that also govern this space but I think perhaps we practitioners in this space kind of hide behind the legal distinctions and not talking about it as human right and I think maybe that's an interesting distinction there. I don't know if this answers the question but we Center for Media Justice has been pushed by one of our partners, Equality Labs around this very question because in some of the work that we do particularly around accountability for Facebook well first I'll say this much like what I think Alyssa was saying earlier I'm pretty sure this is what you said but that or no Priscilla had said earlier about how technologies are tested on communities of color here in the US that often too technologies are tested on the global south before they're brought here and also in the context of like corporate accountability for Facebook there are similar struggles to the ones going on in the US that are going on in places like India, Bangladesh other places other developing countries and places in the global south and what we're being offered here is not what they're being offered in India in terms of concessions around audits around transparency around the way that their products are deployed and so I think there needs to be a bridging of those movements or at least conversation between folk organizers on the ground there and organizers here that are working on the same things because I do believe like you mentioned like the link is there and these companies are global they're deploying globally and in order to make sure that nobody is left behind they need to be collaborating. Any other questions? Yeah, thank you guys so much for a great panel and I'm sorry that I came in late and missed part of it but I wanted to ask Garav I think about the model legislation that you guys have created and what kind of reception you're getting and I was really interested particularly about the public accommodations provision and how that might work and screen. So I generally think it's not just the fact that we put this out, the entire community has really pushed Congress to think about these issues in this space and recognize where the harms are. Senator Cortez Masto put out a bill that recognizes civil rights, harms and directs the FTC to have rulemaking on how data practices affect civil rights. Senator Markey put out a bill his civil rights bill does something similar but also classifies online businesses as public accommodations and directs the FTC in a very specific way to issue rules that prevent data being used for loss of economic opportunities in all those categories that we talked about and updates that to include more protected classes like sexual orientation. So I think that the hunger was there on the hill to find a way to incorporate these protections and I think our work is part of a larger puzzle to make sure that that's addressed. I think, as I said, I think the reason the public accommodations issue is important is because it is very strange. Like I said, for instance, the bar downstairs has more obligations to the public than a purely online business does and for clear reasons, that seems untenable that they have a freer hand to discriminate than the store you walk into. So thank you to the panelists. I think this was really great. I think there's a lot of stuff that's like kind of still left on the table as we were talking about labor. I was thinking of these focus groups I went to a couple of years ago where it was like black people are most worried about workplace surveillance and that was something we didn't really get a chance to talk about but shout out to like co-worker United for Respect and other groups that are doing work around this. I think there's larger conversations around monopoly power like what does it mean when Facebook sets up a data center in Iowa or Alabama and gets 20 years of not paying property taxes in exchange for like 70 jobs. There's a lot that we have to like start talking about and I think one thing is clear is that those conversations can't be separate. So as we like wrap real quick, like real real quick could be like go down the line and say like what is one thing that people could do walking out of here today? Call a number of Congress and tell them this is important. If you're a researcher or an academic make your first of all be in community with people who are on the ground, grassroots community organizers and also make your research accessible to those communities. I know that's more than one thing. I'll stop. I think we've gotten the most traction when there have been concrete stories to hand to lawmakers. And so working with the communities that are doing that local research, if you have connections with journalists, pitch those journalists on writing those stories so that those can get lifted up because I think the lawmakers are like tell us about the privacy harms and we don't want to talk about data breaches there. We want to talk about real harms that are affecting real people. Yeah, I was in a conversation recently with someone that talked about that we need to figure out what our non-negotiables are as it relates to privacy legislation and one of those non-negotiables should be protecting civil rights and I do think that if we protect our most vulnerable communities everyone will benefit from it. Well, just this morning we had a team on the ground in Northern California. They did an action in front of Palantir Technologies trying to talk to workers there. So if you can today tweet at Palantir Tech, tell them to drop their contract with ICE and stop playing an ongoing role on the deportations and family separations. All right, thank you. Can I get some love for our panelists? Thank you so much. This was really great. And also for Ansela who like sort of like really anchored what this was about. Thank you to New America for hosting us. Thank you all for being here and onward together. Thank you.