 Welcome to the future of democracy, a show about the trends, ideas, and disruptions that are changing the face of our democracy. I'm Ashley Zahn, Vice President of Learning and Impact at the Knight Foundation. Before we get started, I want to share a little bit about why we at Knight Foundation believe that these conversations are important. The Knight Brothers believed that a well-informed community is essential to a representative and well-functioning democracy. We also believe in engaged, equitable, and inclusive communities. At Knight Foundation, we've invested in an interdisciplinary field of knowledge and practice focused on the governance of the digital public square. We now know that the digital public square is an essential part of our democracy. It connects people to information and to each other, and it offers endless possibilities, but it also brings challenges to our communities and to our democracy. Knight Foundation has also conducted polling amongst Americans about their views about major technology platforms. Last year, we found that two-thirds of Americans believe that technology companies have too much power, but that Americans are split on whether technology companies or governments should be responsible for content moderation. Last year, we saw the real-world consequences of mis-and-disinformation online, and that has created a unique moment where conversations about changes to social media platforms are gaining urgency. And we can't have conversations about changes to social media platforms without talking about the unique impacts of these platforms on communities of color. And so for that reason, I'm incredibly excited about the guests that we have here today. Allow me to introduce you all to Dr. Dominique Harrison, the director of the Technology Policy Program at the Joint Center for Political and Economic Studies. Dr. Harrison previously was a program director at the Aspen Institute, where she led projects focused on the intersection of media, technology, and policy, and she's also taught at Howard University, University of Texas at Austin, and Trinity Washington University. Dr. Harrison, welcome to our show. Thank you. Thank you for having me, Ashley, and thank you to the Knight Foundation for thinking of the Joint Center for Political and Economic Studies. Absolutely, we're so excited to support your work. Before we get started, tell our audience a little bit more about the Joint Center and the Technology Policy Program. Absolutely. So the Joint Center was founded in 1970 to support newly elected black officials who were moving from civil rights activism into governance. And the Joint Center quickly evolved into America's black think tank. It became a policy hub of government officials to understand what was going on with African Americans around the US. Today, we explore a number of issues such as staff diversity on the Hill, the challenges and opportunities of the future of work, economic policy and the status of black communities, and lastly, tech policy. So our tech policy program is dedicated to exploring the impact of emerging technologies and developing legislative strategies to improve the lives of black communities. We are focused on three issue areas. The first is platform accountability, which builds off the work of Joint Center President Spencer Overton who was written on such topics as reforming section 230 because of the voter suppression kind of tactics that are used and ad practices. It also includes issue areas like misinformation and disinformation. Our second issue area is broadband access and adoption where we're concerned with the kind of programs and policies that are needed to ensure that black communities have access to affordable high-speed internet. And lastly is our privacy and algorithmic fairness issue area where we explore critical privacy issues and big data practices that produce discriminatory outcomes for black communities. Great, thanks. You guys are doing such important and great work. So excited to have you here today. So we're gonna be talking about both the opportunities and the challenges related to tech platforms. So, and before we even get into the year that was 2020, tell us a little bit about how technology has been important and powerful in black communities long before 2020. Yeah, thank you. I mean, I love this question. It gets me back to some of the topics that I would teach at Trinity Washington University. I taught a class on media and social change. And really what I tell folks is that media and communications technologies has always been important, powerful for black communities and other communities of color. I mean, we can start with Africa, the birthplace of humankind. For centuries, people used drums to communicate with each other from far away to send messages from one village to the next, much faster than it would be for a person to walk to the other kind of country. And centuries later in America, African-Americans have used newspapers as a medium of expression to fight for liberation and rights to demonstrate racial pride and to inform readers of events that were happening in African-American communities across the nation. For African-Americans in the mid-20th century, radio was the most popular medium of communication. There were stations owned and targeted towards black folks that shared news about what was going on in black communities around the US as well. And TV was also one of the most important tools in the civil rights movement. Television provided the American public with a means to witness and see the struggle of African-Americans for civil rights by showing the horror that peaceful young black demonstrators endured, especially in the South. This week alone, we honor and we celebrate the 60th anniversary of the Student Nonviolent Coordinating Committee, which included young activists and organizers in the civil rights movements. And today, social media platforms have served as venues for political engagement and social activism. More importantly, black communities have used social media to capture moments in real time of black people being abused and murdered. It's a long history and it's important that people understand just how essential media and communication technologies have been for black communities and for social change. Thank you for sharing that. I think it's so critical that we understand the context in which we're having these conversations and that even though it felt like the world changed last year, a long story and a long history that is important to know. So last year, amidst the pandemic, amidst our nation's racial reckoning, amidst the elections, all of us changed how we interacted with technology in our lives. But could you talk to us a little bit about how it's been particularly important for black communities, the technology platforms? Absolutely. So, I mean, as I mentioned earlier, social media has enabled black communities and the world really to hold power for people accountable for their actions. And it has given a voice to black communities around the US. The online community known as Black Twitter has also used social media platforms to collectively organize, increase voter participation, advertise black-owned businesses and to increase visibility online for black people and the issues that matter to us the most. I mean, I think what we noticed over the past two or so years is that the events that have happened to black folks have always been happening. But what's so important about social media platforms and a communication device is that you can capture these images, you can capture what's happening and then you can share them with the world. And that's powerful because often we need that evidence to really encourage reform and policies and a number of actions that happen within law enforcement. And so I think what we're realizing is that we need more opportunities to share these stories outside of the platforms and the terrible videos that we have to witness. But again, just how essential it is for sharing what's going on in our communities. Yeah, it's incredible the ability we have today to capture something right in front of us. You mentioned the transition to radio to TV was an important transition because people could see images that they, and when you see something, it's different than if you hear it. And so I think that's incredibly powerful. So we've talked a bit about how important these platforms are, but we also know that they bring challenges. And so can you tell us a little bit more about how the tech-related harms that communities of color face asymmetrically? Yeah, so big tech facilitates a number of threats to the civil rights, civic engagement and health of black Americans. Over the past year alone, black people have been bombarded with online voter suppression tactics, misinformation about COVID-19 and vaccination and white nationalist violence online. And I speak about a number of these issues in my recent blog that's posted on our joint center website. So I wanna focus a little bit more on artificial technologies, so AI. So businesses like social media companies use AI systems and machine learning algorithms to automate simple and complex decision-making processes. They also use it to sell products and services or as a tool for advertisers to target their ads towards. And algorithms are used in many contexts that can make inferences about data about people, including their identities, their demographic attributes, their preferences and their future likely behaviors. Often though, the data that is used is based upon informations and actions of racist institutions or practices by different institutions in the US. And so for example, law enforcement has used AI technologies to police communities of color. They use predictive analytics and data-driven metrics to inform policing tactics and practices. And historical data based on unlawful practices such as false police reports, unconstitutional searches, target stops and arrest have led to biased algorithms that disproportionately ranked black and brown individuals in their communities as being high-risk for crimes. So the point is that technology has incredibly useful attributes for our communities. But the way in which these new systems and tools are being built and used by social media platforms has disparate outcomes on black and brown communities and other communities of color that we need to address. Absolutely. And so for the less tech savvy amongst us who maybe have heard of the term algorithm but don't really know how they work and particularly how they're trained, can you talk a little bit more about the data that algorithms are based on or how they're developed? Yeah, so basically these kind of harms occur because of AI technologies and the systemic and repeatable errors that create unfair outcomes. And these decisions have led to discriminatory results and material consequences for black community. Again, based on the kind of data, historical data that is used to inform the decisions or the behaviors of our communities and other communities. And so as it relates to artificial intelligence bias, AI bias is a systemic error in the coding collection or selection of data that produces unintended or anticipated, unanticipated discriminatory results. And these bias results are then used by humans to make decisions with implications that result in discrimination for communities of colors. So it's important to note that we know that social media companies are not intentionally seeking to harm people and communities, right? But we do know that again, the way that the data is used and the ways in which the tools and the platform are used by people has resulted in discrimination. And so those are the issues that we are most focused on and important to investigate and develop policies to address. And so what are some of the specific examples of ways that those algorithms have potentially produces even unintended outcomes? Yeah, so that's a really good question. Again, some of which I write in my blog is that we see that consequences of bias very clearly in housing, employment and lending, industries where AI technologies are used, among many others like health, right? And so for instance, in the consumer lending thin tech market, companies have used the data of African Americans to make lending decisions that have resulted in higher interest rates in home purchases and refinance mortgages in the criminal justice system. Black people have faced bias because the data used by risk assessment tools to predict the potential for recidivism. We also see this in employment, in the employment industry where companies have used AI recruiting tools. I mean, one company we saw, we've seen the news in the past couple of years has found that the data used by the recruiting system was not rating candidates in a gender neutral way. And the AI models were trained on resume data from the past 10 years, which was composed mostly of white men. And so what we had was that the recruiting tool taught itself that male candidates were preferable. So these are just some of the issues that we see now, but there are a lot of harms that are occurring with AI tools being used in education and the health system. So what are some of your ideas about what can be done to address these harms? Yeah, so one of the things that we're grappling with at the Joint Center is whether government needs to have more control and regulation over tech companies. That may be the approach that we need to take. And if many will remember, the tech CEO hearing about two weeks ago was trying to really get at these issues because there are a number of harms. And what we can see is that we don't think tech companies are doing enough to address them. So, I mean, I think the first part of this issue to understand is that policy is not race-neutral. And when policy leaders do not examine pre-existing disparities, there's a risk of exacerbating them. And this also has to do with products that are not race-neutral, right? And having to examine the existing disparities and to ensure that we don't exacerbate them. And what's essential to understand is that policy impacts specific communities in a unique ways. And we need to understand the benefits and costs of technologies on communities of color. And we need members of Black, Brown, and Asian communities to be in leadership positions in the federal government, leading agencies that govern tech to explore the specific policies needed to protect our community. Lastly, big tech companies need to make fundamental changes by undergoing independent civil rights audits and by improving and enforcing their policies. There's a lack of transparency and accountability at tech companies on the mechanisms in place and the decision-making procedures within their companies. And we've seen this with a number of stories about universities who have sought to use data of technology companies to understand what's going on, and the pushback against these companies for using that data or the research that will be developed. We've also seen this in the news by certain African-American prominent research scientists, AI ethicists who have been pushed out of companies because of the research that they have produced. So it makes us wonder how much control do companies have over the kind of decisions and policies that they development? And are they making the right decisions as it relates to the kinds of harms and bias that comes as a result of their tools and platforms? I mean, lastly, I'll say to this is that tech companies need to understand that their commitment to racial equity is not just about the murder of black people or even the lack of diversity within their company. Racial equity is connected to the future of AI technologies and their ability to prevent discrimination and harm. Yeah, there's so much to it back in everything you just said. I want to start with you mentioned understanding the disparate impacts. And so we at AI Foundation are funding a lot of research with respect to social media companies. What are some of the areas where you feel like we've done a lot of research and some of those impacts are known? And what are some of the areas of opportunity where you say we really need someone to study this? Yeah, I mean, I think that there are a number of studies that really look at disparate impact, like I said, in employment, lending and housing. And those areas are really important because we have laws that govern those industries. We have civil rights laws that are able to say that if there is discrimination occurring that federal agencies can step in and whole companies accountable. I've seen research also in terms of the kind of data that is used in policing, right? The kind of databases that have been developed to try to understand where crime is occurring or who might be doing that crime. I've seen research that addresses facial recognition technologies and showing that the ways that black and brown communities are being identified are negative or have more harm for our communities. I think there's much more research that we need in a number of areas as it relates to black community. There is certainly a space, a gap to be filled as it relates to these platforms. I know Howard University who has funding from the Knight Foundation is conducting some of this work to understand misinformation, disinformation in the election and how that has connected to how black folks have voted or if they did not vote and such like that. I think we need more of that kind of research as it relates to the 2020 election, right? Also as it relates to the insurrection that happened on January 6th to understand the kinds of communities and conversations that were happening online that led to what we all witnessed on TV. And so in order to do that work, think tanks, nonprofits and educational institutions need to be supported, right? Specifically HBCUs like Howard University need to be sought out to conduct some of these researchers. They are anchor institutions for the black communities that surround them. And I think they have a close relationship to those communities in order to really garner the kind of research and data and information that we need to make sense of what's going on. So I think there's an opportunity for much more work in this area but also we need to connect the research to policy solutions, right? We need to understand, okay, these are the problems but these are the kind of strategies. This is the kind of language that we need to implement within legislation to ensure that our communities are protected. Absolutely. And so another key challenge as we're thinking about policy solutions is that technology rapidly changes. It's constantly evolving. And let's just say our policy solutions don't develop as quickly. So how do you see the tension between sort of the slower pace of policy and the rapidly evolving challenges in technology? Yeah, I mean, I think that's a great question. You know, policy as I learned in this new role is so important to ensuring that there are guardrails for companies and for protecting communities. And it's important that if not the members of Congress understanding what's going on in technologies but there are other stakeholders who are making that information palatable for people to understand and to address. And so we see some of this work in, you know, I've seen members of Congress working with AI ethicists as well as other kinds of technologies to understand, you know, the landscape as it relates to AI and tech to try to develop the legislation to ensure that we do address those issues. But we know that it takes a long time for legislation to be passed, but it is important that members of Congress stay up to beat with the kinds of topics and issues that are going on as it relates to the tech fields. So Congress needs to keep front and center the question of the role of government in regulating big tech to prevent harm and bias. And we need federal agencies like the Federal Trade Commission, the EEOC, HUD and the Consumer Financial Protection Bureau to use their power and authority to hold companies accountable with the laws we have now. We have such places like California, Illinois and New York developing measures to regulate the use of tech. And until there is federal legislation that targets AI technologies and their use, states will have to create their own laws to protect their communities. And many advocates in the space that we work in have hope that Congress will develop federal policies to address AI discrimination and data privacy in the coming year. We know it's an important issue. Obviously COVID-19 is a very important issue. So those things are on the docket, but there is support by members of Congress and like I said, tech advocates to ensure that we address data privacy issues and AI technologies. I want to pull the thread on what you said because most of the time, I think when we think about policy, we think about the federal government. But as you mentioned, there's several steps that states are taking. And so what are states able to do that the federal government can't and or why are there key ways that states should be acting in this time? Yeah, well, I think it gets back to what I mentioned earlier. It takes a long time for legislation to be passed, right? You need everybody to buy in for a bill to come to fruition and to come into law. But states have a little easier process in terms of developing regulation to hold industries accountable. So that's why we're able to see those places be able to pass privacy measures like let's say in California to protect their consumers or New York who is dealing with issues around surveillance technologies and protecting how businesses use them. But what's great about is that we actually can see how the laws play out in these states which can help inform better decisions and strategies that we can make for federal legislation. Now, of course, companies don't like the fact that these states are coming up with their own policies because then you have to deal with your products in California, your products in New York. Exactly, right? And so many of them do want to see kind of a federal privacy legislation. But I think that in laps of one that we just need to see our local communities create laws to do what they need to do in the moment. And so I think there's much more that we will see in the coming years until there's privacy legislation that states will have to develop in order to, again, address these issues. Great, thanks. We're getting a bunch of really interesting questions from the audience. So I wanted to ask, what steps do you think that everyday users of social media and communities can take to push big tech to make changes? Yeah, I mean, as I began at the beginning with our conversation around the importance of social media and black communities, being able to express your voice is very important in advocacy and social change. Social change actually begins with there's dissent, there's upsetness, you feel mad because something is going on, right? That encourages you to want to do something, right? Which leads to this question. And I think that it's important to express the frustration, the anger, the feelings that you have about what's going online, but also to show the kind of ramifications and material consequences that our communities are facing. And with that kind of information urges and encourages, I think, government to pay attention and to notice that something is going on, right? Often when we hear members of Congress, delivering their speeches at hearings and such, they actually talk about kind of real world situation, families that are impacted in a number of ways by the topics that they address. And so it's important to make your voice be heard. It's important to write letters. It's important to be a part of coalitions. It's also important to be a member of organizations like Color of Change, which advocates for these kinds of issues and pushes back against much of what big tech companies are doing and saying. And so I think that's one of the many great ways to really push for steps that can help in advocating for these issues. Great, thanks. And we've got a question on the pandemic specifically. So what is the shift that was caused by the pandemic into everything being online? Like how has that given the tech industry more power or worsened issues or not? Yeah, I mean, I think that's a good question. I mean, we can see the data that social media platforms and online companies have lots of profits over the last year because of COVID. And why is that's because everybody, most folks were pushed online to doing businesses, to earning, to learning, right? And so, yes, I think in many ways that these issues have been exacerbated because we are so much engaging in conversations and doing much more online. And so I think though it's an important moment, right? Because we now start to recognize the harmful issues and actions that have come about because of users, whether it's white nationalists, violent protesters, et cetera, we can trace some of that behavior to the conversations on platforms and the kind of groups that have formed because people have also in the moment of last year had frustration, honestly, with government. So we saw these issues in a much more prominent way because it was a part of our everyday lives. We couldn't escape it. We've also got a fascinating question about, we haven't really talked about rural areas of the country where they're even suffering from lack of broadband access. And companies do seem to be expanding access into areas, but can you talk about what that means in these conversations? Yeah, so for us at the Joint Center, I'm actually working on a report on broadband in the Black rural South. I think it's important that we all understand that broadband issues impact Black Americans in urban and rural areas. And oftentimes when we talk about rural communities, much of what comes to our minds is white communities, but we know that there are a large number of African Americans who live in these rural states and rural counties, and so that is very essential. And so you can imagine that broadband issues that impact the rural communities, impact Black Americans in disproportionate ways as it relates to access to broadband. That has been essential as we have seen again for earning, learning, for health, telemedicine, right? Often in rural communities, hospitals and clinics have moved away or shut down, giving Black communities less options for receiving services. And telemedicine is a tool that can and has been used to connect these communities to doctors and nurses. But if you don't have broadband access, then you're not able to make that connection, right? And you miss out on those kind of opportunities and services. So we're really interested in understanding what are the kinds of policy solutions that need to be promoted to ensure that we're connecting all communities, right? And much of our focus as we advocate at the Joint Center needs to be on these communities that have a lot of poverty, which are located in what we define as the Black rural South, which is 156 counties in 10 states across the US. So we will promote that research as it becomes available, but I think it's a great question. And I think we need to make sure that we are focusing on these issues as we have an infrastructure plan that's being decided on right now. And we're trying to figure out how states should use monies that go towards broadband infrastructure to connect communities, children and families, and to ensure that the monies that are allocated are really targeting the communities with the most needs. Thank you. I think that's a really important note to actually end on. I can't believe how quickly the time has already... I did it quick. It's already flown by before. Thank you so much for this conversation today. I think it's hugely important and critical at this moment in time. So we're just so glad that you could join us here today. Yeah, absolutely. Thank you so much for having me. These issues are very important to me and they're very important to the Joint Center. And so we look forward to working with individuals, communities and organizations out there doing up this important work and being partners so that we can really advance the kind of social change that will be beneficial to Black communities and all communities of color. Great, thanks so much. You're welcome. Thanks. All right, to close us out, I want to make sure that our audience knows a little bit about some of the programming that we still have coming up over the next several weeks. So one of the things that we're very excited about is that we're supporting a series of workshops by the National Academies of Science, National Academies of Science, Engineering and Medicine, specifically on Section 230 of the Communications Decency Act, which is one of the key policy areas that are being evaluated in technology policy over the next few years. So if you're interested in that, there should be a link in the chat and it's on April 22nd and 27th from 12 to six. Just as a reminder, this episode will be up online and you can find all of our episodes of Night Live on kf.org slash nightlife. If you have any questions for me, please feel free to email fdshow at kf.org or find me on Twitter at Ashley Zahn. Our music today was by Miami songwriter, Nick County. Thank you all so much for joining us.