 Good evening, and welcome to the final panel of the convening on creating an anti-racist future. My name is Robert Hampshire, I'm an Associate Professor of Public Policy and Data Science at the University of Michigan's Gerald R. Ford School of Public Policy. And we're really glad that you decided to stick to the very end and join us on this panel, this final panel, which will be actually a call to action for us as a public interest technology network and community. And how can we actually move forward in proactive ways to create an anti-racist future? You know, throughout the convening, we've seen many researchers in the public interest technology community exploring the intersection of technology, racial equity, and social justice. For this panel, we're going to explore what that means to adopt an explicitly anti-racist framework to inform researchers and practitioners alike in PIT. You'll see that the great guests and experts we have on the panel will make this free flowing, keep it lively here at the end of the conference. But ultimately, we want to challenge us as a community to really move forward to make this anti-racist future. So with that, let me introduce just the great set of panelists that we've have assembled here today. And I'll just do a short introduction to each of them, and then we'll kind of get going to the meat of the panel, okay? First, I'd like to introduce Dr. Jessica Symes. She's an assistant professor of sociology and associate director of research for the Boston University Center for Anti-Racist Research. Her research examines racial and health disparities in criminalizing and punitive experiences and the broader consequences of mass incarceration for communities and neighborhoods of the United States. She received her BA from Occidental College and her PhD in sociology from Harvard University. Welcome, and we're really looking forward to this interaction. Secondly, I'd like to introduce Dr. Fallon Wilson, the co-founder of several organizations but mostly, most recently, the Black Tech Futures Research Institute. Dr. Wilson also was the former research director at Black Tech Mecca. She was awarded many awards, but particularly from the Kauffman Foundation's Open Knowledge Grant that actually helped launch the Black Tech Futures Research Institute. Dr. Wilson's research also includes looking at first-generation college students and alternative pathways within the tech ecosystem. That's some work that we've done at Mission as well to look about the career pathways in tech ecosystems. So, Dr. Wilson, we're really looking forward to gaining more insights about your work. We've got a BA from Spelman College and also an MA and PhD from the University of Chicago. As a public interest technologist, I've followed you over the years as you discuss issues in race, gender, faith. I just want to say that again. In faith community, I think also, there's a big role here that I think is potentially untapped, so hopefully we can talk about that, the faith community and civic tech issue. So, welcome. And certainly, last but not least, we have Vitaly Incundo, who's a founder of CEO, founding CEO of AI for the People, a nonprofit communications agency. Throughout her career, she certainly started as a broadcast journalist, producing documentaries or many networks that you know, but also as an AI policy advisor, and who's worked very closely with legislation and Congress people during her time at AI governance, was involved in launching and leading the first legislation around algorithmic bias and deep faiths in the Congress. So deep experience from the broadcasting and journalism, but also legislatively and also from a communication standpoint. And so we're just so fortunate and lucky in many ways to have you on the panel. And so welcome all three. I know that all three of you have biographies that are extensive. And so we just really just scrape the surface so we can kind of jump into what do we mean by creating an anti-racist future, particularly in the role of public interest technology. So let me just teach Jessica up here. And given that you're the co-director and director of research at the Anti-Racist Research Center at Boston University at BU, can you kind of first give us, you know, what does it look like? What does it mean to take anti-racist view on public interest technology, or what does that anti-racist future look like from your view? Thank you, Robert. And thank you all. I am so honored to be on this panel. I truly, yeah, deeply humbled and so excited to share and engage in this wonderful and truly urgent conversation. I want to start with just a quote from, you know, the really powerful quote from Angela Davis, Dr. Angela Davis, in a racist society, it is not enough to be non-racist. We must be anti-racist. And I think, you know, for us at the center, but for all of us who are working in public interest technology, being anti-racist is a radical stance and it's an active stance against racism, which is unbearable in our society. And it is a crisis in our society. It is a pandemic in our society. And so to be anti-racist is a radical choice in the face of centuries of dehumanization and the, you know, sort of removing people from their ability to have lives full of dignity and well-being and political power. So from a research perspective, we think that racist research historically asks a question, what is wrong with people, meaning what is wrong with individuals or groups and why aren't they just figuring it out? We ask a different question at the center and I think all of us, which is, what is wrong with policy? And our belief is that framing research on race and racism around anti-racist questions leads to anti-racist narratives, anti-racist policy solutions, and really impactful anti-racist advocacy campaigns that cut to the root of racism, which is racial policy. And in racist policy. So being here today and I'll finish a little bit here is, you know, racial data science, I think, asks sort of two important questions, which is, how do we collect data ethically and in a way that can really shine a light and be an important way for us to share the reality of racism, but also our approach to data can be anti-racist as well. And so I'm going to stop there and let my other panelists jump in, but that's sort of how I would start by answering your question. No, this is great. Great, they sort of lay the framework with the quote from the incomparable Angela Davis and that's just a really great way to start us off, you know, and, you know, when it comes to anti-racist framework for public interest technology, really, I think let me maybe turn to the Fallon to kind of build off of things that Jessica mentioned there. Does that sort of jive with your conception of anti-racist future in the role of public interest technology? Yes, definitely it does, but I probably would add a little bit more to it, particularly through the lens of blackness, right? I think when you think about public interest technology, you really need to think and acknowledge some things, right? We need to first acknowledge that black and brown people have always done technology, that this is not new, just because we're now thinking about it in this new discipline. I think it's really important to think about the history of it, right? And of course, Charlottes and Black software books should be like the foundational reading for all those who are thinking about building an anti-racist framework in this new discipline that we're trying to canonize. I think you also have to acknowledge that data bias and big data and all the things we love to talk about in the network and on the practice side as a practitioner, that's not new either, right? I mean, black and brown bodies have always been tracked in surveillance and have used various technologies to do that over the time, right? I think you also have to acknowledge as we really try to really operationalize what anti-racist framework would look like for public interest technology specifically. You've got to look at some of the research that's out there that's been done by the K-POR Center and their Leakey Tech pipeline that looks at all the structures that tailor and intersect that make it very difficult for black and brown people to co-create this new tech world. And then I think you also then have to interrogate the definition of public interest technology. I know that's something we don't like to talk about because I think there's so many definitions flying out. I can't find the definitive one, right? But of course, most of them deal with the public interest, the public good, but we can problematize public good and what it means for people that look like me. Sometimes it doesn't necessarily intersect. I'll give an example. One of, I guess, a practitioner organization that you can say that does public interest technology work code for America. Many of us are familiar with it, right? I think back in 2016, I was one of 10 African Americans that was at the conference and they did a presentation on how to make drones more efficient in dropping bombs on enemy targets as a public good. I said, they're saying to myself, is this what we mean when we talk about public interest and public good and it's from an anti-war framework and also as a person of color, what does that look like? And you think about how those technologies could also be used here. And so I think you have to acknowledge that black folks and brown folks have been doing technology. You have to interrogate the concept and you also have to interrogate the conception of public interest technology. How did it come about? What spaces was it nested in and how do you really make it open and accessible to people who are not at a research one universities, who are part of various non-academic settings? Because it's always very interesting to me. Of course, I have a PhD. I went to the University of Chicago. I went to Spelman, which I love, shout out Stacey Abrams. But I also realized that we spend so much time doing research on the people that we have not yet figured out how to translate these worlds and these concepts of civic tech, gov tech, public interest technology in a way that people would take it up and join our cause. And lastly, I would say you really, yes, I think you have to interrogate the concepts from the beginning. If it's going to be a true anti-racist framework, you have to talk about the origin of it and how people use it and think about it. Who's at the table and who's not. And I think that's where we begin. So I'm going to pause right there for my other panelists. Yeah. So Dr. Wilson, that's incredible insight about interrogating the public interest. Right? And looking at it from a critical perspective, one of my colleagues at the University of Michigan, Shabita Parthaswarthi, you know, has this term called critical data science. Like, so is there sort of a critical interrogation of what the public interest is? And certainly that hasn't been in the interest of, you know, African Americans and people of color in this country for an incredibly long time. And so I think that's a great point, as a starting point, what is to be critical about that. And so that's really a great insight. And so, Metafly, let me turn you to kind of think about tying together these, you know, definitions of what this anti-racist future might look like. I kind of look at you as a visionary. You've always sort of been ahead of the curve on many things in AI and data science and governance. So like, what's your vision for this? What is, what's your vision for this anti-racist future? And in my, first of all, my fellow panelists, I think have done a really comprehensive job in terms of definition. But where I would take it a step even further is when we're speaking about blackness, what blackness are we speaking about? I'm in the process of writing my first book, Ultimated Anti-Blackness. And I had to acknowledge that as a woman who grew up in the UK, who is an African immigrant, I naturalized in time for the election, y'all. So I did what I had to do, because I'm clearly a black woman and we did what we had to do, different Zoom. But many of the people that are prominent and get attention are African immigrants, right? So even when we think about this idea of blackness, we also have to understand that within the tech industry, there is a particular, there is particular pushback for African Americans, those that have been here generationally. And so as we're thinking about justice, there has to be intergroup justice to make sure that if I'm in a room, then I'm in a room with Jamal, Keisha, Tanya and them. Otherwise, I don't know that it's truly reflective of this country. The other thing I would say, and this is where I kind of pitch my star against, is that I do believe that policy is the delivery system for ideology. So if I'm really going to think about an anti-racist future, then the way that those markets are reshaped have to be codified into law and codified in such a way that they don't inadvertently promote white supremacy. Because I think all of us would agree that this racial caste system to use Isabel Wilkinson's phrase was something that was written into law. It was something, you know, this is not naturally, to be racist is not natural, right? To be racist is really the choice of laws, policies, and practices that have been adopted over 500 years that bring us to what we think is natural. So in my speculative view of an anti-racist technical future, which is what I speak about a lot, we would have everything that has been mentioned prior, but we would also have changed our laws to incentivize that type of behavior. So can I give you an example of that? When we were writing the Algorithmic Accountability Act, for example, one of the things that we demanded was a kind of like an FDA type agency that meant if you looked at an algorithm, we can't explain ability isn't there because of IP, but if we see that the algorithm is discriminating against any particular group, it's not allowed to be released into the marketplace versus what happens now. You and your white friends, you're probably male, you design an app, it's in the marketplace, and then Black women's lives are in danger or whatever. So that would be the first thing. And then the second thing in terms of my vision for an anti-racist future is that ultimately, I'm a professional communicator. So in the projects we've done, we would use popular culture, we use film, we use journalism, we use all of these delivery systems to make sure that if you are saying deep on the police, you also know that you're talking about facial recognition technology. If you are saying that you want to live in a just society, an anti-racist society, that you're also thinking about the algorithmic decisions that are being made about you and pushing against that. And that's definitely, I'd be happy to speak more as we get into this conversation, but I would say those would be the two major things that would be in my vision. OK, of course, again, you've laid the groundwork for a vision that we can maybe all get behind. Really looking forward to your book when you're finished, by the way, a plug there. But I think you've sort of moved us to sort of synthesize things that Jessica and Fallon have talked about with anti-racist and then kind of what that future looks like, but also then started to move us towards examples of that or what that might look like in practice, like you mentioned, with the algorithmic transparency work in legislation that you kind of worked on. Fallon mentioned that Charles Shelton's book about Black software and how Black people have been involved, particularly in technology for many, many years. I find myself at this intersection. So I'm trained as a, my PhD is in engineering. And also, now I find myself in a public policy school, but I know that my PhD was paid for by Bell Laboratories. And Bell Laboratories has an incredible history of Black scientists going back to the 1930s. They broke the color line before it was Jackie Robinson, some of them named Lincoln Hawkins, who basically invented fiber optics, someone named Dr. Jim West, who created the modern-day microphone. So the microphone that's on all your guys' computers was basically invented by a Black man at Bell Labs. And there's this great history of Black people being involved in technology. And particularly now, I think we accelerate that beyond just involvement to accelerating this, converting that to actually anti-racist technology. Because one thing I have learned is that just having the bodies there in the same structures aren't, that's the first step, that is a step. But it's not, it doesn't convert completely to sort of anti-racist technology or perspective that pushes us forward. And so, I think some of the work that, particularly I'll kind of turn back to Jessica here. Like, are there particular examples of kinds of work, be it in data science, PIT, civic tech, or more broadly, that you would consider we kind of hold up as work that moves us closer to this anti-racist future? Yeah, I mean, first the work of my colleagues on this panel very much. I'm really inspired by data for Black lives. And something that one of my panelists said earlier really has stuck with me, which is I think we can think really expansively about who gets to be involved in data, data collection, data analysis, writing about and narrating the data. And so, I think one of the most powerful things about the movement for Black lives is how much, broadly, not just data for Black lives, but the movement has really shown how much data can drive a whole new narrative and a whole new set of policies. But it comes from the community. And so I think that it's, while there are so many wonderful organizations that are doing this, I also wanna make sure that that is always anchored in the community that is most affected by all of these policies. Yeah, that's just a great point to keep us anchored in the sort of community, the users, the people who actually can kind of co-create these things. So that's really incredible. So speaking of the work of the great people on this panel, the kind of pushes towards this anti-racist future. I don't know, Dr. Fowley. Wilson, can you say a bit about Black Tech futures and how that work fits into what we're talking about? Is it possible for me to build on something Jessica said first and then jump there? Absolutely. I think you're right. I think part of it is that data can be liberatory in a sense of co-creation. But I come to find that when we think about public interest technology research projects, once again, I think we have that, right? Or we have those issues that we're most comfortable with talking about. Buyer's algorithms, having a competitive market with ISPs, interoperability issues, they're just a set of, they're just, we have very privileged conversations and issues we like to have conversations about. And to do research projects on and to mobilize people on. But then I think back, I said, can I just get a broadband data map of those who have access to the internet that look like all of us on this panel? And the FCC has not updated those things, right? And companies will not give that data access. And so we think about developing a pipeline of public interest technologies. And you want it to be as diverse and as varied as the various bodies and minds and spirits that are on this call. But I can't get there because so many of people who look like me lack access to quality, high-speed broadband. The challenge with that is, it is not a sexy topic. It is not talking about biased data. It's not talking about how Facebook algorithms are trying to get, it is not, it is not, it is not, there's something very class specific about it, right? And the assumption is that everybody has internet. That is not true. And what we realize in this new pandemic and this new remote learning world that we all find ourselves in, even if you do have access to broadband and you have five people at home and they're all using the same data, it drags and it creates disparities. I fear to think about the learning laws of black and brown children because we have yet to figure out how to address the broadband issue because I can't get data on who actually has access. And so when you think about a project for PIT, that is a project. Is it sexy? No, but is it needed? Yes. And so part of the work that I do, and I'm excited because, you know, I had the fundraise for this institute to launch and it's launching next week and it's entitled Black Tech Futures Research Institute. And what we believe is that the true answer to policy, my beautiful panelists laid out is at the national level, but you have to start at the municipal level, right? You need to understand at the basis what is going on in the municipalities as it relates to smart cities. And then how do you translate that to everyday black and brown folks? So they can understand it, co-create policy and build an aggregation of conversations across cities in the South in particular. And so I think I'm probably rambling, I'm a little excited, but I just believe that if we really are going to have an anti-racist framework for public interest technology, we will have to broaden the level of issues that we tend to talk about and we really have to start at the local level. And there are a lot of amazing university partners doing great work. And I think that there are a lot of them thinking about these things, but because it is not, once again, I hate to use the term fashionable, exciting, because who wants to look at broadband maps? Nobody. But it's needed when we think about creating a foundation and it's needed when we think about research and it's needed when we think about how do you build a movement? If no one gets on the, if no one can access the internet and we can't hold the companies accountable, we will only be the privileged and the Ivy Leagues and in the spaces having these conversations and it will never be a mass movement or equity in this new technology world we're building. One of my colleagues at Boston University calls it digital redlining. This idea that whole, I mean, another thing I'd layer on too is telehealth and how many people without access to broadband or internet are not accessing very basic diagnostic services because they can't access telehealth. So it's a huge problem and it's absolutely affecting black and brown communities the most. And I would say very quickly that my work is at all levels. My next project is actually city specific, New York City. So I would encourage us to not think about all our work in the terms of a project but there's an expansive universe. Yeah, so Natali, can you kind of say a little bit more about that, particularly some of the work you've AI for the people and what you can maybe tell us about this new project or how examples of the kind of work you're doing? Yeah, sure. So AI for the people started a year ago. I often say that we are a baby of the PIT network in many ways because it was launched through a conference while I was at Harvard called Black in 2020 and we were looking at that point two major questions. How would social media algorithms impact the 2020 election and going down kind of the disinformation road and we chose to follow racially charged disinformation agents and we were looking at three cities, Philadelphia, Detroit and Milwaukee. So research was broadly on the internet. How would these messages following? And then working with Black Lives Matter affiliated groups in each of those three cities and then launching a counter narrative because we are storytellers. So our partner was moveon.org. We were able to launch a campaign called Vote Down COVID because we found in each of those three cities the stories that we were getting particularly from young Black men and the videos that we released spoke specifically to COVID but even that was gendered. Like the guys would talk about losing their jobs, like the women would often talk about their families and life and being a teacher and being at work and our target was a hashtag called Vote Down ballot which was released by a group called the American descendants of slaves and we tracked from, we started the research project in November last year. We're just finishing up doing our last data drive next week and we actually spotted the ice cube was being mentioned and becoming active within this network around July. So I'm really looking forward to kind of following that and we had spotted that Black men were much more susceptible to the disinformation messages. So we're really excited about that work. We outperformed the hashtag and we placed an article in the Harvard Kennedy School misinformation review and what we're hoping is that that will lay the groundwork for disinformation work being done by Black people for Black people looking at Black things because AI for the people is friends with Black people doing stuff about Black people for Black people doing Black things. So we were really proud of that. That was kind of our launch project and we are just transitioning into a project that looks at facial recognition. So the way our agenda is set around the congressional work I was able to do which we're in three broad areas. Algorithmic bias, which I always say is just the idea that computers can be racist, machines can be racist. So we do that work because to Dr. Wilson's point if the old ladies that I go to church with don't know that an algorithm is biased then I haven't done my job. What's the point of having a communications firm in the network if my mother from the church doesn't know that when she goes to get her benefits that there's nonsense afoot. And that's really my audience. Secondly, we look at in deep fakes we were looking at what we call information integrity because we believe that the first disinformation campaign was that Black people are not human. We take it there. We're not waiting for all of these fancy schools and I know you're probably all watching me shout out Ivy League but Black folks knew about disinformation before you wrote your all papers. So the fact that we don't have more experts in that field seems very ludicrous and we're interested in changing that. And then the third thing we look at is biometrics just because what we're in terms of market forecasting technology is moving away from the screen and we're moving really towards ambient technologies. So things like facial recognition, gate recognition, stingray and we were really fascinated with the George Floyd protests and what happened within the context of COVID and racial justice because we saw two things Jessica I am so excited that you're here and I hope that we will get to collaborate going into the future. But Dr. Kendi's book was number one because people had this renewed interest in New York Times say that Black Lives Matter protests after George Floyd may be the largest social movement in US history. How did they know that? They scraped social media data and looked where people were uploading and found that 500,000 Americans of all races reported to being a Black Lives Matter protest in June 6th from 50,000 unique places. So from an organizing standpoint that's a lot of white people out there saying Black Lives Matter and we don't live everywhere. So there were places and spaces where we are not at but that message had been found and so we were really interested and well what was the surveillance around that? So we are doing a work, we're doing work with Amnesty International and we are specifically looking at the NYPD and their use of biometric technology on protesters in New York City. So we have all the FOIL requests out, shout out NYPD. We will sue you and get our data. We will shout out Department of Justice. Y'all are about to be out so we will hopefully have someone we can work with but then really looking at where these technologies place much like Dr. Wilson. If we don't have these maps we don't understand the scope of the problem. What do these contracts say? How are they used? And then we're working with activists. We're very much, we're finding that the Black activist community really gravitate towards our work. And I think that's just because all movements have art and we create art around these themes. And we're producing a film that kind of goes through not just the activist story and the COVID story but also the way biometric technology was used to arrest two activists here in New York City. And you can read, if you can't read I will send them to you, shout out. You can contact me. But that's the work that we're doing and our goal is to push for, to create the type of pressure that we'll push for a ban because again AI for the people really does believe that ideology is the delivery, policies the delivery system of ideology. And so we want to change the way stuff looks whether we are working on that project or not. So, we're a baby org but our model seems to be really robust. Yeah, what an incredible set of activities and partnerships and community that you guys are building and leading. I think one thing for the pit you in our network is really embracing this broader notion of one, what technology means, we'll come back to that. But then also the set of stakeholders and people who are involved. You said something that was beautiful. Technology has art. You said you guys are creating art and then there's activists involved and various data scientists and storytellers. I think that broad coalition is something that we can really move forward on not just in some academic way where we're researching and writing papers but in the real world, like on the ground. And I think that's- And policy makers too. We, you and I shared earlier I was surprised when I got an email from the Biden folks being like, we've been following the work that you're doing. This is something that we're interested in. Send us ideas, send us proposals. So this and a policy makers are all levels. In the facial recognition system, we're working with the New York City of the office of the New York City Public Advocate because they want to introduce they're really introduced interested in legislation and the stuff that we had done on the congressional level and they want to see whether it would work on the city because they need art too. Plus, this is fun. Like everything in our work is centered in black joy. If you just want to have some fun and advance some justice. That's not fun. For sure. And I'm going to turn to Dr. Wilson here in a second but this idea of really proactively directly being pro-black and working on turning data on his head for that purpose and had that something that's up front because oftentimes, and rightfully so, we think about like data bias and misinformation and those are issues, but we can also then turn those things into proactively help build social movements, help actually think about outcomes and education and like, but do it in a way that's proactive. And I think that's part of what this anti-racist future sort of looks like, but let me just kind of take a little step back and so we kind of mapped out a little bit of this vision and some of the exciting things that are happening but part of the Pitt University Network aspect is also getting students and others, traditionally and non-traditional students ready to engage in this field and this work. And I know that Dr. Wilson particularly think about career pathways and ways in which people kind of find their way into this kind of work. I mean, I think you have some interesting perspectives on that. You want to tell us a bit about some of your work? Yes, yes, yes. I believe that, I just want to say four years ago, five years ago when public interest technology was coming on the scene, I said and I still will say that first-generation black students, this is like, should be the ultimate career choice for them, primarily because they major in and also tend to work in human service-based careers in government, right? And so social work, anything around relating to impact and community, even first-generation college students who major in STEM and computer science will tell you, the reason why I do this work is not to be the Martin Zuckerbergs of the world in disruption, I want to have community impact. And so for me, it is, I feel like I've preached it to many foundations if you're watching, give me a call. We really should be thinking about how historically black colleges and universities should be nested within Pitt. I know Howard is a part of the network and I want to shout out Endream for working very diligently to get more historically black colleges and universities to be a part of this network, primarily because of what they stand for, right? They are pro-black, let's be clear, and they are about creating black and brown leaders who are committed to justice and creating a integrated narrative of what it means to co-create a future for people who look like me. And so for me, public interest technology was a natural fit for first generation black college students. Even for me, you talked, Robert, you talked about how you were an engineer and now you're in this space. My researchers really looking at, it was in gender studies, it was a black feminist to its core, looking at how to create safe environments for learning for black and brown girls, right? I tell the story, how did you get into technology? I can't, unlike the beautiful, is it Natali? I, Natali, I can't code myself out of a paper box. Literally, I cannot. And matter of fact, when I tell people what is code, it's a bracket, zero, zero slash bracket. That's what I tell them, right? But listen, the reason why I learn about technology, why do I learn about interoperability, why do I learn about bias data is because I want to be able to translate this world for people who look like me. I grew up poor. I had an uncle who drove a truck. He made $24 an hour. When we had crisis, we went to that uncle who had the $24 hour job and benefits to deal with the issues in our family. When I found out that automation is likely to decimate truck driving or other types of low-skill work, it worried me. And I said, I cannot be someone who's committed to the liberation of my people if we cannot figure this out, right? And so I do this work, a translating, and I have been a public interest technologist since 2016 outside of a university setting, right? Telling the story of how we have to better translate this world. I will say it again. We have very privileged conversations with either academics or activists, but then in the middle, there's a whole very different groups of black and brown people who know nothing about this world but who has been pushed into it because of a pandemic. And now everyone is trying to figure out how do I do remote learning? How do I get access to broadband? How do I connect with my church and mosque and my families? How do we do community? And so technology has come and just landed on them and we have not prepared them to figure out how to negotiate it so that they are not taken advantage by Zoom. I'm not saying Zoom is taking advantage, but I'm not saying me and not any studies, I don't know. But I love it that every black church I know is old Zoom. The larger point I'm trying to make here is that if we are really trying to grow a network of additional types of students who will go on, once we canonize public interest technology, we need to ensure that historically black colleges and universities are part of this discussion. We have to think about the frameworks and the issues and they're gonna have to be broader than what we tend to talk about. And lastly, you need to talk to these amazing people on this panel and others like us to access our stories for how did we decide that we would do this work? It is not simply because I wanna disrupt and be a Mark Zuckerberg and have a unicorn tech startup. It is because I care about the lived experiences of black people to be able to say how they live their lives in a fully automated world. And I think that's where I would land on that. Yeah, so Dr. Wilson, I think your story is so powerful and actually an exemplar of many others who are people of color, particularly who are working in this PIT space. So part of our, I was leading a project at the University of Michigan for our PIT year one grant. We interviewed sets of PIT practitioners of color and really to get their stories. You're on, we gotta come back to you guys. I think I've talked to some of you guys already, but to really try to distill out like what are those career pathways look like? And oftentimes, some of them are people coming straight from like STEM and technology pathways, but life can go in many different ways. And so many people are coming from, particularly people of color from non-technical backgrounds, a lot of people from marketing, communications. You get people from the arts who are now working squarely in this space and actually leading. So these pathways that we're finding from our interviews and also from data, it shows that it's a very complex picture of how people get into and move through career paths. And that's something for students and students of all kinds, traditional and non-traditional can really, this is a field that is open, is an interdisciplinary field that's focused on impact. And so anyone who's interested in that impact, engaging with the community can really plug into it. So I think your story is so great. And so let me actually turn a little bit, do we have a couple of questions that are related to this? And so someone asked a question about, particularly for, maybe this is best for Jessica, the question is about students that you see who are interested in anti-racist work and research. Do they make that connection with technology from your experience? Such a good question. And it actually kind of builds on what Dr. Wilson was saying in my view. I look to our universities, how much are you prioritizing anti-racist curriculum? Is it relegated to a half of a day in your courses or is it a required course for your computer engineering students, your data science students? Do you prioritize this in your broader curriculum and how do students feel when that isn't a priority? And they don't feel like their experiences are reflected in the broader curriculum, especially in technology. And so something that we want to do at the Center for Anti-racist Research is build a set of curriculum and tools to actually provide whole semesters and years worth of education around what it means to practice anti-racist data science and anti-racist technology broadly. But I push on the idea that, I think we were talking about pipeline and career trajectories, but I have had black students come into my office and say, I've never had a black professor. I've never taken a class on anti-racism. And that to me is a total institutional failure. And we can do better and we must do better. And our courses need to reflect this because frankly, you can't work in technology without the understanding that what we're doing is reproducing inequality unless we interrupt it and stop it. So it's a failure in our education. It's our failure from a justice perspective. It fails our students. And so I think that my students in particular are really eager to learn about these things and find experiences both inside the classroom and beyond. No, that's a good insight about how, particularly building curriculum, because I know many universities in the UN network here, many projects are actually about building curriculum, particularly in computer science and how to really think about revamping the ethics, but now going beyond insane anti-racist content in the computer science work directly, much more directly. So I think many people are probably gonna be reaching out to you Jessica related to curricula for students and technologists to go through, particularly the focus on anti-racism. So that's really strong. I mean, now I wanna still open it up for questions for those who are sticking with us here on the Friday night in the last panel. Please, in the chat window put in your questions and insights and I hope to kind of get to them as a bunch of them that are coming in. And it kind of ties into, the next question ties into something that, Natali, that I think you may be best position to answer. And it's kind of, us academics can sort of, get into curriculum and writing tapers. But like, how do we translate that to practice? I'm a professor in a public policy school. How do we convert that and translate those things to policy action? And I think you touched on it a little bit with some of your experience, but do you have any advice or from your experience of how to translate this PIT research into practice? So my particular case, I had been an active member of the Congressional Caucus of Black Women and Girls for, since about 2013, because they had met me when I was contracting with Google and really doing a lot of kind of community engagement work in the New York city area. And so there was already trust and credibility. So as I built relationships with that particular network, I started to learn what the policy priorities were of folks that were, I was very strategic. I was looking at folks that were looking at committees that were overseeing technology. And I found that they were interested in diversity in hiring in the field, which I thought was kind of a, I thought that was a red herring. That's not really a real problem. I've been in those tech firms, they hire black people all the time. Those black people stay for 15 minutes because they're not here for that and they leave. So I, and I was able to kind of push them towards algorithmic bias, but obviously that was, especially after Kathy O'Neill's book dropped because it was so easily digestible. I then could go to them with something that they could understand, but really aligning their existing priorities and then illustrating how technology was changing that. So biometrics, so for no biometric barriers, I was working with somebody that was interested in housing writ large, right? Had been looking at public housing, long history in public housing. And then here I came and said, facial recognition is gonna change housing for black people, not just as a technology that's in the entrance way, but do you know about ring doorbells? And then that started with the algorithmic bias. I really have to thank Kathy O'Neill's book for even raising that point, the idea that technology wasn't neutral. And then with deep fakes, that became a real issue because I was in the Congressional Caucus of Black Women and Girls. And we were at that point reading Sophia Noble's book about the pornification of black women and girls, algorithms of oppression. And then there was a deep faked video that had been made of Scarlett Johansson at the beginning of 2019. And there was all this press about Scarlett Johansson and what happened to her was terrible, but I was then able to go in because I was in a black feminist space. I was, you know, Kumbahi River Collective was something that we were really interested in. We're reading Audre Lorde, we're reading Bell Hooks. Like I'm with my people, right? And so then it was very easy to make a black feminist case for intervention based on that particular technology. That's my particular story. I think for people in the network that are interested in impacting policy, the first thing that you have to have are real relationships. And no matter how many papers you write and how great you are in the academy, that actually doesn't track outside of the academy. You have to have some type of impact. I mentioned these incredible books that women had written, for example. That was the way of having impact. And then that gives you a platform because often you are speaking to people that have three minutes to hear what you have to say and what you say in those three minutes has to align with existing priorities. So my advice would be the same work that we put into making sure that our papers can be critiqued and the critique is firm because our methodology is good, has to then be applied to the policy space. And we're at this exciting moment in some ways where we're having a new administration that is more interested in science, but let's not joke. This is a neoliberal ticket who are already hiring people from industry. So the next four years, if you are somebody that is interested in racial justice and technology, in racial literacy and technology, which is something I speak about a lot, and anti-racist technology, then we have to build a movement because Eric Schmidt is not it and he is their number one person. But I've been here a while, I've been here, this is what my eighth, my ninth year, they know when I'm coming, I'm about to set it off. So, we'll move it then. Doug, you're bringing up a great point about having, I call it like a dance partner, right? So when you're building those relationships with policymakers, I mean, you need to have a partner on the other side who really can elevate those issues and actually value them, but it's a conversation. You're a communications person, so I mean, those lines of communication and language need to be clear that we're addressing issues of importance to them and their constituents. There's a question here. Yeah, sorry, go ahead. Is that we always partnered with researchers. So the researchers in the pet UN network, for example, end up being great partners for us because if they cannot translate their research, we can. That's our job. And I'm always looking for people to testify on panels. I'm always looking because the people that get to give testimony, you being written into record, whether it's in the state, city, municipal level, is really important and there should be a diverse field. I definitely take an organized stance. If I'm the only person saying this, I've done something wrong. In every city, in every place in this country, there should be somebody saying that. And then if we look at this last election, who delivered for us? It was the hoods of the city. It was people in the hood in Detroit. It was people in Philadelphia. It was people in the hood. So those people need to be represented as well as the academics that are doing that work who quite frankly, I mean, I'm very academic adjacent. I've taught these classes at Stanford. I'm on another fellowship at Notre Dame where I'm teaching a class around feminism and technology and black feminism and technology. And I'm often the only person that they've ever seen seeing these things that I'm oversubscribed because people don't even believe it's a thing. They act like it's magic. Yeah, I think the role that you and folks in your position play in this public interest technology network is very important to be able to communicate, some of the research and translate that to policymakers or facilitate those connections and interactions. I think part of this as a partnership community is we have to embrace all aspects of it. So this is really great point. We're kind of reaching towards the end here on the panel, but I want to kind of come back to a question that I brought up at the beginning. And it's something that Jessica mentioned, but Dr. Wilson, we kind of touched on just briefly at the beginning, and that's the role of community. And really, you said that many, and I found this that students who want to get into this space, I mean, are really motivated by, again, not necessarily disrupting some technology or wearing a hoodie, but they want to build things for community, their communities and from problems that they've lived through and solve them for people who they love, right? And so that community aspect is something I kind of want to bring us back to because they're bringing back the human element, the person element. And so can you say from your experience something about community, we've touched on communities of faith. What's your perspective on those? I was saying we really did touch on communities of faith, but we really should touch on it. Should, let's touch on it now in the last couple of minutes that we had. Yeah, and I'm gonna be quick because I know Jessica, you got something amazing and added just make that thing beautiful in the sense of community. I would simply say that the personal is political and the political should be public interest technology. And I say that because the whole notion of owning one space, one's narrative requires that you are able to translate it to people who look like you. I think it is one thing to have, once again, a messaging campaign that is impactful for those in the congressional halls. However, it is not sustainable unless you have a movement of people behind it. And the reason why I love faith communities to have these conversations about public interest technology, about civic tech, about gov tech, about activism. And now I'm calling Black Church Futurism, right? If I said it, you heard it here, Black Church Futurism is that they are places outside of historically black colleges and universities where you have, at least even virtually now, congregations of black bodies and minds thinking through concepts. And if they can take deep concepts on death, on life, they can easily take deep fakes and say, no, it is not a Michael Kors knockoff bag. Get it? Deep fakes. They can translate to our communities what this world is, how they can co-create it and why they should matter. We have been very good at creating a vernacular around incarceration and wanted to look like a liberating space against that norm. But we have not done a good job at saying that this new world that we are so excited about on this panel, that I can't figure out how to say, how do you say anti-racist to Mother Montgomery who has been over the church deaconic board for like 15 years, you don't look at you like Fallon, speak a language that is familiar to me. And so that goes back to another script, another code. I think, you know, and I love how we use metaphors. Yes, the new Jim Cole, Dr. Ruhi Benjamin Spellman graduate, right? But also the code, right? What is the code that we need to translate for communities? And Jessica, what you got? What you got? Come on, jump on the assistant. I, it's so hard to follow that. But I, what I want to say is that the community is at the center of what we want to do. The community knows what it needs. It knows how to, how it has survived and it will continue to survive. Data and data science can lead to accountability, but it's also about collective action and it's about voices and it's about stories and it's about the public good that can come from centering the community in our work. And, you know, I think about my work in mass incarceration, you know, people have survived and people have lived through this and we have to listen to them for those solutions because the people who are most affected by racist policy are the people who have the understanding and the knowledge and the ability to solve those problems. We cannot silo them. We cannot, you know, keep them away from this process and this, this, we have to center them. They have to be at, not just at the table, they have to be the table and build the table. They are, they are the whole reason why we are here is for them. Yeah, I mean, what a great way to end the panel with that call to action. This is the vocation, you know, for us to really go from here, from this convening to our work, to our lives, our communities to help create this anti-racist future. So I just want to thank my three incredible guests and for all of those here at the convening for joining us for this last session. And I want to thank you from my deep of my heart to New America and the Pitt UN community. With that, thank you.