 My name is Jenny Toomey, and I am the director of something called the Catalyst Fund at the Ford Foundation, and we're going to talk about that. And I have been given the incredibly wonderful task of talking with, as an A here, about public interest technology and why it's so important, and how it's going to make a better world for us or prevent a way, way, way worse world. And in order to do this quickly and get to sort of the meat, because that's what we want to do, which actually reminds me there's a little bit of housekeeping, which means we'll ask questions, but there's also an app on your phone if you downloaded the SoCAP app, and you can start sending questions as you have them. And Erin will let me know if there are things we should be asking to be more responsive to you, because frankly we know a lot about public interest technology. We think about it all the time, but it's sometimes a complex concept. So if there are things we can direct our conversation towards to make it more useful for you, that's what we want to do. I'm also going to do a little sort of preamble of information for you because then we're all on the same page. And I'm going to do it from my notes so that I don't forget anything. So why are we talking about this? Well, the rapid growth of technology is absolutely transforming every aspect of our lives. We know that. You guys live in San Francisco or you're in it. You know how much the world is changing. But the workforce that's actually creating the entire system that we're living within doesn't really represent the rich diversity of knowledge and individual experience. That we have in this world. And so that is meaning that we're actually designing technology that doesn't necessarily work for everyone or is also missing some of the wisdom that would actually allow it to be better. A wonderful organization in need of being, which works on these kinds of issues, reports that black women represent less than 2% of the tech force. And that Latinx women represent 1.5% of that workforce. And that Native American women are 0.01% of that workforce. So when people of color do not enter tech, first of all, they earn less than their peers. They tend to have less senior roles. And this has incredible impacts both on design and remuneration for these individuals. The opportunity gap results in a pretty homogenous workforce. That sometimes reproduces inequalities that are systemic actually into the systems that we have to use. And we've all heard about algorithmic bias and some of the consequences of these kinds of design decisions. Like rampant online gender violence and disinformation and other kinds of issues, polarization. And these systems foment a kind of inequality that harm the most impacted populations of our society. So this is one of the reasons why Ford Foundation, which is a social justice foundation, it's not a tech foundation, really cares about these issues. Every issue we work on is being either advanced or undermined by technology. And we believe part of the solution of making things better is to increase the diversity of the individuals who are designing the technology that we have. There are organizations that are working to build a pipeline of diverse technologists. And we're really lucky to be joined by, as an A who is a board member of one of those organizations, she'll be talking about it today. And we'll review how we got here, what are the barriers that individuals and systems have within them to solving these problems. And with that, I think we understand what at least we're trying to talk about. There's a question already. How does Ford define public interest technology to differentiate it from not for public interest technology? Right. And I'm happy to answer that as I'm doing what I am next required to do, which is to introduce myself and to ask you to introduce yourself. But I'd also say that we love questions. You can raise your hand, but wait for the mic because this is all being recorded. I should have said that before. So how does Ford define public interest technology? So I've been at Ford for over 15 years. I came in to build a portfolio called, it was called like Internet rights back then, but it was trying to build a field of advocates who could ensure that the long fought for rules and protections, human rights, civil rights are not being corroded by technology. And there's clear ways to protect the public in the technology environment. So that was my job. And most of the people I was funding were not technologists. They were policy advocates. They were researchers. They were media experts, right? Because this was quite a while ago. And the one thing my grantees desperately wanted were technologists, not just technologists, because they didn't think that these things would be solved just by technical solutions, but technologists who had the ability to translate and collaborate. So technologists who could work with lawyers, with policy experts, with advocates to understand what are the harms that we're experiencing and what would be the protections and fixes we could put in place. So I went looking for these technologists and I couldn't find them. There were very few technologists being trained with these sort of hybrid ethical values and there weren't pathways for them to go to government civil society and to elevate themselves within philanthropy. And so I went to my boss, Darren Walker, who runs the Ford Foundation. I said, you know, Darren, we need these technologists, but we don't have them. I can't find them anywhere. They can't even conceive of roles they would play in these other sectors. And he said, we've done this before. This is something we can work on. Because back in the 60s, during the civil rights movement, there was a deep need for another kind of technology, the technology of law. We needed lawyers who were going to represent people who were engaging in civil disobedience. We needed people who were going to establish the legal defense funds to bring court cases to establish rights and protections. We needed people going into government, into agencies that would try to enforce those rules. We needed jurisprudence that could knit up to the Civil Rights Act. And none of that existed. It's funny because we just think, of course you can get a law degree and go and do all this stuff. But back then it wasn't so simple. And so investments were made by Ford and other foundations to sort of build all these things. Pro Bono, legal defense funds, legal defenders, regulatory agencies. Darren says we need to do the exact same thing now in technology. And so for the past, I run the Catalyst Fund. It is a three-year $50 million investment that Ford made. We're in our last year of it to basically build out the infrastructure in the same four sectors where public interest law existed. So those sectors are academia, which is supply side. We need to actually train technologists differently to be able to be effective in these different scenarios with a values-based framework. So they're not just solving the technical problem. How do I get X to do Y quickly and efficiently? But let's think about what happens when X does Y. And if we do it this way, are there harms and how do we mitigate for them? The last three areas are all demand side. So government doesn't even know they need these technical experts. They don't know how to recruit in place and hire them. So how do we create pipelines for that? Civil society is always sort of starving, right? They're all on starvation cycles. It's generally not people who have the extra money to bring a technologist into their team or even to know how to do that. So how do we build up the tech capacity in civil society? And then finally, even the private sector. Because in an environment where there are consequences for designing things that harm society, we're going to need to actually retrofit our engineering teams and our design teams and our data teams to include a different kind of intelligence that can think, not just how do we do this quickly, efficiently, faster than our competitor, but how do we think about the consequences of doing it this way versus that way? And how do we ensure that the things we're building enshrine the values that shore up democracy and other kinds of things, equity, other things we care about? So that's what we think of as public interest technology. Aspects of this have existed forever. There's civic tech. There is community tech. There is tech for good. We're not saying this is supplanting this. We're seeing it more as an umbrella framework in which we can actually raise up all the good work people have been doing in other areas and figure out how do you make it permanent, professional, the kind of thing where a technologist can make a choice and feel like they can have a really rich and lucrative career, but not just in the private sector. Okay? Any questions about that? Good. All right, Ezenay. Who are you? Yes, who am I? So I, my name is Ezenay and I am currently a board member for an organization called Black in AI. And Black in AI's mission is to really just uplift Black AI researchers and provide support, resources, and just visibility to these researchers who maybe didn't have the visibility before or didn't have a place in this field. And so a lot of... I'm also concurrently a PhD student at UC Berkeley in computer science. And a lot of my research, I would characterize as public-interested technology. So to me, what that means is not only building technology with the public and society in mind, but also interrogating the current systems and potentially redesigning those systems to then encode the values that we want them to encode to make sure that they're not harming the communities, my communities that are definitely most impacted in the wrong kind of way and just designing these technologies to have all the people that are going to be using them in mind, especially given the fact that AI is impacting so many different sectors, so many different communities. And so just making sure that everybody has a seat at the table and everybody is involved in the design process and how it's integrated into society. Yeah, so that's a little bit about me. That's great, and that was going to be my first question. What does it mean to you? But I'm not going to let you off the hot seat. I'm going to ask you another question now. Are you ready? I'm ready. Okay. So how did you even end up in this field? And what inspired you to stay in it? One thing I can say is when we tried to find those technologists like eight years ago, we did this huge scan and we found about a hundred different places from elementary school to an actual, you know, career placement where the kinds of technologists that we're hoping to have, people who are trained with a different sort of value set would fall out of the pipelines. So everything from like public schools not requiring coding so that people fall out of a pipeline that would get them into a job at a higher, you know, at a professional tech company all the way through to cultural concerns about not wanting to work in that environment. But the thing that was so surprising to me and one of the biggest barriers was that these technologists couldn't even conceive of what a career pathway in other sectors would be. You know, they were going to change the world through an app or through a startup, but they couldn't imagine how they could change the world in Washington or how they could change it in an NGO. And so how did you even conceive of this? Yes, so I mean, I think I was definitely one of those students in elementary and middle school and high school honestly where I did not know what I wanted to do, didn't think that this was even, what I'm doing now was even possible. I think when I was in high school, I tried to do, I was very interested in advocacy work, especially in education. Like that was something that was always a passion of mine and when I went to college, that similar story. Like I volunteered in a lot of organizations that did tutoring for math or English to middle school and elementary school children. And so that has always been something near and dear to my heart. And I honestly, I started in, when I got to college, I started in like the lowest level math class and I really thought that I could not continue. But honestly, starting in that class, I think was the reason why that led me to where I am today because it really did instill in me that like actually math isn't, it's not this kind of scary thing that we hear about, that we make it out to be, it is scary. But I think once I had that experience, that good experience, I could draw on that to motivate me to continue. And I realized that I actually really liked math. So I just like wanted to continue. And then when I realized that you could kind of, that you could, there was a need to try to combine them. That wasn't until my senior year when I took a class with another black woman technologist at my university. And she was actually the first black woman to get a PhD in CS from MIT, Latanya Sweeney. And yeah, yeah, some of you guys know her. Yeah, no, amazing, amazing. For those who don't know her, broadly her work, I think she's most famous for a study that she did on the Google search algorithm, basically showing that when you type in, she Googled her name and she found that the Google search engine showed her, it came up showing. And yeah, advertisements for and associating her name with being criminalized or needing an ad for a lawyer to get out of jail or something like that. Just by her name. So a lot of that is associated with maybe whatever the, what was encoded in the algorithm, the biases that were encoded in the algorithm at the time. Since then, I think Google has changed it. But again, there have been, but in general, there have been other studies with this technology coming up and just being not great, really horrible for marginalized communities for black women especially. And so after taking that class with her, I was like, wow, this is my first example of a black woman who's doing this amazing work, both combining her skills as a technologist, but also her love and interest in public service and in helping her community. And so after that, I just was hooked. I was like, this is what I need to be doing. This is something I really care about. This is where I can combine my skills and put it to greater use in service of society and communities that I care about. And then I guess I got involved in black and AI because I looked around in my classrooms, my math classes didn't see many people who looked like me who even who wanted to do the same thing that I wanted to do in terms of an interest in public interested technology. And so to me, the work in black and AI is just increasing the visibility and the numbers of the people who want to do this work. And even just in general, people who just want to be in this field as it's doing so much. So I think the word that comes to mind is access. And that is something that a huge value of mine that I really care about. And so my work in black and AI I think lends itself to that in terms of providing more resources and opportunities for people who look like me and also those around who are interested. And I realize, I don't have a question here, but I want to make sure that we understand a little bit about the origin story of it, black and AI. Can you talk about who started it and what made them start this organization? So the origin story is with black and AI. So it was founded originally by a group of people, but I guess two main people that I'll mention were Tim Nickabrew and Reddy Abebe, who were, you know, at the time few black women in the field and they met at a conference and just looked around and were like, nobody else is here who looks like us and kind of bonded over being the only people at a top AI conference with thousands, thousands of people at this conference and they were the only two and so decided to start a Facebook group and just invite people that they knew who were in the field or interested. So it started, like, I think with about 10 people and then from their word of mouth kind of spread and it grew to, like, the thousands that it's at now. And after that year, the next year, they decided to plan a workshop at, again, a top machine learning conference, one of the top ones. It's like, if you're an AI researcher, you know this conference called NeurIPS and you're going to be there because that's where all the companies come. That's where all the top research in the field is presented and so they hosted a workshop there, invited and paid for hundreds of black AI researchers to come be at this conference to present their work and just, like, offer them a platform, a stage to just tell their story, present the work, the things that they're interested in, the work that they're doing. And from there, it's just kind of ballooned. Like, now we have a conference, a workshop every year. We hosted at different conferences so not just this one, but other smaller-tier conferences, but also other conferences in other fields to try to bridge across fields. And in general, like, we've just seen a lot, like, black and AI having a lot of impact on black researchers' lives, including myself. Like, I went to the first... Before I was a board member, I was also just, you know, a member trying to figure out how to get involved. And so I went to the first black and AI workshop where I met Timnit and read it in person and from there, just, like, was really eager to help. And so the next year, I was organizing the workshop and then the next year, they asked me to join the board given how much work and thought I had put into kind of building the organization. And so, yeah, it's... Right now, we have a lot of different programings. We're trying to... Essentially, right now, we're trying to strengthen our academic programming. So, creating this pipeline that you have mentioned of building technologists and giving... equipping them with the skills and resources that they need. So, from undergrad to... the pipeline from undergrad to grad school, just, like, strengthening that pipeline so providing resources for people applying. It's really expensive to apply to grad school. Like, the application, the hidden fees that come up really can prevent people from applying. And so providing financial resources, providing just knowledge resources, because not everybody knows which schools to apply to, why to apply, which programs how to write a personal statement or a research statement. And especially when you're coming from schools, you know, from other countries where the system is different from maybe the U.S. system or the Canadian system or whatever system you're applying to, just trying to provide those resources so that there's not an information barrier. And so... and also community, because I think the one thing, even though this... I'm in a PhD program currently, and, you know, it's really difficult. And I actually was in a previous program, but ended up not continuing in that program and switching into the program that I'm in now. But just throughout my career, and numerous times where I've just not... I really felt alone and not wanting... not feeling the need to continue, you know, just losing hope. And I think this year, thankfully in my program, you know, we have a really... we have an unprecedented amount of black PhDs in our cohort, which really does make a difference. Like, I can't tell you how just having that community, having other people who can... who kind of understand your experiences, where you're coming from, who you can relate to in this way, it makes a difference. It really truly does. Like, even just in the audience, I have a fellow PhD also here with me, black women. So it's just... yeah, just having that support, I think really does make a difference in terms of longevity throughout the program. And so just in general, trying to provide that for people across the globe, honestly, is the goal of black and AI and why I'm still here. That's a great story. So would you like to ask me a question? Yes, I wanted to ask this. I wanted to ask... So I guess how is Ford Foundation thinking about public interest in technology now? Like, why now and what are the plans, I guess, for the future? So I said a bunch about, like, our approach about the different quadrants that we're investing in. But I can talk a little bit more about the why of it. You know, I'm going to be 55 in February, and when I went to college, I had a word processor on my desk. It was not a personal computer. I think there probably were some, but most college students didn't have them. Most people who are in charge of everything at this moment are a little bit older than me. So they have that experience, maybe even more extremely. And so what you find is congressional offices, heads of very important nonprofits, heads of certain academic disciplines haven't really been exposed to tech as a system. I mean, they live in tech as a system, but they think of tech as a tool, right? When you say you're really going to need more technologists in your team, they'll say, that's great. You know, help us figure out our online organizing strategy or something, which is great because tech is a spectacular set of tools. You can do wonderful things with it. But we're thinking more about the blind spot about tech as a system and the way it actually... I think of it almost like if you have a square glass and you pour water in it, that water is in a square shape. And if you have a round glass and you pour water in it, it's in a round shape. The tech system is actually framing the shape of the world that we're in, but it's kind of invisible because it's threaded through everything. So Darren Walker at Ford, my boss, is just a really unique individual because he's older than I am, but he's not afraid of the fact that this change is so extreme that it's de-centering. I think for a lot of people in power, it's very de-centering to say, I'm going to have to really think about everything differently and change it. So I'd rather not think about it. I don't find it delicious. I'm not going to be good at it. You know, at 61, I'm not going to learn how to do AI. So I'm just going to ignore and I'm going to contribute through law, through policy, through the areas where I've made my career. And so the challenge with that, of course, is we're perpetuating a knowledge gap in all of the systems that actually build up the rules and laws and protections that allow things like democracy to thrive. We created an enormous asymmetry of power where, you know, I always like to think good democracies are balanced between sort of the innovation and the opportunity created by the private sector, the sort of stewardship and constraint of government, you know, the regulatory environments and the ability to create markets and allow wonderful things to happen, and then they're balanced by civil society who can vote them in and vote with their dollars. And so I think in a perfect world, it's like a three-legged stool. It's nice and solid, and each of these three pressure points hold each other in line. But for the past 20 years, only one leg of the stool understands technology. And if everything feels like it's at sea right now, if everything feels precarious, it's because what would it be like if one had a leg that was 20 years longer than the other two legs? Like everything feels rickety. But it's solvable. You know, bringing tech experts into these spaces in collaboration is the only thing that will solve these problems. And so Ford thinks about it because every issue we work on, whether it's criminal justice, I mean, right now in like 27 states, there's a requirement that judges take into account sentencing algorithms that determine using anonymous data or, you know, untransparent data whether somebody should get parole sooner or later how long their sentence should be. These were tools that were built by altruistic people who were afraid that there might be judges with bias. And by creating this external correction on it, using science and data, they would actually protect people from the bias of judges. Unfortunately, the data they used was biased data, and it's in a black box now. And so one of our grantees, Julie Angwin, did a long research, a piece of research, and proved that the outcomes were much worse for people of color who were supposedly neutral systems, and it was consistent. So we're now trying to do something good, but without the technologist helping us design it on the front end, we've actually baked a biased system into the criminal justice system in 27 states. How will we fix that? We're going to need technologists to come in and rebuild that system, take that system apart, undermine that system, prove the bias of that system. And that's just one example, but every single issue you're working on is probably experiencing these kinds of disruptions and challenges. And without clear-eyed people who can see the system of technology in space, we're going to just replicate the kinds of bad outcomes that we're trying to fight against. And build new technologies, potentially, because I do think that sometimes it just takes a redesign of the whole system and just in general building, in the same way that we're trying to build more efficient algorithms, like building algorithms actually from the start, carrying these values and these societal impacts in mind, I think is also the main thing. Well, it's great. Let's go deeper into that, then. So what specifically are you trying to do? I'm going to take two of the questions and mush them together. And why is representation so incredibly important in what you're trying to do? Yeah. So with Black in AI, I think the goal, the mission for Black in AI is to try to democratize access to the field of AI and specifically for Black researchers who are interested in the field and just in general Black people interested. And so I think for us what representation means is just like bringing more people who are impacted by these algorithms to have a say in how they're designed and how they're deployed in general just like how they're used and for what purpose, right? Because if we're going to design these systems and say that, okay, it's going to impact this sector and it's going to impact this sector, it doesn't really make sense if you are then just, you're deciding that unilaterally and you're not including the people who, all the people who are, who are going to also be impacted. And so I think, because then you have blind spots at that point. Well, can you tell us how bad is the gap? How bad? How bad is the representation gap? Oh, yeah. So I guess some facts, some statistics. So I think in 2021 there was a study done by the Computing Research Association and they just collect statistics on students graduating out of PhDs in CS and even faculty. And so I think in 2021 there were about roughly 1600 people who graduated with a CS PhD and only 18 were Black women. And so, so yeah, that's, it's not a lot. That seems like a pretty steep gap right there. Yes. And it was even worse before, like that's 2021 and before that the numbers are much smaller. And so I think we're starting to see a huge increase. Like I said, in my university right now, at least in my cohort, there are, there were 100 PhDs who were accepted or who matriculated to Berkeley and 11 of us are Black. So, you know, we're rising at least in percentage, but that's not, that is a very unique situation. It is not the norm. It is not the same across other universities, other programs. And so there's still a huge gap and so I think our role is to try to just one, provide access, provide the opportunity, share those resources to people who are interested in pursuing a career in this field and hopefully bridge that gap ultimately through the academic pipeline because another thing that I think matters is not only having technologists who are building the technology but also people who are teaching technologists and nurturing future technologists. I think that really matters too. And so I'm just like creating more avenues for both building the technology as well as like avenues in academia. And yeah, we have other programs that we're working on but I think the academic pipeline is where we started because to where our founders kind of their origin story, it's what they know best. Like Tim and Reddy, they all went through kind of their PhD programs and they went through it alone to get to where they are now and that was great but that's not something that they want for other Black students to like have to experience. And so kind of fixing that pipeline was their starting point and that's where we're kind of strengthening and now I think Black and AI we do have like a vision to expand that to like other parts of the other pipelines in other industries as well in addition to academia. Also yeah, in the private sector and whatnot. Yeah, and there's wonderful work that some people are trying to do in that space but there's not a lot of progress which is really sad. There's like a great group called Code 2040 Ford has been funding them for maybe seven years and what they realized that there were all sorts of bias on how people were being hired so people who were qualified would fall out of the final rounds and then when people get in there are cultural concerns that make it really hard to work in some of these companies when they're isolated or when some of the motivations like if the motivation is about solving the technical problem and avoid without the context in the downstream harms it can feel like maybe you've worked so hard to get some place only to realize that the work you're doing isn't as meaningful as you need it to be. But alright so let's say we solve this let's say that you know 20 years from now we have a across the board representative group of technologists graduating every year in alignment with the diversity of this country. Like what kinds of different outcomes do you think we'll see in the world? And of course I'm not holding you to it but I will call you in 20 years make sure that you did do this stuff. But what's at stake? What could we get? What would be in the world if we had a more representative set of technologists building the world that we're living in? Yeah I mean I think at least my belief is that in general we'll have systems that are held accountable that we can and we'll have more tools to equip people with just the ability to even advocate for themselves when they interact with these systems and just be more knowledgeable around how these systems are working and how they are working in their communities and kind of just have this I think you have talked about this like triangle right just having all these like systems in place to try to mitigate these harms because I think even with having a diverse work system some companies and people think that once you have like a black person in the room the problem is solved but that's not the case that's one step but then in addition to that it's continuing this work continuing to have those systems in place and making sure like you said that like Board Foundation is trying to do the longevity of these systems and interrogating them and making sure that they're not causing, actively causing harm is also a part of the solution and so I think in time we'll have that and I think also the problems that we might try to solve will change I think we a lot of the tech right now is focused on these very minute problems that are meant to provide profit for these companies and so I think moving into public interested technology the goal changes now you're trying to figure out how do you build technology to improve society to not just like maximize clicking on this ad or this link or whatever it's more trying to figure out okay maybe if I show you this image like how is that going to impact your well-being potentially and how do we you know change different things for different people in different communities but we're trying to build systems that try to take that into account and incorporate that and get feedback from the users as well and so I think hopefully that's the goal I would like to move towards because Eli Pariser said on the conversation I had with him recently technology is this tool and it will lean towards power and so if the power wants the tool to do something good it could do something good but right now the incentives are really are very very focused on profit and outcome and so this incredible like opportunity that we have to build better worlds that are more inclusive and and more peaceful and more equal all this technology is not being used that much for that it's being used in ways that are having the opposite effect so it's been 10 minutes out now until we leave I was hoping to stop talking sooner so we get more your questions but this is the time that you guys get to shine I've been seeing lots of nods curious faces a lot of attention does anybody want to ask us questions about PIT or talk about something and we've got a mic okay so here we go thank you for all the insights from both of you and I was really curious what are some of the specific strategies that board foundation deployed to intervene into the private sector which I find really difficult not just I'm sure not just in the states but in Korea as well with the gender and all different diversity issues yeah so so the the catalyst fund is amazing you know imagine you've got an amazing boss who says here's 50 million dollars try to solve this problem this social justice problem you know there's nothing to complain about but if I had one little complaint it would be this thing called the pandemic that happened right in the middle of that section which meant of all the four quadrants that we wanted to really develop strong work in the private sector is probably our weakest one and that's largely because social justice foundations and private sector companies speak a different language it doesn't mean that there isn't really big areas in the middle of the Venn diagram where we could usually work on things that would be beneficial for both of us but we could never get in the room with them because we couldn't get out of our own room to actually figure out how are you when you say this what do you mean when we say this this is what we mean oh this would be really good for us to work on together that says we do have a couple things there's something that was announced yesterday and if you look me up on twitter there's an announcement there but we've been working with Microsoft and America foundation so civil society and a private sector company to begin imagining what would open layers at the top of the internet stack look like so you know there's so much of the the base of the internet is this sort of open protocols and standards but things like at the top layer it gets very very private very quickly and that's where things we don't love are happening so like around identity and around privacy and around finance these are areas that are for many obvious reasons developed in the private sector but the question is if you could come up with open protocols in those upper layers would it allow more people access to build competing tools and technologies in that space that would then allow for more more competition for our dollars and our attention right so that's something you know because right now you have a civil society yelling at tech sector and tech sector saying we're not doing that you're wrong you don't understand and would be much better if we could actually say could we build something together could we try to build something together so that's one thing the other two are obvious ones tied to public interest law so you know pro bono it's a huge thing in private law firms right you can work at a very high level law firm and they can pay you to go work on a capital case and defend somebody's on death row that's a precedent it's nobody makes fun of it it's not just seen as alter them it's good for the company it's good for the world we need more of this pro bono work there's a group called US digital response that recruits technologists and places them in government and NGOs to help them you know get over this hump of knowledge gap so that's one around pro bono and then also civic leave so if there's a way to institutionalize and then put guardrails and protections on companies ability to allow some of their technologists to go into government to go into civil society to work on different projects and if we can get a bunch of companies to agree to it as a normalized and protected you know piece of work we think that would be really valuable as well that said you know if companies ever get regulated then they're actually going to have to incorporate public interest technologists on their team because there'll be consequences for designing things that create harms right now people get away with a lot of stuff but an environment where there is constraint and historic laws that have been corroded are reinforced now in this technical environment people will have to think on the front end about what they're designing about whether it'll harm someone and so that would be another area we have five minutes left for questions oh there you go how can I help in kind of bridging the gap between AI and art but also like mixing them both in because I think there definitely is a mixture there and just leveraging that and really promoting it within the digital web app and space and like making artists aware of that unique connection and do you feel as an A that there is that connection there oh yeah there is definitely a huge connection between art and AI it's still I think people are still finding out about it but there are a lot of like now there's like AI generated art that people will train AI models to then generate art that looks similar to popular art that we know of but also kind of incorporates different styles and so there is a lot of a lot of things being done in that space and just like broadly AI and creativity space that I know about and so I could definitely I'd be happy to share that with more with you in terms of like incorporating AI maybe on your platform or in your with your artists I mean I think one way is definitely just like like knowledge like information access so just like maybe doing some workshops I'm like how to like figuring out okay how are these models working and then like potentially using them to like do many projects to create new forms of art or do new forms of art is potentially one way and there's also I guess like a more kind of technical way like incorporating AI on the app I'm not really sure how the app works but there yeah yeah we can talk but yeah I do think that this is like a huge space there is one artist I have to remember the name but she spoke at one of the black and AI workshops a couple years ago and she's an artist she works like directly in the intersection of art and AI and so I'd be happy to yeah to share all of that information with you. Great one who gets the last question there you go I wanted to ask what the PIT program or the Ford Foundation at large is doing in terms of implementing clean tech solutions or climate resolutions and solutions with the programs that you all offer right now not very much I mean what I would say Ford's huge there's 10 offices around the world it's been around for over 80 years it's trying to transform itself to include a tech lens in all of the program areas one of the ways we do that is by bringing technologists into our teams so we have like tech fellows that come in and sit with the criminal justice team or with the immigration team so that those blind spots don't affect our grant making but we're over but what I'd say is the way we do our climate work is we empower indigenous communities in the regions where our offices exist like Nicaragua and Mexico and Brazil to steward the lands that's been how we do climate which means we're not very much in the rooms where people are talking about tech solutions how the technology itself is going to make you know make the world and the planet survive or best practice although you know we do have an LED certified building so there probably are some people who are thinking about some aspects of how we're doing that but we're not a lot in the climate space except for in that very specific way and so we haven't done much there okay if you have other questions feel free to come up and talk to us it's been wonderful to be able to talk to you today thank you for hosting us and I also have to thank Aaron for helping us get all of this organized because it's been a pleasure thank you