 Okay, welcome everybody. I'm going to go ahead and kick things off today. My name is Sue Hendrickson. My pronouns are she and her and I'm the executive director at the Berkman Client Center for Internet and Society at Harvard University. Today I'm co-hosting this event with Risha Jayante between the Associate Director of the Technology and Public Purpose Project at the Belfer Center. And pleased today, really honored to have with us our special guest Afua Bruce. Afua is a leading public interest technologist who has spent her career working at the intersection of technology, policy and society. Her career spans the government, nonprofit, private and academic sectors, as she has held senior science and technology positions at a data science nonprofit, the White House, the FBI and IBM. I had the privilege and honor of working with her before joining Berkman when I was in private practice and she was at DataKind, and I'm thrilled to be rejoining again. She has a bachelor's degree in computer engineering, as well as an MBA and is the author of the book that we're going to be talking about today. The tech that comes next, how change makers, philanthropists and technologists can build an equitable world. I'd like to just start by acknowledging that Harvard is located on the traditional and ancestral land of the Massachusetts, the original inhabitants of what is now known as Boston and Cambridge. We pay respect to the people of the Massachusetts tribe, past and present and honor the land itself which remains sacred to the Massachusetts people. With that, I'd like to just say that the event is being recorded, audience members will not be shown. If you wish to ask any questions or leave any comments please use the Q&A function on Zoom, we'd welcome you to use the Q&A function throughout so that we can pull questions together so when we get to the end we'll have a curated question set of questions that we can ask. And this is a friendly reminder to ensure that your mics are muted during the event. With that, first I'll turn it over to Amritha to just say hello and then to our special guest Afua. Thanks so much Sue and thank you so much Afua for joining us today. I just wanted to briefly also thank Karen Agia for Ciarando and Becca Tabaschi who are on the Berkman and TAP teams for helping us coordinate this event today. Great to get into your book Afua. So, to kick things off I'll actually just toss you an initial question to lay some groundwork, which is really introducing the concepts within your book and also the motivation. We heard from Sue that you've had such a breath of background in the tech sector and in the public sector private sector etc so we'd love to hear an overview of the book and what your motivations were for writing it. Yeah, absolutely and thank you both so much for doing this event doing it together and letting me be in conversation with both of you have tremendous respect for both of your work. And really as you mentioned we work together while I was at DataKind and Amritha I am happy to be a non resident fellow in the technology and public purpose project and so just really grateful to be a part of this conversation today and excited for it. So we sat down to write the book The Tech That Comes Next with my co-author Amy Sample Ward. We recognize that we both really think a lot deeply about how technology affects communities, how it affects communities in positive ways and in not so positive time. We wanted to, we thought a lot about how the technology development process works, how funding of technology and the use of technology is subject to so many different forces from society and where communities actually have a voice of that process. So as we were thinking about things we realized that you know there is a lot of research and there's a lot of writing about all of the harms of technology and all the ways that technology has been used to inflict harm and disenfranchise people, but we wanted to write a book that could imagine a more equitable world. And so what would that look like if we took some time to step back and really think about what could the future of technology look like, what can the future of technology and the social impacts space especially look like. And so that's where we started. We started by thinking of the tech that comes next really being about the tech but also about community because we want our future world to really center community needs and community perspectives. So in doing that we started by saying well we have to be really clear about what it is we value, regardless of whether people articulate their values or not when you develop technology when you developed ways of interacting in society. It reflects values and we can see over time some of the values humans have, which can be to concentrate power to concentrate information and to concentrate resources. So if we think about communities more broadly, what would it mean to build a more equitable world and what values would we want to be really clear about. So in the book we actually lay out six different values. The first is that an equitable world requires that we value the knowledge and wisdom of lived experience, recognizing that the most impacted individuals and communities need to be central to decisions and solutions about priorities. The second value we articulated was that an equitable world requires that we value the participation of a diversity of people in decision making and planning and in building technology, regardless of their technical knowledge or training we need to make space for both technical and non technical expertise. The third value is that an equitable world requires that value accessibility that we value accessibility for the start. So thinking about the language we use thinking about the tools we design and are they accessible to people thinking about where we hold community meetings or where we, and how we solicit feedback from communities and users, what does that process look like to actually be accessible. Number four that we articulated was that an equitable world requires that we value the multiple ways that change is made, meaning that we have to recognize that even though we might want systemic change and a more equitable future, there needs that need to be met today, and there needs that need to be addressed today and so how do we live in that tension both making long term process progress while meeting immediate needs. The third value we articulated was that an equitable world requires that we value the strength of collect of collectively, creating a vision that, excuse me, creating a vision of a better world. So it means that, you know, Amy and I spent a lot of time writing this book, but we are not the be all end all on this we need other people which again is why I'm so excited to be in this conversation with both you and Maritha and Sue, to expand who is included as we are thinking about what does it mean to collectively create a vision of a better world and the sixth value we articulated was around the dedication of individuals valuing the dedication of individuals and communities and pursuing knowledge so recognizing that there are many different types of knowledge and different types of expertise and we want to hold a healthy respect for true expertise and learned expertise in its different forms. So now that we know what the values are that we're trying to build around we say let's create a more equitable world let's imagine a more inclusive future as it comes to technology. We had to figure out some way to organize and the way that we chose was to identify five specific roles that most people will be able to find themselves in. So it's a shame that the roles are you might identify in one today, you might identify in a different one in another couple of weeks and so recognize that you can move across roles but also that we need to go across some traditional silos, in order to really develop then technology that comes next and to really build a more equitable world. The first role we identified was the role of the social impact organizations and employees at social impact organizations, and one of the key questions that social impact organization employees are thinking about is how our staff supported and resource to align technology with the mission and communities needs. So how do these organizations really see technology as a way to advance their mission and enhance their mission and not compete with their mission or their resources or other resources. So one of the categories technologists is a trained computer engineer and former practicing software engineer, this is where I most readily identify most often but really with technologists we are looking at how can we change the tech development process, and really think about how are technologists investing in the leadership and capacity of the impacted community to support their long term ownership. What does it look like to distribute power differently what does it look like to protect rights and privacy in those situations. The third category is funders and investors so everything from philanthropist to venture capitalists to corporations to individuals, and asking funders and investors to really think about changing the way tech and social impact funding works. So asking investors, are you committed to funding inclusively for intentional engagement, iterative processes and long term support. The fourth category is policymakers, looking at policymakers who have the ability to really change laws and policy, one of the strongest levers we have if we're going to make systemic change. And so a question for policymakers to contend with is, are you meaningfully engaging communities, most impacted by digital divides and technological harm to inform product of policies. Are you thinking about the policies that govern technology and the technology that govern policies. And then the final role that we identified is the role of communities, recognizing that we have people in community in multiple communities and communities from diverse backgrounds communities of historically overlooked people, and asking these communities, what is it that you want to change, and what are your biggest dreams, and really using the answers to those to drive the rest of the process to drive the policies to drive the technology to drive the funding and to drive even some of the goals and formation of social impact organizations. And so with those framings above, let's be a little bit more clear about what we mean about equity, and what a more equitable world looks like. And then giving us a framework of five different roles and actions that each of those roles can take because there's a little bit of responsibility for everyone here. That's the framework that we use to really think about the tech that comes next. That's great thank you so much if it's such a comprehensive overview of the content and one thing that I loved about the book is that it's very action oriented it's almost like a toolkit with the questions etc that are in it so it feels like it really helps us carve a path forward. One concept I really liked in the book to that I'd love for you to speak a little bit on as we move into more of the meat of the conversation is that you make this distinction that tech is not a solution it's how important that framing is, as we think about how tech can be applied in different sectors and spheres to ensure that we're building it to really elevate the values that we've articulated rather than assuming they're integrated in or embedded already so can you speak to that distinction especially in the way that you think about how the narrative is now around tech and where it should be as we move forward. Yeah, absolutely. Again, I am an engineer by training. I love engineering. I love technology. I think it's amazing. I think engineers are amazing and we can do wonderful things. But if we always approach all of the problems as though technologists or engineers or computer scientists have all of the answers and sort of assume that because someone can write some really well designed code or, frankly, not well designed code that still runs that they have all of the answers to drive every other aspect of society. We really miss the mark. We need to really focus on what are the problems that we need to solve, and then back in from things there. There's an example in the book of an organization called rescuing leftover cuisine, for example, that existed to rescue leftover cuisine, as the name might say. And they started out as an organization that was literally, you know, a team of volunteers who was trying to solve food insecurity they get a call from someone, you know, restaurant or someplace saying I have leftover cuisine and they'd call a volunteer to go and walk to the place pick it up and take it to the next place. And that worked for some time, but then they thought about how can we be a little bit more intentional about this and thought about the technology that might help facilitate this. And so rather than just asking someone to design a technology that might facilitate that design a platform that might facilitate that they actually spent a really, they spent some time really intentionally designing their own process, and you know creating. I think it was a Trello board with requests that people can put in from in by people I mean volunteers staff members people from the organizations donating the food to say the to document the request they wanted in the features they wanted in this technology and then people could sort of upvote or downvote on things and when decisions were made that either aligned with the clear community wishes or contradicted but were necessary clear communication could be had. And that is how the platform that still drives rescuing leftover cuisine today that still drives rescuing leftover cuisine was developed. They were able to do that and they were able to go from one small organization that you know operated at this human scale to an organization that operates now and I think more than eight cities across the country. And so again it's really by focusing on what were the actual needs talking with those community members involved using that to help define the technology, and then move from there. When you skip some of those steps you can get into some of the cases we've heard about if people designing, you know, an app to solve hunger and here we are years later and that app has not solved hunger, or designing technology to, you know, help assist people to do something but you didn't actually talk to people who needed the assistance and so those devices that aren't used or if they were used they were actually harmful and inflicted harm on people. And so that integration and really focus on the problems and the communities who are most affected by them is really important. And here's the other example in your book to that you provided of data kind working with John Jay college to collaboratively develop the program to identify students at risk of dropping out and to kind of help propel them across the finish line. Because it seems like it was exactly that kind of model that you were talking about of engaging what the community needs were to figure out how to design that. One of the things that struck out stuck out to me with that is that there was and I think you noted it in the book that the weight of particular variables in algorithmic decision making had the serious real world consequences and determining who received services and who doesn't in that context so it was important for framing. And one of the things as a kind of technologist and since we're both in academia here that I was wondering about is, how do you believe that technologists should be trained differently to kind of handle the ethical and policy implications of their design deployment and development choices given that it really is ultimately not the tool, but the people and the problem that you're trying to solve. Yeah, absolutely and just expanding on the John Jay college and data kind example of it you summarize quite well. So, I also just want to note that the algorithm or the models that were built in that to help identify students who are at risk of dropping out and then to identify what interventions were best. That was a recommending tool for people who are then we're ultimately involved in making that final decision and implementing that and so, as we think about training students differently I think also talking about some of the limitations of technology and some of the ways that we want humans to continue to interact with technology and some of the decisions that we might not want to see to computers who are on feeling who are who have had biases encoded in the way they work and then operate at scale and very quickly to make different decisions I think that's really important. I think another thing that's important to think about when we discuss training students differently is to recognize the different perspectives that are really important and thinking about some of the ethical considerations. And so yes, absolutely engineers should take engineering courses and design there. But it's also important to start to identify ways to build empathy ways to identify how to assess a data set for completeness and where you might go to help build out that data ways to work across disciplines and recognize that people from a law background and social science background or other backgrounds have different perspectives that's really important to incorporate in the design process, because then as people are thinking about coding different weights and different variables and different values and then some of those considerations can then be translated in the code to have to then have a different impact on the overall process. And so really showing some of the practical side of what's working and what's not working is really important. And then I think finally centers like the ones both of you run that really have people who are deeply researching some of these issues, and the positive and negative implications of them, hearing people who are in that research and dealing with those technologies on a day to day basis I think is incredibly important. And by the community of information that the two of us get from the people who are affiliated with our centers it really is amazing and kind of driving the research and, and just understanding of a lot of these issues and implications as we think about how to go with them. One thing you touched on in there was about the silos and breaking those down and I was, I was fascinated when I was looking at your bio that in the White House it mentioned that you had overseen 100 different federal intra agency working groups, and tackling kind of challenges from the environment and sustainability to homeland and national security to stem interagency working groups are a really hard thing to do usually because you are kind of breaking down silos and you know kind of building those challenges across them but one of the things that we're seeing in all of our work is just in today's world there's so much of a need for interdisciplinary multi stakeholder problem solving solutions, but the necessary actors in that historically haven't haven't been broadly enough defined for that. And so I was just wondering whether there was expertise that we should cultivate to bridge those divides build those solutions if there were lessons from your experience building these interagency coalitions that could help us get different silos of expertise to more logically and seamlessly work together on these kinds of issues. Yeah, absolutely I mean I think what I would first say is to develop some patients managing you know for myself 100 different interagency working groups took a lot of patients, a lot of finding people to be the right messenger to different groups and what that looks like. Even though you know say somewhat jokingly patients was really critical then I think it's also really true when we think about developing it. Just working groups in general and working across different silos because we need to make sure that we are taking the time to really develop a common language and you know engineers might have a different language than lawyers that have a different language social scientists and more and more and so really taking some time at the start to really make sure that everyone is sort of understanding what the other person is saying and what that looks like, then recognizing that the time to actually let people speak and then process and react is not insignificant so as you're building project plans as you're designing processes to sort of incorporate people from traditionally different silos, allowing the time for some of that development and that information exchange to happen. I think is really, really important as well. I think piggybacking on that a bit this idea of cohesion so you've named a few different stakeholders that really have power in the way we form and create our tech and so as we think about moving forward and how each like funding streams can shift tech developers can in a way that they're thinking about things. How do you think about the cohesion of change within the groups that we see in the stakeholder groups that we have to ensure that they're coordinating in the way that they move forward and actually implement change that really integrates for example, community perspective about technology because again the tech developer sometimes is sitting in their own corner, thinking about things and they're doing some design experiments but that's not as well interfaced with some of the philanthropists who are funding certain projects etc. So, to that point of cohesion that you made how do you think about that as we think about these key stakeholder groups. Yeah, absolutely and this memory that I think is a great question and one that I'd love to hear your thoughts on and Sue's thoughts on as well because I know in the very interpret disciplinary work that you two have both done I'm sure you've got a lot of examples probably both good and bad. But when I think about how do we sort of start to encourage this I think there are a lot of different like principle and from each of those different groups. And so as we're thinking about funders for example funders and investors I think that if investors start to ask questions about did you talk to a certain group of people, did you do some type of check or in, you know, do diligence process or, you know, who was included who was not included in those conversations that can start to change the needle and I knew there's research by some folks in the technology and public purpose project about that Cornell Tech has done some research some interesting research in this as well and there are other places as well. I think you know on the technologist side as well I think it's really important to recognize that, you know, there are technologists at all different spots in the process and there's something, you know, a little sweep that people can do at least one little tweak at each process. And so maybe your part of the tech development process is when you are doing user requirements. And so then really thinking about what inclusion looks like there and equity looks like there, or maybe it is when you're writing code and then maybe it's your job to think, you know, who did we get this from who did we get our understanding of what our ultimate goal is and what this code is supposed to do or not do from and asking questions along those ways. So recognizing that no one person has to sort of transform the entire system, but there are different levels at many different parts that many different people can pull. So I don't know, did you want to jump in, even with your thoughts on some of that. So I was just going to kind of echo that that as a, you know, I'm a committed generalist who's worked in all forms of emerging tech and in each, each time you have to kind of knit these folks together and I really appreciated in here that you create and they and people come from very different perspectives and figuring out a way to create that dialogue and create the baseline for discussion about these issues one of the things I really enjoyed about the book was both the values based framework that you were giving people and the very concrete questions to enable that dialogue is okay well so if I want to do this assuming I wanted to have more inclusive development and I'm a funder how do I go about that process and started. I think as people start to ask those questions in their respective silos and start to understand the need for each other better. Hopefully we can change the, we can change the paradigm as to you know how people approach these different issues. Yeah, yeah definitely I think one point that you both touched on is the idea of communication to and number one communicating, bringing your thoughts to the table but also having the right language to speak one thing that we notice especially at through our work at tap is that everybody has a different way that they communicate and so to bring different stakeholders to the table. And work to be done to make sure that we're all speaking the same language and also that people's incentives are presented when they come to do the work to ensure that we can align those in a way that makes it more productive but it also points to a cultural shift that kind of needs to happen and I want to circle back to a point you made a foo about focusing on the problems and also the fact that you have a computer engineering background and have done the tech development. I have a computer science background too I've worked on some tech teams and one thing that is common is that often engineers get really excited about hard problems. They care a lot about the solving the problem in a certain kind of technical challenge way but then that gets lost and the values and point actually kind of disappears in the obsession with some of those technical challenges. So I'm curious from that perspective how do you ensure when we think about problem scope and the intentionality around design etc that we can catalyze the right kind of cultural shift so that we're focusing on the problem in the right way. Yeah absolutely so I think we can tackle it on a couple of different angles and so I think one just acknowledging that there are really hard problems that are out there that need to be solved and that need some super complicated technology that in people who really understand the nuances of how to develop the technology systems how to really secure different systems in different ways as people move about as people change locations as people adjust their identities or the ways they show up in different spaces. There are really big challenges around privacy concerns and how your data is protected and how your information is shared or not shared so there are extremely complicated problems that need to be solved. And so I think in the social impact space and so I think galvanizing people around the fact that those do exist and that's worthy sort of other technology expertise is really important. That said, sometimes an organization just needs a spreadsheet, maybe not a full out blockchain solution but they just need a spreadsheet, and so also recognizing when that is the right solution. You know I think it's important to be upfront with people about that to say that you know to the engineers that some of your time might be spent on the really exciting work and some of your time might be spent teaching someone how to use a spreadsheet or developing a new spreadsheet or something and just setting expectations in a more reasonable way, I think matters a lot. It's also true that even for complicated problems or perceived complicated problems in the private sector that it's not always complicated or exciting. We talked to almost any data scientists who wants to write the most complicated data science algorithm, algorithm, they will also tell you that they probably spend a lot of time cleaning code and are cleaning data that is not anything anyone likes to do. Or maybe there are people out there who like to do it I have yet to meet them I am not one of them. You know, and you sort of take the good with the bad and there's that perceptions to some fields and so I think doing a better job of setting those perceptions for other types of work and using some of the those perceptions for the technology and the social impact space I think that that could go a long way. It is one of those things where the shining tool is not always the best sometimes the simple solution is, is what was needed. I asked a question that was kind of a related follow on so just wanted to throw that in here that with respect to thinking about the developers so often the developers do not get to talk or interact with the people who actually use the products and developers are instead given a list of requirements which we've all had you know here's the list of requirements go execute on this, you know, how have you approached changing that dynamic in organizations that may not be set up for that dynamic to change. Absolutely. And so, again, recognizing that, you know, power imbalances in different spaces and who has what access to what lever to pull. In those cases, sometimes starting not with the developers themselves but then really looking at the requirements process and what that sort of requirements process and requirements gathering requirements definition process looks like, and what information is captured at that reference level and what changes you can make there can have a big impact and so starting then in that requirements process of asking for some changes to that process to sort of document what community this came from who was spoken to that sort of prioritized or emphasized I think is important. Also, many different development shops, you know, certainly true in the federal government is true in some private sector places as well, have your, your privacy impact analysis at the end of the process and also thinking through in different ways where that needs to happen so does that type of privacy impact analysis that can take place at the very end that does go to some assessments for how the technology is used and how it impacts communities that have to take place at the very end all the time, or are there some ways for some intermediate checkpoints along the development process. And so changing some of those things can start to get at this and an organization that is more rigid and may want to do what is seen as protecting the developers time. And then to one of the other questions as well as to whether you've seen good examples of ongoing engagement of non technical but issues proximate stakeholders engaging with tech development. Are there good models out there that you've you've witnessed. Yeah, absolutely I think I witnessed you specific organizations working really hard to do that well and sometimes specific teams within organizations to do that really well. And I, you know, now consult with nonprofits directly and worked with a particular nonprofit in the benefits access space, who throughout their development process also had a team of people they weren't the developers but separate team of people who consistently went out into their community to say, how are things working, what do you want to see what do you want to change. So where the interaction came back into play is that every other week, both of those teams met together so where we are in the development process and what have we heard from community members and to have that exchange just be part of an ongoing process that sort of built into how the organization works and so I think that that was one model I've seen that's been that was very successful. Interesting actually could you elaborate more on what even that communication scheme looks like often sometimes there's a dance that happens and people bring recommendations to the table but how it gets translated into actual technical development and requirements is a whole another step of the process so how have you seen that be successful and are their best practices that you've noticed that you would then recommend to other groups as I think through this. Absolutely so I think I think an underlying theme might be clarity is kind. And so as you can be as clear as possible in collecting that information from community members and be clear. So not just leaving it as this person, you know, talk to someone at a particular client site, it turned out that most people were on phones, but stepping back from that and saying and this is actually an example from that same organization I mentioned that you know people had a lot of time developing website interface for computer laptop, and then when the customer team went out that particular office that was being used as a community center to help people access benefits only had one computer. And so then you had most of the staff sort of around on their phones trying to bring it up. And so just the message of, you know, a lot of people on their phones probably not sufficient to really think about requirements but also that additional needs of people don't have access here so now let's think about how people might access things and where they might access and even how they might do training of others or how they might be interacting with their customers that can then that could then lead to a different set of requirements and additional requirements for the development process. I'd love to switch gears a little bit to talk about one of your other stakeholders with respect to this, you know, I was intrigued by your discussion of the roles of philanthropists a lot of time. They're kind of left out of the map of kind of how organizations are working on things even though they're so critical to the funding for the social impact sector, and for what you can actually do with genetic technological solutions in the sector. So the technology funding is democratically accountable and not just reinforcing of the funders values and assumptions when they come in because we'd all like funding, but how do you make sure that it's geared toward the values that you're looking for and the projects that you're looking for, and you have that kind of accountability. So there is so much packed into that question. So many different ways to to touch on it and so many different points so I think there are a couple of different things that are important to think about when we think about funding and yes from the philanthropist space as well and so, you know, organizations that are funded by philanthropic philanthropic dollars want more philanthropic dollars often in order to execute against their mission. And so I think some of the things in, you know, some of the points that we make in the book are that it's really important for funders for philanthropists to really think holistically about the funding process. And so, yes, perhaps funding and initial project but also thinking about what sort of overhead support is needed what sort of support is needed for some time to experiment, some time to explore some new options, a little bit of tolerance of risk that something might not work initially. The John Jay college example that we mentioned earlier involves some of this freedom to sort of play around right and that's how the team was able to test something like 20 models before coming up with the final two models that were used and so building in the time and funding to have that exploration time is really important. And I think also on the end of the process really building in time to in funding for time to share back out to the community what worked and what didn't work. And so really allowing some funding again thinking holistically for organizations to share with similar organization, what's working, or to train other organization on what's working, or to allow the community itself to be trained on how to maintain their own technology that has now been used by developed with them and will be used by them. They should also probably have some agency to think about maintaining that technology itself. And then of course, one of my pet my pet issues is the need to fund infrastructure and sort of that digital infrastructure and the need to fund that so as an overall space on advancements can continue to be made. I love that you threw maintenance in there when I was in private practice there were so many times when I'd work with nonprofit organizations that they had a grant that allowed them to build and to create and then there were changes that happened either in their world or in the technology and they just didn't have the follow on funding to figure out how to have that continuing chapter for the interesting things that have been built. And yeah just building on that. It's so rare that anywhere in technology that the first version is the final version right we wouldn't expect. Facebook is for some reason the only tech company that is going to be defined right now, you know, Facebook's first version of technology to be what we use today we also probably don't expect today's version of Facebook to be what we use in the future. But you know Airbnb just made a big announcement about some big changes they made, and it was adding categories of housing. And so, you know, we accept the fact that there'll be multiple iterations of technology in the private sector space, we should expect that in the philanthropic space or in the social impact space and so I think also for funders to recognize that, you know, it's cool to fund the first version, but also funding version 1.5 and version 2.0 is also really valuable and incredibly important to get to systemic change that really has a different makes a difference in people's lives. So, I'll quickly flag that we're around 245 so we're going to shift over to audience Q&A in about 10 minutes so if you all think through questions and put them in the Q&A chat, we'll field them in the next several minutes to make sure we can answer those but I would like to ask a follow up question on the philanthropic angle, which is, to your point, I, as you mentioned there's institutional change, but from an individual funder perspective I guess how, how can a funder now say, let me shift the way perhaps I'm thinking about this how can they implement or catalyze some of those changes from the individuals perspective that then leads to some sort of institutional change that you've alluded to. I ask that because there's also actually a tap fellow in the audience who works on some of these issues and I know there are people in the community here that are listening that are probably thinking about what what individual power do they have to make that change. Yeah, absolutely. So I think so start with maybe individual power and sort of the sense of a specific program officers that your question starts. Yeah, so I think you know from let's say an individual at a large philanthropic institution. I mean asking and having conversations with your grantees about what some of that holistic support is pushing on it. Okay, you, we want to fund this program specifically, but what other support might you need have you thought through the maintenance have you thought through some of these questions about spending some time to better refine a proposal. So if you think about different ways to even just give out even if it's the same amount of funding, we got one part to do some of the exploration and another part of the funding to do the implementation or to just based on what you found. And then I think also in, as you're assessing different organizations, being asking organizations to get very clear as to what communities might mean to them, or what equity means to them just as you know I started out by saying here is how and how to refine equity. Ask organizations specifically what does it mean when you say you ran and you're going to run an inclusive process. Tell me what that looks like in your overall sort of project plan and your activities and so asking folks to get a little bit more clear and a little bit more specific about what those processes look like and then what additional funding might be needed to support the time to do that. I'm actually curious also one thing that we've been looking at at tap is the portion of funding that goes towards tech that actually comes from philanthropy versus private money now, or rather like venture etc. So I'm curious. How do you see philanthropy I guess playing a really important role in the way we fund value oriented tech as compared to other like financial or private capital schemes that exist that flow towards companies like Facebook etc that fund maybe a little bit more like consumer focused tech etc. So what is the role of philanthropy in that larger ecosystem when we think about how we pick the winners that really then I guess define our technical or tech landscape as we move forward. Yeah, absolutely and I'm going to put this question back on you and read that after I have answered it because I definitely want to hear a little bit more about some of the research that's that's been going on. But I think, you know, one thing to think about from what philanthropy can do here is to recognize that we don't need to only have a handful of technology solutions, which then become the be all and all of technology. Sometimes in the private sector we can fall into the trap of thinking that only a few organizations are worthy of money and are worthy of sort of reference of how wonderful technology is, but really you know in the social impact space, and I'd say broadly. Not just in social impact but it's a tech in general really understanding and recognizing that we want technology to be more accessible for everyone and so we're going to need more approaches. So if you look at, you know who receives venture capital funding or even who receives philanthropic funding, the diversity of people who are leading those organizations is not great. And so we really need to think about how can we really ensure that a diverse wide group of people are getting access to the technology are getting access to the time and the resources to invest in going from an organization that is walking a neighborhood to pick up food delivery to someone else in that neighborhood to then really building a structure and essentially building a tech platform that enables neighborhoods and cities across the country to be fed. And so I think that is some of the power that philanthropy can really insight when we recognize that there are many people who are capable of designing technology systems there are many people who are capable of designing technology that will enhance some of the missions that we've said are important so let's fund them and let's make sure that we're funding a wide swath of them. Yeah that touches on an issue that you know a lot of folks in our BKC community have been concerned about with the potential for adopting systems in the social sector that enforce algorithmic bias kind of further marginalizing and disenfranchising Are there specific things that you think philanthropists and change makers can do to harness the knowledge of communities to design less harmful tech can do just reaching out to the community even when it's developed with that insight, are there still kind of accountability audit like what kind of procedures and protections do put around it to make sure that the data sets really are helping and the problems the solutions really are helping the marginalized communities that you were seeking out to support. Yeah, absolutely I'll answer briefly and then let America respond to this question and in the last one as well and so I think you know algorithmic audits are incredibly important and we should continue to do those or we should do more of them since those are are fairly new but I also think, you know, you're asking the communities who are impacted, what is working and what's not working. Even if someone is not a trained engineer not a trained technologist or not a trained policy maker, they will be able to articulate. They are often able to articulate, am I being harmed by this, can I actually access the resources that I was told I was going to be able to get. Can I communicate with my friends and family can I get the time and space to sort of think and actually do the things that I was told that I would be able to do or that I articulated that I wanted to be able to do with the intervention of this technology do I still have some agency and making any changes. I think the community will know so I think more philanthropists can have a stronger tie to the community or ask some of their grantees to really show that interaction and the reaction from the really the end users that people are most impacted by the solutions. I think that is one step that we can take. And right there what would you say. I think that's spot on and I'll just echo your points I think what we've noticed especially in terms of philanthropy we we've looked at it from a slightly different angle which is fully philanthropy that funds tech policy research even and how that sets priorities in terms of research and how that then informs kind of these conversations about what is values, what do. Does values oriented tech development really look like how do we craft policy that's informed and effective for communities that we're looking at and I think what's interesting is that there's so much potential for full play a really big role in this space, but right now there also isn't that much accountability around them as institutions, sometimes in terms of where the money is going, how it's really benefiting communities and certain areas and so often it's, it's assumed to be of good interest because it's this form of giving that we really elevate, and it is great and we think there's a huge role that it can play in the way we think about again agenda setting, and the kinds of tech that gets elevated but we're trying to think about. Again, how can we ensure that a food to your point they are really interfacing with communities that they say they want to benefit. How do we make sure that there's an accountability scheme around that and how do we make sure those dollars are really going towards the kind of impact that they say they want to have. So we don't have the answers to that yet, but it's some of the questions that we're thinking through which is why if it's great to hear your thoughts on that, and how we build that community engagement in a way that's really effective. And really it's pulling all of the, all of the different stakeholders together to figure out how to solve this so. Well with that I did want to turn things over to the Q&A so that we have time for the Q&A here thank you so much for these comments on it. Maybe we'll just kind of run through them in no particular order and do that. The first question is if the pursuit is to build a better world, could there be a model we're in big tech internet companies who are criticized among other things for their size and wealth could be asked to substitute in part local taxes for global revenues by a contribution to a global development fund, which could be highly impactful which they could operate together in the interest of humanity and manage it with the efficiency with which they manage their commercial projects. Would that be a business like design for global philanthropy. Interesting concept it's like how to bring in private sector private sector dollars into a fund that then some of the questions would still remain as to how that fund gets allocated for what and for whom. Yeah, absolutely and so is there a space for private sector dollars to make a difference. I think there absolutely is I think, you know we've seen some I think similar funds for other challenges right we have the IMF, you know, World Development Institutes and more that sort of pool resources and then manage those resources with varying levels of success. And so I think what is important is to yes look for different ways to to get additional funds and to get additional but also to make sure that there's accountability built throughout the process for really, how are we continuing to assess the impact for communities and really how are communities able to assess the impact on what has been said to have been done to them are said to have to benefit them. And so I think that's, that's really important. I think it's also important to recognize that efficiency is really important, but sometimes we have an efficient systems because we're humans. And humans, you know, like to have relationships which are often inefficient, or like to have different interactions which are also inefficient. And so also I think having, you know, to see awareness that even if as maybe new systems or new structures are built that there might need to still be some efficiencies to account for the fact that there are humans that we are designing for and humans that are need to benefit and not just technology systems that are sort of divorced from that human application. Yeah, absolutely. Well, there are a lot of good questions in this chat. Let's see. Okay, so for the next one. I see from Claire Walsh. Sometimes we're faced with difficult decisions like what to do when our company survival depends on doing business with another company with a poor record or bad practices so do you have any ideas in the in those circumstances apart from looking for a new job. I think it goes back to that question of individual power and how we can kind of cultivate the change you want to see perhaps an institution that creates a lot of friction. Yeah, absolutely. I think it certainly does go to the question of individual power but it also goes to the question and you know to the value I laid out at the start of balancing systemic long term change with immediate needs and so if making this decision on an individual level I think it's really important to recognize that it is an individual decision and the calculus that you make may be different than other people would make the calculus you make today may be different than what you would have made six months ago and what you might make six months from now. And so really again looking for ways where if you do and be dedicated to building this more equitable world that you feel like the best way to do that that might be for you finding a different job might be staying in your current job that's spending more time partnering with an organization like data kind or there are many other organizations will structures to both get community input and a strong technical expertise and sort of combine those together. And so then you're sort of funding that that way, or maybe you have the finances or compare partner with other people are the finances to do some of this funding differently as we've talked about before and creating space for organizations to have more of the infrastructure or more of the experimentation or more of the knowledge sharing. So I think there are a lot of different ways for individuals to contribute to these problems. And some of them might, you know, ask people to leave jobs or to really advocate within jobs or might ask people to find different paths in and pull different levers in the system. If you don't mind, I actually would love to probe you a little bit more on that because you've had such a breath of experience and you've hopped between sectors and roles so I guess is there a time where you felt this kind of tension where and you don't have to go to an organization but rather, there was a moment where you felt like perhaps your individual concept of what change looks like etc or what your values were in the institutional values just didn't align appropriately and you had to make that decision of do I stay and make change here, or do I leave and find a new home. Absolutely. I would say that I have been fortunate and then the positions that I've held and the teams that I've worked on in the organizations. I've been employed at I have always felt enough of a connection to the organization or to the team into the mission that I've been struggled in that way to make me leave an organization for that. There certainly have been teams that I'm on, for example, organizations I've been a part of. For example, where I have desired, you know, one of my strong commitments throughout life is to diversify the tech space and if you look at my employers I've worked at spaces where there are not a lot of black women. And so finding ways to sort of create that change your work towards the change because one person can't single handedly change the numbers of change what diversity looks like in the tech space. And so finding ways to advocate for that within the organization finding ways outside of the organization filing other associations for myself to be a part of that in join with other people who also care about diversifying the tech space. So, I'd say for myself that sort of how it's manifest. It's really, it's really important to figure out how we can continue to diversify the tech space and be the voice for the change that you want to happen in the organizations but I, I hear what you say and you know that's been my experience that well that there isn't a, there isn't a simple answer in a lot of these organizations and there aren't kind of simple choices about what what are the, how to affect that change and how to figure out how you're going to find the pathway and work through it. One of the other interesting questions that we got here is, you know, going back I think toward the, the private sector is, you know, how do you think that private companies can join the public sector to achieve social changes. I think it's a really important component that wasn't as featured as much in terms of the stakeholder base here but they, you know, obviously play a really important part in the tech space. I mean, this is another question that I'd love to hear from both of you Sue and Amritha on, you know, from my perspective there are many ways the private sector can engage and so, you know, one is partnering with the social impact sector in different ways and so there are a number of public private partnerships have been created especially over the past several years. There have been a number of ways for the private sector to partner with nonprofits to provide technology expertise. I think one of the examples or one of the things we talk about in the book is making sure that nonprofits still have agency in that pairing and not, you know, it's not just a sales mechanism but actually what does that organization need what are the community needs and things like that so I think that's really important I think also, you know, finding different funding vehicles to think about new ways of funding this work and so it's not just the 501C3 status. That's going to make sense for everyone so what are different ways of funding this work what how could the private sector also encourage different types of structures of organizations of companies and more to think about really having a public good through really supporting public goods through technology. Yeah, and I worked with a lot of philanthropic arms of private sector organizations which were really important in driving the change in these spaces from a funding and a resource perspective and others and I mean I found one of the areas in your book interesting also about talking about the kind of private sector solutions not really perfectly overlying and oftentimes not being a wonderful gift when it's like just here's here's the technology I have you have it for free and it's really more to think critically about how that engagement can happen in the public private partnership in a way that is driven by a lot of the values that you mentioned in your book and is driven by the needs of the social impact organization and the needs of the communities that they're serving. Absolutely I think there's a. It speaks to the opportunities that exists there especially when the incentives are aligned so there are sometimes specific opportunities where the incentives of the private and public organizations really overlap and I think when we can find those fundamentals there's a lot of full things that can come out of it but related to that there's another question here that I think is really interesting that kind of place to it which is how can more responsible companies address competitive pressures from less responsible companies so when you're thinking about the marketplace and perhaps companies that do want to more align with some idea of social impact or values alignment that makes them potentially less competitive. So that's the claim there. How do you think about survival and market pressures to then shift the way you're doing your product development process or putting out products to consumers or communities. Yeah, absolutely and think of me that the TAP follow has done some research on this so if so I'll ask you to maybe share to recap and so I'm forgetting the name of the person who's done some research on this but wouldn't love to hear any thoughts you can on that. You know I think it's, you know, it's important to think of different vehicles also for private sector companies and so creating the sort of nonprofit arm as well as a way to sort of separate the two I think also looking at who's on the board of organizations whether it's a nonprofit or private sector company is important because the folks on the board are asking about it and sort of prioritizing the responsible use of technology. And then that has that carries a lot of weight in how the organization is structured and the goals the organization is pushing to. So I think those are some some different things to consider. Yeah, and the TAP work is by one of our fellows off Sunday or go who actually is affiliated with Brooklyn as well. She's done some interesting work about actually the claim that they are less competitive then but what her research shows essentially is that if you can design for marginalized communities or design in a more responsible way that you can consider a security and safety of high or more vulnerable groups, what you end up doing is creating safer products that have more potential for market capture so you end up being more competitive. And that's the business case that's presented and I think when we think about these questions the more we can again in terms of incentive alignment, make that business case and show that you end up getting a bigger slice of the population because you're just by nature creating better more robust products then there's kind of that win win situation that that's the dream of tech development right that we can get there and get everybody to go down that path. Absolutely and there was an wired article this week that she wrote right recapping that yes right so I can drop the link to that in the chat but it was a great piece and so highly highly recommend those read it. Yeah. One of our team can drop that in the chat that would be great. So, another question from Farhana sheets is having the right data for analysis key. What are the biggest challenges with data in the public sector. There are so many. How long do we have. So yes there are absolutely challenges with working with data in the public sector. So, first of all anything is just the collection of data. Very haphazard, it happens in different ways, I think also, you know tying back to the funding piece that we talked about before organizations that might start off as just a small group of individuals might not prioritize collecting of data and sort of build up their organization at a time, and the data has not been built up with their organization so just having sufficient data can be a challenge then organizing the data well that has been collected. Organizations can sometimes go from collecting no data to then collecting a lot of data and perhaps over collecting just because this information might be valuable sometimes somewhere and we might want to use it for some type of measurement and evaluation at some point somewhere and that leaves you with a lot of data and less know how what to actually do to analyze that data so I think that's there. And then, I think also you know challenge that whether you're in the social impact sector or not of just are the data sets representative are they complete that also exists in the social impact sector as well so I think you know again there are lots of challenges with data but it doesn't have challenges are insurmountable. There are ways that you can look at using some data science and more to do some analysis there are ways that you can look at partnering with different organizations different schools different other ways to collect data to help build out data and start from somewhere, or there are ways that you can really think about partnering organizations to define strong research questions, so that then there's a framework to then analyze some of the data that's been collected or will be collected in the future. That's great. Moving to I think we focused a lot on kind of the funding schemes and tech development etc one of the questions here is on the policy side which you touched on in the beginning but this is specific for emerging countries so what do you think are good ways to help policymakers from emerging countries tune into these debates and ideas. Oftentimes, they are either entirely disconnected from new regulatory trends or simply copying and pasting legislation, produce for other context, context outside of their own. Yeah, absolutely and I think I'd expand some of the advice from interagency working groups are just working groups in general to this context as well and that first step of really taking the time to translate and not just translate across a potential language barrier but really translate this may have worked in the US so what is the environment here in this particular country what does it look like here. What might you want to take what might you not want to take what worked well what didn't work well as you adopt those frameworks I think is really important I also again just having the space to share some of this practices across multiple countries and countries from different parts of the world and different size economies and more creating that space to really understand best practices for different environments is really important. And then I also say the reverse is important making sure that emerging countries and some of the challenges that they may be facing or some of the ways they figured out to solve problems, then feeds back into other countries that already have that might be more advanced technologically and so what does that process look like to really make sure there's there's that circle of learning. Okay, so we're almost at time to close out so I think, unfortunately we won't have time to answer all the questions in the chat but I would love to throw you a kind of close out question of who, which is. There's so many conversations now about tech and surveillance and security and how we can build better tech etc. And I think again the nice thing about your book is that it's a roadmap for action oriented change culturally institutionally individually etc so I'm curious. Out of all the potential paths forward what what makes you really excited about the future of tech of policy etc to make sure that we can build that more equitable future. Yeah, absolutely I would say two things really excite me. The first is just the ability to say no, and to recognize that we can say no to how we've done things in the past we can say no to different surveillance technologies or different infringements on privacy and more that no is always an option and so there are some things that we may just not want to do and so we can have the power to say no and to really advocate for others to also say no to different technologies that actually excites me because then we can direct our resources to what actually matters which is the other part of what excites me of figuring out different ways and creating pathways for more people to engage in the development in the tech development process for more people to engage in conversations about what policies would be useful for more people to engage in questions about what does equity mean and how do we bring more people to the table and really ensure that our resources and our resources and conversations are inclusive and available to more people. That's wonderful. Yeah. Well, thank you so much for your leadership your advocacy on this your writing for talking with us today and really appreciate all you're doing to help find us a way to a more equitable texture, better future so thank you and thank you to our team for helping pull this together and to everybody who joined us today we really appreciate it look forward to interacting with you more and are really happy to be able to bring this dialogue to everyone. Thank you so much. Thanks so much. Bye everyone.