 So hello, I'm Anne Marie Slaughter I am then the CEO of New America and I am coming to you from the back of a New York City taxi cab, which is just one reflection of the technology we now all increasingly take for granted. So welcome to the tech that comes next how change makers technologists and philanthropists can build an equitable can build an equitable world. I've been looking forward to this event for months, and I'm so pleased to be joined by a fool Bruce and Amy Ward, who are the authors of the eponymous book that's just been released, the tech that comes next. Noah is an adjunct faculty member at Carnegie Mellon University at the Heinz College of Information Systems and Public Policy. She is also a former New America employee in she was indeed the strategy director for our public interest technology program. Amy is the CEO of the non of the nonprofit technology enterprise network. And we are delighted to be able to spotlight their work. This book, I think is really hugely important, and it is rooted in the same principles that shape New America's public interest technologies work in this field and in the world. We're also thrilled to be joined by two leaders in our public interest technology university network, Dr Latanya Sweeney and Dr Charlton McElwain. Latanya is the doctor Sweeney I should say is the Daniel Paul professor of the practice of government and technology at the Harvard Kennedy School and in the Harvard faculty of arts and sciences. And Dr McElwain serves as the vice provost for faculty engagement and development at NYU. I thank you both for joining us and for being part of this event. I invite all of you to consider what would happen if we reimagine the world in which as the authors put it humans are centered and justice is pursued. It is a question that guides our authors as they lay out a vision of what such a world would look like, and break down how actors at all levels, and at all types of organizations can use these principles to transform their day to day work and use technology to get there. Technology is a tool, never as an end. Currently, we live in a world and I'm quoting of exclusive access systems of control and centralized power structures. If it comes next directs us to a world whose pillars are community or whose pillars are community relationships shared resources, safety and accountability. And I will just add as a network theorist or someone who studied and written about networks for 30 years. It is a decentralized world, but still a world in which power is held accountable. So we are really looking forward to this discussion. We're going to hear about their experiences working in the field, and speaking to practitioners across the spectrum of people working at social impact organizations, technologists funders policymakers and community members, all of whom are impacted by pieces of technology. Public interest technology work is happening. There are public technology in public interest technology careers all over the place. We simply need to come together and institutionalize it recognize it and support it as a field. So thank you all for being here. We're going to have a moderated discussion first and then we'll turn to Q&A. We invite you, as you heard in the recording to send in your questions. And with that, I'm delighted to turn the mic over to Dr. Thank you so much. And it's just wonderful to be here with everyone today. And to talk about this is a very exciting book the tech that comes next I mean it paints an amazing just the title alone, ignite amazing vision and thought. Amy, how did the book come about. Thanks for having us. And I'm really looking forward to the conversation as Henry set us up. I think we're going to talk about every single bit of this book from from what's been shared and I'm really looking forward to it. So the book came about. Well, I guess because of food and I, as folks who who've worked in many different aspects of what technology means for social impact work, how it gets built how it gets funded. How it is supported from our experiences and then also, I don't know if you all have looked around this world has a few things to change in it. So the combination of our experiences and as working together on other projects like intense equity guide for nonprofit technology really helped us feel like there's, there's actually a few things we have to say. I really love the opportunity to what we did with the book is hopefully set set a few of those questions up so that more, more folks are joining us in that conversation. And I think something that's really important to start this conversation on is values values are really important to talk about because ultimately what we value is is what, and also how we build whatever we're building right. So both to situate this conversation, as well as how we tried to situate the book in values, you know that we saw as integral to community centered models of care, community centered models of technology development of engagement, wherever you're going to go. There's six values that we outlined at the start of the book and then I thought would be helpful to kind of start this conversation from that same place, because ultimately, we're trying to create an equitable world. We're ultimately just trying to create technology, right. And if we're trying to create an equitable world that requires of us a few things in in our view that requires that we value the knowledge and wisdom of lived experience that we value the participation of a diversity of people in every, every part of the process right from decision making to planning to building that we value accessibility as a priority from the start that we value the multiple ways change is made that means both, you know, immediate needs can be addressed as well as systemic change efforts that we value the strength of collectively having this conversation of collectively creating the vision of that equitable world. And last that we value the continued pursuit of learning and skill building and experience together. And what's really exciting to me about those values is if that's that's really what we talked about first before we you know we didn't write the whole book and then at the end say oh we should add some values and right it's the same. The books process was the same process that we're really trying to advocate for that you start with those values. And then, I mean we could have written for more books. I don't want to please don't sign us up for that but that's a whole lot more words to write but from those values, I think you could probably anticipate how many conversations you could have just about what does it mean to prioritize accessibility right what does it mean to really value lived experience from lots of different folks right. As we open the conversation up from values, it changes what we're talking about in technology than just, you know, oh what did you write it in, or is it open source or not right. We get a peel back so many layers when we start at the values. If you want to chat about who we talk about in the book. Absolutely and again just echoing what Amy said I'm just so grateful to be a part of this conversation today and so thankful to New America for putting on this event, I'm really excited to dive in to the book and really talk about how we can build a more equitable world and so as Amy and I were working on the book and hadn't gone through and outline the values that we wanted to center and thinking about what's important what's there. We had to figure out what our audience was who our audience was and what the main roles we saw as having key roles and responsibilities to play and creating an equitable world. And so we decided to divide the roles into five main roles that we structured the book around and have recommendations for each of these. And I'll briefly go through them to sort of frame our conversation for the rest of the day today. So first we talked about social impact organizations and individuals working at the social impact organizations. So people very focused on the mission of their organizations and who are struggling to figure out how do we build technology. And how do we build solutions in general for inclusion and focusing on accessibility. How are we making our data systems equitable. How are we developing new models for change and new ways of interacting. And then for both the staff and for clients of these social impact organizations. How are we investing in technology and what should the relationship be for technology and who should have access to it. Spoiler alert everyone should have access to it. The next role that we really looked at was the role of technologist people who are dealing with the technology designing technology developing technology and deploying technology. And there we really talk about first of all expanding the definition of who we see as technologist. And so recognizing that technologists can come from a wide variety of backgrounds a wide variety of genders races locations and more that the training and how people develop expertise and technology can vary greatly can come from formal education. And where it can come from on the job experience and really developing that way and really understanding the technology there and how to deploy technology in ways that again really advanced the mission and move forward missions. Technologists need to be really focused on ethics security and privacy. It's important in general we were having a lot of conversations about this but especially important when we talk about working and making a difference in the social impact space because there's so much data that's really important to be able to understand how responsibility is so great. The next role we talked about was the role of funders and investors. Money makes the world go around it is true in many different places, but really wanting to examine what roles and responsibilities funders and investors have to make a difference and really make progress towards building that equitable world and so what it means for funders to think about funding comprehensive needs and capacity, funding for specialization and the ability to work together is to share ideas and to build and to redefine what scale is so scale could mean growing really large or it could be growing to a point and then recognizing that the model needs to shift and adapt for different areas and different communities. But then about policymakers and the world that policymakers really have towards pulling lovers for systemic change in society and policymakers really have the opportunities there to increase access for individuals who resources to technology to the ability to participate in systems but policymakers can also work on being proactive and proactively including and centering community members and their constituents in the policy development and just being proactive and policy development and then policymakers also have the opportunity to understand the implications for adoption and compliance when it comes to technology and to systems in general. And then finally, but perhaps most importantly we talked about communities and how all of the work that we're talking about really needs to be rooted in community and that the technology that we're talking about developing and really the world that we're talking about creating is about community it's about people, it is moving beyond charity and really centering communities recognizing that communities often know their own needs and have the ability or should be empowered with the ability to advocate for themselves. So what types of models for change can we consider for communities how can we better center communities and the technology and the funding, the social impact organizations and form. And so when you look at all of those together we think there's a really strong foundation for building an equitable world. Thank you and Amy, I want to jump in here and say first thank you for your book as a technology historian. I often get stuck in the past, which is often easy to be pessimistic about the future and so to see your like Dr sweetie just the title of the book the tech that comes next to frame our sense of a future and a dividing line and then making a break from the past and many respects. So I appreciate that particular aspect of the book. But I want to jump in where you, right where you left off a fool and talk a little bit about community. One of the things I've been critical about is some of the language we use when we think about technology and even public interest technology and we talk about things like tech for good or tech for all. And in my view, often when we talk about those things frequently the all ends up being certain people and certain people benefit others are left out. So your book and one of the things I appreciate about the book is that it sets us off in a different framework because it begins with thinking about community and centering communities so. So my question for you is in thinking about community as a starting point. How do you define things like equity and justice in the context of a public interest technology or thinking about the future of technology what are those words mean to you really in this context. Thank you for all of that. We should. We need to schedule a separate conversation just to dig into what you just said about all of that I'm going to, I'm going to start and then a food can have more time for the more eloquent answer my, my immediate response though to your question is kind of answering how do we define equity and justice and community with also talking about the preamble to the question because, ultimately, technology for all is a bit of a misnomer. We don't need every single person to use the same technologies in the same way, we all have different needs we all have different lives we all have different communities were a part of what you know so I think there's a bit of the, the like. Capitalism's engine of technology which is like get it to as many consumers right there needs to be as many users as possible, that's actually tripping us up when we don't need everyone to use all the same things. And we don't need them to be used in the same way so I think part of the, the benefit of starting with community and thinking about community first is when you think about that, I think it's easier and more natural to to already have in your mind that there are people who love this right in your community, there might be people who love to play the piano and people who love to play the guitar and people who are like, I would love to listen to you. Please don't make me play an instrument right. But, but maybe music is still valued, right. So, when we start with community, we kind of already create a stage where there can be nuance and difference that's based on really meeting your needs and your goals and doing what you need. So, when I think about an equitable world. I mean, we are free, right. And, and what free means is going to mean, we are each meeting our needs, right, we are each safe. And, and that kind of requires something very different when we think about tech development because it means success is not 100% adoption. Right. I mean success is people using the tools that they need to get done what they want to get done. And that also, it could go away if that need is met, and that again we're not, we're not committed to making sure that technology stays around if, if like we're good. Goodbye, you know sunset the tool, right. And you mentioned just the importance of people using technology in the way they need it for their needs and so I think when we talk about designing that technology then it's okay to be specific about who we're designing for, and really thinking about who are we designing for who is in the community, community and what their needs are and recognizing different community members might have different needs and so that might mean different flavors of the technology so someone might want to look at a screen that has large text and a really bright contrast level. Others might need a low contrast level because of that and so what does that mean then for how you design the technology and how you interact with it in that way. We talk about data collection and using data how have we been inclusive in the data that we've been collecting. Are we interrogating those data sets that we're using and really looking for are they complete are they representative about communities and the people who need to be able to use that information and the tools that you're designing so I think recognizing the differences in community but also being really specific in who you're designing for so that when you say you're designing for all if we move away from that language we really ensure that we are designing for black and brown people for example or we're designing for people with varying levels of familiarity with technology or familiarity with specific issues. Awesome that's great. Yeah, absolutely and just to kind of continue that I mean let's just think a little bit about the role of government and policy could play in helping to foster this vision and create it I mean, you know many ways. It has been a role primarily of right now with technology is looking at the companies in their economic model and their basic practices, and not really quick to respond when there have been issues and certainly not necessarily responding with questions around equity and access. I was wondering and if you start with a community and a community first, not only does it speak to different needs, but also kind of speaks to a decentralization of the way we think about technology. I was wondering if you could comment a little bit about the digital equity act. In the light of the book. Yeah, absolutely so one of the case studies that we, we highlighted in the book was about the digital equity act and the way that it came together and one of the things that we talk about as a digital equity act although it happened at the federal level really built on conversations that were at state levels and local levels before. And it also really was an example of how community based organizations and organizations you were in communities talking to people about how do you want to use broadband. Oh, you don't just want to use broadband to get access to look for a job but then you want to still have access to the internet to be able to communicate with your friends to order food to get benefits or whatever other needs you have. And seeing how that then translates to developing a more holistic policy that really then is reflective of the actual needs on the ground. And so in the book as we look at this example we talked to a particular advocacy organization who again is working with so many of these groups on the ground, and really working for what is the process that we can use just amongst these groups to really find out how people need to use digital tools how people need to get access to broadband where they would like to get it. And then balancing how you communicate that to policymakers how policymakers and learn and adapt and really build in these lessons into the policy development process and then ultimately into the policies that are created. It means that recognizing that even though community members might not be able to have might not be fluent in the most prolific policy languages and technology terms that we all love to use when we're in those rooms. They still know what they need. They know that they need to get a job they know that they need to communicate with their friends and family and so then engaging in somewhat of a translation process to translate that into policies and into laws that can really help again address the systemic issues. What would you add to that. I think I wanted to add a little context that if folks are not familiar with the digital equity act, you know, it's not like a long standing federal policy. You know this is new. And what I think is really important to that story that you're sharing that you know this is really came from years and years of local and state level policy work. That work was happening before the pandemic, right the what while the pandemic seems to have opened some people's eyes to like, Oh, it turns out not everyone's online and like, Oh, turns out you need the internet. A lot of folks would not have gone without the internet right like a lot of people prioritized it saw it as a necessity for them. So this work is long standing, and I think it's just helpful to remember. Again, as we talk about what's the value of doing this work in community. Communities have always said we need the internet right it's not like, oh, there's just like 45 million people in the US being like, I don't know what's online for me, right. They literally cannot get broadband rather than live you know it's it's. Assuming that the action is on the wrong person here, right. And so when we think about policy, especially around technology. I think it's important as I just want to like highlight and underline what if who I said communities know what they need. There's so often policymakers as as people but you know really as a system are not necessarily engaging community in those conversations and you know in the book we talk about, there's policy about technology. There's also technology in policy making to make it participatory and more accessible. So I just challenge folks as you're thinking about this. The Equal Equity Act is kind of an example of both sides right it is it is policy about technology but also its policy created through many years of engagement on and offline to get people really to build that power and push it forward. Thank you for that I want to follow up with, you know, again kind of right where you left off that word power to ask kind of a similar question to the equity one, you know we think a lot about power residing with policymakers and folks who are in power and so forth and and when we when we think about the tech that comes next and creating a different world and fundamentally transforming our tech ecosystem power is a central element of that. So, so my question I'm curious about is a few you talked about sort of the five groups of folks that you focus on in the book. I'm curious how you see each of those groups exercising power in some way in terms of technology change and development. And is there an example where you have seen communities or individuals exert power to really change in a small way or a big way, the tech context in which they live and work and so forth. Yeah, absolutely certainly some of the different roles do. There's just power and balances for that existed our world today whether it's looking at something that implications of funding or policymakers or even technologists. And certainly, the case study that we have in the technologist chapter is about data kind and John Jay College John Jay College being a four year institution in New, in New York that has a number that goal is to produce graduates that can go off and do great things many in the criminal justice space. John Jay College partnered with data kind a few years ago to really look at how do they get more students to graduate so John Jay College said you know we have a lot of we have a lot of programs and interventions in place to make sure students get through their freshman year of college and go to their sophomore year but we're starting to notice that a number of students complete three quarters of the credits they need to graduate and then don't graduate and so why is that we dig into that a little bit more. So they partnered with data kind which does data science, which did some data science for John Jay College in this and ran a lot of different models using the something like 20 years of historical data that John Jay College is an academic institution had a lot of data on hand about students and ultimately they were data kind was able to develop a model that identified students at risk of dropping out helped identify potential interventions to take the John Jay College staff then made the final decision and implemented those interventions. John Jay College credits this algorithm with over a two year period, graduating an additional 900 students at a cost of only about $250 per student to the university. And so when you think about the number of lives that's changed and families that are more supported people who can pay back student loans and things like that it really has a tremendous impact. So what about this example, even more is that in talking to the John Jay College staff, they talked about how the data kind data scientists really explained to them, what was working and so they, the John Jay College staff could now explain to you what a random forest model is for example, and they knew that they could understand that and they could be able to articulate what they needed to see, and trust that technologists would do that and so that started to shift a little bit of the power and balance that that sometimes exists between technologists and social impact organizations. In fact, the John Jay College staff says they now, when additional vendors, new vendors, new tech vendors come to campus to sell them something. If these vendors can't answer their questions or can't explain, they don't engage in those contracts because they now understand some of the power they can have and their ability to communicate. And that also has changed the way they interact with some of their funders really pushing for enough funding and time to experiment with solutions before implementing those solutions, things like that so it's changed the relationship and changed and shifted some of the power dynamics with their funders as well. I'm going to talk about power all day. You know, and it's interesting. Yeah, you know in the book it's not like we're necessarily advocating that funders and investors do more to get together and build power. Right. So this is not a universal goal across all of these chapters. What building power means, and like to what end is that power wielded is of course also different, right with with different communities but also with technology in different ways, you know, I think it's been really incredible to see, despite the huge, you know influences that we have in in the US and I'm sure there's folks joining from other places so like TLDR, same where you probably are to. But you know these big forces that like it's, it's a few major or a single where you are a player to get internet from. And, you know, there, there's municipal broadband, where they've said actually, no one's coming here, no one's prioritizing that we we need and deserve to have reliable fast speeds, we'll build it ourselves, and then they also benefit right like that that revenue the maintenance, it's all a part of the community now. So that's a really cool way of thinking about building power right it's not. We formed a pack and now we're lobbyists, it's, we built the tool we needed and now we can all do whatever we want and we don't need to justify that this was a workforce development effort and it's all to get our whole city employed. Internet should be a utility right it doesn't matter if you took a shower or drank the water you needed water doesn't matter if you were working online or whatever you know reading something, you need the internet and you should be able to have that. I think another way of thinking about technology, as it relates to communities and building power is something we talk a bit about in the book. But I'm going to reference an example that's not in there but is really thinking about technology being kind of handed over to the community for for for maintenance and for like the longer term run to determine where it goes. Open Collective, which is a organization and platform that's supporting lots of fundraising and mobilization efforts where you don't need to have a C3 you don't need to pretend that's the only way you make change. So they're really working on what the what the transfer to community ownership is now, and not that the long term plan for the tech platform is that it's theirs, right. And so, as we think about building power. Part of what we talked about in the book is building up the strengths and skills and expertise, technically in the community, so that you can transfer ownership to the community and they, they can really run the, the tool or the platform or the solution whatever. And that's my short answer is building power, just like policy, you can use technology to build power and network and communicate and organize, but you're also building power to influence technology and have it be accountable to the community in the end. I think I found really interesting both in the book but also that's coming out in this conversation, as you were talking about power, and the role of users in the John Jay example is also the question around blind trust. So in general, most people think of technology as sort of this black box, what comes out of it must be right. It can only be designed the way it's designed. And I think that that you are really challenging this and you're asking all of us to challenges and for us to move forward. What could you respond more could you tell us more about through the lens of blind trust and technology. Yeah, I, Amy and I are both wanting to jump in on this I will talk quickly and then pass it over to Amy I think. Dr. Sweeney to your point people often want to blind just have a blind trust in technology technology is the be all end all it's the, the best arbitrator we can have is what some folks might say that is not what I would say I don't think that's what we argue in the book. In fact, we talk a lot about how you know technology is ultimately a series of choices, we can decide to make different choices. And because it's a series of choices and it's created by humans and susceptible to all of the, the positives and flaws that humans have as well and so it's susceptible to all of the biases and so we think about building technology systems and we think about even just decisions as to how we deploy it, really recognizing that we should really aggregate what these assumptions are that are using everything from if you're designing a form and you've said that you're going to be inclusive are you giving people options to put in enough options for race enough options for gender, for example, are you making predictions or doing analysis of data have you done a check to make sure that data is actually inclusive and then if it isn't have you made an intentional decision about what to move forward with or what not to move forward with. One of the things we talk about in the technology chapter is part of being a technologist in the space means understanding when to say no, and when to say no to doing a particular technology and intervention because technology isn't the right solution. Because the data underlying facts cannot be compensated for enough and aren't valid for the situation, or just their technology situations that have been proposed are inappropriate for the situation. You might not always need a blockchain, for example, to sell every single one of your basic database problem. What I know, shocking, shocking. But really recognizing some of some of those assumptions really pushing on them and really coming back to those values we talked about before. What are the decisions you're making to them to design or deploy this technology, are they aligned with these values are the inclusive. Do they recognize and see all of the people in the communities, Amy over to you. Yeah, I agree with everything that if was said. I mean like always for infinity, but especially all of that. And I guess I just like take a step back and say, y'all humans created the world we have today. Are these the same humans you're going to assume created every technology, like perfectly, no, I do not trust humans have not proven that I should be trusting them as a whole right now in this world. So, instead of, you know, trusting technology, a different word that I've really found helpful is, are we celebrating technology. Are we in our conversations or our planning or our budgeting or our prioritization. Are we celebrating that technology was the solution because technology is a tool, you know, my dad was a career laborer, and you know he didn't come home from the construction site and say, you know that jackhammer today was a real success. Right. We, that's not where the celebration is the celebration is that we are meeting our needs we are meeting our goals we are meeting our missions we're actually changing the world. And when we change what we're celebrating it's just like changing what we value at the start right changing that the celebration is that we are like going where we want to go and not like to look at the fields in this database. It helps us reframe that again it's because we want to have an equitable world it's because we want to be in community we want to be in relationship with each other is people, not that the platform I use to talk to a fluid was like, oh my gosh, this is the best. So glad no I'm celebrating that we got to work together and we got to talk or build something together right. That's that's one of the things that I've really loved about of who you're highlighting the John Jay example because oftentimes I think when we think about things like power and participation it's to the ends of building a new or different type of technology when in the celebration is there are 900 students who were educated graduated and set up for career and economic success and that's the human element that's the part that we really want to change and champion about the potential impacts of technology. Before we continue I want to take a moment to just highlight the book again we're talking about it but I want to make sure everyone has a chance to know what it is and to be able to get it. So when you get a chance all of the folks that have joined us today, please, please, please go out and purchase your copy of the tech that comes next, how change makers technologists and philanthropists can build an equitable world. I think all of you will be able to see a link on your page you can click on it right now on your screen and go right to the book and purchase it. Also the public interest technology team is going to be raffling off I believe five books to registrants of today's event so no action needed for any of our participants. The team will just do a random selection and let winners know about further instructions for getting the book but please don't leave our audience today or shortly thereafter with without getting your own copy. All right, that's it for the plug. Thank you for the plug. Always always have to figure out as we've been talking about before how you get to sign those books and all those things in our, our virtual world today and virtual book tours and so forth. I would also add that it seems like it's room for additional technology vision of how to make that happen. I want to focus also on questions from the audience from Doran Moreland. It seems that leadership is critical for measuring and advancing equity focused work. How do you identify leaders of organizations who are seriously committed. Great questions. I love talking about leadership because I think it's a really great opportunity to question some of our assumptions when we use the word leader. And, especially if we're talking about a technology leader inside of an organization inside of a social impact organization that could be someone with literally any job title you could find on LinkedIn. That might be the technology leader in that organization right similarly that might be the person most committed to an equitable world. It does not mean that you have the privilege of putting CEO and your email signature right that you are like now magically a technology leader, an equitable, you know, outcomes focused leader. Instead I think, really, back to the building power conversation, you know, making sure that whoever you are inside of an organization or whoever you are talking to or working with like in the example with john j college right there wasn't every meeting wasn't with the college president right. It's finding the people who are who are kind of ready to to wiggle a little in the system so that it gets bigger and bigger right and finding those folks and helping them build power internally. And as an organization one of the best ways to build power, especially if you do not have CEO in your email signature is by bringing the community in. It is a lot harder to say no to a staff person who's also able to say, and here are 10 community members who also need this change, right, who are also harmed by the way our database says these are the only three identity markers on whatever right. You build power again by having the community right there with you from the start. And then, again, it might not be a director or a manager or even in a tech team right it's it's often staff who are closest to the community meaning they're often program staff right engagement staff. They're not able to not just be the liaison but literally bring to the chair in the room right bring the community in and change those conversations. And I think reframing what leadership means by the folks who are kind of willing to take the action helps us get rid of some of like the old hierarchy and old ways of thinking that aren't necessarily serving us as we move into a more equitable world. Absolutely everything is said, absolutely even. So one thing our book isn't is a checklist for follow these 10 easy steps to build an equitable world or follow these five easy steps to become a leader. But what we do talk about a lot and Amy touched on it as well is just the importance of having a conversation and engaging in conversations across these roles and then really using that to determine what's our next step how do we move forward and how do we always bring it to the community and so I say that as you're looking for leaders just really looking for people who are engaging in those conversations across all of those different roles that we talked about before, and really then tying it back to what's our mission, what are we trying to ultimately create in the world, and how are we making sure we're continuing to move forward and making smart choices about technology about relationships and about more along the way. I'm going to get another question in here that maybe gives us an opportunity for a kind of real time case study of sorts and that is a question about the metaverse. So what are some of the equity implications that you wish those designing AR metaverse platforms are considering now as they build. I would go back to our values and say accessibility. And accessibility does not just mean you know a screen reader is can can enter the metaverse right accessibility means a lot of things. And when I think about folks who are already now in incredible positions of privilege to be working on the metaverse, when again 45 million people just in the US can't even get online. And that automatically makes me say well who is the metaverse for because I don't know that it's for a lot of us. And that means whatever investments you're making in the path you're choosing today, we know we're going to really determine where it goes, and if it already feels like it's not for a lot of us, like who do they think is coming to the metaverse, you know five years from now, right. So that's not to say they can't change that. But I think the biggest question for me is who is this for, and who benefits by this being invested in today versus other investments that would bring more people into the conversation. Yeah, Amy has said it. Humans will human right. So we have a long track record of seeing that and so I think, you know, as we framed our book we framed our work is about what is coming next and what are those values that we're trying to create what intentional choices we have to make to make sure things are equitable that make sure that people have access, and then make sure the systems we build our built around inclusion rather than exclusion. And so I challenged designers of the metaverse to think about those same questions and continue to engage in that engage in those thought processes and what it means in the actual products and services and systems that are being created. Well, you know, and I also, you know, reading the book makes us also reflect on technology design in a different way it raises questions so for you, and I'll stick with metaverse for just a minute. So metaverse is being built right now. It's being built around a set of values it's not as though technology is value less. It's packed with values. And what and the question is whose values is metaverse going to serve. And so what do you think is it that that that meta should engage communities and discussions about inclusion of the metaverse. So what do you think that we should think about building augmented and virtual reality systems for communities in other words engaging communities and see what kind of virtual reality experiences would be beneficial. Many, many, many have thought about some people have thought about this for example, have brought up the question of how do you use virtual realities to do to establish empathy, or living as the other. That's not something that's currently on the list of anything that metaverse is currently considering. But I wonder if that's kind of where you're trying to get us to go. I don't know if you had any reflections about it. Yeah, I'd say that's exactly where we want people to go is that centering communities and so people are designing the metaverse going out to different communities and a variety of communities both those who are early adopters of metaverse technology but also intentionally again as to who is not here and finding those communities and determining why is that because there is no place for them is it because they perceive there's no place for them and metaverse is it because the metaverse isn't accessible. Is it because there's no desire for the metaverse is it because they envision a different way to interact with the metaverse but really going back to let's find the different communities let's talk to those communities and then design from there. And I think letting go of the assumption in that engagement with communities that the metaverse is what they need. Right. Going in to say, you know, we can create technology we could probably create some pretty advanced technology. We have a whole lot. I mean we're talking about meta slash Facebook like there's a whole lot of resource we could put behind this. What do you actually need. I mean it is, I think, and we can go into this or not or I can like bleep out the names but you know, there's already, it is already starting from a problematic point when it is not a leading this right like they already built they built a product. In 2003 like a whole long many lifetimes ago from a very specific use case with very specific values. We, you know, a lot of us have never been surprised when it's been problematic it was built problematically right. So, to, and we see this in social impact technology all the time. There's technology that we are being sold that we are being marketed that was never built for our use cases that was never built for us, but we are now a sales vertical, right, like that that was gone many years later, right. So, I would argue like just like any other technology, if it's being sold if it's being built for a very exclusive specific group, and then kind of downstream oh that we could like get these people on here to right, it's not for us it's not going to meet our needs, right we need purpose built intentionally built tools and meta saying the metaverse is not I think going to to meet that threshold right it needs to actually say communities what do you need and how are we in service to that end. If I can stay with this for just a minute because I think in a lot of ways that sums up, you know the real crux of your book and ask. What role do we have to play what role and those five groups that you really focus on have to play in helping make this happen assuming that you know we kind of know what is is moving if meta and so forth is has the agenda. What, what role is there to play from others of us to say, let's slow down, let's be more intentional, let's maybe even put some boundaries around and gates around what gets developed and what doesn't and so forth. And I especially think, again, back to that power question, you know the folks who already have a lot of power funders investors policymakers are the ones who have the most opportunity to change that dynamic funders have a bunch of power because they are the ones often in these projects, and then they say, like and we want it done really fast and we want like v one to be perfect. Right, that has to change if you say, we are here for like we will continue funding five versions until it gets closer and closer to better. And that can be on whatever timeline is appropriate for the community. Instantly, there's breathing room, right, but that came from the funder because that's like this broken beholden relationship that we have same with policymakers you know if, if there are definitions. There are policies, there are all these pieces that we can use kind of as the mechanics to say, actually, you know it doesn't need to work that way or you can't go to step two, until community was engaged in step one in a meaningful way. Right, so we almost get to use the system that doesn't work for us, just use it back against itself. Right, so that we do, we do build it that way for going forward. So I echo everything that Amy said I used to work in the federal government and would often tell people we need to figure out ways to use bureaucracy for our good and so some of the same systems and processes that may have seemed bureaucratic and as a hindrance figuring out how to use them to help advance some of the work or to enshrine some of the work and to continue on some of the work that you're pushing and I think. So really as we think about what gates of sort of decision gates might look like metaverse or technology development. In general what those processes look like is we're looking through the process around funding organizations, investing in organizations, purchasing products, developing products that work with other systems that have already been developed with the policies look like for when people can go forward when people can't go forward. What accountability looks like what enforcing existing laws, but in a more digital world or in a metaverse world, look like I think really then thinking through what are those systems that are already in place and how do we shift those a little bit so that we can use them to really protect and advocate really for the needs of communities. You know one of the things I found interesting as you're as you're talking is we sometimes we have, we keep blending, you know, who is the consumer and who is the public in fact there was a there's a question from from someone's anonymous our tech consumers and the public community, the same people. And I think this is a great question because one of my colleagues often says, when you use a lot of the current technologies you are the product, and their customer is their advertiser. So how often can I keep you on a platform is how often as it gives them the opportunity to make money by delivering ads. And so what's interesting about that perspective is that the technology that isn't really designed. It's only designed to keep you on the platform and so decisions that you might make about your own productivity, or your own uses become sort of subordinate subordinate to the money machines the way they've chosen to make their money. Do you think that if you unpacked and you try to put the community first, or even in some cases individuals first people first. You know, to what extent do you think it would have an impact, certainly on the population but also on the business and how do we unpack who the customer really is. I love, I love what you're saying because it's, I see a direct kind of parallel to social impact organizations right we there's, there's a temptation, even if we're not aware of it to make sure the problem we are addressing continues so that we stay in business and still have an organization, right. And so, if we're really focused on the community then we can get over that as an organization and, and our end goal is, we are closed, no more need no more, you know whatever your mission is for right like it's addressed. So it's a success that you that you close the organization right because the issue no longer existed. So in the same way, technology is is required. It's not valuing community right so it's not there to meet your need it's there to make sure that you still need to come back to the, the source, right. I wanted to name that parallel because I think when we find other mirrors, it helps us look at look back at the original example or question in a different way. And as we're focused on community again, do community members need those ads. I think the argument is always well this way you have it for free, right and the ads are helping us stay in business. But if it's really valuable people might pay $0.99 right like people might be willing to pay for the thing that they do need and have it actually be accountable to them and not accountable to the advertisers, right. But again, we are making assumptions about the community and and selling something to them even if it was for free, instead of building with the community and actually having it meet needs that exist. And we're starting to run a little short on time but just again to augment what Amy said I think it comes down to, you know, tech consumers can be members of communities and our members of communities and so it comes back to, are we really sending again with those communities one and so really shifting from Dr. Sweeney as you pointed out sort of consumers as the product to really the community have we gone to see their needs are what they actually need and what we're designing for around that. And just a really quick note, I would kick myself I didn't say this, a reminder that we are all as humans in communities, we are I am part of multiple different communities right geographic communities identity communities, families friends etc. People who build technology are also humans who are in, in communities right. So, part of I think what's really important to remember is while we talk about social impact organizations funders technologists communities. We are all of those things at different moments in our day at different moments in our life, right and organization becomes a funder. When we say hey we carved out money in the budget to build this ourselves, right. We're not thinking about that instead of like there's a hard wall between each of those groups helps us remember that we are already operating in these ways. We're just not thinking about it and we're not seeing it and so it is not a vastly different reality we're trying to say we can build forward from we have it we're just not maybe realizing that right now. That's just a couple of quick questions. As we wrap up very practical number one. Do you have a discussion guide for your book for those that are out there at work and other places that want to use it as a book club reader. Also, what advice would you give to folks who are working at tech companies and want to do the most good. First question we have plans for creating a discussion guide soon ish to come. So, look out for that. And then quickly for folks who are in the tech world wanting to do this, start to integrate your the decisions that you're making in your work today. Take yourself through the questions that we pose today and the questions that we've got in the book as well. I'm thinking about what communities are interacting with or using the technology that you're developing and ask, are you centering them are you, do you have ways to talk to them and more. So that's that's what I would suggest. And if you want to get the discussion guide will put it on the website for free on the tech that comes next. And there's also an email sign up there we we are not sending a bunch of emails but that way we can let you know when the discussion guide goes up. And if you don't have the book yet, please know that there are 25 questions in every chapter. So, you may not need the discussion guide, because those questions may be enough to get you talking for quite some time. Wonderful. It's been a real pleasure talking to you. I just want to remind everyone to make sure you leave this webinar today with a copy of the book. There's a link down in the corner that you can link to make sure you get your copy. What a real pleasure it's been to talk to you all today. Thank you so much. This has been so great.