 Hello everyone and welcome to our last Wednesday webinar of the year from the Pitt University Network, or Pitt UN as we call it here at New America. My name is Alberto Rodriguez and I was Senior Program Manager at the new practice lab and Pitt University Network and I'll be facilitating this conversation today. As you know, Pitt UN unites colleges and universities committed to building the field of public interest technology in growing a new generation of civic minded technologists. These webinars are just one of the ways we elevate the work of researchers to both our members and everyone interested. Emerging technologies are changing every aspect of our lives from how we communicate with each other, particular in this pandemic and how we are shaping our future. And for the last year, a team of researchers affiliated with Civic Hall and the Design Justice Project have been working to understand the nature of how these technologies might allow us to live with a degree of safety from COVID and other viruses. And how can we ensure that technologies do not unintentionally reinforce racial and structural inequalities. So after an in-depth literature review, 20 something field interviews and a database of more than 250 relevant sources, they published the report Pathways Through the Portal, a field scan of emerging technologies in the public interest from which we're going to hear all about today. Our conversation will be really straightforward. First, we're going to hear from the authors about the main insights of the report and the recommendations to design and to use emerging technologies in the public sphere. And they were also be sharing some link in our Q&A, so keep posted on that. And after that presentation, I will be asking some questions through a panelist source from you, our audience. So please join the conversation by making comments and adding questions to your Q&A. And Ana at the end will try to address them. Before we start, let me quickly introduce our authors and our panelists today. I'll start with Diana Nucera, she or aka Mother Cyborg. She's an artist, educator and community organizer who explores innovative technology with communities most impacted by digital inequalities. Her specialty is developing popular education experiences supported by dynamic documentation that empower communities to use media technology as visionary tools. She has been working as a media artist and a technology educator for the past 16 years. Berhanta Yeh is a researcher who investigates the relationship between technology, society and social justice right in our alley here at Pitt UN. She is a senior policy analyst and global internet shutdowns lead at Access Now. Before that, she was a Ford Mozilla open web fellow with research action design at the Open Technology Institute. Sasha Kostan-Sachak is a researcher and designer who is known for their work in network social movements, transformative media organizing and design justice. Sasha is a research scientist at MIT and the author of two books and numerous journal articles, book chapters and other research publications. Stempec is a senior researcher at Civic Hall and a curator of the Civic Tech Field Guide. A crowdsourced global directory of nearly 4,000 tech for public good projects. And he holds a Masters of Science from MIT and a BA from University of Maryland in College Park and lastly but not least, May Hasifray, president and co-founder of Civic Hall, editor-in-chief of its new site, services and a longtime curator of personal democracy, personal democracy forum. He's the author and or editor of numerous books on this intersection of technology and democracy and I'll, I mean, a great fan of him as well. So, let's go right into it. I'll leave our authors to walk us through the report. Diana, do you want to take it away? Sure. Thank you. Yes, we are all great fans of Miha. They were the initiative to bring us all together along with Sasha and this is a really amazing team and we're excited to share with you the results of our field scan. This is, you know, looking into the potential of emerging technologies within the public interest. And this report carries insights into how technologists, communities, artists, researchers and educators are harnessing emerging technologies from chat box to robots and we dug into what people are doing with these tools and what they've learned along the way. And what this report is that it shares like so many different perspectives from people working on many things. And if you're someone who really likes to be tactile with talks, I'm going to put in the chat, which is our link to have the report online so feel free to let go through that as we're talking. So after writing this, it was pretty intense because this literally was during the start of the pandemic. In the midst we had this uprising for black liberation. There's just so much happening. And the title of the report was inspired by the author, Aaron Daddy Roy, and they wrote about the pandemic to be looked at as seen as a portal. And quote, they say, it could be a gateway between one world and the next, we can choose to walk through it dragging the carcasses of our prejudice and hatred, our adversary, our data banks, our dead ideas and our dead rivers and smoky skies behind us. Or we can walk through lightly with luggage ready to imagine another world and ready to fight for it. And so emerging technologies will clearly be important to our passage through this portal. And the big questions that we're asking is like, can we develop new tools that will allow us to live with a degree of safety from viruses. Can we ensure that technologies we develop do not unintentionally reinforce anti blackness or any other forms of structural inequality, and which of many potential futures do we want to build together. Thank you, Diana. So, good to be here. I have a little more background on how this report came together. In order to develop signposts to help guide us along this journey civic call more than a year ago with the generous support of the McGovern Foundation gathered a group of researchers technologists and community organizers to explore the potential of emerging technologies, serving the public interest as everyone knows, there's a lot of critiques. There's a great many of them valid and some of them from some of our colleagues who worked on this report. But we wanted to look at the potential for positive uses for the purposes of this study we used emerging technologies as a broad umbrella term and we shorten it as m tech. And we'll hear as we talk through the report we we spate paid particular attention to public interest uses of artificial intelligence, in particular machine learning, natural language processing, automated decision systems and bots, as well as other tools presented in virtual reality, drones, remote sensing and satellite imagery, and we wanted to focus primarily on the types of m tech that would be accessible to public interest organizations. At end, the way we worked, our team of five researchers starting last winter, developed a list of experts that we wanted to interview we ended up doing 24 in depth field interviews. We did data scientists technologists artists activists researchers policy advocates and others. And we also developed a database of 246 relevant examples of m tech being used in the public interest. And we found cause for caution and for optimism. We got work and we synthesized it down into seven key findings and recommendations and we're going to share a summary of those here with you today. And we hope that these findings can provide some insight into how and under what conditions m tech can really help advance the public interest, so that as we move together through the portal, we take only the baggage that we truly want and need. The first of the principles that we synthesized from the work that we did is do no harm. This is illustrated by a quote from an interview with john calorie who's the vice president of technology at the Trevor project, who said, who told us in his interview. I think it's so easy to create a machine learning model. It's completely different to make sure that it's fair and doesn't have harmful effects. He was sharing that thought in the context of his work at the Trevor project on a $1.5 million US dollar Google impact challenge grant to develop the Trevor projects on the channel crisis intervention platform for LGBTQI plus youth. So the Trevor project. If you're not familiar with their work. They're like a support line and support space for for LGBTQI plus youth, and they're using machine learning to prioritize incoming chats and calls from young people who may be at a higher risk of suicide, so that they can put those incoming chats first in the queue to reach trained counselors. And so Trevor project used UX research to inform the development of a product that wouldn't inadvertently cause harm, because if you're going to be making recommendations for people in a situation of potential self harm, you really want to make sure that that you're doing that in a way that doesn't systematically bias against some of the most apt rescue so particularly, you know black and brown indigenous LGBTQ plus youth might be at greater risk here or lower income youth. So you want to make sure your your systems and bias against them right. And this is important because M tech does offer great potential for mitigating and reducing harms, but we also have a vast and rapidly growing body of scholarship practitioner knowledge and lived experience that demonstrates that unfortunately powerful actors who often use M tech to harm black indigenous POC LGBTQI plus disabled immigrant working class and other marginalized communities, whether or not those farms are intended. Right so it doesn't have to be intentional harm. It can still be harm. And so we therefore need widespread adoption of better design processes independent audits and stronger regulatory oversight. In a full report, you're going to find extensive recommendations for each of our key key finding areas, and we have sections tailored for developers funders policymakers and journalists, as well as cross cutting recommendations. So today we're not going to share all those with you will just highlight a few. So for example, in the area of finding one do no harm. We recommend a more robust ecosystem to minimize harms including red teams vulnerability bounties and independent audits to explore test and mitigate potential and real world harms them tech projects, both before and after launch. We recommend that people adopt the design justice network principles and explore the consensual technology guidelines, and all of those, of course are linked in the report, along with additional recommendations. Another key finding is understanding that context is key. And this is illustrated by dragonna, who is an executive director director of the localization lab. And he says, we went back to our entirely Western group of developers who work on, which was the popular messaging app. And we said, look, there are other parts of the world, if you really design this for them, they're not using it in the same way as you thought. And there are things you might not have imagined. And it's really important to understand that each place in which we live can access or can use tech in very different ways. And an emerging technology in one place, or among one field or group may look so different than another. So community partnerships for context and localization are essential this was a huge finding. And because there are great disparities between the countries as well as between different social sectors within each country. So in terms of technology access adoption, or use in some contexts, an online spreadsheet may be considered and talk. While others and tech might mean artificial intelligence or satellite imagery, or augmented reality tool. It's very different, really depending on where you're at, and how you're using it. So an example of context is a project that I had the honor of working on, which is the equitable internet initiative launched here in Detroit, Michigan, which actually was supported through the America Foundation and its roots. And to me it's a good example of what it looks like to offer community the skills they need to build their own internet infrastructure and places that commercial providers had abandoned them. The equitable internet initiative worked with neighborhood leaders to ensure that more Detroit residents could have the ability to leverage the digital technologies for social and economic development. So these are areas in places that were redlined by technology so they were given the tools that they needed to overcome that. So some of the recommendations from the section and Sasha mentioned they are definitely within the report. And a few is you know like support leadership of community based organizations and individuals with deep contextual knowledge and lived experience is so important. And you know a lot of the stuff is already out there within the more than code report, which in on the website, you can, there's a link to that so you can be able to go through it. Hi everyone. So our third finding is that data is messy, and that might not be particularly surprising to those of you who've worked with data, but we mean messy in the very broad sense not just poorly labeled. So artists and researcher Mimi onua put it thusly. What does it mean for us to make the world into data, and what happens when we do that, and what places get missed in that process. So Mimi is referring to here is many of the emerging technologies we talk about in this report rely heavily on data, but data is always incomplete. It's often misleading, and it's difficult to maintain and protect data disparities lead to amplified inequalities. So we're missing a lot of the data that we might need to actually develop technologies in the public interest data sets that are crucial to the work are often difficult, expensive, or just plain impossible to obtain. So if you're talking about data about police violence or worker fatalities, you might just not be able to get that data. Not to mention, is fundamentally impossible for data ever fully capture the infinite range of deep complexities of human emotions, cultural significance experiences. So we also recommend that data is not just just isn't really an accurate proxy for reality. So as an example of data being really hard to obtain as COVID-19 spread across the US this year. A lot of states weren't sharing data on COVID-19 infections and deaths, disaggregated by race. So if you wanted to do an analysis on the racial aspects of the pandemic, which turned out to be really important factors, you simply couldn't. So data for black lives called for states to prepare and release this data. And over the course of the pandemic, they've maintained a record of which states have shared it and which haven't to try to make that data available. Oh, and lastly, a few recommendations on this one as well. As the report will make clear, we need significant resources to support good data stewardship. We need resources that we should adopt a racial and gender equity framework specifically for data infrastructure. We should promote more transparency, consent and accountability and how data is created, used and shared. And we should develop mechanisms for public oversight of the use of M tech, for example at the municipal level. So our fourth finding was that community led design as for the world. So pretty here. A researcher from Oxford University tells us that you know, computer scientists are not always in dialogue with people who are social scientists or activists or anthropologists. And he goes on to say that, you know, for instance, if they're looking at anti Semitism, it would make sense that they would go and speak to people of Jewish community to read the very long and well established literature on anti Semitism but normally, you know, when they're trying to design for a specific issue, they don't necessarily look at the literature that's already there, or even the people that have worked and have experienced this issue so community led design and the fact that need to include communities is really essential and that's one of the keeping that really came out next slide. So what we found in our conversation with with our interviews is that community community led design definitely is the way to go because it provides a more just equitable and sustainable emerging tech. These practices are moving the mainstream so that's that's one thing that we want to appreciate but in order for it to remain in the mainstream need more funding training and institutional commitment. And one example that I am quite excited about this one particularly is an example from Mexico where the human rights data analysis group worked for worked for a decade to apply statistical analysis to human rights cases. Among many other powerful projects they were able to help activists find the graves of disappeared people in Mexico city. So when you really do bring in the community it makes such a difference and we do make real life impact. One of the recommendations that that stood out for us was that you know the importance of adopting the principle of nothing about us without us so don't write about us don't treat about us don't, you know, I tried so far problems without including us from the get go. So this is one thing that came out community technology practitioners they have their within the community they've learned most of the time to work with tech because the community needs it. So it's really important to learn from partner with key stakeholders issue experts people with lived experiences. These are really important things that came out from from this conversation. And last but not least adopt a human centered policy making approach. So it's not just about the technology but also the policy that goes into making you know this this technology real and applicable in the world. Yeah, so the second one that the fifth finding is sometimes really snoozy is better than sexy and this this one I always want to tell Sasha that came up with it. So, basically what we see here is that. Yeah, so what we see with and with some of the interviews that we spoke with is that. And D has also touched upon this is that what you know sometimes organizations need is a tool that can really help them navigate the day to day admin. You know mandane what we might assume and then we should not necessarily an AI that's going to solve other problems they might just want to, you know, especially to the really good script that can have them navigate for instance who from their fundraising group that's making group is going to potentially, you know, fund their space. So these are some of the really critical things that came out so admin tools related with administrative issues. It's not blockchain. It's not AI it's not anything it's really the most simple thing that most organizations want and you know meeting them where they are and providing what they need is also really about providing a useful emerging tech. Next slide. One key example that we have here is, for example, the nonprofit AI readiness checklist, which were written by Anderson and that some of them here as well as basically provide a checklist for nonprofit organizations on how to navigate, you know, how to apply and how to apply AI, but in addition to providing that checklist, you know, it's also the question is, do you need this or not. So if you don't move on and find something that works for you. Again, a spreadsheet could be the most innovative thing that that you most likely might need here. So some of the recommendations that that came out from from this conversation is that, you know, we in organizations and people need to support and demonstrate and really demonstrate the capacity to implement tech for the public interest work. So they need. It's not just about innovation hubs. It's not just about, you know, how to call them hackathons, but it's, you know, it's the real capacity that that organizations in the community accountability as well. So being able to meet them where they are is really important. Create and maintain a strong regularly updated shared resource list to help navigate through the space. The space quickly change their new tools that are coming in. So being able to provide civil society organizations, you know, public interest organizations with this list is really critical so that they don't have to go and restart from scratch. Last but not least, do not believe the hype, use the resources like the nonprofit AI readiness checklist and others to evaluate if your organization, you know, has the internet capacity to develop the necessary data expertise, you know, ethical oversight capability before jumping onto the bandwagon of the makers, you know, technology, consider, and this is, I think my favorite one is considered a cooperative for buying shared access to specialized tools. So, you know, some organizations are small, they might not afford it they don't have the expertise so why don't we form a cooperatives are some of the things that you cannot put. Moving on to our sixth of the seven recommendations. We also found from talking to many of our expert practitioners that they really hunger for a stronger community of practice. This is a quote from Stephanie Dinkins who is an independent artist who does a lot of cutting edge work out of her own studio using tools like AI and machine learning. And she talked a lot about the value of bringing people together, the power of support, and knowing that there are other folks out there to call upon. And that that that's a rarity and that the field would be strengthened. We're we able to grow that. So this in particular the recommendation is about creating more opportunities for people from historically oppressed communities to be involved and to be given access to resources and leadership positions. Anti blackness specifically and diversity equity and inclusion challenges more generally that we see in the broader tech sector also apply in the field of m tech public interest. So this lack of diversity is undermining the fields potential as projects too often reflect homogenous experiences of team leadership narrow problem framing and data sets and models that reproduce existing inequalities. We're recommending more mechanisms like conferences dedicated fellowships using the cohort model, and the creation and support of ongoing virtual and physical spaces when it becomes safe to do that again to strengthen the growing community of people who want to work in bringing m tech into use in public interest organizations. We have a whole bunch of featured examples of the kinds of organizations working at the intersection of m tech, the public interest in racial and gender justice. I won't read the whole list to you here. They're all included in our section of the civic tech field guide devoted to emerging technology. The recommendations for field building include we have to rethink the entire pipeline of how this is done. One very important recommendation is to support the development of m tech labs initiatives and organizations that are specifically led by black people indigenous people and other people of color. We think there's a need in the field for a dedicated conference on m tech in the public interest, and we want to see more support for prototypes that are being created by activists and community organizations. And so, last but definitely not least. There was a kind of cross cutting recommendation that applies to all of those that come before, which is that we need to seek strategies for institutional transformation. This is from Marnie Webb, who's the chief community impact officer at TechSoup, who says that nonprofits have to be actively involved in policy and development, contributing data. So that our community members and our view of the community get factored into all of those technologies. So we don't end up 10 years from now tools making decisions without the people we serve. And those should be talked to agree that institutional transformation will be fundamental to effective and accountable use of m tech in the public interest. That means shifts in policies and practices within companies within nonprofits universities and other organizations, as well as stronger regulation at various levels of government. Currently, currently, the lack of understanding of emerging technologies in all those institutional spaces, together with generational gaps in representation in policy and decision making circles is currently allowing tech companies to set the tone for tech oversight and self governance. And, you know, the scandal that's unfolded over the last week, for example, with Tim Nickabrew, being fired as the co lead of Google's ethics team, kind of demonstrates that what happens when you allow tech companies to self self govern self regulate in the public space right. So ultimately we can't rely on tech companies to self govern in the public interest. We're going to need deep institutional transformation, policy changes, and greatly expanded legal and regulatory oversight to address algorithmic harms specifically and m tech more broadly. For example, the shadow report on the NYC automated decision system task force provided really detailed recommendations for policymakers research advocates and public around the complexities of evaluating the true risk and opportunities of government use of the limitations of existing bureaucratic procedures and the importance of engaging a variety of perspectives and experiences. And I apologize for the background noise zoom life and someone just started cutting the grass across the street. Of course, right as I'm speaking. Our recommendations for this final section include organizing and dedicated racial justice and tech fund that would be governed and led by black people indigenous people and other POC. The need to gather and share diversity equity and inclusion data about m tech programs specifically. And it's not just gathering and sharing data that includes making public commitments to equity targets with timelines. This is the best practice from the private sector actually. We also think that we're not starting from scratch. There are a number of already existing policy frameworks that could help guide m tech in the public interest. And we did highlight that shadow report on NYC ADS task force. We also highlight me and his tech policy toolkit, which they developed around the no tech price campaign. Finally, well, not finally because there are many more wrecks but we also wanted to highlight that litigation is going to be a necessary tool for institutional transformation. And a great example of what's happening in that space is AI now is litigating algorithms report for an overview of cases. So in addition to sharing the report itself, which we want to make sure you see, it was really important to us as a research team that if we were going to spend our time and other people's time gathering all these resources that we also share those with you. And so that's why Mika and Diana pointed out the m tech pathways website and all of the additional resources we gathered in the process creating this report are available as appendices on that website. So we'll drop a link in the chat, and also in a special section of the civic tech field guide that we've dedicated to emerging technology. So that section is at civic tech dot guide slash m tech that link is also in the chat. And when you go to that page you can see the hundreds of emerging tech organizations projects tools that you can use conferences you can attend fellowships you can apply for all the resources that are brilliant interview is shared with us. And like the rest of the civic tech field guide. This is a living collection meaning that if it's your project that you see there you can claim your page and you can also contribute other relevant projects to this collection. So go and check that out. And speaking of our brilliant interviewees. We don't really have time to get into the detail that they shared with us but we also published edited interview transcripts with 20 of the 24, along with brief summaries. For those of the folks listening, who really want to go deep in learning about how different practitioners use m tech in their work, think about it how they got to where they are in the field. We really suggest you dive into those interviews. The transcripts are all also available at m tech pathways.org. And so we realize this is a very juicy report and that there's lots of recommendations and projects that we've highlighted here so we hope that you take the time to explore the site. Once again and tech pathways that word. And we want to remind you that our website. We have specific recommendations for journalists, developers, funders and policymakers. Because we realize there's just, there's still a lot of work to do, and that we all sort of have a lane to be able to do this work in and we wanted to support people to be able to further expand that with those specific recommendations. And alongside those within our website you'll find a slide deck for the key findings, similar to what we shared today in case you want to go on and share this information with others please credit up a resource guide interviews as we have said, transcripts and annotated bibliography if you feel like really diving deep a little more into this work. And like we said dedicated recommendations so thank you all so much for being here because the first step is recognizing what the potential is and then gathering together to figure it out so sort of open it up to questions now. Thank you so much for that presentation for the great work that you've done. It's, I've read the report and it's quite juicy as Diana said, it has a lot of information and I urge everyone watching and the people that will watch it on you that will watch it on YouTube in the future to go check that site. We have some couple questions from from from our audience so I'll go go through them quickly and I also have a couple questions myself. I'm not directly these at anyone specifically so offer to our panelists just jump in as you can from Jim Gray so it's he he says many researchers consider a wide range of data to understand a phenomenon that's going to be quantitative quantitative or qualitative and then combine sources and formats in some way before synthesizing a conclusion. Does the panel have a particular approach to what counts as data or how to combine various types. Yeah, so I'll try to stab at this one. So in terms of the methodology of this report. Our data was a mix of qualitative the 24 interviews we mentioned, but also a lot of desk research. So we started with a wide collection of emerging tech projects pulled from the civic tech field guide for example, that we already had a 100 or so emerging tech high tech projects, and then our interviewees and the deep interviews, the qualitative really fed back more quantitative data and give us yet more examples. And you can see we've talked about the resources but like we're sharing all those examples with you. So you can see what we're basing these kind of recommendations on what kind of work. Yeah, and I would also add to that. You know, stepping back a little bit from our own report that we're talking about right now. The question of what counts is data is a really fundamental one. Our data is messy section really bad into that further. And I really recommend the recent work on data feminism by Catherine Ignacio and Lauren Klein, because they spend a lot of time talking and writing and thinking about what counts and data in context. So, you know, lived experience is a type of data, which is extremely important and can fundamentally reshape the way that we think about emerging technologies and what they're useful for, but often it's not taken into account in technology product development life cycles. So there's a lot to say about that. But I'll put the link to data feminism also. Yeah, just add to that. I think that what's interesting about what we need shares in the report is the question of like what happens when you do start to create turn everything into data and what's missed was highlighted so which is just a continual question and then, you know, and how that benefits our lives and controls our lives through the commodification of data. So there's just like a whole lot to work with, as far as figuring out how to best move forward with data in the future. Thank you. I can see some of the questions are going through our process of talking to universities as we are a university network. So I'm going to address some here from Matt McVeigh, what roles could universities in academia play in this work? Yeah, I'll start. I just started working with Cornell Tech on their public interest technology program. And for me, the whole public interest technology movement is exciting because I've spent a lot of years in technology for social good, which is great but it's often applying what technology comes down the pipe from the companies and trying to use it for social good, which, you know, sometimes it's a win sometimes as a report shows it's not a win, depending how the data is being collected, for example. But with public interest tech and universities, I see next generations of technologists can be much more informed as they actually build and shape this technology itself. So instead of just using whatever comes down the pipe from Silicon Valley, we can begin to have public interest and ethics informed technologists building this from the start aware of the issues that we talk about in this report. Thank you. We have another question here from Pablo Aguilera. He says what did you identify as the greatest barriers for underserved key populations to participate in tech. Maybe Deanna can help us with this one. Sure. It all speak to specifically the work that I've done in Detroit, because I don't know, I don't think there's a blanket issue within underserved communities or a blanket solution. So here in Detroit, and, well, you know, of course structural racism is a huge barrier, but then how it plays out with redlining of neighborhoods through data but also like dark fiber within the city, keeping a lot of this, the city without internet, and then how credit and just the economy plays into people's access, because there was so many moments in which folks did not even have access to the internet within their neighborhoods because of the credit scores within their neighborhoods and the closure crisis like added in perpetuated that so just giving you a context of the sort of area in which I was working in the greatest barrier was feeling as if they deserved or had the ability to learn technology. And that I believe in light of the previous question, what the role of universities are. I think some of it is like taking it out of the university and putting it within the community because there were so many so much potential within the groups that I worked with, but they didn't actually feel comfortable within tech settings so we had to remix the whole idea of a hackathon to be like disco tech so it's just this community engage in space. We had to do quite a few things in order to sort of present the tech as it would be presented any other way on the streets like, like having, you know, like, what folks could relate to. So the very first thing was essentially connecting people to why tech was relevant to them. So I'm going to say the greatest barrier is education access and honestly the self doubt within folks that has been perpetuated through structural racism. Yeah, just, just to add on to the excellent things that they said definitely access is an issue for the rest of the world but one of the things that we've noticed if we look at out of North America is, you know, majority of the world is not connected to the internet. You know, using, you know, data that's that's already out there to decide what they need, while they don't exist online, becomes extremely problematic. There's also this idea that, you know, underserved communities only need you know commercial related content, not anything that beyond, you know, beyond that so there's there's a lot of issues with you know access to literacy but then specifically on content and language barrier as well. So we can go on and off. Yeah, that definitely is a rabbit hole to go there. I'm going to change a little bit the question for a question from Dylan Halburns. He asks, what do you think might be ways to incentivize tech companies to work with these and tech principles that you outlined. And how can, is there anything such as the renewable energy carbon tax here to jump start more responsible and inclusive data practices. I'll start the ball regulations an obvious one and I currently live in Berlin and have noticed how much more Europe regulates on the digital front and a little more in sync with where the tech is moving. The other one, you know, shorter regulations is community pushback. We've seen in Toronto with the stop sidewalk campaign that communities can shut down projects that they see is not handling their data in a good way. There's also an anti ring campaign in the US for, you know, not having Amazon cameras on everybody's door. So, they give a top down regulation but you also bottom up organizing. Perfect thank you and I, I know that this. There's some comments and in the chat. I'll go ahead and ask another question from an ardent son Sonnenberg sorry if I didn't mispronounce that. Do you have any educational programs or curricula to train future professionals in this area, as this is something that the opinion or university results working on, did you find something like that that you could share with us. So, in, we didn't gather that specific resource in this project. But in the more than code report, there is specifically a list of educational programs and resources and we have a link to that from within pathways to support. Thank you and shamelessly adding the work from pet you and we are also working on building community community based curricula within our memory university so please stay tuned for that, as we will share that. Yeah, sorry. Also, as part of the report which I believe is in there is my teaching community technology handbook, which is a handbook that actually specifically dedicated to folks within technology that want to teach to the community. And it teaches you all about educational theories, how they fit into the current work of community organizing including pop it and backwards design all this stuff. It gives you templates on actually writing your own curriculum so sometimes actually you all might probably are the experts as to what the tech is, but might just need some support and figuring out how to teach in an equitable way, and that Thank you. One last question from the audience is from Richard Connolly. She would love to know if there's a component of the report that addresses or touches on implications for educational technologies, especially those that are being used in K 12 contacts or I think that the answer is no, and that is an important area extremely important area to engage and so we'd love to learn more about that if you have a resource like that. You could share it out with the hashtag m tech pathways which we're using across social media to generate conversation. Yeah, I was just going to say it's an excellent question and it was beyond the scope of our study, because what we really wanted to focus on was the expectation that there were all these great tools that public interest organizations could start using public interest organizations being the operative actor. And it's true that teachers schools are in effect public interest organizations but it's a whole other sector. And I wouldn't be surprised to see a lot of the same concerns and patterns that we found about hype and and about reinforcing discrimination and about, you know surveillance capitalism basically cropping up in the educational field. But we really what what we were imagining is at the end of this for people who want to go into the field of developing this kind of technology what are some important guide posts, and then also for people who work say at a nonprofit organization that is focused on some kind of public good. What are examples where we see meaningful uses of m tech that we think are positive or tools. You know, we one very big gap in this field right now is that there is not a really trusted third party evaluator of the the technologies that are on offer. For example, there may be a great tool that you can use to simplify transcription of interviews. But, you know, is that tool really secure is it really, you know, keeping your data private, etc, etc. That's a gap in the field that needs to be filled. Yeah, sorry. So last question from the hill there in England. What would the panel say is the role of artists and creative folks and helping advance the world of developing equal equitable beneficial and tech, you've already chart one great story. Very excited about this question and thank you for bringing me to the conversation role of artists because they often get overlooked. And I believe that, you know, the thing that art does that no other thing I believe can do as well is that it allows us to investigate and present other perspectives. Without the sort of pole of politics or the pole of relationships, it's like objects, these objects can just hold so much, and then trigger things within us to think completely differently, which I think is very, very powerful. And when you put technology into that in, in the hands of artists they start to discover things that we might not be looking for so in talking to Stephanie Dinkins. And she was like, really interesting because she was saying you know she's a self taught machine learner artificial intelligence programmer. And her whole thing was when she saw a robot that was prototyped as a black woman she wanted to talk to them, understand why a black woman, you know, and as she was talking to them she was like this woman is not black. And then started thinking about the, the, the information that's required to share blackness within technology, and how colloquial that is and how connected to culture that is and how it's different, you know, and in Japan where you're at so she began then really into data and around identity. And I don't think that emerged any other place outside of her work which I think is super fascinating and then also Mimi, who is looking at data sets in thinking like some of her most known work is that there's just simply data sets missing of people who who if it's missing do they exist you know, and so she created like a whole set of files of missing data sets and there's physical evidence of the sort of blindness that we've had so you know art does a really special thing where it allows us to it investigates in a way that I don't think any other medium can and so that's a long way of saying that I think artists have the ability to push the boundaries of this m tech and then take it someplace further. And so it's really important for us to include these artists as a part of the cohort that we spoke with. Yeah. Thank you for asking. Thank you, Diana. I want to close out this conversation with a question that addressed all of you. And it's a little, a little on the personal side but I mean, most policymakers have serious constraints on their side when they're trying to implement new technologies so they have limited resources. What recommendation would you personally think should be prioritized or the one that spoke the most to you or acted as a surprise for you when you went to the research. So if I could jump in first. Let me just say, slow down the, the, the amount of hype and expectation about the magical power of technology has it's everywhere. And so the very first thing is do not rush into this, the mistakes that have been made to date are particularly because of hype, powered by VC money, and in some cases foundation program officers who also wanted to show something off the shiny bright new thing is often not what you think it is. So I would say number one slowed out. Thank you. Matt, what would be the one that will spoke to you. That's related to me because snoozy over sexy is definitely something I think applies to the public sector and policymakers in general. I know a lot of people inside government that would much rather have like cell phone data coverage in their office, more than some crazy new AI feature, or cloud collaborative document writing, instead of some crazy new VR feature, right. So sometimes that means working with the IT departments that people try to do it and runs around. But I think a lot if we're thinking this the net public good, I think some of the not bleeding edge technology has the most potential. Thank you. What about you, Brian. I believe people not technology we've seen, you know, over and over again, governments rushing to, to employ a specific technology, you know, lock people out. I mean, here where I'm sitting out of Kenya like the government has collected 30 million people's biometric data without defining how they're going to use it, just because, you know, some private company has pushed in that direction for instance. So, believe people not technology and invest in people not technology. That sounds perfect Sasha. What will be yours. Two is just one. But I think that the scandal with team at Google last week really brings home for me the importance of stronger regulatory oversight to develop accountability and I think that that needs to happen at multiple levels. So, you know, municipal municipalities have been passing bills, for example to block the use of face surveillance by law enforcement. We have a federal bill around fr T's and biometrics that there may be some chance of actually having bipartisan support for and getting through with new administration, but we're going to need a lot more than that. So, bringing back the office of technology assessment is a proposal that's been floating and that would be a key mechanism to help us develop stronger leadership and oversight on the regulatory side. We're going to need that as we as we go, go forward with these tools. Thank you. Again, last but not least, Diana, what do you think what was the one that spoke for you strongest. You know, I, I think as far as this question goes to policy makers. I would say like quick passing the torch, like this is all on you like we need to we need these regulations we need these. It's very clear how important pop the role policy plays in the future of technology after doing this report also after reading, you know, just a lot about the role data plays within our economy. So I say, if you have not enough resources to educate yourself. So that you could tell talk to your colleagues about doing this one of the biggest lack of the things that I found is that people just don't know what's what the stuff is. And it's not like policy makers are techies or vice versa. And until we have that sort of beautiful hybrid of a person in power, we're going to be sort of stuck. So I say, you know, don't pass the torch and get. We need regulations. And we need oversight. Well, with that, being the last thing to say, I am going to close us off and thank you so much. Thank you all to our panelists for not just the word that you've done but also taken an hour of your time to talk about this amazing process and I urge every people watching when you just later to check this report. Again, as you can, you can check it out at mtech pathways.org and all of all of the links that we've been sharing through the through the to the chat. So again, thank you all and have a great rest of the year.