 Hi and good afternoon, good morning, good evening everybody from whichever parts of the world you're joining us in. Thank you so much for joining us today in our Berkman Klein event on foresight and decolonial humanitarian tech ethics. It is fantastic to have you join us. And the question I wanted to pose to to everybody and why we're here today is really to interrogate how do we not lock people into future harm future indebtedness or future inequity. And before we start, I want to acknowledge that I am currently doing this event on the ancestral Lena pay homelands and I recognize the long standing significance of these lands for Lena pay nations past and present, as well as future generations. I want to also acknowledge the world that will exist beyond our lifetimes and the people animals and nature that will thrive in it. I also recognize that the very act and nature of online events and technology access is denied to so many people around the world. And there are many people out there that find themselves on the brunt receiving end of these technologies that don't get to have a say in how they are designed. So thank you for joining us today as we try to interrogate a more just and equitable digital future for us all my name is RIP Christian. I'm a fellow at the Berkman Klein Center, and my research that I've been working on has been focused very closely on this topic I've been in the humanitarian sector for almost 20 years. And I've been working on humanitarian digital governance through through my work with the Berkman Klein Center. And those in this topic as I've often argued that the humanitarian aid system perpetuates hierarchical patriarchal hegemonic views of what development and progress look like, ignoring other world views, and often the underlying systemic and structural pillars of inequity and bias. And as the humanitarian aid system increasingly intersects with technology systems that are often developed in the context of Western capitalism, and in small pockets of privileged power, I have not been the only one that has raises has raised concerns on the implications of the collision of these two systems. And importantly, what that means on those that are minoritized in the global south. So then do we use, or do we design digital governance systems that speak to these complex intertwined issues and instead of merely looking at digital governance in terms of control. Could we design different approaches to liberate ourselves to liberate our digital futures, so that it is a space of safety of humanity for those of that we are meant to support. And are these approaches in which we can design new forms of digital humanism. I'm really thrilled to be able to explore two elements of that with our panelists and with all of you here today. And this is around how we integrate both foresight, the consideration of future impacts and future harm, as well as the ability into humanitarian digital governance systems. And I am thrilled to announce and thrilled to introduce our panelists for today that will help think through some of these. So, Anasuya, Sabelo, Malambi, and Andrew, Andrew Zoli are joining us today I'm going to invite them to turn their cameras on so I can do their introductions. So Anasuya Singupta is the co-director of Who's Knowledge, an organization that works on reimagining the internet to be for all people. She has led initiatives in India and the USA across the global south and internationally for over 20 years to amplify marginalized voices in both virtual and physical worlds. Thank you Anasuya for joining us. Sabelo Malambi is the founder of Bantikrasi, a public interest organization that focuses on Ubuntu ethics and technology. He's a technology and human rights fellow at the Khar Center for Human Rights Policy and a fellow at the Berkman Klein Center as well. Sabelo's work is at the intersection of human rights ethics, culture and technology and emphasizes global south perspectives and AI policy. Welcome, Sabelo. And finally we're joined by Andrew Zoli who oversees the Sustainable Development Humanitarian and Human Rights Impacts portfolio at Planet Labs, which deploys the largest constellation of Earth Observing Satellites in History. Together Planet Satellites image the entire surface of the Earth daily. Andrew also chairs Planet's internal AI and data ethics program and he serves on the Global Board of Directors of Human Rights Watch. Welcome everybody. Thank you so much for joining us. It's been an hour today and we have very deep questions to interrogate but I might open up really quickly to Anasuya Andrew and Sabelo just for any opening remarks that you want to make before I dive into the questions. And perhaps Andrew we might start with you since your mic is unmuted. Well, before we get started, I just want to say, I have taken over the course of my, my entire career enormous value and inspiration from the work of the Berkman Klein Center. And, and I have to say it is incredibly exciting that you, Arathi, are leading this from there, that this conversation is happening is another on a long list of data points where really, you know, important conversations are happening. For those of you who are listening, I'll just say a word or two quick introduction and sort of framing for this. I oversee the global humanitarian and human rights and sustainable development applications of a technology, which was birthed. And if we think about the kind of tacit conceptually biased frames of development, in which we have created kind of unipolar models where we have hyper developed, and I put this all in enormous air quotes, technical societies at one point, and, and then a whole array of communities on the other. A poll that is, of course, one of the most important things that has to this concept needs to go away needs to die. But I come from a place where from at the very edges of that Western elite highly financed world where we are building technologies that have extraordinary global potential and are producing terabytes of data a day about the world. And I think of data as unrefined social power. And, you know, if you're producing large amounts of data, you're producing large amounts of tacit social power. And the question then is, what are the cultural frames, what are the conceptual frames, what are the respectful relationships. What are the kinds of partnerships and structures in which the full liberation potential of that data can be made fully manifest, and it certainly can't be made fully manifest if it's just a bunch of Silicon Valley types at one end, making decisions about it. We are actively working on how we think about creating those kinds of government governance structures, both the kind of explicit ones and the kind of tacit new new tacit norms around them. I'm not sure where we'll explore this much more in the conversation to come but, but the, the, the challenges in doing that in, in, in are not just about distributing assets, but they're also about conceiving of all of the social relationships, power relationships, the kind of imbalances that exist, and actively working to our best ability to overcome them. We're not perfect in that as anyone is, but, but I'm very excited to hear what people have to say about what we're doing and, and to reflect on it with you. Wonderful. Thank you so much, Andrew. Anasuya, I might, I might open up to you if there's any quick introduction comments you want to make and then I'll go to Savello. Just to say that I'm, I'm very conscious of the fact that I'm coming into this conversation with my head in my heart in India. And also in other parts of the global south, whose knowledge is a feminist collective co led with accompaniers from Brazil from Uruguay from, from different parts of the world from Ghana, and all of us at this moment. Very much on the fragile end of our own emotional beings, but even more so, bearing witness to SF to put it as one might put it, one of the greatest humanitarian crises of our times that as with many humanitarian crises is humanly engineered in ways that make it worse. So I will no doubt bring back very embodied experiences around this as we talk, but it is important, I think for me to say that for us at whose knowledge. I love that Andrew said, you know, air quotes around development. My background is in development studies and throughout my career. One of the things I've done is put scare quotes around development and recognizing that what we have for the world is around justice. And justice for us means centering those who have been marginalized by structures and power of power and privilege throughout history and ongoing. And for us, as you said, Arthur, they are the minoritized majority of the world. I'll stop there and come back to you as we talk more. So yeah, and I also just want to acknowledge that particularly with everything happening in India right now. This is a difficult time for for a lot of our Indian brothers and sisters in diaspora, and for yourself. And we're very grateful for the graciousness of your time and list of today. Sabelo, thank you so much for joining over to you if there's any opening comments you want to make before I jump into questions. All right, it's such a pleasure to be here with you all. And just to see some familiar faces, and to be back at my intellectual home at Berkman Klein Center. I also wanted to give a special shout out to Stanford Stanford's digital civil society lab. We're also doing a fellowship at this moment as well and I'm thankful to the community there as well. I just want to share some initial thoughts and of course we can delve into this much, much, much more. When speaking about human rights, I like to think and to ask myself, has the human part been answered, you know, what makes us human. And what are the social, economic and political structures that are necessary to make us feel human to make us feel protected. And if we look at the world as it is today, the ongoing systematic injustices racial injustice. The neocolonization that Kwame Kramer talked about back in 1966 has European, or maybe I can say Western humanism, being able to recognize the humanity of the non European. We're living decades after these initial frameworks of human rights. Have we seen the benefit. Have we seen the preservation of the dignity of those especially those who have been minoritized yet they make up the majority of the world. And I think the answer is pretty self self explanatory. There has not quite been the case. And maybe we need to start thinking about what are extensions or additions to the human rights frameworks that can guarantee those things to protect those who have been so left out those whose humanity is still not recognized right what good are human rights frameworks, if they prevent you from recognizing the humanity of others. I think that demonstrates a fundamental flaw in how you know we have come to understand these human rights frameworks, although there is a place for them. I'm not saying we just throw away the baby with the bathwater, but I think there's a fundamental flaw there and, and of course, if we look at history, we can see that, you know, Europeans have always gathered together to make doctrines and frameworks that are veiled as human rights are always progressive, only to push forward their own economic interest. Maybe I can go on but it's even traveling that some of the human rights organizations in the African continent have been tied to sort of US expansion, whether it's the sponsorship with them. Either CIA or, you know, other parties, even the labor unions in the United States like the AFL-CIO working with the CIA to destabilize labor movements across the global south. So it's a really, I think it's an area that needs to be better explored. And so I'm glad to be here to, to try to, you know, think more about those issues and to, and to work in sharing sort of an African conception and broadly speaking when I say African because it's such a diverse continent, but you try to bring different views that are shaped by the the anti-colonial influences of the African continent, the current decolonial scholars who are talking about these issues. So thank you all and it's just a pleasure to be with you. Thank you Sabelo. That's the comment that you just made which has been echoed by Andrew and Anasuya. Do we recognize the humanity of others in the technology futures we create and, you know, where I said, I don't think, I don't see that we necessarily do, it's we use these terms beneficiaries, as if people are passively waiting for a handout for whatever harm or ideas of progress that we want to hand out to them as if we are benevolent, so to speak. Anasuya, if I might want to start with you, you and I had a conversation a while ago where we started to sort of edge around in a decolonizing, decolonization, decoloniality, etc., the differences. You do a lot of work in thinking as well as in feminist approaches to this, but what does decolonizing technology and particularly if you can speak to maybe decolonizing humanitarian technology, what does that mean and look like to you? How would you, how do you understand that? Thanks Arthi. Yeah, I'm trying to answer a complex question that would take us, that takes us a lifetime to answer, I think. For us at whose knowledge and in my embodied experience in the different worlds that I have been in, I think about decolonization firstly as really the true and deep recognition of the fact that there have been historical structures of power and privilege that have not just governed the resources that have been extracted from different regions and territories, but have governed the ways that people think and act and are seen and perceived. As Cibelo said, colonization has been a process in which our very humanity has been questioned and let's be clear, colonization was essentially a process from the global north, from Europe at the time, in which race was constructed, and thereby racism began, because it was seen as a rationale for exploitation and extraction of resources from the global south, from Africa, from Asia, from Latin America and the Caribbean islands and the Pacific islands. It's important to recognize that that structure then leads to capitalism, and then that capital that leads to a digital capital, which then not just embodies and reflects the same structural inequities of history, but exacerbates them in some ways, particularly because we imagine digital technologies to be more emancipatory and liberating we assume an emancipation and liberation, because they are seen as being global because they are seen as being quick because they are seen as having an influence that are different from other global infrastructures, which is not true. There is a speed and a reach that is different, but the same things were said about the telegraph, or the telephone, or the television that we are saying in the internet about the internet. So for us in terms of decolonization and a feminist decolonized recognition of ourselves and our lives and our futures, we have to start with a critical understanding of power, who holds it, who doesn't hold it, who is seen, who is not seen, who is deliberately unseen, who is seen, invisibilized, undermined or exploited. And that for us is the ongoing process of a dynamic uncovering of power and of colonization. And it is dynamic because our positionalities are not static. I may be a brown woman who is often the only brown woman in a tech conversation. But at the same time, if I'm in a conversation with Indians, I am so-called upper caste. I am, again, I will put care codes around that. I am sovereign. I have caste privilege. And that matters in a context in which the caste system continues to be, I think, the deepest social structure of oppression that its own inhabitants, its own oppressors have refused to identify and reflect upon. So power is positional, it is dynamic, and unpacking that power and unpacking its historicity, I think, is deeply critical to the ways in which to understand feminist decolonized presence, as well as possible futures. Thank you, Anna Suya. I recall so many conversations where we talk about, we often talk about equity and inclusion, and we don't talk enough about class and we don't talk enough particularly about caste and tribes and how that plays out in our positionality of power, the decisions that we make and how it influences how we think about the world. I'm going to come to you to maybe respond to that because I know this is a topic and you talk a lot about how technology infuses capitalist modes of power. So I just want to come to you to maybe respond to that, and then Sabelo as well. Well, I think Anna Suya used the most important term, which is this, I think the starting point of all of this, which is about power, power and position. And I think about, you know, the reified and outdated concept which we referenced with air quotes and scare quotes at the beginning of this discussion about the premise that, you know, we have this developmental framework, and we have, and I'm going to speak candidly, and if imprecisely, so please presume good intent, right, but we position in this unipolar world, a world in which at one end we have poor brown place based traditional and then all of the scare quotes just assume they run down the list societies at one end, you know, agro economical and at the other end we have, you know, developed technical urban mobile white structures at the other end. And then what's fast, you know, so so then what we do is we assume that there's one dimension that moves everyone along that there's there's an intrinsic then burden of responsibility to move people from here to there. If you're standing here and among the countless reasons why this is a, you know, a pile of hot garbage as a as a concept, not only does it, you know, it does all this othering from the point of view of the developed and right. It creates and reifies power imbalances often in the name of development, but I have to control you to move you which is absurd. And often the technologies that we're talking about are used as the instruments of control to move people purportedly across some spectrum without recognizing the enormous plurality of landing destinations and waypoints along the way and that all of this all of these societies, including the ones over here live in a period, a place of greater dynamic disequilibrium healthy disequilibrium but but also, you know, with the rights to self determination and so one of the many problems here. I'll just point out is a problem for the people who live over here on the develop side which is that you know if you live within two kilometers of a Walmart and you have two cars and a big house and you know you've checked all the boxes. Apparently history is done with you. There's like no imaginary for where these people need to go and of course none of this is sustainable. And the whole system the whole edifice is needs change so one of the things that there are a few points where we I think we need to go in that represent potential places where we can illuminate the soft underbelly and eviscerate these ideas. One of them is is that we have to fundamentally attack the idea that these technologies are neutral. They're presumed neutrality is an instrument that hides the agenda that defines them I strongly disbelieve the idea that there are that any technology is neutral technologies are products of the agendas they create affordances around who can use them who can access them. What's made easy and what's not made easy all of those represent crystallized ethical principles and power relationships and need to we need to kind of go at them. Then I think there's a there's a process and I say this from someone who lives over here and is trying to think about how to take some of the very powerful tools and rethink how we all use them access them, invent them, redesign them rethink them. So, one of the principles that I think is really critical one of the places where we start is with this sense of subsidiarity this is I'm, it's an old Catholic social teaching idea to put the tools as close to the context of use as and no further away. So we need to stop this idea that the implicit idea of distribution of these tools we need to put the manufacturer of them in the right places. I think a second one, and this is one where we've really focused our work is around the creation of digital public goods. That is to say the creation of tools that can belong to wide publics and we spend a lot of our time building them, but then also repositioning who owns them so that they don't come from a Silicon Valley firm, but they live in structures that have better governance more inclusive governance and I don't mean inclusive in the Western regime I mean inclusive of all of the public's that that might have interest in them so we've, for instance, taken, we have a huge program where we're monitoring all of this deforestation around the tropics and but it's not data that comes from us it's passed through institutions that have different governance regimes. And so then think a third part is to really carefully and aggressively attend to the ecosystems of participation the architectures of participation and design them in a different way. And then fundamentally the other thing we have to do is we have to build new networks of trust that all of the other ring and this this idea of some continuity and these networks of distribution they hide the fact that we're not actually talking to each other all that often, you know, often these organizations that want to do good in the world in a humanitarian context they land the UFO in the front yard and they walk out and say we have the answer. And then as soon as they've decided that the context is done they walk back into the UFO and they fly off and those systems become extractive. So we have to build networks of weave networks of mutuality and solidarity and trust. And that is a collective exercise. There's a lot of stuff out there but those are things we're thinking about and how we crystallize those into ethical principles that actually drive our work and inform our work in a significant way that's that we're kind of in the gritty work of doing that right now. Thank you so much Andrew and I think, you know, there's, we are driven by the philosophies and ideologies of Justin equity and liberation. And I'm gritty and in the grittiness of grappling is how do we, how do we convert that, the philosophy and ideology and the good intent into what actually translates and will shift systems. What are the policy interventions what how do we actually shift different forms of governance system so I want to come back to this I what the point that you said which was inclusive governance, which I think is interesting in many parts in the communities I come from governance is is is understood very differently and, and the weight put into community is a lot more than than weight put into control. And I want to just shift to Sabelo here because I'd love to hear Sabelo's, I guess reflections on what I'm so young Andrew was saying but also I'd love to hear. And Sabelo you've done a lot of work in bringing in different types of thinking ideology and philosophies particularly around the Ubuntu approach in terms of how we think about ethics and who's ethics and for what purpose and why. And particularly in the human rights context. And so I would love to hand over to you to hear some of your thoughts and reflections here. Thank you so much. I wanted to sort of kind of carry on from Andrew's reflections. In particular, the concept about the developed world. I see itself in a sort of final state of development or even achievement such that they don't need sort of other human rights firms to extend their thinking to extend how they recognize other humans. I think I think of the quote by the, the South African poet who was also a teacher UCLA. He writes and he says that technological development does not necessarily make us more ethical or more morally inclined or neutral in how we approach the world. And the institutions that we have here in the in the developed world, they still exclude racialized communities. It sort of makes you wonder what is developed about all of this, right, you know, is science is technology, the mark of human achievement. And so, and I think the danger with that word the developed world is that it leads to this paternalism in the global South. And it needs to go show them how to be humans because they've never had human rights systems in their parts of the world. When in fact we know that some of the earliest fragments of human rights were developed in the African continent, and in any other parts of the world as well. So it's this idea that I think it's a troubling force of, you know, Western exceptionalism that we can sort of parent others from the manifest destiny before that I mean the history has just always always. been the of the paternalism. Now, the problem that I encountered a few years ago when I began to, when I left the tech industry to sort of think more about why we've been building the tools we're building what problems are we solving, who's benefiting from these problems from solving the solutions and who's framing the problems in the first place that have to be solved. And so this of course took me to the Berkman Klein Center and, and while I was I began to wonder, you know, we're talking about ethics and AI, but it almost seems as if we all assume that we're talking about Western ethical systems Western philosophy, as the foundation and I just saw sort of like a type of discrepancy there in that some of the ideas which were developed by these early European US philosophers were the ones used to justify slavery justify colonization justify imperialism. And so now we're going to turn to those ideas to try to liberate us from the effects of those philosophies in the very first place. Right. Can we not find other systems that are designed around the aims were aiming for. Can we find other ethical systems that are designed around a sort of a more inclusive definition of what it means to be a human being. And so this led me to maturely to the Ubuntu framework and I made I was biased because I grew up with Ubuntu framework I'm from the the Ngunu people were Ubuntu means to be human and say it's a philosophy that I was always familiar with. So I began to sort of then, you know, explore well, what would a more a widened approach to to person who'd actually mean for the development of better principles and better practices around protecting people it comes to technology. And I think the difference is quite vast. You know, one of the major critics that I've always had with some of the ethical frameworks we use for technology or just even the conversations around human rights in the Western concepts. It's always surprising that they failed to talk about reparations, ought to talk about restoration restorative justice. How do you just overlook the past, and then say well now let's do better. When you have the means to address the past. So to fail to have restorations or reparations in these major frameworks to me it's like they're dead on arrival. I cannot say we'll do better without admitting guilt, or trying to fix what happened and trying to address what happened. And we find that when we go to other parts of the world. Restoration is at the center of the ethical systems, restoring others reparations, even when you're not the oppressor to still give reparations and restoration to others like what we've seen, a famous in the case of Africa, although it's still an ongoing process, but to give a restoration to your own oppressor as well. And so I think that disables. Actually undermines sort of, you know, some of the solutions that we can propose to try to address the negative effects of technology, or even the systems around. When creating technology so my, my small contribution, or sort of the efforts that I've been trying to make is to then work with others to try to suggest that perhaps we can find better ways to ensure this protection that we're talking about. If we're able to sort of explore, what do the other conceptions of human rights of being human what do they contain that we can use, especially here in the so called developed world which has not developed its own ethical maturity so you know what can we do even in these parts of the world to then better protect those who we've racialized within the United States within Europe, and those who we still exclude even within the within the global south. And I think once we do that then we can start to have sort of this most systematic change, but not just only within the tech companies the tech culture, the venture funding, but even perhaps we can start to have given more societal change that is designed on truly acknowledging and truly accepting the dignity the human dignity of others. So, my mind is racing and I think, you know, often, yeah, you're right because we don't talk about reparations and, you know, in the humanitarian system, particularly over the last year there has been so much conversation around the the humanitarian system, the reform of the humanitarian system but my argument but you know the thing that I've always worked through or tried to understand or unpack is, we still assume the centrality of our role in whatever reimagination we mean, and none of that none of our efforts certainly think about how do we do any kind of reparations any kind of giving back, because giving back assumes that there is control we must let go of. I'm very conscious of time, this is, you know, we can all go for much, much longer we want to give some time to to our audience and there's a there's a few questions already in the panels but I want to come back around questions around, you know, blending I want to talk about harm and harm absorption, and I want to talk about what are the incentives for different types of governance. So, how this is my translation of okay this is what we, this is what we're thinking about but from an institutional perspective how do we do this and so yeah I want to come to you here because you know certainly in our world in the humanitarian development space I can definitely say we don't do necessarily do an analysis around the systems of harm and current harm that might result out of any interventions we design let alone technology interventions and we might say it from a policy perspective or a philosophical perspective, but when it then comes down to who is deploying that technology and designing that technology there's a gap. So, I want to ask you sort of and I'm opening this up to Andrew and Isabella as well, what types of harms, and by here I want to focus on future harms must be designed out of this because often we are firefighting the problem in front of us today we're designing to, you know, so how do we solve the issue we see today and not necessarily thinking about what what could eat what could happen from this. And how and then the second part of this question which I want to open up to both Andrew and Isabella is how do we incentivize organizations and actually Andrew because of your work in designing an ethics system internally for planet I'd be curious to hear your point of view. How do we incentivize institutions to absorb more of that harm, rather than to kind of not think about anything that goes beyond the institutions. Normal legal impunity, and you know part of the world that we let's not worry about what's going to happen to the end user here. And that's generally what happens so how do we incentivize a greater absorption of the responsibility of harm towards institutions rather than sort of pushing it off to two communities and end users and minoritize folks, but to start with, what are the current and future harms that must be designed out of this systems and I see I'd love to hear your thoughts on that. Thanks, I think I just want to start by acknowledging what Isabella said and in the spirit of self and collective reflection. I would just like all of our 144 participants how wonderful is that to reflect on the fact or reflect on the question. When you hear the word Ubuntu. Do you think of a free and open source software, but do you think of a South African philosophy of humanity. Just even that. I think will give us pause and reflection for some of the critical questions you're asking us today, I think, in terms of the epistemics in terms of who we center and who we dissenter. And so, when you ask the question around harms. I have to start in a different place to answer that question, because one of the things that happens to us when we are in in spaces of crisis, as well as in spaces of technology. There are a couple of things one is that, as Andrew mentioned earlier there's a, there's an assumption of neutrality, there's an assumption of neutrality from the humanitarian sector and there's an assumption of neutrality around tech there are how problematic that notion of neutrality is in both those cases. But the real problem I think is in starting with them as potential solutions to a problem we have not articulated. So let me start by saying, what is the vision of the world that we seek. What is the just equitable decolonized feminist future and futures that we seek, and then reverse engineer to say how do we get there. And the reverse to that might be very different than if I start from what are the harms. And the reason for that would be, I think, that in from that perspective, I think there are two or three things, I would say, the first is that if we were to start by seeking the just equitable feminist decolonized future for the world that is based around well being and based around, not just the centering of humanity but the centering of the earth, so a bio centric model, because that too is I think, a deep deep issue of the humanitarian sector that has not been questioned yet or challenged adequately. Then it may require the de centering the stepping back of the very roles and responsibilities that people in the humanitarian sector have taken upon themselves. It is frightening, right. It is deeply frightening it, it undermines all the systems and processes that have been built. And the question then to ask is whether in the humanitarian sector whether in the tech sector, are we building systems to justify our existence and our own time in them to justify our own living from them, rather than seeking the seeking the outcome that we want for the world that we want. The second piece around that is, with the de centering, what, what can you do to think about other forms of accountability and responsibility, exactly as Isabella said, which are also deeply discomforting, but are transformational like reparation. For instance, again to, to take the example of history. Feminist economist recently did this analysis that in the 250 years of Britain's colonization of India, 45 trillion dollars moved from India to Britain, which meant that Britain didn't develop India, which is the classic, classic trope. And what kind of is not a put possible Afro future, it is indeed a possible Afro past, right, based on the histories of structural colonization and capitalism. So, what happens when you de center yourself, when you think about your own reasons for being in existence and what you do, and what happens when you rethink and redesign those reasons for being as being focused on reparation and justice, rather than on existing for the sake of existence. Right. And I'll stop there. And what kind of analogy and comment is incredibly powerful on us so yeah because you, you, you are right. And that's amazing. Okay, I did want to ask about how I might, but if Andrew I could just ask you to keep your response super quick. And then I'll go to Saabella as well. About how do we incentivize organizations and how are you thinking about this as you're designing this ethics program, you know, for a technology company in Silicon Valley working on humanitarian issues. Yeah, yeah, I'm going to, I'm going to speak quickly and try to cram. First of all, this, what you've discovered I what we hope all 140 of us 140 something of us have discovered is that we need like four hours for this discussion. You're like the tributaries that open from this that Anisea and Saabella have opened are just amazing. I just want to say one thing about them and then I'm going to pivot to the answer that you answer the question you asked. You know, we're building technologies that are built by people who have been steeped in a bunch of tacit assumptions that are that are the foundational structures of the West that involve the relationship between the individual and the whole. The role that the centrality of consumption to the to the creation of one's being the consumption of symbolic and and physical material. A relationship between between humanity and the natural world that is largely extractive a focus on the interrelationship between beings that is largely focused on transactions and a social order that is predicated on dominance. So, the technologies that we're talking about here come from people who have enormous blind spots around the relationship of the individual to the whole the role of the consumption of all things. The relationship between humanity and and and the natural world and and how how things happen how how which is largely through this idea of transactions as opposed to relationships. And so, in an environment like that where those all those relationships are provisional, you must have dominance in order to have longevity, because transactions are short and dominance is the arcs of dominance are long. Okay, so many of the technologies that we create. We bring to communities in some spirit of help and you know this this is that that list of psychological phenomena and and the kind of tacit biases is how you get people who have built dating caps, dating applications for their for their pets, and would like to repurpose them for humanitarian applications is because they, they see well I built this thing and it does transactions well so now I'm going to take it over here and do transactions well because that's what the world is made of. You asked this question about how do you reverse the story. We have a, I want to say to you that one of the great challenges here. I, we have an incredibly robust, really really deep and genuinely we are genuinely struggling with all of these ethical dilemmas in my organization. And because we feel the dilemmas, we are actually pretty quiet about the work itself, some of it for practical reasons, and some of it, because we don't want to do virtue signaling and because we don't want to draw attention to the inevitable mistakes that you know we'll say one thing and then we'll end up with something that's hard to explain and all of that sort of swirling around. So, we I we don't genuinely talk about this enough. So I'm happy to share a little bit of it. Really, I, this is among the first times I've ever really talked about this work but it's a huge part of my daily work. So we root our ethical system. And then I want to reflect on what Anise said a moment ago in what we think of as the foundational principle of planetary ethics, which is a universal obligation that we recognize we think it's universal to protect the capacity for life to flourish on the earth now and for future generations that principle from that principle we derive a series of actions and a series of subsidiary principles. And then we build processes to support the application of those principles to practical decisions like should we give these tools which are very powerful to already powerful people. And one thing for instance there's a lot of fetishization in our in the technology community about making things open, but if you make them blindly open in a society for instance in which you have, you have a dominant group with lots of social power, and you often have a much larger group of people with relatively limited social power, and you just throw open the gates what you do is you take the already positionally advantage and you dramatically accelerate the value that they can extract from their assets, and you might marginally improve the other group but you've increased the net inequality between them. So, like, we think about these issues of say disproportionate empowerment, and the protect and and the reduction of harm. And the last thing I just want to say because it was in your question are these about the reduction of harm. The two things to say are that on the one hand we want to avoid the obvious ways in which these very powerful tools and that might be used to create harm by ensuring that they are both a we keep them out of actors where we are worried about the use case or about the position of that actor. B, where we ensure that there's an ecosystem so that you know journalists and human rights organizations and all the rest of the other actors that might act as a countervailing force also have access to them and have the capacity to use them. And the third thing is to have some humility about our ability to assess the capability for harm, which is to say that we don't know both all the harms that might be created and also sometimes our assumptions about harm might lead us to reinforce decisions that we are using our position as a small group of Silicon Valley types to make decisions about other people, and about what they what they might or might not do which itself reifies those power imbalances. So we have to be really careful about deciding what we decide, and what we don't. The last thing I'll say is that is that the, the, the, this is these issues are not just ones of policy but they're ones of product, because you have to engineer these affordances into the actual technologies themselves so that among other things, you can make decisions about them. Many of these technologies that we put into the humanitarian sector, don't have the structures of governance built into them that would allow you to make thoughtful and nuanced decisions about them. They're just sort of like well now it's up to you we've we've shipped it and it's someone else's problem to make to make decisions. I'll stop there but but it's very hard to talk about global ethics and all of the principles and in a couple of minutes so it really is and I really appreciate your comment around the humility to even assess the impacts of palm and I guess that's where, you know, in the research that or what we're looking at is around. Yeah, you can't do that as as as just one group of homogenous people. You know would notions of decoloniality, even in that thinking around that might help. I'm going to open up to questions from our audiences and we're going to go five minutes over so we'll finish at 1205. Eastern time just to make sure we do some justice so bella we've got a question for you. So bella highlights the human issue. I would also consider the issues being discussed as bound with an inherent speciesism. I'm not sure for pronouncing that correctly, an interesting emerging technology that challenges this is human augmentation, whereby the taxonomical exam, e.g. social and phenomenological may well blur human non human distinction how does this change the current anthropocentric starting point on a technological ethic. I think I understand that question but so bella do you want me to repeat it. I think I understand it too and I apologize in advance when I miss some of the details there, but it seems like to sort of hint at that maybe we have this inherent biological tendencies to create hierarchy, which then leads to oppressive systems. That's the case. I'm not going to argue against that. I was just saying that it's why we develop ethical systems to try to help us to better understand, maybe the natural stance is for you know might is right to use our power. Naturally, but we have to overcome that and find the best system of organization that ensures that we all have a fair chance at life and we all have enough adequate protections to live meaningful and valuable life. Thank you so bella Andrew and Anna so you did either of you want to take that as well. See you. I'll add I'll add that the principle that I just described, which is to have solidarity with the community of life. It's not, you know that the first principle planetary ethics is not to ensure the flourishing of human life. It's not to ensure the flourishing of the capacity of the planet to to endure and to support the flourishing of all life and and the absence of the distinction there the anthropogenic distinction is deeply intentional. We use these tools as much to on issues of protecting the larger community of life as we do within the human community within the human family. I can add. Sorry, I think the one thing I'll add, which might bring this home literally is that in many of the indigenous communities that I work with. There is no word for nature in their languages, because there is no separation of the human from what we in the Western world or what the Western world understands as nature, because of the significant interconnectedness. And that interconnectedness in a sense is actually at the root, I think, of many of the ways that we can reimagine and redesign our current lives as well as our future lives. Thank you. Next question. I think I'm going to post this to Anasuya. But you know, certainly it's a bit of a wonder if you have ideas here too. Could the speakers mentioned examples of platforms or technologies that are managing better power privilege and that go beyond Western centric approaches, I mean I want to say whose knowledge is definitely one of those platforms, but Anasuya any other examples that you can think of as well and maybe a little bit about what whose knowledge does as well would be great. Whose knowledge is a global multilingual campaign. I've started questioning the term global so let me just say translocal with a global connective tissue. It's a campaign that looks to center the knowledges of marginalized communities online. And as we, as I said before, we call ourselves the minoritized majority to remind the world that we are the majority of the world, whether that is as women, whether that is as black and brown folks whether that is as indigenous and queer folks, whether that is as all of us from the global south. I think the key elements and design principles of platforms that are trying to be decolonized in our own selves though I will admit that even as someone who embodies multiple systems of knowledge, I have to watch my own decolonized mind, because it is such such a slippery slope it is so internalized, but some of the key elements are these. One is that the design and leadership of the communities that we serve and the communities we come from are centered in the way that we think about how that platform or space digital space functions and who it functions for so we are not designing it for a funder we are not designing it for a global south global north. However, we are thinking about our communities. That is where multilinguality, for instance, is critical, even though it's still such a difficult thing for us to achieve so that's an example of it. The other two things that I will say that I think is is is part of this as a design principle that is part of the humanitarian design principle. What we try and avoid is the recency problem, because there's so much of the digital that is recent right digital content is so much more easily about the recent, we can forget history. And that is part of what Silicon Valley does it decontextualizes it dehistoricizes right. And so we do our governments of oppression, all systems of oppression dehistoricize and decontextualize. And so, for us, we try and push against our own tendencies for recency and look to archive history of different kinds, and with different communities. The other thing is to push against the danger of proximity. And that too is a really key crisis of the humanitarian sector and of the technological sector. We care for those who are closest to us. Those who are furthest from us we couldn't give a damn about even if we pretend we do. Twitter can ban President Trump, because the optics of banning President Trump in the States is significant, but will take down posts of Narendra Modi in India, where Facebook will will hide resign Modi campaigns. Eventually over the last two weeks, Facebook took down posts on Instagram and on Facebook, including the hashtag resigned movie. It's possibly the first time that Facebook has actually taken down calls for the resignation of a democratically elected leader anywhere in the world. Facebook doesn't care about Modi right because they don't care about India other than it is the largest market for Facebook. The optics of it are too far away to care for Silicon Valley, unlike the optics of the American public. So part of our really important push is to challenge recency and proximity. And the most important thing that we can do, and that we try and do is to center the imaginations of our communities, because it safety and security is a low bar and it's terrifying that we still have to struggle for that low bar. I want an Internet of joy for my peoples. How and when do I get that. How and when do I get a world of joy for my peoples. If those who are privileged are allowed to have joy. Why should not we I'm pausing because yeah, I think joy is how do we design for joy. How do we design for spaces that we all flourish. And if what some communities can design for joy why is that we know what why do we just design for people to survive in other spaces. And I know, you know with all of you we've had very lots of conversations about this there's loads of questions that we can't get to I'm so sorry to be amazing attendees. I see this as a stop of a conversation for all of us. The work that I'm doing with Berkman isn't I don't believe and I don't want this, this approach this framework to ever be static it has to be emergent because the complexities we are facing. So, these conversations though. I'm informed so much and learn so much is a below and so yeah Andrew, you're all dear dear friends and I learned so much from you you've informed my thinking and all our, all our collective thinking here today. I'm five minutes over so I am going to close us, but I want to close this with an issue point. Let us design for joy. How, and as opposed to just thinking about equity and justice, you know let's think about how can joy be more easily be accessible to everybody, not just to survive. Any last thoughts or comments from Sabelo Anasoya Andrew before before we say goodbye. A word of thanks to all three of you. As ever, all these interactions have been amazing. You've just been amazing teachers for the last hour so thank you very much. Thank you all for involving me in this conversation and it's great to be in community with you all into the audience as well. And everybody thank you so much for joining us for taking one hour out of your Fridays to to be with us today this is the start of many, many more conversations. This isn't just about, you know we often throw the words of diversity, and globalization to mean diversity and inclusion it's not it's it's it's the centering it's de centering the fore foresight isn't a northern cannot just be a northern hegemonic process, the coloniality and it has to be infused through that governance doesn't sit by itself, all of this influences how we think the complexities of our time require emergence it requires radical hope and radical joy. Thank you from the bottom of my heart, I am in gratitude I am. And in so much of your graceful teachings today thank you for joining us everybody and and please stay in touch and get in contact with all of us if you want to learn more and be part of this journey. Thank you. Thank you.