 Hello everyone. Thank you for joining us for our event today titled Human Rights in the Digital Age. My name is Gordon LaForge. I'm a Senior Policy Analyst in New America's Planetary Politics Initiative which is thrilled to be hosting this conversation today. The subject we're covering is Urgent. An increasingly prominent feature of modern life is this pervasive data collection and surveillance that's carried out by corporations and states. It feels mundane and familiar and I think that's because in part it's creeped gradually into our lives. I always think that if you were living in 1993 and then you suddenly time travel forward to 2023 you would be appalled and outraged by just how pervasive it is and how privacy autonomy and dignity are all being undermined by this system of big data collection surveillance capitalism. Planetary Politics has performed research into some of these issues that we've captured in a landscaping report called Governing the Digital Future. You can check it out on our website and the other impetus for our event today is the release of a new book titled We the Data by Wendy Wong, one of our speakers. Wendy is Professor of Political Science and Principles Research Chair at the University of British Columbia, Okanagan. We are also joined by Michael Zelenko, the Executive Editor of Rest of World, which is a truly excellent online publication that covers the impacts of technology around the world, especially outside of the western bubble. If you don't read it, I strongly suggest that you start. And last but certainly not least, it will be jumping on in one moment is Bulalani Jili, who is a meta-research PhD fellow at Harvard University and a prolific researcher and writer on Africa-China relations, cyber security and surveillance, as well as internet policy and other subjects. So I want to thank you all again for being here. How we're going to proceed is we're going to start by turning it over to Wendy. She's going to talk about her book to kind of set up the conversation, introduce all the issues in it. We'll then turn over to Michael and Bulalani to give their thoughts and reactions to what Wendy says. And then we'll just proceed with some questions in a conversation that I will prompt some questions. And at the end, we'll turn it over to the audience for their questions. Welcome, Bulalani. We've just finished up our introductions to join. Throughout the event, everyone in the audience, I encourage you to send your questions at any time using the function that you have on your screen. We only have an hour for this event today, so let's jump right in. Wendy, please take it away. I want to thank Gordon and Heela Russell for setting up this event and also to Bulalani and Michael. I'm really looking forward to hearing what you have to say and also our conversation around these really important issues that Gordon was raising. So I'm a political scientist, but in this talk, I really think I'm trying, and in the book, I'm trying to engage more than just my own discipline here. In fact, the book is written really with a interdisciplinary and policy-oriented stance. It's a book that's recently been published by MIT Press. It is the same name as this talk, which is We the Data, Human Rights and the Digital Age. And in that book, the broad question I'm really trying to ask, I think at the very top level, is why is it important to talk about tech through the lens of human rights and vice versa? So this is a project just to give you a bit of context. It actually came around right when the pandemic hit, like right just a few months before. And what really enabled me to think this project through quite a bit was the fact that, like everybody else, I and my family were suddenly thrust into this world of digital interactions where we might, where we would have ordinarily done face-to-face interactions and work and all sorts of things that would have taken place in person all of a sudden had to be on digital platforms. And so it gave me a great reflection point and sort of a, you know, play by, a live play by play of how pervasive digital technologies are and how much of our lives have actually changed because of them. And also how much our lives have become data as a result of the pervasiveness of digital technologies. So this is a book about AI and human rights that approaches it through the lens of data about people, data from all of us. And so in the short time we have today, I'm going to be giving a very broad overview of the book in anticipation for what I think is going to be a great conversation. So I'd like to start at the intersection of amazon.com, basketball legend Shaquille O'Neal and the reality TV show Shark Tank to talk about data and human rights, which I know is not exactly where most people start, but it is where I start because I wanted to talk about the Ring doorbell. And this is actually where these three entities come together. So amazon.com owns the Ring company, which was initially featured on Shark Tank and whose fortunes turned around or were turned around by, among other things, Shaq's willing endorsement in a series of commercials. So I'm sure many people here today have encountered Ring or other video doorbells with names like Google Nest or Arlo or Blink, another amazon company. You may own one, you know, about 16% of American households do and numbers will have changed since then because of the pandemic sales surge, especially here in North America. So the Ring, like other video doorbells, really offers people the ability to monitor and record the area around their doors or wherever they decide to place these doorbells. And they create audio and visual recordings and there's a two-way mic so people can talk to whoever's standing there in front of the Ring. And it's really handy too because you can link a bunch of Ring products together. They sell lights and they sell other cameras, really being able to link a system around your property for security. And so, you know, what this really effectively does is it changes the ability of the terms of who surveys. And so I think in the current contemporary age, we might be accustomed to, we might have even, we might have started not even seeing things like closed-circuit TV monitoring in public spaces by government or by private entities, let's say in a mall, right? We're sort of used to seeing cameras. But I think Ring products takes it to the next level, right? So these are now surveillance products that are on people's private homes but face public spaces. So we're talking about streets, right? Where people walk and drive and otherwise go through their daily lives. And so, you know, voices and images are being captured on Ring snippets, which then creates audio and visual data for analysis. And actually Ring company has gotten into some controversy for agreements it has with law enforcement, about thousands of law enforcement agencies to share Ring camera data without the knowledge or consent of the actual owners of the device that collected those data. So there's some controversy there and also if by merit of a paid subscription, people who own doorbells or other Ring devices can download video and audio that are captured through their devices. Sometimes these things work in unanticipated ways of course, as many of these things do. There have been a number of highly high-profile documented cases where hackers have actually hijacked the camera and two-way microphones on these devices to harass the elderly, to stop children, or otherwise unsuspecting users of these products. And I think more generally we as pedestrians or as drivers, we're going about our everyday lives. We're probably unaware of these devices as we move through neighborhoods. We're definitely not consenting to our activities, however mundane or harmless, being captured on Ring. And also, doorbell owners are not always consenting to how the footage is being used from their devices. And I would say that practically speaking, because of the Ring's functions, it's actually quite difficult to obtain consent for all these different capacities that just one kind of device can have. And the Ring is just one of many what we call smart, quote unquote, objects, recording and sharing data about us with various entities, be they corporate or government, because these entities have an interest in collecting data from people about our activities. And these days I think we hear a lot about data from people, data being taken, data about our activities. We're often confronted with news about how companies are using data in ways we find troubling. So either tracking our behavior across the internet or even sometimes we can serve as free fodder for generative AI like GBT. And perhaps there's also a feeling among many people, we can't help but fall victim to these incidents because of how our lives are structured and organized around smartphones and other internet linked devices to conduct our day to day activities. So we engage with digital technologies whether we like it or not. So if you live in a big city and you're tapping into public transit, if you're patronizing stores that use loyalty programs and some even use facial recognition technology these days, we have our purchasing histories analyzed and tracked through websites and apps. And those are just some of the ways. And even today, the medium of our interaction today is the Zoom platform. It's a digital meeting platform. In a lot of ways then our lives are data-fied, which is to say that behaviors that are revealing of our innermost thoughts and personal activities have become digital data. Another way we might say this is that a big part of our lives have been made into data. And I think this is it's important to flag here that we are often told that when we think about datafication, when we think about data, that data are valuable because they're commodities, right? Something to be bought and sold like oil. Or we're told data are byproducts. They're dust or detritus or exhaust. And we're told often that we're the product if we're not paying. And all these things are I think market relationships and market driven ways to think about the relationship between human beings and the data generated about our activities. But I really want to push us to think about data, not as part of these market transactions, but actually as part of us, what it means to be human, as part and parcel with the human life and the human experience. And in that, and so doing, I want to think about data as subject to the globally endorsed framework of human rights. But I do so in a very specific way. So, you know, I don't have time to go through all of these today. I'll only be going through two, but there are five main takeaways from the book that I'd be happy to talk about more in conversation with Michael and Bulalani and also in the Q&A. But I will be talking about data stickiness and why we need human rights now more than ever. So, when I talk about data stickiness, I'm really talking about data as though it's gum. Gum sticking on the bottom of your shoe. So we've all had this experience of unknowingly stepping on some gum and getting it really stuck on your shoe. And it's really hard to take it off. So similarly, we can think about data generation, the creation of data around our activities, kind of like this process where it's relatively easy and seamless to have data created about your activities and your choices. But it's actually very difficult to, often we don't know that it's being done, which makes it different from gum. But in the same way that, you know, gum is sticky like data that once created, data are pretty hard to get rid of. So let me explain why that is through four different dimensions. So one thing I will say before I get to those dimensions is that we all know that digital data are easily copied and transferred. That's what makes them valuable. That's what makes them useful for us. And so that's not the reason why they're sticky. Because there are four other ways in which data about people, the process of datafication is actually quite sticky. And the first is that the great majority of the data being taken about people's activities are quite mundane. They're not extraordinary. They're not really remarkable and not notable. So you think about the types of data that are collected on a routine basis about your life. It might be the number of steps you take. It might be what you ordered on Amazon or your commute pattern to drop off the kids and go to work. These are all little things that are being taken from our devices or about us that we don't necessarily have the capacity to change. And so it's very mundane every day. It's stuff that we can't really avoid doing. The second reason why data are sticky is because they're linked. And this is just a way to say that to refer back to the idea that digital data are easily copied and transferred. So this is data don't just stay nicely in a single data set. In fact, once data are created, they're often broken up and sold and traded on the market. And so there are lots of links between the various data sets that are out there about our activities. And this doesn't have positive or negative effects and we can talk about that more later on. The third reason why data are sticky is because they're effectively forever. And this is because once they are created, it's actually very difficult to verify the deletion of any data even after we do our best to close accounts or ask data be deleted. And so it's actually best to assume that data are out there and are effectively immortal. And finally, data are sticky because they're co-created. And I want to talk a little bit longer about this because I think it's a really important idea that sort of connects datafication directly to human rights in a very problematic way. So data are co-created because data about people require a data source and a data collector. So we're all sources of data and data collectors have an interest in creating data about certain types of activities that we participate in. Without data sources or data collectors, there is no data. There's no one to collect from and no one to do the collecting. So we need both parties, which is why data are co-created. And this is actually a practical problem in terms of thinking about property in the human rights term, especially about whose data they are. Because on the one hand, you could say data sources have a legitimate claim because the data are about them. But you could also see why data collectors also have a claim because they actually have created the process by which to make those data. And so there's a data source and data collector relationship that's really important for human rights. And there's also a need to think about actually the fact that many data are collective in nature. So while we all individually experience data collection, the data are only valuable in the aggregate when data collectors and other analysts can draw inferences from quote, people like you, right? You and people like you. And so even though data come from individuals, they're actually made use to draw boundaries between different groups to create different collectivities that we're not even aware of. And so there's a very, there's a high likelihood that you have the possibility of affecting someone else's experience without even knowing when or how. And so it's important to remember that when data are created, they're not created in an isolation and they have collective and individual implications. So I want to turn to talking about human rights and why human rights matter. And so as political scientist Jack Donnelly notes, human rights are about human potential, not who we are now, but what we might become. And so many of you here might be familiar with the story of how human rights came about after World War II at the international level. And the big document that came out of this was, of course, the Universal Declaration of Human Rights in 1948, which set up 30 articles that established international human rights as universal, unconditional, and interdependent. And human rights are also entitlements that states are responsible for providing or protecting. And so some of the big examples that you many people are familiar with in terms of human rights are freedom of expression, freedom from torture. But there's also the right to education, the right to choose a marriage partner, the right to a fair trial. And there are lots of different documents that protect human rights now at the UN level and at the regional level. I don't want to get into that because I actually think what's important is to think about the logic of the UDHR. And so there was a logic that the framers of the UDHR came came to the table with after negotiating for years over what was important. And they came to this idea that human rights and the UDHR fit into what was called a portico. And this portico is like a Greek temple. There's a rationale for the different types of rights that are listed. And they came up with rough categories for how to think about human rights and how to think about the collective human rights project in the roof of this building. And so there's a real way to think about how things fit together, even though their freedom of expression and the right to choose a marriage partner are actually quite different types of rights, but they all fit in in the same document. And what I really think is important about this document is actually the foundation, the foundational values of human rights. In other words, what sustains human rights are these four values at the base of the temple. And I encourage people in the book and in my thinking about this, not to think about individual human rights that already exist as the starting point for for thinking about how we might use human rights in a world of datafication. I think there are lots of different ways we could do that. But the values of human rights really matter because that explains what all these documents what all these articles are for. And so in the book I focused on these four values. You know in 1948 they had slightly different terminology. So instead of liberty I actually use the word autonomy to think about acting and making choices freely because that points out one's agency. I talk about community instead of brotherhood. I think that's a nice genderless way to think about sociality and membership. And I also focus on dignity which is about the worth of a person in terms of how they feel and how they're treated. And finally, equality is also about this idea of a desire to be treated without discrimination in accordance to a common baseline that we all share. And you know it's important to note that these issues are the ones that are coming forth in a lot of the debates about AI. It's not that dignity and equality autonomy and a community are no longer relevant. But it's that now we have to think about how to to enact and exercise and realize these values in a world where we're not just worried about physical detention and physical harm just the world that the UDHR framers were working with. And I think that it's really important to think about how datafication is a change in human life but it's not that data and the digital are separate from physical human life. In fact they're quite intertwined and I think that's really the issue that we're all trying to grapple with today. And so you know to conclude I don't I don't think human rights are the silver bullet for the challenges of datafication on humanity but I think centering the values of autonomy, community, dignity and equality is going to get us started down a path that's globally established but also important for thinking about regulation of AI and data more generally. And I think this really speaks to how we can balance the scales of power that Gordon referred to in his opening remarks thinking about how data sources might have more recourse than they actually currently do with regard to our relationship with with data collectors. And I think you know it's also really important to grapple with the collective nature of data and how individual choices that we make can have implications for many others who may not agree or even know about what we're doing. And I just think about this it's Christmas time there you know advertisements for DNA kits like 23andMe now. 23andMe is inherently you know a day and DNA data are inherently shared so if one person chooses to use these kits there's information about all of their relatives in all the outputs that they receive and I think that's a really important point to think about that many digital data are actually collective in in nature and in their their implication. So I'll end my comments there thank you for listening. Thank you Wendy that was fascinating a great setup and overview and I appreciate that there's so much more in your book that we couldn't you couldn't cover in the talk so all of you listening I encourage you to go out and buy it. We the data it's from MIT Press. So I want to turn to Michael first and then Guililani afterwards but just to offer any remarks or comments or thoughts that you have on the the issues and the topics that Wendy raised. So Michael over to you. Thank you Wendy thank you so much for that and Gordon thank you so much for inviting me to this I think I really appreciate that conversation I think just combining actually Gordon your opening remarks with Wendy's presentation I think thinking about the two ways in which we got into our current state right one is kind of rewinding the clock 30 years and and being shocked by where we are now it's an incremental kind of step by step year by year it feels like a little bit more data is being collected from us we feel it's a normalization right it's a frog in a slowly boiling pot of water to the point where I think I'm curious to hear from Wendy in terms of how do we make people kind of reopen their eyes to this to the state of data collection and the importance of it and I think kind of comparing that kind of slowly boiling pot of water with data collection happening outside of the west where your new systems are being introduced very quickly that are collecting lots of data and you actually don't there's a large loss of the population actually don't know what the data is for they seem largely unconcerned with that data collection and there you know I was actually just speaking with a deputy editor here at rest of world who said you know in India the customer would would give their you know their medical records to the to a grocer right if they asked right there is no sense of ownership over that or sense of value in terms of what that data actually means and so I wonder kind of for both those communities you know the one community that's grown totally accustomed over 30 years of data collection and communities that are just kind of facing this mass data collection all at once how do how Wendy how are you thinking about kind of signaling the importance of that data collection to them opening their eyes to it I think you talk about in the framework of data literacy but I'm wondering if you can speak more to that that's great thanks Michael yeah Wendy do you want to go ahead and respond to that and then we'll go to Mulalani right afterwards sure I mean thank you Michael for a very easy question to answer you know you're asking like this this is I mean this is a huge question and but I think so my history as an as an academic is you know I looked at collective action I studied NGOs for a very long time working in the human rights space in particular and so you know I've spent a lot of time thinking about how people get together around issues that that seem intractable to change so I think part of it is talking about it I think there's a normalization to data collection but I also think there is a sort of non-realization as to how pervasive these practices are so a lot of times they give this talk and people say oh well I've known about this for a long time I don't have a smartphone or but then I'm like but you use the internet right like and you have you have other ways you have an email right and maybe you have a gmail account like that okay then you're you're in the system so I think part of it is understanding data collection and what it's for and how how it undergirds this whole system of AI and I think part of the problem actually is that the debates around AI have existed I think largely disconnected from the debates around data collection when in fact AI is dependent on data in order to generate the output set that we find valuable in society so without big data we wouldn't have a lot of the advances we have today with with AI and I think that really needs to be brought to the forefront that we should not be just talking about algorithms or just talking about computing resources that when we talk about AI we need to talk about data and because a lot of AI systems are being deployed in society using data taken from people's activities we need to be really upfront and honest and explicit with that kind of that relationship I think that might generate some some traction but you did mention data literacy that is something that I usually do talk about with this book because I think one of the things I when I got to the end of the book I realized is that a lot of people don't when they hear the word data it's you know either people are intimidated by it or they haven't had experience working with it with with data and as a social scientist I you know I have a lot of experience working with data about people taken directly from research subjects and so I think this is something I thought about a lot and it's also something I think is really important for this idea of data literacy so it's not about making everyone a data scientist or forcing them to you know become computer scientists working with with lots and lots of data I think the idea is just really getting people to understand the choices behind data creation and that data collection and analysis involves a lot of different choices about the types of things we look at the populations we're collecting from the methods we use to collect those data how we understand you know certain types of indicators to symbolize some quality about a human being that you know that is not a direct observation of someone's thinking but maybe is is is an indication right and or is a way as a shortcut and these are decisions that data creators have to make that those of us who are data sources don't necessarily have to think about right now but I think if we don't start thinking about these types of things yeah you're you're right there then we end up with this situation where people don't they feel paralyzed right and I think that's a real concern. Thanks Wendy and I want to return later to some of those points that you raised regarding transparency regarding openness and then AI we already have some questions coming in from the chat regarding the the implications that AI will have for this whole conversation but but before that I want to turn it over to Bulalani to hear your thoughts in this conversation so please. Yeah sure firstly I'm just like likely to kind of thank everybody for being here and I'd also specifically like to thank Wendy for a wonderful and kind of thought-provoking book I'll kind of initiate my comments first with some reflections around the book but more kind of immediate some of the comments made in the presentation and then I will kind of kind of end it with a question one thing that I'd say firstly two things that I thought were quite significant is really the kind of scholarly and then kind of scholarly and public contribution that the that the book makes at one level it's really about problematizing technical artifacts and how they are traditionally offered to the general public as a solve for traditional problems like security so for example the kind of the surveillance camera where it is a commodity then is you know made exchangeable in markets that then is presented to larger publics as a means to solve an obviously challenging and intimately felt problem around security and so in kind of reexamining this kind of technical solve for traditional problems were both being asked to reconsider our relationship with technologies as in effect not functional instruments but really as political and cultural and social artifacts and then we'll also then ask to think a bit about the the data that those technologies rely on to offer us the solve or to offer us the solution to our challenges and so in many ways it's kind of an intervention that asks several questions about us but also several questions about the technical artifact and to me one particular question that I think I had right afterwards was you know what is really the kind of conceptual distance between densification and commodification and how are they possibly interchangeable but also how they possibly different because in many ways data vacation to me is intimately kind of connected with the commodification of things where commodities are offered as solves which in many ways then offer skate other ways of thinking other ways of addressing problems and so why go to a security camera for a sense of safety and not think about other possible alternatives to questions about safety and so the kind of the question that I had was in particular you know how can we you know adequately squared notification with commodification and I'll just leave it there that is a very insightful and deep question and Wendy I wonder if you have any thoughts about that and then Michael also if you have any reactions to that go for it but but Wendy sure thank you Bilalani for your very kind comments with regard to the book and you know I'm seems like we're really on the same page with regard to of the way that we see the importance of technology is is not a fix right with the technological solution but actually there are so many social political cultural implications and economic implications and I think this is one of my this is the concern I have with regard to how data are talked about because we tend to treat data as the commodity as something that can be bought in Seoul but because most of the data that I'm interested in and that that AI companies are interested in is taken from people and their activities for their choices this becomes problematic because in the human under the human rights framework people are not commodities and in fact you know one of the the most widely accepted sort of customary international norms is this idea that slavery should not be allowed right and if we treat human beings as commodities this is what we're you know we're getting closer to this idea that that then we can trade some elements of human beings on the market and I think that is where human rights are really important to bring forth I do think as you point out we are dangerously close to treating data as as come we do treat data as commodities I think there are efforts to retrieve that through financial relations so you know paying people dividends for for data that are collected about them but again that's a commodification of human behaviors and traits that I actually think given our human rights framework of autonomy or the values of autonomy and dignity that those are fundamental violation of of those values in terms of thinking about how data are bought in Seoul so in that sense I think the that you struck up the very heart of the book which is a rejection of thinking about data only about commodity only at the commodity or in the economic sphere when we have as human beings so many other spheres of interaction that we you are engaged in Michael do you have any additional thoughts there I don't I mean I think it's a really interesting conversation I think the other kind of framework I wonder when we're talking about data is through like a political lens in terms of data collected not necessarily by corporations but by government bodies and how that data is then transformed into kind of political power that's something that we've seen in some of our reporting and I wonder if that actually offers a different avenue for understanding data that isn't just a commodity right but it's actually a political a political token that can be used well alani do you have any thoughts about that because I know a lot of your research is focused on how national governments especially operating in in foreign contexts are generating and using data for political purposes but do you have any thoughts there for sure for sure you know um and you kind of uh correctly pointed out is that you know much of my work is very much interested in both I'd say technology being presented as a fix to traditional political problems but also those kinds of questions sitting at the intersections of kind of I'd say historical inequalities at a global scale between the global north and the global majority and in particular I'm actually working now on a project that looks into offering us a genealogical account of the construction of the technological fix to economic inequalities that were kind of earmarked at the end of the colonial encounter so kind of the colonial encounter here I'm specifically thinking about the kind of the 20th century kind of colonial end of the colonial moment and how kind of the World Bank and other kind of stakeholders constructed ICT infrastructure projects so kind of telecommunication infrastructure as a way to pursue development i.e. as a way to kind of address inequalities during you know the colonial moment but then afterwards and what is particularly interesting is trying then to figure out one whether or not these kind of technologies offer the promise of development then while simultaneously also then thinking about well what are the limitations of development particularly the development that is kind of neoclassical in orientation i.e. that like the market is the basis to offer solutions and then you know effectively what is left behind or what is forgotten or what is then obfuscated by that general commitment and at one particular level that I have become more curious about is in particular about the kinds of you know uh say resistances that you can see at least from the ground surrounding that and you know many of the kind of the stakeholders that I engage with in Kenya in particular you know either said you know we need to effectively try and ban surveillance systems or we need to effectively try and go completely off grid but you know that general I'd say uh I think correct kind of political instinct runs against the kind of the narrative of well you know surveillance is here for your own security but you know again the question you know from them is it's like you know are we really safe or are we you know being asked to kind of further participate in a kind of um default setting where mass surveillance is the only uh solution to kind of political and political stability questions in the context of Kenya. Can I jump in here actually I mean I just in response to Bulalani and Michael I do think you know this question of of when states use data um is it is it different is a relationship still and um and I think it I think it's important to think about who is providing the state with these data and so you know we many states just outsource the data collection to private corporations like NSO group right with the famous Pegasus app I'm thinking in my own work on facial recognition Clearview AI is a huge important player for many governments because they provide this massive database of facial data for for police and other agencies to use so on the one hand I think yeah governments and they may not be commodifying in the same way as companies the data from people but they certainly are engaged also in the in the sort of separation almost of human beings under their jurisdiction and the data that come from those human beings and so they're objectifying right we're being objectified and then and then there's this market relationship that we focused on in terms of commodities but I think that's a that's a really good question and distinction sure yeah could I quickly jump on to that also and then you know we could kind of you know maybe I don't know it's actually really up to Gordon um but you know um I definitely do kind of think it's it's interesting to think a bit about of justification and codification um and at least to me at least I could directly speak to my own personal work what is kind of interesting is specifically thinking about the state and it's kind of I'd say atrophy over time at least in the context of low income countries where effectively it does rely on the private sector to offer you know solutions to additional problems that were initially in the purview of the state and so it begins to kind of effectively outsource policing functions to a large corporate actor and now the corporate actor you know in some ways it is interested not necessarily always directly aligned with the state you know the state in itself might say you know we're interested actually in offering better say policing services uh while say you know the company that it's kind of partnered with that's in the context of Kenya as a far away they're more interested effectively in extracting as much data as possible for commercial products that they'll offer in the future to the citizenry as customers and so to me in many ways uh at least what I've become more curious about is specifically how do we understand least out-solventies uh and you know is the analytical aperture of kind of the private public useful or should we be kind of thinking of more hybridized states similarly with the relationship between I'd say like the citizen consumer and like uh how does that necessarily collapse specifically when you're experiencing your rights only through consumption and not through other uh purviews um and having a kind of more uh let's say nuanced language about those kind of bifurcations or at least now let's call them entanglements uh as these come you know as these become more important to me in my own personal writing and I think in many ways uh your work also tries to speak to that I really like that distinction you raised uh Bulalani and I often think about the distinction between rights and obligations as citizens and how we have to a large degree abrogate at any sense of obligation um and to see ourselves as consumers and as as takers in these in whether it's with governments or whether with corporations I want to return to something that Michael brought up earlier and that Bulalani you've been mentioning um which is how you know we our conversation here has been focusing primarily on you know the the western world and on how uh perceptions of datafication are occurring here um and I know the conversation in the US and Europe really focuses on data privacy and data protection um but Michael brought up the example earlier of the Indian citizen who goes to the grocer and is completely happy sharing medical records and I wonder if you could just talk more about what variations you see in perceptions around the world with regard to data and how much how how much this co-creation and this giving up of data is a problem or not what the objectives are that people have um how do how does the rest of the world think about data and human rights um I think it's a really interesting question and I don't think I can kind of answer it for you know countless numbers of governments and communities but I think what we see in terms of action is in the last few years we are seeing more GDPR style bills being rolled out country by country to various effect right um oftentimes these are bills that you know govern what data can be shared across international boundaries across different organizations um and you are seeing real progress there I think to take a little bit more of a critical lens to it I think the question is often about enforcement um how many of these laws are actually enforced and then um there's certainly kind of like a a data literacy component in terms of uh how important are these laws to the populations there to the actual populations there right um if they're unaware of them if they care about them and then also um why are these laws being put in effect right so one is kind of good governance um dictating how data is transferred I think two is for a lot of what we're seeing is it's a way for a political body to leverage power over a large tech company that they may otherwise have no control over right and it can be seen um it can be viewed kind of in the public space as as asserting power right um whether or not that power is actually put into action is a different question um and so a lot of these laws are just coming into effect in the last two to three years so I think it's it's very early to say how it's going to play out it's certainly a kind of a fractured legislative framework where it's going to create huge headaches for companies who have to navigate country by country but um I do want to pair that with this conversation about kind of data collection and legislation is while we are seeing more and more government oversight in terms of uh how this data is shared um at the same time we're also seeing governments collect more and more data on their citizens um and so and a lot of times that data is not stored in very safe ways and that data ends up being leaked and so it's kind of a hand in hand parallel process where we're seeing this almost I don't want to call it performative because I think it's really valuable this kind of legislation of how data is shared by private parties I think it's super valuable but at the same time you're also seeing government bodies collecting mass troves of data um not really governing themselves and actually exposing that data in really unsafe and harmful ways so um it's an interesting moment where um you're kind of seeing both things happen at once that's great thanks Michael uh blue line here Wendy do you have any reactions to that I mean very quickly for me I just want to say um you know I think that you're absolutely right Michael and this is part of the issue and I think part of the problem with GDPR like documents is that they presume data ought to be collected and then we have to do something about it keep it safe and I think using a human rights lens we can actually illuminate some types of data collection and maybe outlaw certain types of of data that are too revealing of people and I should just really quickly also say that GDPR style regulations differentiate between identifiable and deidentifiable or anonymized data and there's so much data out there now that effectively this distinction is no longer protecting anybody and so you know we find ourselves in so many data sets even if we it's missing an identifier people can cross reference across data to re-identify who we are if they if they so chose to do so so that's really a problem with a lot of the legislation out there yeah at least what I can kind of say to this is that like in my at least my own personal work but also in some of the advocacy that I've been um engaging with what is particularly striking to me is the general presumption that somehow instruments like GDPR that have a kind of a privacy approach presume that somehow privacy can be the linchpin that holds other rights together um and that general prejudgment to me while simultaneously prioritizes privacy precisely so that's way it can manage the further collection of human data and in many ways the underlining general prejudgment of that is that more data is better than no data or that more data is also in line with commercial interests and therefore it must take priority against other kinds of interests and so to me that general kind of prejudgment surrounding privacy simply just has to be further problematized particularly in terms of how it doesn't necessarily allow for other kinds of rights to be prioritized you know because for me you know I'd be more interested in kind of hearing about you know say rights to use of land for example which is like a clear issue in the context of say Kenya uh where you know uh indigenous communities are being displaced for the purposes of building another kind of state project or another kind of commercial venture and so in many ways you know I think when we're having conversations of datafication thinking about privacy both as an important instrument but also a limited instrument is something that I think needs to be kind of further thought out so I want to turn to a question that we have in the chat um we're running on time we're getting close to just we have six minutes remaining but there's a question here that's come from the audience asking how can consumers and governments effectively influence big tech companies so that they do not base their business models on the exploitation of consumer data and I think bigger picture this is a um a component of a question that we've kind of been talking about and dancing around a lot which is the big power imbalance between those who are collecting the data and then us the source of the data and I really like Wendy's clarification of the co-creation process um but as she notes and as this question speaks to there's a big imbalance in the amount of power one party has versus the other so so first I'll ask Wendy how do you think of that question very concretely how can consumers and governments influence the business models of these big tech companies yeah I mean I think that's a it's so it's a it's a pretty long slog but I also think just in sort of talking about the book to various audiences one thing that I do think is not become not been part of the legislators toolkit is incentive so so there's a lot of punishment when big tech companies do something bad when they don't share their data that they have or you know they don't take down content quickly enough they get fine I think one other way we could push companies to do better actually is to incentivize them right so there's a growing idea about data minimization which is the idea that you don't collect the data you could possibly collect you collect the data you need for your specific purposes why not incentivize companies to develop a process around that or develop business models around data minimization I mean you know the US has done this around EVs and encouraging car makers to to create more EVs as opposed to internal combustion engines I don't see why we can't do this sort of similar incentivization with regard to AI companies to get them to do better to respect human rights and I cannot agree with you more about this excessive focus on privacy at the at the you know expensive other rights where privacy is just one of many many rights we have it's an important one but it's not the only one that's affected by datafication and I would say some conversations are actually being hijacked because they've become excessively about privacy something like facial recognition and not about dignity or autonomy which I think are really what's at stake here with regard to facial data so with we have three minutes left so I want to go to each of you to get a final word or a final comment and maybe one way to frame those final words or comments is to focus on your thoughts about where you think we're headed and what if anything makes you optimistic that we will start to or could possibly see human rights play a larger role in how the data economy operates and how this process of datafication occurs so first let's go to Michael and then Lulani and then Wendy who will give you the last word thank you and thanks again for having me I think optimism is sometimes hard to come by but I will say that you know even though I can be critical I do think this turn towards legislations it is a good it is it's a first step right it's a reckoning with the fact that it doesn't need to be governed that it needs to be dealt with in a serious way that it's a conversation that needs to happen the way it's happening is certainly imperfect but I think it does raise general consciousness about you know it's a pressing concern and I hope step by step kind of we can kind of bring the general population around because I think some of this conversation is very very high level and I hope at some point we can figure out a way to bring the general population around to the fact that your data does it you know to Wendy's point it's not just the commodity it is who you are and I think that's that's a fundamental right and I think we can get there. Yeah you know I you know I always tell myself that you know that optimism is earned by your work and so I'm always very much optimistic at least about some of the conversations that I'm involved in and I'll just mention kind of two at one level you know I've been kind of fortunate enough to be in conversations in Kenya, Ethiopia and in South Africa where people are talking about effectively trying to bring African epistemic traditions to constructions of datafication and data governance frameworks as a way to kind of somewhat disrupt you know GDPR's privileging of privacy and saying that like there are other rights and the other values that should be expressed in this kind of digital and selection point and that they should have you know the diversity the both kind of cultural but also epistemic diversity of the communities that they're hoping to govern and then another one is really around like again like getting beyond this at least in my work in my communities of this kind of bifurcational line between it's either you give me your data or you get no service and people are starting to realize it's like a false kind of choice model that like you actually have to disrupt that to offer a really other ways of engaging both with with data but also with the possibilities of people getting the services that they actually really need. Yeah I think you know just to add to what has already been said I think what gives me optimism in a space where it is often difficult to be optimistic is this conversation this talk with Bulalani and Michael and thinking about the things that they're doing right and it's not there's no one one fix for this I think it's realizing how important data sources are to the AI creation of AI and data I think in the book I talk about as data stakeholdership that we're not subjects we're stakeholders and I think that word has come up out of other people's mouths today too but I think what gives me hope is that just disrupting this narrative that AI is for good that AI will improve humanity that data are necessary for the functioning of of our society and going forward I mean the fact is not all data are necessary for for improving social outcomes and so I think it's really incumbent on all of us to really engage in that conversation of what we want to do what types of things we want to give up what types of things we want to take advantage of and and hopefully we can have a conversation about this in multiple contexts so that it's appropriate for all the different societies all the societies in the world that are affected by AI which is basically all of us thank you Wendy that's a great note to end on I want to thank thank the audience thank you all for joining us today thank you Bulalani thank you Michael thank you Wendy everyone go you can go purchase We The Data go read Rest of World online and you can just google Bulalani and you'll see all the great work that he puts out but we thank you all for joining us and have a great day