 Byron, are we on? Hello? Hello? Hello, everybody. Gather around. Welcome to Free Open Shared, a conversation about privacy in Asia with our friend Malavika Chayaram. This is our series of conversations about policy, collaboration, and knowledge here at the Wikimedia Foundation. Thanks for coming out tonight. It's great to see a lot of familiar faces, some new ones as well. This is actually a rather young series of talks. It's only the second evening, so we're happy to see so many of you here. This is me. Please tag your thoughts, questions that you have tonight with hashtag free ideas. We'll go through that as well during a Q&A later. I wanted to pick the opportunity to talk to you just for a minute or two about some of the work that we do here at the Wikimedia Foundation. Our mission is to empower people to collect educational content and to share it under free licenses globally. For us, the tool to do this, or one of them, is something that you are probably very familiar with. It's Wikipedia. Around Wikipedia, we think in three spheres of policy. We think about what is best for our users and the platforms when we think how can content be on Wikipedia. We think about copyright. We think about how we can be able to neutrally host content. We think about intermediary liability protections. We think about access to knowledge and how we can make sure freedom of expression is protected. Finally, but very important tonight, we also think about the privacy of our users. Privacy. We think of it as the intellectual foundation of knowledge. We think of it as our users' rights to read freely, to research freely, and to contribute to Wikipedia without having somebody spy on them or look over their shoulders, eat governments, eat companies, corporations. As communications move online, as everything is interconnected, conversations about privacy are getting more important. They're more important than ever. When we work in privacy, we're proud to say we have a strong privacy policy for our users, but we also have a strong privacy policy for our donors. We have enabled HTTPS to make it sure that people can contribute to Wikipedia and edit and also read without anybody interfering. We resist requests for information about our users, thereby protecting the right to privacy of our users directly. But these conversations about privacy remain very important. I just noticed today they're an everyday thing. When I sent an email about this event, an invite to an email list, I got a message back, well, this is a public archive. Your email there will be permanently stored. Your email address will be visible outside. One of the recipients of one list actually sent me an email back. I don't really use an ID. I don't have an ID and I don't want to have one. Can I still come in? I had to, unfortunately, reply, there's a building policy that you have to show ID. Conversations and perspectives on privacy really differ. They differ not only in the U.S. They differ. There's huge differences between here and the U, but also worldwide. And to talk to us about this, how privacy identity and the concepts around these and questions are important in India and how they evolve there. I'm happy to introduce Malavika Jayram. She's the executive director of the Digital Asia Hub in Hong Kong, which is a think tank that does research around Internet and society. How questions like privacy actually affect people every day in their lives as they communicate online, as they read online, as they just use the Internet for their everyday needs. Malavika has previously been a fellow at the Berkman Klein Center at Harvard University. And she's on the advisory board of the Electronic Privacy Information Center. But the greatest thing that I found in her bio that she sent over is that she taught India's first course on information technology and law in 1997. And so it's my great pleasure to introduce my friend Malavika Jayram for her talk about privacy in Asia and larger Asian context. Thank you so much. Thank you so much. I'm so happy to be here. You people take your editing very seriously. There are marker pens inside the toilet rolls in the washrooms here. Love it. You people really walk the walk. So I'm really happy to be here all the way from Hong Kong. And I'm just going to kick straight off. I gave you this sort of partly subversive topic because this is something I deal with every single day. So this is my life. This is what I do about 20 times a day with everyone about privacy. So just to give you a little back story, I was a practicing lawyer for for too long 15 years 17. Any number of years is too long in London. And when I moved back to India, I the first thing that happened when I got a SIM card was within about 10 seconds of getting a new SIM. I was getting spam everything from you know, horoscopes to matrimonial proposals to real estate to all kinds of things. And I thought my parents don't have my phone number, but you people do. That's kind of interesting. And then I found that a few weeks later, I was getting a lot of information relating to health, medical insurance, hospital information, all kinds of stuff, homeopathy and others, you know, psychostuff. And I was kind of wondering, you know, why this sudden uptick in health related stuff. And I realized I had just done blood tests. And I thought, no, that it's it's correlation, not causation. I knew that line. And I was driving by the clinic where I'd done my tests. And I was waiting at the traffic light. And I noticed that right across the street was a data broker. And I thought, seriously, it's that easy. You have a friend, you just walk across the street, you get the list of everyone who enrolled that day, all their phone numbers, like, is it really that easy to violate someone's privacy? And having come back from England, where we have data protection law, we have privacy rules, we have, you know, an information commissioner, I thought, wait, I'm going to complain. And I thought, complain to whom? On the basis of what? And this was in 2006. So I was sort of thinking, like, there really isn't a framework where I can complain. And I started talking to people doing my little, you know, Voxpop down the street, talking to friends, asking how they felt about this. And this was around the same time that India was launching this huge big biometric identity project. And I thought this is sort of, you know, all of my fears on steroids going, you know, what is going to happen with this? And, you know, being this sort of, you know, civil rights advocate, I thought there'll be people on the street, things will be burnt, it'll be like Berkeley, it'll be great. This is never going to happen. And I realized none of that was happening. And I talked to friends and they'd say, oh, you've been away too long. You know, privacy is a Western thing. You know, we just don't actually do that. You've become white. And I thought, really, Indians aren't private. So people would keep telling me Indians aren't private. We're an individual society. It's a Western construct. It's some crazy enlightenment idea that doesn't actually stick here. Our level of, you know, the basic unit here is the family. It's not the individual. It's selfish. It's, you know, all kinds of things. And I think this is one of those things where I guess many PhDs are born of, you know, weird reasons, but mine was born out of rage. I didn't actually finish it because rage is not a very constructive thing. But it was, it was one of those things where I thought, am I, because people would keep saying you're the deviant, you know, you're the weird one, you're the outlier that you care about privacy, the whole country doesn't. And I thought, really, you can say that with absolute conviction about more than a billion people, like one attribute applies to everyone. I think it was John Kenneth Galbraith who said that anything you can say about India, the opposite is equally true. And I thought, seriously, but I thought I'm either going to self combust with rage every time my privacy is violated, or I could channel it constructively into a PhD that nobody will read. Yes, I know what I'm going to do. So I seem to have clicked this through while talking anyway. So these are some of the tropes I kept, I kept hearing and I'm sort of expanding it now that I live in Hong Kong. And, you know, this seems to apply. It's a nice template that works across Asia. You know, Indians, Chinese, Thai, anyone, they don't care about privacy. It's a cultural thing. And there's this idea as it gets tagged to different projects that something's better than nothing. You know, whether it's an identity project, whether it's biometrics, whether it's a chip and pin, whatever it is, you know, whatever favorite privacy, you know, invading technology, it's better than nothing. We're a developing country, we're poor. Having something is okay. It's a good start. It's good enough for the developing world. It's good enough for poor people. They don't know any better. What privacy do they have now? And this is something I heard from both sides. It wasn't just the people affecting and implementing the technology. It was the recipients saying, well, what dignity or privacy do I have now? What do you think you're trying to take away from me or that you're trying to protect? Because I don't have privacy, but if I could get food on the table, that would be a win. Or this idea that privacy is a luxury and that's something we've come across here, you know, people like Julie Anguin have written about this, you know, will we pay more for privacy or will we just get the, you know, 50 cents off the big mac and take all my data? And this was a trope that kept coming up, you know, whatever your favorite techno-utopian dream, it will solve whatever big third world problem. And it's all fine because, you know, it's some neoliberal idea of what the third world needs. Or it's a trade-off, unbalance, but it's free, so it's okay. So, you know, these are the sort of things I dealt with in all of my research and all of the projects that I was working on. And this whole formulation of it's free or the product, not really a big thing there. But if you try and sort of explain that it's usually the weakest in the chain, the poorest people that bear the cost of the system, that bear the greatest risks because they have the least knowledge about this. They're illiterate in both senses. Either the products and services are not in the language they understand. They might not be literate at all in any language. They might be a victim of the digital divide, even though that's an old word. They might not have access or connectivity. And suddenly when you digitize everything and say, this is all new and shiny and great, do they know how to navigate those systems? Is it still going to exclude them, even in this attempt to include them? And there are information asymmetries. We have them here. Like why wouldn't they exist there and be much more amplified? And can they negotiate it? No, it's not. I mean, I'm a lawyer. I can't go to, you know, Apple and say, I don't like Clause 6.2 of iTunes. Could you just change it? I can't do that. There's absolutely no way they can actually push back on the IMF or World Bank or any other aid organization. So these power differentials are even worse in the developing world. So you can see I did my slides about five minutes ago. So paradoxes that I kept coming up with. And it was really interesting to me because they really seemed very, I really came across very weird dichotomies. Like a lot of the projects that I was researching and looking at seemed very focused on this idea of stability that populations are amorphous. People's identities are fluid. We need to fix this. We need stability because you can't have a system. You can't be a real country unless you have very stable systems, stable databases, you know, de-duplified background checked, you know, scrub, clean, no dirty data, no ghost in the system, no duplicates. So this idea of stability embedded in countries where sometimes fluidity and being ambiguous is the only thing that saves you. You find the cracks in the system. You find ways to negotiate and navigate systems. You're functioning in a very precarious area where the fact of your ambiguity sometimes the only thing that saves you from vicious governments and, you know, allows you to be dissident and buck the system in different ways. But there's this very sort of seeing like a state kind of idea of fixing you in a system and grasping you in some way and Foucault has said all kinds of things about it that I will not bore you with. So this idea of fixation or even a fixation with fixation and this other sort of dichotomy I kept coming up against was on the back of this idea of inclusion, you know, this great neoliberal idea of inclusion, you are actually going to end up excluding people. In your attempt to include them, you'd end up throwing them out of systems in all kinds of ways. So I'll give you one example of maybe that's a bad example for this. But this idea that once you sort of digitize records, you've included people in the system, you've made them part of the financial system, but have you excluded them in other ways? So a couple of examples. One was when the biometric identity project was being enrolled in India and they did field studies after the fact. Of course, why would you do them before? Why pilot or test something? So, you know, and Silicon Valley, you're to blame. This whole beta testing, you know, tweak on the go. It's your fault we end up with these things because we're supposed to fail faster and we do. So one of the projects that was actually looking at how this worked in practice was a particular kind of scheme where you had rural employment guarantees where people could get work, you know, farmers and laborers could get a minimum amount of work. And they found that in the old days, when they got paid in cash at the end of a Friday in front of the group of people, you know, someone handed out all the money for the work that they did, people actually showed up for work. The second, you had a biometric identity which allowed that you were financially included and you could get this, you know, Norfolk's bank account and money could seamlessly get transferred into your account. People didn't show for work. And they said, this is really weird. We're giving them jobs, we're giving them employment, but the numbers are falling. And when researchers like Nikhil Day and Arun Aral looked at it, they found that this completely weird, unintended sort of factor that had they done a pilot, they would have figured out, which was that when you had the social stigma of being seen to be a slacker or a good worker amongst your peers in your social circle, you showed up for work, you did really well. But the second that sort of act of payment where you get 20, you get 40, you get 60, when that sort of chest puffing cry divided really well this week, when that goes away, you don't bother showing up for work, you take the 20 and you drink it away. So that was something that the sort of social aspect of how technology gets mediated that just didn't sort of figure. So in trying to include people in the financial system, you probably excluded them from work in the first place. So that just one example of how this can work. And this idea of somehow you were institutionalizing fairness, you were making it sanitary and objective and you had all these wonderful criteria, very great technical systems meant to replace venal, corruptible people. The human is corruptible and can be influenced, the machine, no, the machine is always right. So that trope against this idea that we're all now grappling with the algorithmic bias and the codes and values that perpetuate societal inequality and unfairness, that's sort of not even a conversation. It's beginning now, but it's not really a big thing. But there is this whole body of literature on structural violence of how the state can actually enforce and be a medium of violence against the people it's supposed to govern. So if you read James C. Scott, you may want to try Achille Gupta and other people have written about this, about the state as a perpetrator of violence against human security. And again, this other paradox of individual versus collective where privacy was seen as a very individual value. But all the benefits of these programs were gauged at a societal level. So there was that kind of, you know, parallax error if you like. And who knows what else I was trying to say there. So here's a quote from one project that I want to use as an example. Tell me where you think this comes from. Allow the trustworthy to roam everywhere. Sorry. The Chinese what? Cyber security law? Close. Keep going. Yeah. Yeah. So this is from, this is the description from the 2014 policy document about the Chinese social credit system. And I think a lot of people have heard about this if for no other reason, because they are terrified of the friends they keep, right? The idea that you could be judged and put in a box, not just for the things you do, which, you know, we're sort of used to our behavior determines and your things get extrapolated from the stuff we do online. And we leave digital traces, shadows, blah, blah, that we're used to. But this idea that you get judged, even penalized for stuff your network is doing, that your credit rating might fall because of what your friend is doing, that's a totally new dystopian crazy thing. So like a lot of people when I've used this quote once, but a lot of people thought this is where it came from. And it's not, it's not black mirror. It could be. And I think there's a lot of fear that it might turn into that. So this is one of the articles talking about it from NPR. Long may it live while it's still around. So what's your public credit score? The Shanghai government can tell you. And there are a whole bunch of articles I could have used as a screenshot. And the reason this particularly resonated for me is because of the parallels. This is also part of parallels with the Indian biometric project where these are the official posters. Ben has seen this, a couple of other people have seen this. This on the right, which is, you know, and this is my interior hell. Who are you? We have the answer. And I'm thinking, wait, what? So I don't exist until I'm in your database. And this also feeds into this development narrative of if you're not counted or if you can't be counted, you don't count. It's very much census documentation, inclusion in systems, micro credit payment systems, mobile phones, all of these things function on the assumption that as soon as we can count you and you're inside the database, you exist, you have rights, you have entitlements, stay outside of it. Sorry. Right. So this, who are you? We have the answer. Like come to us and we'll give you an identity. This is one of those things that conflicts identification with identity that I find deeply problematic. But this idea of, you know, the Shanghai government will tell you who you are and what your public credit score is. So that the parallels, so this is, this is a paper that should get written, but I haven't written it yet. One of you who's listening will probably write it before me. Are you writing it? Oh, good. One less thing to do. Awesome. Okay. I can do a brain dump of everything that's in my head and you can add it. Okay. Cool. My people will call your people. Okay. So, so the parallels between this were fascinating. You can leave now because you know all this stuff. No, just kidding. No. So just some other things from the social credit system. No, only because she might show me up is not knowing what I'm talking about. There's lots of oxymorons here and pool buzzwords. A social credit system is an important component of the socialist market economy system spot the odd pairing of words here. It's inherent requirements establishing the idea of a sincerity culture. Interesting. Not what I join a social network for or even expect experience to provide me with. Caring forward sincerity, traditional values. It will forge a public opinion environment, Black Mirror, that trust keeping is glorious. I mean, I am sorry, but I can't read the word glorious without thinking of Borat and you know, make Kazakhstan glorious sort of thing. But yes, trust keeping is glorious. And it warns that the new system will reward those who report acts of breach of trust. Neighborhood watch, Stasi, anyone. And it's the hubris of this is quite staggering. And you know, obviously it goes way, way, way beyond either either just the social or either just credit. Some of the red flags for me and Citizen Lab has just written a great paper about this about five days ago. Did you write that? Oh my God. Okay, I know your name. I don't know where you live yet, but three points of data is all it takes as Latanya Sweeney. So can be done. So one of the flags about the black box algorithms are treated as they've been labeled as trade secrets in the program. So good luck trying to find out what it all means. Government black lists are being shared with private companies. So all of the creepy black lists that the government might maintain are being freely shared with the eight companies that are doing the pilots right now, which of course begs this question of reciprocity of if the information flow is going in one direction, what's to say it's not going in the other direction from the social credit systems back to the government. It has this sort of unique directional idea of trust, which again, I think is very similar to the Indian system where you need to trust is important, but only in the direction that we need to be able to trust users because they're awful. But trust governments, they're great. No need for transparency. It's all good. They're very close systems. And this whole idea of how much is what waiting is given to data generated by that platform versus any other sources of data does the provider of the social credit system actually weigh its own data and privilege above anyone else's? And the lack of context in which this is done. I think one of the examples you use is about or maybe somewhere else about the woman who tries to use public transport using her son's public transport card and is branded a thief, which if you had the context that would not count as theft in the sort of lawyer sense of it, but without the context, the judgment is very, very different. And I also find this idea of gamification kind of creepy, this idea that it's playful. It's fun. It's delivered in app. It's delivered by Alibaba. So it must be fun. It's sesame, open sesame. It's all part of the whole thousand and one nights fantasy. Does it normalize things like this? If it was an official government idea with biometrics and all kinds of things that signaled that this was something that was state derived. Maybe you would treat it with a little more respect or fear, which I think is the correct response. But if you gamified and it's provided by your social network, do you think it's banal? It's cool. It's not something to be scared of. So these are the five dimensions, credit history, behavioral habits, ability to pay off debts, personal information, social networks. And all of these are really ambiguous sort of wide open terms, which are not really defined. And so this idea of someone who plays video games for 10 hours a day would be considered an idle person. I'm like, my best friends play video games for 10 hours a day and are very productive members of society. And someone who frequently buys diapers would be probably considered a parent. I would hope so. In Asia you can't tell. People have vending machines for underwear. You just don't know. And some parents I know are the most irresponsible people. So these buckets are totally arbitrary. And Sesame Credit is open. Sesame Credit is one of the eight companies providing it. It's one of the implementations. But they're very open about their links with the Chinese government. It's sort of like a nice stamp of approval. They work closely with all of the usual suspects. But apart from that, it's fine. Really it is. So jumping to another sort of part of the world, which I said I would, this picture kind of, there's a whole series called Bioquatics by this photographer, Jan Bannings, which was intended to visualize bureaucracy around the world, which, do I have that picture? Yeah, here you go. So this is what bureaucracy looks like around the world. And that picture was one of the entries for India. And I've shown it to many people asking what they think. And some people say it looks like my office. But to me what this looks like, and people always sort of laugh when I say this, is like, people think of bureaucracy or paperwork, paper trails, all kinds of things. I look at this and I think big data. Because we sort of forget that a lot of the world, this is big data. It's not all shiny blue matrix numbers trickling down the screen. And data in the corner tells you more about my browsing habits than you need to know. But when you think of big data, this is what you get. It's sterile, it's antiseptic, it's clean, it's organized, it's structured, it's all kinds of wonderful things. It's not this. It's not this grumpy looking woman thinking I'd rather be anywhere than here. But it's this, right? But the other thing I find really interesting in this sort of difference between these two sort of imaginations of data is anyone? Yeah. They're both really messy. Yeah. Anything else? Anything different? Apart from the numbers. This is true. Yeah. Yeah. Yeah. It's also messy. Yeah. Anything else? Yeah. Scrutability, interpretability. Yeah. What's different? Yeah, that's the thing that really strikes me, right? Like, this formulation is sort of it sort of assumes there's no human involved anywhere here. It's not about humans. It's not built by them. It's not for them. This is sort of for all its flaws, all its messiness and chaos and, you know, completely idiosyncratic filing system that exists in her head. And God help you if you try to find her and she's on her break. Or is, you know, away for two weeks for someone's wedding. Like, good luck. But there is still something essentially humane about this view of data for me, because she's going to help you work your way around the system. She'll help you navigate and say, don't fill in that form, fill out this one. You're actually not eligible for this, but just say this instead and we'll make you, you know, eligible. Like, or there's someone just to sort of handhold you through processes that might seem completely opaque and weird and scary. You're a villager who doesn't know what does this whole thing mean. And they can help you navigate the completely impenetrable, inscrutable bureaucracy. But this is like, you know, as anyone who's ever tried to call their bank knows, it's a maze of sort of press one for, you know, rage. And so Travis Hall is a friend of mine who's now gone to the dark side and works for the government. But back when he was a good PhD student, he wrote this great thesis, which was about biometrics across three different countries and use cases. And one of them was actually the Indian system. And this is a great paragraph and sort of his lead to that chapter where he talks about how identification can lead to social sorting, but how it's often the drive for sorting that can actually sort of reflexively come back to identification. And he talks about how this is intensified by India's democratic and socialist composition, right. So this idea that this idea of democracy necessitates that you identify your voters. So they have a say and they can participate in government. But the socialism thing means there are welfare elements, there's entitlements, you need to sort of treat people differently to be truly socialists. So the sort of democratic and socialist attributes at play end up requiring a very close relationship between society and people. And this is sort of where identity comes in. And this is too long for anyone to read, but you'll be somewhere on the interwebs cached forever and ever in an infinite loop. So this is from this wonderful paper by Ursula Rao and Graham Greenleaf, which is, it's called subverting ID from above and below. The answer to shaping of India's new instrument of the governance. It's a few years old now, but it's still it's very, very relevant. And this in the next slide, I sort of put these two paragraphs in full because there's this idea that this system of identification is, you know, shiny and new, it's going to replace the messy, chaotic, unstructured, you know, system. But when you actually look, if you look at identity as a process and a practice and a negotiation, as opposed to a system, you'll find that the actual implementation of the project keeps running up against not just privacy risks and harms and the sort of whole surveillance infrastructure being brought to bear on welfare. But this idea that actually doesn't even work for like the very people you're trying to include are being excluded because of their difference, right? So who do you want to pick up in the system? Not the middle class who have 20 forms of ID, not the rich people because they're above the law. It's really the poor undocumented people, the marginalized who don't have any other form of ID because they're born in villages where we don't have a good system of births and deaths. They don't go to school. They're not registered. They have no health insurance. They're not, you know, they're not fixed anywhere in the system. But they're the ones who really need the ID because they don't have any. But they're the ones who, because they're manual laborers, they have disabilities, they're the ones who can't actually be registered in the system. So Ursula, this is part of her sort of ethnographic work where she goes to see how the process is being implemented and what kind of violence it's inflicting on people. And she talks about how their fingers were found to be wanting, right? So you're flawed. It's not the system. The system's shiny and perfect. You, the human are flawed. You don't quite fit. So they tried to get a perfect set of 10 fingerprints. It would fail. The machines couldn't identify the unique contours of fingers damaged during a harsh life on the streets. And these are the people you want to include, right? You begin a struggle against dust. Surprise. India's a dusty country that settled into the skin of manual laborers, you know, from decades of work. A wet towel was passed from person to person. Rub your hands strongly. Technicians would repeat at times performing the procedure up to five times to get a detailed reading. It improved the success rate. I mean, the false positives and false negatives are still not within any sort of acceptable bounds. But it can't help you if you've lost fingers or fingertips. They were told to wait for a specially authorised enroller. This person never showed up. This is like the waiting for Godot of the identification world, you know? I mean, so there are all these ways in which you get sort of excluded from the system. And this was the second thing where gender posed a completely different challenge. And it's not, I mean, there's one issue which comes up with race, which is often mentioned in the biometric and identity context, which is that biometric systems are notorious for not registering darker skin and darker eyes. The error rates are way different than they are for white skin and lighter eyes. So of course, what do you do? You know, inflict it on a country full of dark people with dark eyes. What could go wrong? But it wasn't even that issue. It wasn't the racial element. It was actually a social thing. But women had little issue with fingerprints, but it was less physical and more habitual. They couldn't get the photograph and the iris scan to match because they're so used to looking down. They're used to being deferential. They're used to not making eye contact, right? A lot of these rural women or they had to veil their faces. So this idea of actually looking straight into a scanner was just like, their bodies resisted the humiliating intrusion by blinking and producing streams of tears. So here you have all these like government enrollers keeping boxes of Kleenex everywhere to deal with this. And they arrested heads and pulled their chins up and did all kinds of things to discipline nervous eyelids and to help the process. Or the younger unmarried sister-in-law was called to cover up for a freshly married shy daughter-in-law who couldn't be entrusted to the crude hands of a non-related male. So you have all of these issues that when you're formulating this techno utopian wet dream, you're not thinking of this. You don't think this is how it's going to be implemented in the field. So that paper is really worth reading. So this makes me ask this sort of question. Are we moving from this sort of, you know, David Lyon surveillance studies idea of the panoptic sort to a panoptic sort of? It might work. It might not. Some people get included, excluded, whatever. So this was the other piece that Kevin Donovan and Carlinist wrote about privacy for the other five billion where they were talking about with limited resources when all of these things are shown to fail or at least, you know, not work very effectively. Why do we support these systems in voting systems and microcredit in all kinds of systems? Why do we use these when the offline traditional systems might just do well enough? And they raise this really important point about how it actually serves a totally different meta function of obfuscating bureaucracy and obfuscating processes where information is passed through proprietary applications and technologies. So yay Wikimedia open source, close to public scrutiny and audit. So it serves this completely different purpose. Even if it fails at doing the thing it was meant to do. And another book I would recommend is Shoshana Magnet's book at the University of Ottawa about when biometrics fail, sort of why they sort of institutionally fail and that's by design. And it doesn't matter because security theater is alive and well. So some of the patterns that I've been, oh, I've also mentioned security theater. At least I'm consistent. So I've been looking at how technologies of surveillance, when you add them to these kind of big social problems, especially big social problems at scale that you're trying to solve, you're going to end up with the welfare industrial complex. It's the same vendors. It's the same sort of, you know, tropes about, you know, efficiency versus humanity. And I sort of think of this like a Maslow's sort of human rights hierarchy when you try and talk about privacy and security, people go, I don't have food, clothing, shelter, education. Like, you know, what do I care about privacy? It's sort of way down my list of needs that I don't even know what that means. You know, hence my first slide. It's like, I don't think you know what I'm talking about. And the fact, I mean, the spoiler is that most things from privacy, if you have to look at sort of a human rights scale, every other right seems to come first and is more immediate and urgent. Security theater is alive and well because it actually doesn't matter if a lot of these systems fail because they're seen to be working. They're seen to be more efficient. It's like anyone who's ever encountered TSA, you know. But if you try and critique any of this, if you even ask questions, if you're sort of critiquing, you know, this Morozovian formulation of, you know, solutionism or this technology token narrative, you're the Luddite. You're anti-progress, you're anti-national. Why don't you want India to do well? You know, you're elite. You're educated. What about these poor people? You know, you don't want a T-seller to be the prime minister. I do. Maybe just not that one. You know, you're pro-corruption. Like you want the status quo to remain. You don't want things to change. Transparency is great. And this, and how when you, when you have a law or policy vacuum, it's going to amplify all of these problems that I've been talking about. You could do good with these projects, but are you also doing harm in the same process? So one thing I've been trying to stress is how privacy is actually a collective value. And now there's a lot of great research on group privacy, on the idea of denominations and categories being as deserving of the idea of privacy as the individual. And this, this is especially a fallout of the sort of big data universe and the database nation where even if our own individual data is not at stake, our own particular identifiable information is not at stake, by being a member of a class, whatever that group might be, we're being judged as if it was individual. We're being judged by the behavior of everyone around us and the demographics we'll dictate entitled to or not. If I'm a woman, I might get, not get certain kinds of mortgages. I may be entitled to certain things or not, or like, you know, you, you look for jobs and Google might show high paying jobs only to the men of CEO level and beyond 250,000 and above, only to men or disproportionately to men, which they've now fixed. But Carnegie Mellon Research showed that that was a real thing. So privacy as a collective value is something I've been trying to stress, which is one way of sort of bucking this idea that it's an individualist Western idea and it doesn't apply to us. So I think group privacy is really huge in Asia or it should be really huge for that reason. And in any other values, because I think if you think of just privacy, a lot of people here secrecy, they think that's a bad thing, you know, let people see my Netflix viewing history, what's going to happen? I have nothing to hide, nothing to fear, right, that kind of trope. But I think if even if you don't care about privacy in its own right, which a lot of people in Asia might not, if you sort of join the dots and explain that without privacy, anonymity, agency and autonomy, they might actually lose other things that they do care about like free speech and assembly. If you had to identify every time you posted something online, Muslim women might never say anything. People might never do searches online to explore sexuality. They might not try and meet up with other gay people in the region because they think they're going to be branded, you know, and it's going to discriminate against them in some ways, going to come back to bite them. And the one thing that I will say is people treat poor people in developing countries as being really unsophisticated, naive users. They're not because they have histories of oppression and being treated badly. They're the most suspicious people. So for example, one of the strangest conferences I've been to, really wonderful conference was when this ID project was launched. Maybe this was 2009 or 10 that this conference happened in Bangalore, India, where the local LGBTQ community decided to have a meeting saying, this project actually allows you to self-identify as male, female, or T. Never done before in India at that point. But they were not, you know, about to drink the Kool-Aid. They were like, wait, are we now on a list somewhere? Yes, it's inclusive. Yes, it's progressive. It's liberal. It's wonderful. It's a great thing for India to do if implemented well. But we're not so sure. So I was the only lawyer in the room discussing it with people who had been bused in from rural areas. So it wasn't some hipster LGBTQ convention. It was actually poor people who were still sort of very unsure about coming out, didn't quite know what it meant, were wondering if by getting into this bus and being brought unmasked to a city, were they being bused somewhere, not good. And it was this wonderful conference where people were asking me as a sort of single lawyer in the room, really super weird questions with such trust. They were saying things like, so as long as I didn't have to identify as male or female or anything in the system, or I didn't have to be fixed in a register, I could be anything depending on what dress I felt like wearing on the day or who I was going to meet or depending on the context. But suddenly you're asking me to be fixed and stable in a system for all time as being one of these things. So as the oldest son in a Hindu Brahmin family, I inherited three pieces of land. What happens to my property rights if I self-identify as female or transgender? Do I lose my land? And they were like, so the government has thought about this, yes? Where's the paragraph that refers to these kinds of rights? And I'm like, no, it hasn't. Or all kinds of other questions that came up saying in the process of even registering, they'd had to go through full body searches because really sick BB policemen had decided that they didn't believe they were men or women and wanted to actually do a full frontal examination. So even the process of being included in the system is a really undignified violated process. So it was just something that I was trying to say like, look, you may not care about privacy itself, but you may care about inclusion. You may care about free speech. You may care about all kinds of other values that are violated if your privacy is not maintained. So this is something we're trying to do. And so what do I want for Halloween, Christmas, anything, Chinese New Year? I would love to reframe the debate. And it's something I'm doing. It's not something that sits well with me, especially this idea of sort of showing the business case or showing the sort of commercial reasons why privacy matters or why it's a competitive advantage or why it's good for business. It's not my natural inclination as a lawyer to do that. But I think that when you're dealing with newly capitalizing countries and people trying to come up the value chain, it's actually a very compelling narrative when the human rights discourse has left them wanting. They haven't benefited. They've been screwed over and over again. Maybe the business case will actually bring them to a more inclusive just space. And you know, maybe the money, showing them the money will actually help. So it's something that I've sort of had to adapt. So I kind of like the idea of reframing it. I'm trying to do a lot of work to make privacy cool and banal and ubiquitous because it's kind of like if you're the only one using Tor, then you're the target. You're the weirdo. But if everyone's using it, you're lost in that sea. So if everyone's using privacy preserving technology, which thanks to certain recent developments, everyone's on signal. So trying to give it some kind of a makeover. And I should have mentioned this earlier, but when people keep saying it's not an Asian value, I've sort of gone back to various scriptures and foundational documents in different cultures and with other people like I can't be an expert on every country. But Jill Bronfman at Hastings has done a really cool paper with Tiffany Lee and others about saving face as a formulation for Chinese privacy law saying maybe it's not so much about secrecy, but maybe privacy actually preserves societal spaces and you know your role within a community. It helps you maintain that and it actually saves face. So there are different ways or like in India, if you look back at some old scriptures, there are very clear norms within say design of buildings and like architectural guidelines about where your windows should be situated so you're not looking at your neighbor. There are things in the Bible about looking at your neighbor's wife. So there are other sort of sources that I think yes there's the UN human rights framework and there's European law and the fair information practices and all kinds of other wonderful enabling charters, but I think if we're able to reframe it in ways that have local cultural salience and linking it back to things that culturally are relevant and that stick, I think we have a much better chance of actually making it useful and relevant and looking at Bible alternatives because I think we've seen that for a long time people just thought well data's there, it's up for grabs, we can take it, like it doesn't belong to anyone. It might derive from people, it might be about them but it's not, there's no sense of ownership or property interest. Slowly the community has moved towards saying maybe we need to ask them, but of course consent as we all know has become meaningless. If you want to do something quickly you're going to click through anything. Can I have your unborn child? Yes, yes, have all of them. So consent is, and especially when you're dealing with people for whom informed consent becomes moot, what does that even mean? So we need to look at ways that go beyond consent, that revise the use of defaults, like how do you bake in privacy and security into the infrastructure of these systems? How do you make that the default position rather than expecting people to have a PhD in how to navigate privacy policies to then choose privacy settings? People like the tactical technology collective in Berlin have done great work on visualizing this and giving you alternatives with me and my shadow with their exhibition, The Glass Room in New York. They've really helped popularize this debate and we're hoping to do more stuff with them in Asia and sort of recalibrate prints and keep doing this till someone's listening. So this is my last slide. So what do I think of in terms of the future of privacy in Asia? I think the competitive advantage does help, and I mean not just in terms of companies competing to provide the next Black phone or the next Lava bits, but even in terms of countries being more privacy preserving and having better data protection laws, especially with the new GDPR that's come out in Europe. A lot of countries are embarking on sort of naval gazing saying do we need to sort of bring our laws back into the make them more relevant and fit for purpose, especially with big data with artificial intelligence, with automated processing, privacy by default and privacy by design. I think that's becoming more and more relevant. More finely grained ideas of ownership and access. I think this whole decentralization trope of should we just give users control of their data. And I think framing really matters here because if you think of, if you frame data as the new gold, the new oil, the new whatever, you're going to show value in it. But if you start saying, if you think of it as data as kryptonite or data as you know, as a liability that the more data you have, the more harm there is when it's breached. The more you as a company and a government are on the hook and responsible for data breaches and hacking, which we're seeing every day. So maybe actually devolving it back to users actually helps align interests where users want control and governments and companies will also want to get rid of control as they're made to see that it comes with great power gums, great responsibility, says Spiderman. What about the role of labels? You know, Creative Commons has really reformed the copyright landscape and made something very, very abstract, very accessible. Can we do that with privacy? There's been a lot of work on privacy seals in the past, not all of them effective, but you know, like with fair trade, GMO, organic, like can we actually show what products and services are privacy preserving or unfriendly. And this idea of a data commons, I think one thing I keep coming up against is people saying, you're so selfish, like why wouldn't you contribute your data if it helps solve cancer or if it helps, you know, with flu trends. We know the Google flu trends thing didn't really work, but you know, what if it helps larger societal goals? Aren't you sort of being selfish by locking down your data? But maybe there are certain kinds of ways you can actually do more work on, you know, with responsible data and different ethical norms to actually allow some kind of data to be in the commons on certain terms where it can be utilized for the public good. And look at sort of different alternative models of, you know, you can either hide by locking down data or you can hide by putting so much junk out there that it's meaningless, like Helen Nissenbaum and Solon Barakas's great work on browser plugins that every time you do a search, it'll also do like 700 other searches that are for totally different things. So good luck trying to make any sense of what you actually want. Rethinking this whole business model of advertising, which is basically a surveillance model and ways to disrupt data flows between governments and companies, which as we've seen is already blurred and with the social credit system and the Indian ID system as two use cases, they're getting increasingly blurred. So we may still end up with the idea of privacy as a luxury, but you guys are great at disrupting everything. So I hope you can give us some ideas on what we could be doing differently. And I'm trying to catalog what's being done differently in Asia to see if that can help inform what we're doing here and sort of keep the hope machine running. So thank you so much for listening and happy to take questions. Yeah, you need to keep it on because you have to answer a couple of questions, I think. Thanks so much for outlining all of the things that may go wrong with privacy, but also giving us some hope for the future of privacy in the end, I guess. I think at the end of the day policies around privacy, as you just described, have to work for the people and they have to be workable for the new people who come online as well, right? And this is probably something that we think about here as well as we reach new audiences. And with that, I also I want to open the floor for questions here. And I think we got about 15, 20 minutes for this. Yeah, thank you for the very enlightening slides. In your earlier slides, you mentioned privacy issues between Indian and Chinese culture. I come from India, and so I think that was a wonderful slide where they don't see privacy as a right. So I think that was a wonderful slide. I want to do since you're an attorney and a PhD candidate from Chicago, if I were to give you a plus grade, I want to ask you a question about if you were to consider if privacy is a right or a privilege. You people don't make this easy. I think it's a right, which isn't to exclude it being a privilege. I think it can be both. I veer towards the right side of the spectrum because I have an overdeveloped sense of justice. But that's just me. I think it's a right because I think that the data relates to you. And I think it's not just about I think when we talk only about data in the sort of territory of informational privacy, but I think there are all kinds of other ideas of privacy, like spatial privacy and physical privacy, bodily privacy, that I think those are rights. The sanctity of your own self, your body, your intellectual privacy, I think those are rights. More questions? Hi, here in the back. Sorry, I'm sending like... Hidden by pillars. Worse sight line ever. A couple of questions. One as an interloper here to the Wikimedia Foundation. I'm a librarian and I'm wondering in Asia, whether it's India, China, Japan, Singapore, Indonesia, etc., do you have intermediaries that are trying to educate people about privacy the way American public librarians are trying to do here? Or I mean, is it a function of the school system? Or how do people learn about their rights if they're just sort of going along on a day-to-day basis, deciding which apps to use, which services to sign up with, etc.? That's a great question. Libraries don't have such a big role in this. As far as I know, I haven't really come across good work being done with libraries as a sort of focus of the community, but there are a lot of NGOs and civil society actors who play that role, not enough, because like I was trying to say, privacy is not such a big issue for a lot of people. But one thing that's been interesting is a lot of people working with accountability and transparency have done really great work in being intermediaries when it comes to filing the equivalent of FOIA requests under the right-to-information framework. They've come up with really good ways to sort of bake in privacy because in a lot of countries, the sort of privacy authorities and the information authorities are the same office, or it's the same person doing both roles. The right-to-information movement has actually been very good at baking it in because they realize that the people filing the requests are often being harmed, like if a journalist is asking questions about expenditure on a public project and how much which vendors were looked at for building this dam or this new sort of metro station. Those people, when they ask inconvenient questions and file too many of these requests, are either being threatened or actually physically being harmed and there are a few people who have been killed as a result of this. So they've actually realized the value of anonymity in making these requests and have actually come up with tools where they disintermediate and they sort of decentralize the process such that a team of about seven or eight different people will file different parts of the application and fill them in and nobody knows who the rest are. So this is sort of a low-tech version of Tor, like different people are doing like little bits of the packets at the stream. So they've done really great work and sort of in the process of getting information requests filed, they've actually explained why privacy matters and actually helping with human security. So there are those kinds of intermediaries that are serving a sort of brokers for this conversation. There are a lot of civil society actors who are doing the usual sort of hack-a-thon crypto parties, you know, digital security training for journalists and for activists. There are a few, not enough. I really like that you've mentioned libraries because they could play a huge role in this, but increasingly people aren't going to libraries. They're getting defunded like everywhere else and they're no longer centers of local activity. So did you say you had more than one question? Does anyone else have a question? I'm willing to put it aside. So this may be common knowledge. So excuse my ignorance, but have there been any major data breaches in Asian countries and where the reactions to these breaches different in say South Korea, Japan, China, India, for example? Is there like a noticeable different public discourse about data breaches? There are a lot of data breaches. They're not sort of on the scale of sort of the government employees stuff being hacked or other things, but there are a lot of small instances and like it's very much in the public domain about how easy it is to actually get all of this data, this idea of gray markets and identity where you can buy all of this information even with the social credit system, how some of this information might be leaking on and not just this idea of breaches, but this idea also of sort of gaming the system by buying likes and by buying other ways of sort of changing your profile. So there is an even with the Indian identity project, you can get fake IDs, you can get data about these people very easily and again it's one of those horrors of decentralization in that system where because of the multiplicity of actors who are doing the enrolling, who are doing the scrubbing, the deduplication and then providing it into the sort of you know the stream of the system, all of them are sort of the you know multiple points of failure and they're leaking the data quite easily. So there are instances of it. I haven't actually done a survey across countries, but I'm sure someone has. I can actually look up and send you some information from our network. You mentioned China, Shanghai, I guess India and one other country, what about the rest of the world as far as privacy is concerned or the rest of Asia? Is it all this way or are these three countries just the exception? I think from an identity I meant to actually put this big table of all the different countries that have embarked upon systems like this. Graham Greenleaf has done a great sort of analysis of this. So identity schemes as one example are rampant across all of Asia. Most countries have a system, some are you know the attributes are different, some have fingerprints, some don't, some are a card, some are not, some are physical artifacts, some are just a number like the social security number. So something like an identification system is quite common across the region. So that's very common, but the sort of different issues relating to how they look at and treat privacy, there are some things that are very similar, some things have local regional flavors like and there are different levels of sophistication around their legal systems as well. So Japan and Korea have pretty robust and strong data protection laws. India is, it doesn't have a very strong system and it doesn't have a dedicated data protection law kind of like the US where protection comes from sectoral protections. You know it could come from consumer credit, it could come from health regulations. So it's sort of a patchwork, Malaysia has some, so they vary and I think the levels of sophistication around thinking through these issues also varies. But I think on the whole, like if I had to make a few generalizations, this idea of data belonging to people and you having a property interest in it or having the need to control where it goes and who gets to see it, that's not a very evolved discourse. People don't see themselves necessarily as having rights in the data and that sort of level of understanding around how that data when it's recombined with other pieces of data can form a very rich picture that can actually affect your opportunities and chances. That's not a very well understood phenomenon. Hi. Hey, I have a question about just like private actors in terms of like as product development. You know there's a lot of like interesting technologies, mobile apps going on later right now with respect to like messaging apps like Line and WeChat. And I feel like the way they approach privacy at least for WeChat was very different. It's not like you don't have a friend circle. I mean you have friends but you can't see other people's comments unless you're friends with them. Whereas Facebook is like a little bit different kind of a privacy model. I was wondering from like you know for those kind of technologies is that like you think that's affected by how the companies view privacy in those cultures and like you know is if Facebook wants to develop better products or like Wikimedia wants to develop better products is that something they have to consider and have a different privacy model? I think one thing I am struck by with things like WeChat is just how comprehensive they are. I think something like Facebook you still think is essentially a social network. Yes, you're doing all kinds of things with it but you're not buying things in Facebook. You're not ordering things and you're not sending people money and you know placing e-commerce transactions like so that idea of WeChat as essentially a one-stop shop for everything or like Taobao villages like that whole idea is I think sort of way more advanced and I think one thing I keep hearing from people in China for example is when people talk about censorship they sort of push back and say oh but censorship has had this unintended consequence that big companies like Google and Facebook couldn't enter the market and provide their products and services. Local innovation has filled you know nature abores a vacuum kind of thing and they've actually come up with meeting that challenge in a totally different way and actually Chinese apps and services are way more sophisticated than anything you get in the West like you could literally do everything and never leave the app use your phone for every single thing and I think the other different thing is that how it's trickled down it's not something that's an elite phenomenon in urban concentrations it's something that you know the local guy selling light cheese and rambutans in a village in the middle of nowhere is accepting payments on his phone so you know that that's sort of trickling down of these services is I think what I is really huge and different and I think for someone to now come and compete with that is I think it's a really big uphill challenge or even like you know Uber sort of conceded to DD and we're like okay we're out of here so I think competing with local products is is going to be a little difficult but I'd be interested to see how that that plays out and I think they're probably you asked about their views on privacy and how that actually affects it I think historically they've just assumed they didn't have to care but again now with this whole sort of private policy washing and sort of corporate social responsibility and other sort of reasons why people are being pushed to do this I think they're getting a little more socially aware in terms of how they compete on these grounds and trust as you know as we showed in the social credit system is is a very fundamental part of that and I find it fascinating like that's something like the social credit thing is not just yes it's providing a credit rating system in countries where you don't have things like experience you know sorting out the market but that they're being linked to things like dating and people are very comfortable with the idea like I think some of the quotes from you know young Chinese people is yes of course it matters that a guy is good looking but what is his bank balance so they're being plugged into dating apps right so the social credit thing plus dating apps is like ah but um yeah so I think it's this it these different norms around it do do play a role hi there I wanted to ask about your challenge your prompt to make privacy cool again and I wanted to ask what's uncool about privacy now like where are we starting okay baselines um I think what's uncool is people say millennials don't get it young people don't care everything's the biggest argument is you're on Facebook what privacy do you have left right this idea that when information is already out there what are you taking back so to them privacy is it's not so much that it's uncool I mean I want to make it cool as a sort of incentive not saying that uncool is is a deterrent but more than uncool people think it's anachronistic people think it's that that ship has sailed like what are you trying to protect like you know the horse has bolted so it's this sense that young people don't care which is wrong we didn't avoid and everybody else writing about it and it's sort of like if young people don't care why is snapchat a thing right of course they care maybe they're navigating privacy in different ways and like I always ask kids in in Asia when they tell me that you know they don't care about privacy because their whole lives are being lived and played out online they're like we don't care and I'm like do your parents know you know you're dating a Muslim and they're like ah no and like or do they know you're gay have you had that conversation with your parents they're like of course not I'm not losing my inheritance so I'm like well you have some secrets and they're okay like everyone has and Daniel Solove wrote this great I think he was so annoyed by this nothing to hide nothing to fear kind of logic that he put out this challenge to people on his blog saying give me really good responses to this if you have nothing to hide you have nothing to fear question and people wrote in with all kinds of things like if you have nothing to hide do leave the house you know or you need a life or you know or can I have your credit card number so you know there are a lot of really good responses but I think it's the sense that it's not a value anymore it's not something people care about it's old-fashioned it's so yesterday and previous that it has no salience in a life where everything's out there and I think Joe Churro and some others at Penn at Annenberg actually framed it a little differently they said it's not so much that people don't care about privacy like if you do all the behavioral analysis if you look at what people do versus what they say you would think people don't care about privacy but their sort of explanation for that was not that people don't care but is that they feel helpless in the face of negotiating this they're like how do I even get offline how do I unwind how do I take it back how do I negotiate these terms that I don't have the bargaining power to change so they said it's actually a feeling of utility and helplessness as opposed to a lack of caring or a lack of it being a value that's relevant anymore so I think there are different ways you can look at it but I think making it cool I think is sort of the I'm thinking of it as a positive thing because if you frame it as a human right people are some people cares a lot of people don't but I think if you make it it's sort of like Jillian York from the EFF when she and I think Jake Applebaum when they gave a talk at the Chaos Computer Congress in Germany they were thinking of it and they're saying like think of it in terms of digital hygiene like you wouldn't you wouldn't have sex without a condom why would you communicate with someone without encryption right so they were trying to say like it's the most banal obvious thing like why wouldn't you think of encryption as a digital condom right so they weren't trying to sort of say it was uncool but I think they were trying to say that maybe there are easier frames than saying it's a human right and we must uphold it it's just saying dude it's like common sense like why wouldn't you do it so I think when I say I want to make it cool I want to make it sort of a fun thing to compete on like I think I actually think hipsters play a huge role in this like you know you want vinyl you want to like go back to a world when you want fixies like why can't you also use pen and paper and not put everything in the cloud right so I think you can come at it in terms of pop culture all kinds of other ways I think which actually makes it relevant to people and is sort of part of their daily practice like not everyone wants to sit and think about this like if you look at the research like Alicia McDonald and Lori Cranes paper about the cost of reading privacy policies they actually quantified how long it would take for us I think it was something like 83 days a year for each of us every year just to read privacy policies and understand the terms for an average user not a super nerd and the huge opportunity cost to the economy of people reading privacy policies instead of actually doing something so I'm trying to say like let's move away from having to understand the sort of how the sausage gets made but just liking the sausage for its own sake terrible analogy I think I have an ally there so we have time for one more quick question the question I was going to ask was have you seen any models of where data is shared and then the usage is what's monitored and controlled versus the data sharing being monitored and controlled because a lot of people I think at this point are struggling with the idea of what data they can or cannot control the leakage of or the sharing of but the question becomes like what it's used for whether it's used for like credit scores or for giving them access to privileges and discounts or if it's for identity management is there any good models you've seen in Asia that's been interesting that most of us probably haven't heard of being out here I haven't actually come across them in Asia so much there's sort of that whole layer of data vaults and data brokers where you can sort of contribute your data it's held by a sort of trusted fiduciary who then you can pick preferences saying you know I'm happy for my data to be used for cancer research but not for any other kind of stuff not for genetics or like so you can sort of pick and choose and you have a menu of options and they can sort of navigate that and sometimes even get you money for participating in surveys and research about that instead of this sort of predatory exploitative model of it's all ours and we can do what we want with it because it's feeding the machine so I think there are good sort of data broker models there are good sort of distributed systems where like anonymous credentials and you know different technologies where the user has control and can decide like things like you prove which Microsoft bought and then never did anything with but you know there are really great ways where it's about the credentials and attributes rather than the personal data because that's usually what a lot of companies are interested in not you know your name and where you live but like your name where you live and what that says about what you might buy tomorrow so um I actually don't know about any Asia specific products in this space but it would be yeah thank you for asking I should check more work maybe a phd about that I'm good at not finishing phd so happy to have three more anyway thank you so much thank you for taking the time and coming out here to talk to us about privacy um thank you all for coming out tonight we're planning a next event in about I guess three months from now we're doing this on a quarterly basis and we'll be we'll make sure to let you know about that thanks so much have a good night