 Welcome to Inet's webinar, COVID-19 and surveillance technology. It is a topic that I believe is on many of our minds as technology plays an ever greater role in our lives in the wake of the crisis. Our speaker today is Bruce Schneier, who may be familiar to some of you from his blog Schneier on Security and his newsletter, Cryptogram, both of which are very widely read, contributing to his reputation as a security guru in the words of the economist. Bruce is a fellow at the Berkman Klein Center for Internet and Society at Harvard University, a lecturer in public policy at the Harvard Kennedy School, and a board member of the Electronic Frontier Foundation. He has written over a dozen books, the most recent of which is Click Here to Kill Everybody. We're delighted that Bruce can be with us today to help us make sense of some of the issues around privacy and security. And what to some of us seems like a pretty stark trade-off between what are clear benefits from the use of technology to navigate the issues raised by the pandemic, and the threat of a shift closer to a surveillance society as expectations of privacy are slowly being eroded. The trade-off is starkest perhaps when we consider the use of digital contact tracing, which has been used extensively in several countries, sometimes with fairly extreme enforcement approaches like GPS tracking bracelets that inform health workers when someone steps outside a quarantine zone and they are told to return. The apps we are discussing in the US and that are already being used in parts of Europe are of course supposed to be less invasive and opt-in, but even with the supposed safeguards that will be put into place, there is some question as to whether the benefits of these apps are sufficient to outweigh the costs when we consider factors like asymptomatic transmission. Bruce will talk about this in the broader issue of surveillance and privacy for about 15 or 20 minutes and then we'll open it up for questions. At the bottom of your Zoom screen, you will see a Q&A icon. You can type in your questions there and we'll get to as many of them as we can in the time we have. So Bruce, over to you. Thank you and thanks for showing up virtually. This is our new normal and we're making the best of it. So I want to talk about privacy, security and contact tracing both specifically and generally. There are a lot of issues here and it's worth teasing apart what they are. I've written about contact tracing less about privacy and more about efficacy because the first question in any of these electronic systems is, is it any good? Is it effective? Is it going to do anything positive? And I maintain that contact tracing apps are effectively useless and the fact that we're considering them is really just tech solutionism and doesn't reflect any actual defense against the pandemic. I want to walk you through the reasoning. The contact tracing apps are very much like an authentication system. And they're trying to figure out whether something is true or not. When you get any of these systems, you look at accuracy in two ways, in terms of false positives or in terms of false negatives. So if you imagine an ATM machine, you have an ATM card and the security, you look at it in both of those perspectives. If it's false positive, can someone else get money out of my account? And that would be the ATM machine mistaking someone else for me. If it can do that, it fails as a system. But there's another failure mode that's actually even more important and that's a false negative. When I go up to my ATM machine, will I be denied my money? And that actually will happen more often just because the number of legitimate requests, vastly outnumber the number of fraudulent requests. And in any of these systems, an ATM machine, you know, the fingerprint ID on my iPhone, false positives and false negatives. Can someone else open their iPhone with their finger pretending to be me? And is there a point where I can't open my iPhone because the phone doesn't recognize that I am me? So with that model, let's talk about contact tracing. So the app basically, you know, it knows where you are, knows who's near you, and it registers contacts. I'm going to make this up. So contact is more than 10 minutes, less than six weeks. And that's what the system will register. And we can build in different privacy safeguards, we can build in where that data goes, who gets to see it. All that aside, it's registering what it defines as a contact. So let's think about the false positives. And that's things that aren't causing a concern. There are inaccuracies in both the geographical location of the phone and the proximity sensor. So there's using multiple systems, actually three systems to do that on your phone. There is the cell system. Very coarse tells you what cell you're in. And that has to work, otherwise this phone can't ring, it doesn't know I'm here. There's a more accurate GPS system, which you've used if you've used any kind of navigation system. And you know it's pretty accurate, but makes mistakes I play Pokemon go on this phone. I know how GPS drift works and I see it all the time. And then lastly there's Bluetooth. And that's the system you use to communicate with other objects close by. There's no location, but kind of signal strength is a proxy proximity. And that's what it's using. So those all have error rates. There are a lot of times this will register that you're close to somebody when you're actually not. The second is that the phones don't know about extenuating circumstances. The third one is when I am less than two feet from somebody else for a period of eight hours and we're separated by a hotel room wall or a glass partition. I could be in a car and someone can be outside lots of times when that we are in close proximity, but it doesn't matter for disease transmission. The second source of false positives is that there are contacts that don't result in transmission. It is not the case that less than six feet more than 10 minutes equals a disease transmission. It equals a possibility at lots of times when we were actually much more sophisticated about how the disease transfers now than we were even a few months ago. We know that airflow matters, we know that ambient amount of particles in the room matter inside versus outside matters, whether you're speaking or singing matters whether you're wearing a mask matters, none of that taken into account. Right, so this thing registers a contact doesn't mean I have the disease. You're the false negatives. And that's me getting the disease without this regular contact. So we have the same inaccuracy problem. This doesn't always know where I am and who I'm near. Not everyone has the app. Even Singapore had a 20% accept rate for the app. You need 80 or 90 to make this a useful system. Lots of people, you're going to be near don't have the app. And then there's the transmissions without a contact. Right now I think we've all watched the animated videos of a sneeze and how far the particles travel. It's way more than six feet. And so there will be lots of times you will get the disease without this phone realizing there's a contact. So given all that error rate. Here's the question. If I have the app, I go out grocery shopping and come back and this thing beeps. Does that mean I have the disease. No, does it mean I should quarantine probably not doesn't mean anything useful. Not really. Similarly, I go out this thing doesn't beep. Does that mean I'm safe. Kind of doesn't. And the most important thing we have in this pandemic is trust and squandering that trust an app that doesn't work. In the next three days they'll be Twitter posts. This thing didn't work it didn't tell me I got I had the disease that it told me I was sick and I wasn't it didn't register that I was sick. So there's not value in that there's value in contact tracing. That where it's done effectively you South Korea, the Boston area, it's manual it involves trust involves interviews and involves people is not an app. There's a lot of tech solutionism here. So let's sort of put that aside and think about where we might want to use technology. There are three things I'll hear about when I hear about tech and and COVID. The first is is contact tracing. The second is some of these aggregate statistics, which seem to be very valuable and there's a company that makes interactive thermometers fever thermometers, and they've been posting hotspots of fever around the country based on their aggregate data. Really interesting. We see charts on how people are staying indoors by how they're using Apple Maps to get to driving directions walking direction. We can know a lot about general movements, very anonymous, very valuable. The third thing you see here talked about our here I'm under a lot of names I think there's immunity passports. Is there some sort of digital document. I will need to be able to get into a movie theater a restaurant a dance club that proves that I have immunity. Haven't seen those those are talked about. In a sense they're no different than credentials we have today I mean right now I have a driver's license which is my age verification card which I need to prove to get into a to a bar. So it's those similar things. I don't think the the the the mechanism will be any different. When you think about any of these things. There are a bunch of general privacy principles that that we have that we apply. I mean, is it effective. And more importantly, is it proportional. You know we're in a global health crisis. It makes sense to make extreme trade offs of things we wouldn't do in normal time, we might do today. If contact tracing via app was a an effective thing. We would all probably say yeah that's a good idea. Now today, not last year, maybe not next year, but now. Is it effective is it is it proportional. And then we want to see six different things we want to see consent. Does the user consent to it. We want to see minimization is the data minimize to the extent necessary. Is it secure is the data secure. Is the system transparent. Do we know what it's doing and how it's working. Is the system biased. Does it disadvantage some group in one way or the other. And then lastly, is it temporary. Is something we can turn off when the crisis is over. And those six things are how we should evaluate a seven, if you count efficacy and portionality, or how we should evaluate any of these systems. It's really not just true here in COVID is true everywhere. And a lot of people writing about how contact tracing can undermine privacy by putting us all in our surveillance, we're already under surveillance. This device knows where everyone is knows when you wake up and as when you go to sleep because it's the first and last thing you do, you all have on it knows who you sleep with. And Google your search knows your innermost hopes, fears, dreams, as you search on them. And Google already knows what kind of porn everyone, everyone, everyone likes these, these aren't differences. It's just a new usage. But surveillance is the business model the internet. There's no change here, adding a one app to the 50 on your phone that track you. So what, but I want to close with an interesting, I think, interplay, which we're seeing here as increasingly important. So data has value to us as society and us individually. Contact tracing, let's assume it's effective. As you make this choice. Data together will help us fight the disease. Your location data, who you talk to who you're with, yet that is incredibly private in person. That is a balance. And you and there's a similar balance appears in a lot of other places. And when I used to drive places I use Google Maps that got me route would route me around traffic. That worked because everyone who uses Google Maps is under surveillance. And that's how the system knows where the traffic is. The group value is we all get where we want to go faster. The individual risk is our location information. Now you think of Facebook, Facebook will give you free access to talk to your friends and have your communities do all that cool stuff. Yet they want to know everything about you to target advertising. That's the trade off. More generally in health, I think there's enormous value in taking all of our medical data, putting it one database and letting researchers at the benefits would be enormous. At the same time, it's incredibly personal intimate health data. In all these cases, the same question. How do I balance this value to me as a member of the group to the value to me individually. And there's no one answer to this. It's very domain specific. All right, so Google Maps I can make this up. The data is valuable only 10 minutes. You don't save it. You don't need to know who it is so it's all anonymous. And you don't need all the data so you sample one in 10 cars, like done. I've kind of just solved that medical data. That's not going to work. I need everybody's data by name or ever in a database. Everything. So I can't do any of those solutions. So maybe do something else. I secure it very, very strongly. I only allow queries that have been pre approved under strict guidelines and contractual obligations, where research questions are approved in advance, and only aggregate data is provided in return. Unless there is some rules about unmasking and I'm kind of just making this up, but it'll be a wholly different set of trade off. And it'll be yet a third set for contact tracing, and maybe a fourth set for free internet services in exchange for advertising, and a fifth set when we talk about data that the police might want to stop crimes and terrorism. But this to me is a fundamental question about data in this century that we haven't explicitly answered. Obviously we allow for profit corporations to do what they want to sort of figure out the trade off that works for their near term profitability and kind of accept it. But I'd like us to think about it more deliberately and make decisions about it more deliberately. And that's what coven among other things is bubbling up. We'll see how that goes. Any other questions. Thank you, Bruce. That was a really good overview of some of the issues and I'd love to be able to delve a little deeper in some of them. The point that I hear coming from you loud and clear is that we actually need a bigger, more federal approach, let's say to thinking about how we feel about data privacy. And the problem is that this hasn't been handled at the level that it needs to be handled. Given that we are where we are, we are looking as you point out at very specific trade offs in each of these places. So I want to drill down a little bit deeper into the six issues that you mentioned. Starting with this idea that which was the last actually on your list about efficacy, which was, you know, the first thing you started with, it doesn't seem like this would be very effective. There are studies that have come out the Oxford University University Big Data Institute found that a coronavirus outbreak in a city of one million people is halted if 80% of all smartphone users use a tracking system. And, you know, caveats in this model they assume that the elderly are expected to self isolate on mass. But, you know, since what we really need is to get the RO below one, we really just need to shift things a little bit. And no one I think is talking about digital tracing contact tracing in and of itself, we're all talking about using it in conjunction with the manual system. So if you have this army of digital contact tracers that Cuomo talks about, we're still going to need that the question is, is it helpful to have this as an added little thing to add that little extra bonus, which potentially pushes the number down the RO number down far enough that we can actually get this thing moving in the right direction. And the issue that I think we're going to run into, you know, obviously there's this idea of 80% of people choosing to download the app and you say even in Singapore it's only 20%. If we can actually address some of the privacy and security issues which I would like to go into a little bit further. Isn't that the key to getting people to adopt this you say trust is the most important piece. Isn't that the key to be able to build trust so that we can actually get people to comply with a system that allows us to move this in the right direction. I think the key is the thing that we need otherwise this is this is a waste of time, which is ubiquitous fast, cheap, accurate. Without that, none of this matter. With that, a lot of things are good enough. So sure, this, a digital app, if you convey its limitations, its problems, if you don't treat it as this is the magic thing that will keep you safe, could easily be part of a more extensive manual human base system, because it will be more trusted. The key is testing. The thing that that we need to make this this this work is testing. And, you know, I'm actually not worried about the privacy of these apps the the Apple Google system is fine. It's got a lot of privacy protections and actually what it does is it pushes all of the decisions about who gets the data how it's dealt with when it's made, not public when it when it bubbles up to the to the to whoever puts the system together they just building an API. They actually punted on all of the hard questions, because they didn't want to be responsible for it. So, you know, different countries will have different rules about centralization and decentralization, and how it could be made voluntary a mandatory or how to make it work. That's what we can do that's just the tech tech is actually not the hard part here. The hard part of the tech is the testing part, but it really is getting societies to take this seriously and work together. So sure, an app can be a part of this solution. Usually you just understand what it can and can't do not expect more of it in Singapore is dumped their app, because I mean it was really just policy giving policies let me talk about wasn't actually doing any of anything valuable. I think by the time we get this system, like in place and scaled. This is like a year or two year project this is not something that's done in a week. So it'll be too late. You're not going to really see this. You're not going to see this actually mattering. But if we had testing if I'm trying to we try to teach at Harvard in September, and try to figure out if the school can open up. And basically dorm dorms are the same thing as cruise ships. It really seems hard. But if you can test the entire student body once a week. Maybe you can do it. And if you can't is probably no way you can. And an app isn't going to fix any of that. So then manual digital tracing also is not likely to be helpful. It's more helpful manual involves interviews. And there's some really, really interesting studies I've seen reports where they're looking at transmission was that we know a lot more about transmission, where a restaurant I think was in Hong Kong was mapped who is sitting where where the person was who got the disease. That's how we kind of know that the low of the air conditioning mattered a lot. It was a call center where one person was sick and people didn't didn't getting where they were in the room. A church which was a choir we know a lot more. And those are all through contact tracing, but that was apps don't give you that level of detail that fine grain knowledge. But the app at scale is basically something to tell you as an individual, you are now at risk. That's what the app does. And the app can't do that. Unfortunately, yeah, because the error rates are too high in both directions. And you in order for it and this is hard. We have we always run into this when we want to tell the general public to do something has to be actionable, immediate, and useful, like wash your hands. I can tell you to do that. That works. Where a mask, I can tell you to do that. But if the app beeps quarantine for two weeks, not a chance. Unless that thing is dead on accurate. So going back to the six things that you mentioned one of the things you brought up was the issue of bias. There's obviously a huge issue with respect to inequality in terms of who's actually being affected by the disease. An increasing discussion as to how any kind of digital approach to dealing with this is just going to exacerbate it because if you look at the fact that I don't know if it's true but rumors are that Android based systems aren't as effective with this and you know about that. You're right. That is definitely a cultural divide. A little bit about that to be the wealthier have the iPhone. So if it's going to exacerbate the issues that we're already seeing with an equality around this. There's a lot of inequalities around this disease because the co morbidity conditions are so important. Right. So if you've had bad health care all your life, you're more likely to get more sick and die. So if you have had bad in a bad nutrition, if you've are obese, you are diabetic, you have a heart condition. This is more likely to affect you. And I think we're seeing that both in the United States along racial divides on national divides and I mean Germany is surviving this like surprisingly well and probably in five years we're going to understand what is it about the German lifestyle. Very typically, it makes them more likely to survive. Oddly, smokers are more likely to survive. You know, we don't know why yet. So it was a yes I think that there are we have to worry a lot about how this affects different groups in different ways, not just the app that in general, it's really exposing the inequities in the American health care system. Right. Given that you're pointing to the fact that we have not answered these questions more broadly in terms of controlling privacy. And what we've heard so far you say that it's not a problem in terms of the technology with respect to Google and Apple and we can tell them to delete the data anytime we want. But as a political economy question, we all understand that the way that this has worked is that it's being much of the privacy question is being driven by the ad revenue question and the fact that we have these tech monopolies. So from a political economy perspective, how do we think about the issues of, you know, we're handing over the potential to do this to Google and Apple, we're handing over the ability to collect yet more personal data to these companies. How do we think about actually putting into place some kind of regulations that say the data is going to be deleted at such and such time there are only these restrictions and maybe not even Google and Apple how do we control the fact that this data could then go over to governments and we've just sort of handed over yet more of our private data to them. So I don't know that I don't know that yet more. I mean this has been happening for the past 10 years. Nothing is new. This data has been collected. This data is being used against you. This data is being handed over to government. Well before cove it. I don't see any change. And yes, this is a huge issue. Shoshana Zuboff has a book this thick called arises surveillance capitalism. It was published well before cove it again you know on a scale of one to 10 it's already a 9.5 and maybe now it's 9.5 001 and so on. But yes, I mean right now, we're living in a world where these systems are designed with a near term financial benefit of large multinational cooperation. That's the way the system works. And I don't like it. I think we shouldn't like it. And we should try to fix it. How is going to be hard. You know in the United States there's no appetite for any real regulation that would offend the wealthy. Instead of in any way shape or form it's not going to happen here. I'm looking more towards Europe right now they they seem to be the regulatory superpower on the planet and are doing something a little bit, try to increase privacy not very much. But these are important questions to have me isn't are these business models moral. Do we want surveillance capitalism to be the driving form of capitalism in the century. Do we want to give up all of our data to be used really against our interest right it's you being used to manipulate us. That's advertising that's been all for at, you know, for corporate needs and increasingly for political needs. But I don't see the content tracing apps is adding to that. I mean, the, the protocol is I know people have an iPhone iPhone has sort of a find my phone feature where if you've lost your phone, you can do something and other phones are in the nearby will look forward for you. That was extraordinarily well designed system that protects privacy at all level. There isn't this big database of phones and where they are. We can do the math. We can build the privacy. And yes, there'd be one private app among the 50 apps that are tracking location that are not private. So, you know, so what. Already in the US, we have governments querying these databases for law enforcement for counterterrorism for for lots of applications already China is using location data on your phone in Hong Kong to track protesters. Nothing to do with coven doesn't make that worse cover doesn't make that more likely. It's already happening and it's already a big deal and coven just means people are thinking about it more. Okay, let me turn to the questions that we have coming in so if you're just joining us you weren't there at the beginning there is a panel at the bottom that says q amp a so feel free to type in any questions that you might have the Bruce. Um, and we have a question here that's asking about the state of the specific apps. And while you say they might not be very helpful. There is a question as to where we are with the Google and Apple app and then something about the Bloomberg contract tracing initiative in New York Tim cop cop seal is asking is the Bloomberg contract tracing initiative in New York app based any thoughts on what's being done there. I don't know about what's being done in New York sorry I can't comment on that as to where the, the Apple Google system is it's sort of interesting they're building an API, they're not building an app, they're not building a system. They are building a set of tools software tools that you as a country as a state as a city can use to build a system. And what I know lots of groups are looking at it. And nothing has been fielded yet. So there was a Singapore system which has been abandoned. It was this Israel system I can't tell how much it's being used if it's being used. So stuff was happening in China, Germany is thinking about something lots of people are piloting and thinking, but I don't know how much of it is being done on a broad scale, and trying to be effective. And you know there's something I thought I lost a point and I want to see if I can get it back it'll come to me later to keep you going I'll write down if I hear it. Yeah it does seem like there actually are a number of countries that are experimenting with this and I think many of them are using Singapore's original app and they're building variations on this. So I think a number of European countries for example. But I guess my question is, are they then finding that they're really ineffective do we know anything about what's coming from those digital apps from countries that have already implemented it. I think we do. I haven't seen a lot of info on the efficacy we just know Singapore doesn't kind of dump bears. That's what I think it's another another area that where apps are being used, and that's to monitor compliance. Yeah, right so you land in New Zealand, you land on Hawaii, you go into Israel, other places Australia has some of this, you are supposed to quarantine. And one of the ways that that can be monitored is through an app I mean it's kind of sloppy because you can like leave your phone at home, but only catches the stupid but a lot of people are stupid. Yeah. So that is another area, and that would be effective we have to talk about the, you know the civil liberties of that kind of it effectively it's kind of a sloppy house arrest ankle bracelet. So you're expected to hold rather than it being attached to you. That's the same kind of idea. You know, so we can do the tech the issues tend to be the policy above it may immunity passports, I can do, I can build you a some kind of digital credential that's tied to you. You can't give to somebody else has your picture on it or something. You can show and say look, you know, basically a digital driver's license on us. We have to decide do we want a society with that kind of two tier have and have not know the safe and the unclean it does sound kind of icky. But you could imagine a saying, you know, yes, we want to be able to reopen these businesses but the only way we can do it the only way we can have a concert hall. In 2021 is with it with an immunity passport there's just no way to do that safely. And then we decide do we want to make this very extreme privacy and liberty trade off. Yes, I don't know what the answer is but that's not something I want to do lightly. I want to really have a national conversation about that. Before I just say sure that's a great idea. Well that's exactly it how do we get to the point where we can have a national conversation about this because you know now that it's our health at stake people are much more likely to be willing to give up some amount of privacy but then the question is, is there a way to dial this back. Once we've got a question I would go back to you say how do we have a national conversation about pretty much anything. Okay, the problem is bigger than this. And I have I mean I have stacked up some pretty serious issues in society and you know United States and other countries that we really need intelligent thoughtful, kind national dialogue, I mean I can't even talk about wearing a mask that are becoming a political statement which is crazy. Right, I mean clearly it's out of control but there is this issue of you know since we're talking about the tech in particular. We really do need a policy at a federal level that thinks about our privacy and how do we roll this back because I think you know when you talk about the issue of trust. How much of the trust issue is dependent on how much do we believe that at the end of the day we're going to be able to get back to someplace before this or are we just letting them erode our privacies on a day by day basis. We're letting them erode many countries have privacy commissioners United States does not that but you know the big tech does not want this big tech basically wants all of our data all the time, and because it's an incredible wealth generation machine. And we have a lot of trouble as a country enacting policies that go against incredible wealth generating machines. My belief is that you fast forward 20 years, and we will look at surveillance capitalism, the way we look today as a business model of sending six year olds up chimneys to clean them. But sure it was a great business, but it was fundamentally immoral and we stopped doing it. Right. It's going to take a lot to get from here to there. Yeah, something's gonna happen next couple years. I really believe it's going to be the younger generation. When they start coming into power coming into politics, they'll be able to make these hard trade offs, and we as a generation aren't capable of it. You know climate change. I live in Minneapolis. Right now there are protesting riots in my city, because for policemen basically executed a black man who we're pretty sure didn't actually do anything. The problems sort of we have that require the dialogue about the militarization of police. That's an important one. That's more important than this one. And I can't have that dialogue. So, you know, I mean, this is my issue. I'm very sympathetic. I want this to work. But a lot of me is kind of like get in line. Well, my fear about looking to the younger generation for this particular issue is that I feel like with the younger generation they've already in some sense accepted the trade off and been willing to give up so much of their privacy to live their lives online that they're less likely to be upset about it than those of us who still have some expectation of privacy. So that's a common myth. It is a myth. The whole young people don't care about privacy. Okay. The thing we know about young people from you research studies to deep dive sociological studies, say that is not true. Oh, okay. Young people they think about privacy differently. That for us it's private by default and being public takes effort for them it's public by default being private takes effort. And they are much more sophisticated and nuanced about their privacy. They understand it to a degree that we tend not to we think Facebook everybody knows that the younger generation is much more sophisticated. And to them privacy often matters their peers, their parents, their teachers. And often like many of us they feel powerless. And this is where the the sort of where the nature of the social networks matters. What's important to us as human beings is communication. We are having this seminar on zoom, because we want to communicate with each other. We have a lot of privacy issues, a lot of security issues they've gotten better, but really I use it all throughout is way more important was setting up a zoom call with friends and having virtual dinner together. That's what mattered. And we are all such social creatures, and these platforms play on that and pray on that to take advantage of us. Young people don't care about their privacy. They care, they do what they can, and they understand how powerless they are. Fair point. I have a question here from Nylo Olivero saying instead of privacy transparency must be the goal to all, including disclosure of companies actions and capital movement. Why do we need privacy let's get aware of everything as long as it's shared broadly. So we'll assume the person I said question is wearing clothes. In the question there didn't type is sexual fantasies in, and we'll assume actually he does want some privacy to so we'll sort of get rid of that do away with privacy nonsense because no one actually believes it. Even, I'm blanking on his name, who so as the CEO of Google said you have zero Eric Schmidt, zero privacy ready to get over it. I mean he still keeps a lot of things private. So privacy matters for power. So here I'm going to sort of give the real explanation that the notion is transparency let's all know everything. So sort of sort of to sort of I guess equilibrium states, everything is private. And everything is public. So, let's pick one and not the other let's pick transparency is that privacy. The difference is the power level. So data about privacy increases your power. So think of government or cooperation so use government is up here and individuals are down here. So privacy in individuals increases their power. There's less differential. There's more liberty is privacy and government is closer. Transparency individuals lowers individual power is a greater difference. So liberty is decreased. Okay, but watch what happens the other way corporations secrecy of privacy and corporations increases their power. Again, the differential is increased less liberty. Transparency and corporations or governments increases that power is more liberty. So you tend to want transparency or the powerful governments and corporations privacy for the individual. That is what is best for liberty. I have a question here from dark Carmichael. How close are we to big tech and government merging into a single big database of database system of complete monopoly. I don't know five years ago. You don't you don't need a single database. You just query each other. We learned from the Snowden documents the NSA is breaking into corporate databases all the fricking time and getting data. And we know that they were using national security letters to get databases, and they got the entire Verizon list of who made cell phone calls to whom. So you already seeing this public private surveillance partnership, you're never going to see one database that doesn't make any sense. You're going to see lots and lots of different databases, and then all using each other. So already, we have this, like the public private surveillance partnership, you already have the merging of government and corporate interests, the United States under our rules in China under their rules in Europe and Russia and all, you know, in all those other countries. So this is not something that is coming to something that is here. It just doesn't look the way that you see depicted in science fiction. Um, you know, there is a question of how much we as a populace have to say, in this question, because if I think about India, I actually have a question here from Ishita Mukhopad from India, who says we have a rogue a say to app in India. The app generates data and the data shared by private health providers who use this as a market survey without the knowledge of the consumers. And, you know, my understanding of this particular setup in India was that it had 50 million users within the first 13 days of its release. So there was a huge adoption in India. And India is already seeing a problem with the ID system that they've been using before this, which was supposed to help everyone access social services. And there've been tremendous privacy and civic violations as a result of that. So I think, you know, we're seeing countries like that where technology is being used really in its most extreme dystopian form. Yeah, I mean, under her I was, I think it was a problem from the beginning. The promise makes sense and the government sold in the promise that we just need to know what our population is and how to get them benefits and how to understand them. But actually rolling that out among, you know, a country as large and diverse as India had huge problems and I think we're seeing the fraying and any of those systems it's always the edge cases that are that caused the problems but they're the things that matter. So, so yes, there's, there's, you know, India has on the heart and United States we really have driver's licenses, and they are linked. So again it's not one big database it is 5052 individual databases that can be queried together and can be can be brought together. China is building their own sort of national system here called the social credit score, which is trying to bring a bunch of disparate data streams into one large decision making process. I think of this in three in three different way in three different steps. United States is a lot of conversation now about face recognition that there are cameras that will automatically recognize faces. They do that through databases of tagged photos. So we have provided Facebook with an enormous resource to violate our privacy by tagging all of the photos we give to them. So, and these systems can be used to put a camera on a street, watch who walks by and attach names to them. And this talk in the United States about banning is Cambridge has banned it. I think Oakland in California has banned this use for law enforcement. But it really is the a much more general problem. And that's identification. And it could be your face. And it could be the MAC address of your phone. It could be the way you walk could be your voice. It could be a retina scan that is grabbed with a with a very high resolution camera. Lots of ways you could be identified without your knowledge of consent identification in any way, then this correlation, taking that at any data and running it through databases and learning more stuff about you. And whether that is your rest record, that is your credit card purchasing history. That is your Google maps information of where you go, or combination there of maybe information about your finances. And finally there's discrimination, by which I mean a decision is made about you and you are now treated different. If you use a use case, you walk into an apartment store. I guess when you can walk in apartment stores once again. The system knows who you are somehow. It grabs information about your buying habits and your income level, and it either treats you well or poorly. The information is given to lower sales people who treat you well or poorly based on that is a use case. That's fair. Do we think that's just how different is it from right now, when I when I go to the airport back when I went to the airport. And I used to fly 280,000 miles a year. I had the top tier secret status on Delta. I assure you my flying experiences was different than everybody else in that airport. Do we think that is fair. You know, this happens at Disney, Disney World, you can buy a more expensive pass allows you to cut the line is that fair and just identification correlation discrimination. The technologies matter, but it's that flow that really matters. And there's a society need to decide what we want. We want to discriminate and they want to devote their resources to the more profitable customer. Right, and that's going to fall along income lines obviously, but gender lines racial lines I mean pretty much every demographic line, they will be a difference. And some of those are probably illegal. The more dystopian vision that I heard of this is once they have facial recognition down to this extent and they have access to all your contacts they can essentially take faces that are familiar to you. Merge them together so that you don't actually recognize it but the composite is likely to tap into your psychological perceptions and you tend to react more positively to people who are familiar. So they have a way to sell you advertising with faces that are composite that have the ability to actually get you to feel positively disposed to as far as you know there's only a research study. Yes, the results are real that, you know, if I can take a composite image of people you know people like you, or even you and someone who looks sort of like you, you won't recognize the person. You will feel more positive towards them when you see them, either in a print either in a static ad or a video ad, you'll be more likely to buy the thing that is in the realm of technology. And I can deliver web pages that do that individually. I don't know if it has been done. I believe it is legal to do. Yes, I mean that is, that is a deep psychological manipulation based on you know these this this identification correlation discrimination. Right. So apropos of that there's a question here that I think really gets to the heart of the issue. It's easy to feel powerless and how my data is already used. As we move into the next phase of data being collected, what can a person outside of tech, government and law, due to ensure that our privacy is considered when new policies are being put into place. Make this a political issue. Say more please. Fortunately, the era of the era of tech things you can do is almost over. Your data is not in your hands. Your email is stored by Google or Apple or somebody else. Your photos are stored by flicker or Falkie or smug mug or somebody else. Now your, your conversations are on Facebook and, and WhatsApp and YouTube and you know, so a lot of our data isn't under in our hands and our financial information credit companies have. There's not much we can do individually. We don't control our data anymore. They're in the cloud controlled by corporations. What we can do is agitate for better policy. The thing that it hurts us the most that this has never been a political issue, except a little bit around the edges. And this becomes a major political issue. When people campaign on this issue, then we'll get some change until then we're not likely to get much. But you know, there are exceptions. California passed a data privacy law. It's not great but it's something. You know, and there was political pressure to make political noise that is the most important thing we can do. It's not it's not a tech trick. There are tech tricks but they're all around the edges. I can give you advice like don't have a credit card don't have an email address. That's stupid advice. The living in the first half of 21st century need those tools. This gets back to the powerlessness point I made earlier. I think that people think that Facebook isn't spying on them. They know that Facebook is spying on them. But that's how they talk to their friends that's how they see their relatives. That's how they, that's how they're human. So that they're, they understand they're powerless, and they're accepting it because they have no choice. Well, Bruce, thank you for taking the time to share your ideas with this on on this issue with us and what's disturbing is that once again, I feel like what we're seeing is that the issues around coven are simply highlighting the issues that we've been dealing with for a long time around privacy and the political economy issues that we're dealing with big tech and monopoly. And, you know, there is going to be this question I think that as we look at contact tracing because I do think we're very likely to have these digital apps put into place and you know they're already being used in many countries now and we're very likely to see something being used in this country so I think we're not going to be able to stop it and you know maybe it's a tiny little bit of privacy over lost over and above what we've already lost but I do think it's this constant erosion that everything about the pandemic seems to be pushing us towards and the fact that most of us are now having our meetings over zoom or having conversations over zoom means there's just that much more of our personal private lives that are now accessible to these companies. So we're left once again with the big political economy questions. We don't know how to how to handle except you know try and address through political context. Yeah, I think it's exacerbating a lot of things is exacerbating the problems with our health care system, especially problems with income inequality with our policies that favor big corporations over small businesses with our, our centralized food production, with the idea that that the market will affect all inefficiencies out of a system in terms of profit, and it turns out it turns out inefficiencies were a security mechanism that we've lost. Yeah, so I mean a lot of things this crisis is really exacerbating and highlighting. And, and this is a singular moment. And I think this is something the likes of which we have we've seen few times in our country and previously American Revolution, American Civil War, and the Great Revolution that there will be a lot to that there will be enormous potential for social change. And I think we need to seize it. Very important words I think for all of us. We have a lot of young people who tune into a net events and I think this is really something that they need to be taking and running with I think this really is a call to action to all of us. Thank you so much for joining us and for taking the time. And thank you to all of you who tuned in. We have this more or less running as a regular webinar series now so our next webinar is going to be on June 11. We have Danny Roderick talking to us about issues around globalization in the age of COVID, which of course is another critical issue because we're seeing the rise of ethno nationalism and isolation. But in part, I think which is once again an exacerbation of issues that we were dealing with previously where there was a backlash against globalization and the pandemic has just highlighted many of the issues underneath that so I hope you will join us for that and you can register for that if you go to the website. So thank you all and thank you Bruce. Thank you. Bye bye.