 All right. Hello everyone and welcome to Public Knowledge's second webinar in a tech policy webinar series how Congress can protect privacy during a pandemic. I'm pleased to be joined by Jenny Gebhart, the Associate Director of Research at the Electronic Frontier Foundation, Stacey Gray, Senior Counsel at Future of Privacy Forum, Gaurav Laroya, Senior Policy Counsel at Free Press and Eric Null, U.S. Policy Manager at AccessNow. We'll be talking about possible privacy protections that could be appearing in the next stimulus bill, predictions from the Senate Commerce Committee's paper hearing that happened this morning, but first I want to take a couple steps back and discuss about how privacy has been talked about in the press in relation to the COVID-19 pandemic. So Gaurav, the media has taken a framing that privacy is inherently at odds with public health. Do you agree that that's the right framing to be using? If not, what's the media missing? Yeah. So, you know, hey Sarah, thanks to everyone joining and thanks to Public Knowledge for putting this together. Sarah, your rights have set up the frame in this way. I mean, unsurprisingly, I think this easy and false dichotomy has again popped up in popular discussions of this issue. And I think it obscures way more than it's ever illuminated. You know, there's never been this kind of one-to-one trade-off between privacy and security and it's just as true now when we're talking about the use of, you know, public and personal information to combat, you know, a once-in-a-hundred-year health crisis. And obviously data on a whole slew of metrics is going to be essential to any kind of response. I think when it comes to the involvement of tech companies, the following thoughts, you know, first is that, you know, tech and big data are tools and not a strategy. And I think this should actually be a moment of humility for the tech companies. And there's a lot of, I think, sort of wild proposals being put out there. And I think they should actually take a backseat that the science and epidemiologists and not that, you know, kind of tech exceptionalism can magically solve this crisis. And on those lines, you know, necessity and efficacy is where this conversation should begin, and the fact that this is going to be a long and a hard slog really needs to sink in and we can't let our eagerness for this crisis to end, let us, any of us policy makers or members of the public, you know, be susceptible to snake oil and other purported magic bullets. And then I think really importantly, you know, existing inequities are going to make this crisis and have been making this crisis far worse. And they actually also make these kinds of technological solutions far more difficult. And so, you know, they're giving us a virus system discriminated against people, but comorbidities access to quality care are racially correlated. And, and the way that fits into the privacy conversation is that, of course, trust in tech companies significantly decreased over the past few years, and trust in government is really abysmal. And I think, according to Pew, looking at 7% of trust in government for African Americans and I believe around 17% for white folks. And I think many of the tech solutions being kicked around, are meant to shore up, you know, tests and trace programs and for the trace part of that to work, you need the voluntary cooperation of countless individuals in this country. And then I think we have to be cognizant of what we're asking in that case of marginalized community is the hardest hit in this pandemic. They're already under intense law enforcement scrutiny, and how many of these communities are going to be willing to acquiesce to location and contact tracking. I think folks are going to be concerned with this information has will in fact have law enforcement consequences. I think people are concerned it's going to imperial access to government benefits or be tied to them in some way. And I think it's going to have possible effects on continued employment, and you know one of the few remaining open economic sectors. So as you know that's where privacy comes in and it's not it's not a trade off it's in fact directly related to the success of this kind of technological intervention. People are either not going to use this technology, or actively subverted they can't trust it. And I just want to remind still in fact say that these technologies are useful, and the kinds of safeguards and measures that colleagues on this call and elsewhere have written and talked about are going to be integral to making sure that they work. So, as you said Sarah, the frame is is not only wrong but I think detrimental to know the successful deployment if needed of these kinds of solutions that are going to possibly save this life. And I just want to remind everyone who's attending that we have a q amp a function in this webinar so as questions come up for you, please submit them in the q amp a and I'll make sure to work them into the conversation as we keep going. So, Eric, the United States is not the only country dealing with this pandemic. So what other courses of action have countries taken especially around tech usage to either track and trace, or provide screening tools or social access to the number of proposals we've seen here in the US. And what are some lessons we can learn from the international community. Yeah, thanks Sarah and thanks for having me and for everyone joining and for giving me an excuse to change out of my pajamas. Well, as Sarah said US policy manager at access now, which is an international human and digital rights organization with offices all over the world. I'm going to try to cover a lot of ground quickly here. If I drone on too long feel free to flag me and I can try to speed up but essentially, there are two sorts of buckets that I would place responses throughout the world. And one of them is health data tracking, which is symptoms and testing data and precondition pre existing conditions that sort of stuff. And then the second one large bucket being geolocation tracking and however that manifests. So, just as a general point when I talk about a specific country it is usually a specific region in that country but for simplicity sake, I am going to just list the country so I don't have to get super detailed about that, but happy to provide some details to anyone who has further follow up or questions. So in terms of health data tracking we're seeing a variety of things. You know, thermal scanning technology facial recognition technology to determine when people are using or are experiencing symptoms we're seeing that at the king's palace in Malaysia. For instance, South Korea require incoming travelers to install an app on their phones that will that is designed to facilitate the reporting of their health to the state. In the US we've seen screenings from physical screenings from from flights from India, or from from Italy in South Korea. In Argentina, they have a crowdsourced website for collecting information on flights with passengers who tested positive for coven 19. We've also seen a variety of public notification or what I might call public shaming. We've seen India release PDFs of people who are infected listing names addresses and travel history, and they also put notices outside of quarantined homes. That's why we've seen police force release videos of officers punishing quarantine violators, and that there's certainly been plenty of backlash to that. In Bosnia, we've seen them release names and residences of people who broke quarantine and in Montenegro we've, we've seen them publish personal data of people who are forced into isolation. So there's a variety of health data tracking that's happening throughout the world. And the second bucket is geolocation tracking. This comes in a several different flavors. There are a lot of countries that track location for enforcing quarantines. So we've seen that in Poland who you who use an app that requires daily selfies to be taken within 20 minutes of being reached out to by the police and if they if you don't respond within 20 minutes, they come to your house. In Russia we're seeing them use facial recognition technology to recognize people who are supposed to be in quarantine who are out in the public. In Argentina, we're also seeing them use technology to geolocation technology to find quarantine violators and they are proposing to use telecom data. In the EU, we've seen there were eight EU telecom companies who volunteer to start sharing geolocation data with the EU Commission. Interestingly, we saw that India will stamp hands of travelers that are coming into the country that will show the date through which they are supposed to remain in quarantine. There's also a variety of surveillance equipment that is used to enforce quarantine. So for instance in Mexico and in the United Kingdom's Derby Shire they are using drones to enforce quarantine in Australia and in Hong Kong they're using in-home electronic monitoring devices and in Hong Kong that specifically wristbands for people who are in quarantine to make sure that people are staying in quarantine. And then last but not least is contact tracing. So we see this in tons of countries from South Korea, Canada, Armenia, Germany, Singapore, like all over the world we're seeing contact tracing as well as proposals to do that and apps to do that in the US. I will note that South Korea in particular sent out emergency alerts with personal information of the people who tested positive. It didn't say the name but it listed where the person was and at what time and which region they were from. And so it was actually pretty easy to re-identify some of those people and it has led to pretty serious stigmatization and judgment and ridicule online. I did however save some of the most awful ideas for the end. One here is obviously extreme violence against citizens. There was a South African officer who shot and killed a man for violating lockdown. There was a Kenyan police officer who shot and killed a 13 year old for violating lockdown. Indian officials are spraying migrant workers with bleach to disinfect them, which causes serious health problems. And in Kenya, they also fired tear gas at ferry commuters and hit people with batons. In Tunisia, they've released a robo cop called P guard, which is basically a rolling robot base. It's a block that basically is designed to find people who are violating their quarantine. And it will find people and will apparently say, What are you doing? Show me your ID. You don't know there's a lockdown, which seems very bizarre. Albania has deployed its army to enforce a 40 hour stay at home order and encouraged army personnel to use force if they find violators. And then I think my personal favorite is Peru and Panama are actually quarantining men and women on different days of the week. So men and women are quarantined on Monday, Wednesday and Friday. Women are quarantined on Tuesday, Thursday, Saturday, and then everyone's quarantined on Sunday. So obviously we have a lot of different responses. I think some lessons we can learn is if you have a robust pandemic response where you have early action and quarantine and testing for everyone, you actually don't need to rely as much on surveillance as some of the later acting countries do. We've seen countries like Singapore, South Korea, Vietnam who've handled the outbreak relatively well. All things considered anywhere. They have been preparing for pandemics since 2003 when SARS happened in 2012, when MERS happened back before a lot of this tech existed. So I think there's a lot more that could have been done on the front end that would have reduced the need to use this technology now. But we're beyond that point now, so we sort of have to see what we can do with what we have. Another thing I think is governments need to understand their people better. While a lot of us are lucky that we can work from home, mostly thanks to technology. A lot of people can't and for them it's a it's a decision between staying at home and dying of hunger or going out and working and potentially getting infected and infecting other people and I think government responses need to more adequately address that issue. I'll just skip over some stuff because I'm way over time. And I would just say that whatever governments are doing I'm hoping that they are tracking the data that they are collecting and retaining for this purpose and are figuring out what is the best way, the best thing to do with that data after. Most likely it is to delete that data or at least provide it only for like health research and that sort of thing. So with that I will stop and thank you. So I want to turn to the US specifically and how some tech companies have been responding to the current crisis. So one thing we've been seeing Jenny is that more and more companies are often offering what I'm calling COVID-19 applications from symptom trackers to screening tools to tell you where available testing centers are. One of the most reported on is Google's tool from Verily or Alphabet's tool from Verily I should say what are some of the concerns that you might have around tech companies acquiring this health data and how they're using it if you could walk us through that. So we have three main concerns here. And the first kind of set of concerns definitely applies to Verily of Alphabet and or Google applies to Apple's kind of symptom screening website and applies to one of the newest things we've seen this Monday of Facebook prompting users to fill out a symptom survey that would go to researchers at Carnegie Mellon. But all of these are collecting symptom information I think Verily is doing the most with both screening and then directing people to and conducting testing. So my first big concern about that whole group of services is any possibility of repurposing information for what these companies actual businesses are. Facebook is not in the business of directing people to happy friendly researchers symptom surveys. Alphabet's in a lot of businesses, but as far as this association with Google that's the world's biggest advertising company. So with Verily for example, in order to use that site and have any access to testing you have to have a Google account. And in Verily's various statements and FAQs they say this is because we need to do this for a verification. But there's a lot of ways to verify people and their identities that are not Google accounts. So conditioning access to what is a critical health service right now on having or being willing to create an account with one of the world's biggest advertising companies. I think that's a big problem that speaks for itself. I'm also concerned about things like Facebook symptom survey that's going to live for US users some US users going to live at the top of your newsfeed it's called a quick prompt in Facebook speak. And it's not clear how that does or does not interact with how Facebook is going to serve you content on your newsfeed how it's going to wait ads on your newsfeed. We've seen Facebook kind of have very porous barriers between for example security information and advertising. And we haven't seen any assurances from Facebook that your interactions with or your answers to the survey will have precisely no influence on how Facebook is trying to get you to click on things and keep you on the site. It's really not clear that that symptom survey or how you interact with it or don't won't influence the rest of your Facebook experience. So that's the place where I also have a lot of questions. So that's kind of the first set of concerns that there will be some kind of repurposing information using the offering of these testing services to for example, get more accounts whether or not that's the intent that's what is happening that's the incentive or unintended interactions with other tracking advertising social products like Facebook on your newsfeed. After that, my second set of concerns is about unclear barriers for sharing with government authorities, which of course includes public health authorities for verily. And there's about this and a ton of questions verily is public facing statements. Talk about kind of as a whole lump other federal state and local health authorities and how verily there may not govern data sharing any kind of data flows with them but but lumping those together as kind of one undifferentiated mass just. It's not acceptable, especially for something this high stakes. We have so many questions right what is verily's relationship with the US government and what parts of it. Is there any chance and right this should be a slam dunk question. Is there any chance that ice for example would have access to any data under any circumstances. There's there's not much clear here and like I said a lot of these questions should be crystal clear. They should be gimme for the companies that are putting this together and they should be top of mind. They shouldn't be an afterthought and they shouldn't be something they have to dig for verily also. I mean right now, operating in California and a few counties in California. It feels to really make it clear what its relationship is with the California Department of Public Health. At the very least you'd want to see an MOU and a memorandum of understanding outlining precisely how that relationship will work how data will flow between barely Google alphabet and other authorities. So it's kind of just a lot of a hand waving that we've seen from verily publicly and for something that is the only testing available for a lot of people in a very popular state that might expand more. These are just the simple kind of baseline questions that we need answered. Finally, my third group of concerns isn't about Facebook or Apple or Google. My third group of concerns is about NSO group and Palantir and any number of kind of nameless data brokers who may be using this moment as an opportunity to sort of COVID wash their reputation. I think to a lot of people on this call these are some of the biggest baddest names on kind of the most evil side of our surveillance business economy. And this seems like an opportunity for those who are not already familiar with their reputations for NSO group and Palantir and various data brokers to say hey aren't you glad we've been collecting this information on you all along. Are you glad we have it now when you need it and I think that's the last thing you want to come away from this crisis with. But again, depending on how the media cover it, even beginning with that false choice between privacy and physical health, that's something that kind of frame me is going to play into these COVID washing concerns. Thank you. So the other big thing we've been seeing has been around geolocation tracking. So Stacy, a lot of companies have come out with certain reports about whether or not people are social distancing how it's working and this is all based on geolocation data. And again, I'm going to use the Google's initiative as an example because it's been well publicized but there have been countless others. So what privacy issues arise her reports like this and how do you think Google specifically handled those concerns when publishing their reports. Sure. Thanks Sarah and it's so good to be here with these really strong privacy advocates future privacy forum is a think tank we work with both privacy advocates and with industry privacy professionals. And geolocation is such an interesting issue because I think in the last couple of years it's been the trend that people are starting to become much much more aware of exactly how much data is actually out there already existing that's already available that can be tapped into or not and it's raising a lot of very important privacy questions that have been around for years but not really in the public eye. So to the first thing which is what are the privacy issues. In fact, in your in the q amp a the first attendee said wouldn't that raise privacy concerns. Definitely. So, the first thing is that location data varies pretty widely in who holds it how it was collected its quality which garb you mentioned at the start of this and I think is probably the most important issue here is quality. And how identifiable it is so telco phone carriers for example hold subscriber data so they have fairly accurate location information based on the location of cell towers and the cell towers that your device connects to over time. And because it's tied to your account it's pretty identifiable right it's tied to your name and your identity. Then there are mobile operating systems and providers of major tech services like Google and Apple and manufacturers of devices that hold a whole lot of similarly usually pretty identifiable data. There's also a very large world of app developers and app partners who collect location data from apps that ask for your location for various things that you might have installed on your phone. So if you use any weather apps, rideshare apps, navigation apps, anything that asks for your permission to access geolocation data on a phone. Right now, in part because of the lack of a federal privacy law in the United States. The norm is that that data is typically shared with at least one if not several app developer partners, usually through a software development SDK. And those partners use it for lots of different reasons sometimes it's just to monetize or support a free app. Sometimes it's to provide behavioral advertising. A lot of the time, the advertising and marketing use cases are a little bit less direct. It's not about serving you a local ad maybe it's just about measuring whether you've been to a store or not, for example. That world of data is usually not tied to a name but tied to some kind of device identifier. That said, it's almost always pretty easy to re identify it to a person if you have enough data held over enough time, because our patterns and our behaviors reveal who we are over time think about just two locations that you go to every day home and work, or right now just but home and work generally are enough to identify most people. Location data is also sensitive because, aside from knowing our identities, it's very revealing of who we are and our patterns and where we go every day and our religious beliefs, even if we're going to church, for example, and what we're interested in. All of that is of course what makes it so valuable for advertising and marketing, but it's also what makes it highly, highly sensitive. So it's considered sensitive data under the Federal Trade Commission's jurisdiction, it's considered sensitive data in the EU. And so usually that means it has to be acquired with some kind of consent, and I think we obviously need stronger norms around that. So, so yes there are major privacy issues there but I think the more important question is probably quality, which is kind of where we started here. Part of my testimony to the to the committee today had to do with helping people understand based on the different sources of data, the company involved, and how they collected the data. There's pretty high variety in how accurate it is, and how precise it is, and the volume of the data at issue which goes to the question of representativeness. So this gets I think to the Google example to but but the first thing is that you have to look at precision and accuracy. Precision is basically the number of points after the decimal point in a latitude and longitude. How are we looking at city level street level. Three meters, two meters, two centimeters right some location data collected from apps can be highly, highly, highly precise, depending on how it was collected using what signals. Even if it's really highly precise though it might be totally inaccurate so we know that SDK data and advertising and marketing data is often not very accurate. And we, we know this in part because in all fairness to advertisers, it doesn't need to be very accurate. And, you know, behavioral ads for stuff that they don't care about knows that there's not a whole lot of accuracy that ultimately matters in the advertising space in some cases. But also because we've seen over the years that as Apple and Google have created tighter and tighter controls over location data through the operating system settings. The amount of location data in the market hasn't really changed, which tells you that there's probably as the supply decreases, the amount available hasn't really changed. So there's a quality issue, almost certainly there and a lot of the advertising location data available. So if it's not useful. The first answer should be don't use it right. It has to be useful to address the public health needs and we have to rely on public health experts to tell us what is useful so companies have to I think be really upfront about the sources of the data how it was collected and what it represents. So, to get to the Google example and I'll just end here and we can open it and take it back to you Sarah. To get to the Google example I think they did a couple of things really well. The first is that it's not individualized data, it's aggregate. They put it through robust differential privacy mechanisms that they've opened up to scrutiny from the identification experts, which is not me, but the identification experts in the world they've made it public rather than sharing with government entities secretly, which I think is a good helpful transparency mechanism. And it only shows aggregate trends. We know that in some cases aggregate trends can still reveal things. I think we saw this with the company Strava a couple of years ago accidentally revealing military bases. But for the most part if you take into account rural communities and small communities, you can account for that so they did that really really well. The only underlying question that I haven't seen really be addressed is if you even if you assume that Google, because of the way that it's able to collect data through multiple sensors on a phone actually does have much more highly precise and accurate data than say telcos it may or may not in fact be representative of all the populations that you want to study and so this is where public health experts are just going to have to take it with a grain of salt and tell them what they need and what is actually useful and what is not. We know for instance that depending on who's using different services, you're going to capture different communities so there's lots of well known studies about how iPhone users are much more affluent for example. Android and Google users and users of Google services represent some portion of the population the question is just how representative is it. Yeah, so I'll stop there a lot of these issues could have been avoided if we have we had a federal privacy law, not just for the protections around geolocation data but also just to set the rules. Because ideally, I think public health is obviously a good use case ideally you have a federal privacy law that protects privacy and establishes the rules for what companies can share ethically responsibly morally legally and what they can't. The part of the reason I think the EU has been so quick to respond to this in comparison to us is because they had they have a framework rules. US companies are left with a lot of uncertainty but what they're allowed to do so everyone's figuring it out. So that's actually a perfect segue into my next question which I'll give to Garov but please jump in if you all have feelings. So I've read all the witness statements I've read the chair and the ranking member statements and the chair as well as six of the seven witnesses as Stacy has mentioned have all said we need federal privacy much. Does that mean like once we get back to regular order and the Senate's in session we're going to see this move through commerce right now since we've all agreed, we need it. Well, the question is, when we get back to normal we're going to pass the federal like who knows when that is. And you know I think I had let's say this optimistically, like Stacy said and others have said, and many of the witnesses, the paper hearing this crisis shows why having that kind of framework in place is important. As I said earlier regarding the trust issue. We are now having this conversation about now what kinds of guard rail need to be in place for this information to be used who should be leading on what kinds of data should be used when they come to a public health crisis it would be it would have been great if this conversation had been resolved. You know as recently as last year, but we are still in the middle of figuring the sound and I think, to be honest, frankly, I'm going to put some agency on this as a feat of industry. I think you know there is an emerging consensus on what federal privacy legislation needs to look like there are there in the conversation in the context of that discussion there are some sticking points, but we're now seeing, you know, industry itself is going to be hamstrung in participating in in solving this crisis. Like Stacy said because they created the situation where there's a lot of uncertainty, where I think rightfully advocates are worried about how this information is going to be used what the longevity of this web information is going to live, what it means going forward. And, and you know their ability to participate meaningfully in solving this crisis is hamstrung because we haven't gotten those done yet and it'd be great if we can realize that this is more important. This is of some paramount importance because, you know, God forbid something like this happened again or the next crisis we just we just this should be a solved problem. As far as timing. Yeah, hopefully once this is normal we can all get back at the, at the table have some proper lessons learned and move forward and protecting the kinds of information and having, you know, the acceptable use cases from from the situation. And that so Stacy I just want to go back to you for a minute since you are a witness at this hearing. So, both the chair and the ranking member can't well mention in their statements that data is critical for fighting the coronavirus. However, they highlighted different uses of data which I thought was kind of interesting wicker chose to highlight a company tools that have highlighted social distancing measures or created screening tools while Senator Cantwell specifically highlighted the work that the State of Washington has done with creating testing. So what sort of insight does this give you about how the committee thinks about data use what how like the broader conversations of acceptable data use. Yeah, so so the case studies are not surprising Senator Cantwell hails from the state of Washington and I think that both the chair and the ranking member have been very supportive of beneficial use cases so before this really started. We've been having a lot of conversations about what are the right mechanisms for enabling data sharing for scientific research generally. When we're not talking about product development. When we're not talking about advertising or anything else. There can be a tension between genuine scientific research using big data, which is increasingly the norm you're seeing a lot of real world data by combining big data from commercial sources with HIPAA data. There's a tension between that and traditional privacy rules in some cases. The right to request deletion of your data. If deletion of data or an opt out of data is going to interfere with the research actually being any kind of quality. Then then there's a tension right sometimes people have opted into the use of certain data, but maybe didn't opt into the use of that data for generalizable research. Being able to get consent for secondary uses is another point of tension using data for research. It's the same question that we're addressing now in a public health context, but this is an area where the chair and the ranking member actually had a lot of a lot of agreements. The assumption that we were seeing in Cobra and the discussion draft involved the use of ethical independent review boards to oversee this process. When companies genuinely wanted to use data for socially beneficial research. Found that there was an insurmountable tension between that and privacy norms. What are the right norms there so ethical review boards can act as kind of a commercial version of an IRB, which we're all used to in the academic university setting, where you get experts who can ensure that the data is identified to a certain extent or at least to dynamize to a certain extent so that you're minimizing the risk of identifying people or revealing people and take other precautions to make sure that the research is quality and that it's not being driven only by profit motives. And so, so I think that was that was an area of a lot of consensus for them I think I mean what we saw before all this started was that discussions over a federal privacy law had really broken down. Not around substance of what the privacy norms should be but around enforcement and the extent to which they ought to preempt state laws, really important issues but a lot of consensus on the underlying substance. Thank you for that. So I want to turn to a question that Q&A that I think relates to this really well so if for those of you who don't know, Kinza Health is a smart thermometer company they were also invited as a witness for this paper hearing. They have published a heat map of data around fevers that they believe accurately predicts where COVID outbreaks will be occurring. So Jenna, I think I want to throw this to you because you've been looking at a lot of this intersection about health data and how it's being used by private companies. So what privacy concerns have you seen with some of this this approach to heat mapping like fevers, this prediction of COVID outbreaks if you could go into more detail with that. I think we're seeing a couple different kind of heat maps or aggregated maps. And on the one hand, I think it's, it is best to aggregate that kind of information, whether I think it's fevers and kind of health data or location data you cannot anonymize it individually it's just not going to work. It can so easily be re identified. So that's a slight kind of shruggy plus to it. And I think what these kind of heat maps unintended consequences is the number one thing it often is, but especially here. The first thing I thought of when I saw those Kinza headlines was Strava's heat maps a couple years back that ended up exposing classified military based locations because they they Strava thought that they were putting out here's where cool runs happen it was actually a proxy for wealthy Western people who have smartwatches and use Strava. And on Twitter figure that out and it was a disaster. So I wonder with Kinza. Yes, they're sharing heat maps of different trends in thermometer in in fevers or temperatures, but what is it a proxy for right what kind of person might have a Kinza and what unexpected things might we learn about where they tend to be where they gather. What we learned in Strava was we learned from military bases where I'm not sure what we could learn from Kinza I haven't kind of adversarily tried to approach it or kind of stress test it but that's the number one thing I'm worried about what is it going to tell us that isn't those. Oh, Garib. Yeah. Yeah, just and another thing to add there is, as you said, there's a self selected population that's going to be using this this particular device. I know that this this COVID-19 is affecting disproportionately poor marginalized communities and so people looking at this information should over determine where to divert resources based on this very convenient data set that that includes a ton of biases and I think that is, you know, the limitation of using this kind of information and part of the reason why a lot of traditional methods I think you're going to win the day when it comes to this instead of relying on on great you know we're sitting on this data set let's hand it to the government and build a strategy around that. So I want to just pull back a little bit and Eric I think your expertise would be useful here. So that these again we're just seeing this urge to create a tech solution and one of the things is like predicting where outbreaks will occur predicting who may have caught it from someone and so like these contract contact tracing apps. Is there somewhere that's done it really well like that we could use as a model like is there, whether it's the EU South Korea somewhere else that has figured out the sort of authority question about preserving privacy, and also doing the sort of work that people that a public health experts have said is going to be useful. Yeah, so can't speak to necessarily what public health officials think about certain things but I know there have been a variety of approaches, particularly around telecom data and app data which Stacy went into a bunch of detail about earlier so I won't but essentially like a lot of the state is not specific enough to actually get at contact tracing it's not it doesn't get you close enough to other people to actually say it's just going to like create mass panic rather than actually lead to people being safer. So, beyond that issue. It sort of depends on what you think of as done well. So, Vietnam, I believe, traced not only like first degree people you've come into contact with but also second third fourth degree and that helped them like because so many people who are asymptomatic can you know, spread the disease, you know, reaching out to more than just the immediate contacts seem to work for them. I don't know the privacy details of that plan in great detail. It seems like collecting third second third and fourth contacts would have some pretty serious privacy concerns if that data wasn't handled really well. I think the only technology I've seen that at least a first blush makes sense and Jenny can probably speak more authoritatively on this is the use of Bluetooth because it is a like physically limited technology that's it's still more than six feet which is what we ideally want for measuring passing these along but it at least has a limited circumference that provides some more useful data than you know GPS data that could be extremely inaccurate or telecom data that isn't isn't granular enough. So Jenny just for our audience members who may not know about some of these Bluetooth like these Bluetooth proposals for contact tracing what exactly is that. Yeah, so as Eric alluded to instead of using GPS data which is extremely sensitive location data and not as accurate as we like and instead of using cell tower data which has similar problems. Their proposals to use Bluetooth which is the technology that finds my wireless mouse or right finds your headphones pairs to your to your larger screen to have kind of people often ideally from the FF this app that would use Bluetooth to sense other apps that are opted in. And then yes kind of I hoping for the six feet or kind of epidemiologically meaningful distance and no more no less. And then using that to you know I use this app I walk around one day I figure out that I have coronavirus symptoms or I've tested positive I let the app know. And somehow automatically in a privacy preserving way notify people who were within some distance of me. There's a lot of variations of that basic proposal relying on Bluetooth technology. And I think narrowing it down to Bluetooth is one thing as far as accuracy and privacy preservation right Bluetooth will not say where you were on a map it will say we're close to other Bluetooth sensors that's one thing. But the implementation of these matters a lot and I know we're kind of running short on time we can get into it more but it's it's a big thing technically to not be able to connect those Bluetooth identifiers to phone numbers to real names it's a big thing to wall that data off from different parties and it's the biggest thing I think for me the biggest and answer question is how do we ensure that the use of this technology and the retention of this data ends when the crisis is over. And I think kind of worms. When is the crisis over you know during what phase of the crisis is this most useful and how can we define and identify that and use policy and technical safeguards to ensure that this is not going to be a new kind of surveillance power that governments or companies have adopted right master balance didn't end after 911 and we don't want a similar situation here. No thank you for that so I've been seeing a theme and some of the Q&A and I'm going to take moderate is privileged to sort of frame this question the way I would like to. How is the pandemic going to change the privacy debate going forward is it going to spur states on to pass privacy legislation is it going to change the conversation at the federal level like what if you could look into a crystal ball and what will be through this in 18 months to two years. What is this mean for the privacy conversation in the US and I will start with Stacy first and then go through. Yeah. I think it changes the conversation two ways. One is I think we are seeing clearly the inadequacy of models like the California consumer privacy. Not to say that I think CCPA is bad thing. I just think that it's not enough. The CCPA is at its heart basically an opt out bill, which doesn't place any limits on how data is collected in the first place. It's not going you to opt out of it doesn't place any limits on how that data can be used shared souls, right. It doesn't have any protections for location data or their sensitive data like health data. Aside from that opt out. So, it's a good start, not a bad thing by any means but I think, wholly inadequate to the current situation that we're in. I think advocates kind of knew that but now we're all kind of seeing it in a sharpened light. And the second is I think we will. Maybe it's a good thing maybe about see more of a sophisticated conversation now around what data do we want to be shared for public health for crises for scientific research, and what are the right protections for that. There are there are going to be some hard lines in the sand where privacy advocates will always say maybe there are there are no situations ever under any circumstances where we think certain types of data should be shared with the government. But aside from maybe a few narrow cases I think the more subtle question will be, what are the right protections, when we do acknowledge that there is a good thing that needs to happen, particularly in a crisis. And I think we've seen for instance with Peter Swires really excellent article in law fair around lessons learned from 911 that it's very important to establish legal authority up front and avoid different people having different interpretations of what government legal looks like and established purpose limitation up front, rather than giving into the temptation of government saying, you just give me all the data and we'll figure out what's useful. You have to figure out what's useful and set a very clear granular specific purpose for the data up front and then have an exit strategy afterwards meaning you have to establish the rules for whether data is going to be retained, whether it's going to be used for other purposes, whether it's going to be used for law enforcement down the line right these are all really important underlying questions that we haven't even even really begun to address so I think we'll see a lot more nuanced conversation around that and that's a good thing I think. Jenny how about you. What's your prediction. I think one of the main things that comes to mind of how this will change the privacy debate is actually on enforcement, right, a privacy law is only as good as its enforcement and one of the big priorities for me and for EFF is non discrimination, which means that you won't be punished unless your services are charged a higher price for exercising your privacy rights and kind of an analog to that but I think we're seeing in different places with coronavirus is this so called discrimination of right you have to have downloaded this app to enter public place in you know in China you have to have your green qr code to show that you've tested positive to have this place I think they're obviously different policy questions when it's public health when it's privacy law enforcement. But I think that that idea of non discrimination this extremely boring word that makes you want to fall asleep it's coming alive to a lot more people very quickly. And even as we investigate. You know what are these different contact tracing apps how do they differ how are they implemented differently. I think one of the big guard rails we're looking at is that having downloaded this app or having given it certain information must not be a condition upon which you can enter a public place apply for a job like there's a lot of dystopian realities that seem right around the corner. As far as discriminating based on one information you've made available specifically for this public health case and balancing that with the imperative of adoption and the imperative of complete information for public health officials. I think is really hard, but a key thing I believe is that if we mandate or compel installing a contact tracing app or sharing certain information with it or having it out all the time. I do fear the relationship that would create between the public and health officials between the public and health resources. It would I think set up the wrong incentives of seeking care or seeking testing. So I think in all those ways this idea of nondiscrimination has kind of taken on new meaning for people and I'm interested to see how that plays into the privacy debate. Gareth how about you next. I think I've been perhaps two seconds ago had a slightly more optimistic take on where privacy legislation is going next but though your points you both made are really insightful. And it's hitting this chord that we have sequestered the commercial data. So I think that's one of the things from the government sharing questions and Europe has attacked both things at once. And I think what this question is showing as we cannot for mere political expedience sake, continue to divide those two things. I think we could be staring down a world where your application for government benefits is is predicated on your, you're actually having installed a nap so your, we can't say this is a commercial privacy bucket. This is the law enforcement government sharing bucket. Not to say this is necessarily a pessimistic view, but it certainly complicates the question where we now I think through necessity and through our experience with this crisis, just have to have to consider both of these questions at the same time and that is going to at least add another layer of serious thoughts, the public and advocates in the government have to think about when addressing this. I'll give you the last word on this. What are your predictions. Yeah, so I think that it's sort of in the same way that Cambridge Analytica kind of broad the scandal anyway, brought privacy into the limelight. Three years ago or so. And we'll also will do something similar, at least with respect to the data that is obviously relevant now the health data that geo geolocation tracking, it's going to place more emphasis on how much of that data is collected and how long it's retained for and what are the general policies around that data. And as Jenny and Stacy said lots of companies already had this data, and now it's just being used for public health purposes rather than some other purpose. So I think that, you know, I guess my prediction is like hopefully people will see this and be more knowledgeable but also hopefully a little more enraged about how much data is being collected on them and hopefully will want to make calls for it to stop or slow down or you know whatever, whatever they prefer. So I'm going to use this as a question for the entire group. If you could wave a magic wand and make sure that a specific privacy provision was inserted in the next COVID stimulus relief bill I'm going to take federal privacy comprehensive legislation off the table because I think that's the easy answer. What is the one thing you'd really hope Congress could put in to like meaningfully protect data in this crisis and I'm going to start with Jenny. Yeah, my answer is actually the same as my top priority for a comprehensive privacy bill because I think at the end of the kind of privacy right now versus privacy one year ago that the priorities are similar. It's another enforcement one it's private right of action. That's absolutely one of our highest priorities again I think whatever provisions we have are only as good as how they're enforced and Americans should be able to have their day in court without waiting in general to get their act together, especially when we see kind of privacy violations on a different scale. You know if a company is found to be violating your privacy rights enshrined in whatever law you should have the ability to take action on that. Thank you Eric how about your one ask. I really agree on the private right of action, but I think in terms of I guess like to put it into the substantive versus enforcement bucket. My, my waving the magic wand I think would be strong data minimization requirements that would both limit collection and retention of that data. You know part of what I've been talking about today and talking about previously is like the ability of these companies to collect and retain this data and now potentially be able to use it for secondary purpose so I guess I would add on to that use limitations. I think that's going to be a major issue and seeing seeing what governments and companies do with the data they're using for responding to coven 1934 10 years from now. I think the harms are there are a variety of harms that are likely to come about, unless we can limit the data is collection and retention now. So Stacy, I'll let you go next. Eric, I think my one thing would be some, some attention paid to collection limitation. That looks like a lot of things data minimization principles retention principles privacy by design and security by design are aspects of European laws that have not really made it into the limelight yet in the United States privacy has to be built in from the very beginning and how a product is and of services designed for for us to get really substantive protections we rely way too much on notice and choice and user control and not enough attention paid to the design of products and control effect is exactly what you're seeing right now with zoom where they left privacy to be an afterthought they never really figured it out from the beginning and now because of all of the attention and the heightened scrutiny. They're suddenly having to redesign and figure everything out and they're not alone that's almost all companies that aren't major tech players they they're not thinking about it from the beginning. So, I think I think she said earlier, there's actually a huge agreement I think amongst these advocates and even in much of industry about what actual private legislation needs to look like I'm with ESF on the need for for robust enforcement there needs to be data minimization needs to be collection limits in the context of this emergency, let's say the next the next COVID relief bill. My magic wand would definitely be the time limitation on whatever we decide next people are desperate to get out of their houses and this situation I think I think legislating in this mindset is not necessarily going to produce policy that we're going to be happy with three years down the line. I think as Jenny said we are we're we're still living under the shadow of post 911 national security laws and other kinds of data question that we've only barely been able to push back and are constantly still fighting about. You know I think I think emergency legislation should be considered in that light and let's put a year long policy in place and then hopefully reevaluate when, when, if we have to do such a thing reevaluate when we can put this particular situation behind us and not live under the shadow of COVID-19 for the next 20 years. Well thank you so much for that I really appreciate all four of you joining me today I thought this is a really insightful conversation and I learned a ton and I hope everyone who attended learned a lot as well. These people are all on Twitter and I'm hopefully the Meredith who's running our public knowledge account will throw the Twitter their Twitter links in the chat so if you want to talk to them more I know I didn't get to all of the Q&A questions which were great, but unfortunately we only have about an hour. I want to give my panelists a chance to, if you, if you weren't able to say something that you really were hoping to say, please say it now for our panel or for our attendees, what you'd like them to walk away with how you'd like them to be thinking about this, going forward or last like maybe five, five to 10 second, like take away. And I'll start, I should yeah I should tell people who should be talking I'll start with Eric. My message would be stay home and stay safe and healthy. Jenny. Oh, Jenny. Along with that wash your hands yeah I think I think we covered everything thank you so much Sarah and everyone for coming I think we covered a lot of ground and, like you said always happy to talk on Twitter we are extremely online these days. Stacy. I'll just say that a reminder to everybody who's new to this issue that none of this is new. We've been doing this for for 50 years in the United States starting with the fifths that led to the Privacy Act of 1974 that almost applied to private companies to grow up point about tackling government and companies. This has been going on for a very long time. I know I this was a great conversation. Right we're all extremely available. You know if anyone has questions out there, you know hit us up on the on federal privacy legislation, you know, CDT you guys access yes I'm sure is all have all released really good model bills or principles and you know, as Stacy said we've all been talking about this for a long time and and so please, you know look at those resources and So thank you all for joining us here at public knowledge, I you my house. Please keep following this issue and again, we're all very, we very much like talking about this issue as you can tell from this conversation. So we look forward to continuing to talk about this I am going to be on Twitter talking a ton about what I've read in the statements and in the witness and hopefully for when the questions come out so just keep in there's lots of great Twitter privacy people online as well so if you want to keep the conversation going, definitely go to Twitter. All right, thank you all so much, and I believe we will be ending the webinar right now if Meredith do that and we'll make sure to get a lot of the links that were in the chat followed up to all of you who attended via email.