 Okay, I think we can get started. So good afternoon, everyone, and thank you so much for coming out and joining us for this session. My name is Ben Green. I'm a postdoctoral scholar in the Michigan Society of Fellows, an assistant professor in the Ford School, and a faculty affiliate of the Science, Technology, and Public Policy Program at the University of Michigan. The Science, Technology, and Public Policy Program, also known as STPP, is an interdisciplinary university-wide program dedicated to training students, conducting cutting-edge research, and informing the public and policymakers on issues at the intersection of technology, science, ethics, society, and public policy. One of the few silver linings of the pandemic is that people can join our events from anywhere. So for those of you interested in learning more about our STPP program, you can do so at our website, stpp.fordschool.umich.edu. Before I introduce today's speaker, I wanna make a couple of announcements. First, for those of you interested in the STPP Graduate Certificate Program, the next application deadline is in the new year, on March 1st. Second, our next STPP webinar will be on Monday, February 8th at 4 p.m. I, Ben Green, will be giving a talk called Algorithmic Realism, Expounding the Boundaries of Algorithmic Thought. If you're interested in attending, you can register at the link in the chat. I'm very excited about today's event, Digital Contact Tracing, an unlikely policy story, and to welcome our guest, Erin Simpson. Ms. Simpson is the Associate Director of the Center for American Progress, where she is working towards an open, generative, rights-respecting internet and effective democratic regulatory infrastructure. Simpson served as the civil society lead for the Computational Propaganda Research Project at the University of Oxford, where she supported international civil and human rights leaders in preparing for disinformation and advocated for improved platform regulation in the European Union, United Kingdom, and United States. She was the founding director of programs at Civic Hall Labs and a Microsoft Tech Fellow. A Marshall Scholar and a Truman Scholar, Simpson holds degrees from the University of Chicago and the Oxford Internet Institute at the University of Oxford. This afternoon, Ms. Simpson and I will have a conversation for about 30 minutes and then we'll open up the floor for questions in the second half of the hour. So you can please submit your questions through the Q&A function in Zoom throughout and we'll bring in some of those towards the second half of the hour. So Erin, thank you so much for joining us today. I'm really excited to be able to have this conversation with you. Ben, thank you so much for having me. Thrilled to be joining you in an hour. Great, awesome. Glad you don't have to make the trip up to our cold Ann Arbor days. So that's a good bonus for you. So just to get us started off and get everyone on the same page, can you provide a brief description of what digital contact tracing is and what exposure notification systems are, how they work, and why are they so relevant in the midst of this COVID-19 pandemic? Yes, I think you started with each of those things. Before I dive in, I just wanna make one note, which is just sort of to honor those affected and acknowledge that this is not a necrotic story, even though we're having a conversation. This is hard to talk about. People are suffering. The pandemic is ongoing. And indeed, I'm so much worse than when I proposed this topic in June. So even though we're talking about how exposure notification and digital contact tracing have played out in policy spaces, I just wanted to say that I hope our focus here is productive, not just as an intellectual exercise, but in helping catalyze some reflection and understanding in order to address the pandemic more effectively in the future. So just thanks and solidarity and support to everyone navigating this and just to acknowledge up top that this is about an ongoing crisis. And so to your excellent question, so contact tracing itself and I'm not a public health expert. And so I speak with the utmost humility about the whole thing, but the traditional infectious disease trained public health contact tracers, they're gonna call you up and interview you if you test positive for an infectious disease. Ask you where you've been and where you've seen. And then they're gonna get in touch with everyone you've been in close contact with to help them get care or isolate as appropriate. So the idea is you're slowing the spread of disease by tracing its spread and helping those people seek care. And digital contact tracing was this idea that if we harnessed mobile phone capabilities, we couldn't replicate the process of public health contact tracing, but there's something we could add to the public health process by reducing the time between exposure and isolation. So if you think back to, what did we know about the coronavirus in March? It's airborne pathogen, high levels of spread in asymptomatic people up to two weeks. We were asking folks before, they may or may not show symptoms. And all of that together meant that you're looking at a pathogen that's spreading really quickly. And early on, our public health team was putting together recommendations around contact tracing, the vaccine, the whole lot. And they're saying, what, there are some real limits here. No matter how much we ask states to staff up and you might remember the John Hopkins Center for Health Security we need to hire 100,000 contact tracers this summer. And some states have met the bars that were set. So incredible infrastructure investment. But even with that, there were gonna be some gaps. So one of them is speed, it just takes some time. One of them is capacity. Once we enter levels of community spread, it's really hard to do contact tracing. You don't even know where you got it. And then there's an issue of anonymity. So the city bus problem is how we've sort of shorthanded it. You've got a city bus and you don't know anyone else there. And so thinking about those gaps, in addition to all of the other measures, right? So mass testing, social and financial support to self-isolate, contact tracing, more resources, the whole kit and caboodle, people in the tech policy field were having a conversation of is there anything else we could do because our public health experts are telling us that manual contact tracing isn't gonna do it. And we're looking to places abroad that have explored a lot of different ways to make digitally capable public health response. And then your last point, so exposure notification applications grew out of this question of digital contact tracing. And when I talk about exposure notification systems today, I'm gonna be talking about the Apple and Google supported exposure notification system specifically. So this is a voluntary privacy preserving, decentralized system using Bluetooth low energy technology. In the shortest version of how it works, it exchanges temporary anonymous IDs with nearby phones who also have it enabled. So my phone's gonna log those IDs, if those around me, regular list of anonymous positive temporary numbers, match them on my own phone privately and let me know if my private log of temporary numbers actually shows a log of anonymous number that's tested positive. And so then that application can ask me to seek testing or self isolate. And then if I do end up testing positive, I can share that I've tested positive and it starts to cycle over in terms of helping people pull those numbers down and notify privately. And so this was, it was the name of digital contact tracing was a little misleading. This is not a version of what public health workers were doing at all. It was a tool that individuals were using to protect themselves and their community. And my team was involved really since the beginning and sort of trying to support this process in the United States, trying to help lift up recommendations of great university teams and other nations who were with and then helping draft privacy legislation at a federal level around those. And so as of November 24th, 16 states and territories plus DC are supporting exposure notifications. So in just a short amount of time, it's gone from a twinkle in the eye to very much real. Wow. So this is, this is a, as many of the responses have been this year, a rapid pace effort. And I particularly like your emphasis on this being in addition to everything else. So often we wanna talk about, here's this technology, it can solve this problem. That alone is going to do it. And I appreciate the emphasis on this. It's just one piece of the puzzle in addition to everything else. So you gave us a little bit of the preview of what you mean when you're, what the actual infrastructures are when you talk about exposure notification and the number of states that have already started implementing something. So let's go back to, you know, around the spring when the pandemic was starting and these conversations were starting and we were trying to figure out how to do some of this. What were some of the pressing, you know, technical questions but in addition to the social, political and policy questions that had to be addressed and asked before we could actually get something going. In March, I think just about everything was a question. I'll give you my top five category there. So first there was a question about Bluetooth at all. Most people are familiar with Bluetooth as like having their headphones not connect. And it was not at all designed to measure distance especially between two, only two points. So Bluetooth questions, they were crypto questions. Could we get to a privacy design that's gonna be good enough? There are interoperability questions both between Apple and Google had Bluetooth handshakes working a little differently. And then also how are we gonna do this between states? In the U.S. we have a really decentralized public health system at the state or sometimes at the county level. And so how are we gonna push a system that actually works with people moving around? Fourth, testing integration and verification. People were really freaked out early on about you could abuse the system by having a false positive. I say I test positive, I didn't. I really mess up my neighbor's day, et cetera, et cetera. So testing integration with state systems. And five calibration of the epidemiological algorithm. So we're still learning about the coronavirus but how close for how long did I need to be to you? Was it six feet? How do we make six feet in Bluetooth terms? So there's a lot going on there in opens or skit hubs and academic teams around the world sort of trying to work on those questions. And then simultaneously, and as we were looking about this from our specific U.S. context, we are thinking about first surveillance. So we have a history of expanding government surveillance powers during crises in this country. So thinking about the Patriot Act after 9-11. And we also have really disproportionate surveillance targeting communities. So thinking about the HIV AIDS crisis and public health surveillance of HIV AIDS during the 80s. Thinking about federal surveillance of civil rights processes during the 60s. There's a really long history of distressed sort of rightfully earned from the government. And so we're thinking about surveillance powers and we're thinking about how that history is coming in to deal with public trust. Two, I mean, just broader civil rights and equity concerns. So the disparate impact of the virus is already a problem on communities of color, on the disability community and working people and people without healthcare. And so are we designing a system that's helping to ameliorate those disparate impacts? Are we using our technology to just compound those things? Is this only gonna work for white teenagers who have iPhone 7s, et cetera, et cetera? And there's sort of traditional civil rights concerns. So about public accommodation. Could you be forced into using these? Could I be forced to show you that I don't have an exposure before I head into work or something? How could they be used in that way? And then there were also sort of, you know, governance and oversight concerns and concerns about private company involvement more broadly. So we had a lot to deal with. And on top of that, we had the number one political concern of just no federal leadership in this space and not really a desire to want to solve the pandemic and be working on the pandemic. So there were a lot of questions in the beginning. So it seems like you had your hands pretty full. So I think, you know, one of the major questions that people sort of started thinking about with this was these questions of privacy, both from the perspective of the technical infrastructure and design of the system, but also how it would connect to systems of surveillance and particularly during the course of a summer where we were both in lockdown trying to build these systems and thinking about them, but also mass protests across the US for the Black Lives Matter movement against police brutality and police surveillance. So how did these policy and technical elements interact and how did the policy questions and the concerns about privacy shape the technical debates that you were having or that researchers were having into guiding the types of systems that ended up being designed? Yeah, that's a great question. And you draw really the perfect contrast between like are we really gonna ask people to let us track them when we're trying to go to protests and being, you know, tracked by the police and in all kinds of different ways. The concern was really manifest when there was an official in Minnesota who in response to the protests against the police killing of George Floyd are saying, oh yeah, we're context tracing all the protesters. We're gonna figure out where they're come from. We do it just like we're context tracing the coronavirus. And I mean, that encapsulates sort of exactly some of the historical fears and the present fears in well, this is why, you know, we don't know if we wanna be digitally enabling public response. So for our team being deeply concerned about that working with our race and ethnicity team and our disability justice team from the beginning, you know, even under tough circumstances, we knew that the risks around surveillance and around law enforcement or ICE access to this data were really high. With this kind of data, you know, you could recreate the social graph essentially. So where I've been and who I've seen, even if you're not using GPS data, even with Bluetooth, we pushed from very early on to go to a decentralized system. And so this is, you know, we're never federating all the codes that all the phones have seen at once. We're, you know, we're pulling things to our phone and we're matching there. And that meant a lot of things, technically and for public health. And so technically, you know, we were looking at the decentralized privacy preserving proximity tracing project, DB3T, which was a big open source collaborate out of Europe led by Dr. Carmela Troncoso. And that was really the first thing we'd seen that said, oh, this could work in a totally decentralized way. And the beauty of that is there isn't a centralized database to subpoena. If you're trying to arrest someone, it's not a centralized database to hack into in terms of a security risk. You're just getting a bunch of random numbers that aren't gonna mean anything to you. And so that doesn't completely eliminate privacy risk, but that was the first hurdle for us in terms of even considering something. It just wasn't gonna, especially during the federal administration we had and everything that was going on, we had to have something that was safe for people. And the burden that then fell on us was to really argue for that in a powerful way to public health authorities who were the decision makers here. And so we're first arguing that there's not attention between privacy and public health because at the end of the day, these should be voluntary systems and it would be really hard to make them mandatory. And so the only way these can succeed is to make something worthy of trust because people are gonna have to want to download it. The only way to do that is to do this hyper privacy sensitive, anonymous, decentralized design. And so that sounds really good. But then you get into these conversations deep with public health officials. And rightly so, they're managing in a crisis and they're saying, we don't have eyes on the problem. We don't have the data we need. You're telling me we're gonna invest in an entire clinical system around public health data and we're not gonna be able to have a dashboard or useful metrics. And also there's not, you can't tell me if it's gonna work. And so there were real tensions there. And I don't mean to be dismissive of those. I think coming from the tech policy side, it was easy for us to say, yes, we want in the not in this decentralized system, of course. And then, but really having to argue that we do think that's best for public health because there's just no other way. And we understand that that's hard to invest in right now because your teams are scrambling for resources and it would be great to have more data. And a lot of folks come into this conversation thinking about South Korea system of public health which is extremely extensive developed during SARS and MERS. So tested for federating financial data, transportation data, all kinds of, they have a really comprehensive view of things. And so it was hard to walk into those conversations having that be people's point of reference sometimes in terms of what they hoped they were getting from these systems. Well, and I think it's a great example of how our broader systems of governance and particularly these histories of policing and surveillance really shape what we can and can't do when it comes to data collection. As someone who studies smart cities, there are so many cases where technology has been or data has been collected and it's supposed to curb congestion or improve public health and just the natural gravity where it gets pulled into being used to track protesters or investigate cases or all of that such that data that in a certain system could potentially be useful, the fears of it being used for racially unjust surveillance and enforcement mean that it's not worth the risk. So we can, it's an interesting case of how maybe in a different system we could collect some of this data in a different way but given the reality of our present city governments and local governments and how these data collection and sharing systems work, suddenly that becomes very difficult if not impossible to be worth it. And so you mentioned working on as one of your other major efforts than privacy legislation. So you had, you were having these conversations, you had a system in mind, how did you ultimately convince legislators about what they, about, you know this was the right way to go and what is, what is the process of drafting that legislation look like and where does that legislation stand? That legislation stands introduced and in committee and not passed but legislators were concerned early on about these systems. I mean, they're concerned about the public health crisis and a lot of our sort of privacy champions in Congress were also concerned about coronavirus privacy. And while I don't move sleep over the exposure notification systems that are popular now that state public health systems run, Apple and Google run, there's a proliferation of other systems, private systems, systems that universities are requiring, systems that workplaces are requiring that are extremely privacy invasive. And so the idea legislatively was to lay down a baseline under which to make all systems safe even if you were departing from our desired design. And so our congressional engagement like that sort of worked on two tracks. I mean, first we worked with a broad coalition of other civil rights and advocacy organizations to do in partnership with the leadership conference and others to draft a civil rights principles for coronavirus for the Senate letter. And so just laying down really the technology in use problems, the non-discrimination and public accommodation protections we wanted, the voluntariness and the purpose limitation especially. And so at that federal level, you're thinking about how do we restrict this purpose to public health? How far can we legally go to protect it from ever getting to ICE or law enforcement or intelligence communities in places where we don't want this data to go because we're collecting it for public health and we've got issues with sharing it. And so we put forward that letter and those coalition processes are tough and they're great because when you're able to synthesize all of these groups are behind these principles, that's good. And then there were two Senate teams who were working on different sort of scopes of coronavirus privacy legislation. If we had federal privacy legislation in the US, none of this would have been necessary. We would have had great baselines already. But instead, we were working with a lot of really smart Senate staffers from Senator Blumenthal's office or Cantwell's office or Markey's office just on trying to ensconce what is the right baseline? What are employers gonna try to do? What are the tough edge cases where there might be legitimate reasons you would need something to happen that broadly you're really worried about? And so really just pushing for purpose limitation, the voluntariness, non-discrimination so that even if somebody goes out and makes a horribly privacy invasive app, it can't be abused in the ways we're most worried about it being abused. Of course, that legislation doesn't move forward because we don't have the power in the Senate to move things forward right now. Mm-hmm. Okay. And so how does that seem to be? What's on the horizon then as we have a transition happening both at the presidential level, Congress changing potentially the state of Congress up for grabs next month. Where do you see that legislation or other efforts around exposure and notifications going politically as we move into a new administration? Yeah, I won't speculate what the new administration will do or what Congress will do, but I know what I hope will happen. And I think that's what many people hope will happen regardless of your party affiliation, which is just a coordinated serious federal response to the pandemic. As I said at the beginning, digital contact tracing is kind of the last thing on top of deep investment in our healthcare infrastructure and testing and tracing and the vaccine and financial supports that people can actually stay home safely and everything else. And then yes, we would love to see sort of federal support behind what has thus far been really scrappy state support to get these systems set up and working with Apple and Google giving things set up. We've seen a lot of success in other places that have federal support, say Germany. I think they've now reached 20 million or 18 and a half earlier this fall. So 20 million users, it's amazing. We've had over 5,000 people get notifications that they were exposed and get tested that they wouldn't have otherwise. So I think, you know, I'm just looking forward to an actual serious federal policy response from the administration and hopefully Congress will move on coronavirus relief as well because no state is gonna be able to spend any time on this even if the cost is negligible. If they don't have the bread and butter funding to federate their county level contact tracer databases, which is something they had to work on for months, a lot of states early this summer. So plenty to do. Yeah, very helpful for 2021 and understanding that that'll be a serious issue. Great, so we'll be jumping to audience questions soon. So just a reminder to put those into the chat and we'll try to get as many of those in. In just maybe about five minutes or so we'll make that transition. So Aaron, one thing you've mentioned a couple of times is the role of Apple and Google in all of this. So what did that look like? And, you know, how do you feel about the role that they've been playing in this whole process? Oh, it's so complicated. I mean, grateful that they made a responsible decision to really embrace privacy protections around this system and then make it available worldwide under what is basically as good as we could have hoped. This is, it's so rare that you put out a policy prescription and it actually comes to fruition exactly as you hoped but it was truly sort of one of the best systems we could have imagined from the open source work that was going on. And so credit where credit is due in terms of Teams at Apple and Google. Although the initial announcement was a bit of a surprise. Getting into the fray, taking a lot of feedback on the spec really quickly, being really communicative with different technologists teams who are working on this and with different states, making changes early on. You know, they've really evolved to make it super easy for state adoption. And so I credit them with all of that. Also, you know, I shouldn't have to feel grateful that they're making responsible privacy decisions there. You know, we should have public regulatory baselines that make sure that, you know, we don't have to hope that Apple and Google do the right thing that they can do it. And, you know, there wasn't a lot of public involvement. We have federal leadership was out to lunch. Obviously golfing, but they made the de facto international standard for contact tracing kind of immediately. And I'm glad that it's good. But I don't know what will happen next time. And it reflects poorly on our public infrastructure that we weren't out front and that we didn't have something ready. It's something that we were sort of leading with or engaging with as other nations were just sad. And so it's an interesting example of the power of that mobile operating system duopoly. It's a really pro-social use of it, I think. And I, you know, give this team so much credit. And it also, I think reflects in a kind of disturbing eerie way in the back of all of our heads in terms of just the immense power, even if used for good, that is held there. Right, right. How much we are left up to their decisions as these giant unelected companies to make really massive public health decisions in terms of privacy, health, all of that. So yeah, so turning to the future, right? We've been about maybe nine months in the real thick of it in the United States. As you think about the next nine to 12 months, we obviously have, we're in the middle of a really bad way if there's some promising news about vaccines, but it's gonna take a long time until things are back to normal. So, and now we actually have these exposure notification systems starting to really be rolled out, come online, we know a little bit more about them. So what do you see their role being over this next year? And what's maybe your greatest hope and your greatest fear about what will happen with these systems as we navigate really this next phase or two of the pandemic? Well, I think after all of the other public health interventions have listed off, I really do think at the back of the line, this is another tool to help protect people, protect themselves and protect their communities and add another layer of protection. People are talking about the Swiss cheese model in terms of the public health interventions we need. I think this is another slice of Swiss cheese that can also help maybe help with some of those speed gaps and those anonymity gaps that we talked about. Especially once we leave community spread, there's an opportunity theoretically to help us reopen more safely and to curb outbreaks more quickly because people are sort of reducing that time between getting infected and self-isolating. My greatest hope is that that happens. That would be my greatest hope. And my greatest fear, especially because legislation hasn't passed on this and I don't lose sleep over the main Apple Google exposure notification system, but I do others. My greatest fear is that civil liberties abuses come to fruition either from a surveillance perspective or a public accommodations perspective and history's full of those examples. So I think that's a justified fear. Absolutely. Okay, so let's jump into some of the questions from the audience. So to get started, you mentioned the absence of privacy legislation in the United States. How does the approach to digital contact tracing in the US contrast with countries covered by the general data protection regulation or GDPR in the European Union were digital contact tracing programs even possible there and what do they look like? Yeah, that's a great question. So we got to talk to Germany's team a couple of times over the summer and well, they were developing theirs and they described their challenge as 75% communication and 25% technical. And their communication was working a lot with their government privacy agency in addition to government health agencies, et cetera, because they had to get GDR approval off the bat. They had to make sure their design was in compliance and they did that in advance. And so they did advance and they set up validators within the government, outside of the government with some validators in the open source community and the privacy community. And they tried to get everyone on board beforehand because public trust is so fragile in these things that they didn't want it to be ruined. And so yes, systems possible under GDPR and even really successful. I think Germany's one of the most widely used at this point. And so, and they'll also be federating their privacy preserving exposure notification applications very soon across Europe. And so not a barrier. And indeed, I think probably an accelerant just because if you do have baseline privacy protections and that's something you're a cut to as a public right maybe have a little less trepidation about your privacy concerns like we do in the US where it's just normal corporate practice to have horrible. The bar is so low, like why would we trust anything after how we're treated every day? And so, yeah, that's a good question. If you find any issues, you should let me know. And on that theme, what were some of the, as you look across globally at other examples, what are some of the other types of systems that other countries are doing, whether in the EU or in Asia that maybe are quite different due to the nature of how their laws and institutions are set up? Yeah, a lot of, I should say first, a lot of countries are on the exposure notification system by Apple and Google now just because it's a, a near duopoly mobile operating systems globally. So that's become really easy to roll out. But before that, there were several countries that were real leaders around, okay, we're gonna use technology as a part of public health. I mentioned South Korea and sort of the really detailed system they had there in terms of, federating the financial records, transportation records, this, that and the other thing, they got a lot of attention for letting people know when there was an exposure near them, which I think striked a lot of people the wrong way here, but it's really hard to make comparisons. Obviously China had a very sophisticated system of tracking and monitoring and they had like the codes that you had to show to get into public buildings. So I think sketching out some of what, what we would perceive as public accommodations concerns. Singapore deserves a lot of credit for going out with Bluetooth early. So they were really trying to figure out whether the Bluetooth thing could work. And they actually had teams of professional contact tracers on the back end. So it wasn't this distributed automatic system where you had to upload things yourself. They were doing a lot of that and they were sharing their lessons, which I think was really appreciated. And there's a lot of countries that are exploring different systems. And at some point, all of these teams are really busy. So they're not doing a lot of publishing for the rest of the policy or academic community and how things are going, but I'm really looking forward to sort of seeing the deep dives in terms of what worked and what doesn't as well. Yeah, we'll have some really interesting retrospectives after the fact as we can analyze both the different types of systems and the different types of institutions and how they made these different systems possible or effective. And the app that might work in South Korea is very different than the app that might work in the United States. And there's really interesting comparisons to draw there. People were so concerned, I think, about rights or the privacy stuff, and that's true. But the most boring consideration, it's just, oh yeah, they are very effective and effective, federated, robust, centralized public health system. And we have county level public health administration and they're seeing more than 3,000 counties in the United States. So it's just technically not possible. Like the first concern I have is like, oh, it's mundane, but there's no way. We can't even get close. So yeah, it's gonna be, hopefully there's a lot we can learn going forward. Right, yeah, absolutely. So one of the questions is about how employers are maybe using digital contact tracing apps for their employees. So are there examples of cases where this is happening? Are employees forced to download these apps? And what kinds of civil liberties issue concerns arise? And how do these apps compare that employers are using for their workers when it comes to equity, privacy, security, and so on to the Google Apple system that you've been talking about? Yeah, it's a walking civil liberties disaster. I think it's pretty safe to say. There are, I mean, so just to sketch it out. I mean, to have your employer be able to, some of these apps are location-based. So, you know, your locations and where you've been or who you're associating with. There's not oversight on these applications. So one, we don't even know if they work. So they could be abused to discriminate in other ways. And two, they could be abused to discriminate on whether or not you have coronavirus or have been exposed to someone with coronavirus, et cetera. And so it's easy to imagine how challenging that could be and how that's disproportionately gonna be used against low-income workers and against people of color because that is how our system is set up when we do manipulative surveillance programs. And so, you know, our civil rights letter and our privacy legislation would have made it really clear that the systems have to be voluntary and that you can't abuse them in that way. And, you know, it's still a concern even then you just have the ability to bring litigation to, you know, challenge that and get redressed. So it's, you know, it's an open issue now. We don't have eyes on a lot of examples. There are some great journalists who are looking into this and who are looking really closely, especially we're thinking about warehouse workers and the ways in which they're already surveilled and how this will be sort of a mesh into those systems. There's some troubling university examples in terms of universities mandating apps to be downloaded and then having those apps to be super privacy invasive or just not very, or not active. And so fully in the next kind of address those issues by putting down a baseline above which those systems have to meet technically and use wise and give them some real civil liberties oversight. And I think that'll probably be a priority given just sort of how the new administration is talking about the crisis of the pandemic and the crisis of racial justice. You know, this is a crisis in both senses. So I'm optimistic that that will be addressed. Right, because it's not just then this raises is it's not just 3,000 or more than 3,000 counties. It's also a bunch of employers who might be rolling out systems. It's a bunch of universities, which are one of I think the main other types of institutions that are rolling out various types of systems. And right, I think you can imagine both which types of employers are more likely to have these systems and given the sort of racial differences in terms of which communities and how hard the pandemic is hitting black and brown communities. If it raises these rights, these sort of issues of public access and being able to go to work, what's going to happen and who's gonna be able to show up to work or who's gonna be fired or any of those sorts of questions are really severe. Yeah, yeah, the economic crisis as well. And just to say sort of like an interesting like policy note there, it's just like the strongest policy or just, you know, you can just ban straight prohibitions. And as soon as you start kind of tinkering around the edges or making exceptions, those can be exploited. And so the trick for us there is as we were talking about this rule legislatively, we're like, can we just ban it or do we have to make exceptions? You know, because you can think of like, okay, well, what's the edge case where an employer should be able to say, well, if you got a negative test, you have to, you know, you're not allowed to come into work or something, you know, is it medical workers or is there something there? And that can get really complicated because it's easy to raise those hypotheticals and sort of really quickly dilute like a good intention strong rule into something that's easily exploited. Right. And so a lot of this clearly then relies on, you know, it's voluntary use. Clearly there's a lot of reliance on trust here in a lot of directions, but particularly ultimately trust among people to download this app and have it on their phone. So one question would be then, do we know, you know, do we have any demographic information in terms of who's likely to download and use these apps and what the uptake is generally and how this might be, you know, ranging across populations, across any number of demographic categories? Well, our answer is we don't, we know we didn't system that would collect any of that information. You have to study that kind of information completely outside of the system you've built. So, you know, all people are getting is downloads, monthly active users, codes validated for positive tests and codes shared. That's pretty spartan. So the German team actually is doing like a public health study on top of it where they're just like at testing sites, like asking people if they got the notification and that's why they came in. So that's kind of the level we're at right now. They are going to be publishing some of those conclusions which we're looking forward to there. But that being said, you know, from the beginning, you know, this is 75% of communications challenge that people were thinking about, okay, well, who's least likely to trust us, the government, I'm thinking proactively about the disability concerns, the accessibility concerns within this, the multilingual concerns, concerns about people who, people, you know, say older adults a lot, maybe as not being as familiar with some of this technology. And so, you know, a lot of places did dedicated campaigns. And so Colorado, they described there's like a Netflix style launch where they're like, yeah, we just, you know, for like three days, we tried to get it everywhere we could get it and then we launched it. But they also like, you know, they talked a lot about a program reaching out to clergy and faith leaders and trying to reach older populations and populations of color might not be willingly ready to trust the public health system based on really reasonable examples to our history. And so I think people are being sensitive about it. Like they know this is a communications issue. They know they need multilingual communication anywhere where they think there's gonna be less use. That's where they have to be proactive. And so at least we've seen, I think a lot of thought go into that thus far. And how does this ultimately play out in terms of the efficacy of the system? I mean, certainly the more people you have who are using it or have it on their phones, you imagine the more helpful it'll actually be. I mean, what are their target numbers that you shoot for? How do you think about what success would look like and how to make sure that we're able to hit those numbers given the voluntariness and the messaging and the trust challenges? Yes, well, as we argued, the only way you're gonna get to that road of trust is through privacy and making it voluntary, which is of course the paradox there because there is more to do and there is more to commit to. But there was an Oxford team that put out a study on 60% is what they thought you needed to be epidemiologically meaning curb and break as its own thing. And they're this summer to say, even if you get 15% of the population that you're gonna be starting to see meaningful public health reductions in spread, which would be incredible. So 15% is a much more achievable target, I think. And to be clear, any reduction in spread is totally worth it. And so when states are starting to publish numbers of how many testing codes have been claimed, it's only a couple hundred people. We don't know what happened, but even if it's only a couple hundred people, this is spreading exponentially. And so I think there is a lot of interest there. Early numbers from states that are releasing this fall have been great. So Colorado reached almost 17% of the population in the first week that was incredible. Germany almost said 20 million people now. So that's a significant part of the population that is over 18 and has a cell phone. And even if you're not participating, if we're able to reduce community spread in any way, that does benefit everyone. And so we're not worried that the benefits are only accrued to people who are participating. We do think benefits can accrue broadly. We're trying to spread less. But yeah, it's a lot to fit in to public health messaging, frankly, because the more important things are guidelines we've all been told in terms of social distancing and washing your hands, staying home. And those are being updated. So there's a lot to keep up with already the challenge for teams working on exposure notification has been to cut through. But even low numbers, I think we're excited about. Right, every case reduced helps both for yourself and for the people that you could be infecting down the line who may not have ever had the app or heard of it. Exactly. Right, so when we're talking about this conversation of trust and privacy, it sort of sounds a lot like how we think about privacy a lot in the digital world of you have to sign up, you click the box, you give consent for whatever the service or app might involve. But of course, true consent really requires knowledge of the full risks and benefits of consenting to the technology and in many ways, the benefits and harms in this case are unknown. It's hard to know what that will actually look like. And so do you think this is a flaw in these systems and how can we think about really making sure that the trust and the consent that people are giving when they sign up is robust and that they're really educated about what they're doing or do you think that's not even maybe the right framing for how we think about trust in this setting? No, I think that's a very astute question. Consent broadly in privacy legislation, consent-based models are really limited just in terms of the burden it puts on individuals and when really it's like a collective or it's a public burden that we should be taking on. In this instance, because we push for voluntary and decentralized, you're choosing to put a lot of onus on the user. And you heard my best attempt at the short version up top of what this does. And so trying to put that into as plain language as you can for people, I think there are some user experience content writers out there who've been doing an amazing job and a much better job than I had at giving it a shot. But in this case for these teams, like the best practices in the private sector are not enough here. And so a lot of these apps will have sort of extensive walk systems that try to give you the best shot. And yeah, this person raises great concerns. Like, do we really know the risks? Like, do we really know the efficacy? Like, no, there's some uncertainty here. But also we're not measuring those in a vacuum. We're measuring them against the risks and the uncertainty around the pandemic. And so even in my just tech policy brain was like, I'm not sure, that is a limitation. It's only with the broader context of saying, but we're doing this. We have tried to poke holes in these systems as much as we can. We push for the strongest possible stuff. We're hoping to get a regulatory baseline soon. We're pretty confident relative to the risk in the emergency here. So I don't know if we've got it right. I don't know if we'll ever know. Hopefully there'll be some great criticism in the coming year, as things have side. But yeah, it was an interesting example of just trying to make that calculation on limited information. Yeah. So there, as you're going through this process, one thing I'm curious about is what it looked like to have all of the different teams, both in your organization and beyond really working together. You mentioned a bunch of civil rights groups and racial justice groups. You have your technology and public health teams. What was it like trying in rapid pace to work across this incredible range of disciplines and organizations? And are there any particular conversations or insights that particularly surprised you where you said, wow, it's so important that we had not just the engineers in the room, but also this other group, because I never would have thought of this factor or this concern. So what was that actual working process like? Yeah, I think it really goes both ways. I remember first, we having come up with sort of the best possible system we could technically and then talking about governance wise and again, not us coming up with, but other teams coming with brilliant ideas than us trying to recommend those. Doing a meeting with civil rights lawyers and a labor policy analyst thing, what are the worker surveillance problems? How could this be abused? And just having people be able to come up with so many instances right off the bat. And even if you design a privacy perfect system, the cryptography is beautiful. There's no way anything could go wrong. It's zero trust. It can totally be abused as soon as it's out there. And there are so many disturbing ways that people brought up in which a lot of them with historical examples in terms of ways these could be abused in practice, even if what's on your phone is really, really strong. And then honestly, just that initial calculation with our health policy team and sort of having them sort of lay it out for us in terms of we might need this. This could get really, really bad. We're trying to get all options on the table. And having a lot, you know, your gut reaction, I think to the first idea of this, is though you're gonna make an app for coronavirus tracking, you're like, ah, check the solutionism, the surveillance. This is gonna be horrible. And I think we had to come to it a little bit reluctantly, honestly, just because we kept getting challenged to say, no, we're like, yeah, well, surveillance disproportionately affects black and brown communities we really care about. And having health and the pandemic is disproportionately affecting black and black communities. We really care about it. And so we need to really be real here about what we think the risks are. And trying to talk through uncertainty and risks that are that high stakes intention, it was great to be able to do that with an interdisciplinary perspective instead of just the tech policy perspective. Because were I working on just the tech policy team, I think it would have been easier to just continue putting out here all of my concerns. And instead I had to put out, here's how I think we can overcome these concerns based on the emergency that is presented to us. And that was a really interesting, I mean, it was a challenge. Yeah, it was a really, it was a challenging time. So what's interesting here is that it's almost the opposite of what you imagine, almost 99% of these types of situations going like because it's actually you as the tech team that's reluctant because you're thinking about a lot of the risks. Cause then it sounds like it's a lot of the public health and other groups who are recognizing the risks of not doing something. And a nice way where if you have everyone concerned about all of the types of risks, you might actually be able to do something effective rather than being very gung-ho and thinking that, well, this will be amazing. It's only by having everyone raise all the possible risks. But in this case, it was actually the tech folks who were the most reluctant to build something, which is really interesting. Well, it's great to work at a progressive technology policy team. It's a refreshing employment, yeah. So as these systems are rolled out, one question then is, how are they effective at different points in maybe a community spread or at different points in a wave? Or is it better when community transition transmission rates are high or is it more effective when things are more under control? Is it sort of differentiate across different sort of levels of spread in the pandemic? Yeah. In my humble public health understanding here, based on traditional contact tracing, there's a point in community spread at which there's just too much to trace. And it's too difficult and you're behind and it's a lot. And I think some of those problems hold with digital contact tracing as well. Not that you can't have really speedy notifications, but the other parts of the system are overburdened. So the testing is overburdened or the health system itself is overburdened. And so you're not able to actually follow through on getting tested and self-isolating. That's what makes it good. The knowledge is good in and of itself, but really you need to test and you need to stay home and you need to stay home. So my understanding is after community spread when we're out of community spread or when we're leaving that this could be of more use just because the systems, the public health systems are in place and they would be able to handle it. But also say once we're reopening, the hope is that this could help catch an outbreak earlier. If we're still thinking about a two week delay and I know we're thinking about it a little shorter now, but just thinking about how quickly an outbreak could spread just within those days is really scary. And so if we all are using exposure notification and participating in public health contact tracing, there's a chance that together, public health contact tracers can be strategic about finding the places that are really the super spreaders or where people are gathering and that the exposure notification systems kind of catch the long tail and encourage people to be staying home. Also, this doesn't really work for healthcare workers who are in contact. We should always be asterisking that that's one population which it's probably not great for. So as you think about the different dynamics for developing these systems and the rules, how do you think of the dynamics between different groups and which are the ones that you think are particularly powerful across the federal government, technology companies, advocacy groups, political parties, how are they all fitting together? And as the rest of us are thinking about ways to know what's going on or actually support and promote or advocate for better outcomes, who should we be looking to? Who should we be calling or talking to or putting pressure on across these groups to help us get to better outcomes? Well, in terms of the first question, I think having the administrative state fully activated to respond to the pandemic to have federal leadership I think a lot of states are ready for federal funding and federal leadership around these issues. So we know the coming administration is taking that seriously. And so looking to that, looking to effective public health messaging, the CDC has been doing public health modernization efforts for many years now, but those are still pretty divorced and separate from the kinds of systems we're talking about when it comes to exposure notification realistically. So it'll be really interesting to see how that conversation moves forward. In Michigan, as of last month, you have an exposure notification application. So I encourage you all to go ahead and stress test that and look under the hood and call the team and definitely point out, give them feedback. There's a really hard digital services team somewhere really working on this, but I have good hopes for that. In terms of people to follow and related work, you can volunteer with the US digital response if you're a technologist who public health person wants to lend their skills. The Linux foundation for public health has been doing incredible work on this. They absorbed the temporary contact number coalition, which a lot of the US groups leading work on this and they're great. Obviously a lot of this started on GitHub and it's still continuing there. So you can go look at VP3T and their initial ethics review and try to figure out how this developed for yourself. But really excited for policy leadership on this and obviously we're talking about continued CARES packages, continued coronavirus relief packages. And eventually I think we'd like to see this sort of acknowledged and supported on the federal level there. And so maybe just then one quick last question as you mentioned, support the CARES package. How can we ensure that these systems can ultimately be effective so that people can have the proper mechanisms to isolate to get tested all of that? I mean, that seems incredibly essential that people actually be able to do something when they get that notification that they may have been exposed. Yeah, people can actually stay home if they need money to pay rent to stay in their homes. And so the idea that we don't have financial support, I would rather have financial support and free testing and really affordable healthcare for people to be able to do that. And so we should be pushing all of those things. I would support a CARES package with all of those things with no technology support 100 times out of 100. And there are ways to upgrade the systems as well. So Colorado has a great thing where if you have an exposure notification alert that you may have been exposed, that is sufficient for you and your employer to receive benefits for you to get paid to stay home. And so programs like that that are integrating these into giving away social and financial support that's needed to actually have people stay home, which is the point, that's great. And that's really encouraging to see and I hope I see more of that because that's what people ultimately need. The app is useless without sort of the social and the money support to stay home. Wow, well, that brings us to time perfectly. Erin, thank you so much. You've given us a lot to think about, a lot to do and a lot to think about moving forward and these interesting intersections of technology and public policy. So I really wanna thank you for that. For everyone else, just two final reminders, as I said at the top, if you're interested in the Graduate Certificate Program for STPP, the next deadline is March 1st for the application and our next STPP webinar will be on Monday, February 8th at 4pm and information about both of those are in the chat. So thank you all, thank you again, Erin and everyone have a great afternoon. Thank you so much, thanks everyone and thanks to Molly, Sue, Jen, Amanda and of course, Shabita. Stay safe.