 Hello, I'm Daza Greenwood of law.mit.edu, and this panel discussion is with some of the key authors and contributors to the MIT Contact Tracing Privacy Principles, and we're going to unpack some of commentary to help understand better what these privacy principles are and to highlight some of the key issues and options that they reveal. So first up, I'd like to introduce my colleague, Brian Wilson, who is the editor-in-chief of the MIT Computational Law Report, through which we have published the first draft and we will publish subsequent drafts of the MIT Contact Tracing Privacy Principles. And Brian, could you put a little color on that, please? Sure thing. We started these Contact Tracing Privacy Principles back in March, and our goal was to kind of develop a way that people who were interested in understanding the tensions around contact tracing, especially with regard to privacy rights, epidemiological effect, how they could get involved and understand and develop some best practices around what we're doing, where we should go, and so on and so forth. And so as a publication that's interested in convening key stakeholders, producing content and having conversations, we thought a great way to do this was through drafting some principles, holding these ongoing conversations, and coming up with more and more rich materials. So we started that by first drafting a series of these privacy principles, and that was our version 0.1. And so in the expanded version, 0.2, we then went on to add a little bit of commentary, and this is where this stage of the principles comes in. We want to really inform the thinking behind actually how we develop these principles and set forth kind of a way that people could contribute and provide feedback. And so that's most of the background about where we've been and really excited to get to where we're going. And as you'll see from the great speakers on this call, we're going in some really interesting directions, and we've put a lot of really good thought into this. I'm excited for them to get a chance to introduce themselves and talk about the great work that they've been doing. Indeed. Thank you very much, Brian. Hold on a minute. Speaker view. I may have to manually switch the screen here. Thank you, Brian. And so speaking of our speakers, we are joined by, I think in order of their remarks, Ajinta, who's the founder and CEO of ClinIQ Health, a digital health platform that has spun out virus IQ, public health projects, providing a digital screening platform for virus disease detection and prevention. So that couldn't be more relevant today. Ajinta is a physician innovator with 20 plus years experience in medicine across the UK, Australia, and the US health care systems who is passionate about health technology innovations across medical devices, drug discovery, and drug health. With a special interest in artificial intelligence and blockchain technologies, Ajinta is an advocate for human rights, meritocracy, diversity, and inclusion. So I will introduce our other two speakers in turn right after Ajinta has an opportunity to talk to us a little bit about one of our most important principles, identity control. And if it wasn't obvious, you can see the entire list of principles at law.mit.edu, as Brian said. And we're going to just do a bit of a deeper dive through commentary and panel discussion to highlight some of them. And the first one is identity control. So Ajinta, thank you for joining us today. And thank you for all your contributions to this open collaborative drafting process that we've been convening at MIT. And won't you please share some of your thoughts on this principle? Yes, thank you, Jasmine. Thank you to the group for these important work that's being done. So I think I was going to touch on two limbs of identity control. So the right to control and informed consent. When we talk about right to control, some of the topics will be around de-identification and anonymization, pseudonymization of data. And one thing that's important to make clear is that identifiability is really a continuum. So you have the continuum from fully identifiable data where the person's name and the record sits in an easily correlatable manner to the anonymized, fully anonymized data where it's almost impossible to re-identify that individual. There's a lot of debate around re-identification and the ability to do so, particularly in current kind of innovations. And I think there's also another way to classify personal data de-identification. That's pseudonymization, where the process of the personal data is sufficiently separate, where the personal data is sufficiently separate from their medical record. And it would be quite difficult to put those two records together, unless it was intended to be done as such. There's also a common kind of theme around the recent contact tracing and recent innovations in that we need to move towards some sort of decentralization process, where there's minimization of data use and the purpose of that data use is limited. So there's various apps around the world that are using things like the DP3T protocol. And that really places greater control to the individual, and the individual then is able to permit sharing of their data, as opposed to the individual having to pass over control to a third party entity. When we talk about informed consent, this is coming, taking a lot from sort of medical practice. We talk about materiality. So this is where consent has to be given and the individual has to be informed of anything that is material to them. So the test is materiality, where a person may consider something quite important to them, and that needs to be disclosed to them and informed. And so I think with a digital tool such as these apps, where consent is taken, say maybe at registration, and then that consent remains quite static and irreversible, one could say that there might be breaches in terms of informed consent. And what we think is there's a body of evidence around dynamic informed consent that consent should be, even within these digital tools, should be granular, should be changeable after the fact, and the GDPR also states that people should be able to delete their record as and when they wish. And the dynamic informed consent would enable people to delete kind of elements of their record, not necessarily all of their record, or revoke their consent at any point in time. So that's just some of the topics that I wanted to touch on. I was going to interrupt you from time to time with the question, but that was so clear, I think it really spoke for itself. So I guess, well, I do have one thing, I suppose, which is for informed consent in the context of a contact tracing, say application, like a consumer application on a mobile phone, what would be a good, how could that actually be accomplished in a legitimate way? Currently, the screens of dense legal information don't really seem to inform, they're more of a facade. What would be a design pattern or some example of how informed consent could be legitimately applied in the context of a contact tracing app on a smartphone? Yeah, good question. There are various tools that have implemented designs like this, so there's an Australian genomic centre that has done it, virus IQ has implemented a design for informed consent. So what we have is a privacy notice, which you're welcome to scroll through and then press I agree at the end, which then commits you to that privacy notice. And then you go on to this next page, which actually defines a granular consent. So it asks you whether you want to receive notifications, or whether you want to opt in to share data for research purposes, whether you want to opt in to share data for various other reasons. And it also has a clear statement at the bottom, which says that this consent can be changed at any time, and should you wish to let your record at any time you can. So then the user, I should say, can go into their profile and at any time change that consent mechanism. And should the platform change or data usage change, then the person can receive a notification with a new consent that they may have to make. And that would be considered dynamic informed consent. The consent has to be given in plain, or the information has to be provided in plain English. So what we've opted to do is give brief information and then have this quite lengthy, detailed information page that they can go to, which would be on the website, as opposed to in the app itself. So I think that's one way of doing things. You can see from, if you go on to European websites, such as Bloomberg, particularly for advertising and marketing purposes, they have granular informed consent and they have a switch tab which says, do I consent to this particular activity or not, such as cookies, control, et cetera. So a lot of this comes about GDPR as well. Perfect. GDPR, I'm getting a little bit of a feedback. So maybe if you're not speaking, if you can mute. The GDPR benchmark is, of course, a very solid standard and it's increasingly becoming a global standard now. And in the success of the low bar that I was alluding to with the click through contract that's two characteristic in other parts of the world. And then I also heard you really referencing your rich background in medicine and in healthcare. And the types of consent, the consent paradigm in healthcare is also a higher bar. And so it sounds like this would be what we're really saying is appropriate for how to treat the consent aspect of a contact tracing app. So thank you very much, Ajenta. That was very helpful. And let's let us move on. Our next speaker and also a contributor to the MIT contact tracing privacy principles is Mark Potkowitz. And Mark is an attorney and director of the Legal Innovation Center at Ulster University in Belfast, Ireland. Mark has a background in federal legislative policy in Washington, D.C., United States, where he worked on a range of matters related to privacy, technology, national security, intelligence, and defense. Mark teaches an undergraduate and he teaches undergraduate and graduate law programs at Ulster University and is the course director for the new corporate law and computing, LLM and MSC. And it's not on his formal bio here, but I also might mention that he was a student at Brooklyn Law School and is one of the leaders of our favorite civic group, Legal Hackers. So Mark, welcome. And I believe you wanted to help unpack some of the complexity behind this principle of proportionality that contact tracing apps that the risk to privacy ought to be proportional to their effectiveness at containing COVID-19. Oh, thank you very much, Dazza. And thank you for having me here and speaking on this wonderful panel that we've put together. And this is a complicated issue and everybody should feel free to jump in and ask me to explain if I'm using terms of art that might not be things that you've commonly been exposed to in the past. This issue of proportionality is complex because whenever it comes to things like fundamental liberties, like privacy, we need to take on balance what it is that we're sacrificing and the benefit that we're drawing from that sacrifice. So often when courts look to these issues, and I'm speaking of this from a U.S. perspective, you will have some sort of balancing test where a number of factors are used to determine what the outcome should be based on certain fundamental principles. So here, when it comes to contact tracing and the privacy risks, we really need to consider what the public good is versus the potential private harm. As I've discussed other places, we have certain relationships that we cherish in the United States between people like lawyers and their clients, like priests and their penitents. In some jurisdictions, in most jurisdictions, in fact, there's an obligation for doctors, for instance, to have doctor-patient confidentiality that's observed. Independent agencies, the federal government, have certain privacy protections as do members of Congress. The list goes on, and when you look at privacy implications of contact tracing apps, you could certainly see a circumstance where relationships between people are jeopardized based on the knowledge of them coming into contact with one another, such as lawyers and their clients. Certainly, you can imagine a circumstance in which the CEO of a publicly-traded company routinely begins to meet with a lawyer at a firm which is known for doing bankruptcies and how that could have a significant implications. You could certainly have an instance where the CEO of a company is going to see a doctor regularly who is a well-known oncologist, and that could have significant circumstances. You certainly could see other instances in which you have people under witness protection or police informants or other people in different circumstances who are being put into compromising positions because some of their data could potentially be collected or improperly shared by contact tracing authority. A lot of these concerns are not hypothetical. Under the metadata program revealed by Edward Snowden with respect to the National Security Agency collecting Verizon business metadata, we saw drastic and wide violations of privacy in all of the types of relationships that I've discussed so far. You could certainly see an instance where somebody may try to weaponize a contact tracing solution to tag protesters or dissidents in a particular jurisdiction to map out a network of how they interact with one another through false reporting and contact tracing. There are certainly risks of people with access to the data improperly sharing it, either by accident or in some act of negligence. And as we've seen in the past, there are instances of something called parallel construction in which a law enforcement agency has access to information that they cannot get admitted in court and will find some other reason to stumble upon something like a drug deal or a robbery when there's no reason that a law enforcement officer would be getting a cup of coffee at four o'clock in the morning a mile away in the opposite direction from where he or she lives. So while these are the more drastic examples and some people may think they're hyperbolic, we have a lot of documented cases of these types of things being very, very real. So this isn't to say that contact tracing is fundamentally flawed on principle. And this goes back to the idea of proportionality. What is important is that we have the technology in place that accounts for these types of risks and that we have legislative regulatory or statutory measures that help correct for the potential abuse of these types of programs, whether deliberately or through carelessness. I could continually, I could go on for a while about additional examples and speaking to U.S. case law. But there is one- A lecture, that's for sure. And I know that to be sure. But I would like to ask you if you could focus on one specific thing, which is that, I guess I'd say that asset that we get from the common law of judicial balancing tests, which you were starting to allude to. And I'd like to hear more about that from you. So you just sort of spent the first few minutes talking about there are risks. So we've got risk on one side of the balance. And then the other side of the balance, at least as we have construed it in the MIT contact tracing privacy principles, is the justifying effectiveness at the app at combating COVID-19, in this case. So just to be very blunt, an app that has no effectiveness at combating COVID-19 would justify no risk in privacy. An app that could eradicate COVID-19 tomorrow may justify a greater risk to privacy. So that would be a policy quick. And then there's some curve in the middle there. Sure. Or be proportional. So that this concept of a judicial balancing test is one way we can find precedent and lawyers love precedent that perhaps we can apply to this new scenario. And I'd love to hear you zero in on that. Sure. So when we talk about balancing tests, there are various jurisprudential approaches that judges take. Some abhor balancing tests because they think it's an opportunity for a judge to submit his or her own opinion in lieu of the law. And I'm sorry to jump in, Dazza. You had mentioned something before about the camera switching. So I can only see you. Oh, you can? Oh, that's funny because I'm... How about now? I'm still only seeing you. Maybe I inadvertently clicked on something. So in that case, I apologize and I will go back to the explanation that I was doing before. That's all right. Since we're doing a live production here, others? I can see you, Mark. Okay, fantastic. I just wanted to make sure because with the transatlantic discussion, I just wanted to make sure that there wasn't some issue with the technology that would have an issue. So as I was speaking in response to your inquiry about balancing tests, I was saying that there are varying jurisprudential approaches to balancing tests. The idea is in law, we don't like always to say that this is strictly how things happen. David Hume made the distinction between the is and the ought. And I think it was Lyndon Johnson who said, give me a one-armed economist so that they can't say on the one hand but then on the other. And I think that a lot of law is on the one hand, but then on the other. So when it comes to certain types of legal doctrines, for instance, lawyer-client privilege, the presence of a third party in the conversation eradicates lawyer-client privilege full stop. So if I was meeting a client in a public restaurant, that would be terribly bad practice because that would not be protected by lawyer-client privilege unless I was in a secluded area. However, there are a lot of other instances where a totality of circumstances approaches taken or judges will weigh a varying number of factors in making what the determination is. And depending on what matter at the bench, depending on the nature of the matter at the bench, they will apply different balancing tests. So when it comes to privacy in the United States, we had initially had a theory in which only a physical trespass on property would be deemed a violation of privacy. For those law nerds out there, this was a 1928 Supreme Court case called the United States versus Olmsted. Fast forward to 1967, there was a case called Katz. So whereas in Olmsted, a telephone tap was put miles away from the house in the Supreme Court Chief Justice Taft writing said there was no trespass, therefore there was no privacy violation under the Fourth Amendment. So in Katz, what the government did was put a tap on the outside of a phone booth. Since they didn't put it inside, they thought they recovered. They said, no, no, no, no, we've put it on the outside. And so there was no trespass inside this phone booth. Well, the court said, well, no, when Katz went into the phone with the close of the door, he took steps to secure his privacy. He took steps to close himself to the world, and therefore the Fourth Amendment doesn't protect a place. And this is the Fourth Amendment to the U.S. Constitution. The right to be secure in our persons houses papers and effects from a reasonable search and seizure. So this Fourth Amendment... Spotlight, that word unreasonable. How do you say unreasonable? It's a balancing test. Right. And so the initial, the pendulum in a sense was toward what we think is unreasonable, right, this 1928 decision going to Katz saying, well, we put it on the outside of the phone booth and the court said, no, no, no, that's not the case. It was actually, I believe, Justice John Marshall Harlin II, who had said in his concurrence the reasonable expectation of privacy, which ties into this magic word that we lawyers love, the reasonable person, the reasonable expectation. So when it comes to these types of contact tracing solutions, we need to think about this idea of what is reasonable. And this goes back to this notion of the balancing test. So what do we think is the benefit and what do we think we are willing to pay for that benefit? And when I mean pay, I don't mean currency. I mean, pay is in our privacy and personal data. So it is really challenging, especially when we don't necessarily have the legislative framework in place to provide the types of protections we see in other areas. So without going down the path of different standards of guilt when it comes to Men's Raya, what we will say is in short, a lot of legislation gives a high degree of protection to government officials and government programs when they act carelessly or recklessly. In other words, a lot of standards that we would use to get evidence suppressed in these types of cases that was improperly turned over or liability to attach to government officials is often a standard of something higher than recklessness. Often knowingly, someone must knowingly do something. In fact, they're doing this act deliberately knowingly cause a particular thing or willfully, they're doing something with the intent to cause a particular thing to happen. Right. So we need to make sure that the people have this data. Qualified immunities come into the public spotlight a lot lately. In section 1983 as well on the federal scale. Yes, absolutely. Or and so just to start to wrap because we have, we do have another speaker. It's so there's several factors that one takes into account that a judge might take into account in weighing whether, for example, a search was reasonable unreasonable, you know, plain view, and these other things that you were alluding to. And then we are really opening the question to ask. We're not concluding it, but we're saying one of the principles is there ought to be some factors that are objectively measurable that can demonstrate effectiveness of a contact tracing app in order to justify whatever the countervailing risk to privacy is from use of that app or arising from the use of the app downstream with the data. And one might imagine and I'm just brainstorming here with you, but like hospitalizations. So measure deaths is a measure that we're already using to attribute to cause of death being COVID-19. You know, propagation rate of a spread in a hot zone through testing could be a measure. We can look to epidemiology and public health for classical measures of effectiveness countermeasures for containing and ultimately eradicating epidemics and pandemics. So does that sound about right for how we might translate these jurisprudential principles into this context of proportionality for a contact tracing app? They're very well put, Dazza. I think that the one thing I would like to add with respect to this whole notion is that I don't mean to seem very negative and I come from a background in which we did see a lot of abuse of programs that were designed to surveil not designed for a public health purpose. So perhaps I'm bringing maybe I'm not being a skeptic but being a bit of a cynic. And I think that I should I should encourage everybody when it comes to these to be a skeptic in the sense that they should want more information and want demonstrable facts rather than just presuming the worst. I would say as a lawyer I'm trained to presume the worst and prepare for the worst. Which is why my comments seem to be very distrustful and negative. I don't mean them to be that way. I'm just accustomed to preparing clients for the potentiality of something going horribly wrong. So that when things don't go horribly wrong they're pleasantly surprised. Right. Yeah, well said. Yeah. And in that spirit you know MIT you know we're an engineering school. We prize innovation. You know it's it's all about taking risks sometimes in order to make the world better. But the more we can engineer into the new innovation in this case contact tracing technologies ways to solve for all of the requirements and some of them are legal requirements. They're not all technical requirements or business requirements. The better we think that innovation is. So we want innovation not devastation. From something new. And thank you for highlighting that spirit. So now we move on to. Ryan Carrier. So Ryan is executive director of for humanity. Ryan founded for humanity after a 25 year career in finance. Ryan's global business experience risk management expertise and unique perspective on how to manage the risk led him to launch the nonprofit entity for humanity. And he's done this personally and I can say from our collaboration it's definitely a passion. Ryan helped lead the effort to create independent audit and governance of contact tracing as a trust enhancement initiative to aid contact tracing authorities around the world. And this has been a perspective that we've been very glad and grateful that Ryan has shared with our volunteer experts drafting team that has been working on iterating these MIT contact tracing privacy principles. And Ryan is going to provide some commentary and some depth on two of the privacy principles transparency and accountability. They're of course connected. So Ryan thank you for joining this panel and for your contributions to the privacy principles. Take it away. Well it's been my pleasure and thanks to you Dazza and the rest of the panelists and all the team on the drafting side that have done such great work. It's really valuable because we come from the perspective and kind of the thread of the JINFA to Mark to hear of before any of this matters about contact tracing we have to get usage. We have to get trust in the system for us to reach a certain level according to Brookings it's about 60% adoption of contact tracing technology before it even matters. And unfortunately all around the world we are short of that we are short of that level. And so a big part of all of this work and all of these principles is how can we raise that trust level from the perspective of the public. And so the two two areas that I've been focused on or that we've been focused on is independent governance the idea that there is an entity or a body that exists on behalf of the traced behalf of the people who will be examined in this in this process who don't really have a voice and unfortunately those who are in the authority the contact tracing entities or the technology companies they have a belief and I think unfortunately it's a mistaken belief there's a there's a trust gap here between their trustworthiness just in general it's not necessarily even on this matter but just in general their trustworthiness versus the public's perception of it and unfortunately the public's perception of it is really all that matters because and that's why we've seen such low adoption rates and such challenges. So the theory or the idea here is that independent governance can be and represent that voice where the mission is solely to look after the interest of the population of the contact traced. It is it exists and it works because we do believe the contact tracing is a valuable tool of public health plain and simple we've seen proof of it over decades and we want to find a way to enhance that trust so that more people are comfortable with that and obviously with informed consent and with good proportionality as Ajenta and Mark have talked about those are steps in the right direction and unfortunately we probably need more and so the idea of of a group that's looking after the interest of the trace to help meet and bridge that trust gap is key quite frankly and so it would be reasonable also then to ask well if you've got independent governance what information are they governing on or how are they adjudicating the value or the trustworthiness or even the process of these contact tracing systems and the answer to that is independent audit and so what we've constructed with the four humanity fellows which are experts from from all around the world who have poured in on the audit is this idea of getting complete transparency and thus when you have complete transparency you have complete accountability the idea that you can look question by question there's 238 questions and there will continue to be more as we evaluate and identify the information that needs to be uncovered on behalf of this Board of Governance to fairly understand and to work with the contact tracing authorities to do what they say they're going to do and to meet the thresholds of trust that are required for adoption and so we we think about the audit in it's it's five general ways and then there's a couple of elements that that relate directly to the technology so we look at it from a perspective of ethics bias or equity or eliminating bias is what we want to really say privacy of course trust which is a bit of a catch all that's really things like disclosure and transparency but also control and safety and then finally cyber security or or security in general those are the kind of the five primary principles but then we also look at the technology itself we look at the launch protocol how these things start and an important element is also the expiry when does this go away when do we need to stop contact tracing and very importantly as a JINFA referred to at the very beginning where does this data go and what gets done with it and importantly if I've been traced and provided some of that data did I consent to that data and that usage and another further example of that informed consent is we don't know when the pandemic is going to end we don't know when we will no longer be contact tracing and that's okay so it's okay not to be to throw a date out there August 31st 2020 we're shutting it all down that's silly that's not an appropriate approach what's appropriate is to say when we no longer need it from a pandemic point of view and thus as we move along if this let's say God forbid this tracks for another 18 months which isn't out of the wood you know it's not it's not crazy talk right maybe we need to go back and seek further informed consent to continue to to use the contact tracing system and acquire this data because maybe we only ask for it for a three, four, six month period and so it's examples like that where we are asking those detailed questions about the contact tracing authority and getting documented provable evidence of how they are implementing each of these steps along the way that gives us the confidence to fairly adjudicate the process and remember the point is not to take this information and run to the press and to say look how bad this is that doesn't help anybody that doesn't enhance any trust that doesn't allow the contact tracing technology or system to be trustworthy the point is to work with the authorities and to find the best practices and to achieve the things that will allow for that trustworthiness for that trust from the population being traced and this isn't just a national or state or provincial thing interestingly many of the discussions that I've heard have been that contact tracing may become bigger and more relevant at the corporate level where I think some of these issues become even bigger and more worrisome and thus having this infrastructure where we can enhance trust I think is vital to making and keeping the mission here is very simple contact tracing is a valuable tool of public health let's wrap it in bubble wrap belts and suspenders and just let it do its thing if you want to get other data if you want to track other things do it in another fashion let's keep contact tracing for public health and let's build confidence in that here here so independent governance to manage and be the responsible party for independent audit and independent audit to ensure trustworthiness to among other things create the conditions for broad adoption which is what's that's a prerequisite for contact tracing to be or for the technologies to be useful and effective and also to avoid abuses you know hence to be trustworthy one little thing I wanted to follow up on the example you gave which I hadn't known of before was one way to contain in time consent to make sure that it's active only so long as needed might be to have it automatically expire after say 18 months so if just to see if I understand that correctly if the pandemic were to continue for 20 months and let's say like WHO and in the U.S. you know you know our HHS and and others were to declare the pandemic and epidemic you know public health emergency over in you know 20 months say then is the idea that people would need to re consent after 18 months is that is expiring it prior to the end of the you know objective end of the pandemic is that really proportional because I'm just wondering because if people don't if people don't there'll be a lot of attrition whenever any time you ask for a user action and then yet the pandemic's not over yet and if there hasn't been a material change in the terms or anything else I wonder if that's really in you know when you weigh all the equities and the pros and the cons kind of pushing the right direction for trustworthiness and mass adoption and effectiveness it's actually a great question with with some very cool layers in it so let's start with the first which from the perspective of the audit and governance is ensuring that you don't have even a conflict of interest at the authority level right is the authority the one who also calls pandemic or epidemic and thus they're inherently conflicted and to be fair that is a genuine concern right that if this is just government and they're the only ones making the call and not relying on an outside authority like WHO well then why not just always have a pandemic right always have it in place and never have to worry about the expiry and that is a concern and it hurts trust and then to take it to your point of well let's let's go with the assumption that it's going to go at the system the contact tracing system will be in place for as long as it is needed that's wrapped up in the same theory of well it's just going to go on and on and on forever and I don't trust the authority to act accordingly and so if we're building up trust just the fact that you've shown a willingness to end this and in conjunction with end it to delete properly right to say we no longer need this data and we are getting rid of it and getting rid of it properly and then to do what you say you're going to do which doesn't always happen as we well know that will also enhance trust you're absolutely right it comes at the risk of that having to re-opt in but it is definitely considered I'm pretty comfortable with this but I won't speak for my colleagues that a layered a layered approach to consent in specific with how I'm being treated and what the new question is and look 18 months versus 20 months yes it's kind of there's going to be a lot of people like aren't we done but if we're talking about six month intervals in an 18 or 24 month cycle you know then I think you get used to the process and you kind of embrace it and you're more along the lines of I'm glad that they're thinking about it this way and I feel confident that when the time is up that they'll do what they say they do because they continually are asking for me to stick with it so you're right I do think there's there's attrition I think there's concern about these these kind of layers of consent but I think it's valuable from that trust enhancement perspective got it okay good thank you so and there you have it and that's some of the quality of thought which has gone into the MIT contact tracing privacy principles and the commentary which we're currently developing and you've all gotten a nice sneak preview of just now and so let's wrap up this panel discussion with some closing thoughts from each of the panelists and then perhaps Brian can tell us more about about what we can expect behind the curtain for a rollout of the content and how you can get involved so first things first maybe we can do it in order of the speak that you spoke and Jinta would you share any oops I didn't switch the camera can you share any final thoughts with us please yes well I wanted to make one comment actually in terms of what Ryan was talking about with the contact tracing and duration of time that the data needs to be kept in terms of a disease process like any virus you'd like to keep the contact tracing practice ongoing beyond the pandemic requirement because this virus is endemic now so it will persist beyond the pandemic you know titling or that period that's determined to be a pandemic so we need to keep the I think you're the only health professional among us could you just take a moment to describe to me and everybody therefore probably many others what do you mean by it's now endemic all right so what I mean by it's endemic I mean to say that the virus is here to stay so the virus has now entrenched into our environment and it's going to keep circulating so with SARS and even H1N1 even last season we've had cases of H1N1 so these viruses are not we haven't managed to eradicate it and now that it's persisted it will remain in our ecosystem wanting to contact tracing up how long we hold the data though is up for debate my view would be that you would not need to hold the data beyond 30 days on any one user I say that because the validity of that that diagnostic test or the screening result what a screening test result that you've incorporated in your contact tracing is no longer valid after 30 days because that one person has now technically recovered so they're no longer an index case and no longer contagious and they would need a new test to determine whether they remain contagious so really you don't need to keep anyone's data beyond a certain time period which is actually well within the pandemic zone so in terms of proportionality and privacy principles you could make a very good argument to say that we don't need to even keep data till the end of the pandemic period we just need to cycle the data through and the idea of so this goes back to my closing so I will make my closing thought now without taking too much more time that we need innovations that balance both data minimisation and purpose requirement with that emphasis on user agency and interoperability to support the sharing of the data in ways that benefit the user so it is in the user's interest to share it for research say for epidemiologists it may not be in the interest of the user to share it with say their local restaurant that they've gone to for one instance et cetera so that's my closing thought thank you thank you Agentha and we're so fortunate to have you with your background working with us on this and I believe our next speaker was Mark did you have any any kind of closing thoughts you'd like to share well I agree with everything that's been said so far as a lawyer and not a medical professional I'm not going to try to wander off pieced into the requirements with respect to medical data although it does seem to me that there is a temptation to overemphasise a lot of the potential drawbacks and try to minimize the potential benefits I was recently on a panel discussion about the issues of COVID-19 contact tracing and privacy and the question that was asked of every panelist was would you download what was then the active beta sort of the test of the national health services UK app onto your cell phone and there was a a lot of questions there and the majority of us have more than one cell phone and our answer was yes I would probably download it on one of my devices which goes to show that there are these competing elements because obviously if I download it on a device that I don't carry with me it is not doing the thing that it says it's supposed that it's supposed to be doing right I do it on a developer device that I keep in a faraday cage under the basement exactly so I still think that we are very conflicted with this and while there is a lot of potential benefit we need to be cognizant of and go into this with eyes open so that we can all make informed decisions with respect to our informed consent here here thank you mark and Ryan any closing thoughts that you care to share sure it's to agree with my colleagues first and foremost they're absolutely right and all of that is vital and just to remind those who would be watching this out there that there's actually something that has to happen first and that is people have to trust the contact tracing technology and use it and we need to take whatever steps we need to to reach that threshold of usage otherwise there is literally no point so it might come at the sacrifice of some things it might come at the sacrifice of large pools of aggregated data the UK suggested before they scrapped that they were going to keep the data for 20 years and there was a general uproar in the community because it's the exact opposite of the direction that kind of best practices were headed which how do we get rid of this data quickly and so there's that temptation to always want this data it might be great data but if it keeps people from signing up you'd never get the data anyway so that proportionality that Mark spoke about is absolutely key we have to get people using this tool and trusting it and taking it with them and turning it on and keeping the Bluetooth on or whatever technology we're using and if none of that happens then honestly this unfortunately this whole conversation is irrelevant and so all the steps that we can take to achieve that is kind of the path we have to start perfect thank you Ryan and good luck with the recent launch of your initiative for humanity thank you so Brian Wilson oh let me switch the camera Brian Wilson our erstwhile editor-in-chief of the MIT computational law report where we have where MIT has published the Contact Tracing Privacy Principles do you have any closing thoughts or perhaps opening opportunities for others who may want to contribute sure I think I I think I have kind of a little bit of both but you know the closing thought would be you know I think this quick overview that we've just gone through with regards to the commentary of the Contact Tracing Privacy Principles and how there's so much nuance to this discussion at a lot of different angles I think that perfectly encapsulates you know what goes on in the meetings that we have where we do these drafting you know collaborative drafting sessions and we get people's feedback from a bunch of different backgrounds doing a lot of different things and so I think you know hopefully this is something that people find interesting and people find appealing and if so there is a way to actually get involved one of the things that we're looking to do kind of further down the road map is you know really build up this commentary and keep that that continuous process of improvement with these principles going so that you know even down the road these can still be useful they can still be relevant and they can still take into account the context that's going on globally and even to some extent in more localized ways where people have expertise that they can specifically contribute so I think you know would love to encourage people to join up with us at law.mit.edu there's information there about how you can get into contact with us how you can get involved in projects like these and I would encourage anybody who's interested from any sort of background whether it's you know law computer science epidemiology public health even you know things like design things like user experience things like art things like storytelling you know those are all pieces that need to be baked into this recipe in order for us to wind up with something that is widely digestible widely usable and that gets to that threshold that everybody had kind of touched on where you know there's enough people using this in order for it to be something that that is useful and so I that's the the kind of call that I have for everybody who's out there watching this and thanks so much for co-hearing all these people together and facilitating something that's good and useful thank you Brian so in case you missed it if you want to get involved it's law.mit.edu and then just click on contacts it's right in the top contact us it's on the top menu bar felt the form you can get involved if you'd like to provide questions or comments or other resources on our MIT contact tracing privacy principles you'll see near the top of the page under our special release of materials on COVID-19 and countermeasures the first article is the privacy principles if you click on that we've got a link at the top of the page to a form where you can comment on what you've heard today or comment on the principles the commentary the assessment framework any questions or comments or contributions or ideas we love ideas that you'd like to share you can do so there under creative commons and that gives us the opportunity to openly share it out with the world as we curate everything that we're collecting so as Brian said I want to thank everybody for your contributions you three panelists have been really important to our joint effort at developing some thought leadership and making it available now when it can still really make a difference in combating COVID-19 so with that thank you very much and this panel is adjourned