 Yeah, hey and welcome back to the seabass channel, and it's once more about corona but this time it's about the corona apps and the data protection in decentralized corona apps and Kirsten and Reiner will do a talk about looking beyond the code as always you can ask questions in the chat, which we will answer later on and now have fun and Lots of information with Reiner and Kirsten go ahead Thanks a lot So welcome to our talk data protection of decentralized corona apps looking beyond the code This is talk specifically about data protection and not so much about I see security as we have heard a lot of interesting talks before and Yes, so This is the schedule for today And so And we will start With the introduction of us and Kirsten maybe can say a few words about yourself Thank you. Thank you so much for having us today My name is Kirsten. I'm a data protection expert and I work for a German DPA. I have a legal background I tweet under privacy DE and I was part of the team for a German exposure notification app data protection impact assessment draft in April And today I will present my private opinion. Thank you My name is Reiner Reak. I have a background in computer science and philosophy And I am a researcher at the Weizmann Institute for the networked society I'm active at the forum computer professionals for peace and social responsibility and my fields of interest are data protection and IT security as well as government hacking and I was a part of the team together with Kirsten to create the data protection impact assessment in April and Now we will have a short overview about the talk Yes First we will start with with what is data protection quite important question when when talking about data protection to understand the basic subject And then we will learn some details about the German corona app Which is actually exposure notification app We will learn a bit about what is a data protection impact assessment our findings and what was missing in the official data protection impact assessment and Finally, we will have some time for open questions But first let us start with what is data protection Reiner? So first we would like to differentiate With the question what is data protection between data protection data protection law and IT security? data protection itself is a Research area and social science the idea there or that the goal is to avoid unwanted consequences of data processing Data protection law is the legal field that tries to implement that but it's the legal form of it And we all know sometimes the laws don't really protect what they're intended to protect so there's a difference between data protection and data protection law and The third thing that needs to be differentiated is IT security, which is in a research area in computer science Which is mainly about confidentiality integrity and availability of data and services and And We see here. There's already There's already a differentiation because The data protection itself protects the people that are subject to data processing Whereas IT security protects the data and systems of the people operating the systems so you see we already see there's a first there's a first difference and The current Data protection law is the you GDP are the general data protection regulation And it protects the fundamental rights of people see article one. I've put it out there The regulation lays down rules related relating to the protection of natural persons and it protects the fundamental rights and freedoms of natural persons and generally in in the Data protection law in the GDP are the main questions are The processing itself the purpose the risks involved and who's the controller. So who's responsible? and in such things and One thing that needs to be mentioned, especially when we talk about the any kind of processing is questions of proportionality suitability and necessity so for all data protection processes. It's important to ask Is Is this data processing suitable to reach the aim? So is there a causal relationship? We all know that there are certain applications where that are intended to reach a certain goal But we maybe as tech people or tech people might not all be tech people listening or watching here Sometimes it's just not suitable. So from the perspective of data protection This would then not be illegal processing if it doesn't causally relate to the aim The necessity is Is there any less intrusive measure to reach the same goal then the proposed one is not is not allowed because less infringing one or less intrusive one would be available and finally the proportionality so the goals that can be reached compared to the rights infringed that needs to be put in in context and in relation and This is this kind of understanding of this kind of evaluation has to be done for all processing So right now we see that's the first problem with the corona With exposure notification being done by the corona one apps We don't really know how well it actually works which In effect makes it difficult to make this kind of evaluation But if we leave this a little bit away It's still very interesting and this is what we will continue to do now is to evaluate the risk of the actual usage But and there's also a specific tool. So we it's not like people sitting around the table and just thinking about the risks Yeah, yeah, oh beautiful alright, oh Oh Is that okay again? All right, so I feel a bit like an aeroplane that has been refueled in mid-flight Okay So should I hey should I continue or okay, maybe We'll probably we'll skip this probably in the recording then so right now I could sing the jappity melody Hmm There we are Okay, well this was quick should I continue now? Okay, all right, so as I said it's necessary to analyze all this processing and for this kind of analysis Analysis there's a specific tool in the GDPR Which we will hear about later by Kirsten, but first before we talk about the tool We want to talk about the corona app itself just to outline the general functionality for everyone who's not familiar this will be rather quick But it's important for people who are not constantly dealing with a topic. So what's the corona one up? It's a voluntary transnational Bluetooth Low energy and proximity proximity tracing It's the same technology or that's being used by for example the app in Ireland in Japan or in Italy Etc. The purpose of this app is to break infection chains by warning possible infectious exposures So it tries to trace contacts between people and if one of the persons Is tested positively then the second one so the contact can be warned for a self Quarantining and the whole system although we are always talking about the corona app and consists of an app and a server Which is important looking at it from the data protection beta protection perspective and there are more things but Because of time we're not focusing on this Within the app there are two buckets like to let's say storages One bucket is for the Bluetooth beacons that are being sent by each app and one of the buckets is for the Bluetooth beacons received by other apps those Bluetooth beacons are Let's say pseudo random and they change but this should not be the topic right now and The interesting the interesting aspect is upon a positive test of one person The Bluetooth beacons that have been sent in this one bucket are uploaded to the server and Then all the other apps are regularly downloading all those beacons from the server And they can check with their second bucket if they have seen any of the infected buckets So that's kind of the idea. So the warning is Is being calculated Decentrally on the on the apps on the mobile phones themselves and if there's a warning So if it detects you have been close to someone who has been later tested positive, then you should solve quarantine That's the basic idea and As we see this is a let's say transnational Effort for contact tracing. This is a new technology and it's just Difficult to assess this and as I've mentioned before There is a tool to do this in a structured way So now Kristen will tell us what a data protection impact assessment is and what we can do with this Thank you the data protection impact assessment is a legal requirement resulting from article 35 of the general data protection Regulation which describes in which situation data protection impact assessment must be conducted and Also the obligations a controller has to comply with whenever a data protection impact assessment becomes necessary And needs to be conducted prior to processing To assess the impact of fundamental rights of new types of processing technologies large amount of data or processing on a large scale of special categories like health data as collected by other the Exposure notification apps the the aim of a DPR What we call it as an abbreviation is to aim to detect problematic collection and linkage of information and to prevent them by Implementing appropriate measures or to prevent the processing as such if it bears the risk of the resulting and profiling And other high risks for fundamental rights of natural persons so profiling and its consequences constitutes grim infringements of fundamental rights and Therefore it is quite important to to find all kinds of processing that May lead to such a profiling and to such risks, of course And to allocate these risks properly. It is important Not only to focus on the app, but to focus on the whole process so Including the interfaces, of course that an app uses but also the Processing operations that are very closely connected and that only gives sense to having such a Notification app at all and The problem with the main problem with the official German data protection impact assessment was that it basically focuses on the app and the server and Leaves out very important aspects of the processing and that for example is the Google Apple exposure notification and We still up to this day don't know exactly what is going on When when this feature is being included and used In the app. So this is this is quite a shortcoming Another main focus of a data protection impact assessment is that a controller May also be the main attacker And not not only third persons or external persons that may attack the processing operation So when you conduct a data protection impact assessment, you always have to focus on you know, does the controller actually need this kind of data is it necessary and Does he have in place the appropriate technical and organizational in measures to ensure that he's only processing what he's actually needs to to To to follow and to conclude the the main objectives of the processing Actually there are not many data protection impact assessments around so far and as a consequence in Germany, there was or I don't know if it was a consequence but as shortly before The corona app was being published. It was clear that there was no public data protection impact assessment and that Data subject had had not been included in the process of the assessment, which is a prerequisite from article 35 So we decided in April to go on and analyze what was known at that time about the about the app and provide what you might call a draft of Data protection impact assessment and which then actually is served as a background for the official Impact assessment done by by the government as being the controller of the whole processing our findings in that in that data protection Assessment was that there is quite a difference In terms of voluntariness when you consider, you know voluntary using an app and And and and in different under different preconditions and Also the question of voluntariness in terms of the legal basis because Voluntary voluntary doesn't necessarily mean consent as a legal basis, which is one of the legal basis that is provided by the GDPR and Which which could be used but there's also the option of a voluntary use based on a specific law and this would have A number of positive consequences As opposed to consent being the legal basis Because according to a data protection law legal basis, which is always required when you process personal data We found that consent is not an appropriate legal basis because it requires you to be to to to take To take to or to consent freely in an informed way and That were that were conditions requirements that would have to be you still have to be met by each and every individual user in order to to be a legally valid Consent and this as you can guess is quite quite Difficult task to to to make sure that every user of the app Actually chooses to use it freely and informed So it would have much much been much better if the government would have decided to provide a specific specific law for the voluntary use and also to put down the purposes specify the purposes and sent to them on contact tracing Exclusively and to preclude other uses such as you know using the app as an entrant entrance token for supermarkets for example concerts workplaces And so on and actually today we see the same discussion now concerning the assassination Where we're also we find that you know being have been or having also already received the vaccine may be used as an entrance token and therefore We will leave also for the corona one up that an app that sir That a law should have been in place to govern this And also I could have actually Also worked for Excuse me Much higher acceptance and proper usage of the app Well and another One of our big findings and this is something that has always been discussed in public also and Is whether the question whether the app is working on an anonymous basis? And in our data protection impact assessment we show that The the the app indeed processes personal health data and this that generally no data on a personal Personally used smartphone is anonymous So we always have to deal with data protection issues in this regard and we shouldn't be talking about an anonymous use of the app And that's basically why and data protection we cannot only look at the code used For such a processing, but we have to take the whole processing context into consideration And well reiner how about anonymization? Yes, I will take on with the last two points That we now have seen that The app itself is not anonymous It doesn't mean that parts of the data at part within parts of the processing can be Anonymized so and the whole idea here is once as long as the data is on the mobile phones It is Personal data. Well, you can imagine if I if I can read out all those tokens and then I can compare them with the With those tokens may be saved on the server. I could theoretically Make a combination and then say who's if the phone I took from someone if that person is actually infected or not So that shows That it's that the data connection is there But this doesn't mean that on the way of uploading those those Bluetooth beacons to the server it would not be possible to anonymize that but this is a difficult process it is possible that Those personal data on the phones is being uploaded and goes through an anonymization process and then is anonymous on the server So this is exactly the data protection view Where you would say there is a Part of the process where the same data which is personal data on the phone It is anonymous data when it's on the server, but the process how it gets there is very very important and as You might Think now Well, if you upload those well after you have a positive test and then you upload Via the app your Bluetooth beacon so the server of course the server receives and it has your IP address So by this point, it's of course possible to connect via IP address An IP address is personal data those uploaded beacons to an individual So what does what does telecom and SAP say at this point? And what they say in the data protection impact assessment is Yeah for an indefinitely small amount of time. We do have this connection, but we actually don't save it trust us we don't save it and Here we see again There's the the main difference if you see the controller is the attacker then of course you should not trust the attacker saying well Of course, I'm not saving this So that's that's the main problem But of course this this risk can be mitigated, but it should be mitigated One way we are suggesting in the data protection impact assessment would be to say well Maybe you have this the server who stores it is within a network But the entrance node to the network is operated by the different organization and all this all The main task this other organization has is to strip the IT IP addresses and then forward the data to the servers of Let's say the RKI so this would be an organizational solution This is not a technical solution, but it still is a protection for the data subjects and that's why Hence the talk this you can't really see this in the code because this would be part of the network design and then we see We have always have to look at the at the whole processing and At the same time it would be nice for example as a as a protection measure to have a law that forbids Law enforcement or intelligence agencies going to those organizations to say well Could you just keep those logs on for a short time because we need this information for whatever reason? So it would be good idea to have a law that Prohibits this kind of change of purpose so there we see that it is possible to do anonymization of those Bluetooth beacons While they're uploaded, but it has to be done properly and should not be as in the official data protection impact assessment Brushed away by saying well, we just don't save this information You know, we all know mistakes happen that's That's the least Dangerous scenario we can see a mal intent will be something else and the last point That we found is as Kirsten already mentioned a little bit The exposure notification framework and the relation to Apple and Google of course we would know and that's a really really Crucial aspect and we would not say you should not use it But it's it should be honestly Stated that there is a risk and this risk has to be mitigated The official data protection impact assessment says well, we can't access the source code So it should not so that's because it's not solvable. It's not a problem. No the The task of an impact assessment would be to say there is a risk and because of close source code at this point It's not possible to mitigate this risk So it's okay if this cannot be solved But it should be stated as that and that's that's a big problem If we just say yeah, everything is like it is and so we can't change it and so everything is alright Then why do we do data protection, right? and Yeah, and it would be certainly possible to ask maybe data protection agencies should audit should be able to audit the code of Apple and Google or maybe it would be possible to To find other ways and since we spent several Ten millions on this app It's frankly quite surprising that the team about the micro G project presented a free implementation of the ENF at least for Android and suddenly people without any Google connection can use it on their Android phones And I was just thinking okay if this is one if they would understand how crucial of this role is at least you could Rescue the Android users, you know in in this aspect, but there are many other ways as well how to how to do it And but this would be a consequence if you would take per data protection really seriously And this is not this is not some some fancy magic This is just data protection done seriously Right, okay So this is so maybe at this point props to the micro G project and maybe apart from our data protection impact assessment Donate to those guys to actually make this app better and more usable and more broadly usable to people Okay So now a custom will continue with some of the open questions Yes, thanks. Well apart from technological issues proportionality of processing We believe should still or again be discussed, you know, what what is the added value of the app for contract tracing? We don't know up to this point It seems still an experiment we if we look at other countries, especially in the Asia Pacific area We see that contract tracing by an app has actually not been a major Tool to to fight the pandemic We should take a closer look at the side effects of surveillance habituation of the populations to a surveillance tool I I believe this is a real big problem Which hasn't been considered at all, you know using a smart phone to to being you know traced and then to have You know to to know that somebody's looking over your shoulder and who whom you are meeting even though It seems to be On a on a beacon and a beacon kind of way It's it's still something that affects your brain And the fact that other smartphone apps, you know, this is always an argument or Something that's brought forward that that smartphone apps legally spy on users does not justify That this app does for for for the purpose of contract tracing It's not a no-go, but we believe that Proportionality has to be carefully very carefully considered and that is it has not been done properly so far Well, this is from my side Yes, and I'll just take on the next points and Maybe just one comment right when we refer to the to The corona approach in countries like Taiwan, which is not the the the mega surveillance state. It's a democratic country. So They don't even have an app, right? So this is Okay, but coming back so and even if we now we have the app and now we we all want to see it to flourish The question is why is the development still so Deficitary or so deficit. Where is the functionality about the crowd notifier this the the draft is already there the source code is nearly written so To have another data field which is A crowd so whenever you go to a restaurant, you could maybe scan a QR code There's a possible way of actually implementing this in a data protection friendly way Why is the micro g approach not? Gotten part of the official app. So once it finds you don't have Google or anything installed then it will be using this This functionality so You know where where did all the millions go and the other point is still as mentioned before where are the anonymous uploads? Why is isn't that done properly if it's if it could be solved or at least a mitigated by some more or less simple Network configuration and finally where is the law again that prohibits any other use of the data and Finally, we would like to close With the mentioning that this exposure notification Is can also be seen by a tech-based destruction from those societal problems that lie lie below We have too little testing we lack staff and health agencies in hospitals. We have bad political rules So if people stick to the rules that are being given out then we are still in the catastrophe and the result right now is let's say the the head of our of Robert Koch Institute just Praying to people they should do other things and I'm just asking what well if people stick to the rules and it's still All going into the construct catastrophe then the rules are just bad Praying shouldn't be part of the political strategy The reporting infrastructure is still not fully in place We have a check lack of checks for quarantine people after traveling or during quarantine So all those things I think have to be kept in mind Especially when the discussion comes again when some people who know nothing about data protection starts screaming that we should lose and data protection First they don't understand anything about the data protection and second They obviously don't understand anything about the problem. We are trying to solve here and the societal and political background So this is usually the point We'd like to close with if you want to use technology with your for your societal problems You have not understood technology and you have not understood society Okay, thanks a lot and now we're here for questions and maybe remarks if you want some links Yeah, first of all, thank you very much Kirsten Reiner for this talk Yeah, I guess the corona app will follow us through the next year Will certainly follow me so you think There'll be major changes made or you think there's only small adjustments to be made What's your yeah? What's what's your guess? What's your? wish What's your shout-out to the programmers and the users? Kirsten maybe you want to answer first? Well, I I think it's awfully quiet around at least the German notification app I would have supposed that they would have put much more efforts into Making this app or more usable user more user friendly solving some of the irritation that pop up Occasionally and Also look into the privacy issues a bit more to to gain more trust and to gain more acceptance How about you Reiner? Yeah, well, I think I would expect to see only small steps because to me it seems that actually telecom and SAP Took the chance and I don't know if the people who negotiated those deals if they did not put in any more developments So I guess in a normal economic thinking it would be normal that Telecom and SAP now just say well if you want more functions Well, you have to take another contract and we know how much you need it So I think we only see small steps I mean the revolution that has been rolled out right now is the diary where you can write down when you have met someone I don't know. So I think I think we see small steps, but I think There are many Good ideas already floating around And I hope to see them there. I don't know maybe you again We have to rely on the open source and free software community to maybe to implement that I don't know. Maybe there will be a fork No No idea. All right, there's there's no questions in the In the chat right now, but yeah anything else you want to Say to our viewers Yeah, first Everyone should stay as healthy as possible. I would probably say When we discussed about those apps and about those technical solutions, I think It's always important to see that you can Criticize something without being totally against it So even within our group of the six, let's say researchers who built this data protection impact assessment We were not we didn't have the same opinion about all those things So I can say the app should be much better and there are major deficits still going on and I'm still using it So I think it's not impossible to have exactly this opinion Maybe other people might criticize it and not use it. So I think it's a This differentiated approach I think is very important because many of the problems that we're trying to solve right now list the lacking staff in the health Environment or in the health agencies or in the hospitals. This is nothing we can solve within two months now so maybe the app can be one small part and it is possible to be To say in this emergency moment, it might be an okay idea and not forgetting that we still have to do a lot of homework So this kind of differentiation I think is at least for me quite important Yeah, main we're from the dystopian future society So the question is for the next big thing that comes for the next virus You think this is a good base. We're working on with where we're only little details Like I mean, I'm just talking about the app. Of course, right, right? Mars and all all the rest of the shit has to be done No question about that, but maybe the next virus works a little different and Do we have an open source base to to build the app for the for the next plague? Well, I Think we're well equipped now, but I think it's also important to see and I don't know Christian If you want to say something just just maybe three sentences for me I think it's still not too late to try to avoid the next plague So maybe we we intrude into less areas of animals, which we have not before always Always the best thing to get rid of probably not even start them. Yes exactly So maybe we treat our resources more fairly and we treat them like we need them because we actually do need them So I think those big pictures we should I mean right now Everyone is an emergency mode with thousands of people dying But you know, I think there's quite a connection between the the climate breakdown global warning question and those pandemic things and I think this is so Yeah, but maybe because you want to say something as well to this Well, I think it's always important to take a step back and look at the broader picture And with with such an app and maybe your next pandemic on the horizon Which we all hope is not the scenario that we will be facing But it is to to make sure That that if if we use our technical tools for contract tracing That that we go away after the pandemic that we have measures in place to make them actually go away And also in the meantime to prepare To prepare to have some technical solutions in a in a privacy friendly way in a way that that doesn't infringe into fundamental rights and That are that that may not be Misused even though maybe if there is a change in government for example So it should be something that you you you can maybe the user Himself can switch off or or at least cannot be easily Implemented again not updated not be mandatory. I think this is this is these are quite important questions and I think in our talk we have mentioned a couple of issues of societal issues That we will have to discuss in the future with pandemic or without the questions of surveillance, I think our questions that that that are Are to be discussed in a free or in a society that is based on freedom Okay, so this is basically the demands that the hackers have since I don't know that is keep it decentralized keep it anonymized Make sure it doesn't get data that it doesn't need and And I have I have a suggestion for the dystopian use of this app. I think after this pandemic is over it will be called The loneliness app and it will trace your contacts and it will warn you if you have too little contact That's that's very nice. Yeah Okay, so thank you Kirsten and Reine And thank you so much We'll do a a little little change in the setup and We'll repeat a talk from Peggy Salop about audibles, which will be on at 1630 as far as I know. So see you back soon. Bis gleich