 Our next talk has the title, setting the course, which digital world are we going to live in? And the Federal Data Protection Commissioner Ulrich Kerber will talk about consequences to the future for things like digital surveillance from private and public institutions. Ulrich Kerber has since January this year been the Federal Commissioner for Data Protection in Germany. He is a studied computer scientist. He was a member of the German Parliament and before that he was the Secretary in the Justice Ministry in Germany. Please welcome him with a warm applause. Right, very good day from me as well. I was very glad to receive the invitation and of course I'm a bit embarrassed that this is the first time I am at Congress in the 20 years that I was a member of Parliament. My family went on strike because next to all those weekends that went into work to spend the time around Christmas, around digital issues, that was impossible. But once I changed job in January into the job that much more nicely fitted to this and where I could work into certain things and had some weekends, I was able to get at least one day. And I hope that in the next years with an overnight stay I could get a bit more of a taste of what other people have to say and to show. I have indeed for about a year in my new function, I was able to spend time in my new function as Data Protection Commissioner and I tried to get into the depth of the issue and take up individual issues, things that have been added, such as the enforcement of the General Data Protection Regulation, which of course had the implementation directive that had to be implemented in national law, the scoring and profile directives, the Pacific legislation, which happens about every week, specifically about surveillance and in the last three or three and a half years since, or one and a half years since being in force of the GDPR to develop it further was another issue. And that has been on the background of a development where courses are set about regulation and non-regulation of technical processes, new IT systems that will have massive consequences on our society and the order of our society. And I'm interested in digital surveillance, whether it is from private or state institutions and particularly the outright historic race for ever more powers of security services. And I am convinced, I am convinced that we can set the course in a way that the digital society can also remain a liberal and free society. But that is by no means ensured. And we keep seeing that the technical and economic aspects are put into the foreground and that it is difficult to get through with political and data protection issues. And if there is a debate about the courses that are being set in a digital European political debate, then it is those technical and economical aspects that are being placed. And people are referring to the digital economies in the US and China that are indeed further developed and that what is quite fuzzily termed as artificial intelligence, the claim is made that their advance is something we will never be able to catch up with. And that is taken as a means to enforce the way that certain debates should be held about how data with AI and algorithmic systems and the use of databases should be used. And if you look a bit closer, then it is in fact the case that in the US and China, in certain sectors, a different phase of technological development has come. There is much more private and state means that are being invested. And one can tell that both systems have taken a certain path, set on a certain course about the use of IT systems. There is a surveillance system under development in China that the police can only unfold fully. And that has been an issue in the last few months. We have seen what is happening in the west of China about the Uyghurs. And we have the situation that all kinds of things are checked digitally and people outside these halls cannot really understand how the digital sphere is now so much entangled so that behavior outside the actual digital sphere is also registered under a certain pressure that is exerted. To those, there are rewards for those that behave in accordance to the system and those that do not are being denied certain options in life, weird education or mobility. And it is interesting who is setting those standards. It's interesting how these standards are changing. And most of all, a feeling in this use that you can't really assess anymore whether you're being watched, whether your behavior is interpreted, stored by others and judged. And that there is no real way of withdrawing into a private sphere that is not being... And so what other people are living changes, peaceful, private behavior to in Europe. This is not so prevalent. People that have certain political positions in the economy. There are interesting studies concerning younger people. And we see that these people are more in agreement with systems like this. Systems where certain ways of behavior it starts with things that... Behaviorist sanctions that the large majority does not really want, neither from the state nor from private enterprises. So it's neither the Chinese or the American model that are indeed... That there are indeed certain aspects, the security sector in the US and China, the cooperation with private and public sectors, it starts there. And I believe that this surveillance poison is dripping into society here, too, every day. And the fact that surveillance is carried out is because it can be carried out, that evaluations are carried out because the data can be collected, because data can be stored in ever smaller devices, ever faster. So things that haven't been done, because the effort was too large, can now be done in passing. And where can this poison now be felt? It can be felt in business models by publishers that are given precedence in political debates, precedence compared to people's privacy. And comfort is exchanged for data in social behavior as well. And security is given precedence before privacy. And questions are asked, what can be done with new powers, new intrusive means? And we see that the evaluation of behavior and of convictions, if you look at certain data processing, and if you see how certain legislation brings up the question of who supports something, who speaks out in favor of something, that sets norms, which then in the processing of data are evaluated by state and private institutions, also in the European Union, and how we shape this, how we talk about an e-privacy regulation, that is one of those causes that we are setting and that is what I meant with the title of my talk. At the center of the debate, there is the thing that always, the buzzword is always around artificial intelligence. Of course, there are enough people here who could have a conversation about this for two years, but let's just not do this now. I'm talking about AI and algorithms in a broader sense. Us who are concerned with data protection have realized that this changes our work, but it has also great effects on regulations that are being enforced, and in the spring of this year, we started working on a declaration from Castle Hambach. We have chose this location very deliberately where the civil rights were being foregrounded and it's an international festival. The French, the Polish were there and other delegates. And there we try to find and define basic principles to not make people objects of such surveillance and that we keep in mind what is the state of collection of surveillance for. This is not transparent and being able to explain AI systems, the prevention of discrimination and the basic principle of data minimization. These are very prevalent when debating with tech people because even if people have a very liberal conviction, even those people refuse to take this responsibility when developing systems. We get counters that we are just causing extra work for them and we're putting them under a general suspicion, but it is our opinion that who implements these systems needs to think for themselves, can this area which I am working in have effects on people and their rights and what do I need to keep in mind when developing this technology? We have also talked about responsibility. Every system needs somebody who is responsible, can be talked to in case of discrimination, et cetera. This cannot disappear in a complex network of different complex systems. If you come at this from an economic perspective, it's really absurd that here in Germany and here in Europe where we have experience with data protection and hopefully also IT security, where we have laws that make us have higher quality standards than other parts of the world. That in this region, the economy is whining about these regulations and talking about the USA where people are trying to advertise their services that we are better than the regulations. We have done this from the start. If we had this in cars and machines, then Germany would be a very poor location. I don't know why we're trying to get ever worse in this regard according to the economic lobby. This would be a good thing for our IT security and data protection if we set higher standards here. I have a wish and that's the same for economy, technology and the companies. In order to optimize our processes, this is how the reasoning goes. We need more high quality data to use and to analyze. I'm not against this on principle because there are a lot of processes that I can see myself where access to the right data, where responsible access has potential for optimizing and making society better. But this process needs to be organized in such a fashion that personal privacy rights and personal informational self-determination are not infringed upon here, that these are kept intact and that the technical possibilities are being used to enhance this informational self-determination in places where we haven't had that in the past. And this debate needs to be had. And this is the German data protection authorities have tried to enforce with this Hambacher Declaration. And the European Data Protection Commission has followed along. I think the regulation of AI and algorithmic systems needs to happen where these rights are in danger, where the rights of the individual or society as a whole are in danger. Some people say data protection is getting out of whack if it doesn't just consider the rights of the individual but also considers society as a whole. But there are a lot of instances where we can't just look at one or the other. We need both. For instance, if I admit that people can share certain data as they wish, then the person who doesn't provide the data is being left out. So in case of very sensitive data, we need a general prohibition to collect this data. And it cannot just be up to the individual decision. And this idea that technological developments and their use can be regulated isn't anything new. We've always done it this way in the last 200, 300 years since we've had a somewhat developed civil society and political debate. We did it with steam boilers. We did it with cars, mobile phones. So what should keep us from using this debate in a targeted way and a practical way regarding software and hybrid systems? And one example for that, and that's what the Data Ethics Commission pointed out, when they said there has to be the option of prohibiting to point to another means prohibiting algorithms with a high potential of damage. Now, this quite natural call, and even in cases of grave harm, was criticized from economic, from businesses, and the role that state and businesses and the relationship between them. When I say certain devices are not permitted, the sale of organs is not permitted. Child labor is not permitted. So why shouldn't I be able to say that certain users of software should be regulated up to a prohibition? We are talking about a small peak in an unbelievably innovative market, where 99.9% of all systems that are coming into market would never be affected by supervision or regulation. But this tip is what we should look at. I'm not going to talk about the example of automated weapon systems or high-frequency trading. But even when it comes to personal profiling and the question of who these profiles are being made accessible to, there is a high potential for damage there. And the option of prohibition must be included in regulation there. And also, we do indeed struggle to keep up the principle of data minimization. This is what it is called in the General Data Protection Regulation. It used to be called data frugality. That this principle should not be abolished. It is under attack from all corners, from technological people, from economics, from politics, and speeches by economic ministers and the German chancellor. And I think there is a lack of understanding what this term means. It doesn't mean throw all the data away that you have. So data poverty, perhaps. It's about not having the right to collect data that you do not need to deliver the services or products that you are going to deliver. Because these data are none of your business. And there are points where you can only keep using certain data. And please do not cry out because I'm being very generic here. Data that you have gained from analysis after anonymization. I know that there is no complete anonymization. But there is anonymization that is appropriate for a certain situation. If you need certain data, then the effort that you need to take if data is being deleted, there is a difference between certain data and, say, biological data where you don't know at all what other databases would be using those and would be able to repersonalize these data. And we do believe that technological development, such as decentralized learning, give rise to opportunities of using data in AI that can optimize processes. And we want to have an approach where we do not build up huge data lakes where, at some point, you then look and see what is possible. And only when there is abuse to then see, after you detect the abuse, can you penalize it? And can you find a court that will agree with me when I say I want to penalize it? Because whenever I make a decision as an authority to sanction an authority or a business, a fine perhaps or a prohibition, then I have to expect to be taken to court. And I could lose that case so it can take a long time. And it's not just up to me how the practice will look in the future. So we need clear regulations regarding this area. This is no block to innovation if you include these thoughts from the very beginning. And if I talk to businesses, I keep asking them, do you believe that you, as a Silicon Valley too, that you will be a Silicon Valley too or a China too, and will be successful that way? Or are you just a cheap copy? Wouldn't it be better to have our own words in IT, in an understanding of AI, in European values, to model that and seek support with that in Europe, but also the people in the world that would have other values and would like to see them in their security systems, in their private data use, other than what is happening in their countries? That was the idea in the GDPR. Without regard of the location of a business, European should be protected. And that this should be exported was the idea as well. And we see successes. Other countries are setting up protection laws on the role model of the GDPR. We have been in exchange with American rights with China and Europe, I think, should be very confident to accept this kind of competition. We are experiencing large internet companies. I regret every day that we cannot really that we that the large obvious violations by the large companies haven't led to decisions at the European level. And whenever I meet with other data protection commissioners in Europe, I always urge that the people in Ireland, in France, will hand on these cases. So you cannot actually hide behind one authority. There's always a majority decision involved. But when the large providers are affected from the US, but also increasingly those from China and Russia, others will come. And still their defaults mean that data is being collected without reservation. And they are collecting these data. They try to be conformant with the rules. But there are certain technical, there are certain expertise that is present in this room, but will not be present outside. And it doesn't have to be present. It should not be necessary. If you don't know these core settings, then you will keep losing data to these large services. And those that are trying to avoid these data collectors will be entrapped by them and will lose their data to them. If through tracking tools and plug-ins and source code development kits, these data are being collected from third parties too. And these questions again, next to an attempt by the German Monopolies Commission that is not bound to the same rules as I am, no attempt was made to stop this. And until today, I have a decision by the court in Düsseldorf, the capital of the federal state of North Westphalia. And I could not understand. I think there was too little regard to the changes in competition law and European law involved. So we need the enforcement of existing rules. We need certification and labeling so that if you see a product, you know that the rules are being adhered to and that I then can develop further services. And we also, of course, need digital literacy in the population up to a certain point. People need to understand what they are letting themselves in for in the digital society. And we should strengthen the ways that technology can strengthen privacy. So PIMS, PNTs, I think are important to introduce in a way that can be relied upon because if the large companies are setting standards for protection, such as the do not track standards, if that is being ignored, then that is just a begging for state regulation. If you don't keep the standards, then you are asking to be regulated by the state. And I believe that with the privacy regulation, this regulation should come. Usually, as the head of a data protection security, I should also be concerned with interoperability. And this also entails the exchange of data. But today I see a market where there's a few big players and the entry barriers for new players need to be lowered. So we need interoperability for the important services in the market. And we need regulations for this. I mean, specifically in social media and messenger systems, there one could take steps to develop European alternatives. Ideally, an open source basis could be introduced. I hope the German authorities could follow the French example. And these would be a couple million users in central Europe. So we could be a counterweight to all the American big players that could be established here in Europe. We want the same rules for all data surveillance. This also applies to security agencies. And those on the federal level are under my supervision, since 9-11. New laws have come into existence in Germany at a really, really high speed. And those make it very difficult to follow what exactly is the state of data collection. There has never been an evaluation of these laws, whether they have been successful or not. We see that new powers are being used. These data is collected, but not being evaluated. There are problematic data collections that are even considered superfluous by the security services. But at the same time, there are unauthorized accesses to these data. You've read about the Berlin police, where we've had an incident like this. No data has been deleted from their systems since 2010, even if you were just the victim or the witness of a crime. And this data aren't being made accessible to investigation commissions, but they're available to all police personnel. There's a protocol being written, but you don't even have to give a reason why you want access to these data. And so it's, of course, no wonder that data about certain celebrities is being accessed or data of their colleagues or neighbors of people have access rights. And as long as this is the state of things, it's important to have a security law pause where we don't pass any new laws but rather examine where are the existing laws overarching. We must stop infringing on people's rights. We need to evaluate if these laws are actually doing the work. We need this more than we need new laws. Instead, we should make stronger laws in the case of data protection. This GDPR is bacon, but it also has its limits. And technology is developing faster than the GDPR can follow. This is still at the state of 1995. So if our school hasn't been born yet in 1995, there's probably going to be a couple of people. And this is in technology terms, practically, from the Stone Age. But we need to update this because we need to establish confidence in digitalization. I don't think that data is the resource that's going to be most important in the 21st century, but rather trust in a complex world where I don't know where my data is stored, who is using this. I just cannot check every single product, every single collection. I need to have a system where I can trust the service provider, whether that's the state of a private entity or I can trust this entity because they have proven that they adhere to certain standards every time because there's independent supervision and because there are penalties if they violate these standards with the product. If we don't do this, if we don't do this now, there are two consequences. People are going to fight against digitalization where possible. Others are not going to join digitalization. And we don't want to have fatalism where we say, er, I can't do anything. And we don't want people to abstain. We want digitalization to mean social innovation and not just technological innovation. Which is why we are on the way and in the areas of scoring and profiling, which are the assistance of tracking, I'm trying to categorize people, put them in boxes, divide them into statistical groups. So we need certain adjustments and small enterprises and startups and individual developers should be given means to cope with documentation obligations. And if the protection level could be raised in other areas, perhaps we try to help through guidelines and support. And the person that is responsible in terms of data protection law is the one that is using a system with citizen's data. And the makers of the system should more be brought into their responsibility. And there are debates that you've held hard during Congress in that way. You've published certain investigative documents. Only then is it possible to, I've been standing here and I've been talked to about the same problems. What about the use of Windows 10 in my lawyer's practice, in my authority? And saying, oh, is the person using a system is the responsible person in terms of data protection law? And in the ever more complex world that we live in, that is no longer really feasible. And things have to move on. So there's a lot to do. The German parliament has kindly raised up budget and given us more staff. Thanks to those parliamentarians present. We will have to invest more in data protection and data safety. And we are hiring to all in the room. We need you as supporters on the issue. We need you as constructive criticism. We need you as people that set examples for data protection-friendly innovation. And we would like to have many of you as coworkers. It is nice to be on the light side of power. Come to the Data Protection Authority and I'm looking forward to your questions. OK. Now we will make the questions. First question from the internet. After you, there is Anna Semsrott from Ask the State. He is also, you are also responsible for information freedom. So do you have an idea why agencies that want even more powers, why they have a problem to actually give information they have to the public? So even if these agencies set good examples, and even when we started in the Justice Ministry, we have decided after a couple of weeks that all press releases and statements are sent to the ministry. And we are always being asked for comment from organizations and the press, et cetera. These should all be made public. It's the volume hasn't decreased in the slide, but everybody can see who made what comments between the first draft of the law and the eventual passing of the law. And this establishes trust. Some people have done this after they were forced by the courts and they got back to do this. Actually, freedom of information can be very exhausting for the head of a ministry, not just because something inconvenient is being published. You need to pay attention that the comments you write onto something are, you don't write any insults, but you wrote it more politely. But it can be a lot of work. And I don't like it if I get a complaint either. And after two weeks, somebody petitions to get all the information about this comment. And then they get all the email. And we write back that we've written for a comment. And then people apply for all the email traffic. And instead of being able to deal with 10 complaints, we have just can deal with one. We're the good people here. And because we have freedom of information, we are very happy to do this. But if I approach a minister, they all know how many people are being tasked to react these freedom of information on things. Ministries as a general rule are not of a fan to believe that we are going to have better freedom of information laws on a federal level. I doubt that we need to do a lot of convincing before that can happen. OK, the next question. It's your role is to look over the secret agencies and police and what they do with the data. So what do you think is missing on their control? In the area of collecting and evaluating data, we're not in that general area. With these additional job opportunities, if we hire the people, we're going to have the resources to keep control levels high. We're going to have to work on how technological changes and also in the systems of the services are going to change our control duties, because it's, of course, an entirely different thing to have access to a sequential database with an ancient system. We're going to need to work together with services abroad. There are gaps on the control there. We are trying to work with other supervisory agencies on a European level. We need to keep in contact with them. And I would really like it if this joint implementation regulation that we have, that is applicable to the state, it has not been implemented in Germany totally. We have powers of prohibition for the federal police agency. But towards other agencies, we don't have these powers. And this is against European law, where European law doesn't demand it, but would be sensible. I could approach an agency in a special case. I could forbid the use of data or demand the deletion of data. And the service would have to go before court and argue their case as to why my order should be rescinded. I think this would be the better way for our supervision. And it would make the greater acceptance of our services. But there's no majority with legislation. Next question. Hello. You said that we need some digital literacy to... But I see that currently we are moving into the opposite direction politically and that we are currently state-wise more interested in the economic sphere of it. Well, with digital literacy, I meant that people have a general understanding for digital processes. What happens? What happens with my data? How do algorithmic systems work? This part is, of course, a part of education as a whole. But 80 percent of the population are done with school. We are going to have to find other channels to educate these people. Part of this is the demand to split information given to people. This needs to have a simple explanation for the process. So that people understand the processes that happen. What you are talking about is the question about digital sovereignty and about knowing what is the state allowed to. Is there a legal basis to collect data to make this compulsory just because they think they don't get any volunteers? But the legislation should keep in mind appropriateness and commensurateness. And this is what we as an agency always look out for. And the second thing is what I was talking about. Technical measures elevating Pildre has. If you look at modernizing registers in Germany, please don't, if this is against the Constitution with a unique identifier, there is another way. But a data cockpit where people see beyond the high level security agencies, where people can see who has what data, who has access them, who has shared them from every state agency, this is possible today. And this would be something that people are benefiting from a modernization of digital infrastructure and state level. Next question. You said that one way to do this would be to get better products for data security and would be some way of certifying them. Do you have any special, especially things you can point out that are already made? So what is coming? We haven't come very far in standardization, but development models, especially in algorithmic systems in the AI, we are going to have extended standardization so that there are models how to consider certain things and how to consider certain consequences in development. How does data provision look? How does data quality control look like? In certification, we are one step further. I'm just not talking about certificates of excellence and where this should go beyond the legal framework, but we should make it possible that if you have a product that is being certified, that you can get a certification for your product, that you fulfill the legal regulation. So the next person to use it can see this certificate. And if I use it as intended, then this is conforming to the GDPR. And if I do anything extra, then this is my responsibility. The European Data Protection Commission has passed guidelines, and then they have permitted the national states to implement their own. We did this next month. And I hope that we are German data protection commissions OK for a German implementation. So we could maybe get the first certifications and the first institute issuing certifications could come next year. I'm hoping for such a certificate in the area of clouds computing, for instance. OK, next question. OK, thank you for this talk. And that you're here at Congress this year. For some years, I'm responsible for the IT management. And there, I also have to implement several GDPR measurements. So my question would be, when do we see results from the big companies like Google, Facebook, Amazon, and so forth, and maybe also finds to these big operations? The pressure is high from especially two national data protection agencies who are responsible for these big internet companies. Quick aside, German-European data protection law, a one-stop shop. The national agency where the European headquarters of the company are is the agency that is responsible, like Microsoft, Apple, and Facebook. It's Ireland. For Amazon, it's Luxembourg. And these are rather small data protection agencies. They have a very economics-friendly legislation. There are high hurdles what you can do before you can penalize these companies. If my Irish colleague would talk about cases like I do in Germany, she would fail in court. The judge would call her biased no matter if she herself did the examination of somebody else. But it's been 600 days, and we want these data protection violations. We want results. And then we discuss this among ourselves and see if we can come to an agreement. And if we don't come to an agreement in our commission, we have a majority vote on the official view of the European data protection commission. And this is the decision that is made. So the danger that we in Germany decide things on our own doesn't happen in data protection. Like it would be possible in auto motion. But all this takes time. And I offered the Irish that we could take on a case of this. And they have suggested cases. But there hasn't been an answer yet. The Irish government has given a very little budget to its data protection agency, just a quarter of what my colleague applied for. And this is about 5 million euros a year. And this is the state that is trying to, in a court case, to make Apple pay them 40 billion euros in taxes. And it's not foreign to us in Germany who is protecting the automotive agency. And we do this, but there's really not a lot more than I can tell you about this. OK, next question from the internet. Why doesn't it work that? How can we hackers help you to improve the situation regarding data collection? Well, I believe there was a successful lobbying work that caused people in top positions in politics to understand that data minimization would equal data poverty. We would have to throw away data. And we would then never be able to catch up with the US and China. And this very simple rule was implanted into top politicians' heads. And at the digital summit, I noticed that before the GDPR was passed with the votes of the German government, the chancellor stood up and said, we have to give up the principle of data minimization, which then, a month later, became binding European and thus German law. And that is something you rarely experience. So we will have to quite intensively promote that this is just a wrong perception. And I think a narrative like data that is not related to my business is none of my business. That is something we should promote. And that is a moral decision that we need to make. And of course, it would be more attractive to operate co-power stations with child labor because prices would go down and we may be able to compete with, say, Colombia. But we made a moral decision not to do this. And the same decisions we should make regarding the use of data. Next question. Hello, Mr. Kalberg. You also said by yourself that some laws are not enough. And for me, for example, it's the same. I think it's a toothless tiger. So do you see any alternatives of action? For example, I house it with the provision of alternative infrastructure, for example, for cloud infrastructure or projects that are in the green and what are options in the politics? Well, I think that we should not stop doing the one thing and then not do the other thing. I'm all for support and promotion. And I want to improve the situation. We were talking about the messenger example. There are many very relevant sensitive data that are being collected by messenger services. And I am in an exclusive situation, a monopoly situation I found that I would need a significant amount of forced break up. If one third of all text messages are now sent to messenger platforms, then these platforms have the obligations, the respective obligations to fulfill. And that's why I'm keeping promoting. And the security commission in IT has the same as a German administration. And we have to start at the federal level. We would have to offer a messenger system that is compatible with data protection. And I think we could only do this on an open source basis. And if our large neighbors do this in such a way, and this is matrix, then we should see if this is under the same conditions as in France. And we should see that it's absurd that people have 15 different apps for higher bikes and higher scooters. How they should not install a second messenger with which they can communicate with me without leaking data. That is a certain blockade in thinking that we have to overcome. OK, next question. How do you sensibilize not IT people to give them the interest in their own data safety? Talk to them. They want to be talked to. Politicians want to be talked to, at least. And also those that are not in net politics. Regarding the population, the citizens, I think that next to the annual scandal that raises a short-term interest but doesn't lead to any action, we would need to do this in a way that they can recognize certain pictures or patterns so that they have easy options. And people that have no previous education, I like to say to them that what's happening in the digital arena is what that means in their everyday analog world. If I tell them that tracking, imagine you go to the pedestrian area in Bonnet, someone would start walking behind you, writes down which stops you, stop at who you talk to, and that person would then offer you a coffee for free. But before you take the cup in your hand, the person says, oh, now I'd like to know which five people you've met previously and if you are compelled to stop using envelopes and just write postcards. So if you transfer this into the everyday life that they know, then one or other of these people will wake up. When I still was a party politician, my intention was to set up a store when the Chancery would open the next doors. But people would only get that if I would be allowed to look through their mobile phones. That is not something I could do anymore. I don't dare to do this anymore. Next question. Thanks for this talk. We can see here at Congress that even where we have a lot of security, there are still loopholes in the majority of the time. You did not do something that great. So there's a question. How is your situation in this field of data minimalism versus data protection? Where is minimization more important and where is security acceptable? Well, I believe that data security, if you exclude a lot of security problems, but I think you're talking about the fact that certain data has to be collected to recognize certain attack vectors. Is that what you meant? No, it was that everywhere we can see that people grab some foreign agency is grabbing data. Well, then data minimalization is part of a data security concept. And the distribution of data, the distributed nature of data, is a question that I had expected. How about digital health care? There is data where it would be unethical not to use it. But what, then, are these security measures? How can I reach the level of security that I had in the analog world? But there have been, of course, a lot of violations. How can I improve it? And where is the data so sensitive that I have to use extra measures to ensure the standards? In that regard, many thanks to Congress for uncovering the security problems with the e-card, the German health card. You can be sure. You can be sure that this will be raised in my top level meeting in the next year. Next question. Mr. Kölber, my is that you're here. You have very ambitious goals in regards to bad algorithms so that we need to get trust. And now you also said something about certification possibilities. And now I just want to point out that the diesel motors from our car manufacturers are all certified as well. So to remember, I guess the call of the security of one algorithm can't be in the security in the not knowing about the function of the algorithm. So do you see another alternative than open source to reach your goal to prevent bad algorithms or to increase trust? Is that possible without proprietary software? Well, that is a topic for a talk in itself. And I think there are areas where only by making the algorithm public, I can be part of a criticality review. Look at the summary of the recommendations from the Data Ethics Commission. There was this pyramid of algorithmic systems. So the second level from the top was this. And I believe that we want to look at certain points in systems. And of course, I'm aware that these change quickly so there is no single point. So refusal by companies and authorities, agencies, I will not accept. I have the power to look and I will not hesitate to enforce that. But if a certification is not possible, then if something is used, if something that is used does not reach the level that is required for certification, then certificates cannot be issued. And when we were involved in the coalition negotiations after the last election or the last one and, of course, the competencies first have to be developed to look into those systems, I cannot employ 20 experts and the agency for security, 40 and every agency, every other agency. There are not so many people on the market, so we would have to have one institution that is part of a network with science and civil society to look at such developments and then get a bit closer to what you were saying. Next question from the internet. Should we look to ask responsibly if we ask for data information at agencies? Well, yes, just like with every instrument, you should only do it if you are convinced that it is right. But that's not enough. I've seen freedom of information requests, where I myself said that doesn't really make sense, but then it turned out that something sensible did come out and it was good that someone asked. And conversely, I have seen freedom of information requests where people only wanted to find out what the state of knowledge is in the investigations for organized crime. So the spectrum is so large. You decide for yourself what is allowed and what must be responded to and what does not have to be responded to. We are still in the ombuds person function here, so if a reply is refused, then we will look at it and we will take another look and judge it for ourselves. Next question. Hello. Thanks that you're here and thanks for your talk and a very special thanks that you have it on the European level as well and that you talked about it there. To my question, you talked about the importance of data security and that of privacy in the social area and public area. So do you see problems with, I would say, that there are interests in companies and the ways they behave that they try to get every possible data. For example, in my city, there is a car company, like a right-providing company that... So the problem is that a supplier in a tender for state furniture, someone is excluded because they are more expensive than another supplier who would... But the cheaper offer would include the leaking of more data. I don't know, I'd have to check. Thank you. One of the reasons I came to take away new ideas, but I can't say anything there. Of course, the next reform of the law against monopolies is coming up. No, no, competition law is coming up and we are having some private consultations. But to now we haven't regarded it as relevant enough that I was informed, maybe it wasn't lower level. Thank you, Mr Kelba, for patiently answering the questions and thank you for your talk. We can't have all the questions asked, unfortunately, because we are running out of time.