 Wir werden live das Gespräch von German Inter-Englisch transmutieren. Hallo, ich freue mich, dass ihr es geschafft habt und euch eine halbe Stunde in diese berühmte Hitze setzen wollt. Ich würde mich freuen, wenn ihr auch noch für den Talk danach bleibt. Aber ich hoffe, dass ihr das auch dann überleben könnt, weil es ist, ich finde es fürchterlich heiß hier. Ich bin froh, dass ihr es geschafft habt, dank der Hitze. Ich hoffe auch, dass ihr für den nächsten Talk bleiben könnt. Aber ich bin sehr glücklich, dass so viele Leute dazu haben, die Hitze zu bravieren und in diesem Talk sitzen. Also, Politische Lösungen für technische Probleme, das Titel der Talk heute. Und das sind zwei Expertises, die ich für die deutsche Regierung geschrieben habe. Einer für die Inter-Englische Agenda. Und das ist über die IT-Sekurität und was ihr damit tun könnt. Und der zweite war für die in den Ausschers basically telling, dass das, was ihr versucht, zu tun, ist nicht der Fall. So, zu Beginn. Wenn ich sage, ich möchte Politische Lösungen für technische Probleme geben. Zuerst müssen wir darüber reden, was technische Probleme sind. Also, was wollen wir uns hier in dem ersten Platz lösen? Wir alle wissen das Cyber. Das ist eines der ersten Google Image Resultate. Ich denke, dass alle das gesehen haben. Das Cyber ist groß und das Cyber hat uns auch sehr unhappig gemacht. Das ist etwas, was wir gegenwenden müssen. Oder was wir gegenwenden wollen. Weil es für unsere Gesellschaft ein Vorteil wäre, wenn wir uns die Probleme erzielen. Also, ich habe einen Blick auf die Symantec Internetthread Report 2015. Und sie repart approximately 6.500 Vulnerabilities in IT-Systems every year. Und approximately 2 Zero-Days per month on average. Zero-Day means that this vulnerability exists in production systems. And it's only become known by attackers exploiting them. So, that is something nobody knew about before the attackers exploited this weakness. And only after there was damage humanity was informed about the vulnerability in this software system. Best candidate is the Adobe Flash Player, where I really ask myself how they can go away with this low number of vulnerabilities per day. And finally, I think the most disconcerting part according to the Symantec study is that the top 5 Zero-Days in 2015 usually took 59 days until before they were fixed. So, users are vulnerable for about 295 days in the year 2015. And I think you can say with good reason it would be nice to lower these numbers. So, obviously you want to increase the first number because you want to discover all the vulnerabilities. Zero-Days, that are being exploited in the wild, should be lowered. And of course, the time to first fix should be reduced massively. 2014 was also the year of politically and technologically very painful security bugs. So, we had to go to fail from Apple. It was one and a half years old before it was found a vulnerability in the TLS library of iOS and Mac OS. So, one and a half years the people were vulnerable, was fixed on the same day for iOS and five days later for the Mac. And that deserves the award for second most fastest-pushed updated Apple ever released. Only the U2 album that people got for free a few days, a few years ago, reached people faster than that. We had Heartbleed, everybody knows about it. It was two years old when it was discovered. So, theoretically we were vulnerable for two years. Had no or very few chances to see if any of these attacks have been happening before. Fix was released on the same day. The fix was published with knowledge of the exploit. Special feature, the first bug that had a logo before became cool. I think that's worth a round of applause. And Shellshock vulnerability in Bash. 25 years old before it was discovered. As far as I know, that is the oldest CVSS10 that exists. It's older than the IP stack that Windows uses. So, yeah, it's probably the oldest bugs, because Windows IP stack is also in the CVSS10 area. So, we have problems. And it turns out that the off-proclaimed self-feeling parts of open source are not working as much as we wish for. All of these symbols, open SSL, as well as Apple's TLS Implementation S, Bash, are all open source programs. And everybody sees the open source much, much more secure, because everybody can look at it. And many bugs, many eyeballs, make bugs, shallows. But reality is different. We can see that the open source quality assurance, I think we just have to admit it to ourselves. There are still deficits there. And the economic incentives that make software more secure have also failed us, because some people decided to exploit some of the victims of heartbleed and correlated with their yearly revenue. All of these would totally have had the money and power to audit open SSL or to pay someone to make the library more secure. In the meantime, Facebook created their own TLS library. But we can see we have yearly revenues that are definitely relevant here. And all of these are being protected, or amongst others, protected by open SSL, by open source products. But none of them audited them. And I think that's very sad. There's a social dilemma at play here, because if I invest in an open source software, I have to spend money. At the same time, everybody profits from that. Also those who don't invest. And of course, that leads to an incentive where open SSL is the industry standard, everyone's in the same boat. That means nobody can blame me if I get a security issue there. And of course, why should I invest? Because my incentive is to just wait for everyone else to do it. And I could just profit from their effort. So, let's go to the political solutions. So, what I was talking about at the beginning of last year about the digital agenda, agenda of the German government. I think the following political solutions would be appropriate to ensure that we all have safer computer systems. Not as citizens only, but also companies whose rights the German government is also trying to protect in this context. But where are the political possibilities at all for these kind of technical problems? So, we have real bugs, programming mistakes, buffer where just code quality is lacking, something you can exploit on that level. And we also have the concept of a backdoor. Maybe not quite an intentional mistake. But the remote access possibility, which might or might not have been built in by design. So, that's a systemic kind of a flank, which you can target. Take the German secure mail program, DEMail. This was an example. But these aren't really political errors per se. But I'd like to say, that's why I'd introduce another idea to... Well, an expanded model. This is the different layer. So, after this layer seven, the layer eight problem, we have the organizational level, and the top level is the kind of politics. So, this is a problem of competence, really, at this level. So, at the same time, they're under pressure to do something at last. And there's a lot of action going on, saying we have to do something now, finally. But it's a kind of security drama because the efficiency of the approach isn't really apparent. So, maybe we know airport security when we all have to leave our toothpaste there. We think, oh yeah, they're taking care of security, but it's another approach or workaround that should be addressed as itself. And so, on the layer below, at the organizational level, we have another focus as regards company interests. So, what is the need to do IT security? The users themselves are usually not decisive there in this kind of process or decision. So, we have a dysfunctional KPI. What's a KPI, three-letter acronym? This is Key Progress Indicator. It's a, he's not sure himself, a Key Performance Indicator. So, this is what we arrange with your consultant with your McKinsey advisors and to be able to measure your own progress in the last year, for instance. So, what they're saying is, of course, find more errors in shorter time. But these kind of personal advancement reports will make you ensure that people spend the least possible money in the shortest time. So, basically it's a certification just to make sure that you keep on going and can't be sued for negligence or something. So, that means you've already covered your ass, done your job well enough. So, if you take companies like Microsoft, that gives us all another level, to do something interesting within this organization is quite difficult. And so, that's my conclusion. The social dilemma that arises out of this is that no one is as good as maybe they could be or some, because it's not worth investing in this mutual gain. And so, the vulnerability level might be at the application level or at the presentation level or, you know. So, these are indirect ways of targeting exactly the level where the weakness is, where the vulnerability is. And these are kind of, well, open question. Will these signs make sure that the German certificate for security, will these make coders write better code or will they prevent hackers from hacking? This is something a friend of mine who is an auditor of one of these. Well, he said, if I sell bad meat because I'm a vegetarian, I didn't catch that. Then I can be certified for, well, have his fish and chip shop certified by some strange rule breaking or bending. I didn't quite get the anecdote. But regarding software quality, we have the infrastructure level. The question is how does the infrastructure need to be built up, that we can enforce these standards and the decentralization. These are all points that are only necessary. As soon as the first level, the software quality has already failed you. So, that can be advanced from before just to make independent audits to make bounty prizes and to also have a liability for bad code. But if you have fraudulent systems, well, you could talk about liability issues and just see with what kind of bugs this is already delivered out. So, these political ideas that might be realized, they have some kind of accountability to ensure how these standards are enforced at all. And again in Germany we have a situation that the data retention is being reintroduced. In this situation I would have liked to see that after a year we can say, well, have we profited, have we benefited from data retention? No, there's no indicator for that. And we have to leave it be and carry on another way. So, instead of that you see, I'm a consultant, I have these three boxes and now I'm going to dive into the deep. Proactivity means you want to be faster than the attackers. And the entire security industry annoys me in, especially with the products, the entire prevention thing is always trailing behind. Audits, new vulnerability now, we have to create a new patch. Oh, a new virus, we have to update our signatures. The entire area of prevention only thinks about known vulnerabilities and not about how to reduce the number of, increase the number of known vulnerabilities and how to be faster than the attacker. Right, there's a slide missing here, maybe not. Then I wonder how can we increase software quality today, especially the open source software. And one thing I think will be cool is that we will be able to do something called backbounties. Backbounties from an economic perspective are very sensible because you just put out one high award and you just say, okay, remote code execution in OpenSSL, if you find such a bug, we give you $250,000 or something. And then you just slowly increase the awards. And then you hope that sometime several scientists are working on this, several researchers are working on this and you don't actually have to pay them. So, for security researchers, maybe not the best, but economically speaking, for security they're very interesting. The problem is that today's backbounty programs just don't award high enough amounts. So we have this company called Vupen and they just buy exploits. And so with Google's backbounty program people go there, they can show them vulnerabilities, but they don't actually collect the prize because that would force them to reveal their technique. And instead of cashing in the award, they just say, yeah, now we're going to show you how we did it. We just want to show you how we can do it. We actually have another buyer who will pay more. And of course that's the worst thing that can happen in these kinds of competitions. I think the only thing that we can do as a society is to take care that as a civil body and as a state body we are in the position to pay them the prizes that they want to get for their work to dry out the black market that is also being peppered by the state security agencies. And of course it's a question of liability. Security promises are not being kept. And I always forget who said it, there's this beautiful quote saying, there's only two suppliers in our society that can work without any liability. That's drug dealers and software manufacturers. And both of them call their users users. So if they don't keep their promises or patches that are coming too late, like these 59 days, sorry, 95 days, I think at least in proprietary products and obvious carelessness you could try to create a political pressure for liability which would actually incentivize people and companies especially to make the software more secure. So 20 years of IT products or more than one decade of security problems we find several thousand vulnerabilities per year. Why? Manufacturers say we want to reduce costs so you want to maximize a profit. Users cannot see security features. They don't understand or check. That's why they go for the nice certificates. And in the end, security is often an afterthought. So people often ask you, ask us, well, is this secure? Then you say, well, what's the attack you want to defend against? And that's something we need to educate people about. I think the only way to actually tackle this problem is to create more legislation and actually create stronger marketplace rules. So, for example, standards should be here that we demand decentral infrastructure, we demand strong standards, we demand end-to-end encryption. And of course you can just codify these demands but you can only force people to follow these if you're competent enough and if you don't make the mistake in order to give suppliers the choice to just create their own enforcement rules. So, the next step, we should take a look at the Ministry of the Interior. Thomas de Mesier, the German Minister of the Interior, probably the last thing he wants in his life is that our IT systems become more secure. If you look at what he's in control of, he's in control of police, who basically want to push forward data retention. There's the Federal Office for the Constitution of Protection, so the Internal Secret Service, who are mostly busy with spying on journalists and shredding old files and supporting Nazi structures. So the last thing they want is for our IT to become more secure. Then there's the BND, the German External Secret Service, who recently offered four and a half million Euros to buy zero days. So, they definitely don't want us to know what they have so we can patch against what they paid so much money for. And then finally there's the Federal Office for the Security of IT, who participate in development Trojans for the government. And they should now be responsible to eliminate the zero days that the colleagues from the BND bought earlier. I don't buy that and neither should you. So, what our demands as a CCC is to take the BSI out of that mantle and then subsume all the other agencies under the Ministry of the Interior so they can finally do what the job actually is to protect the some people who attack us or want to attack us. So, that's about our suggestions so far. And now, well, what we were confronted with in as a result. And I had a letter in my letterbox. What was, what happened there? Well, that was exactly how I felt, sort of blackout moment. Well, laws in a way, published as a sort of a div file. So, you can amend one paragraph here and there. So, maybe they have learnt something there in those terms from our community. But we have five areas here that are sort of the minimum standards that are from legal side the least common denominator. And so we have the points if the BSE, the security office of the government is involved with working with government, with non-government companies then there have to be certain regulations held and, well, a lot lower threshold to actually access data. There are kind of deals to watch information about standards and these are enforced in these one-on-one contracts. And so we have the criminal police which is now becoming the central cyber police, cyber war, monitoring unit. And there was the discussion about IT products, software basically being traded as arms regulations to international trade. Well, this is the thing about registry, the need to register your product is interesting here. So basically it's this ministry defining what's critical infrastructure and just keeping a list of the lower standards. But of course the providers or enterprises have the right to make suggestions. So how on earth should the German office for IT security know what kind of demands security standards companies might need? Well, so what these companies need to do that want to have this certification, they have to do a sort of two-year audit cycle and have to give an alert if they have security issues. But that can be done anonymously and if the product is entirely deprecated then they have to have certain contact persons for, well it's like once the first nuclear reactor is hacked, well we might find out through this alarm obligation form a bit later on. So that's the most immediate way. And so we get every two years we get a sort of delayed image of what's going on. And so you can imagine in this kind of well how fast a yearly cycle is in this context. So let's have a look at the security standards that are to become obligatory. Well basically the companies have to write these security concepts themselves. Take as an example energy, they have unions or representatives and lobby groups and they have a sort of suggestion right in the legal, in legislation and an expert reads the evaluation and says well that sounds like you could do it in this manner. And I think in the BSE Security Office, IT Security Office, I don't think there's so much known about the proceedings of legislation so everyone is just convening and taking their own stack of IT concepts, security concepts with them and these have a very different standard not only formatting but writing and it's literally a matter of finding the least common denominator. And so that's the box at the back here, the combination of both, they have two possibilities really. And so what I said to them is just, one option is you read all of them and take the best suggestion and all adapt to that which means you won't have to actively change your already existing procedures. And the other option is that we take not the smallest common denominator but to make it a broader concept and make sure that it's just something that everyone can achieve in within the next two years. So if you see what it might be a simple process it's quite clear what people will choose or which would be the box on the right but also just the least effort, the way of the least resistance. Then we have the telemedia services, they have to be secure now. Telemedia services are content and hosting suppliers and they are now forced to take organized storage and technical precautions to protect the systems components and processes or supposedly to create appropriate authentication methods but that was removed from the changed law. They may now save user data to try debug and solve issues which takes paragraph 15 of the telemedia law and makes it analog to the paragraph 100 of the Telecommunications Act which allows them to save user data and there isn't even a set of limitations on this. That's the kind of data that's being used when you get subpoena for downloading files because obviously if these data exist then you can use a civil law process to just subpoena them and use them for your purposes. So obviously this is kind of productive because the best way to get data security is to not store data that you don't want to risk getting hacked. And that's the point, we know the problem as soon as you have the data everybody comes and wants to access it. I've compared it to the European Union guideline 2006, 24 EG and basically this is a data retention law that is not limited to criminal law. And then there is this beautiful part where the Federal Criminal Police gets the jurisdiction of a cyberspace where you can see how many and which laws are there that involve computer crime that includes the unauthorized access of data and preparation of such computer frauds and manipulation of data and computer sabotage. And of course you could say if you're writing pentesting tools or proof of concept for vulnerabilities then you have other people committed crime and thus you're also liable. That's the kind of grey area that we operate in here in Germany and that's what we're trying to get rid of as a CCC because the IT security can only be achieved by reducing IT insecurity and for that we need research. So with all of these cases first you go to your local police especially while far reaching cases then you get passed on to the state level and if they are of a countrywide or larger impact then they get passed on to the Federal Criminal Office so you can't, it's hard to imagine a way any further away from democratic structures. So, to summarize our first criticism was the protection of the end user. The protection of end users is something that's not even in this entire new law. Companies know how to protect their own interests very well. They know exactly which data are important and which aren't in regards to their yearly revenue and user data, especially telemedia suppliers or telecommunication suppliers data stay there that they don't really care about because last years have shown that if there is a breach there maybe a larger problem where user data are stolen there's zero consequences. Customers don't leave Customers can't demand any punitive damages. So there's just a few days of media coverage half a year later you create some security transparency offensive sell people your new messenger saying it's not secure and there's no damage to your image at all. So I think that's a joke and we absolutely need concrete ways to tackle end user protection. Secondly bureaucracy in IT security instead of bureaucracy I wish for a proactive approach I want IT security to be created a competition of corporations with each other and against bureaucracy. One way to not get that is the right of giving the corporations the right to suggest their own procedures because obviously corporations will never out of their own volition suggest anything so they would have trouble fulfilling. So you would always end up with absolutely smallest thing they can get away with that is not obviously ridiculous. So I'm looking forward to the corporations claiming to exceed the standard of security that's demanded by law. And then of course we have privacy the fewer data is collected the fewer data will be hacked and here we can see that the collection of data happens for completely different reasons. And lastly we demand the independence of the federal security office federal IT security office which needs to become its own agency because now they are the only ones who know anything about security and currently the secret services German agencies are their colleagues and of course they will share that stuff. So that's why I don't think it's a good idea to supply the BSI with information about security vulnerabilities because they will just leak it to the secret services who are potentially attackers. So in theory a good idea but they should not be under the influence of the Ministry of the Interior. So that's it, thank you.