 It is my pleasure to announce our next invited speaker, Bart Praniel from Kaulöwen. So Bart is best known for his work on symmetric cryptography for analyzing, crypt analyzing and constructing many schemes. In fact, there will be a talk later today on his work on PMAC. Bart has been the president of the ICR in the past and he will now tell us about the future of cryptography. So please welcome Bart. Good morning everybody. I would like to thank the ICR board for inviting me to give this lecture. If you look at the previous list of distinguished lectures, it's a great honor. People I really admire and they're my role models and I will do my best to step in their footsteps. I also would like to thank all of you because I'm not a morning person so I know how difficult it can be to get up in the morning and give talks so I'm very happy that you actually woke up and came to this talk. And when I was invited I really was thinking I should talk about hash functions like a topic near and dear to my heart. I've been working on this for more than 25 years but then after seeing Phil Ruggerway's talk, this thing which lecture at Asia Crypt, I decided I would speak about the future of cryptography and give a less technical talk so there will be no equations on the slides and very little technical information. So who has an iPhone here in the room? Good. So there is actually a great advantage now after the Snowden revelations. We cryptographers don't have to make our own slides anymore. The NSA makes them for us. So who knew in 1984 that this would be Big Brother and the zombies would be paying customers? Maybe I should emphasize this, the NSA calls iPhone users public zombies who pay for their own surveillance. And by the way, it's the same for Android users and even for Windows mobile phone users. So another way to summarize Snowden revelations is that what the NSA does, collect it all, know it all and exploit it all. And maybe about the knowing part, collecting we knew a lot but I think the exploiting was probably something that surprised us. So what I'll do in this talk is I go a little bit more in detail on the Snowden revelations. I try to give you an update. It will be not that as long as some of the other talks I've given about this, just a couple of slides to give you an idea. I will particularly talk about how the NSA goes after crypto because I think as cryptographers this is very important for us to understand how our crypto is being undermined. And then I'll speak about the end of crypto. There was actually a talk at crypto by Zitrain a couple of years ago with the same title and in my talk it means the same thing. It doesn't mean crypto is over. It means what is the goal of crypto. And then I'll discuss challenges for research. So if you look at the NSA parking lot or you visit the NSA museum and you read the books that were available, you should not be so surprised about the surveillance capabilities but still I think most of us were surprised by the massive scale and impact and how competent these people are in doing what they're tasked to do which is spy on everybody. So they're very sophisticated both technically and organizationally. They actually do something which we teach our students defense in depth. Well, the NSA has offense in depth. We found out at least three independent ways the NSA had to get access to Google data. So even if one gets blocked by technology or by a law, they still had two others left. And maybe there is others we don't know about even today. Of course because of Snowden, because he was working for the NSA, we know more about the NSA and GCHQ, but of course many countries collaborated. And there was an economy of scale element here. As a small nation, you can't actually run such an operation by yourself. So you give your data to the NSA in exchange you get actually something back, some questions you can answer for your own purpose. And this is typically governed by secret treaties. So we don't know much about this. And we need not a Snowden if we really want to find out what's happened. Maybe more surprising for most is that industry was in some way collaborating as a mechanism we were not aware of, security letters, where a company can be told to give up certain data or certain keys. And they cannot talk about it. If they talk about it, actually the CEO goes to jail, so he will make sure nobody talks about this. And then of course we also learned about undermining cryptographic standards and at the same time also the credibility of NIST. But maybe the more spectacular thing is what's called active defense. So in fact, if you go to trade shows after the Snowden revelation, and many companies already have copied the same term and they say, you use the same thing, we offer you active defense. But the NSA speak active defense means hacking everybody. That's what it means. So I guess many of you have heard about quantum insertion. It's nothing to do with quantum computers, but it's a clever way to actually get access to somebody's machine. What you do is if somebody goes to Google or Facebook, you actually, as the NSA, you send the answer before Google or Facebook. And by being faster, you can actually make the client believe he's at Google or Facebook and you can then send him malware and take control of his machine. It's actually quite easy to detect this attack because a bit later the Google.com answer will come, but it's not clear whether it's still use it, but at least it has been very effective. There's even a whole quantum program, nothing to do with quantum technologies and I don't want you to read everything. But the bottom point is here, the propagation delay from tip to target determines the success rate of the network effect. Less latency is more success. So the NSA takes over your machine by being faster than Google or faster than Facebook. Of course, once they do this, they use zero days to take control of your machine and maybe if you never put your machine on the internet, they put in ships during transportation, supply chain, subversion, it's also been quite popular. So I think if you read that the early protocol papers from Dolef and Yau, they introduced this Dolef Yau attacker and it kind of was a nice thing for our papers but I think what we learned from Snowden documents is that the Dolef Yau opponent exists. There is somebody controlling all our communications and it's not just a game, it's actually a reality. Okay, of course, after the documents, it's no longer deniable and we also know the oversight has been very weak. So when I attended a crypto protocols workshop in the early nineties in Cambridge, Bob Morris gave a talk. He was at that time the head of technical, I think of crypto at the NSA. He was the father also of the guy who wrote the internet worm. He gave a very interesting talk and I think in retrospect, I should have paid even more attention. But one thing he said is rule and one of crypto analysis, look for plain text because people actually don't encrypt. And so where do you find plain text? Well, of course, we are all addicted now to cloud services. So in fact, we put all our data in the cloud and the prison program revealed in the first week of Snowden Evolutions in June 2013 showed that in fact, the NSA had in some way access to all those companies data in the cloud. We still don't know how the access exactly works. Is it real time and what is the capacity? But we know that the NSA does have access to all cloud data, which we all put there. The other slide very famous is upstream. Of course, we know for a long time that the NSA and others are tapping all communications and the actual investigation European Parliament is an example of this. But this slide is particularly interesting. It says to the analyst, you should use both. So you not only use the terabits of data coming into upstream, but you should also use prism and combine the results. This slide is kind of informative because it shows you which companies are collaborating when. I don't think these dates really matter, but I think the fact is that they're all there is more important. This slide, of course, must have upset Google a lot. It shows one way the NSA had access to Google's data, and I guess Google was particularly upset about this smiley here. So what actually happened is that Google was encrypting between the user and the Google front end, but then in the back office, all the connection in the backbone were actually eavesdropped, not by the NSA, but by GCHQ under the muscular program. And so you can bet that people who sell encryption boxes have done good business in the last years to now encrypt those slides. So that's probably something that has been blocked off. The third thing is metadata. In NSA speak is DNR, Dialogue Recognition from the Old Phone Days. So what is metadata? Metadata is not plain text, but which website do you visit? Mac addresses, IP addresses, location information, it allows to map networks and to reveal social relations. And by the way, I guess if you bring your phone, I brought my phone here. There's probably other people here who are on a watch list. By being in this room for an hour, you probably end up on another list, just based on this technology. We should have told you, you should have left your phone in your hotel room. You should not take your phone if you go to crypto conferences to the sessions. So in the first week of Snow revelations, it was leaked that actually the NSA was also spying on Verizon customers by collecting their metadata. And this made the US citizens very upset because the NSA is paid to spy on everybody else except the Americans. And so in fact, the NSA overstepped their rules and was also collecting metadata on people in the US and on US citizens, which was not allowed, okay? So last summer, although it was passed, it became effective November, 2015. And so the NSA is no longer allowed to collect both the telephony data. This data now stays with the telcos and the NSA has to go to a FISA court to get access to the data. Be very careful. It's only telephony data. Your internet data, this is still not specified what is happening there. Is it collected or not? I don't know the answer, but I have a guess. Now, Europeans were of course, the first to point fingers at the NSA and GCHQ, but then in Europe, 10 years ago, our parliament voted the data retention directive, which actually forces all telco operators to collect exactly the same data. And so somebody went to court about this and the highest European court, European Court of Justice, ruled about two years ago that collecting all metadata about everybody is completely disproportionate. And it actually violates fundamental rights. What was the response of the countries? Within a few months, the UK had a new law and all the other countries struck their shoulders and just move on. The highest court says something which is inconvenient, just ignore it. And if you actually get annoyed, citizens just vote a new law and then take a procedure of two or three years is necessary to vote that down again. So our governments are extremely cynical in their response to this. So Obama of course, felt criticism about the Verizon case. And so he went on public and said, we're not listening to your phone calls. This was to the US citizens of course. It's only the metadata. Now unfortunately, a few months later, General Hayden gave a speech and he said, we kill people based on metadata. Now you can go read the speech and a bit further he says, but that's not what we do with this metadata. That gives a really big comfort to me. Okay, then the fourth thing, so we had prism, we had communications, we had the metadata is the client systems. I already mentioned the hacking, the exploiting of devices. And then at Christmas 2014, we go from their Spiegel a very nice present, a whole catalog of what the NSA calls tailed access operation or tau. A lot of devices that can actually be implanted on motor boards or in PCs or in routers to actually remotely e-stop on devices that never touch the internet. You can find the complete specs, how they communicate with it, how much they cost and how many they produce. It's really if you're a hacker, it's really fascinating reading. These guys really had good tools and they probably have now even better tools. So once you have all this stuff, what can you do with it? Well, it goes into a system called Tempara for GCHQ or Xkeyscore deep dive for the NSA. And you can ask very interesting questions. For example, I find one phone number in case of a terrorist attack. You can find all the devices of this person. They're MAC addresses, their IP addresses, their location and who they talk to. It's incredibly powerful to after an attack actually trace the network. And we've seen this in the recent attacks that actually, I think in general, police was very quick in finding other people in the network and who supplied the network and so on. You could also, if you want to hack, say, Belgacom in Belgium because they have access to interesting data, you can say, give me all Microsoft Excel sheets containing MAC addresses in Belgium because actually systems tend to collect MAC addresses in Excel sheets. You could also say, give me all exploitable machines in Panama if you want to get some interesting information there. Because every time a machine in Panama does an update and it fails, the NSA finds out before Microsoft does. Or you can say, give me everybody in Austria who just happened to speak French and actually uses tools like OTR or SIGML. And you can get all the information on those people. So the NSA and GCHQ, they call this target surveillance. But they first collect everything about everybody and they say, because we only are target questions, believe us, we are honest, this is actually fine. This system is very attractive and what we learned from Snowden documents is that actually the German national secrecy service, the Buddhist doctrine deans, was actually spying on European but also German companies, German citizens on behalf of the NSA and GCHQ and the exchange they got access to those systems, the temporary systems. So our secret services are so desperate to get this data that they're even willing to spy on their own companies and damage their own economies just to get access to those systems. Okay, so we have in society, we have created today, we have the balance between industry, government and the users. And in general, the government has to protect users against overreach of industry and we have other democratic means to protect against overreach of government. Now, in the US, people tend to be more worried about their government. In Europe, people tend to be on average more worried about industry. Now, of course, technology has changed industry a lot. If you speak to any big computer science events or if you look at what computer science departments are hiring, it's big data people. We're collecting all the data and many of this data comes from IoT sensors. There will be 50 billion sensors according to Cisco in 2020 on the internet. So there is lots of data from lots of sensors which are everywhere in our cars, in our rooms, in our devices, in our bodies probably. And then, of course, we will analyze these things. And then, of course, we also have the business model of industry which is advertising. You never pay Google or you never pay Facebook or all the other companies because they sell your data to a whole series of companies. A complex ecosystem is not only by the big players but also smaller players. There's many small players who actually trade your data in exchange for showing you the best advertising and they actually get advertising revenue from this. So I think industry is now for financial reasons and to give you nice services, they're actually overreaching. And so we heard a few years ago that data is a new oil and what I heard a couple of weeks ago was and data mining will give us rocket fuel. Now, we also had from oil, we had pollution. And I think if you deal with rocket fuel, you should be regulated because these things may blow up. So in general, what we need, I think, is a strong control by government but what we have seen is through the PRISM program the government actually gets access to all this data and loses its interest to regulators because they also have the data. And so in some sense, the balance of power which had been created carefully over the last 100 years in Western societies between all the players has been completely distorted by technology. In fact, this is not so new, this concept of complete surveillance, there was Jeremy Bentham who actually designed prisons in the late 18th century called the Panopticon where you could watch all prisoners all the time and in fact, such a prison was built in London and fortunately destroyed. But actually he described an environment where people would be completely surveilled. And of course, we know surveillance can give rise to discrimination because you know much more about people. It makes people scared. It leads to conformism. If we know we're always surveilled, we'll start behaving differently. And if you look at progress in history that very often has come from people who behave differently and have original ideas. And of course, it's also a very powerful tool if you want to oppress or abuse people and if you understand power. So the question we face is, can we control this power? Should we first build such a thing? I think the reality is today it has been built. And the second thing is, can we actually defend against this or can we at least try to control it more? So the main messages are, there is an economy of scale issue. So you should not be worried that Austria can do this. They will do it at a smaller scale, these countries, but it's only a few nations have the resources and means and the access to data to do this. So all the smaller nations will actually bind together with the bigger nations to get access. And this happens, as I mentioned, I believe under secret treaties. Don't believe the story that the surveillance infrastructure is passive. It's hacking everything and everybody. That's what it's about. This means if you're on the way between surveillance people and the target, they will hack you. There was a technician in Belgacom who happened between GCHQ and a certain network. His device was hacked. The guy had done nothing wrong. He was not a terrorist. He was just on the path between A and B. He was enough to control his device. I think if you worry about Comsac, like what the quantum crypto people are doing, you probably missed it. The problem is not the communication. The real problem is in the end systems. How can we control those? And I think this is the question for our community. How will we help make computer systems more secure rather than the old problem we always worked on, which is how will we secure communications? And of course, we're gonna build a society where there will robots and sensors everywhere and then we have our governments rather than helping us to secure it. They're actually hacking it and keeping behind the vulnerabilities for their own interests. And maybe a final comment is that it's very interesting that in European legislation, human rights are universal. Well, of course, if the highest court says that they are violated, then our politicians just shrug, but in the U.S. situation is even different. In the U.S., there is a firm belief that human rights stop at the U.S. border. And if you're not a U.S. citizen or if you're not on U.S. soil, you have no human rights. And in fact, it's completely legitimate to surveil you, even if scholars believe, and it's their agenda, believe that mass surveillance is not acceptable and actually goes against human rights. So now about crypto. So a few months in the Stone revelations, September 2013, New York Times announced or described the Bull One program in which all the whole bag of tricks, which is being used to go against crypto. And I think then cryptographers started really paying attention to what Snowden had to say. So of course, if people do use cryptography, the easiest way to get access is just come in and ask for the key. And there is some, I would say, substantial evidence that in fact, is that happened. If you have large players with an SSL private key to decrypt all the session keys, it's kind of likely and natural that the NSA would go and ask for that key. And from then on, they can read all communications and decrypt them as fast as Google can do it. Because of security mechanism, it's very hard to get direct evidence because if people show the leopard, they go to jail and people don't like to go to jail. So there is cases like LavaBit, which was an email provider of Snowden who had a kind of centralized architecture and he shut down his operation rather than actually giving up the key. There is some other cases, and as I said, there is no hard evidence, but if people offering a service suddenly stop it, if the service was successful, you can't suspect that something may have happened. Of course, if you're a billion-dollar corporation and you get such a letter, you're not gonna shut down overnight, right? But smaller companies tend to shut down or give up support for part of their products. Of course, you can try to find a private key somehow, and on Monday, you heard a beautiful lecture on some attacks on TLS, and logjam is one of the attacks that actually shows that the all-legacy export crypto from 512-bit from the late 90s can even today be used to break new TLS connections today. And if you read the paper carefully, you also see they make an estimate on 1024-bit keys, and so you see that actually, still a large number of servers today on the internet use 1024-bit keys, and can probably broken quite quickly by TNSA using these kind of mechanisms. Also, the fact that GCHQ has a program called Fly and Pick in which they collect all kind of information on SSL servers and TLS servers shows you that this is really, I mean, not in a very good shape there. This is being exploited massively, is the best of it protocol, but in fact, in practice, it's probably the most exploited protocol as well. If you can't get the private key, why not replace the public key? Because in the end, the cryptography is about authenticity of public keys. And for this, we built, of course, an infrastructure, 10, 12 million servers, and we built a CA infrastructure, which actually I think is not a big success. Well, many people made a lot of money with it. I bet every device trusts about 600 of those keys, and it's claimed to be only 60 companies. I don't know whether to make a difference, but of course, there have been several incidents reported, and one of the things what was very good is the work, the efforts by Google to actually spot this better and use Chrome to actually detect these kind of things, but we have companies being hacked, governments being spoofing things. There is companies selling new products, how to impersonate or add additional wood keys in your company so you can spy on the Facebook or Gmail usage of your employees. This is actually summarized products. So I think given the many attacks, I don't think we're in very good shape here. There is some research and there's some good results, how to fix this, but this has not happened in this community. And there is also a very big problem where you get those things deployed because you go after vested economic interests. Maybe into a point of view, of course, there was a very nice work done by Mark Stevens on Flame. Flame is malware that used a new MD5 attack to actually falsify certificates and get access, attack a Microsoft update mechanism. There was also some bright light on the horizon. At least there was a grassroot attempt. Let's thank Cript to actually get rid of all these companies and have an open initiative to actually certify public keys. Let's hope that we can do something better, but there is definitely need for more research. It's happened that this research is actually more published in CCS than actually in EuroCrypt or CryptoRacerCrypt. Of course, another way to get at the key is to make sure that actually the key comes out of a random generator with a trapdoor. This is the ultimate dream because if you can trapdoor this mechanism that goes from a seed to a key somehow, then by looking at some public data like Nonsense, which you send always in TLS or IPsec, numbers are random data coming out of the same machine. If you have a trapdoor, you can then actually compute the key which will come out of the random number generator. So it's almost too good to be true, but it actually happened, as we know. So what you will see was one of the four algorithms standardized by ANSI and ISO. They were proposed by NSA together with some other algorithms. It went first to ANSI and ISO, was then standardized by NIST, and then people started noting it. They found many mistakes or deviations before the publication, like the paper by Barry and by Christian Giustin. But then of course, in 2007, after publication, Dan Schumov gave his famous talk at the CryptoRub session, and he actually showed that this generator was backdoor. And what happened? Nothing. In fact, NIST didn't do anything and people are kind of shrugged. Also our community shrugged, but I think we all thought that NIST would actually fix this. But NIST didn't. Also, I was not so worried because there were four algorithms and you will see it was a hundred times slower than all the rest. So which person would actually take something based on elliptic curve, which was a hundred times slower, how to implement and use it? Well, then we found out in November 2013 that actually some people were using. But so to make sure that you don't use a non-backdoor parameter, the standard says in the appendix, security of ULEC requires that the points P and Q be properly generated. To avoid potentially weak points, the points specified in appendix A1 should be used. I don't know whether you like irony, but I do actually, and I think it's really funny. So of course then, when the New York Times revealed this program, people started looking back and thinking about the CryptoRub session, I think it looked quickly. NIST responded by withdrawing the standard. There was a new draft out, I think in the past weeks, but then it was leaked that actually the Be Safe Library made ULEC the default algorithm, and this was leaked by Reuters. And then last Christmas, we got a nice Christmas present. Juniper actually announced problems with ULEC. We knew they were using ULEC, but they claimed there were no problems. But then it was actually shown that by some careful analysis that by several carefully crafted bugs, the ULEC output would anyway be used and exploitable, and the whole protocol was changed to make the exploit work. But then even more funnily, in 2012, somebody changed the PNQ, the back doors. So the people who wrote this paper on keys under door mats, they predicted that back doors would be overtaken, well this actually has happened in real life. Which of course we don't know who did this, was it somebody else in the NSA, or was it the FBI doing it to the NSA, or the CIA doing it, or was it the foreign nation? We still don't know. So I can recommend to Reuters papers, it was great research about this, how to exploit this thing, and also how this Juniper thing happened. It's a fascinating history. So all of this was actually thought of by Moti Jung and Adam Young, who actually published in the late 90s, quite a set of papers, very interesting papers called kleptography, at Crypto showing that actually once you trust somebody to give you a cryptographic library, then this person can actually damage your security interest seriously. So if you never trust any cryptographic library that comes from somebody else. That was the main message, or you should audit cryptographic libraries. They published a book and apparently some rumors have it that they inspired the NSA to actually go on and do their COOP or dual EEC. Okay, so the book itself is called not kleptography, but cryptovirology, but it goes after the same thing. So the bottom line is that even if we have secure connections with SSL, IPsec, SSH and so on, the NSA can exploit a large number of them in real time. We don't know how it works. We know some numbers from five years ago. There were about 20,000 per hour. We know that if the NSA intercepts cyber attacks, it goes always to Utah for their long-term storage, in case they can, in the future, ever decrypt it. But turns out that actually they also have some connection to a system called Long Hall and performance computing resources, high performance computing resources, where they can actually decrypt some things in real time and inject the whole thing immediately into temp or XT score deep dive. We still don't know how it works, but we have, of course, some ideas like the dual EC exploitation could be part of this. So the bottom line is cryptography has been undermined with weak implementations. Think also the Heartbleed case, another example. Of course, going after keys, which is not something cryptographers can do much about, well, I'll come back to this. Undermining standards, which is something we can do something about, and maybe also cryptanalysis. Although we don't know too much from the cryptanalysis divisions, Snowden had no access to their division, unfortunately. I would really have loved to have all the intermal dramas also leaked, but this was not to be. Of course, what else is in the interest of the NSA? IPsec is 48 documents. This is perfect. If IPsec is even harder to implement than TLS, probably there is even more mistakes and more problems. And I guess we have seen much fewer papers on IPsec. This is maybe a nice research agenda for the next five years to find also all the bugs there. Export controls have still been working, have been biting back at us. And of course, hardware backdoors. In some way, if you can undermine the hardware, if you can go at the lowest level, you can always undermine security. And of course, if you can't win anymore, then you send your friends into the field and you actually have law enforcement say, we are going dark, please stop the encryption because it's actually making it impossible to perform our work. So there was a document, I don't know whether you should believe this, but it's an interesting figure that some sources in the government, the US government believed that the Snowden documents have advanced the use of encryption by about seven years. So that actually after the relation, people started using encryption, some companies started cleaning up their act. And so that actually this has done a lot to actually get more crypto out there. So, but let's look at crypto today. And when I started in crypto was late 80s, I don't think there were more than a few million crypto boxes out. There was some government systems we knew nothing about and there was a financial sector. And they had some ATMs and probably some internal crypto, but most of the crypto was in hardware and a typical crypto box was about 10,000 euro. So there were definitely not billions of devices out there. So today we have actually a very big success because today crypto is everywhere in our pockets. Although most of the crypto is there to protect the industry. I counted about 18 billion devices and I guess I should do more work on this. The biggest deployment is bank cards. And you can argue a bank card protects the bank against you, but to some extent it actually also protects you against hackers. So I don't give it a red score, but an orange score somewhere halfway. It's kind of good for the user, but it's in particular good for the bank and this bank card is always watching what you do. Then you have access cards. This is really to access to buildings. So I don't think the user himself benefits much as more for the organization. You have the update of devices. There you can also argue it's good that your user, that your device is updated correctly because people try to undermine this to attack you. But of course it means if your device is updated you're also under control of the entity that updates you. In some sense a secure update mechanism is a blessing but it also occurs because it means you're now controlled by somebody else. Then I think where the user is really the enemy is content protection, pay TV, play stations, Blu-ray, DVD, all these settings. The user is the enemy and crypto is actually used against the user. So it's not benefiting the user. Or you can say it does benefit him because otherwise he couldn't watch the videos but at least in the security model the user is the enemy and it's not the Alice or the Bob. Identity cards again it's a mixed case and then you have the readers for access control devices and bank terminals. So it's kind of a large deployment but not too much to help the user. So what does help the user? And Phil's presentation actually spoke about cryptography for security, cryptography for privacy and cryptography for cryptography. So I think they have a slightly different division. So I think security and privacy are actually intertwined and very often you can't have security without privacy and vice versa. So I'm not sure that's the right division. So what I call security to protect user data. And the biggest deployment is mobile communication about six billion devices that have encryption. But the bad news is it's never end to end at least not if the telco provides it. It stops always somewhere in the network so it goes in clear over the fixed network. Then you have the SSL infrastructure several billion devices, client systems and 10 million, 12 million servers. I spoke about it. I mean it's of course a fantastic infrastructure but it's also a fantastic way to exploit it especially the public infrastructure behind it even if we fix the security which we should do we also have to fix the PKI because if you don't do this actually it remains a massive system exploitation. Then we have the more interesting cases. Let's just do IP stick. I don't know the numbers. I'm not sure. I guess it's more corporate use. It's very hard to get numbers on those. Another more interesting case is encrypted hard disks. So one in two hard disks today is encrypted but most of those hard disks if you take them out of your machine and put them in a different machine they still work because they don't want to do the key management that you have to pay extra for that. So now you have crypto costing you money slowing you down consuming power but not giving you any security because the key management has not been installed. So I would say it's a mixed case if you want to do it properly you can but the average user is not helped by it. We have Skype, very interesting case. Skype was distributed peer to peer European. It was not transparent. We couldn't know how it works. Skype was acquired by Microsoft for eight and a half billion dollars in 2011. Within a few months the NSA was able to decrypt Skype to Skype calls. Is there a connection? I will not decide it. This is for you to decide about this but I think it's a problem. It's still not open either. WhatsApp actually is based on open technologies. There are one billion users. It's based on open whisper systems but is it secure? I guess it's something for us to find out but if WhatsApp is secure it's the first secure crypto we deliver to users for more than one billion users between the mass markets. Okay. I guess it's something we should be looking at. So this is summarizing the same thing in text and I guess I don't agree with everything in an article but I think it's a very interesting article on the same topic. So there is crypto that works actually. This is also Snowden documents. It says OTR encrypted, no decrypt available. So OTR is off the record system. So what seems to be okay is true crypt although development has stopped and there is new forks. GPG, Tor, the NSA says Tor stinks. Although we have documents from a couple of years ago saying they had problems with it. I would think that they've been investing heavily in actually the anonymizing Tor and so I would not be so confident that actually using Tor makes you more secure. I mean, I hope it does but you should be careful. There is ZRTP and as we learned on Monday it's actually vulnerable to downgrade problems. So more work is needed there. What do decisions have in common? They use Stanford Crypto which is RSA, a DV Helmang elliptic curve with large keys. They're open source. They give end to end security and they have few users. Few I mean at most millions. In fact, if you look at the whole crypto battle we've actually lost big because in 30 years crypto went from a few million devices to about 30 billion but in fact we managed to make it impossible for the average user, the billion user to actually have secure communication or secure protection of data. So I think as a community we've failed very big and it's not always all fault. There is many other courses at play but I think we should be aware of this and we should think this is our problem too. So what about crypto research? As you heard on Monday there is still surprisingly a lot of work to be done on secure channels. It's kind of the old problem. The problem I mentioned as not the most important one although I think it's very important still. It's actually amazing that we still are working on this. I mean the last 10 years I think we've discovered much more about secure channels than we have in the history of cryptography. So but there is still a lot to be done and for example on Monday you heard about downgrade attacks and how to deal with this secure negotiation. There was a Caesar competition giving us better performance for authentic capital encryption. It's forward secrecy. For example one of the things Google did after Snowden revelations was switch from RSA with a private key that can be asked by a security letter to Diffie Hellman. Where actually there is no long term secrets in the hands of Google. And in fact one of my students went to Google I think it must have been around 2010 and heard summer job was actually to speed up elliptic curve cryptography, Diffie Hellman. And she made it from two or three times slower than RSA to only 30% slower than RSA. And Google said this is very nice. Do you really think we're gonna slow down our service by 30% for this scheme? She was hired anyway by Google. But so only after Snowden, Google decided they could pay the price for actually going up to Diffie Hellman have forward secrecy. If you look at the protocols, DNS security but in fact it doesn't work that well. BGP they're still discussing the routing security. This is still not done. And all the other things, we should have them secure by default. I think if you wanna go for it, you should go to the ITF and I guess it's gonna be a long and difficult battle. You can do other things like, Eddie Emparrick and Zurich is just designing everything from scratch again. And he's looking at if we would build the internet today, giving the threat models we have today, how would we do this? And I recommend you look at this work. It's really interesting. It's actually not impossible to do it. Of course from a deployment point of view it can only be done as an overlay but it's not impossible. I think that's the message I want to give. Metadata is a very big problem and I think this community has kind of abandoned this and left it over to the pets community. It's actually a much harder problem than encryption because with Metadata Protection you never get nice proofs. You need to have models about communication systems, about communication statistics and so on. So it's very difficult research. I think Tor is a good thing but we need many more systems, many more approaches to this and I think we should look at the problem more. And in particular location privacy. This is there is hundreds of papers but there is zero deployment or there is very little interest in it but also very little solution that actually have been tried and work. I think one thing we should have learned we teach our students that cryptography is used to move secrets to keys. But we don't teach our students that if you put all secrets into one key then somebody will come and get this key. And so we should actually think about this and I think Eva Desmet was one of the first to look at some of these problems at group of cryptography about sharing keys and we had research from the 90s about this but as far as I know it's only been deployed by some CAs for the top level for DNS but like threshold cryptography is not deployed. I can even tell you a secret for the Helios elections with ICR we don't use threshold cryptography. So there is three trustees and if one of them loses their key then we have to do the election again because threshold cryptography is too hard to implement. I think there we actually can do something because it's not so hard to implement. We can do it just a matter of go for it. I think we have technologies from the 80s and 90s where we can protect against those things but we should also make sure that those things get used. The harder problem is computer security of course and I think if crypto is in bad state probably computer security is in even worse state. We have a lot of legacy. We have more and more zero days serious problem being discovered think of hard bleed or shell shock. And of course what's not been reported so much is that the US government had access to hard bleed probably before all of us so they first could use it for a while to exploit and then before it was actually released. Our devices need continuous update which means that and in fact updates are can be dependent on your IP address your MAC address your country on other things. So the entity that updates your device also controls your security and privacy completely. We don't have very good defense technologies. I think the industry now has given up more or less and they say we're gonna go for security by big data. We'll watch everything and everyone all the time and those who behave badly we will find them. I'm very skeptical. I mean the watching will work whether they will find the guys who behave badly I'm very skeptical but some people will make lots of money with this. We don't think about human factors. Supply chain this is the famous picture where NSA employees are adding chips to Cisco routers. So I know companies will go pick their devices in person get a supplier to avoid these kind of attacks. So it's in pretty bad state. I guess the only thing we can do quite well is protecting data at West. So we have encryption depending on who you trust. You may deploy BitLocker or TrueCrypt in the cloud. In principle we can do it but it's not done. And the big problem of course is the key management. We don't work enough on this. But what if you want to compute on data? So let's take a step back. And I think the real issue is architecture is politics. The way our IT infrastructure has evolved of course it has evolved gradually it has evolved because of several reasons. But in the end the way it's being done is to create control. And it can be controlled by governments or it can be financial control or control of monopolies or de facto oligopolies. But we should not think as technologists or scientists that the internet and the cloud and these things happen just by coincidence or because it's the best thing. There is some steering behind this and I mean the fact for example that Skype was acquired is a good example of this. Skype was distributed but Skype had to be taken under control. So it has privacy implications but also security implications. The other thing is I think we should stop collecting big data. We should actually as a community advocate is we should stop doing this because big data yields big breaches, think of pollution. People still don't think about the consequences but there is so many breaches today that we cannot guarantee even large companies that their data will not be breached. And breaches are not a security problem, they're also a privacy problem, they're also a security problem. Think of the Office of Personnel Management. Everybody in the US with security clearance, their data was breached. This is a major security problem for the US. So we should really be very careful about this. We collect massive data in the hope to make rocket fuel and get some short-term benefits. In the long term we may destroy our ecosystem. It's very similar to what we've been doing with global warming and oil and CO2. We should be very careful about this. So interestingly I don't want to spend too much on the legal side but I think we have to talk to politicians or lawyers to, it's not only a technology problem. I think it's very naive to believe we can only solve it with technology. So last October, European Court of Justice, I'm actually fittingly I can give this talk here because Max Trams is a student in Vienna, a very smart student. I recommend you look at this lecture, for example, a CCC lecture is brilliant. He actually brought down a safe harbor agreement by complaining to the Irish Privacy Commission about prison and saying, my personal data is actually being abused by the NSA. And you should do something about this. And so the highest European Court actually said, it's correct. Safe harbor has to be stopped. Of course, the politicians are now working on privacy shield and I'm not sure privacy shield will be better but at least there is a small victory. But I went to a conference with lawyers and politicians last January and the European bureaucrat said, yes, the highest court has spoken about this but in the end, it's a political problem. So again, what the court says doesn't matter if it goes to the core of our society, the political power, then the highest courts, even in Europe, are being ignored. So maybe a bright spot on the horizon is that last week, the general data protection regulation got approved and in fact, it will come into force in two years from now because it makes so big a change that industry gets two years time to adjust. And it would be a topic of a whole lecture but essentially it embodies something like privacy by design and lawyers already have kind of been bending this thing but I think it's a unique opportunity for cryptography to get some of their things used because the privacy by design principle says that if you do something, you have achieved a certain goal, you should do it in the most privacy friendly way. You should not collect more data than actually really need it for the service. The fines are now serious, up to 4% of global turnover and of course, there is an exception for national security. Apparently, this doesn't seem to be relevant so the fact that mass surveillance violence of human rights cannot be dealt with by privacy legislation. What we should do, I think, is keep data local. Somehow, the industry has convinced us that this device, which has more power than a cray from the 80s, is not smart enough to tell somebody which advertisements I want to see. And somehow, we all believe this. This device knows a lot about me and it could perfectly tell which advertisement I want to see without giving any data about myself except for this information. I think that's a fundamental point. Of course, you say I want backup. Well, we can create local backup infrastructures in our homes with our friends. We can use threshold systems for this. It's actually possible to achieve the same functionality, maybe at a slightly higher cost, without putting all this data at risk in a central place and we will be protected at the same time against industry and against governments. And of course, there is many photographers in front of me so you all know this can be done for many applications using their knowledge, oblivious transfer, private information, retrieval. Maybe the other stuff is a bit more challenging to do it very efficiently, but we should work on this. But at least we should show what is possible today. It's very difficult though as a cell because if you collect everything, then in general, there is more control. And so the government and industry will prefer to have more data because they see other uses for it and some applications work better with central data. Especially fraud detection in some cases. If everything is local, fraud detection becomes a lot harder. So we have to think about this. But in general, as cryptographers, we face the problem that the broader public does not understand what our crypto magic can achieve. In spite of the paper by Jean-Jacques Couture and Louis Guillieu and Tom Burson and others on how to explain the knowledge to our children, we still haven't managed to explain the knowledge to our politicians or to our lawyers. Okay, I'll skip the example I wanted to give. So of course for small data, we do have some time to centralize information. An example is genomics research. There is now a trend in medical sciences to characterize individual cancers and actually develop treatments based on individual cancers. It's clear that you cannot find which treatment works best for each cancer if everybody insists on keeping their own data with themselves. But you need to pull something there because otherwise you will not be able to find it. So this is clearly a case where big data is needed. We have some technologies which are imperfect like pseudonyms, we know that it kind of sucks but it's something we can look at. There is differential privacy, it has its shortcomings but it can be used. There is work on search and processing of encrypted data and this is one way of doing it but of course you should be aware that in genomics people speak about gigabytes of data and it's not clear we're ready for those kind of skills. I think also we need to think of cryptographic techniques to give better governance. Once this data is there, we should see who accessed the data, who did what on the data. We should invent cryptographic mechanisms to actually protect this data better and to make sure that if people do certain operations they're somehow logged and that you can actually find out who did what at which time. So I think it's a fascinating research topic. I still think we should favor local data. We should not say oh we can encrypt everything, put everything in the cloud because we have all this fantastic fully morphed encryption. So now go to the cloud. I think this is actually a very big risk that in our enthusiasm for FHE we actually forget to point out the risks of data in the cloud and we all know that all the data in the cloud will not be encrypted with FHE in the next 50 years. So we should actually start our papers with that. That would be really honest to do that. Because in fact that's what if you go to industry cloud conferences and you raise the privacy issue then these guys, if they can pronounce it they say fully morphed encryption. So we should not oversell our solutions but of course we should work on it. Okay, next point is transparency. So if you think about all these things about control, about back doors, about updates, specifically for people in a certain country, you prefer really open systems. And I think I see of course there was big benefits and we have fantastic systems created on the closed source but still I think as a society we should go to open systems because I think what we want is transparency and security. So of course open source so far has had quite a bad governance and the hard bleeds is an example of this but in general as principle what we want is the powerful which is the governments and the large corporations that should be transparent in what they do and the small people should actually be able to have secrets. I think what we have today is the other way around. In fact the big companies are completely transparent about what they do with our data and the governments. I mean if there was no Snowden we would even know what Temporar is and what it could do but in fact we as individual citizens have no product. And so I don't want to start a debate about open source licensing and whether closed source can be a solution in certain cases. They can also be all the mechanisms so I'm not saying you always have to give your code but we should think about this more and advocate this more and try to provide solutions. So in December, Azure Crypto Phil spoke about academic freedom and I really recommend that you listen to his talk and read his essay which he wrote. But of course as academics we are free to choose which problems we work on but there is some pressure even academia because you want to publish the publisher Paris problem or you want to make impact so you want to work on the topic everybody's working on because then you'll get more citations and of course in industry you have even much freedom. On the other hand I think it's very hard in cryptography to predict what will be valuable and I think even harder to predict what's gonna be valuable to society. On the other hand I think if you're honest with yourself you probably can tell about certain of your results I know this is actually not, this is crypto for crypto as Phil calls it. But I think in general it's very hard to make distinction, this is where I disagree a bit with Phil maybe. So for example if you think of public key, public key concept and Bitcoin or zero coin there are two fantastic concepts but it was very hard to receive people to actually create ransomware with it and actually there's a good example of good crypto being used for bad and I think a lot of good crypto can be used for bad things too. But of course an identity based encryption is another example, it's not a surprise that GCHQ is a big fan of identity based encryption because of the backdoor functionality but maybe for my IoT devices at home I want IBE, maybe. And that's the very, there I want the backdoor because they're my devices. So I think it's very difficult to decide in advance what is gonna be risky or not and I guess we can also use a code of Einstein of course if we want to be free in some sense. There is also the big gap between theory and practice and some people have been arguing it has been growing although you should be aware that when RSA was invented it couldn't be implemented and I think it took about 20 years before there was large scale implementations. So of course some people say nothing is more practical than a good theory and RSA theory is important at least in theory. But so in crypto we have assumptions and primitives like AES or factoring is hard or your favorite learning with errors assumption. We have modes, great as like GCM SIF or something like this, protocols like the authenticated key establishment in TLS, you have the understanders like TLS or IPsec, you have specs which have state machines and then you have code. And so we've been very good in the research agenda to build up from the lower layers with proofs to the upper layers and I will not go into the details about models although I think we should be very careful with our models and among the something struck me really was the speaker said that I was really happy because we could make them even change the protocol so our proof would be better. And if it's a small tweak I'm also happy for him but I think your model only covers part of the system and if you're going to ruin engineering intuition or common sense to make your proof work I think this is very dangerous because the problem is the proof only covers a small view or part of the model and in fact you should be very careful if you start changing things to make proofs work. So something that stuck with me but when Oded Goldberg gave talk at crypto in the 90s was he said my father said it's possible to build a cabin without foundations but not a long lasting building. So we should actually look more at foundations but if you look at what's been doing here it's not the foundation, this is actually the next layer we've been building the walls but have been working on foundations. So which problems are hard? And now I will quote Jim Massey who said a hard problem is what nobody works on. So in fact how many of us work on hard problems? It's very hard to get publications on and people will say well you didn't make much progress and if you actually build on our own it's much easier to have success in your next paper than work on a hard problem which is on primitives or hard problems. At the upper layer we saw some very nice examples on Monday that actually with automation you can actually stretch what we do in approval security you can stretch it up to implementation which I think where we should go. And it's nice first step that being said I think much more needs to be done there because it makes no sense to have 40 page crypto proofs and then somebody who doesn't understand emitting making buggy implementations I think we have done lost. So meeting last I think is a very good development. So the other thing is crypto lifecycle we focused mostly on design and now there is this kleptography issue this back dooring. So can we actually do kleptography as Moti Jung calls it can we actually clip the power of people who play with our devices? Okay this I think is very interesting how to kill cover channels, how to kill these things but if you look at devices they have a whole lifecycle and in fact as cryptographers we tend to stick just to the crypto design. And we don't look enough at all the other phases of the lifecycle and what can we do there with cryptography needs to make things more secure. I think there's a lot of things to be explored. So I won't say too much about concrete suggestions but try to understand real problems. Try to look at deployed systems, standardized systems. I mean the conference organizers were giving you tails and all the other thing. It would be really interesting to look at all this stuff and see whether the crypto in there makes sense and whether the implementation is actually okay. And if you can't do it ask your students they're very often smarter than you so they can maybe do it for you. And also I think we should try to not only write papers but also develop solutions. So to conclude I wanna say a few things about this case. I think it's a very good opportunity for all of you to actually get involved in the debate and I don't have time to go into the details but I wanna say just make a few comments. First, the FBI had the metadata. This was never in the press but they had it already. Second, a zero day was used and as far as I know this zero day has not been made public. So the government is stockpiling zero days and making all of you less secure for their private interest. And the third thing is that of course it is well possible, I don't know why this happens that a security letter has been sent from the NSA to Apple and say everybody outside the US please put this backdoor in the system. So in the US you may be okay, you may be protected by this but if you outside the US then this may be problematic. So in general if you join the debate and in particular assume you would be successful and assume that in 10 years from now this thing which lecturer can stand there and say we really want there is now 100 billion crypto devices that secure all the user data then you will have the question but what about law enforcement access? What about security versus privacy? And I recommend that you read this paper here which actually says that it's the wrong framing of the debate. So in fact, privacy is a security property. So if you kind of put them in a balance this is completely wrong debate and also privacy is multi-dimensional. It's always in this debate people speak about personal privacy but they never speak about privacy as a public good and privacy is essential for society to work as we know it. So of course and I also would argue that intelligence agencies have abused their power have overreached and have tilted the balance anyway we should actually now work on the other side on the user side and this may sometimes damage what police can do but on the other hand it seems to be also kind of a taboo to work on law enforcement access and I think we should actually break this taboo we should not say it's impossible I think we should think about it at least write papers how can we do this better? Because it's very nice to I think in first instance we should now secure because I think we're actually have a backlog but I don't think it should be a forbidden question to think about imagine we had perfect channels perfectly secure devices there is some cases where government may need access how would we do this? In an audible way in a controllable way in a limited way we have actually no answers either. So time to stop here so I think what we learned from Snowden documents is that the threat models which we kind of didn't believe actually are reality we have very strong opponents we should think more about system security network security it's not about twiddling it's very nice to improve our current protocols but what we really should be doing I think is rethink architectures and build distributed architectures it's a very different approach than just trying to fine tune what we have and also help to build technologies I think as a research community we have responsibility to do good research but we also have responsibility to make sure that some of this actually gets used and I think we should also engage in a public debate if we cryptographers don't do it at least we understand some of the issues I think we should actually engage and help people understand the issues so I would say it's all about choices choices you make and the future of cryptography that's you thank you very much for your attention so thanks for this talk do you have any questions? thanks for the talk Bart you mentioned transparency as one of the goals and I guess there's a big question how do we actually deliver transparency so people we can't expect users of various devices and so forth to actually know the cryptography if we understand the cryptography and I think transparency also involves understanding what the threats are and so forth so can you say something more about transparency and how we will deliver that? the problem is today we have a fair little transparency right I mean even a given example I mean there are several examples like even the Juniper case Juniper didn't say all these things it was actually people who spent their Christmas vacation reverse engineering all the net screen code to find out so if there would have been transparency which at least expert would have had the chance to look at this these problems could have been spotted earlier I think the other case where transparency shows is important is the Volkswagen case which is much better understood by the broader public that actually if you would have transparency in the software there of course that doesn't mean that you want every car owner to hack their own software or to read their own software but at least society should organize mechanisms that there is audit and that there is incentives I think as a government you can actually have money so you can actually spend resources you can either make offers for companies or academic researchers you can say here is a grant to please audit certain pieces of code I think that's another way to get transparency so it doesn't have to be the user himself understanding that at least it should be in principle these things and so in that sense if you have this model it's not necessary that actually the code would be revealed to everybody so you could keep some of the business models with code which is actually not open but you could still have transparency about what is happening when we think about security against large organizations such as the NSA we tend to stick to the old metrics of how much time is required and how much memory is required in order to break a particular security mechanism in my opinion we should think about a different criteria the NSA can throw a lot of computing power at problems and where it is really limited in my opinion it is in the number of good crypto analysts it has in the number of technicians in the number of hackers you're talking about several hundred crypto analysts you're talking about several thousand people who are really top hackers, top level hackers so we've been working on improving gradually everything we do if you look at TLS for example it's an algorithm, it's a protocol which had been around for a very long time and we keep twiddling it and making sure that the next version stops a particular attack in my opinion we would have been much better if we had completely replaced our protocols I know that it's difficult and there are limitations every year so instead of trying to improve things and get rid of one more bug at a time making agility much faster so we might have new protocols which have bugs in them because we are not going to spend as much time on getting rid of all the bugs but we are going to swamp NSA with its limited human resources rather than the almost unlimited timing resources it's related to the question of for example the monoculture that a very large fraction of the computers are using windows and windows is evolving so with a limited effort NSA can understand how it works if each company in the world was using slightly different kind of algorithms and data structure for its internal communication life for NSA would have been much harder even though a certain percentage of those are going to have hidden bugs in them so I think that we are wrong in trying to build a very small number of protocols which we are trying to make as secure as possible I think that instead we should move at a much faster rate in shaking and changing things I think that's a good point and I think there is some researchers in computer security have looked at diversity and biological mechanisms I think there is one thing that may break it which is automation if the NSA can write tools to analyze protocols I think then the question is will we have tools to analyze to design them and they have tools to analyze them and then I think it becomes a war of software against software and maybe we have more equal battle there but I think this is what's going to happen then it's going to be automated and then we'll see who has the best automation tools one last question before the coffee break on the subject of automated exploitation I believe that there is already a DARPA grant to do research on automated detection and exploitation of bugs using machine learning so it's unclear that just making more systems would have any impact thanks so thanks Bart and the program is continuing in 17 minutes, 1040 here and downstairs yeah see you there thank you