 So today it is our pleasure to welcome Gilles Bart and so who is our first invited speaker for the Euroclip conference. So Gilles received his PhD in mathematics from the University of Manchester in 1993. He is currently professor at the IMDA Software Institute in Spain since 2008 and since joining IMDA he has his research has focused on building foundations and tools for verifying cryptographic constructions and differentially private computations. So today he will speak about advances in computer-aided cryptography. So good morning and thanks a lot for the invitation. I'm very honored to be here. Maybe also for French people this room is a bit or this place is a bit special. This is actually a place which is well known in France for political meetings. One of the candidates of the presidential election I think hold this meeting here last week. Luckily for you I won't be talking about politics today but it's still very nice to be in this room. And so we actually came to be interested in cryptography a bit by luck. I was attending a meeting in Estonia in 2000 and presenting work on logic and there was a course by Claude Petterschner about the generic group model and I attended the course and I found it very exciting and started to work on this a bit later. But really what got us started very seriously in looking at cryptography and I'm very thankful to the authors for this are two papers by Shai Halevi and Mihail Belach and Phil Rogaway and also to some extent by Victor Schrupp. They were all published around 2005 and in particular there is this very beautiful and insightful paper by Shai who's kind of suggesting how a tool for building and verifying cryptographic proofs would look like. I don't think this paper has ever been published but it has served us as a source of inspiration for many many years so we are very grateful for this and so as I said we also started to do this work around 2006 and we were kind of trying to understand crypto just in our corner. We started from very very far. We didn't have any background. Gradually we had a lot of cryptographers that were answering all our silly questions so I'm also very thankful for this but also something which was very important for us was in crypto 2011 actually in the call for paper which I hope you can read here but there were some emphasis on also welcoming topics that don't routinely appear at crypto including cryptographic work in the style of the CSF symposium so CSF is more on computer security foundation and applying formal methods and logic to security problem and so we got motivated by this call for paper and we submitted a paper there and so we were very lucky to and very happy to get in. I mean we would have not imagined if it were not for this call of paper we would never have dared to send our work to a crypto conference because we were assuming maybe all people do serious work there and maybe it was not for us I don't know but so we were very grateful also to be given this opportunity and so for us it has been a lot of pleasure to interact with the crypto community for all this year. I'm saying we all the time and I haven't really given names we've been working with a lot of people mainly in Spain France and also Portugal but it's also a lot of collaborators and yeah I've been a bad guy I didn't write their names on the first slide but there are many of them. Okay good so the kind of work which we had been doing initially was more focused on what was the initial problem set in the paper by Shai which was trying to come up with tools that help cryptographers verify that their proofs are correct so provable security proofs but over the years we've kind of broaden the kind of work which we have been doing trying to see every opportunity available to us to know I mean to apply what we know which is like programming language and program verification in the setting of cryptography and essentially if you try to give a definition to computer aided cryptography you can think as a very broad field where you actually try to develop tool assisted methods for designing analyzing and implementing cryptographic construction and this includes both primitive and protocols and essentially the methods that we are trying to develop they tend to be principled so they try to build on a mathematically rigorous approach taking inspiration from logic which is kind of the place we come from and there are actually many goals one we could consider so for a long time people applying formal methods for cryptography they were working in the so-called symbolic model and the symbolic model is very useful because it supports automated tools and these automated tools have been very good at finding very subtle flow in protocol so this could be or this has been traditionally one of the main focus of computer aided cryptography but personally I'm also interested more than finding a tag trying to build proofs and so on and there has been a lot of work as well in trying to build independently verifiable proof of computational security so the basic idea here is like if you have actually tools which can check automatically your proofs will be kind of breaking the symmetry between building a proof and checking a proof and so there will be less opportunities for kind of having flows in the proof because the tool will check for you okay and kind of more recent goal that we've been looking at are verified implementation so you could think that coming from programming language verification and so on this would be the first thing people would be looking at but somehow ironically this came a bit later and also maybe you can hope that by using tool you will not only improve or fix what cryptographers have been doing but you can bring your own contribution and come up for example with a new cryptographic design or better implementation and I hope I will give you some examples where actually the fact of using tools and this principle method can actually shed some light on problems that were not necessarily well solved before so kind of bringing forward the state of the art in cryptography and for this will be building on formal methods which is an area which has been very active for the last 40 or 50 years and this is a huge field and it focuses on a lot of different aspects essentially the kind of main goals have been to prove that programs are correct so that they achieve their stated purpose so this is what program verification is about but there's also a lot of work on program analysis trying to make sure that the programs are safe so they don't have for example kind of memory errors during execution which could be a serious cause of attack there's also a lot of work on compilation trying to optimize a kind of implementation and come with better implementation there is a new line of work on program synthesis which is about generating automatically programs that meet a certain purpose and for example I'm very excited about application of program synthesis to cryptography I will say a few words about this during my talk and so on so essentially what we have is like we have a huge corpus of techniques which have been developed in a slightly different setting but one thing which I find very exciting is try to take these corpus of techniques and apply them to cryptography I think it is at least for me it has been a great source of fun and there are lots of very exciting problems that we've been able to look at through this lens okay so the potential benefits it's actually something which is good in both directions so essentially if you're able to develop good formal methods for cryptography you can expect higher assurance which was originally stated in the paper by Shai for example but I also believe it will be very useful to narrow the gap between a provable security and crypto engineering so there is some kind of a trend I think you can mostly coin in until the under the world real world crypto that Kenny and some other people have been trying to develop come up with security definition that match more closely the reality of implementation and so on but as you do these things are becoming more and more complex and eventually so this is my belief people might disagree with this but I believe that former tool they give you some kind of bookkeeping and way to tame the complexity of proof and there will be really necessary if you really want to end up bridging the gap between security proof and low-level implementation crypto engineering which tends to worry about let's say assembly level implementation and I don't think you can go the whole way in between the two if you don't have tools to do this and as I mentioned you can also expect new proof techniques on the other hand so this is the kind of propaganda saying yeah guys I will save you and so on I don't know whether I believe in this very much or not but also for me kind of applying formal methods for cryptography has been a great great source of example we got lots of very challenging example very nice problems to think about and also it help us develop kind of a new theory for programming language verification and so we also get a lot from this so a lot of fun but also kind of new interesting theoretical challenge to look at okay and so here is a kind of a long-term goal of computer-added cryptography so I will not make a bet of what I mean by long-term my supervisor was looking at formal verification of mathematics made a bet to me many many years ago about when a mathematician will be using proof assistant to verify their theorem and I think it's on his way to lose his bet there are 25 more years to go but I don't think he will make it so I won't make any bet but here is a kind of ideally where we could try to go okay so this is kind of a reductionist proof and this reductionist proof is trying to show that assembly level implementation is secure and so what you're going to show is like if you have an adversary that breaks the assembly code then you can build an adversary that breaks the design okay so this is kind of a reductionist statement and you expect that with provable security you show that there is no adversary that breaks the algorithm so you would be done except that in the middle you have to throw in two assumptions that will actually be handled using programming language techniques one is saying that the assembly code is kind of good quality I'm not going to specify very much but let's say it has to be safe so it has to respect some kind of a programming discipline that doesn't make bad things happen and it has to be leakage resistant I will give an example of what I mean by leakage resistant this is by no means the only definition but this would be the first goal and the second goal is that the assembly code correctly implements the algorithm so the assembly code meets its intent okay and so if you managed to do this essentially this is the what I meant by closing the gap between provable security and implementation and so for the first thing assembly code is safe and leakage resistant you can try to use program analysis which is actually what tries to achieve this kind of property and for the second step assembly code correctly implements the algorithm you can use program verification and also verified compilation so verified compilation is a kind of new line of research which tries to show that compiler preserves the functionality of programs okay so that would be really cool to get there there are many challenges of course so it's not for tomorrow the first challenge is to build models so there's a first problem which has nothing to do with crypto which is that building models of execution platform is actually a very big challenge so if you really try to give a semantics of x86 or arm and so on this is kind of a big mess so already trying to specify this is something which is very hard and I'm just talking about the functional behavior like I give you an x86 program for example you I read it on the platform which result do you expect and this is fairly hard okay second problem we're talking about leakage resistance so you really have to build models of leakage which is also extra hard there are some people working on this and but this is something that needs more research and the other problem is really being able to come up with a realistic model of adversary so as I mentioned this is something that the real world crypto tries to do to a certain extent but there is more work to be done so more both at the algorithmic level which is essentially where real world crypto is a kind of trying to look but also what it doesn't mean to be an adversary when you are executing on the platform so there has been some work on this direction so let's say for example you're considering a virtualized platform where the adversary is executing in some other partition and so on so there are some workers in this direction but they are very preliminary a second problem is that you know if you want to verify this you actually need to have candidate libraries to verify so you need good efficient code that ends up being safe and leakage resistance so this is also something that is very hard to come with today I'm not aware of everything that is going on in a kind of crypto libraries but there is no immediate candidate for verification I mean it stills this libraries need to be built and of course this is a huge challenge for formal methods we need to kind of develop kind of new theory and there's a lot of engineering to be able to get all these pieces fit together so on the other hand the situation is not hopeless because there's a lot of work which has been going on in this area so as I mentioned and maybe somewhat ironically most of the work has been focusing on a kind of security rather than areas where formal methods have been traditionally applied to so there's a lot of work in trying to build the tools for checking security in the symbolic and the computational model we've been working on the tool called Easy Crypt which I will be describing shortly but there are many other tools there's an increasing number of tools that focus on side channel analysis I'm mentioning here two kinds of tools tools that are focusing on a constant time verification constant time cryptography so try to avoid cash based timing side channels and there's a second kind of tool which is kind of more focusing about the mask implementation kind of protection against TPA attacks so I will present briefly two of these tools as well there's also some work on analyzing for safety so the trust in self analyzer which has been applied to some TLS implementation there is some work on functional correctness so functional correctness is also very important for security it's anytime you have a very stupid bug like a carry bug it can lead to an attack so this is also something which is very important and there are some tools there so there's a crypto which has been developed by Galois there's a number of tools which kind of leverage proof assistant and certified compiler in particular this concert VST toolchain a developer also concert is developed at Inria VST is developed at Princeton there is this tool called gf verif and there is also a kind of parallel line of work which tries to kind of develop principal implementation so chasm and boring SSL there's a lot of work that has taken place as well I think mostly at John Hopkins University trying to mix formal methods and cryptographic engineering so there's a lot of tools that have been developed and published at CCS over the years so there's a lot of work that is going on I mean I think this works have led to very promising result I would not say that in isolation any of the problem has been solved but we're going in the right direction what is slightly more new is kind of integrating this tool and there has been a very few case study that show that to a certain extent you can actually mix this direction so last year we did some work with my friends in Portugal and a colleague in Spain where we looked at an implementation of small component of TLS so we built our own implementation stemming from the fact that first of all it makes verification simpler and also it's not clear whether there is a kind of good implementation available to start with and so what we did here is actually looked at all aspects so we had a security proof using easy crypts or standard style of a provable security and then we just had our C implementation which we showed functionally equivalent to our easy crypt specification and then we compiled the C implementation using comms so this is actually what gave us this result which I mentioned that the generated assembly code has the same behavior as what we had in easy crypt and then we had type system for verifying that the implementation is a cryptographically constant time so this way we handled the side channel analysis and so this is a very small example but it shows that everything can be done together at least on this small example and there are some other example we had previously done some work on PKCS people at Princeton and Harvard have been doing some work on HMAC although for example in this case they don't look at side channel there is some work on HACLSTAR which is being done here in Paris at INRIA which focuses on functional correctness and side channel resistance there is the work on TLS so there's a lot of work that tries to combine different of these aspects together so in the I mean in the end it's a long-term goal but I hope it's not a crazily long term so the tool which we are using for doing this first step of the proof is called easy crypt and you can think about it as a kind of domain specific proof assistant so there are lots of tool for reasoning about mathematics reasoning about program we built our own tool and we beat our own tool because we wanted our proof goals tailored to reductionist proof so reductionist proof they have a kind of different flavor from standard program verification because when you do program verification you reason about one program at the time when you do reductionist proof in this game hopping style in general you have two probabilistic experiment and the goal is to relate these two probabilistic experiments so this is a something which has a became known as a relational verification you have to reason about two programs at the same time and these two programs are probabilistic which is also a bit new from the point of view of program verification so we decided it would be better to come up with our own tool and so our tool is the kind of supporting many common proof techniques that are using cryptos like especially in this code base game playing approach so we have this British step failures invent hybrids argument eager sampling and so on and we have support for all of those and the kind of key philosophy in what we are trying to do is like the proofs that you guys are doing are not that easy so it's difficult to think that one day there will be one tool that you push button and you get fully automated proof for any kind of statement to give so at this point is very important to have very good control as a user on the tool and you will try to guide the tool to build a proof for you so this is what we are trying to do and so we are kind of taking from these two mode of doing proof the proof assistance so we have something inspired from cork and SS reflect which is the kind of proof assistant that has been used for proving the biggest statements in all the biggest example of formalized mathematics so for example they prove the fight Thompson theorem which is a big chunk of mathematics or the four colors theorem before and we also have automation backends to SMT solvers and computer algebra systems which are very useful and so how did we manage to get these proofs or build a tool that actually is able to deal with this so it's not so easy but and we are we were quite ignorant but it turns out that in the end what we needed is to take inspiration from something that arises quite frequently in the analysis of Markov chain for example to prove rapid mixing and so we use something called a probabilistic coupling and so the basic idea of probabilistic coupling is like you have two probabilistic processes and you try to establish a relationship between them and so that's exactly the situation which we are interested in and how do you do this so the basic idea is like you can think about these two process and they have independent randomness but if you want to say something meaningful about the relationship between the two processes you will have to say something about the way the randomness in the two process kind of ties together and that's exactly the idea of probabilistic coupling instead of seeing these two probabilistic processes independently you will build a single probabilistic process that kind of emulates the behavior of the two and so every time you have a random sampling here and the random sampling here you're actually going to build in a random sampling in the product process and this random sampling in a way will tie the way the two random sampling behave in each process okay so that's the definition what the definition of coupling is actually doing here you're saying if you have two distribution over a so we are lucky to be in crypto or at least in the style of crypto we're looking at we're dealing with discrete distribution in fact sub distribution so things are a bit simpler here and essentially you have new one and you too and the coupling is just distribution over the product space such that when you take the first margin all you get the first distribution the second margin all you get the second distribution and then there is this notion of our coupling so essentially R is the relation you want to establish and if you want to establish this relation you have a further constraint on the coupling that you have to come up with and so this notion is actually very useful for cryptography because for different choices of the relation R you will be able to formalize steps in crypto so for example for a bridging step the relation R will be a quality and then you will be able to show that the probability of a same event is the same in two different games for a failure event you will the relationship will say all the two results are equal provided some bad events does not happen and then you can actually bound the difference in probability for the event acts in the two games by the probability of not F okay this is a slightly different statement from what you usually see in crypto papers because we're working with sub-distribution so we are just getting this max except instead of just one of the two and then for a reduction it's just kind of an implication R has to be an implication between the two winning events so essentially what I'm trying to show here is like these couplings are kind of a great tool to formalize a lot in reasoning in crypto so is it one of the questions that we had so first of all initially we had we did not realize that what we were doing were couplings we had this definition but we did another connection and two years ago I was working with a student at University of Pennsylvania is Justin shoe is actually more interested in differential privacy but he came up with this connection and yeah I was I'm still very excited about it and so the question is like whether recognizing that what we are doing is probabilistic coupling is something useful so as far as I can see there has been some prior work on using couplings in crypto there are a few papers it's not so widespread and the way they use it is kind of different I don't know whether it's so I don't have the answer whether it's a very useful insight but one thing which is sure is like for us it was the key to build this scalable verification infrastructure because we don't have to reason about probabilities when we do this game hopping we just have to establish the coupling and we deal with probability later so that was the key to have something scalable and also it's very useful for dealing with generalization because when you want to look at the more general settings this kind of connection with coupling is very helpful so we've been doing a lot of work on differential privacy and observing the relationship with coupling we very recently came up with the right notion of coupling for this particular setting it's a notion of approximate coupling and we prove a version of a theorem called Strasse theorem which really tells us we have the right notion and also some years ago with Dominique Kuhn-Ruh we got very excited about extending our work to quantum crypto and we thought we had done the hardest part of the work because we had found the grain name for the cool for the tool it would be called Quisi Crypt and the problem is that when we try to extend lifting to the quantum setting it was not so clear what the definition would be but now we know there is this connection with probability coupling and optimal transport there is actually some work on quantum optimal transport and people have been coming up with notions of coupling in this setting so we can try to see whether this work so it's for us I think it's very useful okay so what we do is like we have this code base approach to probability coupling this is a language which is very close to what me here and Phil have is in the 2006 papers and so we kind of deal with this game playing techniques using this relational whole logic where P and Q are relation on states so we don't talk about probability so in this sense it looks very much like a standard verification and then once we have done this this gives us by the kind of results I've showed you later some kind of probabilistic inequalities and so to conclude you just have to bundle together this probabilistic inequality and also resolve some probability bounds for some events and for this we have a logic that helps us give upper bounds for the probability of events in a game in a game so this is actually traditionally where a lot of work in verification of probabilistic program went but there's a lot of limitation still so for example if you try to use concentration bounds or reasoning about independence this is not something that is handled so well this is some ongoing research area and then of course we need to bound the execution time of constructed adversary and in principle this is something which should not be too hard but we've been very lazy and we did not really build a tool support there although there is one area of theoretical computer science called implicit complexity theory where they have developed type system and so on we could try to use their stuff we never did okay so very quickly I won't go very much into detail but this is kind of what the proof rule looks like so essentially the way it works is like you have a big program you apply some rules and you get smaller programs so it's like in whole logic or program verification you build your proof backwards we have a rule for random assignment which is where the magic goes but essentially what we are doing here is kind of building a coupling a special kind of coupling which we call bijective coupling and just with this we can go a long way so we've been doing a lot of example kind of the latest example which we've been doing is one kind of a building a plan a verified platform for a secure two-party computation based on Garbel circuit we had a small project with NIST about verifying indifferential ability from random or recall of chat three my colleagues recently completed a proof of privacy for an e-voting system and so doing this proof we got some kind of mechanizing them we found some subtle points so on the other hand it's true that these interactive tools are time consuming and difficult to use so back in 2011 when we got our paper we say for the working cryptographer I think we were a bit too optimistic I have to say I mean if a working cryptographer starts using our tool at this point maybe we'll stop being a working cryptographer fairly soon which was not our intention but actually my friend Manuel Barbosa is always suggesting there could also be this lightweight approach where actually you use our tool just to write your probabilistic experiments and the kind of sequence of game without doing the proof and this would be already very useful because they won't be type checking errors in your program and there will be some kind of structure and you will be able to get some early bugs so this is something which would be valuable but I guess in the long run what you want is to have more abstraction and more automation okay the problem with automation is like it's somewhat problem specific so we've been doing a number of work on highly automated proof and I will go quite quickly about this but essentially the basic idea is like this proof which we are doing in relational whole logic they are at the wrong level of abstraction when cryptographers do prove they have this high-level principle and this is what should be automated because otherwise people will get lost in trying to use our tool I think Jonathan Katz had this kind of very nice saying about easy critters was there was some kind of impedance mismatch when you were trying to do use the tool so it's really our job to try to go up at the level of abstraction that you guys are doing and so some basic observation here is like many of the proofs which are many of the principle with which are using the consists of two part one is like coming up with some information and this is one where the small smartness comes into place and then once you have coming up with this kind of information check that it's the right information so for example if it's a reduction it will be you will give a constraint like the adversary and then we'll have to check that it's really simulating well and so like the second part could be automated and in a principle way and then for the first part you have to use some heuristics and so this is something that works pretty well in practice we built a tool for for example DTH based cryptography we applied it for many examples in pairing cryptography so one thing is very hard to come up with a sufficiently rich set of high-rule and this is something that what could try to work on but also something which I find very exciting is that sometimes instead of using heuristics people have been looking at the building decision procedure so this is our kind of algorithm which you know will return the right answer if it exists and so there was some work by Utla and Roy in 2012 but there was also a paper at Crypto last year by Kammer and Rosoulek actually Crypto for the first year last year had a special session on automated tools so that was for us we were not on this session but there was a big good news okay so we also did a lot of work on automated proofs in the random oracle model and essentially what we looked at is like one of the first example which what we did was a formalizing OAP it took us six months and that's an enormous amount of time and then we tried to understand whether we could automate this and we actually built a tool called Zoo Crypt which was able to do this and essentially what we did there and I think this is a general principle it's very good to build two tools one which is very good at finding attacks one which is very good at finding proofs you run the tools so if you are just going to look at lots of scheme you just run the tools that find attacks first because most of the stuff you will come up with is junk and then on the one which are left you try to run the proof and we had a very good coverage rates so for example for CCA security CPA security was almost full coverage it was less good for CCA security and then you have this gray zone and so what we did for this gray zone is we had this nice oracle called David Poinscheval and he talked very well he actually came with a new scheme which was on our gray list called a ZAP for which he could actually prove in CCA security for RSA with a small exponents so I have to say here like the ZAP stuff the proof was mechanized using easycript but all the smallness came from David not from us okay the last thing which we did I won't say too much about this but we also work on the building automated proof in the generic group model and this is something where we get pretty nice results of the latest work which we are doing is with some people at UNS and we're trying to apply this to attribute based encryption so there's lots of opportunities to try to get automation in different models and so on okay so let me try to go now to the second part which was about side channels and so essentially we've been doing two lines of work one is in on constant time cryptography and one is on mask implementation there's a well on fact that when you want to break crypto you're not going to try to break the math you're going to try the implementation and there are plenty of evidence that the timing attacks can be kind of a disaster they help you to recover the key and more or less they work remotely so you really better not have stupid timing attacks in your implementation and kind of a gold standard for avoiding cash based or protecting to the best extent against cash based timing attacks is a cryptography constant time and essentially what it suggests is like the control flow and the memory access should be independent of secrets okay and so it looks like a pretty simple thing to achieve but it's actually very hard to achieve and so we have a funny story about this because while trying to do our work last year ago in trying to devise tools for checking constant time we look at this S2N implementation of TLS by Amazon Labs and we found the timing attack there and they actually told us yeah guys you are too late there are these guys in London so Martin Albrecht and Kenny Patterson who already found an attack then we fixed it but it turned out that what we attacked was their fixed implementation so really if you are a programming language guy and you say oh a constant time you say oh this is not such an interesting property it's very easy to achieve it turns out that it's not so it's very useful to have tool and more or less you can have a great tool by being extremely lazy so we actually realize that there is some work which has been done in the context of compiler verification by Zaks and Pnuel in 2008 and it was tailored to check that compiler optimization is correct and it turns out that what they propose is sound and complete for relatively complete for verifying program counter security and the way it works is like you start from one program and you again actually build another program that emulates two runs of this program okay so you just run the program twice in lockstep and every time you reach a branching statement you just have to check that the two execution take the same runs okay and every time you do a memory access you just have to check that the memory access is made at the same location okay so this is something which is beautiful it's sound and relatively complete which means like if the transform program doesn't raise any assertion failure then the original program is constant time and that's if and only if it supports advanced notion of constant time for example you can have some public outputs and with a very thin layer of code we've been able to implement this of on top of an existing verification framework for LLVM called SMAC and we apply these to many libraries so that was really nice we have some ongoing work trying to extend this to a vector instruction and we also apply this to Supercop but because in Supercop you actually don't know or at least we don't know whether this implementation are meant to be constant time or not it would be nice to have a counter example generation so that actually if there is an implementation for which our tool says it's not constant time we can actually come up with values that we can run and show it's not constant time so this is what we try to do we are also doing a lot of work and this is something which I got extremely excited about as well well basically I get excited about everything so I should stop saying it differential power analysis so there's a lot of work this is being done sometimes in the chess community but you get also a lot of paper here on trying to building mass implementation to make sure that programs are resistance against the DPA okay and so there are these two models the thresholds probing model which was introduced by Ishae Sahai and Wagner from years ago and which is a beautiful theoretical models for analyzing mass implementation and there is this a more practical noisy leakage model that was introduced later and there is this beautiful result at Eurocrypt 2014 showing that the two models are equivalent so it's kind of you get the best of the two worlds you have this theoretical model which you can use for verification and then you get good guarantees and so we decided to try to understand essentially whether we could apply our techniques for mass implementation in the threshold probing model and we didn't get the idea just by ourselves there has been some prior work on this there is this very nice paper at chess 2012 who actually use this language-based technique to analyze mass implementation there was a follow-up the year after with a tool called sleuth which I also believe was presented at chess 2013 and then there were some formal methods people that also published at the CAF 2014 so this is this was very exciting work for us because essentially we realized here that what masking security deliverer is something which is exactly an instance of something that people have been considering in our community and so their work was limited to low orders and does not compose well so we wanted to try to see whether we could do better by just looking at more advanced technique so essentially the idea of probing security I'm not going to go so much into the details here but very often cryptographic notion of security are stated in a simulation-based style we don't state our notion in a simulation-based style we normally state them as something which is called to safety which means we consider to execution of the program and we assume the inputs are related and we look at the relationship between the output and so it looks very different but in practice it isn't to convince yourself there is this very simplified case here where you have a function that takes two argument and if you're a cryptographer you will say oh there exists a function that only takes the second argument such that for every two element a1 and a2 f of a1 a2 is equal to g of a2 okay so this is kind of the natural way you would think about for us we will think about to run so we would say oh you give us a1 a prime one and a2 and they have the same result okay and so there are kind of two different formulation but they end up being equivalent and this is some thing which for us allows us to take your definition and move them to our world and so using this we've been building a tool which uses probabilistic non- proofs probabilistic non-interference and the basic idea is like when you do masking security you have to look at every possible set of T observation so this is something that grows very very fast and so when you build a tool you just have to find a way to deal with lots of these sets at the same time so what we did is like instead of looking at sets of T observation we were looking at very large sets under a certain criteria and just making sure that we were covering everything so in practice there is no reason why it would run better than verifying all the sets sorry in theory there is no reason why it would be better but in practice it worked much better so we were able to analyze a lot of implementation so you see for example if we would be looking at this S box a third order masking you will have to verify probabilistic non-interference two billion times which is not a very good idea so our technique allows us to do something which is much better and then the last thing which we've been doing and has been presented at CCS last year was trying to build full implementation because this first work was actually looking at masking security of one gadget but really what you want is to achieve masking security for full implementation and so what we did is we introduced a new security notion which is compositional I will show in my next slides that there is a problem with composing mask gadget so we have a new notion that make things compose that means you can analyze smaller components and then you will make sure the full implementation is fine this is something which is fully automated and we have a type systems that make sure that the implementations are secure and one thing which is nice by having full automation and type system is also there is something which is very costly when you deal the mask implementation is to insert refreshing gadget so refreshing gadget they don't change the functionality but they increase security but this has a cost and if you have a type system you can let the type system tell you yeah you should put a refreshing gadget here or you don't need to put a refreshing gadget so you get much more efficient implementation and we used it to mask a lot of implementation AES, Kechak, Simon's, Peck and so on and we get generated code which is reasonably fast so for example if you mask AES at order seven it's about a hundred times slower than a mask implementation I don't know whether people are interested to mask at order seven but we have a tool we just have to give which order we want to mask so we try it and so I don't think I still have 20 minutes left so probably much less 10 minutes okay so let me go very quickly over the problem with composition so essentially and I'm going to use this simulation based notion but so you have a composition of gadgets and the adversary is going to make a less than T observation which are split across the different gadgets and you would like to so you know that each gadget is achieved probing security and you would like to show that the result achieves probing security but the way it works with the standard definition is like essentially you have to carry back all this number of us observation up so when you're doing your simulation based proof so you have this t3 observation here so if you just use the fact that the first gadget is a kind of non interfering you will go and get t2 plus t3 here and then when you arrive at t1 you get t1 plus t2 plus 2 t3 okay so you already kind of unable to proceed okay so this is an observation which is not ours but which says that when you just take non interference or masking security as standardly stated composition does not work well so what we propose is a notion of strong non interference and the basic idea here is like we are going to distinguish between internal observation and external observation okay and the notion of simulation so you just need as many inputs as the number of internal observation to do the simulation so you can think about as a strong non interference gadget as some kind of barrier and this barrier means that everything which you have been observed observing up to then is going to be lost once you cross a strong non interference and fortunately enough this is not a completely crazy notion but because a lot of not all of them obviously but a lot of gadgets from the literature already achieved this notion of strong non interference and so once you have strong non interference again you have this setting here we've put a refreshing gadget this refreshing gadget is strongly non interfering so you just care about the number of internal observation and so when you go up up here this t3 observation they will go here so you get t2 plus t3 but now when you go here you actually get TR because the t3 don't count because of the notion of strong non interference so when you arrive at a1 you get t1 plus t2 plus t3 plus TR and when you go there at the top you get again t0 plus t1 plus t2 plus t3 plus TR which is smaller than t which is what you wanted so you're done so this is how you achieve compositionality with this notion of strong non interference so I think this was very cool because we have this nice notion and this is one of the example where I think we can kind of beat cryptographers like what we propose is advancing the state of the art maybe I mean I hope that people like strong non interference and so that was very nice for us like we don't feel like donkeys following smart cryptographers for 15 years maybe we have something to say okay so I think this is something very nice and so there is some further work that needs to be done but in general one thing that makes me very excited as well is like if you're a programming language guy really the first thing which would have been looking at is not this reductionist proof we should be have been looking at kind of information theoretic security this is kind of the thing which is closest to actually what we know how to do and I actually believe there's a lot of opportunity to try to look there and apply our techniques so for example it will be interesting to look at MPC and see whether our techniques can apply there I think there's a lot of potential here and the other thing which I find a kind of exciting is like our techniques also apply to active attacks so you know when you look at fault injection if you are so in it's more in software engineering but people in software engineering are looking at this problem of program repair so you have a program the program doesn't do what you want and you have automated methods to fix it and like fault injection you can think about it as adversarial program repair like you have a program you cannot break it you inject fault so that you can break it and so you can transfer this program repair stuff in the setting of crypto and you can come up with new and interesting attacks so this is a very good and so to wrap up so I think what we've been doing is kind of all we hope we have been doing is kind of a building useful foundation and tools for achieving higher strength crypto looking at probable security and practical crypto but also at reducing the gaps between the two and there's a lot of very exciting direction so I hope we still have fun for another 15 years but so kind of directions which I'm very interested in a kind of adding more automation and looking at different application domains we also looking with a number of people are carrying these methods for high speed implementation so there is this idea of developing a programming language called Jasmine which looks very much like Kazam but which is certified and so on and that would be very interesting I already mentioned language based methods for information theoretic security and I didn't talk too much about this but I also think automated synthesis of crypto construction is something very excited and exciting sorry and there has been some great work on this so there was a CCS paper for actually there was a previous paper by the same group of people at Maryland but they look at the kind of authenticated encryption and they came up with lots of great construction fully automatically I think that was very amazing I actually got the best paper awarded CCS and there is also this crypto paper this year which I already mentioned and I'm also very hopeful we can do something for quantum cryptography because we have a great name for the tool so now we have to do the research thanks for your time so we have time for questions any question so I have a question so if I wanted to verify your side channel implementation using your tool mask verif what would be basically the complexity for me as a cryptographer do I have to implement the algorithm in a special programming language or what is the learning curve for cryptographer so for sorry we are right now I don't know whether this is the best choice but what we did is our tool works on C programs so it's probably not the best choice if you want to do something kind of industrial or whatever but so it's written in a kind of let's say comfortable subset of C and so it should not be too difficult to write programs in this fragment and we have kind of plans to extend the slowly the language but I don't think it's extremely painful we have more much more painful tools to use if you're interested when when you talk about the relational decomposition do you include the platform stuff or was that separate like armor x86 you were saying the platform was very difficult to include is that included are you meaning the verification work that we did in this particular example at the ME CBC we did not so we generate assembly level code and our adversary is just an adversary that has access let's say to the trace so for us the trace is just a sequence of a program point and the memory access in this particular work we don't look at justifying this adversary but we have an independent work with colleagues in Uruguay I forgot to mention that some of this work is also with people in Uruguay well we actually show that this model that we have is meaningful for a certain kind of virtualization platform so this is something that has been formalized in the core proof assistant building on top of concept that's quite a big development