 So the first paper is on a cover and decomposition in this calculus on elliptic curves made practical applications So probably the unreachable curve over FP6 by Anton Jou and Vanessa Avitz and You're ready So this is quite a long title, sorry about that So during this talk, I'm going to present you a new practical attack of the elliptic curve discrete logarithm problem And give you an application to occur which is defined over a sextic extension and to see how performant this attack can be Let me first begin with some Reminding facts about the CDLP and the major existing attacks Maybe first the definition of the TLP is the following you are given a group G and two elements G and H in this group and the goal is to find an integer when it exists an integer X such that H is equal to G to the X Okay, so the difficulty of this problem is clearly related to the group G and If you don't have any information, sorry on this group You can perform what we call a generic attack Which means that you consider the group law as a black box and then the best complexity you can hope is in At least in the square root of the size of the group G If you consider a specific and sensation of the group G you may have some more efficient attack for example If G is a subgroup of the multiplicative group of a finite field you can perform an index calculus method Which has a sub exponential complexity in our third the function L is an interpolation function between polynomial complexity and exponential complexity Or another example here Is to take the subgroup of the Jacobian of a curve which is defined over final group of finite field FQ then Again, you can use an index calculus method to be to be more performant that the generic attacks in the context where G The genius of the curve is greater than than two For a elitist curves the thing the situation is a bit different so I enunciate here the discord logarithm problem in additive notation and I give a Sketch of the group law on on this figure here So you can see that this group law is quite easy to to compute You just use the question of lines, but it is it is sufficiently Intricated so that the LP is very hard Indeed you usually only have generic attacks to to attack this this program so for the The definition field of the field of the curve you have the choice you can take either a prime field so in this context you have an excellent security, but The difficulty here can be the arithmetic if you want to have a faster metric It might be difficult in this context to implement it Efficiently in hardware, so if you want to make the things faster You might want to consider extension fields Especially when P is equal to two or fits into a computer world However, you have to be aware that in this context the these curves are potentially potentially vulnerable to index calculus methods So as we have seen on on this example The index calculus methods are the best Way to try to attack the the ACD LP and to be faster than generic attacks So I sketch here the basic online of this method since it will be very important for the Next of my talk So the idea here is to first define a factor base which is composed of some points of the group G and Then you have two major stages the first one is a relation search Well, you consider random combination of the LMNG generated the group G and the target point H You which which you want to compute the DLP the year So you consider a random combination of G and H and you try to decompose it in the factor into the factor base So once you have collected enough relations of these four The idea is to combine them to try to consult the right-hand side term And then we use the DL of H in base G. This can be reformulated by saying that in fact we are computing Vector in the kernel of the matrix composed by the say IG corrections there So the second stage of the index calculus method is the linear Algebra one where we try to compute an element in the kernel of a huge matrix So there are two difficulties when you're trying to do such index calculus method The first one is on the practical side with this linear algebra The matrices involved there are very very huge usually there are millions of columns But hopefully there are very sparse usually you only have a ten or so coefficient which are non-zero per rows So we can try to perform sparse linear algebra algorithms The difficulty here is to distribute this this algorithms Okay, either you use long source or Widman algorithm, but usually they are hard to distribute The second difficulty is more theoretical one it concerns the search of relations For example, if you consider an elliptical which is defined over a prime field We don't know any methods yet to to compute the relations As I said if you consider an elliptical which is defined over a next engine field in this context You have two existing methods the first one in fact consists of transferring the problem to another group the group of Defined by the rational points the FP rational points of the Jacobian of the KFC So you use a very decent technique to transfer the DLP on this group and then we perform index calculus method Or the second method can be to try direct the composition on the elliptic curve using Godry and deems approach so as it is Really important for the talk I will detail a bit these two methods right now So the first one is very decent Consist of first taking the very restriction of the elliptic curve E, which is defined for an extension field FQ to DM So this is just the mabellan variety which is whose equation are obtained by considering equation of E seen on the vector space FQN over FQ So you have any equations defining this very restriction and then what we do is we consider Curved C which is included in two display restriction and which is defined over FQ So somehow this inclusion is equivalent to say that we're considering a morphism by From the curve the FQN rational points of C to the FQN rational points of E So once you have discovered map You will do the transfer by considering the following the application the following applications first the pullback of the polymorphism to Go to the Jacobian of C defined over FQ to the end and then the trace map with respect to the extension FQN over FQ to get To transfer the DL into the Jacobian of C defined over FQ then you perform an index calculus method on this last group and You can do this either in two ways either you consider sorry and Either you consider an hyper elliptic of C which has a small genus and then you can use Godre's method or If the curve has a small plane degree model, you can use them D masochism But in any case you need to have a curve which has a small genus G and the question and the main difficulty in this approach Is to find such a curve So to do that we can use the GHS Construction which is due to Godre as and smart in the binary case and which has been generalized by D mean the odd characteristics case so the idea of this method here is to Given an elliptic of E which is defined over FQN and an equation of this curve To construct the curve C defined over FQ and the corresponding morphism from C to E I Won't detail the method since it's quite complicated But I would like to mention that for most elliptic curves the genius that you obtain with this Construction is in 2 to the m. So clearly the index calculus method that you can perform on the Jacobian of C are usually slower Than the direct generic attacks performed on the elliptic curve You can nevertheless use an isogenic work to extend the reach of this attack And this is this is a method proposed by Galbraith passive and Galbraith But still this method is usually efficient only for a few fields So if this doesn't work you can try to perform a decomposition attack directly on the elliptic curves And this was proposed by Godre and deem So here you don't do any transfer and Since the method is working also in a more general case of the Jacobian of Curves I will present it in the context of the Jacobian of a hyper elliptic curve defined over FQ to the m. Okay, so Usually you apply this to the elliptic curve by a present it's in the context of hyper elliptic curve So the idea here is to define the factor base by considering the divisor Of the Jacobian of H over FQ and which have this form here so they are equivalent to a point Q which has a Next coordinates which lies in FQ Okay in the elliptic case this means that you are considering points having an X coordinates in FQ and Then you perform the Relation search step. So you try to decompose a given arbitrary divisor as a sum of NG Divide elements of the factor base where NG here is precisely the dimension of the very restriction of the Jacobian of H So if you look at the asymptotic complexity you have a relation search step Which is in a as a complexity in a big O of Q and linear algebra Which has a complexity in Q square so you can rebalance a little bit the things by using the double large prime variation And then you get an asymptotic complexity Which is in Q 2 to the Sorry Q to the 2 minus 2 over NG as Q goes to the infinity and NG are fixed So the advantage of this method is that all curves are equally weak under this attack Which is not the case for the GHS attack But the decomposition here is the main problem because Each time you try to decompose a given divisor you will have to solve a multivariate Polynomial system over a finite fit which is quite difficult in practice So we can see a bit the detail about this decomposition stuff To try to get the decomposition of D into NG elements of the factor base What we do is we use the approach of Nagao which consists of rather Looking for a function f whose divisor is equal to this expression here Which means that we are trying to find F in the Riemann-Rohr space defined by this divisor here NG times the point at infinity minus D Okay, so this reformulation in fact is can be seen as a parametrization by the Riemann-Rohr space of the set of decomposition of the divisor D So once you try to to find the function F You can see that as you need to have the X coordinate of the Q point to lie in FQ You will have to solve a quadratic polynomial system defined over FQ Which is composed by this number of variables there And this number of equation and just expressing the fact that the X coordinates of the Q points are in FQ So this is clearly a zero-dimensional quadratic polynomial system and to We can estimate the complexity of its resolution by saying that it's at least polynomial It's the degree of the zero-dimensional ideal, which is 2 to the a minus 1 time Hg which is Exponential as in N NG So in practice the resolution of this polynomial system is really difficult and Mainly we are able to get a resolution only when N and G are smaller than 3 or eventually if we consider the elliptic case using summation polynomials You are able to make the resolution when M is less than 5 Okay, and as you need to collect about Q relations for the linear Jebas step You will have to to compute about factorial NG times Q decomposition Polynomial systems Or even more if you are trying to do to perform the double-lash plan variation I mentioned before So the bottleneck of this approach is clearly that the Relation search is too slow for a practical resolution of the DRP Okay So during the second part of the talk I will present our new and dex calculus method and the first ingredient of this method is Clearly to try to improve the situation with the relation search So what we do here is that we consider relations involving only elements of the factor base so relation of this form and If you make some heuristic about how many such relation we have you can see that It's about Q to the M minus NG over factorial M so that we need to fix the parameter M to NG plus 2 right to have enough relation To perform the linear Jebas step Okay, so this is quite similar in fact to what we are doing in the classical and dex calculus methods and firefield We have for example the number fifth sieve or the function fifth sieve Or if you are familiar with Demes and dex calculus methods for small degree plane Curves and it's also the same stuff Okay, so of course you have to modify the index calculus method in this context During the linear Jebas step what you compute is just the DR of the elements of the factor base up to a constant now So that to compute Nendeville an individual a discrete log you will have to perform at least two Nagao side decomposition to fix this constant So this is the what we call the descent phase And if we want to compare now The decomposition search in Nagao and in our variant we see that in Nagao's Approach we have many polynomial system quadratic polynomial system to solve which are zero dimensional And whereas in our variant we have only one Polynomial system which is under determined with two more variables than the number of equations So what what do we have more with our of our what is best with our approach is that if you are able to compute a nice set of generators of the ideal of this polynomial system You might be able to compute faster the relations The idea is to compute a lexicographic of the basic basis Hoping that the specialization of the two more variables that we have use an easy to solve system So the things are going very well when we are considering the special case where the curve is defined over a quadratic extension field We are considering the odd characteristic case here so that we can define the extension as this where omega is a non-square and In this context what is nice is that you have an additional structure on the equations In fact the polynomials involved at each decomposition are multi-homogeneous in fact, they are even billionaire by billionaire and We and if you consider the variables with respect to the first block of this degree you get a One-dimensional variety So in practice we modify the decomposition methods by not special easy specializing to two variables, but instead we specialize the first block of variables So that all the remaining variables now are lying in a one-dimensional vector space So this is really easy to solve of course We see in a minute that we can further improve the situation by using a sieving technique In practice and this is much faster than Nagao's approach For example, if you consider the case where n is equal to 2 and the genius is equal to 3 On the 150 bit curve, we are about 1000 times faster than Nagao's approach Okay, so let's now detail a bit the sieving technique The fact is that Here after the decomposition method what you get is a polynomial whose roots are supposed to be the x-coordinate of the cube point In the decomposition of your divisor D So you still need to find the roots of this this polynomial and to test if it speeds So to do that We in fact prefer to modify the decomposition stage by saying that during the specialization of the first block of variables We can express all the remaining variables in terms of the last unknown lambda that you have So that the polynomial big F here is in fact a polynomial in variable x and lambda and as degree 2 in lambda And degree 2g plus 2 in x corresponding to the 2g plus 2 points in the decomposition of D So instead of specialization specialization lambda we do an enumeration in x over sq and Look at the corresponding values in lambda which are easier to compute because it's a polynomial of degree 2 and What we do the so is that we store in a tabular a counter associated to each values of lambda and After the enumeration in x we are looking for the contours which Have reached the the value 2g plus 2 because we know then that for these values of lambda the polynomial F is split Okay, so this is a classical trick that we are doing in in saving techniques In particular this method is well adapted to the double large prime variations and mentioned since in this context You only have to see With respect to the values of x corresponding to the small primes Okay, so this was the first ingredient to improve the decomposition step decomposition computation the second ingredient is to to use a combined attack of The GHS and the decomposition attack So we are considering the context of an elliptic of defined the very few to the end Where the finite field is such that the GHS does not provide any nice covering curve Typically the genius of C is too large to perform an efficient index calculus method and The degree of the extension and is too large mainly greater than six To have a practical decomposition attack So in this context we are a bit stuck What we can do is to suppose that the degree n is composite if n is composites then we can apply both approaches first the GHS on the sub extension FQN over FQD and Then transfer the DL to the Jacobian of the curve C defined over FQD Then performing the decomposition attack on this Jacobian of C With respect this time to the field the base field FQ So note you can note here that the degree the composite degree Is not artificial since it has been proposed for optimal extension fields So we can we can have a look at the case where n is equal to 6 the sex stick extension case which occurs for optimal extension field and The need our target in this context is good to consider a curve E Which has a genius three hyper elliptic cover? Okay, so this occurs in fact for about Q to the four curves directly up to isomorphisms And if you are allows you to to use an isogenic work You can say that at least we consider that it's true for almost all curves Having a cardinality divisible by four In this context if you try to apply the GHS attack or the decomposition attack You you get something which is not efficient since the best you you can get with the GHS attack is a genuine nine cover and this occur Rarely only for a Q to the cube the Q to the three curve start So index calculus method in this context are usually slower And as I said before as the degree of the extension is six here The direct decomposition will fail So we can have some estimation based on the computation done with the magma software in the context where the curve E either as a prime subgroup Prime order subgroup of one hundred and sixty bits So if you try to perform a generic attack on this curve against this curve You have an asymptotic complexity, which is in P cuba and If you try to extrapolate the timing that you can obtain in this context you see that the Global attack would take 50 trillions of years. So clearly unreachable Now if you try to perform a decomposition attack or the GHS attack you have two choices You can either consider fp square for the base field In this context you can see that the memory is the bottleneck since you have to solve about p-square elements so clearly unreachable with P of 27 bit size Or you can try to to use fp as a as a base field So in this context I already said the composition is intractable and the GHS attack in the base case Allows you to perform an attack with an asymptotic complexity in P to seventh fourth in practice This would take about 2500 years, so this is not so bad, but occurs very rarely Now you can try to perform our cover and decomposition attack So here the complexity the asymptotic complexity is nicer. It's in P five-third Of course in the case where the curve admits the GHS hyperliptic cover and The estimation here would take about 750 years using the GHS decomposition style So this is already much better or even 300 years using the modified relation search So but this is clearly just estimation so we can now have a concrete Example that we have With a real size example, I mean So we consider an elliptic curve which belongs to the family of curve admitting a genius free hyperliptic cover so it has this form and We took a Curve which has the cardinality, which is four times a prime number of size here 148 bits If you perform the GHS attack to against this curve you get to cover which has a genus 33 clearly unreachable Now if you try to perform our method you We were able to to get a complete resolution of the DLP in about one month So I can be detail the timings here During the relation search we first have to compute the lexicographic gamma bases This took about three seconds really fast Just to mention if you try to compute the lexicographic gamma bases For a random system with the same number of equation and variables, which is creditic you you're not able to do the computation Okay, so this is really fast Then you have the sieving step So here we have computed all the possible relations, which means a P to two or two times factor your edge relations and this took about 62 hours using 124 calls So as already said is 1000 faster than Nagas approach Then we have the linear Jebas step So as we have many more relations than needed we first use the strategic Gaussian elimination to reduce by a factor of five the number of columns and This took about 25 hours using 32 calls and then the main step Lancers algorithm step Which took about one month using 64 calls So at this point I would like to mention that Somehow we we cannot More improve this step we're using long source algorithm Because we have to broadcast about 200 megabytes of data at each round, which is somehow the bottleneck for a parallelization Okay, maybe Antoine can say more on this during this talk in one hour and Finally you still have to do the decent phase Where you have to compute an individual logarithm and this is really immediate since it took about 14 and 14 seconds for one point Okay, finally, I would like to to see a bit How we can scale again the timings obtained with magma before using all the experiments that we have done on different curves So we have three examples here Case where we have a curve of 136 bits 142 bits and the last case I have presented so you can see that the ceiling times it's each time Is about a five times Longer and the long source long source time is about three times longer so that Satellating this value we can say that completely breaking the DLP over 160 bit curves would take about 200 CPU years so better than one I have announced in them during the previous sites Okay, and this talk here. Thank you for your attention Thanks Vanessa for the great talk. Is there any question or any remark? We have a lot of time All right, let's start here In most index calculus Algorithms finding additional discrete logs is much easier than finding the first one. Is it also the case in your algorithm? Could you repeat the beginning of the question please? In many index calculus Algorithms it is easier to find additional Discrete logs after you have done the hard work and found the first discrete log and Calculated Sorry Yeah, okay, okay, sorry I didn't get the question Sorry, is there any other question Nigel is waving, but anybody else Nigel Hi Very impressive timings. Do you have any feeling how this would could this could be extended to characteristic to? Yeah, actually it's a good question Yeah Okay So we have done some experiments as well in characteristic to and this can be also applied in in this context But you have some specific Situation in characteristic to with also the same as summation polynomials So we are truly working on this case to improve the situation. So maybe direct decomposition is also a Good approach in this context Any other questions, please stand up and wave clearly. It's hard to see anybody Okay, Vanessa. Thanks again for the nice talk