 So if thing works, yes, it does. Thank you for the introduction. So in fiction in the 1960s, there was a James Bond episode called From Russia with Love, a very appropriate title. And even better, in the core of the scenario was about trying to recover a lector. A lector was a, well, Sovietic at the time, not Russian, encryption device. So there is a scene where you have James Bond who is listening to a description of this device. He chooses rotors and everything, all this old fashioned stuff. And you have these guys, cryptographers from MI5 or 6 I never remember. And they are trying to use this information to recover how this lector works. Reality is a bit less impressive. Instead of sending guys overseas and have them listen to pretty KGB agents, you look at a PDF file. And on this PDF file, you have this kind of stuff. So you just look at them, and you squint for several months, and eventually hopefully you get some results. This is pi, the S-box of the last two Russian standards. So you have a block cipher called kuznetchik, which means grasshopper. And a hash function called StreeBog. Both use the same S-box. It operates on 8 bits. It's a permutation. It's differentially linear properties are fine. They're a bit better than what you would expect from a random permutation on 8 bits. But what we don't know is how it was made. The only thing you need to implement an S-box is its lookup table. And that's the only information that its designers provided. This obviously begs the question of how did they design it. So first, I'm going to give a brief reminder about the things we knew about this S-box up to that point. So some results which I have established with my then colleagues from Luxembourg, and also the tiny bits of information provided by the designers. And then I'm going to present what I claim to be its actual structure. So the one which its designers used on purpose. And finally, I will explain why I don't like that structure. So in the first place, we found this decomposition of pi. So that's when I was in Luxembourg. I worked on this with Alex Biryukov and Alex Udovenko. And we managed, using techniques I'm not going to go into, to rewrite this permutation in this way. It's ugly. Let's not beat around the bush. It is very ugly. But it helps with the hardware implementation. So when we ended up with this first decomposition, we were mostly puzzled. Also because this component has the worst differential uniformity you could imagine. This part, I call it t. This part, I call it u. So in the following, when I talk about the tu decomposition, it's this decomposition. We were not very happy with this decomposition. So we made a bit of detour. We had a look at an S-box from Belarus and realized that it was built like a logarithm, and that it actually looked a lot like the Russian one. And we also established that the Russian one could actually be written like a kind of logarithm, a discrete logarithm composed with a weird function, a weird permutation, which is extremely weak from a cryptographic standpoint. So it's not something you would expect from a random permutation. But it's really not satisfactory. We knew there was something to be found, but we couldn't find it. So this we published at TOSC also. And in this TOSC paper, we wrote this in our conclusion. So the important parts are in red. We think it's more likely that each of these decompositions, so the tu decomposition and the log-based one, is a consequence of a strong algebraic structure. Still, this master decomposition from which the other would be consequences remains elusive. And at that time, we thought the only way we could actually recover this decomposition would be if the designers of these algorithms, so basically the Russian secret service, would tell us. The little information they did provide because they are trying to convince people to standardize these algorithms. So Strybog is already an ISO standard. Kuznetchik is being considered for standardization by ISO. Both of them are RFCs. So their designers do try to advertise them. And in one of such talk, you have one of the designers of Kuznetchik, which explained how they built their S-box. So here we have two successive slides. It's the same except the second box is in red, which I think means that it's the important part. It's about how they built their S-box. The first box here and here is about a possible way of building S-boxes by selecting them from known classes. This gives you the best cryptographic properties so differential uniformity linearity. It has an obvious analytical structure. And you can use, for instance, the finite field inversion, which is indeed a perfectly valid way of building your S-boxes. That's how they are made in the AES. The second box, which turns into red, is about a random search with a given limit on the parameters. It's not clear to me what this limit might be. And such S-boxes do not have optimal properties. They don't have the best differential and linear properties. But they also do not have a pronounced analytical structure, which, again, is true. And people do use such S-boxes for such reasons. So there is nothing wrong here. The part which I personally find quite funny is that they left out the first option because it has an obvious analytical structure and preferred the second one because it does not have a pronounced analytical structure. Well, actually, what I'm going to show is that it has an extremely pronounced and an analytical structure. Cryptographers have tried to discuss with the designers because they go to conferences sometimes. So Marco Johann Nisarinen and his co-author for WorldBob, which was a Caesar candidate based on Srebog, they had a short discussion with the designers of these algorithms. And the designers said that they had used some randomization using various building blocks and that this was seen as an effective countermeasure against yet unknown attacks, which is coherent with the content of the previous slide. At ISO, what they say is that they did not use the TOD composition to design their algorithm, their S-box, which I think is true, that their aim was to have the best possible and linear, the best possible differential and linear properties obtained from some random search, which they do not explain. And they also say that before the SHA-3 competition, we didn't really care about the origin of parameters. And so the Srebog designers didn't care either. And so they don't feel too bad about the fact that they lost the generation algorithm. So at that point, they claim to have lost the generation algorithm and that they don't really mind because at the time when they designed it, people didn't really care about the origin of parameters. This is not true. Even in 1985, people were already designing block ciphers with very careful explanations about their generation process for their S-boxes. So I'm a bit puzzled by this statement. And then at a small workshop called Crossfire, there was a Russian cryptographer that presented Kuznetchik. Unfortunately for them, Maria was the session chair and could ask them some questions about their design process. And they avoided the question and eventually said that they had used the TOD composition to design their S-box, which no, they didn't, I think. So at that point, we have two decompositions which we are not really happy with. And we have some contradictory statements by the designers and also some really wild statement by the designers like this is, well, I'm supposed to be polite. So now I'm going to present what I found and what I described in this TOSC paper, which is what I think is the actual structure. So before I go into how I found it, just a bit of math. The final field with two m elements, so there should be only one m, but a two before, can be written in this way, except for zero. So that's just the final field logarithm. And because of this, you can partition the states in this way. So you take the union for all i's of alpha to the power i, where alpha is the generator of the multiplicative subgroup. This is the multiplication. And then if you put aside the case i equal to zero, which is just the subfield, then you get this description of the final field as a union of vector spaces, but not quite vector spaces because this don't contain zero, but almost vector spaces. Since the subfield is a vector space of dimension m in the final field of dimension 2m, you can also find another subspace of the same dimension such that the union of the two will span the whole state. So you can also write the final field in this way. So it's going to be the union for all these w's of the additive coset of the subfield. And then if you put aside the case where you have the element zero here, you end up with this partition of the final field. What's important is that in both cases, you have partitions with a set of size 2 to the m. And 2 to the m sets of size 2 to the m minus 1. And everything is linear. It's important because the pi interacts with this partition. First, some words about how I actually found this structure. So for reasons that are completely unrelated to any of this, I worked on an algorithm which looks for vector spaces in a set of elements. And one nice application of this algorithm is that it can allow you to look for spaces of a given dimension that go through a permutation and are mapped to other spaces. So you have an affine space in the input. And when you apply the permutation, you have an affine space in the output. And you can look for all of those. So I was very happy because I thought I had a nice way to test this algorithm because I knew that in pi, there was one such transition. If you set this branch to 0, this is a multiplexer which selects the output of this component when this one is equal to 0 and this one otherwise. So basically, you have that when this guy takes all possible values, here you get a vector space of dimension 4 and here a vector space of dimension 4. So I was expecting to find one such transition. But actually, I found two. So I had completely forgotten about this box and I had moved on with my life, but I shouldn't have. And now I was back working on pi again. And so what I realized is that this second pattern could be generalized to look not at really vector spaces but almost spaces where you have removed the 0. So if you fix a value here, when this guy takes all possible values except 0, you get a vector space at the top because the multiplication by a constant is linear and a vector space at the bottom, well, and a fine space at the bottom. And you have 16 such weird transitions. So this is the situation. We have the final field, the image of the final field, which is the final field itself. And you have one vector space which is sent to some of fine space and another vector space which is sent to another fine space. So that's what I had found with my algorithm. And the combination of these two space, together, they span the whole finite field and you have the same here. When I realized this, I felt really stupid because I had been looking at pi for three years at that point and I knew about this set and this set that they existed and interacted with the S-box. And I never realized that they were the subfield. So when I realized that, I banged my head out of the wall and then worked further. And then I realized that this other vector space in the input was a multiplicative coset of the subfield. And in the output, you had something which was indirect sum with the subfield. And when you look at the other multiplicative subsets, the other multiplicative cosets of the subfield and the additive cosets of the subfield in the output, pi maps one to the other. So pi maps the partition of the field into its multiplicative cosets of the subfield to its partition into additive cosets of the subfield. The random S-box pi does this. Actually, you can write it in this way. So that's what I called a TK log because the designers of Kuznichik are the TK26. And it's kind of a logarithm. So you need an affine function and a small permutation of the exponents of the subfield and the generator, and then it works like this. I'm not going to go into too much details. It map, it always, so such permutations and pi in particular always satisfies some set equalities, namely these. But what is important is that it has a kind of separation property. So if you have that your input is in a given multiplicative coset, it will always be in the same additive coset because when you change j here, you change the j here, and so where it only changes where you are in this set. The additive coset doesn't change. At the same time, if you fix j and you change the multiplicative coset, you will be at the same spot in the additive coset, you will just change the additive coset. If S depended on i somehow, you would still have this coset to coset property. It's not the case. So pi does not just map these multiplicative cosets to additive cosets, it's even simpler. It could do this in a more complicated way. It doesn't. Also, we can prove, and I'm not going to go over that, that this TK log structure explains both of the previous decompositions. The relationship between the TO decomposition and this log-based decomposition was extremely unclear to us. We were really puzzled by this thing. This decomposition explains it. Now, why don't I like this? I'm going to have to tell you about the work of someone else, namely Arnaud Bagné. My apologies. So in his PhD thesis, he introduced a way to introduce backdoors in a block cipher, which was a generalization of previous works by Kenny Patterson on these imprimitive S boxes on the DES that you could put in the DES. And what he proved is that if you want to have a backdoor where you have a partition of the input space into a fine spaces and a partition of the output into a fine spaces, if you want this partition to be preserved so that two elements in the same VI end up in the same WI all the time, then you had to have this kind of property at the S box level. So the S box has to map a partition of the space into a fine spaces, subspaces, to a partition of the space into a fine subspaces. And if you write it with boxes, like I always do, and always with a T and a U, this is what you need. So if you want to build a backdoor of the specific type, then you need an S box which looks like this. And in particular, if you write it formally, what it means is that you need to have an S box which maps additive cosets of a subspace to additive cosets of a subspace. So that's what he established. So yeah, what he established is that the S box has to map additive cosets to additive cosets. That's not what we have here. Pi maps multiplicative cosets to additive cosets. But it's not that simple either. Because when you look specifically at StreeBog, the linear layer interacts with both additive cosets and multiplicative cosets of the subfield. So that's something else the designers did not explain. When they designed StreeBog, they have a small internal block cipher, and this internal block cipher uses a mixed column-like operation. And it's specified by its binary matrix. So matrix. So this is the specification. If you write the binary matrix as a picture, this is what you get. And there are some obvious patterns here. It's actually just an 8 by 8 matrix of the subfield. Why they didn't say so? I don't know, because there is nothing wrong with that. What's funnier is that it's defined in the same field with the same polynomial as pi. So if you take a vector of this shape, you just have one element, x, which is in the subfield, and everything else is equal to 0. When you apply this binary matrix to the vector, you get elements that are going to iterate through a multiplicative cosets of the subfield. So if x is in the subfield and is going looping over the subfield, each cell of the output vector after multiplication by this matrix are going to iterate over multiplicative cosets of the subfield. So two open problems, pretty obvious ones. First, how was L built? It's MDS. But what else can we say? I mean, this guy is equal to this guy, and this guy is equal to this guy. Can this tell us what kind of structure it has? And obviously, can we leverage these properties to actually attack Strybog, sorry, or also Kuznichik? So in the case of Kuznichik, you also have a similar situation, except they were kind enough to actually provide the matrix in this case, and it's a different polynomial that defines the field. So you don't have this kind of stuff in Kuznichik. But you still have the same S-box with the same very strong algebraic structure, which is supposed to be random. So some natural questions that you may ask yourselves after seeing this. That's one I get asked a lot, actually. Isn't it always possible to find a decomposition of a permutation? Well, the answer is simple. It's no. If you generate a permutation at random and you give it to me, I won't be able to say anything other than that it looks random. If we can actually find a structure, it's really a strong indication that the design process was not random. Is there anything wrong with log-based S-boxes? Because other people use them. There's nothing wrong with using a logarithm, but it's not what's happening here. It's not just a logarithm. It's a logarithm which maps the field to itself, not to the integers. It can be seen as the composition of a logarithm and then something which sends back the integers over the finite field. And also in the case of 3-bug, it interacts in a very non-trivial way with the linear layer. So that's why I don't like it. And this is the third decomposition we have, like, won't there be a fourth or a fifth one at some point? To answer this one, I need to do some combinatorics. So you have 2 to the 1,684 roughly 8-bit permutations. You have 2 to the 83 tK logs. And to put this number into perspective, you have about 2 to the 70 affine permutations. So if someone were to give you an affine permutation and tell you that they had generated it using a random permutation generator, you would probably not believe them and believe that there is some sort of bias into their generation algorithm. It's the same here. That's why I claim that the presence of this structure was deliberate. And that's what the designers actually intended. If you want to design an S-box with properties that are really similar to pi, the generation algorithm is very simple. You just pick a tK log at random. You check if it's linearity and differential informity are the best possible for a tK log. If it's the case, then you output it, then you're happy. If not, you generate another. It finishes pretty quickly. You just need to generate about 2 to the power 11 random tK logs, and you will find one. However, so the result of such an algorithm will really look like pi. However, it's never better than a regular logarithm. So if you just take a plain discrete log, the differential uniformity and linearity will be the same, and you will actually have fewer of these coefficients in the DDTLAT. So it's an epsilon better. Pi is an epsilon worse than a discrete log. So the reason to use a tK log instead of a logarithm is not an improvement of the cryptographic properties. And so in conclusion, so if you want to convince your friends that, I mean, if you want to talk about the decomposition of this S-box with your friends who might not be cryptographers, I have written a note where I intended to vulgarize this result. So it's at this address. I claim that the tK log structure in pi, and I hope I have convinced you, was a deliberate choice by its designers. That's what they intended to do. Why? I don't know, but that's what they intended to do. And it really looks like a structure which is known to yell the backdoor. I have not found an attack against TRIBOG using this property. To be clear, I have not found such an attack. I'm not saying there is none. I have not found it. However, until the designers of TRIBOG and Kuznichik explained properly how their random generation process could output an S-box mapping cosets of the subfield to cosets of the subfield in the same field as the one used to build the linear layer of their hash function and why that might be a good thing, I don't think we should use these algorithms. And I don't think we should standardize them. I'm looking at ISO when I'm saying this, like representatives of ISO. I know that next week they are going to discuss standardizing Kuznichik. I think it's a horrible idea. Thank you. Questions or comments? Yes. Here. Thank you. Do you think there's something to learn about invariants, more variants of invariant subspace attacks exploiting these kind of properties? If I were to try to exploit these properties, that's the way I would go, yes. It's really reminiscent of these invariant properties. But this kind of partition is typically, I mean, it doesn't remind me of any of the functions of the attacks that Gregor presented on Monday, so that it might be a new variant or... Yes, that's the way I see it. I agree. Thank you for your talk. You indicated that it's easy to find the one with the best differential linear properties in this space. How did you find the bound for this? Experimentally. Maybe there is a way to really obtain them with the proper pencil paper argument. I didn't do that. I generated a lot. And I saw that they could not improve the bounds. Actually, I have my experimental results here. So this is not just the differential uniformity and linearity, it's a bit finer grained. So the higher you get on this axis, the higher you go, the least likely your linear properties are, and in particular, the lower your linearity is, the higher you go. It's a bit finer grained because it also counts the number of occurrences of this maximum coefficient. And here is the same for the differential uniformity. So when you generate TK logs, you usually end up in here, under my pointer here, and you have some cases where, which correspond to about 1,000 of them, which end up around this spot. But you would need to go much higher than this to get something with a lower linearity and much further to get something with a lower differential uniformity. So maybe there is one specific combination of these two components, so the S and the kappa that you need to define the TK log. Maybe there is one specific instance which will lower these quantities, but using just a randomized search, you're not going to find them. Any other question or comments to leave? Okay, if not, let's thanks Leo again.