 Thank you very much for listening to my talk. I am Xinxing Gong, and I'd like to talk about our paper, Resistance of Snow V against Faster Correlation Attacks. Snow V is a new member in the Snow family of stream ciphers. In this paper, we study the resistance of Snow V against bitwise and faster correlation attacks by constructing bitwise linear approximations. First, we propose and summarize some constant-time algorithms using the slice-like techniques to compute the bitwise linear approximations of certain types of composition functions. Then, using these algorithms, we find a number of stronger linear approximations for FSM of two variants of Snow V given in the design document. Thus, improve the faster correlation attack. Finally, we study the security of a closer variant of Snow V, and derive many mask turbos yielding the bitwise linear approximations of the FSM with high LCR and amount of faster correlation attack according. Note that the needs of our attack pertains to the security of Snow V. Our talk will include these seven parts. In the first part, I'd like to talk about the motivation. Snow V has two predecessors, Snow 2 and Snow 3G, using the classic LFSR-FSM structure. Many literatures have studied the linear attacks on Snow 2 and Snow 3G, most of which are based on the bitwise linear approximations. At Crypto 2015, John and others achieved improvements over the previous attacks on Snow 2 by building two round bitwise linear approximations for FSM. Inspired by this work at FSE 2020, John and others launched a faster correlation attack on Snow 3G by building three round bitwise linear approximations for FSM. These results give the impression that large unit approximations lead to larger SEI and also to better attacks. In the design document, the designers studied the linear attacks on two variants of Snow V by directly using large unit or bitwise linear approximations. So the question is, how do the bitwise linear approximations work for Snow V? Before describing the main work we've done, we will introduce some concepts used in this paper. Definition 1 describes the correlation of a blunt function. Correlation is often used to evaluate the efficiencies of bitwise linear approximations in linear attacks. Definition 2 describes the correlation of an EM function onto any given input and output mask. For an EM vector of blunt function F, with the definition of the probability distribution, the SEI is defined as definition 3, which measures the distance between the target distribution and the uniform distribution, especially for M equals 1. The SEI of F is equal to the squared correlation of F. Note that SEI of a distribution is used to evaluate the efficiencies of large unit linear approximations in a linear attack. For the targeted Cyphers Snow V, we only provide the description of internal state updating functions in KStream generation phase. Snow V is with the LFSR and FSM structure. The FSM consists of three registers. The KStream output and the FSM updating are as shown. Here, sigma is a bitwise permutation, and ASR denotes a 4AS encryption run function with the round key constant being 0. Next, we will introduce the work we've done. The first is about the computation of bitwise linear approximations of certain types of nonlinear functions, which is used for the cryptanalysis of Snow V. Here, we summarize five types of functions composed of basic operations like MODED, XOR, commutation, and SBOX, and provide algorithms to efficiently compute their linear approximations. Type 1 function is constructed by several parallel small SBOXes. For any given input and output masks, correlation 1 can be easily obtained through linear approximation table lookups of this gene. Next, type 2 function. Type 2 function represents the addition module 2 to the power of n with several inputs. In the literature NW06, the authors show that for any given input and output masks, correlation 2 can be obtained by doing several matrix multiplications by using some pre-computed matrices. This is a constant time procedure. Method in NW06 is as shown in theorem 2. We skip the details. Next, type 3 function. Type 3 function is a composition of type 1 function under the MODED, which is at the core of node 2 and node 3G. The literature GZ20 has provided a constant time algorithm to compute correlation 3 for any given input and output masks. The general idea is to divide the n-bit values into d-n-bit values according to the specific structure of SBOX. And then pre-compute and store some useful matrices independent of the input and output masks using algorithm 1 as shown here. Using these pre-computed matrices, the correlation on an arbitrary bitwise mask tool can be derived by doing some matrix multiplications of small size as shown in theorem 3. Next, type 4 function. Type 4 function H is composed of the addition's module 2 to the power of m and the permutation p. Next, we show how to accurately compute a correlation 4 in constant time by adapting the techniques in the literature's GZ20 and in W06. Focus on the definition of H, where m divides... With the permutation p, the last MODED can be divided into d-parallel MODED. Thus, H is actually in type 3 category and can be solved using the method in GZ20. That is, for any GV masks, correlation 4 can be obtained by doing some matrix multiplications of small size as shown here. In this equation, the matrices CV are oversized 2 times 2. Note that all possible matrices CV are also derived by doing matrix multiplications by using the bit-slash technique in literature in W06 as shown in theorem 1. We will skip the details. The last is type 5 function, which is quite similar to type 4 but work on different sizes using different commutations. This type of function plays an important role for analyzing the bitwise linear approximations of snow v. With the pre-computed matrix CV for type 4 function, correlation 5 for any GV masks can be obtained by adapting the commutation for type 4 in our case. By doing some matrix multiplications of small size, which is also a constant-time procedure. After introducing 5 types of nonlinear functions, we will next present the applications of the above commutation algorithm to some aberrations of snow v. The first one is snow v, sigma 0. The left finger shows the FSM part of snow v, sigma 0. In snow v, sigma 0, the permutation sigma used in the FSM updating is assumed to be identity. That is, there is no bytewise permutation. We will first study the bitwise linear positions for FSM by using previous algorithms. For 3-round FSM, the output bits can be represented as a function of internal state bits. With the variables and the F-function described above, generally, we consider to apply the linear masks by gamma lambda to the case gene words at 3 consecutive time instacings, respectively. And then cancel out the nonlinear contributions by decomposing the whole noise into 4 sub-noses, and the 4-noses are nA, nB, nC, and nG. Accordingly, the bitwise linear approximations of the FSM of snow v, sigma 0 have the following form. And the correlation under any given mask of phi gamma lambda is obtained according to the Pin-up Lemma. What we should do is to find phi gamma lambda such that the corresponding correlation is as large as possible. Then we need to compute the correlations of the noses for given masks. First, for the computation of the correlation of nA and nB, note that nA and nB have the same form, but different input and output linear masks. From their expressions, a certain type of function is derived, denoted by mask called G. Based on this function, we define a type 3 function G as follows. We verify that mask called G can be expressed as 4 type 3 functions in parallel. Based on this, we deduce that the correlations of nA and nB are computed as the multiplications of correlation 3 under the corresponding partial masks. Since correlation 3 for an arbitrary mask toolbook can be obtained by doing 4 matrix multiplications of small size, the correlations of nA and nB can be obtained with constant time complexity. Next, for the computation of the correlation of nC. Similarly, a certain type of function is derived from the expression of nC denoted by mask called F, which is a parallel application of 4 type 2 functions. That is the addition modulo 2 to the power of 32 with 3. Then the correlation of nC is computed as the multiplications of correlation 2 under the corresponding partial mask. Since correlation 2 for any given masks can be obtained by doing 32 matrix multiplications of small size, the correlation of nC can be obtained with constant time complexity. For the noise nD, note that subbytes is an application of 16 AES S-boxes and can also be represented as 4 parallel applications of pac-man function S-boxes. Then the correlation of nD can be obtained through 16 LAT lookups, which is, of course, a constant time procedure. With the constant time algorithms for computing the correlations of 4 subnoises, we can carry out a wide range of such for big gamma lambda, which yield high correlations. In the paper, we used a search strategy attempting to find some potential linear masks based on some of the vision. We will skip the details, but give the search results. Our best results have the correlation 2 to the power of minus 18.67 and the SEI is 2 to the power of minus 37.3. Following the general procedure of the fast correlation attack, we propose an attack with linear oppositions in the table. Now we compare our results based on bitwise linear approximations with data based on bitwise linear approximations in the design document. Our best linear approximation has the SEI 2 to the power of minus 37.3, which is much larger than the best bitwise one with the SEI 2 to the power of 58.7. Using the stronger oppositions, we naturally improve the fast correlation attack. Next, we will introduce the bitwise linear approximation of a new variant of Snovi. Called Snovi, mode add 32 and mode add 8, where the permutation sigma is used as proposed. This finger shows the part of the FSM. Only the 32-bit adder used for updating the R1 register is replaced by 8-bit adder, while everything else remains identical. We will first study the bitwise linear approximation for FSM by using previous algorithms. As before, we consider to apply the linear mask's fair gamma lambda to the key streamers at three different time instances, respectively. And then cancel out the nonlinear contributions by decomposing the whole noise into four sub-noises. And finally, obtain the bitwise linear approximation of the three-round FSM. Note that here, the noises A and NB are the same as that in the analysis of Snovi sigma 0, while Nc bar and Nd bar are new introduced. Thus, we need to compute the corrections of Nc bar and Nd bar. Nc bar is exactly the noise introduced by the bitwise linear approximation of the type 5 function introduced before. Then, the correction of Nc bar for any gave-in masks can be obtained by doing some matrix multiplications, which is also a constant time procedure. As for Nd bar, the correction can be obtained through 16L80 lookups. To sum up, the corrections of four sub-noises can all be obtained with constant time complexities. As before, we use a strategy based on some observations to search for the linear masks. Our best results are listed in this table. With these approximations, we also propose a faster correction. Let's give the details. Finally, we give a brief study on another variant of Snowve, whose bitwise linear approximation has been studied in the design document. This is the part of the FSM. The two 32-bit adders are both replaced by 8-bit adders. We will sketch some ideas on how to find a good bitwise linear approximations but list the best results we found. Compared with the results based on bitwise linear approximations in the design document, we increase the bias of the linear approximation laterally. Finally, I'd like to make a conclusion with this presentation. In this paper, we present a number of stronger linear approximations for the FSM of several variants of Snowve and also propose improved attacks accordingly. Although needs of our attacks threatens the security of Snowve, we provide new lights on the structure of Snowlight stream ciphers and also the bitwise linear approximation attacks. That's all of my presentation. Thanks for listening.