 Okay, welcome to session H, Message Presentation course. In this session we have two papers. The first paper is on weak ease and the positive effects against polynomial-based mark-space by Kolom Atlanta and the color sheet. Okay, this paper has been selected at the best paper of FXC 2013 and our giving ceremony will take place at this dinner. So please give us your talk. Thank you very much for the introduction. This is a joint work with Carlos Sid. I'm going to start by going through the main contributions. So we study the underlying algebraic structure of these marks and hashes, and this leads us on to a generalized border attack that extends the work of Sarin from FXC last year and gives us a common language to describe all of the existing attacks against GCA and also gives us a learning extension. And there's another consequence. We get some different weak-key classes for police hash constructions and for these map constructions. So almost every subset of a piece of space is a weak-key class. So based on these three points, the talk comes into these sections too. The introduction of the background and the algebraic structure, then section of borders and section of regions. So we're going to assume that we've got some message, which will probably have some cybertext, maybe some additional automated data, probably in the message line. And we're going to pick some field and top our message up so that each message block is just in the environment. And then our hash function family is going to take a string of field elements and give us back just one field element. And we'll index these hash functions with h, some hash key. That's, again, just another field element. And the common description for these hash functions is we take our message and use that to define the polynomial. So that's this bit here, so each of the MIs are just the coefficients of the polynomial. We take this polynomial and hash key. And we use these to generally pass. We've got these good provisional probabilities for the universal and couple-waves of hash functions. And from this family, we can build a map to encrypt the output of the hash somehow, maybe encrypt the knots. That's this bit. And add that on. Or just push the whole thing through a box like that. But for the sake of this, we don't really care which one of these you do, because here, in case if this bit doesn't change the output, the hash function is the same, then the output given back out of the written is also the same. That's how we get a map order. So a few real examples. Got our come-to-merge, the one that Dan was just talking about, from Mibru and Geiger in 2005. So they pick f2, just 128, and derive a parameter key from the quart-type of key, just by encrypting 0. And this additive encrypting and on to adding that on. Dan's probably in 1305. Again, slightly different field. No relation between the hash key and the block-type of key, but just the same additive encryption. CWC, another different field. Another different method for deriving the hash key. And here he's doing both of the encrypts the hash and then adds on and on. So you don't agree, care about this, because if you've got the hash collision, then you get the map order. And Sarian's Soap and Domain transfer mode, which he proposed after GCN last year, picks this field down here as a Soap and Domain prime. And the idea behind this is that it gives you some extra protection against the site when it happens. I'll give a quick description of it later. Another map is essentially the same as Garland County. This is what GCN's map looks like. So our message is on the top here. We've got some authenticated data, some counter-mode encrypted type of text. And then we see the message on the end. And then this multiply and XOR and multiply and XOR is the evaluation of the binary. And the right-hand side is the encryption to give us the map. So at every point in these borders, we first need to decide what we're going to let our adversary do. So we can say that he can ask for, give us an answer to the message and we'll tell him what the value tag is as long as he doesn't repeat the answers. And he's also allowed to ask some people that he picks. We'll tell him the value tag. So his aim is to find some nonce and message and tag that's valid without just asking for it. And one way that he could do that is to start with some nonce and some message, get the value tag for it, find hash collision, and then just substitute n for n prime and then there's a valid tag, a valid tuple. And so what we suggest is that if we've got h, our unknown hash key, we're going to suppose that we can write down some polynomial, a stored tree polynomial, q. So it hasn't got, there's no q zero, there's no constant term, because there's no constant term in the message, in the hash polynomial. And let's just suppose for the moment that we know that q of h is zero. So the first line here is just the definition of the value of the value of the hash. And because q of h is zero, we just have that on here and that doesn't change anything. We've still got quite a lot of it. If we collect all the right terms together, and we maybe zero have one of them and one of q. Well, this bit here is just the hash of some other message and just q. And so what we've done is we're starting from a polynomial with this way we know that q of h is zero. We've got a hash collision and we've also said earlier we've got a hash collision and we've got a math fortune. So if we know that the nonce message in a tag is a valid tuple, then it's valid for this new message precisely when this for tree polynomial is zero. And that's a stupid remark. That's the same as saying that hash key is in the set of roots of this polynomial. And that's the description we're going to use for the weak keys in the next session. So how do we choose our for tree polynomial? Well, in general it's difficult because if we don't know what h is, then we don't know whether q of h is zero. So we can just sort of cross our fingers and hope and say that there's some number of roots of a q and our for tree is successful if the hash key is one of those and there's size of the field possible hash keys so there's our probability. And this suggests that what we would like is for q of x to have lots of have lots of roots so we get pure tree probability. So we pick a pure tree polynomial in a high degree and with no root of roots because that's sort of what you can get there. So the the naive approach of the stupid method for rising down to your left is to just pick some set of the key space and then multiply together all of these factors around here and that's going to be zero if the hash key is in this set but it's adding zero so we're going to get a constant term but that's not so important. So we've written down a sort of stupid method for finding our polynomial. Actually all of the courage tags for this GCM are defined in terms of the polynomials that they're using so focus on the tax GCM if we use short tags and here lies on using linearized polynomials and squaring in this field to the character of two. So q of x looks a bit like this polynomial there might be some different coefficients but it still has the same structure this is nice because you can keep track of the the field elements that are still possible hash keys using the other matrix as you attacks GCM if we repeat answers so for this one we need to have two tuples that were both valid with the same the same nonce and because in GCM we XOR this encryption then answer if we XOR tags together over here that's the same as XOR in the hashes together and then we can collect that all together onto one side and then this hash here is just some polynomial in H and the XOR of the tags is just some constant field element so if we multiply it all through by H then we've got another 4-3 polynomial so the attack is going to be successful if the actual hash is the root of this polynomial Saran and FSE last year so what we should really be doing is looking for subgroups of the field because in that case there's some power T when we raise the hash key to that power we get back to the one so we've got another stupid bullet point if this is true we can multiply both sides by H and get this and collect it all on one side and that's a polynomial in H that's 0 if only if our attacks are successful so if we cross our fingers and hope that our hash key is an element of this subgroup this first line here is just the definition of the hash and then we know that H is equal to H we're hoping it is and so there's no problem swapping M1 and MT plus 1 so we've got a new message with the same hash and the suggested text list was to pick a field where there aren't so many subgroups because then we're less likely to be able to find we're less likely to have misholding maybe we'd like a bit more control over the message that we're going to forge we can predict what's going on we know that in the forge message each block is just going to be the original block plus the coefficient of the forge requirement so if it's in the original message it's authentic into data and we know what that is and it's in the clear so we know what's going to be in the forge message and if it's a counter motion script in the psychotext and the characteristic of the field we're evaluating is two or three additions just out of tools so adding on something to the psychotext is the same as adding onto the play text we can do it a bit better than this because multiplying multiplying our quarter polynomial here by some alpha in the field that doesn't change anything and now if we forge with alpha q rather than with q m i becomes m i plus alpha q i and we can pick any alpha that we want so it means for one message block we can choose our alpha so that that message block has the differential that we would like and this is how we get the length extension so in GCM your message looks a bit like this the first block is the length of the associated play text and then the rest of the message follows and we only use the length to compute the hash we don't actually send that with the rest of the message and what we suggest is you pick your forge requirement so you're going to use the stupid method from earlier or one of the existing 400 polynomials from another track and we need to know what the value of the length of the field is in the valid message so you can see how long the message is so we can work out what the length of the field should be as long as we can tell where the associated data and the cycle text starts and we need to know what length we want this value to encode that's the length of m plus alpha q and that's easy because we know the length of m and we know what q is because we chose that that's not a problem and we just need to pick the right alpha so that when we forge the length of m becomes the length of m plus alpha q and so you should make sure you get the right alpha to get this so why is this a good thing well with a cycling attack the best you can do is to have a more general probability of m over the size of the field and here ends the length of the message that you're given in the beginning the message in the value of q so with this we can increase the length of the message and get a better success probability for a much shorter value attack if you imagine having a field with subgroups of size 3 and 5 if you've got a 4 block message you can't use the subgroup of 5 you can only use the subgroup of 3 so you kind of wasted one of the message blocks that you did have but now we can get much better success probabilities and we don't care how long the original valid message is in this case we've got a successful probability of the maximum of m over the size of k this is the maximum missable message length that your construction operates on this is described in the it's in the original security groups and it's in the poly-13FI paper but this is the data that realises that the weak keys there's been a few papers today that have talked about weak keys the definition we're taking is from country of real in 2008 and let's say that a set of keys is a weak key pass if something unexpected happens so in this case if you want the portrait or military to be higher then you would otherwise expect it to be if we use one of these keys and we can detect whether one of those keys is being used we can't just say this is the first one, this is the second one, this is the third one our decision is more effective than that and we're now up to the size of the verification papers so the weak key passes that are currently known is if the key is 0 then the hash of the message never changes because your polynomial is just m1 times 0 plus m2 times 0 plus m3 times 0 so you can never get anything different out of your hash and sorry it says we'll actually these subgroups we can detect whether the key is in one of these subgroups because if our board read is successful then our key is in this subgroup and what we show is actually there's loads of these weak key passes so if the key is weak if there's more than three elements in it or if your if this sets is 0 key and one of the key and we do this by testing whether you can use the stupid method for constructing your board read polynomials and first test if page is in being or at 0 and then maybe we need to rule out whether the key gets at 0 what does this mean well these are properties of all polynomial hashes it's not something specific to GCN and there aren't any say tunes so at no point are they particularly relating to the key tunes so that the main counter mode isn't much better in this respect than GCN but it does protect against some of the nice methods of just writing down good geodexes so there's a few more details about that in the paper I think it's fairly well known that maximum message length is important for these perspectives and what we've demonstrated is that it is the original is it the maximum message length is what really matters and besides the field is that those are the two things that are in the port read mobility all of the chrono and the vanishing hashes have lots and lots of weak keys maybe it's better to talk about this as an unavoidable property of the rather than counting the weak key classes the key question is whether having all of the weak key classes makes the output more weak in the case of GCN it's not such a problem because the parameters are chosen quite well but maybe if you were to try and change GCN to make it faster and lighter and smaller this is something that you probably should be aware of that's me thanks very much