 In the previous video, we introduced the notion of a transposition factorization of a permutation. We mentioned how that factorization is non-unique. There could be lots, there's lots of variety in how the factorization works. So what can we expect to be consistent when we factor a permutation into transpositions? And the idea is we can expect the parity to be constant. That is the number of transpositions of the factorizations will always be an even number or will always be an odd number. It's impossible to factor a permutation using an odd number of transpositions and an even number. And so let's see the proof of this theorem 511 here. Every permutation can be expressed as a product of an even number of transpositions or an odd number of transpositions, but not both. That last statement there, not both is critical because if you omitted it, this would be obvious because the factorization, the number of transpositions has to be a natural number, which the natural numbers, every natural number is either even or odd, right? So we do need that this not both happen. So how do we, why can't we have a factorization which is both even and odd in terms of the number of transpositions? And there's a lot of ways of proving this one, a lot of different proofs. But myself, as an algebraist, there's a lot of stuff about algebra I really like. I love groups. I also love associative rings, but sometimes I attempt myself with non-associative algebras. Algebraic combinatorics is definitely the branch of mathematics I feel most attached to, but also very, very attached to representation theory, particularly representation theory of finite groups. And so when I was deciding what was the best way to present this proof, I could not help myself and I had to use the representation theoretic proof of this result here because it kind of illustrates the power of representation theory. Representation theory is a part of group theory and abstract algebra in general, which we take abstract groups and we represent them using concrete groups. And there's a lot of benefit that can happen from that. And so basically we're gonna introduce linear algebra into a group theory problem. So this will require a little bit of review of linear algebra. I hope that's not too much for us. So remember that RN is gonna be our vector space consisting of column vectors who have in entries consisting of real numbers. And the vector EI is gonna be the vector. You can think of it as the ith column of the identity matrix. It's the vector which is gonna have a one in the ith position and then zeros everywhere else. So it just has a single one and this is gonna be in the ith spot. That's this vector EI, which of course, if you think of the identity matrix, I sub n right here, this is the matrix which you have ones along the diagonal and then you have zeros everywhere else. The ith column or the ith row, if you prefer, of this matrix is none other than just EI. And so the way we can then write the identity matrix is in the following way. The identity matrix is just the matrix whose columns vectors are these EIs. So EI, or E1 is the first column, or E2 is the second column, you get the gist right there. So this is an important thing to remember here. So now, okay, so there's a little bit of review of notation of some linear algebra we have. We wanna talk about permutation. So let's sigma be a permutation in Sn. Notice here in is gonna be the same thing, right? So we're taking vectors, matrices for the same number of letters we have in Sn. What we're gonna do is we're gonna create a permutation matrix that has given our permutation, sigma, we're gonna create a matrix which we associate to that permutation. It's called a permutation matrix. So we'll take bracket sigma, right? And the idea is because we're putting brackets around our matrices. So bracket sigma make us think, ooh, it's a matrix that has something to do with sigma. So what we're gonna do is the matrix sigma is gonna be the matrix where we permute the columns of the identity matrix. Let's do an example of this. If we're working S3, let's take sigma to be 123. And so that means sigma, a bracket sigma would be the matrix where the first column of the identity becomes now, so we're gonna replace the first column with the second column, right? So this is gonna look like E2, E3, and E1, which would look like 0, 1, 0, 0, 0, 1, and 1, 0, 0. Like so. So this is our example of sigma one. Now let's do another example. If we did say tau was the two cycle one, two, then the permutation matrix associated to tau would be the matrix, which, well, you get an E2, you get an E3, excuse me, an E1, and then an E3. E3's left fixed. So this would look like 0, 1, 0, 1, 0, 0, and 0, 0, 1. Just to give you some examples of these permutation matrices. That's what we're trying to talk about here. And let's see, so where were we? So it's a fact from linear algebra that interchanging the rows of any two columns or rows of a square matrix changes the sign of the determinant. When you learn about determinants in linear algebra, you often learn facts about like how row operations affect the determinant of the matrix, like the replacement operation, you replace a row with some, with the sum of a row with a multiple of another, that doesn't change the determinant. If you switch the order of any rows or columns, it changes the determinant by minus sign. And then if you scale a row, that also has an effect of changing this determinant by a scalar, right? So that's how you learn about row operations, but as the determinant is not affected by transposition, that means you can also interchange columns and we make no difference. So what that tells us here is that, so what's significant is that if you take the determinant of the identity matrix, since this is a diagonal matrix, it's determined obvious to be product of all these ones. So you get one to the end, which is one. The determinant of the identity matrix is one. And so if we start permuting columns of the identity, which is what a permutation matrix does, what this is gonna do is that it's gonna have some effect to the determinant and your determinant of the permutation matrix has to be plus or minus one. Now be aware that the determinant of a specific matrix cannot be plus and minus one. It's one or the other. It's either plus one or minus one. And this dichotomy is what we're gonna play with to show that our permutations or have an odd transposition factorization or an even one. So the thing about the following, a single transposition, a single transposition is just a single swap of columns. Everything else is in the right position. So if you take a single transposition, if you take the determinant of a matrix coming from a two cycle, then our properties of linear algebra tell us that this is gonna equal negative one. So if we count two cycles, then this is going to give us this helps us with the determinant. Now, the other thing to know about the determinant in this situation is that the determinant factors. If you take the determinant of a times b, this is the determinant of a times the determinant of b. And so if you have a factorization of your matrices, you can take the determinant of each factor individually. So that's what we're gonna do here. So since each transpositions associated matrix gives us a negative one, we see that the determinant is gonna be negative one to the k, where k is the number of transpositions in the factorization there, that's critical. So if you have an odd number of factorizations, you're gonna have negative one to an odd power, which is negative one. But if you have an even number of transpositions, an even number of transposition factors in your product there, then you're gonna get negative one to an even power, which is one. And since the determinant cannot be both one and negative one, we see that it has to be an either or, it has to be one or the other. And so you either get that it was an odd number or an even number. And then the number of transpositions doesn't matter because every even will give you one and every odd will give you negative one. And so this then leads to our definition here. Now we've proven our theorem there. We say that a permutation is even if it can be expressed as a product of an even number of transpositions. It doesn't matter how many, just an even amount. And we say that a permutation is odd if it can be expressed as a product of odd number of transpositions. So there's gonna be, there's just two types of transposition, two types of permutations, excuse me, even transpositions and odd transpositions. And I really like the proof we just went through with like the matrices and the determinants. It's really cool to use all this stuff from linear algebra, which we've all seen before, but also it shows you the power of what again, I'd be calling representation theory by representing my permutations not as these abstract functions, but as concrete matrices, we can then utilize the strength of linear algebra to prove things in group theory. And this idea of representation theory is a very powerful tool that does show up in mathematics. And occasionally it shows up in this class as well.