 The second type of elementary, elementary matrix that I wanna talk about here, it's the idea of an interchange matrix. Remember the interchange elementary matrices are those where we interchange two rows of the identity matrix, but everything else will be the same. A permutation matrix is what we could call the interchange elementary matrix 2.0, where we don't just interchange necessarily two rows of the identity, we could interchange any combination of rows of the identity. So a permutation matrix is a square matrix formed by rearrangement of the rows of the identity. And so we can see an example of such a thing right here. So take this permutation matrix, where we have 0, 1, 0, 0, we have 0, 0, 0, 1 for the second row, 1, 0, 0, 0 for the third row and 0, 0, 1, 0 for the fourth row. And so notice here that if you look at the rows, 0, 1, 0, 0, that was originally the second row of the identity matrix. The next one, the second one here, 0, 0, 0, 1, that was originally the fourth row of the identity matrix. The third row here is 1, 0, 0, 0, that was originally the first row of the identity matrix. And then lastly, you have 0, 0, 1, 0, that was originally the third row of the identity matrix. And so we can think of it like the first in the following way. If we take the first, second, third rows of the identity matrix, The first one became the second one. That is we put the second one in place of the first one. The second row becomes the fourth one, right? We put the fourth row instead of the second row. Then the next one we got the first one, the last one we got the third one. And this is what we mean by these permutation. Things got switched around. One became two, two became four, three became one and four became three. This is a generalization of the idea of interchange because with an interchange matrix, which you can see three interchange matrices right here on the screen. The first interchange, reading left to right, this interchange is one and three. Rows two and four are left alone. These are the genuine second and fourth rows of the identity matrix. But the first one became the third one and the third one became the fourth, or the first one, excuse me. That's what the interchange was. You're only gonna swap two of them. Here's another example. If you look at this matrix right here, the first row still is the first row. The second row actually became the third row and the third row became the second row and the fourth row is left fixed. So these are the ones that got swapped around, the second and third row. Interchange, you only swap two rows. And then lastly, with the third one there, the first row is left alone. The second one is also left alone. The third and fourth rows get swapped. That's what we did there. And you can see the swap because the ones are no longer on the diagonal. Same thing here. The ones are no longer on the diagonal. And the same thing here. These ones are not on the diagonals as they're supposed to be. That's how we can identify which are the scaler, the interchange matrices here. Now going back to the original matrix, this one right here, this permutation, we switched up a bunch of the rows. A permutation matrix that can be shown, I guess I should say, it can be shown that any permutation of objects can be accomplished by a sequence of transpositions or a sequence of interchanges. That is, we can rearrange any collection of objects by just swapping two things at a time. So this implies that a permutation matrix is actually a product of elementary matrices of interchange type. As such, a permutation matrix is always going to be singular. And in fact, when it comes to a permutation matrix, just like interchange matrices, the transpose of the matrix is actually its inverse because if we swap things around, if we swap them back to where they came from, you're gonna get the identity matrix. As any interchange elementary matrix has at least two rows permuted, it can be viewed as a permutation matrix for this generalization. And so similar to how an interchange elementary matrices multiply, we see that if A is a matrix, then if you take P times A, what's gonna happen here is you're going to then, you're gonna send, well, let's take this exact matrix as an example right here. So P times A is gonna be the matrix where the rows of A are rearranged in the same way that the rows of I are rearranged in A, or in P, excuse me. So P A is gonna be the matrix, which it permutes the rows of A according to the permutation P. So what I'm saying is you have a permutation P that gives you this permutation right here where things got swapped around. Instead of the first row, you have the second row. Instead of the second row, you have the fourth row. That permutation's going on there. And so if we were to do a hypothetical here, we take, whoops, we take a matrix P and take a matrix A. If we take P times, we'll take the matrix one, two, three, four, five, six, seven, eight. So clever here. We're gonna do four, three, two, one, and then we're gonna do six, seven, eight, five, just as an example here. If we multiply this thing through, what happens when you multiply by the first row? You're just gonna grab actually the second row here. You're gonna grab the second row instead. You get five, six, seven, eight. And then when you multiply by the second row of P, that's gonna grab the fourth row because it's a zero, zero, zero, one. And so you get six, seven, eight, five. And then when you multiply by the third row of P, you're gonna grab the first row of A here. So you get one, two, three, four. That's a consequence of the one, I'm sorry. The P here, double checking my, double checking P. P, if you multiply the third row, no, that gives you one, zero, zero, zero. That was right. So let's put those numbers back on the screen. And then if you multiply by the fourth row of P, that's gonna grab the third row of A, giving you the four, three, two, one. And so what multiplying by the permutation matrix is gonna scramble up the things in A the same way that you corresponded, correspondedly scrambled up the identity. So you're still gonna see that this is the second row. You're still gonna see that this is the fourth row. You're still gonna see that this one became the first row and this became the third row. Just like we saw earlier when we talked about this permutation associated to the matrix. Now, if you were to multiply the other way around, if you take A times P, this has the effect that you're gonna permute the columns of A. So be aware that multiplying the left affects the rows, multiplying the right affects the columns. So you have permutation matrix. If you multiply together two permutation matrices, you're always gonna get a permutation matrix. Permutation matrices can always be factored as a product of transpositions. We mentioned that already. And the inverse is gonna be its transpose as repeating some of these facts. So if we were to break this thing apart, we had that tableau I mentioned on the screen earlier. I got it erased, I better write it again. So one went to two, two went to four, three went to one and four went to three with this permutation here. So we can break this up into three transpositions at a time. We can factor this here. So this is the one, like we said, that switches one and three. This is the one that switches two and three. And this is the one that switches three and four. And I want you to commit yourself when you multiply these together. You end up with this permutation right here. Now, determining how to factor a permutation matrix gets a little bit more complicated. But if one's familiar with the idea of permutation multiplication, you can use those same techniques to get the factorization for these permutation matrices right here. And that's gonna be our introduction to permutation matrices.