 So, we need some notation, so lambda is an eigenvalue of A, A lambda is the algebraic multiplicity of lambda, k is the size of the largest block corresponding to lambda, n i is the number of Jordan blocks corresponding to a blocks of size i corresponding to lambda and rj is rank of A minus lambda i power j. And so for j equal to 1, 2, etc. So, yeah, okay, so this is some notation, okay, and for the moment just bear with me, I will outline the procedure and then you will see why we need all this notation. So, the following proposition, yeah, Sir, can a particular eigenvalue have different sizes Jordan blocks? Yes. Okay, sir. You could have multiple Jordan blocks associated with the same, for the given eigenvalue. And it's not necessary that all the, all the, all the blocks associated with that eigenvalue should be of the same size. The easiest way to see things like this is to actually write out some Jordan matrices. It's already in Jordan form, you know that that is the, that is the, so that is the Jordan form of that matrix. So, it is already similar to a Jordan matrix. So, for example, if I were to write 2, 1, 2, 0, this is a 2 cross 2 Jordan block associated with eigenvalue 2. And there could be one more block here. And then I just fill in zeros everywhere else. So, this is a 3 cross 3 matrix which is already in the Jordan form. It has only one eigenvalue equal to one distinct eigenvalue and that eigenvalue equals 2. And corresponding to eigenvalue 2, there are two Jordan blocks. The first Jordan blocks is of block is of size 2 cross 2. The second Jordan block is of size 1 cross 1. And so, basically the algebraic multiplicity of the eigenvalue 2 is 3. It occurs three times as the root of the characteristic polynomial. And the geometric multiplicity of the eigenvalue 2 is going to be 2. You can find two linearly independent eigenvectors corresponding to the eigenvalue 2. So, you could take this matrix and try to find a basis for the eigenspace of this eigenvalue equal to 2. And the nice thing about these Jordan blocks is that you can actually, if you just try it for a couple of matrices, you will realize that you can actually write it out quite easily. Hello, sir. Yeah. Sir, previously you told that once we know the algebraic and geometric multiplicity, we can directly write the Jordan form. Yes. But let's say the sizes are not necessarily same. So, if the let's say algebraic multiplicity is 4, geometric multiplicity is 2. No. So, the thing is that there could be multiple blocks here. So, if the algebraic multiplicity is 4. Like in this case, one block can be 3 cross 3 and other can be 1 cross 1 or other case. Or you have two blocks which are both equal to of size 2 cross 2. Yes. Yeah. So, this procedure that I am going to tell you will help you figure out exactly which case it is. So, I agree with you that it's not sufficient to know the algebraic and geometric multiplicity of every eigenvalue. You also need to know the sizes of those blocks. Okay. And so, that's actually where this n i will enter into the picture. So, you need to know all of these actually to find to write out the the Jordan canonical form. And we'll figure out, we'll outline a procedure to determine all these things. Okay. So, here's a proposition which will actually tell us how to determine the Jordan canonical form. So, it has several parts to it. Point one is that a lambda equals n 1 plus 2 n 2 plus etc plus k n k. Okay. So, now I must point out that all these definitions are for a particular eigenvalue. Okay. So, I am fixing an eigenvalue lambda of A. For that eigenvalue, A lambda denotes the algebraic multiplicity of that lambda. k is the size of the largest proc corresponding to lambda. So, writing k lambda here. But just to keep the notation light, I am just calling it k. Okay. But k is going to be different for different eigenvalues of A. Similarly, n i is the number of Jordan blocks of size i corresponding to lambda. So, ideally, I should be writing n i comma lambda or lambda comma i. But just to keep the notation light, I am just calling it n i. But keep in mind that it is associated with a particular eigenvalue. Similarly, r j is the rank of A minus lambda i power j, j equal to 1, 2, etc. And this also depends on the eigenvalue lambda that I am fixing here. So, ideally, I should be using r j lambda. But to keep the notation light, again, I am omitting the lambda from this. So, A lambda. So, basically, this is not difficult to see. There is n 1 blocks of size 1 corresponding to lambda. There are n 2 blocks of size 2 corresponding to lambda, etc. Up to, there are, this is k is the size of the largest blocks. There is, so there are k times, and k is the number of blocks of size k. So, if you take the sum of all these things, that must equal the algebraic multiplicity of lambda. The second point is r j is equal to n minus A lambda for j greater than or equal to k. And r j is strictly greater than n minus A lambda for j less than k. What that means is that if I start at j equals 1, and I look at rank of A minus lambda I, I will get some number which is going to be strictly bigger than n minus A lambda. And I take j equals 2. Again, I will get a number which is strictly bigger than n minus A lambda. But when I hit k, this r j will be equal to n minus A lambda. So, it will start with a number that is bigger than n minus A lambda. And it will keep decreasing as I take higher and higher powers here. And at j equal to k, it will hit n minus A lambda, and then it will stay there. So, we will discuss this more later. But for now, just keep in mind that r j is a decreasing sequence. It will start somewhere and keep decreasing down until it hits n minus A lambda, and it will stay equal to n minus A lambda for all j bigger than or equal to k. So, 3. You can actually say exactly what r j will be for j less than k. And that is this third point here. So, r k minus 1. So, r k equals n minus A lambda. r k minus 1 will be equal to n k plus n minus A lambda. So, n k is the size of the, is the number of Jordan blocks of size k. And k is the size of the largest block corresponding to lambda. And so, r k minus 1 will be equal to n k plus n minus A lambda. So, it is strictly bigger than n minus A lambda. It is bigger than n minus A lambda by exactly this value n k. r k minus 2 is equal to 2 n k plus n k minus 1 plus n minus A lambda. So, n k is always at least equal to 1, because by definition, when I say k is the size of the largest block, I mean that there must be at least one block corresponding to of size k. And so, n k is at least equal to 1. Now, n k minus 1 need not be equal to 1. It could even be equal to 0. But here I have a 2 n k plus n minus A lambda. So, r k minus 2 is strictly bigger than r k minus 1 and so on. I will write one more to show you the pattern r k minus 3 is equal to 3 n k plus 2 n k minus 1 plus n k minus 2 plus n minus A lambda and so on down to r 1 is equal to k minus 1 n k plus k minus 2 n k minus 1 plus etcetera plus 2 n 3 plus n 2 plus n minus A lambda. So, the way this the proof of this proposition goes, it is a bit detailed. I may do that in the next class, but the way it goes is so, the proof proceeds by looking at. So, you look at powers of j minus lambda i power j. Now, when I do j minus lambda i, j has all the eigenvalues of the matrix A along its diagonal. So, when I do j minus lambda i, it will kill the diagonal components where this particular eigenvalue appears. And all others you will get some non-zero value along the diagonal. And wherever you have killed the eigenvalue, those wherever the diagonal entry appears as 0, those are nilpotent Jordan blocks. And when I start taking higher and higher powers, those blocks will start becoming equal to 0. And so, basically we exploit the fact that if A is similar to j, then that means A minus lambda i is similar to j minus lambda i. And so, which in turn implies that if I raise this to the power j, A minus lambda i power j will be similar to j minus lambda i power j. And so, their ranks are equal. These are the essential ideas of the proof, but maybe next time I will walk you through the proof. But for now I want to say how to tell you how this proposition can be used to determine the Jordan canonical form. So, basically given A, what we do is the first step is to find A lambda. This is the algebraic multiplicity of every eigenvalue associated with the matrix A. So, you need to solve the characteristic polynomial and then find A lambda, find rj equal to rank of A minus lambda i power j for every j and for every lambda. So, again the thing is this might seem like a lot of work because you have to go over every j, but keep in mind that there is some number k beyond which this rank will stop. It will become n minus A lambda and it will stop there. It will not change after that. So, you just need to keep going till you see that the rank has become equal and it has stopped decreasing. So, once you do that it allows you to find k which is the least j such that rj equals n minus A lambda. So, that is the maximum j to which you need to raise this power. Once rj equals n minus A lambda, any higher power that you raise here and find the rank, the rank will always be equal to n minus A lambda. This is also done for every lambda. Then use point 3 in the proposition to find nk and k minus 1 etc up to n2. So, if I can scroll up here, so we know that rk equals n minus A lambda, rk minus 1 is what we just determined by finding the rank of A minus lambda i power k minus 1 and that equals nk plus A lambda. So, we know rk minus 1, we know n minus A lambda, we can find what nk is and once we know what nk is, we can substitute that in here. We know rk minus 2 k, we know n minus A lambda, we can determine nk minus 1 and so on all the way down to from this equation, we can determine what n2 is. So, then you can go back to the point number 1, it says A lambda equals all this. I know what n2 and 3 up to nk is and I know what A lambda is, so I can find out what n1 is. So, then I have determined the number of blocks of each of the sizes for every eigenvalue. So, then the Jordan form is completely determined. So, in the homework, so one of the uses of this Jordan canonical form is, as I mentioned long ago, is that you can show that any matrix is similar to its transpose. So, I will call it a result. Any A is similar its transpose. So, how do we use this Jordan form theorem to show this? So, first we note that every Jordan block. Hello, sir. Yeah. So, sir, for computing the Jordan canonical form, first we have to calculate the eigenvalue using characteristic equation and check what is our normality. Correct. Okay, then we can use the problem to calculate. Then we can use? This proposition. Yeah, exactly. The first step in the proposition is to first to find the roots of the characteristic polynomial and from that determine the algebraic multiplicity of every eigenvalue. Then corresponding to each eigenvalue, you have to find these rj's, you have to find k, you have to find nk all the way up to n2 and n1 and that's it. That's all you need to write out the Jordan canonical form. Yeah. Okay, sir. So, to see this, basically if I take this matrix, 0's with 1's along the anti-diagonal and then 0's everywhere else. Now, an interesting thing about this matrix is that what is the inverse of this matrix? Okay. So, this matrix is actually its own inverse. Okay, you can check that this same thing applies when, even if you take this matrix of order n or whatever order you like. So, this is actually a permutation matrix. It basically flips all the entries of the matrix of a vector. So, if I take 0, 1, 1, 0 times x1, x2, I'll get the vector x2, x1 or even better to make it a little more clear. If I take the matrix 0, 0, 1, 0, 1, 0, 0, sorry, 1, 0, 0 times x1, x2, x3, what I get is the vector x3, x2, x1. Okay, it flips, it flips the entries considering some mirror point in between. If there's an even number of entries, then it will consider, so if it's like x1, x2, x3, x4, you'll get x4, x3, x2, x1 like that. It'll flip the entries of the thing. So, it's a permutation matrix. It permutes the entries of a vector. And one property of permutation, this permutation matrix is that it's its own inverse. So, if I multiply this by a matrix which has 1's along the anti-diagonal, what I get is basically this transpose of this matrix. Okay, this is something that you can manually verify by multiplying these matrices together. So, thus, if a equals s j s inverse is its Jordan canonical form, then basically we have because it's in this form a is similar to j and j is similar to j transpose and j transpose is similar to a transpose which is equal to s transpose inverse times j transpose times s transpose. So, this is just taking the transpose of this. And so, j transpose is similar to a transpose. So, that means that a is similar to a transpose. Okay, as a consequence, basically any matrix is similar to its transpose. And like I mentioned, this is one of those results which again is very difficult to intuitively explain why you should be able to find an invertible matrix such that s inverse a s will give you a transpose and this is possible for any matrix a. So, a and a transpose have the same rank and that also is the implication of that is basically similar matrices have the same rank. So, a and a transpose then have the same rank which is also another way of seeing why the row rank of a matrix must be equal to the column rank of a matrix. So, one of the implications is that a and a transpose have the same rank. I mean, this is, yeah. Sir, also row rank, column rank was non-intuitive. I couldn't hear you very well. Sir, for a matrix row rank equal to column rank was not also quite intuitive in that sense as you mentioned. So, in this case the non-intuitive things are linked together. Yeah. So, at the time we didn't give a proof for why the row rank must be equal to the column rank. One thing I'll point out is that if you go back and carefully look at our development till now or at least the development of the Jordan canonical form and the prerequisite needed to determine this Jordan canonical form, the point is that we haven't used the fact that row rank equals column rank to come up with the Jordan canonical form. And as a consequence, it is a valid thing to say that one corollary to this result that we just put down is that the row rank equals the column rank. So, this is one way to prove that the row rank equals the column rank. If in our development so far we had already used the fact that the row rank equals column rank to come up with this Jordan form theorem, then this would not be a proof of the Jordan form theorem or the proof of row rank equals column rank because you can't prove something by assuming it's true and then doing a whole bunch of steps and then coming back and showing that it is true. So, that is not the case here. And basically rank, eigenvalues, these are all similarity invariant properties and so A and A transpose have the same rank. Now, yeah. So, how to check like formally prove that A and A transpose will have same rank? So, this is, I mean there are other ways to show it also, but this is one way is to say that A and A transpose are similar and because rank is a similarity invariant property, that is any two similar matrices have the same eigenvalues and the same rank. And therefore, if A and A transpose or since A and A transpose are similar, they must have the same rank. I know sir, I'm sorry sir. I meant to say how to show that two similar matrices have the same rank. So, two similar matrices, I mean these are things we've already discussed. You should just go back and look at your notes, but similar matrices have the same eigenvalues and the number of nonzero eigenvalues is the rank of the matrix. And so, if they have the same eigenvalues, they must have the same set of zero eigenvalues and the same set of nonzero eigenvalues. The two similar matrices have the same rank. Okay, now, so there is one other result I want to say, which requires another definition. I will maybe state the result and then the next time we will show it. So, the point is like this. So, if P of t is a polynomial, okay, maybe, okay, let me do the following just to keep it a little more organized. So, I'll go up here and I'll call this the result one. Now, here is the result two. There are some uses of the Jordan canonical form. So, now, if P of t is a polynomial, then P of a commutes with a. This is an obvious but useful fact. What about the other way? So, basically, that means if a and b commute, okay, can we write b equals P of a for some polynomial. So, that is, we've seen that P of a commutes with a, okay. So, for any polynomial, it's true. And so, can I write a matrix that commutes with a as a polynomial of a? That's the question. That's the converse of this statement here. So, the answer is that this is not always true, okay, not in general. And the Jordan canonical form allows us to answer when it will be possible to write b equals b of a. So, for example, just to show why it's not true. So, if I take a equals the identity matrix. Now, the every matrix commutes with the identity matrix. So, if I take any other matrix b, b times i is the same as i times b. But if I take any polynomial, okay, then for every P of t, if I compute P of i, okay, this is going to be some value P of 1 times the identity matrix, okay, the polynomial evaluated at 1 times the, so basically it's going to give me a matrix that's proportional to the identity matrix. So, we can only generate matrices of the form alpha times the identity matrix by using polynomials. So, it's not always possible that you can find a matrix, find a polynomial P such that b equals P of a for some polynomial, okay, so that is clear. But now the question is when will it be possible to find a polynomial such that a matrix that commutes with a can be written as P of a. So, we are out of time for today and we'll need to introduce one other definition of what is known as a non-derogatory matrix. And a matrix is non-derogatory if every eigenvalue has a geometric multiplicity equal to 1, meaning that each distinct eigenvalue has only one Jordan block involving it. And under that condition we will see what the result about finding a polynomial P such that B can be a matrix B that commutes with a can be written as a polynomial of a in the next class. So, that's it for today and we'll continue on Monday.