 So the last time we were looking at the Jordan form and we discussed some properties of the Jordan form. For example, that it can be used to show that any matrix A is similar to its own transpose. So today we will discuss just a few more properties of the Jordan form and then link it up to convergence of matrices and also start some discussion on polynomials and matrices. I briefly mentioned things like the minimal polynomial of a matrix and so on. We'll discuss that some more today. Okay, so at the end of the previous class, we were discussing the following point that if P of t is any polynomial, then if I compute P of A, I get a matrix that commutes with A. That's a trivial but useful fact. Now what about the converse? That is, if I'm given a matrix B that commutes with A, can I write B as some polynomial function of A? And the answer we saw that is that in general, it is known. We took an example of the identity matrix and showed that it will generally not be possible. But we can give a more refined answer to the question by considering the following definition. So a matrix A is called non-derogatory if every eigenvalue of the matrix A has a geometric multiplicity equal to 1. What that means is that each distinct eigenvalue has only one Jordan block involving it. Okay, so with this definition, we have the following result. Let A in C to the n cross n be non-derogatory. Then a matrix B in C to the n cross n commutes with A if and only if there exists a polynomial P of degree at most n minus 1 such that B equals P of A. So one way of the proof is very simple. If there exists a polynomial P such that B equals P of A, then it is clear that B and A will commute. That we saw already, that is just a consequence of this thing here that if P of t is a polynomial then P of A commutes with A. So what we need to prove is really just the converse that if B commutes with A, then there must exist a polynomial P of degree at most n minus 1 such that B equals P of A. So let us do that. So just for the sake of completeness, say that if B equals P of A, then B commutes with A. So we need to show the converse. Okay, so we again start with the Jordan canonical form. So let A equals S, J, S inverse where J is the Jordan canonical form of this matrix A, then so what we need to show is that if B commutes with A, then there exists a polynomial P of degree at most n minus 1 such that B equals P of A. Now, so basically the starting point is that A B equals P A, then this implies I will just substitute for A. So S, J, S inverse B equals B, S, J, S inverse which in turn implies S is an invertible matrix. So I can multiply by S inverse and then I can on the left and I can multiply by S on the right. And what I will get is J times S inverse B, S is equal to S inverse B, S times J where I am just putting brackets to show that S inverse B, S and J now need if A and B commute, then S inverse B, S commutes with J. So basically if we can show that S inverse B, S equals P of J, then so I will write that. So if we can show that S inverse B, S, this is some other matrix okay and if we can show that this is equal to P of J, then if I see it, so then B will be equal to S P of J times S inverse. And this if you consider the polynomial expansion, you can see that this S and S inverse can be pulled inside this polynomial function and so we can write this as P of S, J, S inverse which is equal to P of A. So whatever polynomial we find which connects S inverse B, S to J is the same polynomial that connect B to A. So basically the problem then reduces to assuming that now assuming that this matrix A that we had considered earlier is actually this Jordan matrix and the matrix B we had considered was this S inverse B, S matrix here. Okay so. Sir. Yeah. Sir, could you repeat how we went from B equal to S, P, J, S inverse to down? So see P, J in general and so you don't have to pay attention, I mean just giving you an illustration here. So I can write it as some A0 times say the identity plus say A1 times J plus A2 J squared plus etc up to we've said that it's of degree at most N minus 1. So I'll write it as A, N minus 1, J, N minus 1 something like this. So if I do S, P of J, S inverse that's going to be equal to A0 times S, I, S inverse plus A1 S, J, S inverse plus A2 S, J, S inverse S, J, S inverse plus etc right and so basically you see that this is actually equal to so this is this matrix. So that whatever is this so if I call this some if I call S, J, S inverse as some matrix A then what I have here is this is just the identity matrix this is the matrix A, this is the matrix A square these two together and similarly here I'll get A to the N minus 1. So this is nothing but P of A. So that's all I'm trying to say there. Okay, so what we've shown is that basically so one way to say it is that it's okay to assume A is Jordan matrix and proceed because we know that if we can if we can show that a Jordan if a Jordan matrix commutes with B then I can write B as a polynomial of that Jordan matrix then I know how to write if if A were not a Jordan matrix I know how to write B as a polynomial of A. It's the same polynomial that will work. So now the what we need to show is that if B, J equals J, B then B can be written as some polynomial of this J. So now we use the fact that A is non-derogatory which is what we assumed in the statement of the theorem. So since A is non-derogatory it's Jordan form or A is already in Jordan form we can write A as a block diagonal matrix with say J N1 of lambda 1, J say Nk of lambda k and zeros everywhere else and these lambdas are distinct. So basically what we mean by non-derogatory is that each lambda will occur in only one Jordan block. The geometric multiplicity of every eigenvalue equals 1 where lambda 1 through lambda k are distinct. A distinct just to be clear eigenvalues of A. Now so this is a certain partition on an n cross n matrix. The first one is of size n1 cross n1. The next diagonal block is of size n2 cross n2 and the last diagonal block is of size nk cross nk. I'll consider the same partition on B, Bij partitioned according to A. Then basically so now A B equals B A that's what we are given. B commutes with this Jordan form matrix and so if I consider the off diagonal blocks of A B minus B A, A B minus B A equals 0. So the off diagonal blocks A B minus B A are of the form J Ni of lambda i Bij minus Bij J Nj of lambda j. So just all you have to do is to consider this product. There's a matrix like this. So I'll just write this out here but you have to just work out. So I'll just write it in short. I'll write it as J1, Jk and I have B11 through B1k, Bk1 through Bkk and then I have to do minus B11, B1k, Bk1 through Bkk times J1 through Jk and then look at the ijth entry. ijth block matrix and that's what this thing reduces to. So you can see that that is the case. So this is the off diagonal block and this is also equal to a 0 matrix because A B minus B A equals 0. And since these lambdas are assumed to be distinct, it can be shown that this for the fact that this is equal to 0, this implies that Bij equal to 0 for i not equal to j. This is a little exercise but I'll maybe indicate to you how one arrives at this. So for example, if I consider just 2 cross 2 block and for ease of notation, I'll consider instead of lambda i and lambda j, I'll consider lambda 1 and lambda 2. This is the first, so lambda 1 here also. So this is the first Jordan block. This times the corresponding matrix of B which is B11 say B12, B21 and B22 minus this matrix here which is again B11, B12, B21, B22 times the second Jordan block which is something associated with lambda j which is lambda 2, 1, 0, lambda 2. And lambda 1 is different from lambda 2. So if I expand this out, what I'll get is lambda 1 B11 plus B21, lambda 1 B12 plus B22, lambda 1 B21 and lambda 1 B22 minus lambda 2 B11, B11 plus lambda 2 B12 and lambda 2 B21 and B21 plus lambda 2 B22. And this thing should be equal to 0. And this is equal to lambda 1 B11 minus lambda 2, lambda 1 minus lambda 2 B11 plus B21 and here it is lambda 1 minus lambda 2 B12 plus B22 and here it is lambda 1 minus lambda 2 B21 and here it is lambda 1 minus lambda 2 B22 plus B21 minus B21 equals 0. So if I notice here lambda 1 minus lambda 2 B21 equals 0 and lambda 1 is not equal to lambda 2. So B21 is equal to 0 and then if I plug that in here lambda 1 is not equal to lambda 2. So B22 equals 0 and B22 is 0. So and lambda 1 equals is not equal to lambda 2. So B12 equals 0 and finally here B21 is 0 already. So and this is non-zero. So B11 equals 0. So that implies this matrix is the all-zero matrix. I don't have to write that. So all the entries of this matrix are 0. Okay so by similar argument but applied to slightly I mean more general cases when you have n1 and n2 you can show that this thing equal to 0 implies that all the entries of the matrix B ij must be equal to 0. So that in turn means that for i not equal to j B ij is 0 that implies B is also a block diagonal matrix with the same block structure as j. So yeah in in subtraction in first row in second column there is an extra term minus B11. Okay yeah there is a minus B11 but right. So then the the root is so let me do that so that it's clear but it doesn't change the conclusion. So instead of do writing it this way I'll first figure out that B21 equals 0 from here I'll figure out B22 is 0 then I'll go here and figure out B11 equals 0 and now B22 B11 are 0 so B12 is 0. So you're right but it doesn't change the conclusion. Okay so B is a block diagonal matrix and I can write B as like this okay and from the commutativity assumption again we haven't used the i equal to j part of the commutativity. So this is the commutativity assumption and this is true for i equal to 1 to k okay and now we'll use the form of the Jordan blocks. So so if j n i of lambda i equals lambda i times the identity matrix plus n i where n i is that nil potent matrix with zeros on the diagonal once on the first super diagonal and then zeros everywhere else. So this is the form of the Jordan block and this is of size n i cross n i then so the identity matrix commutes with anything so if if I say that bi times j is equal to j times bi it means really that bi is commuting with this n i matrix so bi n i equals n i bi i equal to 1 to k okay now this in turn implies that bi actually has a specific form it's not just non-zero block but it's actually what is called a toplet's upper triangular matrix okay this is also something that you can show this implies bi is of the form and it has B2 of i here in the first super diagonal all the way up to b n i of i so size n i cross n i and zeros down here okay so I'll call this form star for for later use so it has bi's b1 nice on the first on the diagonal b2 of i on the first super diagonal b3 of i on the next super diagonal and b n i of i at the top right corner so again just for illustration purposes if I consider the two cross two case you just so you see that I'm not I'm not being unreasonable here if I take b11 b12 b21 b22 and multiply it with this two cross two Jordan block 0 1 0 0 and this is supposed to be equal to 0 1 0 0 times the same matrix b11 b12 b21 and b22 what this means is that that if I execute this multiplication here what I get is 0 b11 0 and b21 this is equal to b21 b22 to 0 0 right and so if I equate the terms we see that b11 equals b21 sorry b11 equals b22 so the diagonal terms are equal and um b21 equals 0 and yeah b21 equals 0 so basically b is of the form say b11 and then this is also b11 b21 is 0 and this will be b12 this can be anything okay so bi has this kind of a toplets so this is called upper triangular toplets form is called an upper triangular toplets form so bi has this kind of a form