 Another way of decomposing a matrix is through QR decomposition and it is very useful in many, many problems, many scenarios. So basically the QR decomposition theorem says that if you are given a matrix A of size n by m, so it need not be square and n greater than or equal to m. So basically it is a matrix that is tall like this. So this is n by m and n greater than or equal to m. Then there exists a Q which is in c to the n by m and with orthonormal columns and r which is upper triangular of size m by m such that A equals QR. And if m equals n, then Q is unitary. Of course it is orthonormal and it is square so it must be unitary. And the last part is that if in addition A is non-singular then r can be chosen such that its diagonal entries are all strictly positive. That means they are real and positive. So keep in mind that you cannot compare a complex number to 0 and say that is it greater than 0 or less than 0. You cannot order complex numbers. But on the real line you can order things. So when I say that all its diagonal r can be chosen such that all its diagonal entries are greater than 0, what I really mean is that I can choose r such that all the diagonal entries are real and positive. In this case Q and r are unique. Okay so I won't prove this theorem but it is actually a direct consequence of the Gram-Schmidt orthogonalization process. So essentially all you will be doing is to run Gram-Schmidt on the columns of A and you see that the corresponding coefficients that you learn that you compute can be arranged in the form of an upper triangular matrix r. Okay that's how it goes so I won't write the proof out. But one important utility of the QR decomposition is to calculate eigenvalues. So recall that the way to find eigenvalues is to first write out the characteristic polynomial and then you have to find the roots of this nth order polynomial for a general n cross n matrix and there is no simple procedure to find the roots for n greater than 2. For n equals 2 it's a quadratic form and we know we can write the the roots of a quadratic enclosed form. But for n greater than 2 we cannot write the roots in closed form. We'll have to use some numerical zero finding algorithm to find those roots. So this is called the this algorithm is called the QR algorithm. It basically helps is useful for finding the eigenvalues of the matrix. Keep in mind that the QR decomposition by itself doesn't reveal the eigenvalues of a matrix. In particular the diagonal entries of r are not the eigenvalues of the matrix A. Okay but we can use this decomposition to in this in this algorithm to find the eigenvalues of A. So let the matrix A which I'll call A0 in C to the n cross n it's a square matrix P given. Then what we do is to first compute a QR decomposition of A. I'll state that as write A0 Q0 r0. So we've computed the QR decomposition. So the dimension of A0 is n cross n. Yes. So now it's a square matrix because I'm trying to show you how this QR decomposition could be used to find eigenvalues and eigenvalues are things we define only for square matrices. Then what we do is we compute this matrix which I'll call A1 which is equal to r0 times Q0. So all I'm doing is I first computed this QR decomposition and then I'm just reversing the order and multiplying it as r0 Q0. Now this A1 I'll compute its QR decomposition. So this is another QR decomposition step. And then I'll compute A2. I don't know how many times I should write this but this is the pattern r1 Q1 and so on. So the kth step will be we write ak equals Qk rk and k plus one step would be to compute ak plus one equals rk Qk. This is again a QR decomposition step. So this is the algorithm. And so first before we proceed one claim is that ak is uniterally equivalent to A. So that is easy to show. So for example if A1 equals r0 Q0 then if I consider Q0 times A1 that is going to be Q0 r0 Q0 which is equal to Q0 r0 is A0. And this Q0 has an orthonormal column so it is uniterally. So which means that Q0 A1 Q0 Hermitian equals A0 or A1 is uniterally equivalent to A0 and so on. And so ak is uniterally equivalent to A0. So it gives you a sequence that are all uniterally equivalent. And what one can show which I am not going to show again here this is an algorithm it is one of its properties is that under certain circumstances. So for example if the eigenvalues of A are all distinct so under certain conditions e.g. the eigenvalues of A A0 have distinct absolute values. The QR iterates ak converge to an upper triangular matrix. k tends to infinity. So since the upper triangular matrix is uniterally equivalent to A0 the diagonal entries of this ak as k goes to infinity are the eigenvalues of A0. So this is one other numerical recipe that one can use to find the eigenvalues of a matrix A. Okay so now that we have started discussing factorizations we will discuss what is what are known as canonical forms. So these are basically forms where I mean processes by which we reduce a matrix down to a simpler form. So the motivation is that one basic question you can ask is when are two matrices going to be similar? Okay we know that similar matrices have the same trace, the same determinant, the same eigenvalues, the same characteristic equation. But it is also possible that matrices can be different without, it is possible to find matrices that are not similar to each other but have the same trace determinant eigenvalues and characteristic polynomial. So it is still not clear how we will verify that two matrices are actually similar to each other. Okay if you can find a matrix such that S inverse AS equals B then great you are lucky, you found this matrix and so you know that A and B are similar. But if you do not, if you are not able to find that matrix how do you prove or otherwise or disprove that two matrices are similar? So one possible approach to find to determine similarity is to try and reduce both matrices down to some simple form for example a diagonal form and then see if these diagonal forms are the same up to possibly permutations of the diagonal entries. And so that is one way to determine similarity. If you are able to reduce both matrices down to a diagonal form and check that the two diagonal forms are actually the same, then you know that the two matrices are similar. So these are what we call canonical forms, reducing a matrix down to its simplest form which will then allow us to test for properties like similarity. So basically the, so if you could reduce things to diagonal matrices or reduce all these matrices to diagonal matrices then that would work. But the problem is that not every matrix is diagonalizable and so we have an existence problem. If two matrices are both non diagonalizable then it is difficult to know whether those matrices are similar or not. Now an alternative could be to try and use Schur's theorem which will allow you to reduce a matrix to an upper triangular form and then you can say let me try to compare these upper triangular forms. But in this upper triangular form that you obtained from Schur's theorem the diagonal entries can potentially appear in any order and two upper triangular matrices with the same, even if two upper triangular matrices have the same diagonal entries, but different off diagonal entries then those two matrices can still be similar. And so essentially Schur's theorem is insufficient to determine whether or not two matrices are similar. Now if we search for an upper triangular form that is as close to being diagonal as possible but is still attainable for every matrix then that form is called the Jordan canonical form. And this Jordan canonical form is a set of almost diagonal matrices and in fact if the matrix is indeed diagonal then the Jordan canonical form will return a diagonal matrix. So in some sense it is a generalization of diagonalizability of matrices. So the Jordan canonical form is a set of almost diagonal matrices and these matrices are called Jordan matrices. And these Jordan matrices include diagonal matrices and the punch line is that every equivalence class under similarity of square complex matrices includes a Jordan matrix and any two Jordan matrices of the same equivalence class are the same in a very trivial way. We will be able to look at the Jordan forms and say yes these are the same or these are different. So the main result we will discuss next is that every complex square matrix is similar and essentially unique. But essentially unique I mean that these matrices which are called Jordan matrices has a block diagonal structure and those blocks are called Jordan blocks and the only thing that is allowed is the permutation of those blocks. But other than that the Jordan matrices will be unique. And so this is called the Jordan canonical form that reduces the matrix to an almost diagonal matrix which is called a Jordan matrix. And as I said the Jordan matrices of two matrices in the same equivalence class okay let me write that that is actually an important point. So the Jordan matrices of a pair of matrices in the same similarity equivalence class are the same in a trivial way. Meaning that only in the block diagonal structure of the Jordan matrix some blocks could be exchanged but otherwise they will be the same. So you can it's very easy to check whether the Jordan matrices are the same or they're different. Okay in order to talk about the Jordan canonical form I need to introduce a couple of definitions. So the first is a nilpotent matrix. So a in c to the n cross 10 is said to be nilpotent if a power k equals what? 0. So the smallest positive k for which this happens is called the index of the nilpotent matrix. Of course if a power k equals 0 then a power k plus 1 a power k plus 2 all that is always equal to 0. So for example if a is the matrix 0 1 0 0 then a squared is the all 0 matrix and so we say that it is nilpotent of index 2. More generally if a is the n cross n matrix with once on the super diagonal and 0's elsewhere then a power n equals 0 meaning that the matrix is nilpotent of index n. So basically the Jordan canonical form theorem later will say that every matrix is similar to a matrix of form d plus n where d is a diagonal matrix and n is a nilpotent matrix. So that's what we are going to go towards. So yeah. So what is super diagonal? Super diagonal are the entries just above the diagonal. Okay sir. The diagonal not the diagonal so this is the diagonal and this is the super diagonal and similarly this thing would be the sub diagonal. Okay. So if you have once on the super diagonal if you square this matrix what you will find is that you can hand compute it it's easy. The ones will come in the second super diagonal then if you take that this matrix power 3 or you multiply that by this matrix again it'll come in the third super diagonal and then fourth fifth sixth eventually it'll come to be a matrix with all 0's except this entry being equal to 1 and then you multiply that one more time by this matrix it'll get rid of everything and you'll get 0 matrix. So k cross k Jordan log j of lambda with lambda being a complex number is this is the following matrix. It has lambdas on the diagonal and once on the first super diagonal and then 0's everywhere else and is of size k cross k. So lambda is on the diagonal 1 is on the super diagonal and 0 everywhere else. So this matrix is called k cross k Jordan block with lambda and of course when k equals 1 j of lambda is just equal to lambda. So we'll also sometimes use j k of lambda when we want to indicate the size of the matrix. Okay. So this is for the k cross k. So we will use both these notations hopefully I won't be too confusing to you but here I'm not using the subscript k. So this j 1 of lambda or j of lambda is just lambda for a 1 cross 1 matrix or a scalar and also j of lambda in the k cross k case is lambda times the k cross k identity matrix. So I'll write it this way. So there is clear j k of lambda is lambda times the k cross k identity matrix plus n where n is an ill potent matrix of index k with 1's on the super diagonal. So it's the all-zero matrix with 1's on the super diagonal. So we have the following theorem which I think since we have only one minute left I'll just leave it for the next time. So maybe pick your interest a little and say that Jordan form theorem what it will say is that a in c to the n cross n is similar to a matrix of the form j 1 of lambda 1 j k of or j r of lambda r where these are Jordan matrices or Jordan blocks. So again I think I'm messing up notation a bit but is a Jordan block of size not i cross i n i cross n i these are so there's a bad notation here I'll fix that the next time. So this is corresponding to eigenvalue lambda i of okay so this is what we will state and prove in the next class we'll stop here for today.