 Hi everyone, we're still in section 6.3 about diagonalization in the textbook linear algebra done openly and so now we're actually ready to define what this word diagonalizable even means So as usual in this chapter if we have a matrix which is square So say a is an n by n matrix. We say that a is diagonalizable if it is similar to a diagonal matrix Which specifically means that there exists some invertible matrix P Such that P D P inverse is equal to a so we had talked about similar matrices previously You can look at the link if you want to see some more about those But a matrix is diagonalizable if it is similar to a diagonal matrix Let's look at an example of such a thing. So let's take this 2 by 2 matrix a right here it'll be the matrix 7 2 negative 4 1 and Using the non singular matrix P which is 1 1 negative 1 negative 2 This is in fact a non singular matrix here You can see it's it's proposed inverses right here to 1 negative 1 negative 1 and I can verify for us that this is in fact These are inverse matrices of each other. It's not too hard of a calculation Don't forget the negative sign there to 1 negative 1 negative 1 if we go through the multiplication We're gonna end up with 2 minus 1 we get 1 minus 1 we get negative 2 plus 2 And then finally we get negative 1 plus 2 Which clearly becomes the 2 by 2 identity. So these are in fact inverses of each other And so then compare this with we have this matrix P and then I have this proposed diagonal matrix D Which D is the matrix? 5003 I Claim then that a can be factored as the matrix P Times D times P inverse which you see the matrices right here now remote thing Remember when you multiply by a diagonal matrix if you multiply a diagonal matrix on the left You're gonna scale each of the rows by the corresponding value So if you take this matrix right here this product 5003 times by P inverse This will have the consequence that you're gonna scale the first row by 5 So you get 10 and 5 and you're gonna scale the second row by negative 3 So multiplication by a diagonal matrix is pretty quick and slick So this matrix right here is none other than just D P inverse right there And then with these two matrices you just do the you do the matrix multiplication This row times that column you'll get 10 minus 3 which is 7 First row second column this time you'll get 5 minus 3 which is 2 Second column first row you get negative 10 plus 6 which is negative 4 and then second row second column You get negative 2 plus 6 which is a 1 so you see the product right here Now I did want to mention that you can go the other way around if you wanted to if you want to multiply these Ones first if you have a diagonal matrix If you have a diagonal matrix that's on the right then you do scaling you'll scale by 5 and by 3 But you'll scale the column by 5 and the column by 3 if you wanted to either one of those is okay So this verifies that we do in fact have a diagonalization This is the real McCoy right here that this factorization Works and this shows that a is similar to a diagonal matrix Now why won't someone be interested in a diagonalization? There's a there's actually a lot of benefits of it I'll just give you one quick application of diagonalizations that can help you with sort of a numerical analysis of matrices if you were to square the matrix a So the same matrix from the previous slide if you square the matrix a well since a is equal to p p inverse Sorry pd p inverse You could actually use that So you get a twice But you'll notice that if you have p or pd p inverse pd p inverse You could actually put the p inverse in the p together that shows up in the middle They actually cancel out and you'll end up with d squared in the middle of p d inverse p inverse Notice this shows you that if a is similar to d There's also shows that a squared is similar to d squared and this is actually a common fact if a is if a is Similar to b then a square will be similar to b squared a cubed is similar to b cubed and all of the powers will be similar as well by this calculation by this type of thing right here and so Why why would he care about this? Well the thing is squaring a diagonal matrix is a whole lot easier than squaring a Regular matrix because it's a diagonal to square a diagonal matrix You have to square the diagonal entries 5 squared and 3 squared which are of course 25 and 9 That's not so bad to do and then Once you square the matrix you'll times it by p you'll times it by d a p inverse Excuse me and since d square will still be a diagonal matrix one of those products Just means scale the rows or scale the columns whichever direction you prefer So in order to if you have a diagonalization in hand the amount of difficulty of computing the the square Really just comes down to you have to build a square Real numbers not so bad to do I'm yet the Yet the square and scale real numbers And so it's really like it feels like one matrix product with a couple of a couple of multiplications along the way Now if you just have like if you're just squaring it that might not seem like a huge benefit But what if you have a large power of a right? What if a of k here is like say the 17th power Well if you wanted to do a times a times a times a times a times a times a 17 times that's gonna add up You have to like 16 matrix multiplications, but if you had a diagonalization in hand you have to do You have to do a Exponent for real numbers an exponent for real numbers. That's not so bad And then once though you have those again you scale rows of a matrix That's not so bad. And so you actually only have to do like one full-blown matrix multiplication If you think of matrix multiplication as this expensive operation, it turns out that having a diagonalization can dramatically simplify Computing exponents for matrices. This is of course if you have a diagonalization in hand So this is just sort of a numerical benefit There are other benefits both computationally and theoretical benefits of diagonalizations But let's sort of try to explore a little bit how one would find a diagonalization of a matrix And so I want you to sort of notice the following observations here So we had our original matrix a which you'll if you forgot it already Maybe because you saw a butterfly fly in front of your screen or something We have a was the matrix 7 2 negative 4 1 on and the matrix P remember Again, it was the matrix 1 1 negative 1 negative 2 if you focus on The first column of P right if we have a diagonalization if A equals P D P inverse This is the same thing the same AP is equal to PD and we focus just on columns, right? If you take the first column of P call it X Then this will tell you that X is this you're gonna scale that column X by some number Which we might call lambda Lambda X If if this diagonalization happens it turns out that the that the Ith column of P is going to be an eigenvalue An eigenvector of the matrix a and we can see this for our specific example, right? If you take a times 1 negative 1 right here you work out the details You're gonna get 7 2 times 1 negative 1 the dot product says 5 And if you take negative or 4 1 times 1 negative 1 the dot product is gonna say negative 4 minus 1 Which is negative 5 5 times 5 and negative 5 if you factor out the 5 you're going to get 1 and negative 1 This of course is 5 times That vector X right here where this was a times X So the first column of P is an eigenvector and same thing right here if we take the second column The second column 1 negative 2 Let's say that's the vector y this time if we take a times y well Going through the multiplication there 7 2 times 1 negative 2 you get 7 minus 4 Which is a 3 and then the second row times the column you're gonna get negative 4 minus 2 Which is a negative 6 and so factoring out that 6 a second at the 3 excuse me You're left with 3 times 1 and negative 2 and that's of course 3 times y so we can see that the two columns of P are eigenvectors for this matrix a and in fact 5 and 3 huh that looks familiar if you look at the diagonal matrix D Was it not? 5 0 0 3 5 and 3 are the diagonal entries of the of this diagonal matrix D and Remember if you have a triangular matrix for which diagonal matrices are triangular Diagonal matrix is those which are upper and lower triangular matrices the diagonal entries of a triangular Triangular matrix or in this case a diagonal matrix these are the eigenvalues of of the matrix D And when matrices are similar they have the same They have the same eigenvalues We had talked about that before as well. They have the same eigenvalues and therefore If a is similar to a diagonal matrix will have the same It'll have the same eigenvalues as that diagonal matrix and those eigenvalues will be just the diagonal entries of the matrix So I want to kind of summarize these principles we talked about in this specific example here This works in greater generality If we have a matrix a which is in by n that matrix a will be diagonalizable if and only if a has a Has in linearly independent eigenvectors that set of in linear independent eigenvectors is called an eigen basis for the vector space all right and it so it'll be diagonalizable if and only if you have enough Independent eigenvectors Okay, and so if you have an eigen basis, then you can construct your diagonalization in the following way D will consist of the diagonal matrix whose diagonal entries are the eigenvalues of a So you just put the diagonal entries I used you just put the diagonal entries of D to be the eigenvalues and then the matrix P Will be the matrix whose first column is x1 whose second column is x2 whose third column is x3 up to this nth column xn here and we have the property that a x1 is Equal to lambda x1 so x1 is an eigenvector where I lambda one is its eigenvalue and a x2 a x2 is an eigenvector of a whose eigenvalue is lambda 2 and you can proceed in this pattern Getting a xn lambda xn So you can construct this matrix P using your eigen basis. That's what this is right here. You have your eigen basis That forms the columns of P you have your eigenvalues down the diagonal You have to go in the same order the eigenvalues and eigenvectors have to correspond to each other and if you have this eigen basis you can construct this this Chain this non-singular matrix P and that's how we can form this Diagonalization all right, so things I want to mention here Makes some connections to things we've talked about in the past if you have an eigen basis B for the matrix A And if E is just your standard basis for Fn So this will consist of like E1 E2 up to EN that basis right there Then this matrix P is just none other than the change of basis where you switch from the eigen basis to the standard basis And hence its inverse matrix will be the change of basis Matrix from the standard basis to the eigen basis make a connection there And also another sort of nice result here is that if a matrix has indistinct eigenvalues Then it's always diagonalizable and that's because we saw earlier that for distinct eigenvalues The eigenvectors will automatically be linearly dependent. So if you have indistinct eigenvalues, you'll have these One-dimensional eigen spaces which are always which are all pair-wise Independent from each other and so you always get an eigen basis using that The only time that you're not diagonalizable is if you do not have enough eigen vectors And that would only happen if you have repeated eigenvalues Now that's not to say that if your eigenpowers eigenvalues repeated that you can't have an eigen basis You certainly can we'll actually see that in the next the next example coming up, but You might not have enough eigenvectors if there are repeated eigenvalues. Basically the issue here is you're gonna be diagonalizable dial diagonalizable if and only if Your geometric multiplicity, which is the dimension of your eigen space is equal to the algebraic multiplicity and the algebraic multiplicity this is the amount of times a Eigenvalue shows up in the characteristic polynomial and we want this to happen for each eigenvalue That's exactly the conditions. I will guarantee we have an eigen basis and hence we're diagonalizable Alright, so and then the last part of 6.3, which we'll watch in just a moment We'll actually see an example of how you can diagonalize a matrix. It's kind of an extensive problem So we're gonna do it all at once see then maybe go to the bathroom beforehand. It's it takes a little bit to do