 Hi everyone, we are continuing our discussion in section 6.2 the characteristic polynomial from the textbook linear algebra done openly and in this part The end part of section 6.2 I want to talk about the idea of similarity and this notion is going to start to kind of explain why Amongst many reasons we are interested in Eigen values and eigenvectors in the first place. So imagine we have two square matrices a and b They're both in by in we say that the two matrices are similar if there exists some non singular in by in matrix P Such that P a P inverse B. Sorry P a P inverse is equal to B So that is we can factor the matrix B and have a as a factor But there has to be these non singular Factors right here. Of course non singularity is required for the matrix P. Otherwise P inverse doesn't matter. It doesn't exist But the matrices a and B themselves do not have to be Non-singular they could be singular matrices. I do want to mention though that this equation right here It could be rewritten as P a is equal to P B One has to be very careful about the order of operations You can't just commute the matrices willy-nilly. And so let me give you an example of of a Similar matrices here. So let's take the matrix a which is this three by three matrix one zero negative seven five one two negative four two zero and take this other matrix B Which is also three by three fifteen negative eighteen negative two seventeen negative seventeen negative four seven negative twenty two and four right there And I claim that these two matrices are similar to each other How do we know that well to check that two matrices are similar you have to choose this? Connector right this connecting matrix P, which will be non singular And we kind of put them together, right? So let's say that P Is this three by three matrix right here one negative two negative one one negative one zero one negative four negative two This is a non-singular matrix. In fact, it's inverse matrix I provided right here as well two zero negative one two negative one negative one and negative three two and one and If you want to pause the video and multiply P with P inverse and you'll see that this gives you the three by three identity These that this is the real McCoy. This is a non-singular matrix and so Putting these together now, right if we take P times a times P inverse I'm not writing the details out here because it takes a little bit while to write them out You can see them right here if you take P times a times P inverse Let's first consider P a Sorry a P inverse, right if you multiply those together take the first row with the first column You'll end up with two plus zero plus 21 that gives us a 23 Continue on we'll do the first row second column, right? We end up with zero zero negative 14 You see right there and continue to do all of the details all of the details going on right there You end up with this matrix right here. This is our AP inverse All right, and again pause the video if you want to double-check these calculations yourselves And then when you multiply these two matrices together, right? You take the first the first row one negative two Negative one multiply by the first column. We're gonna get 23 minus 12 Plus four which is equal to 15 first row second column. We know how to do this by now You're gonna get negative 14 minus 6 plus 2 that adds up to be negative 18 And so if you go through this fat if you go through this matrix multiplication Voila, you see that we've recaptured the matrix B This 15 negative 18 negative 2 etc etc and so this shows us that a and B are similar and we'll often to note This is a is similar to B or something like that, right? They're similar to each other and so there's a relationship that these these matrices are connected in some way, right? This matrix P acts as the connector and a and B are related to each other The relationship that matters is actually this theorem right here if a and B are two similar matrices Then they have the same Characteristic polynomial and if they have the same characteristic polynomial They'll have the same eigenvalues and they'll show up with the same eigen or the same multiplicities Both geometric well, we're not well, they'll have the same algebraic multiplicities They'll have the same eigenvalues and so this is actually a very very useful result And the idea comes from the fact that if we look at the characteristic polynomial of a you get a minus Lambda I like this well the idea is you can factor this and you can multiply this by the determinant of P times the determinant of a minus lambda I times the determinant of P inverse now the I should note here that P the determinant of P times the determinant P inverse These things are reciprocals when you multiply them together. This actually is the number one So that's why equality is preserved and then by the multiplicative property of determinants this is the same thing as the determinant of P times a minus lambda I times P inverse which when you multiply those through you get the determinant of P a P inverse you'll get minus P lambda I P inverse like so and now P a P inverse this is none other than the matrix B since they were similar and this one right here lambda I this commutes with any matrix So you can bring those over here and you're gonna get P times P inverse which is equal to the identity itself And so this thing will simplify to be the determinant of B minus lambda I so two matrices are similar They'll have the exact same characteristic polynomial Now it's it's not fair to go the other way around if two matrices have the same characteristic polynomial That does not necessarily mean they are similar. There are other things to be considered in that But similar matrices will have the same characteristic polynomial Which means they will have the exact same eigenvalues, which is very nice But I want to caution you they will not necessarily have the same eigenvectors the eigenvectors Might change as you go from one similar matrix to another But this eigenvalues will be the same and it turns out the change of eigenvectors has a lot to do this connecting matrix P We'll talk about more of this in the next section of the textbook about diagonalization Now because the eigenvalues are the same There's going to be some some other things that are going to be the same about the matrices as well So I want to kind of mention here that if you're similar If you're similar, you're gonna have the same eigenvalues like we said They'll show up with the same the same algebraic multiplicities and all that type of stuff Of the same geometric multiplicities as well, but you'll also have the same determinant If if two matrices are similar, they'll have the same determinant And the main reason behind that is a very interesting observation that the determinant of the matrix like take the determinant of a This is just Evaluating the characteristic polynomial at zero Okay, and if you evaluate the characteristic polynomial at zero, you're gonna get the constant term of that polynomial And the constant term of a polynomial is going to equal the product The product of its roots Which is which is sort of an interesting observation to the terminate is equal to the product of eigenvalues So determinants and eigenvalues are very deeply connected to each other Which is pretty cool. Another thing that's even more stellar here is that if two matrices are similar They're gonna have the same trace Remember the trace is the sum of the diagonal and she's of a square matrix this was a little bit harder to convince yourself up but the trace is actually equal to the sum of The eigenvalues of the matrix, so they have the same eigenvalues. They'll have the same trace And yeah, that's that's kind of a phenomenal observation here And so these things we're talking about right here eigenvalues determinants traces These are all something called in variants. These are quantities properties of a matrix that do not vary If you switch from one matrix to a similar matrix So if you want to show that two matrices are similar what you want to show is that they they differ They differ on an invariant like their eigenvalues their Determinants or their trace the trace is one of the easiest ones to do so I often look for that one first If you take the matrix a here the trace of a is Equal just to be one minus two which is negative one on the other hand the trace of B That turns out to be three plus two which is equal to five And so if two matrices have different traces, which they do negative one over here and five over here That would imply that these things are not similar. So look at Look at the the trace of the matrix to see if they're not similar Look at the determinant and look at the eigenvalues some other things that are invariants Similar matrices will have the same rank They will have the same rank rank is an invariant of similarity. This will also mean they'll have the same nullity You can look at the nullity of the matrix And those are some other invariants that you should be aware of and so like in the homework For example, if you're trying to show the two matrices are not similar You look at invariants and see that they're different on invariants And this this is a very different thing of how we show that two matrices were similar Like we had over here two matrices are similar if you can find this connecting matrix But what if we can't find a connecting matrix? Is that because it doesn't exist or maybe because we're bad at finding it, right? What about Bigfoot for all I know Bigfoot is still running around the forest of California somewhere But people have been looking for Bigfoot and haven't found them But maybe Bigfoot is a warlock and he uses magic to conceal his location I don't know. I mean there could be reasons why we can't find them the reason so just because we can't find them Doesn't mean it doesn't exist. It just means that maybe we're bad at finding them, right? So we need a different argument to why two matrices are not similar and this idea of invariance is the strategy to go Show that to two matrices that vary on an invariant means they're not similar as the name invariant seems to suggest And then actually brings us to the end of our discussion here in 6.2 About characteristic polynomials on this section of the video. We're talking about similarity of matrices and such I do hope you've been enjoying these videos and learning a lot if you if you'd like to see more feel free to hit the like Button or the subscribe button Leave a comment behind if you have any questions stay tuned for section 6.3 about Diagonalization which will talk some more about this notion of similarity. I will see you all then Take care everyone