 Okay, so the last time we were looking at eigenvalues and eigenvectors. The basic equation is Ax equals lambda x and we need a solution such that x is not equal to 0. You also saw that this always occurs in, I mean by definition it occurs in pairs. That is an eigenvalue has an associated eigenvector with it. And further if Ax equals lambda x, A minus lambda times the identity matrix times x is equal to 0 which is a homogenous set of equations which means that A minus lambda i is a singular matrix which in turn means that the determinant of the matrix must be equal to 0. So from that we deduce that lambda is an eigenvalue of A if and only if determinant of A minus lambda i equals 0. We also defined sigma of A to be the set of eigenvalues of A and therefore A is singular if and only if 0 is in the spectrum of this matrix A. So this is called the spectrum. And we defined the characteristic polynomial as Pa of determinant of degree n and it always has n roots counting multiplicities. And these n roots are the eigenvalues of A. So we have a procedure of how to find eigenvalues. We first solve the characteristic polynomial or we find the roots of the characteristic polynomial and we find that gives us what lambda i's are and then we find eigenvectors by finding the null space of A minus lambda i. So this is a procedure that will work reasonably well for small dimensional systems but if you have very large matrices then you will have to use other methods to find eigenvalues in eigenvalues. So that's basically a short recap of what we did in what we saw in the previous class. Today we will continue this discussion and we will also discuss about one very important concept called similarity. So in the last class I stated this result that if a matrix has distinct eigenvalues then the associated eigenvectors will be linearly independent. And so here is a lemma that essentially makes this point. So if lambda 1 to lambda k are distinct eigenvalues that means that no two of these are equal of this matrix A then corresponding to each distinct eigenvalue there is at least one nonzero eigenvector, nonzero vector which is an eigenvector. And so suppose x i is an eigenvector associated with lambda i, i is 1 to k. So they are k vectors then these k vectors x 1 to x k form a linearly independent set. So let's show this. This is a somewhat interesting proof so I thought I will just go through that with you. So the proof is by contradiction. So suppose it's not true and instead these k eigenvectors are actually a linearly independent set. So they are linearly dependent. Then what it means is that there is a non-trivial linear combination of these k vectors which will give us the zero vector. And in fact one can find a minimal linear combination which will give us the zero vector. So that implies there is a linear combination with least number of nonzero coefficients say r of them which yields zero, the zero vector. So we will write that as, so let's write that as say alpha 1 x 1 plus alpha r x r equals zero. So what I have done is I have assumed that it is the first r vectors here that gives you the least number of nonzero coefficients which will give us the zero vector. But that's okay because I can always reorder or renumber these vectors if necessary. So the point is that all these alphas are not equal to zero and further r is at least equal to one, at least is greater than or equal to two or it's greater than one. So basically what I want to say is alpha i is not equal to zero, i equal to one to r and r is greater than one because xi is not equal to zero. So you can't just take one vector and find a non-trivial linear, okay so you will have to use at least two vectors and you are using some r vectors and all of these coefficients are nonzero. So all we do now is to multiply, pre-multiply this by a. So a times alpha 1 x 1 plus etcetera plus alpha r x r is equal to zero because a times zero vector is just zero but left hand side is equal to a alpha 1 times a x 1 plus etcetera plus alpha r a x r equal to zero which means that a x 1 is equal to lambda 1 x 1, alpha 1 x 1 plus etcetera plus lambda r alpha r x r equal to zero. So now we can multiply, so I will call this equation 2, I will call this equation 1. So I will multiply equation 1 by let us say lambda r 1 times lambda r minus 2, okay the right hand side is the zero vector so the right hand side remains zero but if I multiply this by lambda r and then I subtract this then I will get alpha 1 lambda 1 sorry lambda r minus lambda 1 x 1 plus alpha 2 lambda r minus lambda 1 lambda 2 x 2 plus etcetera plus alpha r minus 1 lambda r minus lambda r minus 1 x r minus 1 and the last term is lambda r alpha r x r here and lambda r alpha r x r here so they cancel so this is equal to zero and since these lambdas are distinct all these are all these coefficients will remain non-zero but then what we have done now is we found a linear combination involving only r minus 1 of these vectors but we started with the assumption that this alpha 1 x 1 plus etcetera up to alpha r x r is the least number of non-zero coefficients required to get the zero vector so it contradicts the so this has fewer than r non-zero coefficients which is a contradiction okay so now we can move on to another topic which is that of similarity so we will start by defining what this is so b a matrix b is said to be similar the matrix a in c to the n cross n if there exists a non-singular s in c to the n cross n such that b equals s inverse a s okay so this this transformation s inverse a s applied on a is called a similarity transform and what it really represents is a change of basis of a linear transform so if s represents a change of basis matrix so given a linear a set of linear equations say y is equal to a x if I can write in a new if I if I represent x in a new basis as s times z where z is in the new is the coordinates of x according to the new basis then if I can compute y equal to a x as y equal to a times s times z and this y is now again in the old coordinate system and so if I want to transform it back to the new coordinate system I have to multiply by y by s inverse so s inverse y becomes s inverse a s times z so s inverse y is like w which is the coordinates of y in the new coordinate system so s inverse a s represents the same linear transform as a but in a different basis or a different coordinate system that's one way to think about this similarity transform so um so this similarity transform is a mapping from say a to s inverse a s so given a made given a non-singular matrix s if you can map a to some other matrix s inverse a s and this kind of a transformation is called a similarity transform and we will also use the notation b tilde a to to say that b is similar to a and this matrix s is called the similarity matrix this similarity is actually what is called an equivalent relation okay what we mean by that is that um it is um reflexive which means that a is similar to a of course I can write a as identity matrix inverse times a times the identity matrix so a is similar to a and it is symmetric meaning that if a is similar to b then b is similar to a so if b equals s inverse a s I can write a as s b s inverse so there is another matrix um such that uh so you can call that matrix t which is equal to s inverse then a will be equal to t inverse b t and so b is a is also similar to b and finally it is transitive meaning that if c is similar to b and b is similar to a then c is similar to a okay now what what an equivalence class does is it splits the space of all n cross n matrices into equivalence classes so within an equivalence class any pair of matrices are similar to each other and if you take one matrix from the from a given equivalence class and another matrix from a different equivalence class they will not be similar to each other you cannot find an s such that b equals s inverse a s so equivalence relations that's one property of equivalence relations so equivalence so let me put it this way it's in fact true of any equivalence relation not this particular one but not only this one but any equivalence relation um on c to the n cross n partitions c to the n cross n into equivalence classes so any pair of matrices in the same equivalence class are similar to each other and any pair of matrices coming from different equivalence classes are not similar to each other so basically you can ask what properties to matrices in a given equivalence class share and in fact they share many many properties and this is what we're going to study in some detail um so the first first thing that the first thing result about what they share is that they share the characteristic polynomial if b is similar to a then of t equals p a of t they have the same characteristic polynomial this is very easy to show it's essentially a couple of lines proof so pb of t by definition is the determinant of ti minus b which we can write as determinant of t s inverse s i identity matrix is s inverse s and s here is this similarity matrix that will take a to b so minus b is s inverse a s and what I can do now is I can pull out s inverse from the left and pull out s on the right so this is equal to determinant of s inverse ti minus a times s but we know that determinant of a b equals determinant of a times the determinant of b so this is equal to determinant of s inverse determinant of ti minus a determinant of s but determinant of s inverse is one over the determinant of s so that is equal to determinant of s inverse determinant of s determinant of ti minus a then these two obviously cancel which is equal to determinant of ti minus a which is equal to p a of t so a corollary to this is that if b is similar to a or a and b are similar matrices then a and b have the same eigenvalues counting multiplicities so they not only have the same distinct eigenvalues but also the number of times the eigenvalue appears as an eigenvalue of a is the same as the number of times it appears as an eigenvalue of b so so now question is the converse true if two matrices have the same eigenvalues will they be similar yes okay so who said yes through okay if a and b anybody else have an opinion on this counting multiplicities are they similar sir i i don't think and then it is huh why sir if they have same eigenvalues then they will have same characteristic polynomial but that is determinant of ti minus a should is equal to determinant of ti minus b but even if the equivalence relation does not hold for different values of t and b not equal to s inverse s it can be they can be similar it is not okay so here's a very simple argument if i consider the matrices 0 0 0 and 1 here these matrices are not equal okay but what are the eigenvalues of this matrix okay what are the eigenvalues of the all zero matrix okay they are these two matrices so one thing you can keep in mind is that if a matrix is triangular then the diagonal entries of the matrix are the eigenvalues of the matrix okay that is easy to show and also try try that out for yourself and convince yourself this is true that if the matrix is upper triangular the diagonal entries are the eigenvalues so for matrices like this which are upper triangular you can just read off the diagonal entries those are the eigenvalues so both these matrices have 0 comma 0 as their two eigenvalues but they are not similar why are they not similar because if there was and such that s inverse this matrix times s was equal to the all zero matrix where s is a non-singular matrix you can simply pre multiply and post multiply by s and s inverse then you will you will get an absurdity that 0 1 0 0 equals the all zero matrix so it's not possible that these two matrices are similar so they are not similar although they have the same eigenvalues so the answer to this question is no sir sir yeah sir you said I was looking into that that if b is similar to a then you write b equal to s inverse a s right so if pre multiply s and post multiply s inverse then we can write a equal to s b s inverse yes but we also know that if a is similar to b then b is similar to a so from there we can write a equal to s inverse b s no not through the same similarity matrix that's important okay so so this is what you said is correct if b is s inverse a s I can also write this as a equals s b s inverse s and s inverse are not the same matrix so I can write this as t inverse b t where equals s inverse okay so actually the when we say two matrices are similar okay depending on the direction in which I want to execute the similarity transform the matrix s I mean the matrix s depends on the direction in which I want to execute the similarity transform so I can write so for example so that's why when we say b is similar to a what we mean is that there exists an invertible s such that b equals s inverse a s we write it like this but of course it also means that a is similar to b which means that there is a different matrix t such that a is equal to t inverse b t but that t is not the same as s its t is actually equal to s inverse and in fact this matrix s okay this may not be unique okay we'll see that later there are there possibly many different s's such that e equals s inverse a s okay okay so thanks sir yes if two matrices have distinct I mean same eigen values which are not zero hmm then will they be similar always okay so that is something to think about okay we'll we'll see many more results coming up and then I'll the answer will become obvious okay just hold on okay so thank you yeah okay so a direct consequence of this okay but maybe I can just answer this question in this way so suppose I consider the matrix 1 1 0 0 sorry now clearly these two matrices still have the same eigen values both are equal to 1 but clearly it's also not possible that there's a matrix s such that this identity matrix equals s inverse times this matrix times s so if there existed such a matrix now this is the identity matrix 2 cross 2 implies that if I now pre multiply and post multiply by s and s inverse I should have 1 1 0 1 equals the identity matrix which is not true okay so it's possible that the matrix has non zero eigen values and the eigen values are the same but the matrices are not similar to each other okay but what if the eigen values were non zero and distinct that we will see okay so since um similar matrices have the same characteristic polynomial they have the same number of non zero eigen values counting multiplicities and the number of non zero eigen values equals the rank of the matrix and so we have the result that that's one way to think about it but yeah I'll tell you another way uh the other way to think about it is if b equals s inverse a s multiplying a matrix by a non singular matrix does not change its rank and so as a consequence left or right multiply left and right multiplying by the non singular matrix retains the rank of the matrix so if b is similar to a then a and b let me just write it so similar matrices have the same eigen values counting multiplicities and similar matrices also have the same rank