 In the next few lectures we will discuss the notion of Eigen values, Eigen vectors, etc. But the motivation comes from the following problem, okay. Let me write down the topic first. Eigen values and Eigen vectors for linear transformations, okay. Let us say we have an operator T over V, a finite dimensional vector space V, okay. Then we know how to write down the matrix of the linear operator corresponding to a basis. The question is can we obtain a basis script B such that the matrix of T relative to this basis, B B but we use the short notation such that this is simple given a basis, is there a basis? The question is is there a basis such that the matrix of T relative to that basis is simple. Now what we need to interpret this word simple, make it precise. The simplest linear transfer, the simplest matrices apart from the scalar matrices that is a matrix is called a scalar matrix if it is k times identity that is the simplest. The next simplest is the class of diagonal matrices. So we will ask this question to begin with, can we find a basis B of V such that the matrix of T relative to that basis is a diagonal matrix, okay. So this is one of the interpretations of what a simple operator is. So let us precisely ask this question. Does there exist a basis B of V such that the matrix of T relative to that basis is a diagonal matrix. Let me use the letter capital D for that. What is this? This is diagonal of let us say D11, D22, etc, DNN that is this matrix has this form, okay. I will choose alpha, alpha 11, alpha 22, etc, alpha NN all other entries are 0, all other entries are 0. The diagonal entries are these numbers alpha ii. Now it is quite possible that some of these alpha iis are also 0, okay but that does not matter. Off diagonal entries are 0, that is what I am imposing on the matrix of the linear transformation relative to a fixed basis B. What is the advantage of this? First of all what is the meaning of this? What does it say about the transformation T and what is the advantage of this? If a matrix T if a linear operator T has this property then we will say that T is diagonalizable, okay. Let us first give this definition. We will simply say that in this case T is said to be diagonal. In this case we say that T is diagonalizable. We say that T is diagonalizable. The question is are all operators on finite dimensional vector spaces diagonalizable? Before answering this question let us see what it means. I mean what is the advantage of this? Suppose we know that an operator is diagonalizable then what are the advantages? If T is diagonalizable then we have the following. I will use the previous notation. The notation that we have developed here. If T is diagonalizable what this means is that T u 1 for instance can you see it is alpha 11 u 1 where I am using where I am using the notation u 1, u 2, etc for the basis elements. If I denote the basis B by u 1, u 2, etc, u n, n is the dimension of the space v then T u 1 is alpha 11, u 1, T u 2 is alpha 22, u 2, etc. I have these equations. T u i is alpha i i u i, okay. T u i is alpha i i u i. This describes, we know that the action of T on a basis describes T completely. This of course describes T completely. So let me give the definition of an eigenvalue. Let T belong to L v a number lambda in F, F is the underlying field either R or C, please check this, this is correct. A number lambda in F is called an eigenvalue or a characteristic value. There are other names like latent value, etc. A number lambda in F, F is underlying field for me. V is defined over F. A number lambda in F is called an eigenvalue of T, eigenvalue of T. If there exists x not equal to 0 in V such that T x equals lambda x. Now let me emphasize that this x must not be 0. So I will write that here. This should go along with the equation T x equals lambda x. Does this equation have a solution? For some x non-zero, for some number lambda, this lambda comes from the underlying field, okay. If it is a real vector space, I demand that the number must be a real number. If it is over the field of quotients, the number lambda must be a fraction, okay. So we demand that this lambda belongs to F. Then this lambda is called an eigenvalue. Any vector x that satisfies this equation is called an eigenvector. I will simply say any such x satisfying the above is called an eigenvector. There is a correspondence. It is called an eigenvector corresponding to the eigenvalue, corresponding to the eigenvalue lambda. Any such vector x that satisfies this equation is called an eigenvector or a characteristic vector or a latent vector corresponding to the eigenvalue lambda, okay. Question, do all linear transformations have eigenvalues? Let us first look at a simple example. Maybe before we look at that example, can you see that this is really solving a homogeneous equation. It is like T minus lambda i of x is equal to 0. Then I will look at the matrix of T relative to some basis. Then T minus lambda i x is equal to 0 is equivalent to A minus lambda i x is equal to 0, okay. Let us look at first an example. Let us take the rotation transformation. Remember this equation T x equals lambda x. The operator T acts on x and then the resultant vector must be along the direction of x. That is what this means. The resultant vector must be along the direction of x, a multiple of x. So rotation vector, rotation matrix, rotation operator, will that have an eigenvalue? Geometrically, provided, okay. Let us say what is the rotation matrix? It corresponds to what? Cos theta minus sin theta, sin theta cos theta. Let me say that the angle theta lies between 0 and 180 strictly. Does it have an eigenvector? So I have this in particular. Define T from R2 to R2 real space by T of x1, x2 is minus x2, x1. So this is what I said corresponds to 90 degrees really, right, minus x2, x1. Does this have an eigenvalue? So for this problem, let us do it by looking at the first definition. I have looked at lambda x equals T x equals minus x2, x1 but lambda x on the other hand is lambda x1, lambda x2, okay. Let us remove the case, lambda is 0. If lambda is 0, can you see that x is 0? That is because of this. So I need lambda, I am looking for an eigenvector. So this cannot be 0. So lambda cannot be 0. If lambda is 0, then x is 0. I am looking at an eigenvector. I am seeking an eigenvector. So lambda cannot be 0. Now look at these two equations. From this second one, x1 is lambda x2, x2 is minus lambda x1. So this is lambda into minus lambda x1. That is minus lambda square x1. That is 1 plus lambda square x1 is 0. Lambda is a real number. We are seeking lambda a real number. So from this it follows that x1 is 0. Go back to this equation. x1 is lambda x2. Lambda is not 0, x1 is 0, so x2 is 0. So x is 0. So in either case, there is no non-zero vector x that satisfies this equation. T x equals lambda x. T x equals lambda x is not satisfied by any non-zero vector x. So this linear operator T does not have an eigenvalue, okay. This linear operator T does not have an eigenvalue. Now this is a problem with the underlying field. The underlying field is R. R is not algebraically closed from group theory. A field is said to be algebraically closed if any polynomial of degree n whose coefficients come from the underlying field has precisely n 0s. What is a polynomial which does not have 0 here? The polynomial T square plus 1. The polynomial P of T equal to T square plus 1 does not have a 0 over R. R is not algebraically closed. So the problem in this case, the operator T does not have an eigenvalue has come from the deficiency of the field. The deficiency is really from the field, okay. This is one possibility. I look at the other possibilities. But before that, for the case of 2 by 2 for an operator T on R2, all this can be done. But if I have an operator on R3, then this can get quite complicated. So we need to translate this into the language of matrices. So let us translate this problem into 1 for matrices and then look at matrices. Let us remember that T is okay. Tell me if this is okay. T minus lambda i is not invertible if and only if A minus lambda i is not invertible. We have proved this before where A is a matrix of T relative to a particular basis B. Do you remember that we have proved this result? If T is a linear transformation, A is a matrix of T and if T is invertible, then A is invertible. In fact the matrix of T inverse is the inverse of the matrix A, okay. The matrix of T inverse is the inverse of the matrix A. So remember that here i is the identity transformation. I am applying T minus, I am looking at the matrix of T minus lambda i relative to B. I relative to B, the matrix of identity transformation relative to B is identity matrix. Here remember that we are okay. I hope you remember that this is what we have. Relative to 2 basis, we write a linear transformation. If it is an operator, then we work with only 1 basis, okay. If you have 2 basis, then the identity transformation relative to 2 basis need not be the identity matrix, okay. But with respect to a single basis, this is the identity matrix. So let us remember, this is the identity transformation, this is the identity matrix. So this is not invertible if and only if this is not invertible, okay. Then the question boils down to, so I am seeking T x equals lambda x with x not equal to 0. This is the same as T minus lambda i x equal to 0. Null space of T minus lambda i, that is null space of A minus lambda i by this identification. So I have A minus lambda i x equal to 0, x not equal to 0. Now the question is over homogeneous equations. I have a matrix, let us say B x equal to 0, I want x not equal to 0. I know that this has a solution if and only if the row reduced echelon form of this coefficient matrix has at least 1 0. The last row, at least the last row is 0. There may be more 0s. There may be more 0 rows, okay. In other words, this matrix cannot be invertible. If A minus lambda i is invertible, I can pre multiply by A minus lambda i to conclude that x is 0. So if I am seeking x to be non-zero, then A minus lambda i cannot be invertible. In terms of determinants, this means determinant of A minus lambda i is 0. So the question is, does this equation have a solution? Now what kind of an equation is this? T having lambda as an eigenvalue reduces to lambda satisfying this equation. What equation is this? Determinant, yeah it is a polynomial equation. Determinant expansion along let us say first row or the first column. This is a polynomial of degree n. It is a monic polynomial. The coefficient of lambda to the n is 1 or minus 1. It does not matter. You have 0 here. So this is a polynomial equation where the polynomial is of degree n. It is what is called as a monic polynomial. The coefficient of the highest degree is 1. So we need to solve this polynomial equation, okay. That polynomial equation is called the characteristic. I will call P of T as, okay, P of lambda as determinant of A minus lambda i, okay. This is, this notation is for the determinant. This is called the characteristic polynomial of A. It is called the characteristic polynomial of A. We observe that it is a polynomial of degree, this is a monic polynomial of degree n, the monic polynomial of degree n. Now this is a difficult problem in numerical linear algebra but we are going to do problems of the size 3 by 3. So it should not be difficult. Okay, before we proceed to other examples, let us also make the following observation. I am going to use some of the properties of determinants that I am sure you are aware of. For instance determinant of a product is a product of the determinants, okay. Determinant of A B is equal to determinant A into determinant B. I am assuming that A and B are both square. What about matrices that are similar to each other? Okay, let us say P A P inverse. I have two matrices A and B related by this equation. B is similar to A. For instance if I write down the matrix of a linear transformation relative to two bases, let us say the matrix A with respect to one basis, matrix B with respect to some other basis, then A and B are related by this equation, okay. Suppose that B equals P A P inverse, then can you, you remember that determinant of B is equal to determinant of A? Determinant of B is determinant of P into determinant of A into determinant of P inverse that is determinant is multiplicative. Determinant A B is determinant A into determinant B. Now these are numbers. This is determinant P into determinant P inverse into determinant E by determinant P determinant P inverse that is 1 so this is determinant of A okay so similar matrices have the same determinant this property we will need where do we need? We need in the following look at B minus lambda I I have B equal to P A P inverse look at B minus lambda I B minus lambda I let me write like this this is also equal to instead of B I have P A P inverse minus lambda I I can write this as P A P inverse minus lambda P P inverse identity I have written it like this now I take P to the left P inverse to the right and write this as P into A minus lambda I P inverse so B minus lambda I is P times A minus lambda I times P inverse again use the same formula apply determinants on both sides determinant of B minus lambda I is determinant of A minus lambda I that is if B is similar to A then determinant of B minus lambda I is determinant of A minus lambda I so is it clear from this that A and B have the same eigenvalues similar matrices have the same eigenvalues similar matrices have the same eigenvalues one of the advantages of this result is that the following function is well defined I am calling the determinant of T I am defining the determinant of a linear transformation I define a function from the set of all linear transformations to the underlying field this function I am calling as a determinant function. The first method of T is the matrix of T relative to B, B is any fixed basis B is any fixed basis is this well defined in other words if I change the basis will I am sorry I meant the determinant of, I meant the determinant of this matrix okay is this well defined, determinant of the matrix of T relative to some basis relative to some fixed basis is this well well defined. If I change the basis will I get a different value for that determinant? I will not because when I change the basis then the matrices are related by a similarity transformation but we just now saw that similar matrices have the same determinant okay. So this is well defined this allows us to go from determinant of a matrix to determinant of a linear transformation okay but let us get back to matrices that is what we want when we want to look at example. So let us look at the Eigen values and Eigen vectors of matrices. First okay I am really looking at second example. Suppose I have A defined as follows let us look at this matrix let us calculate the Eigen values possibly the Eigen vectors. Remember the first example of a linear transformation for which we try to compute the Eigen values. The first example we have seen that it has no Eigen value that the deficiency coming from the field what happens in this example? I need to calculate the Eigen values, Eigen vectors I must look at the polynomial equation determinant of A minus lambda i equal to 0 okay. So let us do this quickly this is A minus lambda i along the main diagonal I must delete lambda and then calculate the determinant. So I want the determinant of this matrix 3 minus lambda 1 minus 1 2 2 minus lambda minus 1 2 2 minus lambda. Let us expand it along the first column okay first row 3 minus lambda into lambda square minus 2 lambda plus 2 minus 2 lambda plus 2 2 minus 2 lambda please check the calculations minus just 2 lambda this and this get cancelled goes by the minus 2 please check that the rest of the simplification follows lambda minus 1 into lambda 2 okay please verify this expression simplifies to this lambda minus 1 into lambda minus 2 whole square. This is the characteristic polynomial of A this matrix. So we have no problem with regard to Eigen values lambda equal to 1 is an Eigen value with multiplicity 1 lambda equal to 2 is an Eigen value with multiplicity 2 okay. So apparently no problem with regard to Eigen values are considered what about Eigen vectors? So let us take the case lambda 1 equals 1 we must solve this equation that is A minus 1 identity x is 0 so I must solve 2 1 minus 1 2 1 minus 1 2 2 minus 1. I am seeking non 0 x that satisfies this equation I need to do elementary row operations etc okay but I can observe quickly that this row can be deleted I can push it to the last row then that will become the 0 row and then from the first equation and the third equation what follows is that the second coordinate is 0 we do not have to do elementary row operations the second row can be removed it is the same as the first row that in effect gives a 0 row at the bottom I have only 2 rows first row is 2 1 minus 1 second row 2 2 minus 1 it is a homogeneous equation 2 x 1 plus x 2 minus x 3 2 x 1 plus 2 x 2 minus x 3 cancel 1 from the other I get x 2 to be 0 so x 2 is 0 the other equation single equation is 2 x 1 minus x 3 equals 0 any vector x that satisfies these 2 conditions is an Eigen vector corresponding to the Eigen value 1 there are infinitely many I agree but there is precisely one linearly independent vector I will call it x 1 let us say first coordinate is 1 third coordinate will be 2 second coordinate I know is 0 there is precisely one independent vector that satisfies these two equations any other vector is a multiple of this it is a homogeneous equation any other solutions are multiple of this that is because this is 2 see this is actually 2 equations in 3 unknowns we must fix 2 one of them is already fixed either x 3 or x 1 I must fix for convenience I fix x 3 then I can determine x 1 in terms of x 3 3 equations in 2 unknowns 2 must be fixed x 2 is fixed to be 0 fix x 3 so there is only one solution the dimension of the solution space is 1 so there is only one independent solution for this so this is an Eigen vector corresponding to the Eigen value 1 any other vector that satisfies A x equals lambda x will be a multiple of this because it is a homogeneous equation any multiple is also a solution so this is 1 Eigen vector what about the Eigen value 2 I need to solve A minus 2 times i x is 0 with x not 0 so I must delete 2 1 1 minus 1 I must delete 2 this into x is 0 this into x is 0 I observe that this is a multiple of the first equation so this gives me 0 rho now I have 2 equations in 3 unknowns I must fix only one of them the other 2 can be determined in terms of this is it okay you can do elementary operation but it is obvious the rank is 2 the rank is 2 so nullity is 1 so I am let us say for convenience I fix x 3 I fix x 3 to be 1 by the way if you go back to this you go back to these equations I have taken x 3 to be x 3 to be 2 can I take x 3 to be 0 if I take x 3 to be 0 it follows that x 1 is 0 so I will get 0 0 0 but I want a non-trivial solution okay that is the reason why I have taken x 3 to be 2 we are looking at non-trivial solutions I do a similar thing here see if I take x 3 to be 0 if I take x 3 to be 0 then this is gone I must look for x 1 x 2 such that this into x 1 x 2 is 0 this is an invertible matrix I will again get 0 so I cannot take x 3 to be 0 so one of the non-zero choices x 3 equals 1 then I need to solve for x 1 plus x 2 is 1 2 x 1 is 1 so I will change it to 2 only for convenience 2 x 1 is 2 is that okay these 2 equations so x 1 is 1 I will call this x 2 the new vector that we obtained by solving this equation x 1 is 1 x 3 is 2 or this 1 is wrong is it okay now 1 1 2 2 minus 2 1 plus 1 2 2 minus 2 so this is the only linearly independent solution of this equation any other solutions are multiple of this okay so this time even though we have obtained 3 eigenvalues without counting multiplicity if I count multiplicity there are only 2 eigenvalues 1 comes once 2 comes twice but let us say it has 3 eigenvalues I do not have 3 eigenvectors I do not have 3 eigenvectors this is not the deficiency of the field this is the deficiency of the operator this is the deficiency of the operator or the deficiency of the matrix that we started with okay. Now even at this point you can verify that this operator is not diagonalizable because it has only 2 independent eigenvectors okay you can take this as an exercise this operator is not diagonalizable because it has only 2 linearly independent eigenvectors. So let us see the connection between eigenvalues eigenvectors and diagonalizability so remember we said that T is diagonalizable if the matrix of T relative to B is a diagonal matrix can you see that from this it follows that T is diagonalizable if and only if there is a basis of V each of whose vectors is an eigenvector 40 T is diagonalizable if and only if there is a basis B of V with the property that each basis vector is an eigenvector. The reason why this is true is that if you write down this then this goes along with the equation T ui equals alpha ii ui that is the reason why the statement is true is this clear what this means is that if T is diagonalizable then each of the basis vectors that I started with is an eigenvector u1 u2 etc u n are basis vectors so they are not 0. So they are eigenvectors corresponding to the eigenvalues alpha ii now what are alpha ii these are the diagonal entries of this diagonal matrix okay so if T is diagonalizable then each of the basis vectors that I started with is an eigenvector conversely if I have a basis u1 u2 etc u n such that each vector there is an eigenvector then I must have some such equation satisfied I must have T ui equals some gamma ii ui if I get this then I write down the matrix of T relative to that basis u1 u2 etc u n I must get a diagonal matrix so these statements are equivalent okay. Now go back to the previous example the previous example we have only two independent eigenvectors so T is not diagonalizable the matrix A is not diagonalizable in the language of the transformation the linear transformation T induced by A is not diagonalizable is this clear diagonalize non diagonalizability can come from two factors one from the underlying field the other one the inherent nature of the transformation T okay in the second example the problem is with the transformation T in the next lectures let us look at necessary sufficient conditions for diagonalizability apart from this apart from this necessary sufficient conditions for diagonalizability and then properties examples okay let me stop here.