 So, now we have discussed a bit about eigenvalues. So, now let us switch to focus to discussing about eigenvectors. So, basically we start with a small definition of something called eigenspace. So, let A and C to the n cross n, then for a given lambda, the set of all x vectors x such that Ax equal to lambda x is called the eigenspace, which I will abbreviate immediately as e space of A corresponding to lambda. And we will denote it as e space of lambda. So, for example, if I take the 2 cross 2 identity matrix, it has only one eigenvalue lambda equal to 1. And the eigenspace corresponding to lambda equal to 1 is the entire R2 plane. And the other thing is that every x, x in the eigenspace of a particular eigenvalue lambda eigenvector of A corresponding to lambda. And it is basically obtained by finding the set of all solutions to, so basically e space of lambda is the set of all solutions A minus lambda i times x equals 0 and is equal to the null space of A minus lambda i. So, in fact, if lambda is not an eigenvalue of A, then what can I say about the null space of A minus lambda i? Suppose lambda is not an eigenvalue of A, so we started out saying lambda is one of the eigenvalues of A, lambda belongs to sigma of A. But suppose lambda is not an eigenvalue of A, what can I say about the null space of A minus lambda i? It only contains the 0 vector because A minus lambda i is a non-singular matrix that is the determinant is not equal to 0 if lambda is not an eigenvalue of A because by definition all the lambdas for which determinant of A minus lambda i are eigenvalues of this matrix A. So, all other lambdas are not eigenvalues of A. So, for any lambda that is not an eigenvalue of this matrix A, A minus lambda i is a non-singular matrix and therefore, its null space contains only the 0 vector, okay. So, but if lambda is indeed an eigenvalue of A, then A minus lambda i is non-singular, sorry, A minus lambda i is singular and therefore, the null space will contain at least a one-dimensional subspace, okay. So, definition, one more definition. The dimension of eigenspace of lambda is called the geometric multiplicity. So, geometrically speaking what it is saying is that corresponding to the eigenvalue lambda, how many linearly independent eigenvectors can I find? And that is called the geometric multiplicity of lambda. And of course, we also know that lambda is a 0 of the characteristic polynomial of A. And so, the multiplicity of lambda as 0 of P A of t, this quantity is called the algebraic multiplicity. So, this geometric multiplicity is basically the maximum number of linearly independent eigenvectors associated with an eigenvalue with lambda. So, one fact which is not difficult to show is that the geometric multiplicity of lambda is always less than or equal to the algebraic multiplicity of lambda. So, for example, if I take the two, and the geometric multiplicity of the eigenvalue one is also two. So, they are equal in that case. And the dimension of the eigenspace of this eigenvalue one is going to be 2. The entire two-dimensional space is span by the set of all linearly independent eigenvectors corresponding to eigenvalue lambda, okay. So, if the matrix A has some eigenvalue lambda for which the geometric multiplicity of lambda is strictly less than the algebraic multiplicity of lambda, then we say that A is defective matrix, okay. Otherwise, of lambda equals the algebraic of lambda for all we say A is non-defective, okay. Now, what this means operationally is that if A is non-defective, it means that the geometric multiplicity is the same as the algebraic multiplicity. And we know that if we add up the algebraic multiplicity of all the eigenvalues of an n cross n matrix, we will always get n because an nth order polynomial always has n zeros. And so, if the geometric multiplicity equals the algebraic multiplicity for every eigenvalue of A, then it means that the sum of the dimensions of the eigenspaces corresponding to every eigenvalue is equal to n, which means that the matrix A has n linearly independent eigenvectors and therefore it is diagonalizable. So, we have that A is diagonalizable if and only if it is non-defective, okay. So, we have defined eigenvectors and seen a property of it. Here is one more definition of a left eigenvector, left eigenvector of A corresponding to lambda, which is in the spectrum of A if y-harmation A equals lambda y-harmation. Okay, this definition allows us to state one result which is known as the principle of biorthogonality. So, it says that if a in c to the n cross n and lambda and mu belong to sigma of A, they are both eigenvalues of A and lambda is different from mu than any left eigenvector of A corresponding to mu is orthogonal any. So, the normal eigenvectors we defined so far are also called right eigenvectors because the multiplication by the eigenvector is from the right of the matrix corresponding to lambda. What does this mean? It just means that if I take the inner product between a left eigenvector of A corresponding to mu and the right eigenvector and a right eigenvector of A corresponding to lambda, I will get 0. So basically, if I have distinct eigenvalues of a matrix and two distinct eigenvalues of a matrix and if I take the eigenvectors, the right eigenvectors corresponding to these two distinct eigenvalues, we know that they will be linearly independent but they need not be orthogonal to each other. On the other hand, if I take a left eigenvector corresponding to one of the eigenvalues and a right eigenvector corresponding to the other eigenvalue, those two vectors will be orthogonal to each other. So, this we show like this is a very simple proof. So, let y in c to the n, the left eigenvector of A corresponding to mu and let x in c to the n be a right eigenvector of A corresponding to lambda. Then if I consider y Hermitian A x, A x is the same as lambda x because x is an eigen, is a right eigenvector of A corresponding to lambda. So, it is the same as y Hermitian times lambda x which is equal to lambda is just a scalar. So, I can pull that out lambda y Hermitian x and I can also use the fact that y is a left eigenvector of A corresponding to eigenvalue mu. So, I can write this as mu y Hermitian times x which is equal to mu times y Hermitian x. So, I have written y Hermitian A x in two different ways as lambda y Hermitian x and mu y Hermitian x but lambda is not equal to mu. So, the only way these two can be equal is if y Hermitian x equals 0. So, that means that x and y are orthogonal to each other. Another thing is to relate eigenvectors of similar matrices. We know that similar matrices have the same eigenvalues but how are the eigenvectors of similar matrices related to each other? So, that is the following result. Let A and B be matrices in C to the N cross N. Sir, corresponding to eigenvalues do we have a right eigenvector as well? I mean all these properties do they hold for right eigenvectors also? Sorry, left. So, you are asking basically if so actually if you go back to the original definition right and okay so this is actually a good question. Let me let me maybe answer this a little carefully. Say if we if you recall we started with the equation A x equals lambda x and we said that this implies A minus lambda i times x equals 0 which implies that lambda is an eigenvalue of A if and only if determinant of A minus lambda i equals 0 correct. And here x is a nonzero vector which means that this matrix must become singular. So, for any eigenvalue of the matrix A, A minus lambda i is going to become a singular matrix. I could have done the exact same thing by starting with y Hermitian A is equal to mu y Hermitian which means that y Hermitian times A minus mu i equals 0 which means that this and so and y is not equal to 0 which means that this matrix has linearly dependent rows or in other words it is not also again non it's also a singular matrix. So, which means that mu is an eigenvalue of A if and only if determinant of A minus mu i equals 0. So, basically corresponding to any eigenvalue you will always have at least one nonzero left eigenvector and one nonzero right eigenvector. What the result above is showing is that these two eigenvectors if they correspond to distinct eigenvalues they will be orthogonal to each other. Does it answer your question? Yes sir, but those vectors will not be related. I mean they could be any correct in row space and column space. I mean they are related in the sense that they are for distinct eigenvalues they are perpendicular to each other. But for the same lambda if I look at the left eigenvector and the right eigenvector they need not be related to each other. So, in particular if A x equals lambda x and if I take the Hermitian of this then what I get is x Hermitian A Hermitian equals lambda complex conjugate times x Hermitian. In other words x Hermitian is a left eigenvector of A Hermitian corresponding to eigenvalue lambda star. So, they are not the left and right eigenvectors are not directly related to each other. They are related through Hermitian or transpose cross relation between them right eigenvectors of A. You can't write a direct relation between them. So, if I take A x equals lambda x and if I take the transpose I will get x transpose A transpose equals lambda x transpose which means that x transpose is a left eigenvector of A transpose corresponding to lambda. It's not a left eigenvector of A corresponding to lambda. Yeah, I mean related through transpose means f x is the right vector of A that means x transpose will be left eigenvector of A transpose and vice versa. Correct. But if I take a matrix A and I take a particular eigenvalue lambda I know that this A will have at least one left eigenvector call it y. So, there will be a y such that y Hermitian A equals lambda y Hermitian and there will be an x such that A x equals lambda x that x and y are not really related to each other. Yes sir, I got it. So, now here's the relationship between eigenvectors of similar matrices. So, basically if x is an eigenvector corresponding to lambda, it's an eigen, lambda is an eigenvalue of the matrix B and if B is similar to A through this similarity matrix S then S x is an eigenvector of A corresponding to the eigenvalue. So, basically this allows you to compute the eigenvectors of similar matrices very easily. So, if you take, if you know the eigenvectors of a particular matrix and you know another matrix is similar to that matrix then by just multiplying the eigenvectors of the first matrix by S you can get hold of all the eigenvectors of the other matrix. So, proof is practically one line. So, B is S inverse A S and B x equals lambda x. This means that I just substitute. So, S inverse A S x equals lambda x and now I just left multiply by S which implies A times S x is equal to, now when I multiply by S I get S lambda x but lambda is just a scalar. So, I can take that scalar out and write this as S x and since x is not equal to 0 and S is non singular implies S x is not equal to 0 and so S x is an eigenvector of A corresponding to lambda. So, I have a few more remarks I can make. So, for example the eigenvalues of a real symmetric matrix complex Hermitian matrix are real. So, if you take a complex Hermitian matrix or a real symmetric matrix it will always have real valued eigenvalues. So, if I take, suppose I take A in C to the n cross n and it is Hermitian then A Hermitian equals A and so if I consider A x equals lambda x then if I pre-multiply by x Hermitian A x is the same as lambda times x Hermitian x. X Hermitian x is already real and positive because x is a non zero vector. Now if I take the Hermitian of this which is actually the same as taking its complex conjugate x Hermitian A x whole Hermitian is equal to if I take the Hermitian inside it becomes x Hermitian A Hermitian x but A Hermitian equals A so this is the same as x Hermitian A x but if I do the same on the right hand side I get lambda star times x Hermitian x. So, from this you can already see that lambda star and lambda are equal because this Hermitian is the same as this x Hermitian A x. So, these two must be equal so and this is a real and positive quantity so they must be equal so yeah another way to say it is that if I take if I look at this and this I'm taking a number and taking its complex conjugate I'm getting back the same number so that means that x Hermitian A x is real value and Hermitian x is also real value and positive so that means that x Hermitian A x divided by so I'm just taking this down there x Hermitian x is equal to lambda is real value. If I take the ratio of two real valued quantities I cannot suddenly get a complex valued quantity. So, another property is that let be an operator norm or an induced norm on c to the n cross n and let lambda be any eigenvalue of A in c to the n cross n then mod lambda is less than or equal to the norm of A. We have seen this before already that the spectral radius is a lower bound on any matrix norm you can define on A and this is basically restating something we've already seen before. So, but looking at what we have done here it's a small exercise to write a small proof out for this so I mean it's as simple as if A v equals lambda v then it means that norm of A v is equal to mod lambda times norm of v and this norm is the norm that induced the matrix norm then so that means that if I take A v over norm v this is equal to mod lambda and this is true for this particular v which is the eigenvector of A corresponding to lambda and by definition the matrix norm of A is the maximum of a quantity like this over all v not equal to 0 and therefore by definition of the matrix norm mod lambda is less than or equal to norm of v. So, basically any operator norm gives an upper bound on the magnitude of the eigenvalues of the matrix A. So, I think that's all we have time for today.