 2 step procedure. First step is solve the characteristic polynomial to find lambda i. Now this can be done only in close form only for the 2 cross 2 case. For larger dimensional cases, you will have to solve it by hand for specific A matrices or you will have to use a solver, a zero finding algorithm to find all the zeros of that polynomial equation. And 2, find eigenvectors by finding the null space of A minus lambda i. Of course, this is okay for small dimensional systems but if you have a very large dimensional system, then the errors can accumulate. So this is not the most desirable way to compute eigenvalues and eigenvectors for very large dimensional systems. We will see that much later, some numerical techniques to find eigenvalues and eigenvectors. Now some couple of properties of these eigenvectors but maybe before that I just want to mention about multiplicity. So in this context, we consider multiplicity to be simply the number of times it occurs as a zero of this polynomial. But a more thorough way of looking at multiplicity is in terms of the derivatives of a polynomial and whether this particular eigenvalue is a zero of the derivative of the characteristic polynomial. So specifically a polynomial p of t has lambda as a zero of multiplicity k which is always greater than or equal to 1 if and only if we can write p of t equal to in this form, p of t to be in the form t minus lambda power k times q of t where q of t is a polynomial such that q of lambda is not equal to zero. So basically if I take for example, p dash of t dp of t by dt, this will be equal to k times t minus k power t minus lambda power k minus 1 times q of t plus t minus lambda power k times q dash of t. And so from this representation, you can see clearly that p dash of lambda will be equal to zero if and only if k is strictly more than 1. If k equals 1, then k minus 1 becomes zero and this term becomes k times q of lambda and of course this term will go to zero but that does not matter, q of lambda is not equal to zero and k is 1. So the derivative will now be nonzero if k equals 1. So basically for p of p dash of lambda to be equal to zero, we need that in this representation, k should be greater than 1. And similarly if you take the second derivative, so if k greater than 1, p double dash of t, second derivative will be k times k minus 1 times t minus lambda power k minus 2 q of t plus some other terms which will all have something like t minus lambda power m for m greater than or equal to k minus 1, these type of terms. And now once again, so basically once again you can see that p double dash of lambda will be equal to zero if and only if k is greater than 2 because if k equals 2, then this term becomes t minus lambda power 0. And then if you substitute t equals lambda, this will remain equal to 1 and k into k minus 1 is going to be nonzero and q of lambda is nonzero. So this whole thing will be nonzero, even though these terms are going to zero. And so p double dash of lambda will be zero if and only if k is greater than 2 and so on. So basically this calculation shows that p of t rather, so lambda is zero of p of t with multiplicity k if and only if p of lambda equals p dash of lambda equals etcetera up to p k minus 1 derivative of lambda equals zero and p k of lambda is not equal to zero. So this is how you get a more precise definition of the multiplicity of eigenvalues. So just a couple of properties of eigenvectors. The first property is that if lambda 1 through lambda n are distinct that is no two of them are identical, then there exists a linearly independent set of eigenvectors. So this is saying a little more than, so if I have lambda 1 to lambda n being distinct eigenvalues corresponding to each eigenvalue I will have one eigenvector that comes from the definition itself, one nonzero eigenvector. But what we are saying here is that these eigenvectors are going to be linearly independent which means that these eigenvalues eigenvectors will have unique directions and since they are sitting in the n dimensional space they will actually span out to the n. The second property is that if there are r repeated eigenvalues then a will have n linearly independent, I am going to use more short forms, linearly independent eigenvectors provided rank of a minus lambda i equals n minus r. And this should hold for every distinct eigenvalue. So this is sort of answering the questions of question we asked at the beginning of the class that is when will the matrix A have n linearly independent eigenvectors. One case is when all the eigenvalues are distinct. Another case is when if there are repeated eigenvalues but for each eigenvalue if you compute the rank of a minus lambda i and that is equal to n minus r where r is the number of times this eigenvalue is repeated then also it will have n linearly independent eigenvectors. However, the directions of the r eigenvectors associated with these repeated eigenvalues are not unique. There are multiple ways in which you can find a basis for that r dimensional space and therefore multiple ways in which you can find a set of n linearly independent eigenvectors. And the third point is that if v1, v2 up to vr are eigenvectors associated with r repeated eigenvalues which are all equals they are repeated so we call them lambda then any v belonging to span of v1 through vr is also an eigenvector of eigenvector associated with the same eigenvalue. So, for example, if I consider the matrix 1 0 0 0 0 0 0 0 then any v in span of e2 e3 e2 is 0 1 0 and e3 is 0 0 1 is an eigenvector corresponding to lambda equals 0. And the identity matrix has n repeated eigenvalues all equal to 1 and so any v in r to the n is actually an eigenvector. Okay, so I am out of time so we will stop here and continue again on Friday.