 So remember we defined a non-zero vector v as a generalized eigenvector for lambda of rank k, if k is the least value for which a minus lambda i to the k applied to v gives us the zero vector. First we'll introduce a quick result, we have the following. Suppose v is a generalized eigenvector of rank k corresponding to eigenvalue lambda, then a minus lambda i v is a generalized eigenvector of rank k minus 1 corresponding to the eigenvalue lambda. And you should prove this. Now we found generalized eigenvectors by finding an eigenvector v and then solving for the vectors that map onto v or the vectors that map onto a vector that maps onto v and so on. And this gives us a sequence of vectors and so we introduced the following idea. Let v be a generalized eigenvector of rank k for eigenvalue lambda, then a minus lambda i applied to v is a generalized eigenvector of rank k minus 1, a minus lambda i squared applied to v is a generalized eigenvector of rank k minus 2, and so on. And we get a sequence of vectors which we'll call a Jordan chain. And at this point we'll introduce a useful strategy in linear algebra, given a set of vectors always ask if they're independent. Well they can always ask, let's try to prove. Clearly the set consisting of the last element of the chain, a minus lambda i to power k minus 1 v, is independent. So suppose we add the preceding chain elements one by one and we let the set of vectors be the last independent set of chain elements. And in particular if we include the preceding chain element we get a dependent set. So we can write it as a linear combination of the other vectors. Now if we multiply it by a minus lambda i we obtain, and since our assumption is that v has rank k, this last term is going to be the zero vector, and so the set consisting of all the remaining vectors is going to be dependent. But this contradicts our assumption that this set was our last independent set. Now we're not going to do all your homework for you. There are some details we omitted in this proof, and you should find them and elaborate on them for homework. The important result in any case is that if v is a generalized eigenvector of rank k for eigenvalue lambda, the corresponding Jordan chain is linearly independent. Given that, we'll prove the following. Let matrix A have distinct eigenvalues lambda 1 through lambda k, and let ui be a Jordan chain of generalized eigenvectors for eigenvalue lambda i. The union of all of these Jordan chains is an independent set of vectors. Did I say we'll prove this? You should prove it. Because the union of Jordan chains is an independent set of vectors, and the vectors in a chain have a nice relationship to one another, we define the canonical basis. The canonical basis of A is a union of the Jordan chains. Now if you actually understand this definition you should be a little bit confused. And that's because A is an n by n matrix, and these vectors in the Jordan chain are n by 1 vectors. So you might wonder how it is that these vectors are called a basis for the matrix. And the answer is, oh my, look at the time. Now the important thing to remember here is that any set of independent vectors forms a basis for some vector space. Now if A is an n by n matrix, it acts on an n dimensional vector space. There will be a total of n eigenvectors, both regular eigenvectors and generalized eigenvectors, and so the canonical basis is actually a basis for the domain of A. And if we're talking about matrices acting on vectors in Rn, then the union of Jordan chains forms a basis for Rn. And this provides us a way to find all eigenvalues and eigenvectors without having to find the characteristic polynomial. That's the method we've been using all along. And it also explains what's characteristic about the characteristic polynomial. And that's the following. Let A have characteristic polynomial, then for any vector v, applying the analogous matrix polynomial is going to give us 0. This follows rather immediately because v can be written as a linear combination of the eigenvectors and generalized eigenvectors of A which correspond to the factors of the characteristic polynomial. Moreover, if lambda has algebraic multiplicity k, there is a generalized eigenvector for lambda of rank k. And that puts the final connection between the algebraic and geometric multiplicities of the eigenvalues and what they mean in terms of the matrix.