 So let's consider a new problem. Suppose we try to find all eigenvectors for this matrix. As before, we'll pick some convenient seed vector, about 1, 0, and we'll find v and av. And these are obviously dependent, giving us the equation av minus v gives us a 0 vector, with corresponding minimal polynomial x minus 1 equals 0, giving us lambda equals 1 as our eigenvalue and 1, 0 as our eigenvector. Since this is a 2 by 2 matrix and can have up to 2 literally independent eigenvector, we'll see if we can find another one. So to find another eigenvector, we'll pick a seed vector that's linearly independent of our known eigenvectors, then grow it into a seedling by applying a minus i to our seed vector. And so we'll use u equals 0, how about 0, 1? We'll apply a minus i to u to get. And we'll use that as our seedling vector. We'll apply a and get again an obviously dependent set, which gives us a new minimal polynomial and a new eigenvalue lambda equals 1. Let's take a closer look at what happened. If v is an eigenvector for a matrix a corresponding to eigenvalue lambda, we have a minus lambda i applied to v gives us the 0 vector. What we found is that av minus v gives us the 0 vector, or a minus i v gives us the 0 vector. So that says our v is an eigenvector corresponding to eigenvalue lambda equals 1. But remember this vector v was a seedling vector grown from a seed vector via a minus i applied to u. So in fact we have a minus i applied to a minus i applied to u. And we can be right that as a minus i squared applied to u gives us the 0 vector, which looks very similar to the eigenvalue equation, and this leads to the following idea. A non-zero vector v is a generalized eigenvector of rank k corresponding to eigenvalue lambda if k is the least whole number for which a minus lambda i to power k applied to v gives us the 0 vector. And note that ordinary eigenvectors are generalized eigenvectors of rank 1. Now we found that for our vector u 0 1, a minus i squared applied to u gave us the 0 vector, but a minus i applied to u wasn't the 0 vector. And so u is a generalized eigenvector of rank 2 for lambda equals 1. The theory of generalized eigenvectors is very similar to the theory of eigenvectors in general, and we have a number of results that follow more or less immediately. So without going into the details we claim the following, suppose v is a rank k generalized eigenvector corresponding to lambda where we'll assume k is greater than 1, then a minus lambda i applied to v is a rank k minus 1 generalized eigenvector. What that means is that if we have a generalized eigenvector we can produce another generalized eigenvector by applying a minus lambda i repeatedly, and so this leads to another definition. If v is a rank k generalized eigenvector corresponding to lambda, then v a minus lambda i v a minus lambda i squared v and so on up to a minus lambda i k minus 1 v is a chain of generalized eigenvectors corresponding to lambda. And this leads to the following result, the vectors in a chain of generalized eigenvectors corresponding to lambda are linearly independent of each other and also of the eigenvectors for any eigenvalue mu not equal to lambda. And we like linear independence because that gives us a basis, and so this leads to the idea of a generalized eigenspace. A defective matrix might not have enough eigenvectors to form a basis for its vector space, but the eigenvectors in generalized eigenvectors do form a basis for the vector space of a matrix. And we'll prove this as follows, we'll assign it as a homework problem. This means we can use our approach to find all eigenvalues and eigenvectors for a square matrix A. We'll choose a seed vector, all roots of the minimal polynomial with respect to v will be eigenvalues, and we can find the corresponding eigenvectors. If we haven't found all the eigenvectors, choose another seed vector u, grow it into a seedling, and as long as our seedlings don't tie out, as long as this isn't the zero vector, we can find the minimal polynomial with respect to v, and lather, rinse, repeat. So here's a rather horrifying matrix, but since this is a four by four matrix, the minimal polynomial has degree at most four, so we'll choose our standard seed vector and compute v, A v, A squared v, A cubed v, and A to the fourth v. We'll use these as column vectors and row reduce, which gives us our minimal polynomial, and we find eigenvalues lambda equals negative one and lambda equals two. And once we know the eigenvalues, we can find the corresponding eigenvectors. Now we might notice something interesting here. Now we might observe that our minimal polynomial applied to our vector u gives us a zero vector, but that means that A minus two i squared applied to our seedling vector gives us a zero vector, and so our seedling vector is a rank two generalized eigenvector, and this also means that it's the start of a chain of generalized eigenvectors. The other vectors in the chain will be found by repeatedly applying A minus two i, and so we find, which gives us this as a rank one generalized eigenvector for lambda equals two. In other words, it's the standard eigenvector. And so we can replace the eigenvector we found with the chain of generalized eigenvectors, and since this gives us four eigenvectors, there can be no others, and we've found our set of eigenvectors and our eigenvalues. If our minimal polynomial didn't have the factor A minus two i squared, or if we didn't notice, we could have grown a seed into a seedling. Our eigenvalues are lambda equal minus one, lambda equals two, and we're also choosing the seed vector v and growing it into a seedling. So we'll pick a seed vector, how about zero one zero zero, apply the minimal polynomial that includes all of our eigenvalues, which will be our seedling vector. Now, since the minimal polynomial for v has degree at most four, and u is already a quadratic times v, then the minimal polynomial for u will have degree at most four minus two, that's two. And so we only need to find u, a u, and a squared u. Using these as column vectors and row reducing gives us a minimal polynomial, a minus two i. And again, we have a minus two i applied to u is the same as a minus two i applied to, a minus two i, a plus i applied to v. And we've already computed a plus i applied to v. And so once again, that gives us the first vector in a chain of generalized eigenvectors. And it's the same vector we found before, so the chain of generalized eigenvectors is going to be the same. And once again, we found all four eigenvectors in generalized eigenvectors, so there can't be any more.