 Suppose matrix A has eigenvector v corresponding to eigenvalue lambda. How could we find a generalized eigenvector of rank K corresponding to eigenvalue lambda? To answer this question, it's helpful to remember every problem in linear algebra can be solved by reducing and solving a system of linear equations. So let's think about this. If u1 is a generalized eigenvector of rank 2 corresponding to eigenvalue lambda, we must have a minus lambda i squared u1 giving us the zero vector. Or we can write that as a minus lambda i times a minus lambda i u1. Now, remember v is an eigenvector corresponding to eigenvalue lambda. So we know that a minus lambda i v gives us the zero vector. So we might require that a minus lambda i applied to u1 gives us the eigenvector. And so a non-zero vector u1 would be a generalized eigenvector of rank 2. Well, let's lather rinse, repeat. Now suppose u2 is a generalized eigenvector of rank 3 corresponding to eigenvalue lambda. We have a minus lambda i cubed u2 equals the zero vector. And we could rewrite that. And we already know that a minus lambda i squared u1 is the zero vector. And so we might require a minus lambda i u2 give us u1. And this suggests the following approach. Find any eigenvector v with corresponding eigenvalue lambda. Then, if possible, find a non-zero vector u1 where a minus lambda i u1 is our eigenvector. And again, if possible, find u2 where a minus lambda i u2 gives us u1 and lather rinse, repeat. Now remember we're interpreting matrix multiplication as a linear transformation. So one way to look at these generalized eigenvectors is they're vectors that will eventually map on to the eigenvector. For example, let's try to find all eigenvectors in generalized eigenvectors for this matrix. So somehow we determined that this matrix has eigenvalue lambda equals 1 and corresponding eigenvector 1, 0, 0. So first we'll see if we can find u1 where a minus i u1 gives us v where v is an eigenvector for lambda equals 1. Now we already have the eigenvector, so we'll set up and solve the equation. So a minus i is going to be a minus 1's along the diagonal. We'll apply that to some vector and we want to get the eigenvector 1, 0, 0. And row reduction gives us, and if we let t equals 0, we find u1, 0, 1, 0, a generalized eigenvector for lambda equals 1. Now since a minus i applied to this vector gives us the actual eigenvector, this is a rank 2 generalized eigenvector. So 0, 1, 0 will map to 1, 0, 0. Now we want to see if there's something that maps to 0, 1, 0. So let's see if we can find u2 where a minus i applied to u2 gives us u1. And so this gives us the matrix equation and our row reduction gives us, and no need to be fancy, we could choose t equals 0 and get our new vector. And again this vector will map on to 0, 1, 0, which will map on to the eigenvector and so this is a rank 3 generalized eigenvector. And notice that we're up to three eigenvectors, one actual eigenvector 1, 0, 0, and two generalized eigenvectors for lambda equals 1. And this should be all the eigenvectors, but here's a useful strategy in life. Make sure things fail when they're supposed to. We should not be able to find a fourth eigenvector this way. Let's make sure we can't. So we begin the same way. Let's try to find some vector that will map on to this last eigenvector 0, negative 2, 1. And when we try to solve this equation that last row corresponds to the equation 0x1 plus 0x2 plus 0x3 is equal to 1, which is unsolvable. No, no, that's a good thing. We wanted to make sure that we didn't get a fourth eigenvector for a 3 by 3 matrix. Let's take another look at finding generalized eigenvalues and eigenvectors. So here's a matrix. And somehow we find that first eigenvalue with the first eigenvector. And we'll see if lambda equals 2 produces a generalized eigenvector. And so we want to find some vector that will map on to our eigenvector 1, 1, 1. So that gives us this matrix equation. However, since the coefficients in the first row are all zeros, this system has no solution. And so there is no generalized eigenvector corresponding to lambda equals 2. Again, somehow we find that eigenvalue lambda equals 1 has associated eigenvectors 0, 0, 1. And we see if lambda equals 1 produces a generalized eigenvector by seeing if we can solve something maps to 0, 0, 1. And we see that this has parameterized solution, and choosing t equal to 1 gives us... So this vector maps to the eigenvector, and so this is a rank 2 generalized eigenvector for lambda equals 1. Now, since a acts on vectors in R3, we should expect there to be only three linearly independent eigenvectors, and we've already found three of them. So if we try to find another, we should fail. And so we can verify this by trying to find a vector that maps on to this rank 2 generalized eigenvector. And note that the first two rows correspond to the equations x1 equals 0 and x1 equals 1, so the system is unsolvable. Also, since additional eigenvalues would produce new eigenvectors, we've actually found all of the eigenvalues as well. This suggests a revised strategy for finding the eigenvalues and eigenvectors of some matrix A. First, we'll pick any nonzero seed vector and find the least k for which has non-trivial coefficients. All roots of the minimal polynomial will be eigenvalues. And then we'll find all corresponding eigenvalues and generalized eigenvectors. And at this point, if you don't have n linearly independent vectors, choose another seed vector linearly independent of the known eigenvectors and lather, rinse, repeat. But any algorithm that begins with choose leaves open the possibility that the wrong choice will cause the algorithm to fail. Yes, now it's appropriate. And so the important question we have to ask is, will we always find all the eigenvalues this way? We'll take a look at that next.