 So, given an n by n matrix and any non-zero seed vector v, there is a least value k for which the set is dependent. And so we can find non-zero coefficients a0, a1 through ak minus 1, which satisfies the vector equation, and there's a corresponding minimal polynomial. So, why did we do that? We claim the following important result. All the roots of the minimal polynomial are eigenvalues of a. Here's a quick sketch of the proof. The factorization of the minimal polynomial gives us a factorization of the matrix equation. And the first thing we note here is that this product consisting of the last few factors with v can't be the zero vector. And the reason is that if it is, then we'll have a non-trivial linear combination of the vectors of v through ak minus 1v, which equals the zero vector, so this set is dependent. But we assume that k was the least value for which such a set would be dependent. So remember, another way of characterizing an eigenvector eigenvalue pair is the following. If v is not equal to the zero vector and a minus lambda i applied to v is the zero vector, then v is going to be an eigenvector for eigenvalue lambda. Now, since this entire product does give us the zero vector and this last portion of the product is not the zero vector, it follows that this last portion is an eigenvector corresponding to the eigenvalue lambda 1. And we can rearrange the factors to give us the eigenvectors for the other eigenvalues lambda j. Well, let's try it out. So let's try to find the eigenvalues of this matrix. So we could choose any seed vector we want to, and there's no point in picking anything too complicated. We'll choose our seed vector and find v, a v, and a squared v. And the set of three vectors is necessarily dependent. Ro-reducing gives us, and since x3 is our free variable, it's this third vector that makes the set dependent. And so we get parameterized solution, a linear combination equal to zero, and we'll write this in a more conventional form by putting the highest power terms first. Now, this gives us the minimal polynomial x squared minus 2x minus 8, which, because the universe is kind and gentle, will factor. Okay, we can't always count on this factoring, but in this particular case it does. And so the roots of this equation are x equals 4, x equals minus 2, and so the eigenvalues are 4 and negative 2. How about a 3 by 3 matrix? Since this matrix acts on vectors in R3, then any set of four vectors in R3 will necessarily be dependent. So the set v, a v, a squared v, and a cubed v must be dependent. So I'll pick a random seed vector about 1, 0, 0, and we'll find a v, a squared v, a cubed v, and we'll let these be column vectors, and we'll row-reduce our matrix of column vectors, which gives us a parameterized solution, a linear combination equal to the zero vector, and a minimal polynomial with roots 1, 2, and 3, which will be our three eigenvalues. And once we have the eigenvalues, we can find the eigenvectors as before. So for lambda equal 1, we have row-reducing, parameterizing our solutions, and if we let t equals 1, this will give us an eigenvector minus 3, minus 2, 1. For lambda equals 2, we have row-reducing, parameterizing our solutions, and if t equals 1, we get the eigenvector. For lambda equals 3, we have... So the question to ask is, can we find all eigenvalues and eigenvectors this way and avoid finding the characteristic polynomial? And the answer is... Yes! Of course, the universe never gives us something for nothing, and so we will need to exert some effort, but nowhere nearly as much effort as we would if we were to try and find the characteristic polynomial. We'll take a look at some of these cases next.