 Given an n by n square matrix A, we can find the eigenvalues by finding the roots of the characteristic polynomial, the determinant of A minus lambda i, or by choosing a random seed vector and finding the roots of a minimal polynomial. So which should you use? The correct answer is both. And this is a good place to illustrate a very important idea theory guides practice. So let's try to find the eigenvalues for a 3 by 3 matrix. So if we find the characteristic polynomial, we need to solve determinant A minus lambda i equal to 0. So we'll expand out our determinant and we get a cubic equation. And we know how to solve those. We can solve them by factorization and we get our solutions. And once we have our eigenvalues, we can find the eigenvectors. And so the important thing to recognize here is that if we use the characteristic polynomial to find eigenvalues for an n by n matrix, we must find the determinant of an n by n matrix and we must solve an nth degree equation. Now, both of these are very hard to do, but the reward is that when we do this, we will obtain all the eigenvalues. And consequently, all the eigenvectors. On the other hand, let's consider that same matrix, but this time we'll apply our minimal polynomial approach. Now, since A acts on vectors in R3, then any set of four vectors must be dependent. So we choose a random seed vector and compute A v, A squared v, and A cubed v, and we find. And row reducing will give us a solution. And we let t equals zero, s equals one, we have. And so we find a linear combination equal to the zero vector. And this gives us our minimal polynomial, which is quadratic, and we can factor it, giving us our two eigenvalues. And our two eigenvalues can be used to find two eigenvectors. But there was a third eigenvalue eigenvector pair. How do we get that? And here's the issue. Finding the eigenvalues for an n by n matrix using the minimal polynomial approach only requires row operations and matrix multiplication and can yield a lower degree equation. And so it is, in fact, significantly easier to implement. But you get what you pay for, and in this case, you might miss eigenvalues. So let's put the two ideas together. From the characteristic polynomial, we know that an n by n matrix must have n eigenvalues, but finding them always requires solving an nth degree equation. The minimal polynomial might have lower degree, but if it does, we'll miss some eigenvalues. So how can we find all of our eigenvalues? So again, theory guides practice. We know there are supposed to be n eigenvalues, and if we haven't found all n eigenvalues, we can lather, rinse, repeat. So to find a third eigenvector, we can use a different seed vector. So maybe this time we'll use 0, 1, 0, and find. Row reduce gives us, and so we have our linear combination equal to 0. This gives us our minimal polynomial, again, a quadratic, which we can factor, and we have eigenvalues, lambda equals 4 and 6. Now we previously found the eigenvalues for lambda equals 4, so we can ignore this result and focus on the lambda equals 6, which gives us a third eigenvector. Now, as a general rule, mathematicians are leery of solutions to problems that begin with try this and see what happens. We like things that will definitely get us to an answer eventually. And so the natural question to ask at this point is, can we improve our approach? And the answer is, yes, but we'll need to develop some more ideas. So let's take a look at those.