 When we're trying to solve the eigenvalue eigenvector problem, we can find the eigenvalues by solving the characteristic polynomial, and we can find the characteristic polynomial by evaluating the determinant of a minus lambda i. But finding the determinant of a matrix is hard! Remember, any method that requires computing determinants is going to be computationally expensive, and in general, we should always ask, is there a better way? So let's consider this. Suppose a is an n by n matrix, and v is any non-zero vector of the appropriate size to be repeatedly left multiplied by a. Then there is a least value k for which the set v, av, a squared v, and so on, is dependent. And that's because since v is in Rn, then any set of n plus 1 vectors must be dependent. Now, we'll form a minimal set of dependent vectors, and we can do that as follows. The set consisting of just the vector v is necessarily independent, so we'll add av to our set. If the set v, av is dependent, we can stop. But if the set is independent, we'll add a squared v to our set. And if the set v, av, a squared v is dependent, we can stop. Otherwise, lather, rinse, repeat. And eventually, we'll get to a set where when we add akv, we now have a dependent set. Since the set without akv was independent, we know there are non-zero values a0 through ak minus 1 for which this linear combination is equal to 0. And the important thing here is that since akv made the set dependent, we can guarantee that it's going to be included in the linear combination. And we'll say that v is the seed that gives us the minimal polynomial, which is the polynomial formed using the coefficients of this linear combination. So for example, let's find a minimal polynomial for this matrix with a seed vector, oh, I didn't know how about 1, 0. So we'll find our vectors. v is just our seed vector 1, 0. We'll find av. And so maybe we're lucky. And the set of these two vectors is dependent. And so we can check for the independence of the set of vectors v, av. Letting our vectors be the columns of a matrix and reducing the matrix to row echelon form gives us, OK, that's not too exciting. But the important thing here is that we have no free variables. And so we know that our set of vectors is independent. No, no, we wanted the set to be dependent. So we go on to the next and find a squared v. And that's a applied to av, which is now we check for the independence of the set of vectors v, av and a squared v. So again, letting our vectors be the columns of a matrix and reducing the matrix to row echelon form gives us, again, not a lot of excitement here. But we do see that x3 is t is our free variable. And we can parameterize our solution. And we'll let t equal to 1, which will give us x1 equals minus 8, x2 equals minus 2. And so minus 8v minus 2av plus a squared v is the zero vector. And we have a corresponding minimal polynomial. Now there are some efficiencies we can have here. In practice, we don't have to check our set every time. If v is in Rn, then any set of n plus 1 vectors is guaranteed to be dependent. And so we can just form our set v, av, a squared v, and so on, be the columns of a matrix and row reduce. And the first free variables correspond to the first vector that gives us a dependent set. So for example, the matrix 2503, starting with our seed vector 1, 0. Since v is in R2, it's guaranteed that the set of vectors v, av, and a squared v will be dependent. So let's find those vectors, av is, and a squared v is. We'll let these be column vectors and row reduce. And so x2 and x3 are free variables, giving us a parameterized solution. Now remember our goal is to find a minimal polynomial, a least degree polynomial. And so if x3 is anything, we'll have an a squared v. So let's let t equals 0 and s equal 1. And this gives us a solution, a corresponding equation with the vectors minus 2v plus av plus 0 a squared v, or more simply, minus 2v plus av. And so we have a minimal polynomial minus 2 plus x. Or we could take a 3 by 3 and we can pick any vector we want as our seed. So let's pick, no, no, God, no. How about this one? So we find av, a squared v, a cubed v, and letting these be column vectors and row reducing. We find that x4 is our free variable and we can parameterize our solutions. And if we let t equals 1, we'll get a set of solutions. A linear combination equal to 0 and a corresponding minimal polynomial.