 If V is an eigenvector, then any multiple of V will also be an eigenvector. But could there be matrices with more than one linearly independent eigenvector for a given eigenvalue? And the answer is yes. And so we introduce the following definition. The geometric multiplicity of an eigenvalue is the number of linearly independent eigenvectors it produces. For example, suppose we know that this matrix has eigenvalue lambda equal to 1, let's find the corresponding eigenvectors. Since lambda is equal to 1 is an eigenvalue, then the eigenvectors V satisfy Av equal to V. And so solving this equation for the components of that unknown vector V, if lambda equals 1, our system of equations will be, and row reducing the coefficient matrix gives us, which gives us the parameterized solution with two free variables S and T. And so we have our eigenvector as a linear combination of two vectors, and so there are two linearly independent eigenvectors. Now suppose we know that lambda equals negative 3 is also an eigenvalue. Since lambda equals negative 3 is an eigenvalue, then the eigenvectors satisfy Av equal negative 3V, and again we can set up our equation, and our system of equations will be, and we can row reduce the coefficient matrix, and now we have a single free variable, and our solution is, and so our eigenvector will be. Now our matrix had two eigenvalues, lambda equals 1, and lambda equals negative 3, which produced three linearly independent eigenvectors for lambda equals 1, 1, 0, 2, and 3, 2, 0, and for lambda equals negative 3, 2, negative 2, 2. And so the question you've got to ask yourself is, could there be more eigenvalues and eigenvectors? So the important observation to make is that A is a 3 by 3 matrix, so it acts on vectors in R3, but R3 is a 3-dimensional vector space, so a set of independent vectors in R3 has at most three vectors. So there are at most three linearly independent eigenvectors, but since each eigenvalue produces an eigenvector, then an n by n matrix has at most n eigenvalues. So suppose we know a matrix has eigenvalues lambda equals negative 2, and lambda equals 1. Let's see if we can find any additional eigenvalues. So for lambda equals negative 2, the eigenvector satisfies the system of equations, row reducing our coefficient matrix gives us two free variables, and so we can parameterize our components, and for lambda equals negative 2, the eigenvectors are a linear combination of two eigenvectors. For lambda equals 1, the eigenvector satisfies the system of equations, row reducing our corresponding coefficient matrix gives us a single free variable, and so our solutions are going to be expressible in terms of a single eigenvector. And this gives us three linearly independent eigenvectors, and since A is a matrix in R3, there can be no additional eigenvalues.