 One of the more important problems in linear algebra is known as the eigenproblem, and this deals with something we call eigenvectors and eigenvalues. And so what's our eigenproblem? Well, suppose I have some square matrix A, then what I want to do is I want to find some non-zero vector. Important that that would be a non-zero vector. I want to find some non-zero vector where the product A times that vector is going to be lambda times the same vector. In effect, what that means is that if I do multiplication by the matrix A, what I get is a scalar multiple of my original vector. Now the term we use here is that lambda and our vector v form an eigenvalue eigenvector pair. So lambda, you can think about it as the scalar multiple, and v, you can think about it as the vector where the matrix product is going to be some scalar multiple of that vector. Now solving the eigenproblem isn't too difficult for small matrices A. So for example, let's say I have the matrix 7 negative 5, 10, negative 8, and I want to find the eigenvectors and the eigenvalues for this matrix. So the idea, again, going back to our definition, I want to find some scalar value lambda and some vector x where A x is lambda times the vector x. And so this gives me a system of equations. So if I look at this matrix product A times x, well, that's going to be rho times column vector. So that's going to give me 7x1 minus 5x2. And I want that to be lambda times x. That's going to be first component should be lambda x1. Second component should be lambda x2. And it's a nice simple system of equations and I can rearrange it this way. And if I peel off the coefficient matrix, I end up with this as my coefficient matrix. And because this is a homogeneous system, I know there's at least one solution. I always have the trivial solution to a homogeneous system, x1, x2 both being equal to 0. Now I want to find a non-zero vector for which this is true. So I know that this system of equations has a trivial solution. In order for it to have a non-trivial solution, the determinant of the coefficient matrix has to be 0. So I'll go ahead and find that determinant equal to 0. So every condition corresponds to an equation, the condition of having a non-trivial solution corresponds to determinant equal to 0. And so I know how to calculate the determinant of a 2 by 2 matrix. That's going to be product of the one diagonal minus the product of the other pair of diagonals. And so that's going to become this. And after all the dust settles, I get lambda equals negative 3 and lambda equals 2. And these are going to correspond to my eigenvalues. Once we know these values, we can find the corresponding eigenvectors. So my eigenvalue lambda equals negative 3. Well, what that translates into is that the matrix product will give you minus 3 times the original vector. So again, I'll apply the matrix to my vector x1, x2. My first component, minus 3x1. My second component, minus 3x2. And again, I can set that up as a homogeneous system of equations. I'll peel off the coefficient matrix. And I can reduce. And that gives me a nice parameterized solution. And my parameterized solution, some parameter s times the vector 1, 2. And so there's my eigenvector for this particular eigenvalue. The other eigenvalue, lambda equals 2. Again, I'll drop that into my equation. That says this matrix times some vector is 2 times the original vector. So my first component will be 2x1. My second component will be 2x2. And again, I'll make that a nice homogeneous system. I'll peel off the coefficient matrix and reduce the system and parameterize my solution. And I get my eigenvector s1, 1. And we can summarize this. This matrix has eigenvalue lambda equals negative 3 with eigenvector s1, 2. And also, eigenvalue lambda equals 2 with eigenvector s1, 1. And since s just produces a scalar multiple of these vectors, we actually omit the s as being implied by the eigenvector problem. And so we might just say the eigenvector is 1, 2 for lambda equals 3. For lambda equals 2, the eigenvector is 1, 1. And we should verify that these are, in fact, eigenvalue eigenvector pairs. So for lambda equals negative 3 for this eigenvector, I can verify that the scalar, that the matrix multiplication of this eigenvector by the matrix should give me minus 3 times the original. So I find that matrix product. It's negative 3, negative 6. And this is, in fact, negative 3 times my original vector. Likewise, for lambda equals 2, if I take this eigenvector 1, 1, I multiply by the matrix. I get 2, 2. And again, I'll verify this is, in fact, 2 times the original vector. And so this is an eigenvector for this eigenvalue. And this is an eigenvector for this eigenvalue.