 So one of the more powerful methods of solving systems of linear differential equation comes to us from linear algebra. And this emerges as follows. Suppose we have a system of first order linear differential equations with constant coefficients. Under the right conditions, which is to say with some work, we can write this system as a system of linear equations. And so we can represent our system in matrix form as dx equal to ax, where x is a vector of functions, a is a matrix of constants, and d is our linear differential operator. And so let's talk ansatz once again, and remember it's easier to obtain forgiveness than permission. It will be useful to remember the following from linear algebra, but a be a square matrix, an eigenvalue eigenvector pair is a scalar lambda and a nonzero vector x, where ax equals lambda x. Suppose x is an eigenvector of a, where ax equals lambda x. Then dx will also equal lambda x. But if xi is a component of our vector x, then we must have dxi must be lambda xi. Now remember the xi's are supposed to be the functions, so that means the derivative of a function is lambda times the function itself, and so that means that our xi must be some constant times e to lambda t. And this suggests the following. Suppose lambda v is an eigenvalue eigenvector pair for matrix A, then x equals c e to the power lambda tv is a solution to the differential equation. So here's a crash course on finding eigenvalues and eigenvectors. Suppose I have the square matrix 3, 1, 4, 3, and I want to find the eigenvalue eigenvector pairs. So by assumption ax equals lambda x, well I know what a is. I don't know what the vector x is, so I'll let those be our unknowns. I'll do a little linear algebra. And our eigen equation gives us the system of linear equations. And we'll rewrite our equations in standard form and get all the unknowns over onto one side and all of our constants onto the other. And so the system ax equals lambda x corresponds to the system of linear equations. Since this system is homogeneous, we can reduce the coefficient matrix. Now notice that if 3 minus lambda squared minus 4 is not equal to 0, there's going to be a unique solution to this system of equations. But since the system is homogeneous, the solution will have to be x1 equals 0, x2 equals 0, and this gives us our vector x being the 0 vector, and we've specifically prohibited x equals 0 as an eigenvector. To get a non-zero eigenvector, it's necessary that 3 minus lambda squared minus 4 be equal to 0. So let's set it equal to 0 and solve, which gives us solutions lambda equals 5, lambda equals 1. Now if lambda equals 5, our system ax equals lambda x becomes, or if we simplify it, this will be, if we solve this system, we'll reduce it to 2x1 equals x2. So one solution is going to be x1 equals 1, x2 equals 2, which gives us the eigenvector. If lambda equals 1, our system ax equals lambda x becomes, which simplifies to, which will reduce to 2x1 equals minus x2. So one solution is x1 equals 1, x2 equals minus 2, which gives us the eigenvector. For example, let's solve this system. We'll rewrite it in operator form. The derivative of xy is going to be 3, 1, 4, 3 times xy. We'll find the eigenvalues for our matrix. Well actually we've already done that. We found the eigenvalue lambda equals 5 has eigenvector 1, 2, and the eigenvalue lambda equals 1 has eigenvector 1, negative 2. And so our theorem says the solution will be e to power 5t times the eigenvector 1, 2, e to power t times the eigenvector 1, negative 2, or any linear combination thereof.