 Welcome back to 6.2, everyone. Characteristic polynomials for the textbook linear algebra done openly. I wanna talk about how eigenvalues are related to singular matrices. I should say non-singular matrices right here. So you can see the following theorem that's in front of us right now. That if we have an n by n matrix, I claim that this matrix is non-singular if and only if zero is not an eigenvalue of a. And so it's actually a fairly quick argument on why that is, right? Because if zero were a eigenvalue of a, we would get that a times x is the same thing as zero times x, which whatever x is, that's gonna be the zero vector right here. This would suggest that x actually belongs to the null space of a, which would suggest that this null space is something non-zero or non-trivial because after all an eigenvector has to not be the zero vector right here. And this process is essentially reversible, right? And with connection to the characteristic polynomial, if you take the determinant of a minus lambda i, right? Well, if we're taking lambda to be, if we're taking lambda to be zero, you're taking a minus zero i, this is just the determinant of a, right? And if this was an eigenvalue, this should be zero. And as we've seen before, a matrix is singular if and only if its determinant is zero, right? So we can't have zero as an eigenvalue for a non-singular matrix. And in fact, when we have a non-singular matrix, we can actually find the eigenvalues related to these things in the following way, right? So say that ax equals lambda x right here. Well, if it's the matrix is non-singular, you can multiply both sides by the inverse of the matrix. On the left-hand side, a inverse times a will give us the identity. So we're left with just x right here. On the right-hand side, because it's lambda is a scalar, you can factor it out of matrix multiplication here. So you get lambda times a inverse x like so. And since we know that lambda is not zero, right? Because non-singular matrices do not have zero eigenvalues. You can actually divide both sides by lambda and you end up with a very important observation that a inverse x equals one over lambda x right here. So if a matrix is non-singular, the eigenvalues of the inverse matrix will be the reciprocals of the eigenvalues of the original matrix, which is actually a really, really neat observation. It's kind of cool. And so a matrix is non-singular if and only if it has non-zero eigenvalues. Now, I want to kind of mention that although zero is an acceptable eigenvalue, the zero vector is never considered an eigenvector. And I also want you to be aware that if zero is an eigenvalue, that doesn't mean the eigenvector associated is zero. Because again, zero doesn't have to be, zero vector is not considered an eigenvector. So take this three by three triangular matrix. I like this example right here because it's triangular, we can identify the eigenvalues really quickly without using the characteristic polynomial. We see that they're gonna be three, zero and two, right? If we were to look for the eigenvectors associated to lambda equals zero, the zero eigenspace here is none other than just the null space of A. Because if we subtract zero i from A, that doesn't change anything right there. So to find the null space is to find the zero eigenspace for a matrix. And if that null space is non-trivial, that is if the nullity here of the matrix, this is just equal to your zero multiplicity. That is the geometric multiplicity of the eigenvalue zero is just the nullity here. And so one can see very quickly that if we investigate the null space of this matrix, you can grab a vector negative two, one and zero. And so I want to kind of verify that with everyone here. Three, six, negative eight, zero, zero, six, zero, zero, two. Times that by the vector negative two, one and zero. And so if we do the matrix multiplication here, we're gonna end up with negative six plus six plus zero. We're gonna get zero plus zero plus zero. And we end up with zero plus zero plus zero. That's nice. When you simplify that, you end up with zero, zero, zero, which is zero times the original vector negative two, one, zero. And so this is evidence that negative two, one, zero is an eigenvector of A and it's a zero eigenvector of A. So you can have zero being an eigenvector, sorry, zero can be an eigenvalue, which that happens if and only if the matrix is singular. This square matrix doesn't have an inverse, but the eigenvectors associated to lambda equals zero will be non-zero vectors themselves and it'll just coincide with the null space of the original matrix. So these eigenspaces in some essence are generalizing the notion of null space that we saw previously. All right, thanks everyone. See you next time.