 All right. In this very short video, I want to talk about the spectral theorem of symmetric matrices, and I've mentioned this before, but whenever a theorem gets a name, it's sort of like it's a big deal. You know, like the Fuddle-Mill theorem of linear algebra, the non-singular matrix theorem, things like that. And as we're coming near the end of section 6.4 here, the spectral theorem is this major capstone theorem with regard to not just eigenvalues and diagonalization such but for symmetric matrices right here. So before anyone's like worried about a ghost coming out here, we talk about spectral theorem and spectral theorems in linear algebra. The idea is that a strom in the linear algebra context represents the set of eigenvalues. So the spectrum of the matrix are sometimes called the spec of A for short. It's just a set of all the eigenvalues of that matrix. And so a spectral theorem in linear algebra means it's a theorem about eigenvalues. So the spectral theorem of symmetric and Hermitian matrices is a way of categorizing to classify what the eigenvalues of a symmetric or Hermitian matrix is going to be. And so for any square symmetric or Hermitian matrices, the following four things are true. Some of the things we've seen already, some we haven't. So first and one of the most impressive things about out there is that for a symmetric or Hermitian matrix A, if it's in by in, it'll have in eigenvalues. And that's counting multiplicity, right? And then particularly these eigenvalues are going to be real numbers. The, even if you have a real matrix, right, the eigenvalues very well could be imaginary numbers or they could be non real numbers, you know, with a real part and an imaginary part, right? We just they could be non real complex numbers, but for a symmetric matrix, they're always going to be real numbers, those eigenvalues. That's also true. And you'll have any of them if you count multiplicities, right? And speaking of multiplicities, the geometric and algebraic multiplicities of a symmetric or Hermitian matrix are always equal to each other. There's never any lacking of that because we always know that the geometric one is less than or equal to the algebraic one, but we actually guarantee equality for a symmetric and Hermitian matrices. Equality of the algebraic and geometric multiplicities is equivalent for the matrix being diagonalizable. And since as we've seen before, a symmetric or Hermitian matrix is orthogonally or unitary diagonalizable, in particular, they're always diagonalizable. So you get this equality of multiplicities. We had talked about before jumping to the fourth one on our list here that a matrix is symmetric if and only if it's orthogonally diagonalizable. And it'll also be Hermitian if and only if it's unitary diagonalizable. So there's a strong sense of diagonalization for symmetric and Hermitian matrices. And then we also, so this right here was our theorem. What was it again? 6.4.4. We also saw that eigenspaces of a symmetric or Hermitian matrix are mutually orthogonal with each other. That was 6.4.1 theorem in our textbook right here. So this kind of summarizes things we've done here. This one we've hinted to Part A, but we haven't actually proven it yet. And so I just wanted to supply just a quick argument what's going on in that regard. Because after all this 6.4.1 actually depended on the eigenvalues being real for the Hermitian case right there. For the real one is a little bit more obvious. And the argument really comes down to the following calculation. So if we want to see the argument behind A, take the quantity x star A x. So I'm going to focus on the Hermitian case in this situation because it's less likely that a complex matrix would have a real eigenvalue. So consider this quantity right here x star A x, right? And I want you to take the star of that. So take the conjugate transpose. Well, the conjugate transpose itself is also a shoe sock operator. It switches the order of everything. So the last term becomes the first term x star. You get a star and you're going to get x star star. The star, the star kind of like the transpose, if you take the double star, they just cancel each other out. So you end up with x star A star x right here. And then this is the point where Hermitian is actually important, right? Hermitian implies that A star equals A. So this equals x star A x like so. And so you'll notice that x star A x quantity star is actually equal to itself. So this matrix right here, x star A x right here, this, this, this quote unquote matrix is Hermitian. So if you take it star, you can back the original thing right here. And the reason why this is significant is that this could be written in a slightly different perspective. This is just x dot A x, right? We're taking the inner product here, which is the Hermitian product. And this of course is just going to equal a, this is just a complex number. When you take the inner product of vectors, it's always just gives you back a scale. So this is just a complex number. But when you take the star in this situation, the star just gives you conjugation, right? So this is telling us that x, x A x. So sorry, x dot A x bar, the conjugate is just equal to x dot A x like so. And so that implies that it's in fact actually a real number. So starting off with that, this x star A x is a real number. But I want to consider it in a different perspective as well. If we take x star times A x, if we, and this, so this, this calculation star A x. This is a real number for any, any vector, any vector x as long as A is a Hermitian matrix, which includes symmetric matrices as well. So let's assume now that x is an eigenvector so that what we get that A x equals lambda x, right? For some eigenvalue there. So in that situation, this then becomes x star times lambda x. And as we're now the second factor pulling it out doesn't do anything in terms of conjugation. You'll pull out the number lambda right there. Not that it would be much of a problem anyways. We get lambda times x star x, like so. In which case this, this then gives us lambda times the length of x right here. And as eigenvectors cannot be zero, the length of x is nonzero. We're going to end up with lambda equals x star A x over the length of x. And I guess I should, this length of x squared. I often forget that square there. And so x star A x we saw was a real number. The length of every vector is real when you square that it's good. This shows us that we have a real number like we were trying to verify. That's what we're trying to show here. So for any symmetric matrix or any Hermitian matrix, the eigenvalues are always real numbers. And that's a very impressive and powerful result when it comes to considering the spectral theory of these, of these symmetric matrixes. I can't under emphasize, I can't, I can't, I can't over emphasize. I should say how important how that these eigenvalues are real. We're not necessarily going to see it too much in this, in this section, but I do want you to be aware of it. This is a significant result that helps in many, many, many situations of linear algebra.