 Let's look at one last thing in section 6.4. This was a longer of a section compared to others. And like I said, there's a lot of stuff going on here, bringing everything together. So continue on with the spectral theorem for symmetric and Hermitian matrices. I wanna talk about the spectral decomposition of a symmetric or Hermitian matrix. So we'll do this in the special case of a symmetric matrix, but be aware that this is true for her Hermitian matrix as well. Since the matrix is symmetric, it has an orthogonal diagonalization of PD, PT, for which then the columns of P are the eigenvectors of A. The diagonal entries of D are exactly the eigenvalues of A. And then the rows of PT, these are gonna be the left eigenvectors of A, which are none other than just the transposes of the columns of P here. So really for a symmetric matrix, the left eigenvectors are just the same as the traditional eigenvectors, the so-called right eigenvectors there. Now as this is a diagonal matrix in the middle right here, if we multiply this matrix with the first matrix right here, times in a diagonal matrix on the right here has the effect that you'll scale each column by the diagonal entry. And so when you do that, you end up getting this right here, you get lambda one times U one, you'll get lambda two times U two, lambda three times U three, all the way up to lambda N U N. You're just gonna scale each column by lambda, like so. And so then if you take this matrix right here, you see all the columns and you see this matrix here, you see all the rows. When you multiply that together by the usual rules of matrix multiplication, you get something that kind of looks like a dot product. You're gonna get lambda one U one times U one transpose. You'll get a lambda two U two U two transpose all the way down to lambda N U N U N transpose, which they're not dot products because the transpose is on the outside, not the inside. Oh, that's right. We have an outer product, not the inner product. You're gonna get U one tensor U one. You're gonna get lambda two U one tensor U, or U two tensor U two, all the way up to lambda N, then U N tensor U N, like so. And so this right here is gonna, what we call this sum, A equals the sum of all of these matrices. This is what we mean by the spectral decomposition of a matrix A. Now these outer products, U one tensor U one, U two tensor U two, U three tensor U three, we're gonna abbreviate those as just BI. So BI is this UI tensor UI, which is the outer product UI times UI transpose. Now these matrices BI, they're gonna be in by in matrices. They're gonna necessarily be symmetric and they're each gonna be rank one. That is, if you look at the column space, they're just, if you look at the columns of this matrix B, you're just gonna get the same column showing up over and over again, although you might get different scalar multiples of them. Some other things we mentioned about these BI's here. And so I mean, the BI here, the span, the column space of BI is none other than just the span of just the UI you started off with that eigenvector. If you take two different BI's BI times BJ, this is always gonna equal zero. So you could say that these matrices are orthogonal with each other, right? The product of them is zero. And then if you square any one of them BI times BI, you just get by BI. So these matrices are in fact gonna be idempotent matrices. So we've constructed a set of orthogonal idempotent matrices. And this is a discussion for symmetric matrices. Of course, if you're considering complex vectors and you have a Hermitian matrix, then the tensor product, the outer product here U, tensor V will be UV star. So like usual, change the transposes into conjugate transposes when you're working with complex vectors there. And the spectral decomposition is similar. So let's look at an example of this spectral decomposition for us matrix. We'll just take this two by two symmetric matrix, seven, two, two, four here. And so it can be shown, I'm not gonna give you the details of all these, but it can be shown that here is an orthogonal diagonalization of the matrix A. Its eigenvalues turn out to be eight and three. And so we're actually gonna record that here. We get eigenvalue one is eight. We're gonna get eigenvalue two is three. And then if you look at this right here, here's our U one. This is an eigenvector for A associated to the eigenvalue eight. So we get the U one is this matrix two over the square of five, one over the square of five. And then for lambda two, our second eigenvector U two, it'll be the second column right here. Just copy it down as you see it. And so you get negative one over the square of five and two over the square of five, like so. And so you see those exactly right here. So let's compute the outer products of these eigenvectors here. These are gonna be unit eigenvectors of course. So if you take U one, tensor U one, you'll take the dot product of the vector with itself. So we had two over the square of five and one over the square of five. So if you look at the possible product, you get two over root five times two over root five, that gives you four fifths. Then two over root five and one over root five gives you two fifths. One over root five and two over root five gives you two fifths again. This will be a symmetric matrix. Whenever you take an outer product of a vector with itself, you always end up with a symmetric matrix or Hermitian matrix. Depends whether you're real complex. If you take one over root five with one over root five, you get one fifth. So you get the following matrix right here. This right here is our B one. And then if you do the second one, U two, tensor U two, you'll get negative one over root five with itself, which is one fifth. You'll get negative one over root five with two over five, which is two fifths. You'll get two over the square of five with negative one over the square of five, which is again negative two fifths again, it's symmetric. And then if you do two over root five with two over root five, you get four fifths like here. And so this gives us the matrix B two. And so I do want you to sort of verify that these matrices are in fact, orthogonal and idempotent. If we take B one squared, this means we take four fifths two fifths, two fifths, because it's symmetric in one fifth. If you take it with itself, you'll end up with 16. What are we doing there? We're starting to take four fifths times four fifths just to help me out here. Maybe I don't want to do too much in my head, but let's just take this right here. So we want to take the first row times the first column like that, right? Well, in that situation, you're gonna end up with 16 over 25 plus you're gonna get four over 25. We'll come back to that one in a second. Take the first row times the second column, you're gonna get eight over 25 and you end up with also two over 25. Admittedly in hindsight, it could affect all these one fifths out, but oh well, we'll just stick with it. You're gonna get eight over 25 plus two over 25 again. And then lastly, you're gonna end up with four over 25 and then one over 25. So when you add these things together, notice everything is over 25, I'm gonna affect that out now. So we end up with 20, 10, 10 and five. Everything is divisible by five and we fact out the five and cancel out, we're gonna get one fifth times four, two, two and one. And of course, if you redistribute that one fifth through, you know, she ended up with a B one right here. I'll leave it up to you to check that B two likewise is ident potent. It's not too hard to do, it's a very similar calculation to that, but you can check that B two is also an ident potent matrix, B two squared is equal to B two. If we take B one times B two in terms of a matrix product, again, I'm gonna use a little bit of four side and factor out the one fifth here. So we get one fifth, four, two, two and one. And then we can fact out another one fifth from B two. And so we get one, negative two, negative two and four. So the one fifths come together to give you one over 25. And then if you look at these products here, you're gonna get four two times one negative two. So you're gonna get four minus four. First row, second column, you get negative eight plus eight. Second column first, sorry, second row, first column, you get two minus two. And then lastly, you're gonna get negative four plus four. And so you can see that those, the coefficient of one over 25 in the front is irrelevant because you end up with the zero matrix. So then in fact, we do have orthogonal ident-potent matrices going on right here. So that gives us, so we do have these orthogonal ident-potent matrices, but then we start combining these together. If we take eight, you can just see it right here, I don't need to write it out. If you take eight, so this first part right here, this is just eight times B one, eight was the first eigenvalue. You get three times B two. We claim that this is equal to the matrix A. Well, if you take B one, that's right here. Take B two, that's right there. And try to simplify this thing if you dare. I mean, it's not too horrible, right? You can factor out the one fifth from everything. You end up with 32 plus three. Then we end up with 16 minus six. Then we end up with what's coming next. We're gonna get 16 minus six again, it's symmetric. That's not surprising. And then we get eight plus 12, like so. Combining those terms in the different positions here, we get 35, 10, 10, oh boy, 10, and then 20. Again, those are all divisible by five. And we divide everything by five. We end up with seven, two, two, and four, like so, which is the original matrix A. So we are able to decompose as a sum of the matrix, it's a linear combination of these orthogonal idempotent matrices. And this is what we mean by this spectral decomposition. Remember, decomposition typically means that we're gonna write a matrix or a vector as a sum or linear combination of things as opposed to a factorization, which means we're gonna write the matrix into smaller factors using multiplication in that sense. And so that brings us to the end of section 6.4. It was a longer section than others. Like I said, there's a lot going on here, but there's a lot of really cool stuff for us to see. But that does bring us to the close of section 6.4. As always, if you have any questions, please post them in the comments below. I'd be happy to answer any questions you might have about these linear algebra topics. Click the like button, feel free to subscribe so you can see more updates and more cool linear algebra videos or other math videos in the future. Check them out on my channel if you're interested and I'll talk to you next time where we can learn some more about linear algebra. Bye, everyone.