 So what I want to talk about right now is the idea of a symmetric matrix, which is a special type of real matrix in its complex top part called a Hermitian matrix. So a real matrix A is called symmetric if the transpose of the matrix is equal to the matrix A itself. So take, for example, this matrix A right here. It's a three by three matrix, which is going to be 1, 2, 3, 2, 4, 5, 3, 5, 6. If we take the transpose of this matrix, rows become columns and columns become rows. If I take the first row of A, that becomes the first column of A transpose. So you get 1, 2, 3. If I take the second column, excuse me, the second row of A, that becomes the second column of A transpose. You get 2, 4, 5. Then if you take the third row of A, it's transposed with them become a column. So we get 3, 5, 6. You'll notice that this matrix, that is drew on the screen, is identical to the matrix you start with. That when you have a symmetric matrix, you have these elements on the main diagonal. Then the things off the diagonal, in some respect, you're reflecting them to the other side. We call the matrix symmetric because with regard to the diagonal of the matrix, you have this line of symmetry, hence a symmetric matrix. It's a matrix which is equal to its own transpose. So we define a symmetric matrix again for real matrices. We honestly could define this notion of symmetric matrix for any field. It doesn't really have to be the real numbers. But for the true geometric benefit, we focus the symmetric matrices on the real ones here. For complex matrices, turns out the symmetric matrix isn't the right one to discuss because for complex matrices, we never use the transpose. Instead, we want to talk about matrices which are equal to its conjugate transpose. A complex matrix is referred to as Hermitian if A star is equal to A. It's just the complex analog of this thing right here. Let me give you an example of a Hermitian matrix right here. For a Hermitian matrix, if I want to compute B star for this three-by-three complex matrix right here, same basic idea. You're going to take rows and turn them into columns. Take the first row right there. But when you take the column here, you have to also take the complex conjugate. The conjugate of 1 is going to be a 1. You don't see any difference there. When you take the conjugate of i, it's going to give you a negative i. When you take the conjugate of 1 plus i, that'll be a 1 minus i. We took the conjugate of the first row turned into a column. You'll notice already that that's the column of B. That's identical there. We move on to the second row here. Let's turn that into a column and take conjugates. That's going to give us i negative 5. The conjugate of a real number is just itself, and then you get 2 plus i, like so. Notice that now agrees with the second column of B. Taking the third row here, we're going to turn that into a column. We end up with 1 plus i, the first entry, 2 minus i, the second entry, and then the conjugate of 3 is again 3. You'll see that this is identical to the original matrix B. This is an example of a Hermitian matrix. The theory of symmetric and Hermitian matrix is going to be almost identical, because after all, every symmetric matrix over a real coefficient is technically a Hermitian matrix, because when you take the conjugate transpose of a real matrix, that's identical to the transpose. The theory of symmetric and Hermitian matrices is almost identical. We will primarily focus on the real ones, that is the symmetric ones, but be aware that there will nearly always be analogs to what one could say for Hermitian matrices. Let me give you a few examples. Suppose we have two N by N matrices, A and B, that are both symmetric. Now, to be a symmetric matrix, you necessarily have to be square, because when you have a matrix, let's say it's N by N, when you take its transpose, that turns into an N by M matrix. If these are in fact equal to each other, you have to have that N equals M. So only square matrices possibly can be symmetric or Hermitian. So imagine we have two symmetric matrices, and we have some scalar in our field R right here. So that could be a complex or real number in play here. If you add together two symmetric matrices, that's going to be a symmetric matrix. It's also true for Hermitian matrices as well. And this becomes from, this is a very natural property of symmetric, from the transpose I should say. If you have A plus B transpose, this will equal A transpose plus B transpose, which if these in fact are symmetric, you're going to get that this is A plus B. So A plus B transpose is the same thing as A plus B. It's a symmetric matrix. If you take a scalar multiple of A matrix, R times A, that likewise will be symmetric if A was originally symmetric. A similar statement would be true for Hermitian matrices as well. And to see that it's because how does the transpose interact with scalar multiplication? We saw that for transposes, this is just equal to R times A transpose. If A is symmetric, that means A transpose is A, and therefore R A transpose is equal to R A. So because the transpose map was linear, we see that the sum of symmetric matrices is symmetric and the scalar multiple of symmetric matrices is symmetric. If you take these two properties right here, we see that if we were to take the set of symmetric matrices, symmetric, we take the set of symmetric. Let me write it that way. If we were to take the set of all matrices such that A transpose is equal to A, then we can naturally see that as a subset of F to the N by N. It's a subspace, oops, N by N. It's a subspace of the space of N by N matrices because it's closed under matrix addition. It's closed under scalar multiplication. The other thing to mention is that the zero matrix, the square zero matrix is necessarily a symmetric matrix as well. But we can do more than just the vector operations here. We can say a little bit about multiplication. Turns out that the product of two symmetric matrices will be symmetric if the matrices commute, right? And the reason for that is the following. When you take the product of two matrices and then you take the transpose, the transpose as a shoe sock operator, it turns it around. Now, if these are symmetric matrices, that means B transpose is B, A transpose is A. And then, so that's not necessarily the same thing as A B, but if they commute, right? If they commute, then you could plug in A B right here and we see that A B transpose is equal to A B. So A B would be symmetric as well. So the product of symmetric matrices or Hermitian matrices will be symmetric or Hermitian if and only if the matrices commute, which of course doesn't happen in general. But what can we say about inverse? A similar type thing that if a matrix is invertible, then its inverse will likewise be symmetric if the original matrix was symmetric. And again, the idea is the following. If we take A times A inverse, take it like so, we know this is equal to the identity. If you take the transpose of both sides, you're gonna get A A inverse transpose equals to the identity transpose. Well, A inverse, A times A inverse transpose by the shoe sock principle, this will become A inverse transpose times A transpose. And this will then equal, well, the identity transpose, the identity is a diagonal matrix. Diagonal matrices are symmetric. So this will just be the identity right here. Since A is symmetric, this would then turn into A inverse transpose times A, right, equal to the identity. I'm gonna times both sides of the equation on the right by A inverse. And in the end, we see that those A's cancel and then you end up with the statement that A inverse transpose is equal to A inverse. So the inverse of a symmetric matrix is symmetric. It turns out that we can actually build symmetric or Hermitian matrices very easily from just any old matrix. So if you have any matrix A, turns out that A transpose A and A A transpose are symmetric matrices. How do we see this? Well, the idea is if you take A transpose A transpose by the shoe sock principle, this becomes A transpose times A transpose transpose for which the double transpose is just A transpose A. And so we see that the matrix is equal to its own transpose. A similar calculation holds for A A transpose. Now, of course, if you're talking about complex matrices, you replace this with a conjugate transpose and the same argument goes from there. Let me show you an example of this. If you take any matrix and for this one I'll take a specific matrix, two by three, we'll take one negative two, four, three, zero, negative five. If we multiply them together, we have A, we multiply that by, sorry, here's A transpose and we multiply that by A. When you go through the multiplication of this, we'll just do a few of these together. You take the first row times the first column, you'll get one plus nine, which is 10. Then you take the second column, you're gonna get negative two. Take the third column, you're gonna get four minus 15, which is negative 11. Then when you do the next row, notice what happens here is you're gonna times this by the first column. This right here is just like when we did the first column, the first row times the second column, right? You're gonna end up with a negative two in this situation. Next, when you take the second column, second row together, you get a positive four. When you do the second row, third column, this is actually gonna match up as if we did the third row with the second column. So these numbers, negative eight are gonna match up here. And so you can go through these things and see that A transpose A is gonna be symmetric. A similar thing happens with A, A transpose that you can see right here. If you take A and A transpose, same basic idea, you take the first row times the second or the first column, you'll end up with 21. When you do the second, the first row, second column, this will be the same product as taking the second row, first column, for which case, you're going to get a negative 17 right there. You see it's the same in both situations. So in fact, do get a symmetric matrix. Now this is not the same symmetric matrix, right? Notice that A is a two by three matrix. A transpose A turns out to be three by three and A A transpose turns out to be two by two. The two numbers involved in the shape of A right there. Now, of course, if you're working with complex matrices, then you're gonna do the conjugate transposes in that situation, something like that. And that's perfectly fine. Just make sure you take the conjugates when you transpose a complex matrix.