 So, good afternoon. So, last time we were looking at the Cholesky decomposition and he closed out the discussion, summarized the chapter and then started discussing about Hermitian and symmetric matrices. And I alluded to three applications where Hermitian matrices are useful or they arise in such applications. The first was in computing the Hessian of a matrix. The second was in the quadratic form and the third was in graph theory. So we'll continue the discussion about Hermitian and symmetric matrices. This is a fairly long chapter in Haun and Johnson. So we'll go through that chapter in some detail. We begin with the basic definition that the matrix A of size n cross n is said to be Hermitian if A equals a Hermitian. The Hermitian is nothing but the conjugate transpose of the matrix. We say that the matrix is q Hermitian if A equals minus a Hermitian. So that's the basic definition. Now, some very immediate and obvious facts are like this. For any matrix A, if I consider the matrix A plus A Hermitian or A A Hermitian or A Hermitian A, they're all Hermitian matrices. Just take the conjugate transpose of this, you will get the matrix back. Now, if A is Hermitian, then A power k is Hermitian matrix for any integer power. And in fact, if A is non-singular, A inverse is also Hermitian. Then if A and B are two Hermitian symmetric matrices, then their linear combination with real valued coefficients A A plus B B is also a Hermitian symmetric matrix. Of course, if I take complex valued coefficients here, then it did not remain Hermitian after the linear combination. And for any matrix A, if I consider the difference between A and A Hermitian, that is going to be skew Hermitian. Because if I take the Hermitian of this, that becomes A Hermitian minus A, which is minus of A minus A Hermitian. These are very useful properties. We'll see that in a second. And similarly, if you take two skew Hermitian matrices, then their linear combination with real valued coefficients is always skew Hermitian. And if A is a Hermitian matrix, then IA is a skew Hermitian matrix in vice versa. That is, if A is a skew Hermitian matrix, then IA will be a Hermitian matrix. Any matrix A, whether it's Hermitian or not, can be written as the sum of two matrices. The first matrix being one half A plus A Hermitian, the other being one half A minus A Hermitian. So if I expand this out, I get half A plus half A, which is equal to A, and half A Hermitian minus half A Hermitian, which goes to zero. So this is equal to A. And this first part here is H of A. This is a Hermitian symmetric matrix. This second part, I call it S of A, and that is a skew Hermitian matrix. So this is called the Hermitian part of A, and this is called the skew Hermitian part of A, and this representation is unique. In other words, if I write A to be H plus S, where H is a Hermitian matrix and S is a skew Hermitian matrix, then we have that half A plus A Hermitian equals H, because A is H plus S, plus H Hermitian plus S Hermitian, and S equals minus S Hermitian, so these two terms cancel, and H equals H Hermitian, so that becomes half H plus H, which is equal to H. And similarly, if I take half A minus A Hermitian, then I'll get S. So this, there's a one-to-one correspondence between A and the two matrices H and S. I can't write A in any other way as the sum of a Hermitian symmetric matrix and a skew Hermitian matrix. A similar result is that each A, or any A in c to the n cross n, can be uniquely written as A equal to S plus I t, where both S and t are Hermitian. That is almost trivial from this, because if S is, if I can write A as H plus S and S is skew Hermitian, I can write this S as I squared minus I squared times S, and I, that will be the same as I times I S, and I S is going to be Hermitian symmetric, because S is skew Hermitian, that's something we just saw. So that's exactly what is here. So I can write A as half A plus A Hermitian plus I times minus I over two times A minus A Hermitian. This is a skew Hermitian part of A with the factor half, and then when I multiply that by I, I get a Hermitian symmetric matrix, and then there is an I times sitting here. So this coefficient is square root of minus one, but S and t need not be real valued matrices. But the point is both S and t are Hermitian symmetric matrices. And then we have that, if A is Hermitian, then the diagonal entries of A are real. This is because when I take the conjugate transpose, the diagonal entries stay where they are. If A equals a Hermitian, the diagonal entries cannot be complex valued because otherwise, or cannot have a nonzero imaginary part because otherwise the diagonal entries will not match. So basically in a Hermitian symmetric matrix, all the entries above the diagonal are complex conjugates of the reflected entries below the diagonal. And so if A is Hermitian, then we can fully characterize or we can fully specify this matrix A by using n real numbers for the n diagonals and n times n minus one over two complex numbers for all the off diagonal entries. So although A has n squared entries, it has a Hermitian symmetric matrix has n real valued numbers and n into n minus one by two complex valued numbers that completely specify the matrix. Okay, now, so basically Hermitian symmetric matrices are two complex matrices as real numbers are two complex numbers. So here is one result that kind of makes this point. So let A be a Hermitian symmetric matrix. Then, so A, X Hermitian AX is real for every X in C to the n, all eigenvalues are real and C S Hermitian AS is Hermitian for every S in C to the n cross n. Okay, these are pretty obvious facts. I mean, so for example, if you want to show A, if I take X Hermitian AX complex conjugate, that is the same as taking its conjugate transpose because this is after all a scalar X Hermitian AX Hermitian. And the Hermitian of the product of vectors and matrices works exactly the same way as taking the transpose with the extra conjugate in there. And so this can X Hermitian. So this is X Hermitian coming over on the other side. A Hermitian X, this is X Hermitian Hermitian, which is the same as X coming on the right side. And since A equals A Hermitian, this is equal to X Hermitian AX. And similarly, if AX equals lambda X, and I'll take a unit norm eigenvector, so X Hermitian X equals one, then lambda, which is equal to lambda times X Hermitian X, because X Hermitian X equals one is equal to, I'll write that as X Hermitian times lambda X, because lambda is a scalar. And lambda X is the same as AX, so that is equal to X Hermitian AX. And this, we just showed that this is real valued. And since this is real valued, lambda is real valued. So all the eigenvalues of Hermitian symmetric matrix are real valued. And finally, S Hermitian AX, if I take its Hermitian, then I get S Hermitian A Hermitian S, which is equal to S Hermitian AX. So that means that S Hermitian AX is Hermitian for every S. So I guess these are, you know, pretty obvious facts, but they turn out to be very useful facts for later that to take a Hermitian matrix and you compute X Hermitian AX, that's always real valued. So for instance, normally if I take a quantity like X Hermitian AX, that would be complex value, and I cannot order those values. So, because you can't order complex numbers. So if I had to say minimize something like X Hermitian AX, that's a tough problem if A is an arbitrary matrix. But if I said A is a Hermitian symmetric matrix, X Hermitian AX is always real valued. So that's a perfectly valid thing to try to minimize or maximize. And similarly, all the eigenvalues of A being real means that I can order the eigenvalues. I can ask which is the smallest eigenvalue, which is the largest eigenvalue and so on. And we'll see that as Hermitian AX being Hermitian symmetric also will be very useful for us later. The thing that you should think about now is whether the converse is true. So suppose X Hermitian AX, you computed X Hermitian AX and found that it's real for all X in C to the N. Does it mean that A must be a Hermitian symmetric matrix? Likewise, suppose you found all the eigenvalues of A and you found that they're all real valued. Does it mean that A is a Hermitian symmetric matrix? And so on. But we'll come to that in a few minutes. So if I consider, so I mentioned that this is a theorem that illustrates somehow that Hermitian symmetric matrices are two complex valued matrices as real numbers are two complex numbers. So how is that? It's because suppose I take N equals one. So what is this property saying? It's saying that so N equals one means A is just some Hermitian symmetric complex number. And if that is the case, it means that A must be a real valued number because it's conjugate transpose, which is its complex conjugate is equal to A itself. So it means A must be a real valued number. And if A is a real valued number, X Hermitian AX is going to be this real valued number A times mod X square. And that is real for every complex valued C, complex valued X. And if it's a one cross one matrix, whatever that A is, that is an eigenvalue of the matrix. And so it just means that A is real. And this being Hermitian is the same as saying mod S squared times A is real for every S being a complex number. Okay, so I asked about the converse of these points. Here's the result about that. So I'll actually write the Aij being its entries in this form because we'll need these entries to prove the result. Okay, then A is Hermitian, if and only if, at least one of the following code. A is normal and all eigenvalues of A are real. See, S Hermitian AS is Hermitian for every S and C to the n cross n. So that is a normal matrix. And remember that all Hermitian matrices are also normal because if a matrix satisfies A Hermitian equals A Hermitian A, then it's a normal matrix, but for a Hermitian symmetric matrix, A Hermitian equals A. So A Hermitian equals A square, which is equal to A Hermitian A. So all properties of normal matrices, for example, that eigenvectors corresponding to distinct eigenvalues are orthogonal, that there's a complete set of eigenvectors, that the matrix is unitally diagonalizable, all of them hold for Hermitian symmetric matrices also. And we will use these properties extensively in the coming results, in the results that we are going to discuss. Okay, so now proof. So when I say that A is normal, I'm already assuming a lot about A, but in addition to it being normal, the fact that if it's true that all the eigenvalues of A are real, then it is Hermitian symmetric. So Hermitian symmetric matrices are one special case of normal matrices. And if I assume that the eigenvalues of A are real, then this matrix A is Hermitian symmetric. Okay, so let's see how to show this result. Now, the necessity is what we showed in the previous result. So it's now enough to show sufficiency. Okay, so in other words, we, so let me actually maybe explain this. This is basic logic. So the statement of the theorem says that A is Hermitian if and only if one of these conditions hold, which means that we need to show that A is Hermitian if this condition holds. Okay, and we call that the necessary condition. And we also need to show that A is Hermitian only if this condition holds, which is that this condition is sufficient for A to B Hermitian. We've already showed that if A is Hermitian, then this is true. That is the sufficiency result because again, basic logic is that if A, this statement A, so statement A implies a statement B, then the complement of statement B implies that is not B implies not A. Okay, and so what we showed is that A being Hermitian implies X Hermitian AX is real. Okay, which means that if not B is that X Hermitian AX real for all X. Okay, this complement, which means that there is some X for which X Hermitian AX is not real, implies not A, which is, okay, so that is, so the only if part is already shown by the previous result. So we need to, it suffices to show sufficiency, namely that if this condition holds, then A is Hermitian. If this condition holds, then A is Hermitian. If this condition holds, then A is Hermitian. So basically if X Hermitian AX is real for every X in C to the n, then if I consider X plus Y Hermitian AX plus Y, now this is also real value because of this condition here. This, if I expand it out, I get X Hermitian AX plus Y Hermitian AY plus Y Hermitian AX plus X Hermitian AY. Okay, now X Hermitian AX is real because again of this assumption again and Y Hermitian AY is also real. Okay, so now this is real and this is real. And so this must be real also because if this was a complex number, there's no way that this equality would be satisfied. Okay, so now, so we know now that this quantity like Y Hermitian AX plus X Hermitian AY is always real value regardless of which X and Y I choose. So I can choose some in some some I can cleverly choose X and Y and see what happens. So if I choose X equal to EK, the kth column of the n cross n identity matrix and Y equal to EJ, then what Y Hermitian AX will do is that AX will pick out the kth column of A and Y Hermitian times AX will pick out the jth entry of the kth column. So jth row and kth column. So that means that AJK plus similarly this will pick out AKJ. This is real. So if this is real, that means that if I consider the imaginary part of AJK, this is the negative of the imaginary part of AKJ. Okay, the imaginary parts must cancel otherwise this wouldn't be real. Next, choose X equal to I times EK and Y equal to EJ, then what we have is that if I consider Y Hermitian AX, that will give me I times AJK and X Hermitian AY, this I will become a minus I when I take the Hermitian here. So I get minus I EKJ and this is again real. Now if this is real, then it means that the real parts of these two must cancel each other and there is a negative sign. So this means that real part of AKJ equals the real part of AJK. So if you look at, if you now look at AJK and AKJ, their real parts are equal and the imaginary parts are the negative of each other. So that means that AKJ is equal to AJK complex conjugate. And this is exactly the same as saying A equals A Hermitian, okay? So that proves part A, is that what I called it? A, yeah. Now if A is normal, then it is uniterally diagonalizable. We've seen that result already. All normal matrices are uniterally diagonalizable, which means that I can write A as U lambda U Hermitian, where lambda is a diagonal matrix containing lambda 1 to lambda n along the diagonals, which are the eigenvalues of A. And U is a unitary matrix. So now in general, basically A Hermitian would be equal to U lambda, so this is the Hermitian of this. And then lambda Hermitian, but I can write it as lambda star because lambda is a diagonal matrix times U Hermitian. But if we say that the eigenvalues are all real valued, lambda star equals lambda. And so this is equal to U lambda U Hermitian, but that is equal to A from here itself. So A is Hermitian symmetric. So this shows B. And similarly, if it's true that S Hermitian, A S is Hermitian for all S in C to the n cross n, then if C is true, A is Hermitian by choosing what is the eigenvalue. So if I just substitute, I mean this is S Hermitian, A S is Hermitian for every S. So it suffices to say that okay, when S equals I, then I get A must be equal to A Hermitian. So that's it. So that's this proof. Okay, this, the point I made about Hermitian matrices being normal, this is an important point. So I want to actually write it here. Hermitian matrices are normal. So that means that all properties of normal matrices apply to Hermitian symmetric matrices. So for example, eigenvectors corresponding to distinct eigenvalues are orthogonal. And there is a complete set of eigenvectors. That is the geometric multiplicity of every eigenvalue equals its algebraic multiplicity. So Hermitian matrix can never be defective. It is uniterally diagonalizable. So all these hold for Hermitian matrices. So the one result we have seen earlier was the spectral theorem for Hermitian matrices, which was a specialization of the result for normal matrices. So the result says that A C to the n cross n is Hermitian if and only if there exists a unitary U, real diagonal lambda. So I could write this as real to the n cross n such that A equals U lambda U Hermitian. Moreover, if A is real and Hermitian, but if it's real value, it just means it's symmetric. So if it's real and symmetric, actually put it this way, A is real and symmetric if and only if there exists a real orthogonal matrix. Again, I could write it as real. So actually I will write it as real and real diagonal lambda such that A equals P lambda P transpose.