 So normal matrices are, I mean, so don't get fooled by the name. It's just a property that has nothing to do with in some sense the matrix being normal or in any way related to the Gaussian distribution. So the definition is like this, A in C to the n cross n is normal if it commutes with its conjugate transpose. Okay, so any matrix for which this is true is called a normal matrix. These normal matrices are a generalization unitary symmetric and Hermitian matrices. So for example, for a real symmetric matrix, it's a matrix such that A transpose equals A and if it's real symmetric and A transpose equals A, then A A Hermitian is the same as A A transpose, but A transpose equals A. So this is equal to A squared. This is also equal to A squared. So it holds for a real symmetric matrix. Similarly, for a Hermitian matrix, this equality holds and as a consequence, all such matrices unitary matrix, symmetric matrix or Hermitian matrices are all normal matrices. So just to illustrate that, if u is unitary, then u u Hermitian equals u Hermitian u, which is equal to the identity matrix implies unitary matrices are normal matrices. Similarly, A A Hermitian equals A Hermitian A if A equals A Hermitian. So all Hermitian matrices are normal. Also, if A Hermitian equals minus A and such matrices are called skew symmetric or skew Hermitian, then A A Hermitian equals A Hermitian A, which is equal to minus A square. So skew symmetric matrices are normal. And finally, just one more example. If I consider the matrix A equal to 1 minus 1 1 1, this matrix is normal, but it is not unitary or Hermitian skew Hermitian or skew symmetric. So basically, the definition of normal matrices is a strict generalization of these other matrices like unitary matrix matrices or layer symmetric matrices or Hermitian matrices or skew Hermitian matrices. So here is a very, very interesting result which outlines some properties of normal matrices. It's called for orthonormal also. This A matrix is orthogonal, but for orthonormal it should hold. Yes. Not generally for orthogonal. If I had taken an extra 1 over square root of 2 factor here, then this would have become a unitary matrix. And so then, of course, since it's unitary, it is also normal. But without that also, it satisfies the requirement of being normal. But generally orthogonal or not, right? So again, I think you asked this question the last time. No, sir. I'm asking not orthogonal. No, no. In terms of notation, I use two notations. One is a unitary matrix, okay, which is potentially complex valued, but for which uu Hermitian equals the identity matrix. I also use the notation real orthogonal matrix to mean a real valued matrix satisfying uu transpose equals the identity matrix. So I don't have a specific notation for matrices like this A that I've drawn. I've written here where the columns are orthogonal to each other, but they're not unit norm. I don't have a specific word for that. But basically, a matrix like this whose columns are orthogonal, and they have the same norm but not equal to 1, okay, is also a normal matrix. Because if I take A Hermitian A, I'll get a diagonal matrix and the values along the diagonal will be equal to a scaled version of, sorry, the A Hermitian A will be a scaled version of the identity matrix. And so from that, you can see that if I take 1 over square root of that scaling, then that matrix and apply that to A and A Hermitian, that resulting matrix will become a normal matrix. And so as a consequence, multiplying it in the other order will also give me the identity matrix. So A Hermitian will be equal to A Hermitian A. Okay, so here's the theorem. So if matrix A in c to the n cross n has eigenvalues lambda 1 through lambda n, the following are equivalent. A is normal. A is uniterally diagonalizable. Sigma ij equal to 1 to n mod aij square is equal to sigma i equal to 1 to n mod lambda i square okay. So this shows that any normal matrix is uniterally diagonalizable. Which is a different requirement or a different condition under which you can be assured that matrix is uniterally diagonalizable compared to the result we saw earlier where we wanted the matrix to have distinct eigenvalues. These eigenvalues lambda 1 to lambda n need not be distinct. If it is normal, that is also sufficient. A is going to be uniterally diagonalizable.