 Suppose A is a matrix with real entries. The eigenvalues of A will be the roots of a real polynomial, that there is a polynomial with real coefficients, and remember the complex roots of a real polynomial have to occur in conjugate pairs. So that means if lambda is an eigenvalue, so is the conjugate of lambda. And the question you've got to ask yourself is, self, what's the corresponding eigenvector? So suppose A is a matrix with real entries, where A v is lambda v. Now, I know that the conjugate of lambda is also an eigenvalue, so let's take the conjugate of both sides. After all, what's the worst that could happen? So again, the conjugate of a product is the product of the conjugates, and this also holds true for matrix multiplication. And remember, definitions are the whole of mathematics. All else is commentary. Since A is a matrix with real entries, the conjugate of A is just going to be A itself. And this is just our equation that defines an eigenvector eigenvalue pair. And so this proves that if lambda v is an eigenvalue eigenvector pair for matrix A with real entries, then the conjugates also form an eigenvalue eigenvector pair. Now, if A has complex entries, the proof fails. However, if A is Hermitian, a remarkable thing happens. So suppose A is Hermitian with eigenvalue lambda and corresponding eigenvector v. We can write the eigenequation and let's multiply both sides by the Hermitian of v. And you might ask, why would we do that? And the only answer to that is, why not? It's something we can do, so we might as well try it. After all, what's the worst that could happen? Don't answer that question. More importantly, if you don't try something, you can't succeed at anything. Now, we do have some choices. We can multiply by the Hermitian on the left or on the right. Now, let's think about that. If v is an n by one column vector, which it is, then if I multiply on the right by the Hermitian, I get an n by n matrix. But if I multiply on the left by the Hermitian, I get a one by one matrix. And let's multiply on the left because let's try the easy things first. So if we multiply on the left by v Hermitian, then we get v Hermitian A v equals v Hermitian lambda v. But lambda is a scalar, so we can float that scalar multiplier to the front. Now, let's take the conjugate of both sides. Why? Again, why not? What's the worst that could happen? So again, the conjugate of a product is the product of the conjugates. And since v is a column vector, v Hermitian v will be a one by one matrix with real entries. And the conjugate of a real number is just the real number. Now, note that lambda vhv will be a one by one matrix, and that means that v Hermitian A v is also a one by one matrix, which means that when we take the conjugate, well, it's really also the same as taking the Hermitian because the transpose of a one by one matrix is just the matrix. But wait, we have this useful property that the Hermitian of a product is the reverse product of the Hermitians. And so this Hermitian can be rewritten. And again, we have the Hermitian of a product, so we can rewrite it. But A is assumed Hermitian, so A Hermitian is A itself, and so we have. And since v is an eigenvector corresponding to eigenvalue lambda, A v is lambda v where we can float our scalar to the front. And now we have lambda times v Hermitian v equal the conjugate of lambda times v Hermitian v. And this means that lambda must equal lambda conjugate, which can only happen if lambda is a real number. And so we get the following result. All eigenvalues of an Hermitian matrix are real. And if this isn't surprising, then you're not understanding what the theorem is saying. Because remember, A is a matrix that could have complex entries. And so when we write down the characteristic polynomial for A, the coefficients will be complex. And it's not obvious that such a polynomial would only have real roots. Let's add in one more idea. Remember that if A transpose equals A, then A is a symmetric matrix. And if A is a real symmetric matrix, that is a symmetric matrix with real entries only, then A Hermitian by definition is the transpose of the conjugate. And since A has real entries, the matrix of conjugates is A itself. And so this transpose of the conjugates is just A transpose of A. Since A is symmetric, A transposes A. And if we join beginning to end, this says that A Hermitian is A, which is the definition of a Hermitian matrix. And so this leads to the theorem that symmetric matrices are Hermitian. And this leads to the big reveal. Symmetric matrices are Hermitian. And we know that all eigenvalues of an Hermitian matrix are real. And consequently, all eigenvalues of a symmetric matrix are real.