 In general, if A is an n by n square matrix, then we know it has n eigenvalues and nothing else. But in some cases we might know a little more. The most important case is when the matrix is symmetric with real entries. Suppose A is a real symmetric matrix. So remember, definitions are the whole of mathematics, all else is commentary. So we might have complex values, but because all of the entries of A are real, then the conjugate of A is just A itself. And since A is symmetric, A is A transpose. So now let's talk eigenvalues. Suppose lambda v is an eigenvalue eigenvector pair for a real symmetric matrix A. Definitions are the whole of mathematics, all else is commentary. So we know that A v equals lambda v. Now since A is real, let's work the conjugate in. So if we conjugate both sides we get. But since A is real, the conjugate of A is just A itself. Since A is symmetric, let's also work the transpose in. So let's take the transpose of both sides. And remember when you transpose A product, it's the product of the transpose in the reverse order. Meanwhile, over on the right hand side, the conjugate of lambda is just a scalar. So here we'll just transpose the vector. But since A is symmetric, A transpose is just A. But we know something about A, namely that A applied to our vector gives us lambda times the vector. So let's right multiply by the vector. Over on the left we have A v, which we know is lambda v. We'll float that constant to the front. And over on the right we have the conjugate of a vector transpose times the vector. But given any vector with real or complex components, the conjugate of V transpose V is a one by one matrix with a non-negative real entry. So it's a real multiple C of the identity matrix. So that means C times lambda is C times the conjugate of lambda. Since C is non-negative, this means lambda is the conjugate of lambda. Well, the only numbers that are there on conjugates are real numbers. And so this means the eigenvalues of a real symmetric matrix are real. And usefully, since all the eigenvalues are real, there are several numerical methods we can use to find them. But wait, there's more. In general, given an n by n matrix with real eigenvectors, we know nothing really. But what about the eigenvectors for a real symmetric matrix? At least we know that since the entries of A are real and the eigenvalues are real, then the eigenvectors are also real. But what else do we know? Well again, suppose A is a real symmetric matrix and U and V are eigenvectors corresponding to distinct eigenvalues lambda and U. Again, definitions are the whole, you know the rest of it. We know that A U is lambda U. And since A is symmetric, let's throw in a transpose. Again, since A is symmetric, its transpose is itself. And we know something about A. We know that A V is mu V. Because remember, V was an eigenvector of A corresponding to the eigenvalue So let's write multiply by V and we get, we'll float that constant to the front. Since U and V have real components, U transpose V is a one by one matrix whose entry is some real numbers C. So now we have a scalar multiple of C equal to a scalar multiple of C. But since mu and lambda are different, the only way we can have this equality is that C must be the zero matrix. But remember, if U and V correspond to N by one column vectors U and V, then U transpose V is the one by one matrix whose entry is the dot product. And since our matrix must be the zero matrix, this means the dot product of the two vectors must be zero, which means they're orthogonal. Consequently, if A is a real symmetric matrix, eigenvectors for distinct eigenvalues are orthogonal. Now that's for distinct eigenvalues. What if lambda is an eigenvalue with geometric multiplicity greater than one? We can find a set of linearly independent eigenvectors for lambda, and so we can form an orthogonal set of eigenvectors for lambda. Consequently, if A is a real symmetric matrix, its eigenspace has an orthogonal basis.