 So, the last time we looked at several new things, eigenvectors and an eigenspace you can define with respect to particular eigenvalues. We looked at the concept of geometric and algebraic multiplicity of an eigenvalue. We also defined left and right eigenvectors and the principle of bi orthogonality, namely that if you take two distinct eigenvalues of a matrix, then any left eigenvector of the matrix corresponding to the first eigenvalue is going to be orthogonal to any right eigenvector of the matrix corresponding to the other eigenvalue. And we also looked at how eigenvectors get changed under a similarity transform. Essentially the new eigenvectors become s times the old eigenvector. So we saw that if you take a real symmetric matrix or a complex Hermitian matrix, its eigenvalues are always real. And if you have an operator norm, then for any eigenvalue of the matrix A, mod of lambda is less than or equal to the operator norm of the matrix A. So an immediate consequence of this is that if lambda is an eigenvalue of A, then mod lambda is less than or equal to max 1 less than or equal to i less than or equal to n sigma j equal to 1 to n mod Aij. And mod lambda is less than or equal to max 1 less than or equal to j less than or equal to n sigma i equal to 1 to n mod Aij. So the mod row sums or the mod column sums, if you take the largest of them, they will always be an upper bound on any, the modulus of any eigenvalue of the matrix. Why is this true? Norms of matrix. Yeah, exactly. So yeah, so that's it. So now we'll continue on. So the next thing I want to discuss is the idea of unitary equivalence. So this is, I mean, just to kind of break the mystery here, unitary equivalence or unitary similarity refers to similarity under unitary transformations. Remember that two matrices are similar. If A and B are similar, if you can write B as S inverse AS for any invertible matrix S. If this S matrix S happens to be a unitary matrix, we say that A and B are unitary similar or unitary equivalent. And so basically that's the core idea here. And this unitary equivalence is very closely related to one very important theorem that I'm going to discuss soon, which is called the Schur unitary triangularization theorem. So it forms the basis for such for that theorem. And so, so this is like the prelude leading up to that theorem. Okay, also recall that we say a set of n vectors x1 to xn are orthogonal if the inner product between any pair of vectors is zero, as long as you're picking distinct vectors. In addition, if each of those vectors are of unit norm, then we say that they are orthonormal. And when defining these things, we typically only consider the usual inner product xi Hermitian xj and the usual Euclidean norm, which is xi Hermitian xi. And so then under this definition, we say that this vectors form an orthonormal set. Also if you're given a set of orthogonal vectors, and these vectors are nonzero, you can obtain an orthonormal set of vectors from this set of orthogonal vectors by simply normalizing each of those vectors. So for example, if y1, y2 up to yk are orthogonal vectors and they're nonzero, then if I define xi to be yi divided by square root of yi Hermitian yi, then these x1, x2 up to xk will form an orthonormal set of vectors. So obviously orthonormal vectors are nonzero by definition. Okay, so we have the following result. An orthonormal set of vectors is linearly independent. This is very simple. So I'll just quickly write this out. So if x1 through xk are an orthonormal set of vectors, now we need to show that they are linearly independent. And so suppose sigma i equal to 1 to k, alpha i xi equals 0. Then we need to show that all these alphas must be equal to 0. So if this is the 0 vector, then we know that this implies sigma i equal to 1 to k, alpha i xi Hermitian times sigma i equal to 1 to k, alpha i xi is equal to 0. A 0 vector in a product with itself will give you 0. And when I expand this in a product, so the left hand side is equal to sigma over i j equal to 1 to k, alpha i alpha j xi Hermitian xj. But xi Hermitian xj is equal to 0 if i is not equal to j because they are an orthonormal set of vectors. So this is equal to sigma, say j equal to 1 to k, alpha j, this is a star missing here, alpha j mod square times xi, xj Hermitian xj which is equal to 1, I will actually write it out so that it is clear, xj Hermitian xj and this equals 1. So this means that sigma j equal to 1 to k mod alpha j square equals 0 which is only possible if alpha i equals 0 for alpha j equals 0, 1 less than or equal to j less than or equal to k. So which means that x1 to xk are linearly independent. So we have seen this already, but just for the sake of completeness, a unitary matrix is a matrix u in c to the n cross n such that u Hermitian u is the identity matrix. And at some point we may need this, so I will make a distinction between complex matrices and real valued matrices by calling the equivalent of this for real valued matrices as a real orthogonal matrix. So these are some abuse of terminology, but this is just for the sake of concreteness when we are using these matrices. So when I say it's real orthogonal, I just mean u transpose u equal to identity and u is a real valued n cross n matrix. Okay, now before I state the next result, I want to recall one little property that again we have seen earlier. So if A is in c to the n cross n and B A equals the identity matrix for some B belonging to c to the n cross n, then 1, A is non-singular, B is unique and 3, AB is also equal to the identity matrix. And so as a consequence, we can write B equals A inverse.