 I want to introduce another definition here. If we have a square matrix, let's make it be a real matrix, this first one here, so just real numbers. If you have a real square matrix U, we call it orthogonal if it satisfies the relationship that U transpose U equals the identity. The transpose of the matrix times the original matrix gives you the identity. Well, this statement right here, if you twist it a little bit differently, if you transpose U equals the identity, that actually would suggest that U transpose is the inverse of the matrix, which of course, as the left and right inverses are the same, this also would tell us that U U transpose equals the identity. This is what we call an orthogonal matrix. See why someone might be interested in an orthogonal matrix. Remember how we did matrix inversion in the past. If you want to find the inverse of a non-singular matrix U, you would augment that matrix with the identity, and then you would row reduce that matrix so that U becomes the identity, and then the identity would become U inverse. Because there's a lot of row operations that take place in the middle, this can be an expensive procedure. But in comparison, if you just have to take the transpose, which is essentially free when it comes to complexity, finding the inverse of a orthogonal matrix is a cinch, assuming you have an orthogonal matrix. The complex counterpart of an orthogonal matrix we call a unitary matrix. As usual, whenever we talk about complex matrices, we never talk about the transpose. That's bad news for complex matrices. Instead, we want to have the conjugate transpose. A unitary matrix is a complex matrix, exactly where the conjugate transpose is equal to its inverse. Now, why the name orthogonal matrix? Well, it comes from this theorem right here, that if we have a square matrix which is a real matrix which is orthogonal, or a complex matrix which is unitary, this happens if and only if the column vectors of the matrix form an orthonormal set. Remember, an orthonormal set means that every vector is a unit vector and that dot products of vectors from that set are orthogonal. That's 3 equal to 0. A matrix is orthogonal if and only if its column vectors form an orthonormal set. That's why we call these things orthogonal matrices because the column vectors form an orthonormal set. Now, because of this, people are like, why do we call it orthogonal? The set of column vectors like GIT is orthogonal, but actually has to be orthonormal to be an orthonormal set. Why are these called orthonormal matrices? That actually makes a lot of sense and some authors of linear algebra text books actually insist that these things are called orthonormal matrices for that reason. But the orthonormal labels used more commonly, so I think it'll do students a disadvantage if we use a different term here. We have an orthonormal matrix, unitary matrix, clearly there's no confusion there. Now, whenever a matrix is orthogonal, since its inverse is its transpose, we get that every orthonormal matrix, transpose is itself an orthonormal matrix or its conjugate transpose we're talking about unitary complex matrices, I mean. Because of that, it's also true that a matrix is orthogonal if and only if it's row vectors, likewise form an orthonormal set. So you can look at columns or rows and these are going to be orthonormal sets. So I want to give you an example of this type of thing. Here's a three-by-three orthonormal matrix. This is a real matrix and if you look at this thing, you might be like, what the heck? Why are all these square roots showing up in this matrix here? Square root of 11, square root of 6, square root of 66. What's going on here? Well, remember to be an orthogonal matrix, we need columns to be or an orthonormal set. So each of these things has to be a unit vector. And if you were to take the vector 3, 1, 1, which didn't have the square root of 11 there, take its norm, you're going to get the square root of 3 square, which is 9, plus 1 squared, which is 1, plus 1 squared, which is 1. Okay, I get it. 9 plus 1 plus 1 is 11, square root of 11. So 3 over root 11, 1 over root 11, 1 over root 11, that's just the normalization of the vector 3, 1, 1. And similar thing for negative 1, 2, and 1, if you take that vector, its length would be square root of 6. If you take 1, negative 1, negative 4, 7, the length of that vector is square root of 66. So these are normalizations. Are they orthogonal? Well, if you take the dot products, I'm just going to ignore the normalizations yet. If you just take the dot product of 3, 1, and 1, with the vector, negative 1, 2, and 1, you're going to end up with negative 3, plus 2, plus 1, which is 0. So that is, in fact, an orthogonal pair. And be aware that a skill of multiple of vectors that are orthogonal is not going to change your orthogonality condition. So the normalizations will also be the case. And if you take vector 1 times 3, that'll be 0. If you take vector 2 times 3, it's also 0. You can check that this set right here is an orthonormal set. So that's one way of showing that it's orthogonal. The other way to show that it's orthogonal is simply just to take the definition, take u transpose u, and show that this is equal to the identity. I'm not going to do that calculation here. I'm actually going to encourage the viewer to do that. So pause the video right now and check that if you take u transpose u, that this will give you the 3 by 3 identity, 1, 1, 1, on the diagonals, 0s everywhere else. Check that for us right now. I next want to give us an example of a Hermitian matrix, not Hermitian matrix. I'm sorry, unitary matrix. Unitary matrix, this one's going to be the case where u star u equals the identity. These are complex numbers. We'll take the conjugate transpose here. So again, pause the video right now and double check that this matrix is unitary by the definition. Now another way to show that this is unitary is actually to check that this is an orthonormal set. An orthonormal set of vectors. Now with complex numbers, you have to be a little bit more careful. Remember when you take the Hermitian product, you have to take conjugate transposes. So if you take the first vector, let's show that it's a unit vector here. The length, you can factor out the one half because it's a real number, the conjugate doesn't do anything. So if you take one plus i, one minus i, take the length of this thing. You're going to get one half the square root of, well, one plus i times its conjugate is a two. And if you get one minus i times its conjugate, that's also a two. So you end up with one half times the square root of four, which is two. So you get two over two, which equals one. So the first vector is a unit vector. A similar calculation shows the second one as a unit vector as well. So going back here, if you want to take the dot product of these things, you're going to get the dot product is one half times one half is one fourth. And then you get one, one plus i, one minus i dot one plus i times negative one plus i. Now, as these are complex numbers, don't forget to take the conjugate of the first factor right here. And so we're going to get one fourth times one minus i times one plus i, or one plus i. And then plus one plus i, its conjugate times negative one plus i. And so if we work out the details here, if you foil one minus i times one plus i, that's going to equal two. But then one plus i times negative one plus i, that will foil out to be negative two. So this thing adds up to zero when you're done. That is zero. And so you can see that this is an orthonormal set of vectors. It's what we call a unitary matrix. All right, why don't people care about orthogonal or unitary matrices whatsoever? Well, I want to talk about geometrically, what does an orthogonal multiplication by an orthogonal or unitary matrix do? Well, if we'd have a unitary or orthogonal matrix U, basically I can interchange the words orthogonal and unitary, essentially interchangeably. There's no consequence to doing that here. Orthogonal for real and complex for unitary matrices here. If we take any two vectors x and y that live inside of our vector space Fn, it turns out that the inner product between x and y is going to be identical to the inner product of Ux and Uy. So if you take the vectors x and y and you times it by a unitary matrix, Ux dot Uy will equal x and y. So matrix multiplication corresponds to linear transformations. Multiplying by a unitary or orthogonal matrix has the effect that, well, since it's linear, it'll preserve vector addition and scalable multiplication. But orthogonal and unitary matrix have the extra property that they preserve inner products. That inner products before the map are the same as the inner product afterwards. And the reason this is significant is because we preserve inner products, we also preserve everything related to inner products. So like we started off with this lecture, this will preserve angles. Angles between the vectors will be preserved because we can compute the angle from an inner product. This will also preserve distances because again, the distance between two vectors, the distance of say x and y, this equals the length of x minus y, which is remember the square root of the dot product of x minus y with itself. And so anything that we define with a dot product will be preserved by orthogonal and unitary matrices. And so multiplication by an orthogonal matrix doesn't affect distances, it doesn't affect angles, it doesn't affect norms. We also have the property that if you take the norm of ux, this will equal the norm of x. The length of a vector doesn't change when you times it by an orthogonal or unitary matrix. And so this is actually a pretty useful fact right here. The proof of which is actually pretty easy to see. The idea is the following, start off with, start off with the left-hand side, ux, dot, u, y. And so this is gonna equal ux transpose times uy. So I'm gonna assume these are real if it was complex, switch the transpose to a star. And so by properties of the transpose, since it's a shoe sock operator, you get ux, ut, u and y. Well, since it's orthogonal, this is the identity. And so this just becomes x transpose y, which is the same thing as the inner product x dot y. And like I said, for complex matrices, we change the appropriate parts and we get the same argument here. Orthogonal matrices don't change inner products. A consequence of this, if we have two bases for our space fn, call them b and c, and these are two orthonormal bases, then the associated change of basis matrix where you change from c to b, it'll also be an orthogonal or unitary matrix.