 At the beginning of this chapter, we introduced the idea of an inner product. I want to now present the counterpoint to inner product, which we call an outer product. The inner product was sometimes called the scalar product because you take two vectors, you combine them to make a scalar. As such, the outer product is sometimes referred to as the matrix product because we're gonna put two vectors together and form a matrix. And so if you have two vectors, which belong to fn, the outer product, which is denoted as u tensor v there. So this right here, the symbol in the middle, it kind of looks like multiplication, but you put a circle around it. It's sometimes referred to as a tensor. And so, you know, much like the dot product is called the dot product because that's the symbol we use. The outer product is also called a tensor product because that's the symbol we use here. So we take u tensor v. We're gonna create a matrix whose ij position is ui times bi. And I'll give you an example of that to be a bit more specific in just a moment. But we create an in-by-in matrix of this form. Now, the outer product, we can actually define it using matrix multiplication. u tensor v will actually look like uv transpose. Now, I want you to compare this formula with the formula for the inner product. So the u dot v was u transpose v. u tensor v is equal to uv transpose. The location of the transpose is moved, which with the outer product, you'll actually see that the transpose is on the outside of the product here. On the other hand, for the inner product, it's on the inside. And you can actually take that as motivation why we call this the inner product and outer product. We're talking about the location of the transpose. Is it on the inside or is it on the outside? Now, I should always caution you that when we work with complex matrices, the tensor product is actually defined to be uv star, much in the same way that the Hermitian product is defined to be u star v. We'll focus on real matrices here, but be aware that we can describe this for complex matrices as well. So these inner and outer products are related to each other, not just in name, but also by the very important formula here, that if we take the outer product of uv and times that by w, this is the same thing as the inner product of vw times u. And so in some respect, outer products and inner products are connected to each other in terms of matrix multiplication. So let's see exactly why that is. So the idea behind this, you just start off with the left-hand side right here, take u tensor v times that by w. And I want you to first convince yourself that this even makes sense. So if uv and w are all vectors from fn, which that could be the real numbers, it could be the complex numbers, I don't really care about the field right now, u tensor v will be an n by n matrix. An n by n matrix could be multiplied by a vector in fn. So the left-hand side is compatible, it will be a vector in fn. On the right-hand side on the other hand, you have a vector in fn and you're going to times it by a scalar. A scalar times a vector gives you back a vector. So this formula is telling you that the multiplication by a matrix can be equivalent to the multiplication by a scalar, although the vector in play does switch its roles. So let's go back to our argument here, u tensor vw. This, we can rewrite, the tensor product can be written as uv transpose times w. And now this is all realized as matrix multiplication, which is associative. We can re-associate, so we get u v tensor w, which, excuse me, v transpose w, which v transpose w is equal to the dot product of v and w. We see it right here. Now the dot product is a scalar and typically when it comes to scalar multiplication, we put it in front. So you get v dot w, u. And that actually completes the argument here that really we just sort of played a shell game and kind of moved where we thought the transpose was was on the inside or the outside and that's where this identity comes from. So sometimes it can be hard to remember when you write it like this. So when it comes to this identity, I actually prefer to think of it as u tensor v times w is equal to u times v dot w. If you remember that way, that way uvw go in the same order. And so the outer product can turn into the inner product. Again, most people put it the other way around just because it's weird to put your scalars behind. It's like saying x2 versus 2x. We're just conditioned to write our coefficients in front. It's not required, but it's just a convention that we follow. So this is the outer product. And I want you to do, I wanna show you a calculation of such a thing here. So if we take the outer product of u times v, what we're gonna do is we're gonna form what type of matrix this is gonna be. This right here, I guess in the previous slide, I said the vectors have to have the same length, but in terms, I guess in all reality, you don't have to have the same length. These could have any lengths whatsoever. You're just gonna take all the possible products of scalars together. So we're gonna take one times all of these vectors right here. Or if you wanna think of in terms of matrix multiplication, you have one negative two zero times two negative two, three, and negative six right here. So notice this matrix, the first one, the column vector, you could think of it as a three by one matrix, but then the other one is gonna be a one by four matrix. So in the end, we're gonna end up with a three by four matrix right here, the outer product. The dimensions of matrix will be the size of the first array versus the size of the second array. All right, so when we do that, you just look at all the possible products. You're gonna times this row by one. So we get two, negative two, three, and negative six. We then take this row and times it by negative two. So we get negative four, positive four, negative six, and positive 12. And then you're gonna times everything by zero, so you get zero, zero, zero, zero. And this turns out to be the outer product of the two vectors, fairly straightforward, very nice. Now the case that you and me have the same length is gonna be of critical importance to us. And let me show you how this outer product is connected to this idea of matrices we saw previously. Cause again, we were introducing the idea of special types of matrices earlier in this lecture here. So let U be a unit vector, okay? And consider the outer product of U with itself. A unit vector means that the length of the vector is equal to one. And so if you take the outer product of a unit vector with itself, this always creates a idempotent matrix. And as such, the matrix transformation that sends X to AX will be a projection, which is projection onto the span of this vector U. So if we ever wanna project onto a line, it turns out we can always construct that matrix using this outer product right here. And the proof is very, very nice. How do we know that A squared is in fact A? That's what it means to be A. That's what it means to be an idempotent matrix. So we're considering A times A. Well, A is equal to U tensor U times U tensor U. But what is the tensor product here? What is this outer product? You're gonna get U U transpose, and then you're gonna get U U transpose. If I redo the parentheses, you're gonna end up with U times U transpose U times U transpose. And then you'll notice that this friend in the middle right here, this is just the dot product of U times U. Oh, this is the key here about being a unit vector. If you're a unit vector, that means the length of the vector is one. But if you square both sides, the norm is equal to one squared, which is still one, right? Now, the square of the norm is just U dot U, and if you square that's a one. So unit vectors have the property that U dot U is equal to one. So this gives us that you have U times, you know, U times the number one times U transpose. This just becomes U U transpose, which that's just the outer product. U tensor U, that was the matrix A again. So we always get that if you take the outer product of a unit vector with itself, this will always create the projection map onto the line spanned by that unit vector. Let's see a specific example of that. If we have the vector one, one, maybe we wanna do the projection onto the line spanned by one, one, which be aware that's gonna be the line Y equals X inside of the plane R2. Well, V is not a unit vector, the length of this vector is actually the square root of two. Right? So you're gonna get the square root of one plus one, which is the square root of two. So in order to get a unit vector, we're gonna normalize this thing. So we take V and then we divide it by its length. So the normalization is gonna give you the vector one over the square root of two, one over the square root of two. And for whatever reason in your previous Ellsbury classes, they always insisted upon rationalizing the denominator with no good reason to do so. If you do that, you end up with root two over two and root two over two. So that's our vector U in play here. Let's take the tensor product of U with itself right here. So this is, you're gonna take all the combinations of the square root of two times two, or really I like to think of it as one over the square root of two right here, or if you prefer the square root of one half, they're all the same number. So you compute the outer product of U with itself, you end up with the matrix one half, one half, one half, one half. There you go. And I claim this is an idempotent matrix. If you're not convinced, let's look at it. If you take this matrix squared, if you take the first row times the first column, you're gonna get one half times one half, which is a fourth, plus one half times one half, which is the fourth. One fourth plus one fourth is drum roll please. It's one half, right? And so you do this for each of the coordinates, you're gonna get one half, one half, one half, one half. This is the original matrix A. This is in fact an idempotent matrix. And so multiplication by this matrix right here is projection onto the line y equals x.