 What I want to then talk about here is to talk about the QR factorization of a matrix, which we've talked about various factorizations of matrices before, particularly in Chapter 3, we talked a lot about matrix factorizations like the LU factorization, factorization using elementary matrices, and the like. The QR factorization offers yet another important factorization that's closely related to this Graham-Smith process that we've been talking about right now. So imagine A is an M by N matrix with linearly independent columns. It doesn't have to be a square matrix, but we do want that have independent columns. If there is a dependence relationship on the columns, this factorization isn't going to work. Then if we have independent columns, we can factor the matrix A as a product of two matrices, which we call Q and R, hence the QR factorization, where Q is an M by N matrix, so it'll have the same dimensions as the original matrix A, but it'll have orthonormal columns to it, which I want to make a comment that if this is an N by N matrix, this is what we call Q is an orthogonal matrix, which we talked about orthogonal matrices in a previous lecture of 4.4, I think, and orthogonal matrices have many ways of defining them. They're those matrices whose transpose is equal to its inverse, but that's equivalent to having orthonormal columns. Now again, our matrix here does not necessarily have to have a square shape. It could be any rectangle. And so this is sort of like the rectangular equivalent of an orthogonal matrix. So Q will be an orthogonal matrix, and the column space of A will be identical to the column space of Q. So we're not changing the span of the column vectors here. That's Q. R, on the other hand, is going to be an N by N, so it will be square. It'll be upper triangular matrix, and all of the diagonal entries will be positive. And in particular, if you have a triangular matrix whose diagonals are nonzero, that means it'll be an invertible matrix. So R will have an inverse to it. And so I want to give you a quick explanation how one constructs this thing. So the proof is essentially the algorithm for when finds the QR factorization. So you'll be given the matrix A. You start off with that. So the first thing you're going to do is you're going to gram-smith the columns of A. And so then you're going to get this basis. So the second step here is Q is formed from this orthogonal basis. This, I should say, orthonormal basis that you just constructed here. And maybe I should specify this in here in our description that we do want it to be the orthonormal basis. That is, the columns of Q are going to come from this basis right here. And so once Q is in hand, it turns out R is pretty easy to compute. To compute R, all you have to do is you take R equal to Q transpose A. And that's it. Now, of course, if you have a complex matrix, instead of taking Q transpose, you should take Q star, because that's what's necessary to have the appropriate orthonormality of the columns right here. And the basic reason this works is if you look at Q transpose A, what you're going to do is you're going to get a matrix, all of which you have these inner products, Q i dot A i, where Q i is the columns of Q and A i are the columns of A. And so when you take the transpose and work through the matrix multiplication, you get a bunch of these dot products right here. You get all of these Q i's times the A i's, like so. And I should mention that the order in which you place the original basis will affect the Gram-Smith process, because the process is recursive. If you change the order of the original basis, it does change things a little bit. And so in order to get the matrix R to have all positive entries, you might have to change the order of columns. That it's not such a big deal. One can do it and make sure you get positive columns. This will become actually more obvious when we talk about the determinant in the next chapter, because we do similar things when we change the order. You change the sign of some things. And so let's see an example of this. So I wanted to just use the same matrix we've been doing this whole time. So take the matrix you see right here. It's a four by three matrix whose first column is 1111, second column is 0111, and third column is 0011. The columns will probably look familiar. These were the x1, 2, and 3 that we had earlier. If we apply the Gram-Smith process to this, you'll remember that column 1 became the all one half vector. 0111 became negative 3 over root 12, 1 over root 12, 1 over root 12, 1 over root 12. And then 0011 became 0, negative 2 over root 6, 1 root 6, 1 root 6. So if you apply the Gram-Smith process to these three column vectors, you get these three columns right here. And as we saw before, this is an orthonormal set of vectors. So that's the hard part of the QR factorization. You apply the Gram-Smith process. And so that's the little summary we'll have right here. You take this matrix and you apply the Gram-Smith process. That's how you get this matrix Q. And then how do you get R? If you take the formula we had before, take Q transpose A. Now, if Q is an invertible, if A is a square matrix, Q will be a square matrix as well. And you're going to get Q transpose is just the same thing as Q inverse. And so that's basically what we're doing here. Once you have Q and A, you can solve for R by taking the, by the inverse. But as this is not necessarily a square matrix, we can use transpose and get something that kind of mimics the inverse in general. And so if you take Q transpose, what you see right here, Q transpose, and this is the original matrix A. If you go through all the possible products, like one half, one half, one half, one half times one, one, one, one. That'll give you four, one half, so it's added to be four halves, which is two, the first bit. Then the second one, you're going to take one half, one half, one half times zero, one, one, one, one, right. You should get three halves in that situation. And then the last one, you'll get two halves, which is that. And you know, go through all the details. This will give you your matrix R. And we did position the vectors in such a way that we get all positive entries. If this didn't work, you might have to permute the order of these columns and then permute the order of these rows accordingly. And that'll actually give you, that'll switch up the coefficients so that they're positive. And this gives us our QR factorization. In the next lecture, we'll talk about an application of the QR factorization relevant to the least squares problem, amongst other things. But yeah, to get the QR factorization really just apply the Grant-Smith process and do a quick matrix multiplication after that. I say quick, but I mean the matrix Q is going to probably have some square roots floating around here. Since it's an orthonormal set of vectors, the columns are, it's not going to be the prettiest vectors you ever have, but there are some huge benefits of having that orthonormal set and this orthonormal factorization of the matrix A. And so that concludes section 4.7. Thanks for watching today. If you liked this video, then please subscribe and just to help make some more videos in the future. If you have any questions, please put them in the comments below. If you like what you saw, please comment about those as well. And I will see you next time as we conclude chapter 4. Thanks for listening. Bye.