 Welcome to this short lecture just on QR factorization how to decompose a matrix A where each of the columns are mutually or at least linearly independent and we're going to change that basis into these two new matrices Q and R. Q is going to be very nice because all those column vectors in it they're going to be orthonormal. So they will orthogonal to each other and they are unit length and then R is going to be an upper triangular matrix. So I'm Dr. Jean Montoppe, amongst other things I'm a research fellow at the School for Data Science and Computational Thinking at Stellenbosch University and as you can see here I'm using a Pluton notebook so we can use Julia, the Julia language an exceptionally perfect language when it comes to computational thinking and I urge you to if you're not already familiar to learn more about Julia I've got a bunch of videos right here on YouTube and then even a university level course that you can get a certificate on as far as a massive open online course is concerned on the Coursera platform so you can learn all about Julia. So what I'm going to just assume that you have a bit of knowledge about vector arithmetic about the projection of a vector onto another vector that's going to be very important about linear independence because we want all our column vectors at initially at least to be linearly independent so that they form a basis for a subspace at least of a vector space with respect to our matrix A, the rank of a matrix because that also just helps us to understand or make sure that our column vectors are mutually or at least linearly independent what orthogonality is and then the normalization of a matrix and what I mean by that that each of those column vectors that you can normalize those column vectors. So the Julia packages that we're going to use in this notebook are linear algebra and row echelon so if you're setting up your environment and remember I do have a video to show you how to set up an environment for each and one of your projects never ever just use the base installation of Julia and do everything and that rather create individual environments for each of your projects and then install only the packages that you need into that environment specific to that project. So for this project I'm just showcasing a bit of linear algebra using Julia I'm using the row echelon package that I had to install and then linear algebra that package is part of Julia. So here's our matrix it is three column vectors and we have three of them. So in actual fact they are mutually orthogonal say which linearly independent and that means it spans all of three space but very often you're going to have more rows than columns so you're only going to span a subspace. So let's just set up A and it's very easy in the Julia language we're going to enter the values row by row leaving little spaces between the elements and then every new row is denoted by a semicolon and I'm going to assign that to the computer variable A and you can see with the Pluto notebooks that the execution of the code is just above the line of the code and just to show that those three column vectors are linearly independent what we can do of course is set up with the homogeneous system and if they are linearly independent any solution to the linear system it's going to be the zero vector or you can just look at the matrix itself and just because this is square in this instance to use elementary row operations reduce it to reduce row echelon form through Gaussian elimination and if it ends up being the identity matrix of course those three column vectors are mutually or at least are linearly independent and you can see there I'm using the row echelon package so I always use that it's the first time I use a function just to remind me from what package that function comes so the RREF reduced echelon form function pass A as my parameter or argument to it and I get identity matrix I can also just use the dot rank method or that would be the rank function I should say in the linear algebra package pass A as parameter or argument and we can see that's got a rank of three in other words those columns are linearly independent so in the linear algebra package there's also a qr function and you can see qr there pass A to it it's going to return two matrices so we're assigning it to two computer variables q and r and once we do that we see both q and r and remember you're going to find these little overflow errors there so anything times 10 to the power negative 16 in computer speak that means a zero so we see these three they are each three of these column vectors are of unit length and they are all orthogonal to each other so this is an orthonormal basis for three space as far as our matrix A is concerned and you see r there as well and you can see it's upper triangular with the zeros below the main diagonal just to show you if we multiply q and r we get back our original matrix with a provisor that these things times 10 to the power 16 are really zero so we just get back to where we are so if we have a bunch of basis vectors and that each column in A that is a basis for either the space or a subspace we've got to choose one of them to start off with and here we're just going to keep things easy we're just going to start with the first one now what i'm going to do the notation that i'm going to use is that each column vector in A is just going to be a sub one a sub two and a sub three and then the new basis is going to be u sub one u sub two u sub three such as remember that u sub one u sub two u sub three are each orthogonal they're mutually orthogonal and when i make them unit length i put a little hat on them such that we have a equation two here so i'm just going to choose the first one a one to be u one because all the others will now subsequently have to be orthogonal to to them but i've i've got one then just grab one out of the air from the subspace that you're dealing with take one of the ones that you have so for me column vector one that's a one and that's going to be u one and if i divide that by the norm of a one then i have a unit vector and i'll call that a u hat sub one so i'm just going to extract the first column vector they're just using index notation so you can see there a then square brackets column means all the rows comma first column so that's just going to be the one one one and that's what we see there one one one and if we click and put on that little downward arrow there we see it's it's a column vector one one one that's what you have and i'm just going to set u one to be a one so i'm just saying assigning a one that this column vector to u one and then just to get the magnitude of that vector i'm just using the norm function passing a one to that and we see the numerical approximation for the norm of that vector and then so that i have u one norm remember that's the one with a little hat on as u one divided by norm of u one and there we have it we have our first vector in our new basis our orthonormal basis and it is of unit length now i'm going to go to just a little google drawing because i just want to remind you of what the projection of a vector onto another vectors and how that helps us as beautifully in this gramschmidt process let's have a look so there we go there's my little drawing and i have my vector a one here the orange one at the bottom and a sub two there now they're not the same as the problem that we're dealing with it's just a little schematic just to remind you what this idea is of a projection so i want to do the projection of a two onto a one now remember a one we've chosen as u one so what we're looking for is another vector in the space that is orthogonal to this vector and if you think about a projection of a of a of a vector onto another vector so you see this component now is this pink one is a long a one that's the component of a two along a one and because it's orthogonal we have this very beautiful pink one going up to the left here and that's u sub two and by the way that we set up these projections they are these two one these the the projection and how you get the projection they are orthogonal to each other so we might as well make that u two and remember what we'll do is if this pink one at the bottom if that's the projection plus u two just gives me back a two so you see at the top a two a sub two that's the projection of a sub two onto u sub one plus u sub two now putting in u there just to remind you it's actually a one but a one is u one so it doesn't matter so if you add the the projection and you add u two you get back to a two now you know what a two is so we can just solve for u sub two it's just a two minus the projection and then you have u two and if you make that into a position vector so take it to the left you now have the orange vectors as u sub one or a sub one is now u sub one and you have u sub two and they're orthogonal to each other and all you need to do is normalize them and then they're orthonormal as simple as that and if you have a sub three now they've all got to be orthogonal to each other so with a sub three and it's projection you better get the projection onto a sub two and onto a sub one because what you really want is you want it to be orthogonal to everything let's get back to the code and i'll show you so yeah i'm reminding you how we do these projections so the projection of a sub two onto a sub one or u sub one now remember that is the dot product of the two divided by the norm of the one that you're projecting onto times the one you're projecting onto remember that's how we do the projections if you if you can't remember so here we have a sub two i've just extracted a sub two zero one one that column vector and i'm calculating now projection of u sub one of a sub two onto u sub one and i'm giving it an appropriate computer variable name and this is how you do the dot product you use the dot function and then just the just there's arguments separated by comma my two vectors so a sub two and u sub one and what i'm doing is then dividing it by the norm of u sub one squared as we have in three up here and then multiply it by u sub one all i'm doing and then there we go i get that projection but that's not what we're interested in we want to use it such as to remember that u sub two plus that projection equals a sub two and i can solve for u sub two and that's exactly what we do here we say u sub two equals a sub two minus the projection we've just done and that gives us u sub two and there we have u sub two and all i have to do now is to divide it by its norm and now i have a unit another unit vector and five is what it's all about i just want to show you that remember they've it's got to be mutually orthogonal to everything that came before it so if i take any a sub n because next time remember going for a sub three i go to u sub three remember that's the orthogonal part and i add to that all of these things here that's the u sub n is the one that i want and i add this i'm doing this summation and all the ones that go before that so i'm cycling from equals one to n minus one so all the ones that came before so three will be it's got to be against a sub one and a sub two not a sub three i'm working with a sub three so it just goes to n minus one so it's the dot product of a sub n dotted with u sub i so it'll be u sub one first then u sub two divided by that norm squared multiplied by itself and all i have to do is just as before we solve for u sub n so u sub n is going to be a sub n minus the summation just going from i equals one to i equals two just to stop at n minus one so if we look at the projection of a sub three onto u sub two we also have to look at the projection of a sub three onto u sub one just as we have here in the summation for five and so let's do that let's just let's just extract the third column vector there zero zero one and i'm doing the projection there of a three onto u two and onto u one we need both of them and then here's our a little expansion so it's u sub three plus the projection of a sub three onto u sub one and the projection of a sub three onto u sub two and that equals as you can see there and all we're doing is we're solving four we're going to solve four u sub three and once we solve for u sub three this is what we do u sub three is a sub three minus this projection minus that projection and once we do that we just divide it by its norm and we have our third column vector and all i'm doing now is just putting them all together as u hat but remember that's just q that's just q and what i'm showing you here if i take the dot product between each of them bar this to the power negative 16 negative 17 remember those all zeros if i take the dot product between each of them now i end up with zero so they are indeed orthogonal to each other and if i normalize each of them they are also normal as simple as that now with qr factorization i just want to show you then just the form that q that r takes it's upper triangular and it's just these dot products with each other and if you do that you get r calculated now you'll see there is going to be a difference because what we've used here is a simplified method let's put it that way because this is not going to necessarily deal this is over a field of reels but we've got to think of over a field of complex numbers for instance so there's actually i just want to warn you a bit more to this but when we're dealing with the reels this is absolutely just an easy bunch of steps to follow what i do want to remind you then is just the fact that we're just using the dot product here because it was just over the reels and we're just in a clear end space but when we move away to things that are a bit more interesting we use the inner product remember the dot product is then a special case in that instance of the inner product so i just want to remind you an actual fact it's not the dot product but it's the inner product and then nine is exactly the same as equation five as before it's just that we have to deal with the inner products if you can define an inner product space you can do qr decomposition and that's really powerful and then i also just show you the real form of what r would look like not a bunch of dot products but a bunch of inner products and now you can really expand it as far as what you view as vectors and you need to stick to these vectors in your clean space as long as you can define that inner product and the space that you're dealing with meets the assumptions for for an inner product space and then you're good to go so that was the game schmidt process is just showing you how easy it is to to use julia code as i said look in the link in the description down below when i have time i'll put in all those links in the description and you can learn a lot more about julia otherwise go to my to my youtube channel jh clopper and you'll find lots of video tutorials on the use of julia