 Hi there in this second video on the fundamental subspaces of a matrix. We're gonna look at the column space remember the first video was on the null space and Please go and watch that video. I'll link it in the description down below and there'll be a card up here If you're watching on a computer, you'll see that card. So have a look at that video first We can use sort of the similar code I will review a little bit about the null space, but best you view that video first now We're going to use Jupiter notebook again, and that's inside our visual studio code as you can see here and We're going to use the SMPI library first of all at the top. You'll see mathematics Python 3.9.2. So I do create special environments Python environments using conda for all of my work and so that's just for you to see in the top Right hand corner there if you want to set up your system a similar way that would be Python 3.9.2 So let's set up those packages I'm going to use SMPI once again, and I import SMPI with any without any abbreviations there so just import SMPI and That means if I want to use one of the functions inside of SMPI I have to type out the whole word SMPI dot. So just for your interest as well, you can see I'm using The Dunder attribute version there So 1.10.1 as far as SMPI is concerned and then we want this Latic to be printed to the screen nice mathematical notation. So I'm going to call the init underscore printing function inside of NumPy So let's just think about to start off by thinking about a vector in Rn So we're talking about mostly here about the geometric interpretation of a Vector, but we can also just think of it as a array of numbers whichever way you want to look at it Rn Can be a representation of one of or each of least of the columns Column vectors we would call them in a matrix So if we look at our matrix in equation one here I have a column vector one two five a column vector three three one and a column vector one two two So if you think about it each one of those geometrically at least will exist in three space and that really has to do with the way The way that we set this matrix up matrix up in as much as the only three rows So if we talk about a m by n matrix m by n matrix as you can see there We know that the result Which is B if you think about a x equals B as we see there Will turn x the solution always and B we turn the result So we have a result that is an element of our M purely because they are three There are three elements in each of these column vectors. They're three rows in my matrix so critically I we really need to think of multiplying a by some vector x and because we have a square matrix here x will also be an element of our three But it is this linear combination of these column vectors So it's except one times the first column vector except two times the second column vector except three times the third column vector And that's going to give us the result for some solution x And x in our instance as I said because it is a square matrix. There are three columns It'll also be an element of our three now what we have to think about here before we get there As is to think this vector B the result of vector What values can it take if we think about all of three space because our example here is really just three space What what areas in that geometrical space can this? Result the vector B take that is what we want to determine So a little bit of revisit of the null space We're gonna create a matrix here and as you can see I'm again using the some pie dot matrix Function I'm passing a Python list and then I'm using the reshape So dot reshape method and I want that to be reshaped as a three by three matrix So that's one way to go about it And you can see we have Assigned to the computer variable a we've assigned this Matrix that we had in equation one so you can see the column vectors there one two five three three one one two two Now this is a matrix in which these three column vectors are linearly independent Not one of them is a linear combination of the other So the addition of constant multiples of one of the other ones and we can look at that as far as the rank is Concerned to confirm that so we can use in some pie here the rank method So a dot rank we see a rank of three that means there are three linearly independent columns or In a mathematical textbook that might be there are three pivots Now we can set up or so the Homogeneous system so we'll have a result as a zero vector and see if The solution to X is a zero vector if that is so that means that's the only solution That'll give us a zero vector as a result and we know that there's linear independence We also remember the fact that We have that the rank plus the nullity must equal the number of columns and we have a rank of three We have three column vectors therefore. We're never going to have a basis vector or More than one basis vector We're not going to have any basis vector when it comes to the null space And if we call the null space method there on our matrix a we see we get back this empty Python empty list and That means we don't have a vector that spans any space So the the null space does not exist although remember we will always have the zero vector there So that the zero vector does not span any space So let's then dive into this column space Remember we said that it is the result B that we after what values can be takes We're thinking of three space can it be any vector in three space or is it constrained to some plane or even some line and So what we're going to call here just on a is the column space So very fortunately we have this idea at least a method called the column space in Sympi and you can see it's just going to return These three column vectors for me as is they linearly independent in other words They are a basis for the whole of our three They are three of them, which is what I would require as a set of basis vectors to span the whole space and they are one Version of a basis vectors you can of course There's a three space so it can even have the three standard basis vectors that will also span our three and that's something I think that we just have to talk about a little bit and That is this idea of if we do Elementary row operations on our matrix is that going to affect the column space so the end result of Elementary row operations with gas Jordan elimination would be The reduced row echelon form so that's as kind of as many elementary row operations as you could perform So let's do that We've seen the rref rref row reduced echelon form method So I'm going to call that on a and that's going to return two things for me the reduced row echelon form of Of our matrix and it's also going to return Well, that's a tuple that it returns the first element in that tuple of the zeroth element would be the reduced row echelon form matrix But the second element in that tuple is going to be a tuple itself And that's a tuple of the indices the column indices that contain the pivots And that's not what I want. So I'm going to use indexing and I just want the zeroth element in that tuple that gets returned So that I only get the reduced row echelon form of the matrix and I'm going to assign that to a variable called a underscore rref And as you can see that as we suspected that we are going to get back the identity matrix. That's a square matrix. We have Linearly independent matrix all the three columns are linearly independent if we do if we do gas Jordan elimination We are going to end up with the with the identity matrix and that is exactly what we do get now Let's have a look at the at the basis vectors for the column space of this Matrix that is now undergone Elementary row up or at least elementary row operations were performed on this and what we get back are these three standard basis vectors And of course they span our three and that would be a different just a different set of Vectors that are the basis as opposed to the three that we have before so In our instance, it does not change the space that is spanned, but this is a very special case So I want you to remember Very important here that elementary row operations They are going to affect the column space They're going to give you a different column space once you do any kind of elementary row operations. So As before with an all-space we're going to go through all the kinds of matrices that you might come across So our second one here is going to be a square matrix, but now we have We don't have linearly linear independence So I've created the third column, which is this addition of the first two column vectors So definitely we have linear dependence here. Any one of these column vectors can be written in terms of the other two So if I do the rank I better get back a rank of two and indeed we have a two there Now this has consequence for us. Remember that I have three columns I have a rank of two which means I must have a nullity of one So there's going to be a vector. That's a basis of the null space But what does that do for our column space? Well, we cannot we can no longer have a basis that will span all of our three We're only going to get back to and what this column space is going to do again It's going to return the first two of two column vectors that are linearly Independent of each other and the first two that it found was the first one in the second one Or the zeroth one and the first one when it comes to python indices, and you can see them there one to one and two for one And if you think about those two vectors in three space They they will span only a plane through the origin and that plane I want to use an inverted commas. It's sort of an angled plane. It's not on the XY or YZ or XZ planes it's sort of angled and Any solution vector X if I take this matrix and multiply it by any solution vector that I can think of Which is a vector X. I cannot get to all of three space There is a constraint now a placed upon the possible Result vectors and that is the column space might be vector B is now constrained And that is that constraint is the column space And the two vectors that we see there are basis for those Now what if I were to do the column space on The row reduced asian form so now I've performed elementary row operations on a and you can see the a dot RRF RF method But I use the zeroth index of that because I only want back the matrix and not the tuple of pivots And then call the column space on this So this is me looking at the effect of elementary row operations on my matrix. Does that change the column space and Yes, look at that. I have again two vectors They are now the basis of the column space of this new row reduced asian form of this matrix And we can see there the first vector is in three space and that points along The X axis if you think geometrically the second one points along the Y axis So any linear combination of them has to be a vector in the XY plane But we've just said that these two vectors here that are basis for the column space of the original matrix a that's going to be slightly angled So on this angled plane through the origin in three space. So these are not the same subspaces They are different subspaces. So please remember that if you do elementary row operations on a matrix you Very likely are going to change The column space because it's very rare that we deal with a square matrix that is totally linearly independent We're going to deal with Matrices in most cases that have many more rows and columns that sort of quite normal and we are going to affect the column space So don't make that mistake Elementary row operations Do not affect the null space or the row space and the next video tutorial is going to be about the row space It does not affect those subspaces, but it does affect the column space. It's only in that very neat Example of a linearly independent square matrix that we're not going to change The column space but in other instances be on the watch out for changing that column space Because what the problem is and I think what the crucial bit is that you sort of have to realize is this is the following So let me take a matrix. I'm just going to create a solution matrix X there 2 to 1 There we go. Now I can use the simpi dot mat model. So that's matrix multiplication of matrix A And the solution vector X. So I'm doing AX and I'm going to get back some B Now if I just were to just do that, it's just going to neatly print that A and X to the screen for me So I've got to do that dot do it Method there and that's going to do the actual calculation for me And I get this vector B this result vector 9 18 6 So what I did is I took 2 times column vector 1 of A plus 2 times column vector Second column vector of A plus 1 times the third column vector of A. That is what AX means So I've done that for you here in print in equation 3 So it's 2 times the first column vector plus 2 times the second column vector plus 1 times the third column vector We do remember though when we set up this that this third column vector was just the addition of the first 2 So I can this this 3 6 2 I can rewrite that and that's what we see on the second line as 1 times 1 to 1 And 1 times 2 for 1 it's just the addition of those two If you were to look at that 1 plus 2 is 3 2 plus 4 is 6 and 1 plus 1 is 2 So nothing changes But these two column vectors they the same as the first two column vectors So I've got twice this column vector plus one that column vector that gives me three times the first column vector 2 times the second column vector plus 1 times The second column vector that gives me three times the second column vector and then there's nothing left Zero left of that And this means I have a new vector. Look at this 3 3 0 That's a different solution vector than the first one. Our first one was 2 to 1 now We have a separate one 3 3 0. But look what happens if I do this I'm taking matrix a and multiplying it by this solution vector. I get back the exact same b So there are more than one solution vectors That you know when it's acted upon by my matrix. So a times that gives me the same b There was some constraint on b in this instance And now you can clearly see why we don't have an inverse for this matrix Because I can't go back from b to a because there are two solution vectors that we've seen here two different solution vectors x that get Well, we'll we'll use those terms in a in a different lecture when we talk about or a different way to look at the multiplication of matrix a with a vector x That linear transformation of x the two of these column vectors or two of these solution vectors um They get this translation to the exact same vector. There's we don't have all of these space Available to us now. It's constrained to the subspace. That is this plane through the origin and clearly two different solution vectors x They get they get translated to this the same vector 9 18 6 I hope you can see that that is that constrained And hence this is not one to one And on to this this linear transformation as far as a function is concerned And therefore it's not a bijection and it won't have an inverse I can't get back because here we've shown at least there are two ways to get back Or there are two different vectors to get back to if I took the try to take the inverse of a And we will not have an inverse of a it's impossible to calculate an inverse of a so that that's sort of the Nice part of understanding this idea of a column space So let's move on. We're going to talk about rectangular matrices with more columns than rows now. We saw what happened there before If we look at this first Matrix simple matrix object that we create three by four matrix as you can see there by the reshape I have 12 elements here in my python list I reshaped them in three rows and four columns and we see that So each my column vectors think about it. They exist in our three and three space I need three basis vectors. So three of these I only need three of them to be linearly independent I have four so there's no ways that at least one of them is not going to be a linear combination of the other That that that should just be logical. So if I look at the rank now, I've created this such that Three of them are linearly independent. My rank is going to be three. So again in this semi special case I'm still going to span all of our three But I now have four column vectors and and I can express any one of them as a linear combination of the others I have four columns though and I have a rank of three, which means I must have Anality of one there must be a basis vector in the null space And that is two first of all, let's just look at the column space and we can see it's going to return just the first three Linearly independent columns that it can find and it is those first three one to one two for one one three seven The third one there at least so you can see what happened here. So look at this third one It's one plus two is three two plus four six one plus one is two So the second one was you know looked at the first one then looked at the second one and says well It's not a linear combination of the first one Let's at the third one it says oh, well that is a linear combination of the first two So it's going to skip that one and then the third one finds well It's not a linear combination of those three that came came before So it's returning those three that are linearly independent and that is the basis for all of our three So again, I can span all of our three, but I've got four columns So anality of one so there must be a vector that exists That that at least is a basis for my null space Look at that though that this solution vector x that is a basis for the null space is an element of r4 not r3 anymore So that null space when we have this kind of matrix shape Is not necessarily then going to be in the same Part of the same vector space as those column vectors Now let's have a look at taking this matrix x And there we go. That was the basis vector And let's say three times that Three times that vector which is now a different vector and minus three minus three minus three If I were to multiply this new three times x By a it's still going to give me zero zero zero and that's what we mean It is a basis any any scalar multiple of that and if there was more than one A linear combination of those basis vectors for the null space If I were to take matrix a and multiply by any one of those vectors in that subspace It's always going to result in a result vector that is b. So that should be quite clear Once again, let's do elementary row operations here And then call the column space and we see that there's other standard basis vectors of r3 So this is once again a special case. I did already span all of r3 by my column vectors In as much as at least three of them were linearly independent the fourth one Absolutely, whichever one of the four days must be A linear combination of the others, but I'm still talking about the same columns column space It's still r3, but once again, it's not always so so watch out elementary row operations Most definitely most of the time is going to give you a different column space Now we get to matrices that we'll see much more commonly. We have many more rows than we have columns So in our first instance, we create a simple matrix object and we have one two three four five rows and three columns And that means that each of these column vectors on element of r5 And if we look at the rank, we see that it's three. So these are linearly independent column vectors Now let's call the column space of that and what we're going to find once again It's going to return the first At least all the first linearly independent column vectors that are confined. We have a rank of three We only have three columns. So all three of those vectors Now those each of those vectors are an element of r5, but I've only got three of them So that is a basis for a subspace of r5. I cannot span all of our five Now let's have a look at what happens when we do elementary row operations on this matrix And as I said, the most extreme form of that is to use gas Jordan elimination Until we get the reduced row echelon form now look clearly at the difference between these three These are not the same column space. This is three basis vectors and another three basis vectors And they do not span the same sub subspace of r5 Because look at this if I look at x sub 4 and x sub 5 they non zero in these first three But here we're looking at standard basis vectors and there's there's no way for me to have any element in x4 and x5 They are all zero all zero. So these you should clearly see that these are two different column spaces Two very different column spaces. So do not do any elementary row operations on your matrix Before you look at the column space of that matrix So now let's look at a rectangular matrix again with more rows than columns And now we have one we have linear linear dependence So let's have a look at this one And you can see one plus two is three one plus three is four one plus five is six two plus two is four two Plus four six. So clearly we have linear dependence here. And if we look at the rank now those first two Are linearly independent. So the rank is going to be two I am definitely going to have a matrix in the null space, but let's have a look At the column space here once again returns for me the first linearly independent Vectors that it finds and now we can see there's only two. So my subspace is now even more constrained As far as r5 is concerned. I only have these two basis vectors Once again, let's do the reduced row echelon form take the column space of that once again Look at this except three except four and except five There's no way that a linear combination of these two basis vectors is going to get anywhere in You know within except three except four except five In r5 they're never going to be values there, but I do have that with these two So these are not the same column space Now let's have a look at the null space Of course, I do have to have a vector in the null space because my rank was two I have three columns. So I have to have a nullity of one and there we can see I do have A vector in the null space by the way, let's do this And you can do that with any of the others before and now let's just do an elementary row operations on this vector And then call the null space and you see nothing's changed. That's still the same basis vector So the null space doesn't change and in the next section, we'll see that the row space doesn't change, but the column space Definitely changes in most cases only in those special cases that it won't change So as before we're going to end off with a little proof. We have to show that this column space Is a vector space So for any matrix a that is the shape m times n the column space is a subspace of r m because there's m rows And my result vector will always be a m by one And we know that r m is a vector space, you know, if once you've Learned about vector spaces, you can show that r m Has all those properties of a vector space. They're all all fulfilled all those properties when you look at r m So to show that the column space is a subspace We have to show any three things that the column space contains at least one vector And remember there's always the zero vector even though that doesn't span a space. They still a vector there More importantly, we should talk about addition that holds and scalar multiplication that holds So when we know that we can always find the solution x equals zero for the homogeneous system b equals zero That's what I said. So the there's always that zero vector b Now we want to know that if b sub one and b sub two are in the column space of a We must have that b plus b one b sub one plus b sub two is also in the column space of a So that's easy enough to prove because By our statement a x sub one equals b sub one and a x sub two equals b sub two We've stated that those both of those are in the column space So we can write these two and then we just do the addition a x sub one plus a x sub two That must equal b sub one plus b sub two and we know that we can rewrite the left hand side as A times x sub one plus x sub two the right hand side is going to be a new vector But as we write it there This new vector b sub one plus b sub two is in the column space of a Because the way that we can write it there and then the same for the scalar multiple Or a c which is an element of the real numbers So if I multiply left multiply both sides by this scalar c And by definition we still have that b is in this column space. So we can write a x equals b And then we just rewrite c a x as a times c x From the properties of matrix vector multiplication and that's going to equal this new vector c b So c b is also in the column space So we've shown that This row space is a vector subspace. It is a vector space It's a subspace of r m in these examples that we've looked at So I hope you learned something from this video on the column space. Remember, it's all about the result vector b That's what the column space is about. You've learned that once you do elementary row operations You're going to get a different column space unless you have a special case And you can now clearly see at least as far as the square matrices are concerned If you have linear dependence, why you can't have an inverse Why you'll have a determinant of zero because it's this idea of translating our matrix x into this column space b And that there will be more than one vector that can go to the same column b And then that means if you want to do an inverse, you know, which one of those two are you going to go to So you can't really it's it's a clear way to see why you won't have an inverse In the next video, we're going to look at the row space of a matrix