 So the last time we went over course outline and we went through some basic definitions of matrices, matrix multiplication, matrix addition, so on. Then we started discussing bigger spaces and in particular we discussed about, yeah, so we were discussing about linear combinations and in particular we discussed about linear independence. So linear independence is a very, very central concept to matrix theory and linear algebra. And so I'll just reiterate that a set of vectors are linearly independent if the only linear combination of this set of vectors that gives you the zero vector is the all zero combination that is all the coefficients must be equal to zero. That's the only way you can reach the all zero vector. Then we say that that set of vectors are linearly independent, otherwise they are linearly dependent. That means that there is a non-trivial linear combination of those vectors where some of the coefficients are non-zero but which when added together with that weighted combination gives you the zero vector. So today we'll discuss many related concepts, specifically basis, dimension and last time somebody asked me about linear transformations and I was telling you that linear transformations are actually equivalent to matrices and matrices are equivalent to linear transformations. So the question was how do we define what is a linear transformation which then is related to a matrix. I'm going to talk about that. Then I'll talk about some fundamental subspaces associated with linear transformations or matrices. And if time permits we will also discuss about the notion of the rank of a matrix. Now to recall we say that if you remember we just put down the last thing we discussed the last time on the previous class. So a set of vectors v1, v2 up to vn span a vector space v if n of v1 through vn is equal to v. That means that every vector in capital V can be written as a linear combination of v1 through vn. So basis. So a set of vectors v1 through vk is said to be a basis for a vector space v if it's both linearly independent and spans the set v. So in this case we call this a spanning set, vector space v if it is both linearly independent and spanning set. So some comments about the basis are in order. First of all the basis is not unique. You can define many different basis for a particular vector space. And every v in this set, spanning set v can be, sorry, every v in this vector space capital V can be uniquely written as a linear combination of the basis. Obviously this is not true if you add or delete vectors from the basis. If you add vectors to the basis there are more than one way in which you can represent a vector that belongs to the vector space. If you delete vectors from the basis there are points in v which cannot be represented as a linear combination of the remaining set. So also every vector space has a basis. And we say that another way to say what I just said is in fact it's the two popular phrases a basis is a maximal independent set, minimal spanning set. So maximal independent set meaning that this is the maximum number of linearly independent vectors that you can pull out from this vector space v. In other words if you take a basis and you take any other vector from v and add it to that basis that set of vectors now becomes linearly dependent. And it's a minimal spanning set in the sense that if you take away any vector from the basis then it can no longer span the vector space. There will be some points in the vector space which cannot be represented as a linear combination of the remaining set. So another way to say this is that an independent set of vectors in a vector space is a basis if and only if no proper superset of it is linearly independent. Also a set that spans v is a basis if and only if no proper subset of it still spans the vector space. These are things I have already said I am just saying it another way. Now very another concept which is related to the basis is that a vector space is said to be finite dimensional if there exists a finite set of vectors in v which is a basis for v. So in this course we will completely or exclusively look at finite dimensional vector spaces if the basis for a vector space does not have a finite number of vectors in it then we call the vector space infinite dimensional. So for example if you take the set of all polynomials in one variable say x then that is an infinite dimensional vector space because you can have x, x squared, x cube and so on going all the way up to infinity. So if you take the set of all polynomials that you can define in x then this span this forms a vector space which is infinite dimensional. But like I said in this course we will focus on the finite dimensional vector space most of the results for finite dimensional vector space actually do extend to infinite dimensional vector spaces but in some cases you will have to do you will have to make some extension arguments which is beyond the scope of this course. So here is one result related to basis it is called the basis theorem. So what do you think it says? Anybody want to guess or just anybody know what the basis theorem says? Sir number of the vectors present in the basis is actually the dimension of the vector space. Yeah exactly so that is the theorem. So what it says is that if some basis of a vector space has a finite number of elements then all basis of the vector space have the same number of elements and this is called the dimension of the vector space is denoted by time of. So essentially if you find a basis for a vector space and I find a basis for a vector space the basis that you found could be different from the basis that I found but the number of vectors that you that you have used to form the basis is going to be exactly the same as the number of vectors I have used to form the basis. So just to I mean illustrate this idea that there could be different basis but they will have the same number of vectors. So first of all you know if I take the space r to the n this has dimension how much n. So if I take for example the case where n equals 2 then 1 0 0 1 this is one basis and so is 1 minus 1 and minus 1 1 and so is 1 2 3 5. All these are basis they will span r2 and you see that they all have two vectors each. Sir. Yes. The second example is not a basis right they are linearly dependent. Yes. Okay now there now it is a basis thank you. Okay so if I take c to the n this also has dimension n in the field c okay and it has dimension 2n in the field of real numbers. Of course if you take this n dimensional real space this vectors e1 e2 up to en which are like this over here where basically ej has 1 in the jth position and 0 everywhere else this is called the standard basis okay. So now let us prove this result okay so how do we prove a result like this that if some basis has a finite number of elements then all basis of the vector space have the same number of elements and this is called the dimension of the vector space okay. So the proof is by contradiction so suppose there are two I found one basis and you found a different basis and they happen to have a different number of elements so suppose v1 through vn is one basis and w1 through wm is a different basis v and without loss of generality I can assume that n is less than or equal to n otherwise I can simply switch what I call v and what I call w so I can assume n is less than or equal to n without loss of generality so as a first step so basically this has fewer vectors than this what I am going to do is I am going to take this bigger set here and replace one of these vectors with say v1 and then I will replace one other of these vectors with v2 and so on that is what I am going to do so first we can replace one of the wi with v1 and still have a basis for v so why is that true that is because suppose v1 was equal to alpha 1 w1 plus alpha 2 w2 alpha m wm I can always do this because w1 through wm is a basis for this vector space v and v1 to vn are also vectors that sit in v so any if I take v1 it is a vector that is lying in v and so it can always be represented as a linear combination of w1 through wm and obviously not all these alphas are going to be equal to 0 because if I make all the alphas equal to 0 then the right hand side is 0 and the left hand side is 0 okay so this is this is true so not all alphas are 0 so if alpha k is one of the guys who is not 0 then wk can be then written as 1 over alpha k times v1 minus all the other vectors so in this series there is no wk okay I have skipped wk and all the other terms are here so I can just I am just rewriting this first equation here okay so basically what this gives us is that so notice that these are this is just a linear combination of all the other w so I can replace wk by v1 and still have a basis for this vector space v so there are two things here so we can replace wk by v1 and still span the vector space v and the second is that this set of vectors that we get w1 wk minus 1 v1 wk plus 1 all the way up to wm are still linearly independent okay why is that true simple if you take a linear combination of these and you get 0 and suppose that's a non-trivial linear combination then all you have to do is to substitute for v1 it's some linear combination of all these w's and what that will end up showing you is that there is a non-trivial linear combination of w1 through wm which gives you the zero vector and therefore this set of vectors w1 through wm are not linearly independent but that is no that is that was one of our starting points that this is a basis meaning that this is a linearly independent set that spans v so you can show by contradiction I would like you to try to show it but what I was saying is it's it's a simple argument what you do is you take a linear combination of these vectors w1 through wk minus 1 v1 wk plus 1 through wm and you set it equal to 0 suppose that there is a linear combination of these vectors which gives you the zero vector and if that linear combination is a non-trivial linear combination it means that these vectors are linearly dependent so you start by saying suppose it's true that I can take some beta 1 w1 plus etc to beta m wm where there is some beta k which is multiplying v1 and with not all beta i is equal to 0 which gives you the zero vector then it or what you do is you substitute for v1 from here v1 is some non-trivial linear combination of these w's and you substitute for v1 into that equation involving beta 1 w1 plus etc up to beta m wm equal to 0 and you manipulate that equation a little bit and you end up showing that there is a non-trivial linear combination of w1 through wm that is also giving you the zero vector which means that w1 through wm are not linearly independent but that's a contradiction because we started by assuming that v1 through vn and w1 through wm are basis of v so that's the argument so you can so now that you've replaced one of the vectors in this set by v1 think of this as your new basis and you do the same argument and replace one of one of these vectors with v2 okay now clearly v2 is linearly independent of v1 okay so you can use one of the other w i's to replace it you don't need to use v1 to replace it because clearly v2 when I write v2 as a linear combination of these vectors here the coefficient of v1 may or may not be 0 but one of the other coefficients will certainly be non-zero and it's that coefficient that you use to rewrite it like this and then say you can replace wk by some wk prime by v2 is linearly independent of v1 so we can use one of the w i's so then we can do then v3 v4 and so on okay but unless m equals n what we'll have then is we'll have a set which has v1 to vn and some of the w i's and since v1 to vn span this vector space v it means that these w i's can be written as a linear combination of these v1 to vn and which means that this new basis that we found is not linearly independent anymore so there will be some w's left over but any w i can be expressed as a linear combination of v1 through vn since v1 through vn spans v so that means that this this new set that we have is no longer a linearly independent set and this leads to a contradiction okay so that's the proof any questions yes sir while proceeding with replacing every v i by w i you said that if the coefficient of certain w i is zero to take another w i but while going from v3 to v4 to vn if we ran out of the non-zero coefficients I mean there is no particular w i left for which its coefficient is non-zero and I can replace it by a v yeah so that's not possible because so for example suppose you've replaced v1 v2 and v3 with some of the w i's and so now you have a set which has some w i's and it has v1 v2 v3 in it now you're looking at replacing v4 v4 is linearly independent of v1 v2 and v3 so you can't express v4 as a linear combination of this set which contains v1 v2 v3 and the rest of the w i's but with the non-zero coefficients being only in v1 v2 and v3 because this v4 is in linearly independent of v1 v2 and v3 so the coefficients of one of the other w i's has to be non-zero is that clear and that w i can be used to replace okay so the the next thing I must I want to talk about okay just maybe one remark I think I've already mentioned this but a property of a basis is that if you take any vector in the vector space v and express it as a linear combination of the vectors in the basis that linear combination is unique okay so the next thing I want to talk about this is also something you can show by the way what I'm doing right now is really just reviewing some basic concepts from matrix theory that you must have seen in your undergraduate so I'm not proving all of these results once we start discussing the new material or the once we get past all this background material I will generally try to prove every result that I put down