 we have understood this idea of a basis, we are going to make a very important claim, a statement that we are going to prove, alright. And we will see that in proving this statement, we shall be making use of a result that we had proved quite a few lectures back, but I hope you still remember that result, okay. And you will see this intimate connection between results that we see in terms of matrices to results that have deep implication even in these abstract vector spaces that need not look like in tuples of numbers. We will see that connection shortly. So now what is this result that we are going after, okay. Here is the result. So suppose S is equal to say V1, V2, Vm is contained in V. So this is obviously a vector space, which is another way of saying that S happens to be a generating set for the vector space V, alright. Now pay careful attention to what is being said. Then any set S tilde contained in V, which has more than m elements, must be linearly dependent. Quite a startling result at first glance, right. Because what we are saying essentially is that you take any generating set for a vector space V. The cardinality of a set is the number of elements in that set. So you look at the cardinality of the any generating set. And now what is being said is as follows. If you have any set, not necessarily a set that is constructed from S by adding more elements or taking away elements, right. This is just another set, any arbitrary set which may or may not have anything in common with S. But the fact that it is coming from the same vector space V and contains more elements than are there in this generating set. Then there is no way that this set S tilde can ever be linearly independent, right. This will have profound implications in everything that we study here after. So let us look at a proof of this claim, okay. So we will see something similar. We will do something similar to that in our proof, okay. So here is what we are going to do. We are going to say that look every vector in this, since it also comes from V, therefore it must belong to span of S. That is exactly the idea we are going to use, okay. But then you will see some interesting conclusion of a result that we had proved a few lectures back kicking in and helping us on our way through. So suppose S tilde is equal to S1, S2 through till Sn with n greater than m. Because that is the first thing we know about S tilde, right, that it contains more elements than are there in a generating set. And the number of elements in the generating set we have assumed to be m, right. So obviously needless to say that m and n are finite. So we will assume that this is a generating set and then later on we will take some subset of that, okay. So we will say that look everything that is here is most probably more than I need to cook up any vector in this entire vector space. I may make do with fewer, but this is the most I need. So I have not spoken about the linear independence of this specifically. I have just said it is a generating set, yeah. So pay attention to that point. It is an interesting point that you have made, but that is what we have not made any restrictions there. But still we will end up. Because the first part of this claim does not talk about the linear independence thereof of this set. And yet we will see that we end up with something very interesting here. The second part talks about linear independence. So what is it that connects the two? Here is what? Now this n is less greater than m, right. So since s is a generating set for v and si belongs to v for all i, what can I write? We have s, let me say k here, yeah, whatever, it does not matter. Therefore we have sk k, what can I write this as? This must be some linear combination of the fellows in the set s, right. So I might have alpha i k, what is it v? Vi, i going from 1 through m, right. What I am required to show is that some linear combination of these sk's, some non-trivial linear combinations of these sk's, any, ok, there might be multiple possibilities. If I can show that one non-trivial linear combination of these sk's vanishes, then I am done, ok. So, consider summation beta k sk is equal to 0, where k goes from 1 through till n. Let me just write down the whole nasty thing, just open this up. It is good, I have space at the bottom, so I can write this as beta k. Let us write the expression for this, opened up. So this is alpha 1 k, ok, I can just start with 1, alpha 1 1 v 1 plus alpha 2 1, is it, v 2 plus dot dot dot till alpha m 1 v m, right. I hope I have not messed up the indices, right. This is the first term, that is beta 1 s 1 plus, let us write the second term also, this is beta 2 and then I will have alpha, what was it? 2, right, no, 1 2, 1 2 v 1 plus alpha 2 2 v 2 plus dot dot dot till alpha m 2 v m. That is just the second term. We will have n such terms, I have written only the first two, so let me go over to the last and then we have beta n times alpha, I believe it is what, 1 n, right, 1 n v 1 plus alpha 2 n v 2 plus dot dot dot the ellipses, then alpha m n v m. This must equal 0, yeah. What I am now going to do is I am going to pull together terms related to v 1, write them together, then pull together terms related to v 2, write them together and so on and so forth. This means, what is the term that is common to v 1? If you excuse me writing the coefficients to the right and the vector to the left, it does not make much of a difference. So, what is the first term? Alpha 1 1 beta 1 plus alpha 2 1, sorry, alpha 1 2 beta 2 plus dot dot dot last alpha 1 n beta n, is it not? That is the coefficient of v 1 plus, let us write v 2, what is the coefficient of v 2? It is alpha 2 1 beta 1 plus alpha 2 2 beta 2 plus so on till alpha 2 n beta 2. Beta n and so on and so forth, until I come to the last vector, which is v m, right and I have alpha m 1, right, m 1 beta 1 plus alpha m 2 beta 2 plus until alpha m n beta n beta is equal to 0, right. And you might at this stage feel, why am I going with this? Because after all, I have not invoked the condition that v 1 through v m must be linearly independent. If I had, then I would have said, oh look, this is 0, then each of them have to be 0. But this claim I am going to make now, I do not really need that linear independence. As long as remember my original question at hand, that has to do with this expression here, this one. As long as I can get a bunch of non-zero, at least some of them non-zero, betas which lead to this condition, I am done. So, let me say that, I am going to force each of these to be 0. If I do force them to be 0, what does that lead me to? Sorry, it must say that, what happens? If I can find some betas which despite being non-zero leads to this being 0, each of these coefficient 0, what does that mean? Does it not mean? Because within this expression, what is hidden is essentially this expression. So, follow this point very carefully. I am not assuming linear independence of the v's. All I am saying is, if each of these coefficients, let us call them gamma 1, gamma 2, so on, let us call the last one as gamma, let me use a different color. Suppose this is gamma 1, this coefficient, this is say, gamma 2 and let us say this last fellow, let us say this is gamma m. If somehow I can ensure that with a bunch of non-zero betas, all of these gammas do vanish, what have I eventually cooked up then? Have I not cooked up a non-trivial combination of the s k's that vanishes? Because after this expression, going to 0 identically means what? If I want this to go to 0, one choice I can make is to make all the gammas 0. In other words, I was targeting for this. I was shooting to achieve this. Now, if I can choose some suitable gammas, now this condition has been shown to be equivalent to this condition. I do not know anything about the linear independence of the v's, so I should not use that linear independence property. But I do know that any vector when multiplied by 0 goes to 0 and you sum up zeros together, that also leads to 0. So, if each of these coefficients I can make 0 by non-zero choices of the betas, it means I have shown you that there do exist non-zero betas which drive this to 0. So, that is the reason, that is the reason I am able to bypass this or even not even consider the linear independence of there or lack thereof of these vectors v 1 through v m. I do not know that. But I must be able to ensure that there will be some non-zero choice which drives each of these to 0 and what is it that guarantees that I will be able to do it. So, I am going to write these fellows in this following form and you just pay attention to it. If alpha 1 1 alpha 1 2 alpha 1 what is it n alpha 2 1 alpha 2 2 alpha 2 n alpha m 1 alpha m 2 alpha m n times beta 1 beta 2 beta n is equal to 0 for not all beta k equal to 0. Then what have I just shown exactly what I wanted to show then s tilde will be linearly independent. I am not saying this is the only way to show it, but this is definitely one way because all that I require is to cook up one non-trivial combination, one set of non-zero betas that is so that not all betas are 0 which leads to this being 0. This may not be the only way but I do not care. I am just trying to show that there does exist some non-zero betas which combine these s s s from s tilde and lead to 0 that is what I will have shown right. But now look at this matrix what does it remind you of? See we were dealing with an arbitrary vector space. Now I did we assume that this is an interval of numbers these objects but eventually the question boils down to that of matrices. What do we know about this matrix? What kind of a matrix is that? The ones written in terms of the alphas. It is a fact matrix right. What does the fact matrix have? It has more number of columns than number of rows and therefore just recall the argument just to refresh your memory it cannot have it can have at most m leading once it can have at most a row rank of m or a rank of m as well. Therefore since n is greater than m it must have n minus m free variables at least. If it must have n minus m free variables there does exist a non-zero solution to the equation Ax is equal to 0. So therefore since m is strictly less than n let us call this equation star. Star has a non-zero or non-trivial solution and we are done that is what you can just write up the next few steps. The point is since m is less than n that does exist non-zero that makes this vanish. If this vanishes it means each of these coefficients vanish if each of these coefficients vanish that means this must vanish that means this must vanish but then not all of the betas are 0. Therefore the set S till day which had contained more elements than a generating set must also vanish right. See that is why it is very important to follow and I thank you for asking that question as to whether I would assume the linear independence of the set or not I did not right. It is a very important point. Anything that contains more elements than a generating set cannot be linearly independent. I do not care about the linear independence of the generating set that I started with. As long as it is a generating set yeah anything containing more elements than that must be linearly dependent right. Sorry oh sorry yeah thank you thank you for pointing that out yeah linearly dependent sure okay. So now immediately from this we will see some consequences right away. Suppose V admits a finite basis okay. Suppose V admits a finite basis right. Let B1 and B2 be two bases for V then the cardinality of B1 must be equal to the cardinality of B2. When I write this for a set it means I am referring to the cardinality. It may not be finite. So if you have a spanning set you can always keep extending it. If you have a linearly independent we will see that if you have a linearly independent set we will see that you will always find a way to extend it. You may not be able to keep tab of all the elements in the basis if it is not a finite dimensional vector space. But there is always a way of going we will see some interesting examples of vector spaces. In fact there is a number for this. I am going to define that. The number of elements in a basis is actually called the dimension of a vector space and this dimension this all that this is saying is that the dimension is unique. So this dimension if it is finite then we can keep a count of this if it is not finite then also you can keep cooking up more and more elements and you can say if you can mention an algorithm. If you can mention an algorithm about how to go on extending this linearly independent set to add more and more that will pave the way for cooking up a possible basis. You may not exactly be able to write what that basis is. Yeah, yeah I mean. So you look at the way that basis is defined. The look at the way the basis is defined. The first step is that it must be a generating set. So the first question that is if you want to guarantee the existence of a basis you have to guarantee the existence of a generating set. So you need some algorithm that allows you to ensure that there is a set that is going to generate this. So if it is a finite you will always come up with a set that eventually terminates. For a finite dimensional vector space you can always guarantee that. But for infinite dimensions how do you even list it? You see. So for infinite dimensions all you can say is that there is an algorithm and there is a method for churning out more and more fellows which eventually like yeah. But that is a general process. You cannot terminate it at any point. Otherwise it stops becoming a finite dimensional vector space. So for finite dimensions you can definitely say that there is a way to prove that it is guaranteed to exist. You start with linearly independent. We will get there. Hopefully we will have time for that. So this I will not write this out in detail but I will argue. Please do feel free to ask if you have doubts about the line of reasoning. What is it that is being said here? Both B1 and B2 are basis. So both of them are generating sets. Both of them are linearly independent. Suppose on the contrary, so a sketch of the proof would be assume the contrary. Assume and you can flip the argument the other way. Assume that this is what holds. And immediately the contradiction is evident. Why? Because this B2 is a generating set. B1 is a linearly independent. I mean they are both of those things. But when I am interested in proving that this cannot be true, I am interested in invoking the property that this must be a generating set or a spanning set. And this must be a linearly independent set. But I have already told you there that if you have a set which contains more elements than the, than any generating set, it cannot be linearly independent. By the same token, if you assume that B2, sorry, B2 is greater than B1, then you use the fact that this is a generating set and this is a linearly independent set. And this is both, these are both contradicted. So if you have two numbers, neither of which is greater than the other, the only possibility is that they are equal. And therefore, it does make sense to talk about the dimension of a vector space independent of the basis that you have chosen, alright. It does not matter which basis you have chosen. Whether you have chosen 1x and x squared for the basis of a polynomial and I have chosen 5, 7x and 3x squared plus 2 as the basis of the set of, of the vector space of polynomials of degree 2 or less, we will both still end up with concluding that the vector space of polynomials of degree 2 or less has a dimension of 3, right. So that is a very important conclusion. So we will end this module here. The outline of the proof is, assume that this is not true. So then either this is true or this is true. If this is true, this is a generating set that is a definition of a basis. This is a linearly independent set. But what have we just seen that any set that contains more elements than a generating set cannot be linearly independent. So then if this is a generating set, this fails to be linearly independent. But that cannot be, if both are basis, by the same token here also, if this is true, then this is a generating set. This cannot be linearly independent. So they cannot both be basis at the same time and have different number of elements in each of them, right, okay.