 As another important conclusion to be taken home from this, we can give actually alternative definitions of a basis. We can say that a basis is either a maximally linearly independent set or a minimal generating set. What do you mean by minimal or maximal? The number of elements. So, if you have a basis you cannot violate, you cannot keep the linear independence intact by increasing the number of elements in that set. Why? Because a basis is not just linearly independent it is also a generating set. So, the moment you add one more element to a basis it contains more elements than a generating set which is the basis itself and therefore it cannot be linearly independent. So, a basis is the maximally linearly independent set. Any linearly independent set in a vector space can have at most the number of elements equal to the dimension of the vector space, right. So, it is a maximally linearly independent set by the same token it is a minimal generating set. In other words, if you take away one element from the basis and if it still continues to remain a generating set then in the original basis you had something superfluous it could not have been linearly independent, right. If even with fewer elements than in a basis you had managed to generate then how could the basis have been linearly independent, right. So, this same results you see we are beating the same dead hordes over and over and generating all of these conclusions. So, we can say alternately that if you just invoke one property a basis is a maximally linearly independent set or a basis is a minimal generating set. It is like it is just about the right structure just enough elements so that you can generate every element in the vector space, yeah and it is got nothing more so that you know redundancy is there, right. And this leads us to another very interesting conclusion that. So, for here after we shall mostly be dealing with finite dimensional vector spaces and I will use the shorthand FDVS for finite dimensional vector spaces. So, for V which is a finite dimensional vector space any vector V has a unique representation with respect to a given basis. Of course, if you change the basis then the representation changes. A pictorial representation of this could be or the illustration of this point would be again our favorite which we understand very well let us say R2. So, suppose you choose this as your V1 and this as V2. Obviously, this is the very standard basis you can think of the x axis and the y axis, but on the other hand I might choose this as W1 and this as W2 that is also a basis. Now, suppose I want to represent this vector say V in terms of the first basis what do I need to do? I need to project this on the first parallel to the second because it is all parallelogram law you see and I need to project this on to the second parallel to the first. So, this gives me my alpha 1 and this gives me my alpha 2. So, it is alpha 1 V1 plus alpha 2 V2. On the other hand if I wanted to write it in terms of the other one maybe that is the color. Again I would extend this parallel to this. So, this is my beta 1 and this one parallel to this. So, this is my beta 2. So, I will also write this as equal to beta 1 W1 plus beta 2 W2. So, of course, if you choose different basis then the representation varies no questions asked. But what I am saying is suppose you fixed up the basis then the representation must be unique. How do we go about proving this? We will of course, not take the special case of R2 or R3 or Euclidean space we will talk about the general vector space because that is what the result claims. So, again we assume the contrary suppose V of course, which is from the vector space V is equal to summation alpha i V i where the basis contains the vectors V1 V2 till Vn. So, it is an n dimensional vector space so to say right. So, this is what it is. If it is not uniquely represented or if it has no unique representation then V can also write V as summation i going from 1 through n alpha hat i V i where alpha i is not equal to alpha hat i for some i. At least one of those alpha hats is different from the alphas if not all at least one of them is different then we would have a different representation. But now if we write this implication V is equal to this and V is equal to this if we put them both together we have summation alpha i V i is equal to summation alpha i hat V i of course both sum from 1 through n bringing them all on the same side we have alpha i minus alpha i hat V i this summation is equal to 0 and I am saying I am done why? It is a basis the elements in V i are a basis and therefore, a linearly independent set and no non trivial combination can lead to 0 the only possibility is that these are all trivial which means that alpha i is equal to alpha i hat for all i which contradicts what I had assumed earlier. So, therefore, the representation of a vector with respect to a given basis must be unique. So, this part is clear I can erase this let us now delve into subspaces a bit and understand this relationship with these vector spaces and subspaces and so on and so forth. This means alpha i must be equal to if it has no unique representation then we can also write V as this where alpha i are not equal to yeah thank you yeah alpha i is not equal to some alpha i hat for some i at least yeah thanks right. So, now we are going to talk about subspaces because we already know what subspaces are and it is important that we study these structures of subspaces with respect to this new idea that we now have about basis and dimensions ok. So, suppose V let us say a finite dimensional vector space yeah has a subspace W right suppose V has a subspace sitting inside of it which is W ok. Consider let us not call it B let us say S is equal to W 1, W 2 till W m be a linearly independent set in W. If V set minus W that means if this is not the empty set and let us say little V belongs to this set then S union V is linearly independent. Let us take a close look at that what are we saying we have a vector space within it is sitting a subspace W there is a linearly independent set sitting inside the subspace. Now I choose to augment this set inside W with an additional vector, but where do I pick this additional vector from? I do not pick this additional vector from W I pick this from that part of V which is non overlapping with W which is not there in W yeah. So, this is linearly independent of course, in V and now if I look at the resultant set which is the augmented set then I am cleaning that this is going to be linearly independent Y. So, consider beta V plus summation alpha I W I going from 1 through m is equal to 0. If we are to show that this of course, this is the 0 of the vector space V. If we are to show, but the 0 of the vector space V is also the 0 of the vector space W I hope that is clear right it is a subspace. The point is if we are to show that this is linearly independent then somehow it must follow that beta is 0 and each of these alpha I is equal to 0 otherwise this cannot be true right. So, what is it that leads us to that conclusion? Suppose beta is equal to 0 obviously the dreaded word obviously alpha I is equal to 0 for all I Y because if beta is 0 then what you are left with is basically elements from a linearly independent set by my own claim here. So, the alpha I must be 0. So, the only case what checking is suppose beta is not equal to 0 which immediately means beta inverse exists. So, hit it with beta inverse there right then what do we have? We have V is equal to minus summation beta inverse alpha I W I which can be written as summation alpha hat I W I with alpha I hat being now defined by minus beta inverse alpha I here is this clear and again I am saying we have a contradiction why? What does this mean? Means V belongs to right because each of these terms comes from W. So, that means V must also come from the subspace W which is a contradiction. So, therefore, we must go back regress back to the first possibility which is beta is equal to 0 and once beta is equal to 0 each of the alpha I must be 0. So, therefore, this is a short short way if you never run out of imagine. So, this is this is this is it basically to answer your question that you asked earlier how do you constructively cook up or try to cook up a basis ok maybe I will just just hold hold hold that hold that thought for a moment immediately from this I will go to the next important result which will now invoke the dimensions and we will see an interesting consequence of this ok. So, this is clear I can erase this or maybe I will erase the main result here. Let me see how much I can retain if V set minus W is not empty is not phi this is this empty set notation for the empty set then dimension of V must be strictly greater than the dimension of W is this obvious why should this be obvious just invoke the result we have seen right away. Consider just go back to the basics what is the definition of the dimension number of elements in a basis. So, we have to start with the basis consider BW as a basis for W ok choose V belonging to V set minus W such V is guaranteed to exist why after you have taken away everything that is there in W you will still be left with something to choose from in this set why because it is not empty by my assumption. Now consider BW union V the cardinality thereof what is this this is greater than dimension of W, but what do we also know about this set further we have just seen by our previous claim thus this BW union V is linearly independent in V right. So, if you have a linearly independent set inside V the number of elements therein what do we know about the maximal linear independence right the basis. So, number of elements can at most be equal to yeah it is it is. So, this what is this this number. So, what do you see what can you see about the dimension of V then it must be greater than this number, but what is this number this number is greater than the dimension of W. So, this implies that dimension of V is greater than or equal to I do not know it might also equal might be just one more, but this is greater than the cardinality of BW union V and this is strictly greater than the dimension of. So, you never know which result comes in handy when sometimes you might be required to show that two sets are two vector spaces are equal and somehow from somewhere you have plucked out this condition that one is contained in the other and in some other route in a circuitous route maybe you have somehow found that they have the same dimension then immediately turns out that you do not have to show that one is contained in the other and the second is contained in the first. If one is contained in the other and the dimensions are equal then they must be the same vector space that is what this result says right. So, we will finally conclude with providing you a recipe for cooking up a basis starting with a linearly independent set for a finite dimensional vector space of course. Oh I have okay I have not defined I have not written okay. So, yeah I have just said in words. So, now that we have cleared the cleared the air on the fact that the number of elements in any basis must be equal we can just say that the dimension of a vector space V is equal to the number of elements in its in any of its basis. So, very interesting results you see you look at dimension of a space you look at n tuples, but already by this you would know that this vector space Cn over r is still an n tuple, but it has a dimension 2n right the field matters. Similarly, there is also a very interesting conclusion that r over q I will not give you a complete formal proof of this it is beyond the scope of this course. Can you guess what is the dimension of this infinity why how do you prove that of course there is something that I am going to use which I am not going to prove, but suffices to say that there are numbers which are algebraic numbers algebraic numbers in the sense that they are roots of polynomials whose coefficients are integers or you can say rational numbers and then there are numbers which are neither which none of those properties. So, they cannot be written as roots and there do exist such numbers they are called transcendental numbers do not go into transcendental meditation transcendental numbers two such very well known numbers are e the exponent and pi. So, at least we are guaranteed about the existence of transcendental numbers. So, now consider a transcendental number it is an irrational number, but it is a real number. So, let us say suppose alpha is a transcendental number. So, to prove that pi or e are transcendental numbers itself will take up a lot of time and beyond the scope of this course, but we will assume that we are accepting of this fact that there is a transcendental number. Now, suppose further let b be a basis for r over q and that cardinality of b be equal to some r which is finite. So, we are assuming the opposite the contrary then what do we have? We must have we must have what result then what can we say? How do we specially choose this b? Let us say or let me let me just say that actually that probably would have been better if I had just proved the earlier result first, but let us just take this. So, let us take one alpha, alpha squared till alpha to the r. So, if this all of this came from a finite dimensional vector space then for some finite r. So, let us erase this because I would have should have probably proved the earlier result first, but it is ok. So, suppose if it is a finite dimensional vector space then the basis dictates that beyond a certain number which is the number of the elements in the generating set which is what the basis is also 1 right. So, then beyond a certain number if a set has more numbers than the dimension of the vector space then it must be a linearly dependent set yeah. So, if r is greater than the dimension of this vector space then such a set which is constructed by taking powers of this yeah must also be linearly dependent. Therefore, there exist C i such that summation C i alpha to the i i going from 0 to r is equal to 0, but immediately we have a contradiction because this means where of course, C i are rational numbers, but this would be in that alpha which are assumed to be a transcendental number. Now, ends up being a root of some such equation some polynomial. So, it cannot be right. So, therefore, if you assume that there is a finite I mean at a finite point it terminates that there is a finite basis that means there is a finite generating set then beyond that point if you choose any set you can always cook up a set like this to contain more elements in that generating set than that generating set and therefore, it must be linearly dependent, but then that would contradict the very definition of a transcendental number right. So, just to quickly conclude this now that we know what a dimension is I will just maybe mention this as sort of like an algorithm on how you can cook up with a linearly independent set. So, consider S inside V to be a linearly independent set right. Now, once you have this S which is a linearly independent set this is the initialization of the algorithm. The first step the next step is look at or rather construct the span of S right construct the span of S. Of course, S is not a vector space, but span of S is definitely a vector space right. Then look at V deletion the span then ask the question is V deletion span empty if it is a S, S is a basis and you cannot go further on this track right. If not if it is not then what do you do? Choose V belonging to V deletion S and define the new S as the older S union this V. So, this is of course, a little mixed up language like like coding language. So, do not hold it against me this is like sort of semi mathematical and semi coding kind of a language ok. So, once you do this define this S then after this what do you go? What do you go back? Again you go back and construct the span of S and if it is a finite dimensional vector space it will guarantee eventually spit out a basis. You this algorithm will terminate at some point and will spit out a basis and look at the things we have used down the line every step we have used results that we have proved today right. We have shown you that if this is non-empty you cook up a new fellow from here it is bound to be linearly independent. If it is linearly independent yeah then it definitely is part of some basis it can be extended to a basis. So, what I am saying is any linearly independent set can be extended to a basis. So, your initial condition is important start with any linearly independent set in a vector space and repeatedly apply this algorithm. I am not saying how easy or difficult this is going to be, but in general this is a constructive way of obtaining a basis starting from a linearly independent set. And if it does not terminate then you are dealing with an infinite dimensional vector space right ok thank you.