 In the previous lecture we have started looking at products of subspaces, however there were a couple of questions that were asked which I felt I should share with all of you at least the answers to those questions. So one of those questions was an interesting one which said that look we had these objects these vector spaces v then their duals and then their double duals right and we showed that there is a natural isomorphism between these two right. And it is a very important point to note that in general if these are not finite dimensional then you should not expect isomorphism in fact it so happens that if these are not finite dimensional vector spaces the double dual is a much bigger vector space than the original vector space all right. So that map which I had constructed if you go through there you will see that that injection the injectivity of that map is beyond question, however we did not prove the surjectivity instead we argued that since they are of equal dimension injectivity implies surjectivity. Now if that equality of dimension is not assumed a priori and that is only valid if it is finite dimensions right and in finite dimensions you cannot equate dimensions in such a case that surjectivity is no longer true all right. So all of this is very specific to finite dimensional vector spaces not in general for any vector space all right. So we would do well to remember that that is the first point that I would like to get out of the way. The second was this dual map that we had defined remember when we have this T which is a mapping from some v to u probably we had chosen and then we describe this dual map which is from the dual of u to the dual of v. Now of course by this notation itself you can guess that we are assuming it is linear but you should not assume it is linear you should just go ahead and check that this is indeed linear in the way in which we have defined it right you remember that composition we had defined right that phi this is how we had defined this is it not. So I leave it to you to check that this is indeed going to be linear in the sense of a linear map right. So that is the second question that was posed and I just thought before we proceed with what we were doing towards the end of the previous lecture maybe we would clarify this. So I hope that those of you who had asked that question that is well and truly now explained right all right. So we were looking at products of subspaces all right. So suppose you have v1, v2 until vk all being subspaces of v and v is a finite dimensional vector space. Now of course I have defined what this product of subspace is going to be like the point is that we are going to now try and explore what the dimension of this product is going to be. First thing you have to prove is that this is indeed a subspace which is the object v1 cross v2 so on till vk is a rather a vector space in its own right. Now this of course subject to the rules of addition vector addition and scalar multiplication that we have described in the previous lecture you should be able to readily check that all of them check out because of the inherent structure that is there in each of these v's these vis the vector space the fact that this is a vector space is beyond any doubt now right. Now once we have assured ourselves that this is a vector space and not just that but the fact that this each of these v1 through vk are residing inside some finite dimensional vector space. The next logical question then to ask is this also finite dimensional and if so then what could be a potential way of showing that this is also finite dimensional of course if you can explicitly construct a basis that is finite you are done so that is exactly what we are going to do. So suppose you have bv1 bv2 until bvk are basis for v1 v2 vk respectively with the cardinality of bvi being given by mi right that is the cardinality of the set the number of elements distinct elements in that set. So in bvi the number of distinct elements is mi right so the claim would be maybe I should write an explicit expression for what these bv1 through. So let us say bv1 is equal to v11 v12 until v1 m1 bv2 is equal to v21 v22 until v2 m2 so on till bvk is equal to vk1 vk2 until vk mk right so that is an explicit way of just listing out all the members that are there in this individual basis. So the claim would be that this v11 padded with 0s then v12 padded with 0s until v1 m1 padded with 0s so how many is that that is m1 vectors. Notice that each of these belongs to the product right there is an order where the first of these fellows comes from the vector space v1 and these are the 0s of v2 v3 v4 till vk right. So these are all individual members of this how many are there exactly m1 so far next we carry on now we take this as 0 and we take v21 and pad the rest with 0s so on just going to write the last one now 0 v2 m2 0 so how many are there this is exactly m1 such fellows this is m2 such fellows just keep count that is the most important thing here and we will likewise have m3 such vectors m4 such vectors so on until we go to the last one where we start with 0 0 and the first one from the last list which is vk1 vk mk right. So what about this set then this is mk in number right so there is m1 vectors m2 vectors so on till m3 m4 m5 until mk vectors so the total number of fellows in this set the cardinality of this set alright. So let me give this set the name B yeah is a basis for v1 cross v2 cross go all the way till vk so that is the claim so what is the total number of fellows in here summation mi i going from 1 through k right. So if this is indeed true then we know that the dimension of this product is equal to the sum of the individual dimensions what do we need to show in order to prove that this is a basis well as usual two things one the fact that this is going to be linearly independent and two the fact that this is going to be a generating set for every vector inside this product right. So that is what we shall now endeavor to verify okay let us retain that so suppose summation alpha i or rather alpha 1 j alpha 1 j j going from 1 through m1 j going from 1 through m2 v2 j until summation alpha kj j going from 1 through mk vk j is equal to 0 0 of what well of course 0 of the product what does this mean it means I have already taken a linear combination of these fellows please check if you follow the rules of vector addition and scalar multiplication on the product space this is already there in that form right. So if that is so what does it mean to equal 0 what does the 0 in this vector space look like that is going to be 0 0 yeah this is 0 of v1 it is 0 of v2 and this is a 0 of vk clear now if that has to be true then each of these individually have to equate to 0 of the respective vector space right this means that summation alpha i j vi j j going from 1 through mi I hope the indices are right yeah they are is equal to 0 for i is equal to 1 2 until k yeah is all right because each of those individually have to vanish because they have to equate with each of these on the right hand side but what does this mean what are these fellows after all that list that I just erased these fellows are nothing but elements from the individual basis of the vector spaces v1 v2 v3 and therefore these are fellows that are coming from a linearly independent set. So if you are asking for them to be 0 then it cannot help but lead to the conclusion that alpha i j is equal to 0 for all i j which essentially means that whenever you take a linear combination of vectors in the set B it can only result in 0 if all those coefficients are 0 that means B is a linearly independent set right. So one part of the of the proof is quite straight forward from this what about the second part does it require much effort what do I have to show I have to show that you pick any arbitrary object inside this product space and I should be able to write it as a linear combination of fellows inside the set B I am not going to completely write that down I am just going to tell you how that works. So choose any arbitrary choose an arbitrary v1 v2 vk belonging to v1 cross v2 cross till vk and observe that vi is equal to summation alpha i j what were those numbers I wonder okay. Yeah I think this is how we were expressing them so it is vi j right where j was the sum j is equal to 1 to mi yeah since B vi is a basis and therefore a generating set for vi and it is done you agree that it is done I am not completing the entire reasoning the point is you pick out any of these vi's they individually come from so this one comes from v1 this one comes from v2 this one comes from vk does not matter pick out any one of them these individual fellows can be written as a linear combination of the fellows in that respective basis but those are already sitting in here only padded with some zeros on this side or that side. So the moment you observe that these fellows can be written as linear combinations of individual basis there is nothing really to prove in showing that there is a that this is a generating set it is quite obvious is it not please ask if there is any doubt about this if not because then I am leaving you the rest to you to complete the the remainder of the reasoning is that obvious is that clear okay good so then obviously once we have explicitly shown you one such basis is straightforward to write that the important result that dimension of v1 cross v2 cross v3 vk is equal to summation of the dimensions the individual vi's i going from one through okay I am claiming this is exactly what I have proved by constructing a basis I have just proved this okay so we could let matters rest here in so far as products of subspaces are concerned you can just look through certain examples of these products of subspaces familiar subspaces that you might have seen apart from the Euclidean ones and you can check out these different cute aspects of these results and so on right but at this point there is something that is sort of lurking here and is a bit too tempting to forego what other kind of a subspace do you remember where you know it turns out that the dimensions get added up direct sum right so here is a result that's almost crying to be investigated now if this fellow's dimension is equal to this and we know that a direct sums dimension is also equal to this both being finite dimensional one must be able to then infer that they are isomorphic because equal dimensions means for finite dimensional vector spaces they must be isomorphic so we wonder what could be such a potential isomorphism in case you are faced with a direct sum that will sort of bring the story to a close and also allow us to relate this product with another object that you are aware of right so i'm going to just talk about this map say gamma which is a mapping from actually i'll not right away define this as a direct sum i'll just take it as a sum for the moment and let's see what happens how is this given it is given like so right takes an object v1 v2 vk and maps it to v1 plus v2 plus v3 right so it means it takes an object such as this sitting inside here and maps it to this for the time being we haven't used the symbol for direct sum all right that's because we will ask one or two questions before we assign that direct sum level here all right so what can we say about this map gamma can we say that this is a surjection if i were to claim that gamma is a surjection would you object to it would you be able to object to it yeah you can of course object to a lot of things but that doesn't make it true the point is every time i've written something out like this it guarantees that such objects exist inside these individual vector spaces so there must be a pre-image that maps to this so surjection is quite obvious again what does surjection mean that every time i give you an object here you should be able to find a pre-image here now the moment i give you an object like this already just go ahead and look at this it's quite straightforward right so surjection is really quite straightforward what is not very obvious is injection now there unless i have imposed the condition that this is a direct sum it is not going to follow directly that this is an injection why because when i'm asking for injectivity i'm going to investigate the kernel of this map so i'm going to ask for that fellow in the product which leads to zero in the sum now if it's not a direct sum then i cannot have the property or cannot assume the property that every vector therein is uniquely representable remember the definition of the direct sum every vector in a direct sum has a unique representation so this zero may not have a unique representation unless what unless it's a direct sum in fact if you look at subspaces that have anything other than zero in common then you go ahead and pick that object see what i'm saying what i'm saying is suppose any two of these fellows have some vector some non-zero vector in common say some vector called p then what you do is you just write zero plus dot dot dot plus p plus dot dot dot again zero and then write this as minus p so because this fellow p belongs to at least two of them therefore minus p also belongs to two of them because the vector space must be closed under vector addition right and additive inverse must exist so therefore this is clearly a combination of fellows not all of whom are zero and yet it results in zero you see the point so unless it's a direct sum it's not just the zero that maps to the zero there are also other fellows in addition to the zero which also map to zero so if you want injectivity then clear on this on the other hand if it is a direct sum then of course no two of the fellows have anything in common right their intersections no matter how many of their intersections you take at a time they have nothing other than the zero vector in common so the only way you're going to be able to ever write this zero is by choosing zeros from each of the individual vector spaces here in your domain so the kernel is then trivial which means it is injective so the direct sum implies it's injective and because the direct sum implies it is injective this is also another way of proving that result which you might have proved in your second assignment which is that dimension of a direct sum i going from 1 through k vi is equal to summation dimension vi i going from 1 through k because I have already established it's a surjection and I have now shown that if it's a direct sum then it's also an injection well you might argue is it linear I leave that to you to check just being a surjection and a bijection isn't enough it must be a linear bijection right so of course this map gamma just go ahead and verify that this is linear so that's your exercise not a big deal so once you have linearity surjection injection this is an isomorphism if it's an isomorphism then it means that for a direct sum it's this you've seen already one proof in the solutions of the assignment and now this is another way of arguing the same thing right so we will bring this section on products of subspaces to a close and now we'll move on to something else yes no of course not I have not spoken anything about v these are just random subspaces if it's a direct sum then you are basically assuming that these subspaces have nothing other than 0 in common so I'm not asking about a complete decomposition of the entire vector space into you know these we will do something similar to that we will look at things like those in terms of so called a invariant subspaces when we study the eigenvalue eigenvector problems right a complete decomposition of the entire vector space written in the form of a direct sum but not yet okay