 So, again sorry this is just phi which is mapping from this to this let me draw that figure again now the picture is a little different. So, we have this phi here that is mapping from V to W right and we have this let me use that other color we have this psi and the resultant of this must be the identity in W that is what we mean by a right inverse right. Now, if phi is surjective suppose this is of dimension equal to n again this is of dimension equal to m. Now, what do we claim about n and m and their relation again go back to the discussions in the previous lecture what have we seen what do we know about this if it is a surjection then what do we know about the dimensions of V and W of course W is an equal to the image image of phi yeah because every fellow in W has a pre-image in V. So, it is the image of phi. So, then the dimension of the image is m the kernel we know not it is presumably non-zero because if it were 0 then it would be just a bijection yeah but the kernel is a positive number in that case if it is not a bijection and on the left hand side we have n. So, n minus m is equal to dimension of kernel which is a non-negative integer. Therefore, we have now n greater than or equal to m in the language of matrices this would mean wide or a fat matrix clear so far and now with this in mind we will claim that where is it yeah I mean you might already guess what the claim is the claim is that phi has a right inverse given by phi composed with psi right inverse say psi which is a mapping from W to V ok given by what is this a mapping from so it must be the id mapping in W like we have drawn there this is equivalent to phi being surjective ok kind of just the dual of the previous result is it not ok. So, again it is then befitting that we show both sides of this implication. So, first let us again I am getting rid of this figure first. So, first let us assume suppose phi is surjective right remember this consider B W is equal to W 1 W 2 W m as a basis for W. Now, if it is a surjection what can we say about each of these members in the set B W without loss of generality there are after all members of W and any member of W must have a pre image under this map phi in the set V in the vector space V. Therefore, each of these fellows W 1 through W m must also have a pre image in the vector space V yeah. So, it means there exist V i such that phi V i is equal to W i remember I am not saying anything about whether these V i's are linearly independent or distinct or whatever I do not care. So, suppose in the worst case you have two different V i's that map to the same W. So, you might ask what do you do then well you just map it back to any one of them you see what I am saying here you have a V here and you have a W here this is V this is W now it could very well be that there is this point W here which has come about as a result of this map phi from the point V as well as from the point V till day. So, there could be multiple pre images see it is not a 1 to 1 guarantee anymore it could be a many to 1 of course, it can be a 1 to many that is the one thing always ruled out, but it can always be a many to 1. So, you might say I am saying there exists I never said there exists a unique V i remember that I am not claiming any uniqueness here I am just saying there exist some V i at least maybe multiple yeah. So, in which case what do you do how do you cook up this inverse well we quite simply assign if there are multiple fellows that map to W i then I pick any one of them yeah really does not matter define psi of W i is equal to V i now you see what is going on you think about this map here what is the composition operation that is going on it is a right inverse right. So, you have phi composed with psi. So, you give the first right of action to psi, psi acts on W i even if there are multiple possibilities are going to V 1 V 2 V 3 it picks out any one of them, but once you are in one of those elements that map to W i if you let then phi act on any one of those they will all take it back to the same W i. So, the id property that the fact that this will actually map to the identity this composition operation will actually map to the identity is still legit is it not right. So, there is a problem with this definition yeah you agree. So, again just to be sure there exists at least some V i that is what surjection guarantees. So, there exist some V i pick out any one of those V i's if it is a 1 to 1 fantastic you have an isomorphism if it is not at least you have a surjection. So, at least you have some V i. So, pick out that any one of those V i's which map you back to W i and say that the psi that left that right inverse that you are constructing that psi will map this W i to one of its potential pre images. Surjection guarantees at least one pre image. So, map it back to its pre image here and then if you let phi act on that pre image obviously will take it back to W i. So, this is indeed the identity. So, surjection does guarantee yeah the claim is that this is we are done. So, psi is the desired right inverse yeah. So, all that we now need to show is that if you have the right inverse then the map must be surjective because now we started with phi being surjective and we showed that there is a right inverse. Now, suppose there exists psi as a mapping from W to V such that psi sorry phi composed with psi gives me the identity in W. So, what I have to show is that this phi must be a surjective mapping. How do I prove this? Take any W belonging to W implies we have phi of psi W is equal to W that is what the composition is. But what does this mean? This means psi W is a pre image of W under phi. So, any W that you pick out it has a pre image. So, psi W belongs where of course psi W belongs to vector space V. So, any W that you pick out I will go and figure out what is psi W according to what I have defined maybe I do not even need to define that some psi W definitely is there because it is guaranteed by this there exists some psi which is this right. So, go ahead and apply psi on that whatever it is that is itself your pre image. So, therefore, any arbitrary W I have picked out this W without loss of generality. So, any arbitrary W I can construct its pre image with respect to phi by looking at psi W. So, therefore, this is surjection established. So, we have proved both sides of this assertion in the language of matrices. What this means is if you have a wide matrix with full row rank then you will always have a right inverse to such a matrix. So, let me just point that out a belonging to f m cross n with what is it m less than or equal to n and rank a equal to m guarantees a right inverse for a just a special case of linear transformations right. So, tall matrix full column rank left inverse will exist wide matrix of at matrix full row rank right inverse will exist right. So, we have established these results about existence of inverses any questions on this sofa this clear alright well practice there is really no shortcut. So, you I will share a few more problems because that first problem sheet is now done and dusted. So, I will share a few more problems in the second problem sheet and you will get enough practice for trying and proving these sort of things. But you will see that you will begin to see the pattern already I suppose you are seeing we are repeatedly using certain kinds of techniques and tools every once in a while we will use something a little different, but more often than not we are just using basic definitions and pushing our way through right and these are kind of textbook stuff mostly standard textbooks would have them the once you will get in your problem sheets are not really you know the standards text textbook stuff that you find in derived in your books those are kind of like in our childhood we used to have this geometry problems right we used to call them riders first you had the theorems and things and then you have riders right the problems that you are asked to solve. So, those are kind of those riders ok. So, we will go back and forth with this sometimes we deal with square matrices and rectangular matrices similarly sometimes we deal with isomorphisms and sometimes with mappings or linear transformations between vector spaces of different dimensions right. So, let us again pull ourselves back to isomorphisms and look at some more interesting properties of isomorphisms ok I know I mean we are going back and forth on this, but it is important to grasp in its entirety because obviously if you have both surjection and injection it is going to be vector spaces of equal dimensions it is going to have to be that and in terms of matrices it is going to have to be square matrices of full rank which is basically invertible matrices right ok. So, now I am going to make a claim since we have already dealt a bit with these compositions here is an interesting not really as difficult to prove as it sounds, but compositions of isomorphisms are also isomorphisms right which is to say that let me draw this even though I hate drawing figures because it is very easy to cheat people using figures ok. Sometimes it is very instructive to draw these things they help us. So, suppose you have mapping phi 1 from V to U and phi 2 again these are all linear transformations right goes without saying and this from U to W and suppose these are both isomorphisms. So, what is it that you can immediately infer if these are finite dimensional vector spaces that dimensions must match because you have gone ahead and found linear bijections between them. So, the dimensions of these fellows are all same that is a first I mean just an observation. Although what we are trying to now prove is the fact that if you look at this as a overall map from V to W that is a composition of these two what is it going to do? First you are going to take a fellow in V map it to W and then you are going to let phi 2 act on that fellow and map it to sorry map it to U first and then map that fellow in U to a fellow in W through phi 2. So, that is tantamount to phi 2 composition phi 1. So, we have to show that is an isomorphism. So, you agree that if we can show this then we are done here this composition map is also an isomorphism is what we have to show these vector spaces are isomorphic the mapping is an isomorphism so, when you have found an isomorphism it means the underlying vector spaces are isomorphic right. So, what are the things that we need to check in order to prove isomorphism linearity injection and surjection right. So, we are going to look at this as a map remember this is a map from V to W. So, what are we going to do? We are going to look at the linearity property first. So, phi 2 composition sorry composition phi 1 acting on alpha V 1 plus V 2 where of course, this fellow comes from V what is this? This is nothing but phi 2 acting on phi 1 alpha V 1 plus V 2 yeah that is the composition, but then what is that? What can I write because phi 1 is linear it is an isomorphism. So, it is definitely linear. So, I can say this is phi 2 acting on alpha phi 1 V 1 plus phi 1 V 2 where are these fellows coming from now these are fellows sitting inside U right because remember phi 1 maps fellows in V 2 fellows in U. So, now if phi 2 is acting on fellows in U it takes it to fellows in W objects in W, but what do we also know about phi 2 that is also a bijection a linear bijection. So, now you can just think of these tiny teeny objects yeah as simply objects in U say this is U 1 and say this is U 2 in which case okay let me just go ahead and use that colour this is just phi 2 acting on alpha U 1 plus U 2 U 1 plus U 2 which is nothing but what is it I can pull this out because this because of the linearity of this. So, this is alpha times phi 2 of phi 1 V 1 plus phi 2 of phi 1 V 2 that is nothing but alpha times phi 2 composition phi 1 acting on V 1 plus phi 2 composition phi 1 acting on V 2. So, linearity is established is this clear because this map acting on this fellow is the same as this map acting on this fellow first multiplied by alpha plus this composition map acting on V 2 right. So, linearity is established what we next need is to show that it is 1 to 1. So, suppose V 1 is not equal to V 2. So, then phi 2 composition phi 1 V 1 is equal to let this be true that is it let it violate the tenets of injectivity. What does this mean? This means phi 2 of phi 1 V 1 is equal to phi 2 of phi 1 V 2. But what do we know about phi 1 V 1 and phi 1 V 2 if V 1 is not equal to V 2 then phi 1 V 1 can never be equal to phi 1 V 2 right. So, since phi 1 is so please fill out those gaps those arguments because phi 1 is an injection therefore, phi 1 V 1 is not equal to phi 1 V 2 unless V 1 is not sorry unless V 1 is equal to V 2 only when V 1 is equal to V 2 can they be equal. So, therefore, this is like claiming that phi 2 U 1 is equal to phi 2 U 2 with U 1 not equal to U 2. But is that possible because phi 2 is also 1 to 1 map and if U 1 is not equal to U 2. So, this is absurd since phi 2 is also an injection right. So, this is the fellow that I am calling U 1 and this is the fellow that I am calling U 2. So, obviously U 1 is not equal to U 2, but phi 2 U 1 I am claiming to be equal to phi 2 U 2 which is again absurd because phi 2 is also an injection therefore, phi 2 U 1 cannot be equal to this unless U 1 is equal to U 2. So, therefore, whatever I started with that this be equal despite these two being not equal is not possible which means that this composition phi 2 composition phi 1 is an injection. Do I need to prove surjection if I am dealing with finite dimensional vector spaces I actually do not yeah because dimension of V and dimension of W are same we proved this earlier right dimension of V and dimension of W are same and there is a map which is a linear map which is an injection therefore, injection would imply surjection. But of course, you can try that as an exercise to show independent of that result that we have derived that indeed this map that is the composition of these two must also be a surjection right how do we do that is also very simple just have to have that picture in mind. So, what you are asking for is you have this these three objects this is V this is U this is W I am just going to do it in a pictorial form just please fill it out by yourself. So, what I am saying is that for every object here there must be a preimage here through that composition, but look it is just about retracing your steps back at the first step map this object here back to something here that is definitely going to exist because phi 2 is definitely a surjection then because phi 1 is a surjection this will indeed map it back to here. So, then you go ahead and in one stretch in one fell swoop you sort of relate this back here this fellow will indeed be the required preimage under the composition mapping. So, therefore, what we have is a linear injection and a surjection and therefore, a linear bijection which means we have an isomorphism composed out of two different isomorphisms right. So, compositions of isomorphisms are also isomorphisms you can go ahead and keep increasing the number of these compositions I have just taken two maps and you see this argument is going to be very similar right. So, you do not really need to argue about the dimensions here per se right yes of course, yeah of course the order matters right because when you are going from here to here first you are hitting it with phi 1 and then you are composing it because the order tells you precisely from which vector space to which vector space you are going in which sequence right. You cannot in fact, go from arbitrary vector space to another arbitrary vector space right because phi 1 and phi 2 are defined in this manner if you invert them if you flip them then what you are saying is that you first want to go from here to here and then you have no way to go because you see this map phi 1 is defined as a map from v 1 to w not from w to somewhere right. So, it is not even going to be defined if you just flip the order of course, when we have the same vector space like endomorphisms then it will still be defined but even then you see order of multiplication of matrices you might just readily think of they are not commutative matrix multiplication is not commutative in general right. There is going to be some very special conditions when matrix multiplications like square matrices when they multiply and they are commutative very special properties of matrices right otherwise they are not going to commute in general right ok. So, that brings this module to a close.