 So, in the previous lecture we have started to look at objects such as inverses of linear transformations and we have seen that if you have an isomorphism from a vector space V to another vector space W then you can always define such an inverse not just that that inverse also turns out to be another linear transformation from the range to the domain right. The fact that it is linear is what is so beautiful because over finite dimensional vector spaces we have seen that if you have a linear transformation that is defined over just any one basis in the domain right that suffices not just that you can also look at it as uniqueness of this definition. For instance you take this phi 1 as a mapping from V to W and suppose you have phi 2 as another mapping from V to W and suppose there is a basis for V given by so of course these are finite dimensional vector spaces that means this is a basis for V and suppose these two fellows agree on the basis it turns out then they agree on every other vector which is to say that once you defined a linear transformation over any basis it is not only defined but it is uniquely defined. So this we can just check out a quick proof of this so you can take phi of phi 1 of V is equal to phi 1 of summation alpha i V i right where every V can be represented as a linear combination of fellows in the basis by the same token you can have phi 2 V is equal to phi 2 summation alpha i V i yeah. So then what can we conclude from this we have to show that these two must agree this V we have chosen as arbitrary but what do we know we already know that phi 1 V i is equal to phi 2 V i for all i belonging to the set 1 2 till n so then it is a trivial matter because what this results in is summation alpha i phi 1 V i and this results in summation alpha i phi 2 V i now since each of these terms equal the other therefore these are indeed equal right. So there is this uniqueness once you have defined a linear transformation over one basis that is the only linear transformation you get no other linear transformation can match unless it is exactly equal no other linear transformation can match just over a basis and yet mismatch over some arbitrary element once it matches over a basis you define it uniquely right so this means that the inverse that we have defined is also unique because once you have shown that the inverse is also a linear transformation and it is defined exactly over a basis of W remember if this is from V to W then the inverse was defined as something that existed as a linear map from W to V and it was defined over a basis in W so then not only have we shown that the inverse exists for an isomorphism but that also the inverse is unique and see now if you try to show this for matrices without understanding this notion of vector spaces and try to show that it is unique you might have to do some bit of work but we are not getting into any formulae that is the point we are dealing with a much more abstract notion of linear transformations of which the matrices just happen to be a special case right so once we have shown this is unique you also know that the inverse of a matrix a square matrix when it exists also has to be unique that is just another way of seeing that same fact right but now today we have promised a little more than just looking at inverses of square matrices or inverse mappings for isomorphisms we have also said that even if we do not have an isomorphism but if we just have a by a surjection or an injection there would also exist some form of a limited kind of an inverse for such maps right so now we shall investigate what kinds of inverses do we have in mind under such circumstances we have already laid out some hints for that in the preceding lecture when we talked about things such as left inverse and a right inverse right but today we shall look at them more closely we exactly prove the uniqueness of a linear map when it is defined over a basis if a linear map happens to be defined over a basis then any other linear map which also agrees with that linear map over that same basis must be nothing but the same map so if you take phi 1 and phi 2 as argue that these are two different maps but they somehow agree on this one particular basis any one basis then they also agree on every other point on the vector space every other vector on the vector space so because the inverse is also linear map like we showed the other day and therefore the inverse must also be unique so that was the point of this discussion right because that is how it fits in but that is of course the inverse as I said of an isomorphism when there is a bijective mapping between v and w that is the dimensions of v and w are equal if they are finite dimensional but now we are going to talk about mappings from v to w when v and w are not necessarily of the same dimension which is to say that they are not isomorphisms okay and investigate when we can cook up some different kinds of inverses so let us say we have v to w where v has a dimension is equal to n and w has a dimension equal to m right suppose suppose phi which is defined from v to w is an injection one-to-one map what do we know about these numbers n and m from our previous lecture what does this immediately tell us if phi is an injective map we have discussed this right in the view of the Rankinality theorem for finite dimensional vector spaces what do we know immediately about this of course if it is an injection that the kernels dimension is 0 because there is nothing in the kernel other than the 0 vector so the kernel dimension is 0 the image dimension must be equal to the dimension of the domain that's v right and the image is sitting inside w so the dimension of the image is less than or equal to the dimension of w but that is also equal to the dimension of v so the dimension of v must be less than or equal to w so we are essentially talking about a situation where n is less than or equal to m in the language of matrices like we said if you're looking at matrices as a special case of linear transformations then this is exactly the kind of what sort of matrices n is less than or equal to m yeah we're talking about tall matrices right because matrices that map from n dimensional to m dimensional spaces are m cross n so if n is less than or equal to m then it's a tall kind of matrix right what else are we asking for here so what is the other property that we need for this existence like this injection so suppose now again let us look back on the figure we had drawn the other day so this is v this is w right so there is this map phi that takes you from v to w and then if we want an inverse of this we must find some psi here which takes you back to v not just to any arbitrary v but then this mapping must be through the identity mapping right so any object v inside this must be mapped back to v itself which is the identity mapping over v right so this is what we are essentially asking for okay and we'll so show that the existence of such a phi is predicated on this property of injection in other words whenever you have an injective mapping you can always find such a phi and if the map is not injective then you will actually not be able to find such a psi did I say phi it's a psi so psi is the inverse of phi right so what we are saying is that this psi is then called the left inverse why because first we are hitting it with phi and then we are hitting it with psi so this is a composition all right so it's a psi of phi that should be the identity over v that is a definition of a left inverse remember this is the definition of the left left inverse now we are claiming that existence of left inverse for phi is equivalent to phi is injective right so we have to show both sides because this is an if and only if sort of a condition necessary and sufficient condition which is to say whenever you have a left inverse phi must have been injective to start with whenever you start with an injective phi you'll always be able to cook up a left inverse yeah so how do we go about establishing this fact okay this much is clear so far any questions on this yeah we may not be able to get a complete inverse like we got for isomorphisms but at least with injection there is one kind of inverse that you will get which is a left inverse yeah so what is this condition let's try and investigate by the way in terms of matrices what does this mean what does it mean to say that the dimension of phi sorry the dimension of v is equal to the dimension of the image when can you say that a matrix has a left left inverse because matrices we understand better their arrays of numbers so you have something that's in f m cross n right so we have a tall object such as this this is m this is n so when is it exactly that we say that the matrix yeah is going to have a left inverse think about it what is this n and m how is it appearing n is the dimension of v and what am I saying about the dimension of the image of phi what is the dimension of the image of phi also known as in case of matrices rank of a matrix right so we are asking for the rank of the matrix to be equal to n which means it is full column rank of course row rank and column rank are the same but what we are saying is that in case of matrices this is tantamount to full column rank in short FCR for matrices in other words if I manage to establish this equivalence I would have established in the parlance of matrices the fact that whenever you have a matrix that has full column rank then you will always be able to cook up a left inverse for such a matrix now of course if you have a tall matrix I mean if you have a wide matrix can you have a full column rank matrix because the row rank and column rank are the same if you don't even have as many rows how can you have that as a rank right the rank is less than or equal to the lower of the number of columns and rows right so if you are asking for full column rank you must have at least you know suitable number of rows it can at most be a square matrix or a tall matrix right so that is all implicit here so that means tall matrices if they are full column rank just being a tall matrix isn't enough but if a tall matrix has full column rank by dint of this assertion we have not proved it yet we will but by dint of this assertion tall matrix having full column rank will always have a left inverse yeah so you just try and understand the implication of this result first absorb it in in the language of matrices and then we'll do this proof any doubt so far on this the numbers add up right please be sure that you are grasping the essence of the numbers here how it looks it's important to have an image of the matrix picture at least okay so if that is clear we can move on to seeing a proof for this okay so suppose phi is injective phi is injective we have the fact that n is less than or equal to m so consider bv of course a basis for v basis for v given by v1 v2 till vn look at the set s defined by phi v1 till phi vn what can you say about this set s any special claim we can make about this set s is it a basis how can it be a basis see m a basis for w must have at least m but you are on the right track distinct or something more linearly independent set right so this s is a linearly independent set why so the claim is s is a linearly independent set of course it is distinct but it's something more than that more than just distinct it has it is a linearly independent set why suppose not suppose not yeah assume the contrapositive suppose not then what happens summation ci phi vi is equal to 0 let this be true this implies summation phi sorry the summation would go inside so phi acting on summation ci vi is equal to 0 but what do we know about phi the fact that it is 1 to 1 it's an injection we have assumed it is injective right so if it is injective then only the 0 vector can exist in its kernel so this is mapping it to the 0 of w so therefore this vector inside the argument of phi must also be 0 is equal to 0 this is a 0 of v but this is definitely a linear combination of vectors from a linearly independent set which is the basis it implies that ci is equal to 0 for all i but ci is are indeed the coefficients of these 5 vi's so any linear combination of 5 vi's that takes it to 0 must be the trivial linear combination where all ci's are 0 which means that this set is indeed linearly independent so this is linearly independent we have also checked that out if it's a linearly independent set and it's residing inside a vector space of dimension m which is greater than or equal to n what do we know about such a set we can always extend this linearly independent set until we get a basis for w right so let us go ahead and do that see at every step we are invoking nothing more than what we have proved earlier right ok so let me leave this it is the figure here right ok so extend s to a basis for w given by b w is equal to phi v 1 phi v 2 until phi v n and then you have w 1 w 2 until w m minus n right that adds up so there is exactly m vectors here we can always cook up a basis by extending that existing set that that was given to us we proved this earlier right any linearly independent set can be extended to a basis so this is a basis and now define psi as a mapping from w to v given by psi of phi vi is equal to vi and psi on wi what does this mean I don't care anything I don't care so never have I claim that this inverse map has to be a unique map you can come up with some map I can come up with some other map as long as this part is sacrosanct that's all I need see you initially you might feel this idea a bit uncomfortable because we are used to cooking up unique maps and so on right but I am saying this left inverse doesn't have to be unique yeah when you have inverses of isomorphism then we have shown the uniqueness right but here this is at my discretion the only thing that is sacrosanct is that if there is a fellow in w which came as a result of mapping by phi then that fellow gets mapped back to the identity element that is to vi itself but as for other fellows who did not even originate I mean look at this this wi see it's not a surjection so these wi's which are fellows here then need not necessarily have had a preimage with respect to phi so I don't care about the psi maps them back to it's really unimportant to me the only thing I am concerned with while mapping this if you now go back to that figure which I have erased here is that if something originated in V then after this mapping successive mappings by phi and then by psi it should lead back to that same fellow as for the other fellows who did not even originate in V just if I take up partial picture of that vertical line from W to V that psi doesn't really matter to me psi only matters when there are fellows from V that I have under consideration right so therefore this is indeed an inverse you can check right you take any arbitrary here so this means these are like don't care conditions yeah so I don't care about what happens to these I can map them to anything I like and this indeed also brings to if I like you can it's actually good for many design problems and things like that sometimes you might want some sort of a best inverse best left inverse from certain perspectives so you can play around with these assignments as long as you keep this fixed these you can play around with and you know get some best possible solution for your particular application right so that's a left inverse so we started with the assumption that this is injective and we've now shown you that we can always constructively arrive at a left inverse right now it's the other part that remains to be shown and unfortunately I have to erase this that's okay by the way I'll urge you to try and prove that this inverse whether it's linear or not just satisfy yourself yeah once you fix this up then it becomes uniquely defined but as it is left to itself since this is a degree of flexibility here it's not unique right so the other part is suppose psi composition phi as a mapping from where is this a mapping from by the way v2 itself right yeah exists so that psi composition phi is equal to the identity mapping in v now I have to show that this is also an injection because I had to show both sides you see I started with this being an injection and showed you that there is an left inverse that exists now I'm assuming a priority upfront that there is a left inverse and I'm going to have to show that this is injective so what is the criteria for injection if it fails to be injective then I should have two different fellows in v mapping to the same fellow that would violate injection right so suppose v1 not equal to v2 and of course v1 and v2 belong to the vector space v such that phi v1 is equal to phi v2 which is when I'm assuming that there is not injective now I have to somehow prove that this is absurd this can never be true and I will do so exactly by using this condition look at what happens so psi composition phi acting on v1 is equal to what psi acting on phi v1 and psi composition phi v2 is nothing but psi acting on phi v2 but what do we know about phi v1 and phi v2 they're equal so I must have psi composition with phi v1 is equal to psi composition with phi v2 because of course the same fellow psi which is my left inverse when it takes up the same argument because phi v1 and phi v2 are the same they must map to the same point the same point cannot be mapped to multiple points otherwise it's not a valid mapping right so these must be equal but what do I know about this what is this whenever I'm saying it's a left inverse that's the definition of a left inverse that means that identity mapping on the vector space v on v1 is equal to the identity mapping on the vector space v that is v2 but identity map only returns the vector itself right so let me use a different color because I'm running out of space just to highlight this point which means that v1 is equal to v2 which is a contradiction of this claim that I made here so contradiction right in other words what I've shown you is that if you have a left inverse that exists then the map that you started with must be injective right so these equivalences have been established any questions alright then we'll move on to the next module