 Let us now see some implications of this rank nullity in the context of one-to-one and on-to maps right. So I think it's by now okay let me leave it there for the moment and now let us say that you have suppose phi is one-to-one what is the equivalent condition of one-to-one that we saw just a while back the kernel of phi must be the 0 subspace what is the dimension of the kernel in that case 0 right. So that means by the rank nullity theorem what must we say what should we be able to infer that dimension of v from where you are taking objects and passing it through phi must then necessarily be equal to dimension of the image of phi is it not right just straightforward application of the rank nullity theorem to this condition okay. Suppose phi is on-to surjective what do we know? What do we know about the image of phi in that case it's w right. So then dimension of v is equal to dimension of kernel of phi plus dimension of w but now what sort of values can this number take non-negative integers is it not. So what can you say for on-to-maps about the relations between the dimensions of v and those of w's see dimension of v minus dimension of w is equal to dimension of kernel phi which is what greater than or equal to 0. So therefore dimension of v in such case can you not say this the dimension of v what is it yeah greater than or equal to we have to you cannot discount the possibility that is also equal what about this can we say anything about it of course we can where is this image of phi residing inside see image of phi is residing inside w. So it is a subspace of w can the dimension of a subspace be greater than the space inside which it resides of course not. So this is less than or equal to dimension of w right straight away in view of this now I urge you to think in terms of matrices back again so we will keep going back and forth with phi and matrices at special cases of these kind of phi's or linear transformations now think about it what do you think where do you think do fat matrices and tall matrices fit into the picture. So can tall matrices be both one to one and on to if at best they can be one of them then which of these can they be what does a tall matrix do suppose you have a matrix A which is m cross n can I not think of this as a mapping from mapping from where to where yeah. So what kind of matrices do you think can be one to one sorry tall tall means these numbers if you will think of them m is greater than n. So that means fits in here right because m is where you are mapping to n is where you are picking out objects to map from right similarly if it is on to where it must be a wide or a fat matrix right. Now the only way that you can fit both of these pictures at the same time not just for matrices but for mappings in general if I want a bijection what am I basically talking about. Now you have to impose both of these restrictions and therefore for a bijection what do I require that the dimension of V must equal the dimension of W. In other words in the language of matrices I am talking about square matrices yeah straight away right. If you have to invoke both of these then equality is the only possibility. So if equality holds is it guaranteed I mean is it an if and only if condition not really right this is a necessary condition for bijection but just because you have this I can cook up several linear maps which are mapping from vector spaces of equal dimension does not mean that the map is always going to turn out to be or the linear map is always going to turn out to be a bijection of course not. In fact if you do manage to find a bijection between two vector spaces independent of your knowledge about their dimensions if you found that they are indeed bijections that is an independent way of proving that the dimensions of the two vector spaces must be equal you follow what the point is right. Just because I have told you that the dimensions of the vector spaces are equal not every mapping that takes fellows from V to fellows in W is a bijection just because the dimension is equal it does not guarantee that but if you have found a bijection from a vector space V to a vector space W then you can definitely be sure that V and W have the same dimensions. In fact this is a very important class of mappings and we say that these are isomorphisms. So morph means morphology means structure iso means similar right. If you find a linear bijection then that is the same as finding an isomorphism you have already seen one isomorphism though I did not probably give it that name at that point in time can you think of what I have described in this class which is an isomorphism the way we assigned coordinates yeah the way we assigned coordinates is a clear cut case of an isomorphism why well here is another interesting cute little result that will tell us or show us in a very easy manner why it must be true why what we have shown is an isomorphism. So the claim is this okay suppose dimension of V is equal to dimension of W. So I have already started with this that dimension of V is equal to dimension of W does not mean it is a bijection but under these given circumstances surjection implies injection and the other way round. In other words when you know a priory that the dimensions of the two vector spaces V and W are the same if you manage to show that a particular mapping is an injection or a one to one mapping you do not need to separately prove that it is onto and the same way if you know somehow if it is easy to show that it is an onto mapping you do not need to separately prove that it is an injection it automatically follows right. So just verifying one property brings the other property as a guarantee right. So how do we show this suppose of course this is a base that we have to build it on so this is going to be true anyway assume injection it is just what we have done a while back just for the sake of this proof I will have to rewrite this assume injection means what the equivalence condition is that the kernel of phi is 0 right. So dimension V is equal to dimension kernel phi plus dimension image phi but this of course is 0 yeah what do I know about dimension V dimension V is equal to dimension W. So this means this means what dimension W from this given condition I am invoking here and from this given condition that it is an injection I am invoking here is equal to dimension of image of phi but look image of phi is contained in W if its dimension is equal to W we've been over this step in the previous lecture to we've seen that in a subspace has the same dimension as the original vector space then they must be one in the same you don't have to show containment both ways likewise to show that they are one in the same. So this means that W is equal to image of phi which is surjection next if you assume surjection first again it is the rank nullity to the rescue all the way. So if you assume surjection first then what happens of course just write the base step again that rank nullity which is why I haven't erased it dimension V is equal to dimension kernel phi plus dimension what is this image phi W right W but again by the given condition this and this must vanish. So therefore dimension kernel phi must be equal to 0 but we know of only one vector space which has dimension 0 which is the 0 subspace right. So kernel of phi must be that is phi is an injection. So when you are dealing with vector spaces whose dimensions are the same verifying one property is tantamount to verifying both of them at the same time you don't have to separately verify them. So why is this useful what were we discussing just before this yeah isomorphisms right. So now think about isomorphisms and think about the case where we assign this coordinates and we claim I just claimed a while back that it is an isomorphism. What is the one claim that I had made then think about that matrix that dictionary that I called from one language to the other what is in the kernel of that matrix we have already claimed that matrix is non-singular it is invertible why because we saw that the 0 gets mapped to the 0 the 0 of vector space V the n dimensional vector space gets mapped to the 0 of the n tuple f to the n and nothing else. So then it means what dimension of V is n dimension of f to the n is n. So they are definitely vector spaces of same dimension that is what the coordinate assignment does. So this checks out that V is equal to w and I have already shown injection because I have shown you that the kernel contains nothing other than 0. So that means it is a surjection if it is both injection and surjection it is a bijection if it is a bijection then see nothing is like carved in stone it is all basically part of the definitions and just pushing your way through. So it means it is an isomorphism they are the same structure which means that you can basically deal with them like you deal with Euclidean spaces that is why the idea of matrices is so powerful because once you have a finite dimensional vector space once you can figure out what its dimension is you can always say there is an isomorphism through coordinate assignment through choice of an ordered basis there is this isomorphism and then you can deal with it like you deal with n tuples of numbers right. So that is the power of this isomorphism right. So it is very important that you find out in any way whatsoever some one to one onto linear map so the three things that you need in a map it has got to be linear it has got to be one to one it has got to be on two. If you can guarantee those three things you do not have to check the dimensions of those spaces separately you can rest assured that the dimensions of the two spaces will be equal yeah. There is another beautiful thing that this allows us to do which is when there is an isomorphism you can always cook up inverses yeah you can always get inverse maps and the fact is that even this inverse map is going to be linear right. So not only does it guarantee the existence of the inverse but it also says that the inverse map is also going to be linear that is the question that we shall now try and investigate for matrices you already know this to be true you have this formula we are not going into those formula based things like determinants and all but you know that A is a matrix A inverse is also a matrix. So of course any matrix is a linear transformation so if you are operating between vector spaces or n tuples of numbers where n is equal on either side then it is a square matrix and any square matrix you know the explicit formula for those invertible square matrices right. But what we shall see is that in general for any linear transformation this is true right. So that is the main value of this kind of thing and then we shall see something very interesting even when you do not have square matrices that is to say even when you do not have the dimensions of these two sides to be equal you can still have some limited kind of inverses yeah in a limited sense you cannot have inverses like you do for square matrices but you can still push your luck and get certain kinds of certain special kinds of inverses right. So that is going to be the object of our investigations next. Any questions or doubts so far? Before we move over to these studies of these inverses there is a very important property that will set the tone for our study which is that what is it that is so good and nice about linear transformations? Why do we prefer them or why do we feel that they are tractable? See when you are dealing with finite dimensional vector spaces if you want to characterize what a linear transformation does to each and every object in that vector space and that can mean in finite number of objects in the vector space you only need to know what it does to a finite number of objects in the vector space and that is it you do not care about anything any other piece of information right. So if phi which is a linear mapping from vector space V to vector space W these are finite dimensional vector spaces right okay. So phi can be completely characterized by its action on any basis or V right which is to say that suppose BV is equal to V1 V2 Vn till Vn is so suppose this is a basis for V then knowledge of phi V1 till phi Vn suffices in evaluating phi V for any V in V. This is going to be the theme for any linear system okay that you know what it does to a few of them and you can basically extend that and figure out what it does to every one of them. So this is why it is so important to understand what a linear transformation does to a basis set. So let us try and understand why this is true any V in V can be written as V is equal to summation alpha i Vi i going from 1 through n consider phi V which is nothing but phi acting on the same object summation alpha i Vi but now because of the linearity I can pull out these terms one by one separate them. So this is nothing but summation alpha i phi Vi I have done two operations at one go okay. I first said this is phi of alpha 1 V1 plus alpha 2 V2 plus dot dot dot till alpha n Vn which is equal to phi of alpha V1 plus phi of alpha V2 plus phi of alpha V3 and then I have subsequently pulled out the alphas outside the scalars and said it is alpha 1 phi V1 plus alpha 2 phi V2 plus alpha 3 phi V3 so on till alpha n phi Vn right. So I have done both of those at one go but then notice what are these objects these are exactly the action of phi on the basis. So irrespective of whatever V you pick out I did not choose this V to be anything special it is an arbitrary fellow just an ordinary fellow sitting inside V which is representable in terms of this basis. So all that you need to understand in order to understand the action of characterized action of this phi is to understand what it does to a basis set. Once you know what it does to a basis set you basically defined it for good right yeah. So it is then important in this slide to investigate how we can say that when there is a one-to-one onto or a bijection mapping between two vector spaces V and W the inverse will also turn out to be linear and then how do we define the inverse. Of course defining the inverse that recipe is going to be very straightforward once we have figured that the inverse is also linear because just like the original map the inverse map is also going to lend itself to easy description once we know what it does to particular fellows therein right ok. So this idea is clear suppose phi which is a mapping from V to W is an isomorphism first we have to understand what we mean by this idea of an inverse. So I will try and explain that by means of a diagram. So this is V this is W and this is V again. So suppose I go this way using phi right then I go this way using psi. If all that it does is basically maps V to itself in what we call the identity mapping this is the id means identity, identity mapping but identity mapping in vector space V right what do you think is the cumulative effect of these two yeah no but if I am to compose these operations together isn't this the same as psi composed with phi being the same as the identity in the vector space V. On the other hand if I had the following W and W right so shock so this is my phi right. So this phi takes me from V to W and this is basically the equivalent of okay have I used the same color no I should use the blue color probably right yeah. So there is a fellow W here which gets mapped to W note that this is the identity in the the identity mapping in W it takes objects in W and maps it to the same object back yeah but then I am seeking maybe I should have used a different color for the object that I am seeking which is in this case psi and in this case this psi. So this is the much sought after entity but their actions are different you see because in this case in this case what is the object doing just notice what is happening first psi acts on it followed by the action of phi so isn't this the same as phi composed with psi giving me the identity operation in W is that clear see it is very important to understand if I don't put this subscript here you might be led to believing oh it's the identity no it's not so that's the significance of the left operation and the right operation in either case the object I am seeking is psi all right so that's the idea of inverse one sided inverses may exist but both sided inverse will exist for isomorphisms yeah that is that is the claim so with this picture we would like to bring this module to a close and we will discuss the idea of inverses and how they can be defined given a map how we can actually go about constructively defining the inverse of that linear map okay