 So, the moment you have this as an isomorphism it is a 1 to 1 onto bijection linear map so on right. Suppose psi is an inverse for phi which means that psi takes objects in W and maps it to v objects in v it goes in the opposite direction. So, let v 1 v 2 till v n be a basis for consider phi v 1 phi v 2 phi v n what can you say about the set immediately based on whatever we have discussed it is a basis for sorry it will generate W is it also a basis or not? I mean if this is not linearly independent then a linear combination of these fellows would be 0 a non trivial linear combination, but then that would mean that phi, but what do we know about phi you see it is an isomorphism it contains all those properties we have discussed so far. So, it is therefore definitely 1 to 1 if it is 1 to 1 then its kernel is just trivial if its kernel is just trivial then you take any linear combination. So, summation c i phi v i is equal to 0 is the same as summation phi of c i v i is equal to 0 which is the same as because of the 1 to 1 nature of phi summation c i v i equal to 0 which means c i is equal to 0 for all i. So, this is also linearly independent set contained inside W it is 1 to it is an isomorphism after all I mean everything checks out. So, it has to be a basis yeah it is a basis for W and the point is here is why this is so easy is the point is define psi as a mapping from W to v in the following manner how psi acting on phi v i is equal to v i you can go ahead and name these fellows W 1 W 2 W n for ease of notation and in that case this is just W i. So, this is how I am going to define the mapping but this is only good enough if psi is after all linear because we have seen just a while back that for a linear map if you know what the linear map does to every object in a basis set then you know what it does to every element in that in that vector space right. So, we have to be sure that when we are defining a mapping psi in this manner this is good enough this is enough we do not need to define it at any other point of any other object for that we need to establish that this is after all linear but is psi linear that is the question see this is mapping to the identity of element a identity mapping in v yeah it is the identity map in v you take any object in v and it spits out the same object in v. So, is psi linear that is the question we need to answer how do we check that psi is linear we need to check if psi acting on alpha W 1 plus W 2 is equal to. So, the question then translates to alpha psi W 1 plus psi W 2 for all alpha in the field and W 1 W 2 in the vector space W that is the question you agree once we have answered this in the affirmative we have actually not just told you that an inverse exists we have told you it is linear and that this is exactly the way to cook that inverse right. So, you look at every object in the originals vector space the basis for the original vector space see what is done to them take them take their maps or their images in W and then just map them back to the original fellows and you have the recipe for the inverse, but this is the hurdle that we must cross before that to show that the psi is also linear. So, how do we show this we know nothing about the linearity of psi, but we do know about the linearity of phi. So, I mean see this is the this is the technique that we have to rely on whatever we know we have to harness that knowledge in order to prove whatever we do not yet know no trickery really. So, somehow we have to massage this question in a way that it reflects on phi or using some properties of phi instead of properties of psi because about psi we know nothing much apart from the fact that this is how it must act on objects right. Suppose W hat is equal to alpha W 1 plus W 2 what do we know about W 1 and W 2 see there exist V 1 V 2 in V such that phi V 1 is equal to W 1 and phi V 2 is equal to W 2 because of surjectivity out of that so many properties that we already have invoked right. So, then what can we write this as W hat is equal to alpha times phi V 1 plus phi V 2 yeah this is what phi acting on alpha V 1 plus V 2, but what does this tell us. So, what should psi acting on W hat spit out right this implies if it is an inverse mapping by the definition of the inverse mapping psi acting on W hat should spit out alpha V 1 plus V 2, but what is alpha V 1 from this we just take a d 2 over there this immediately also implies that psi acting on W 1 must be equal to V 1 and this also must immediately tell me that psi acting on W 2 must be equal to V 2 yeah. If there is any step that you have any questions about please ask what have we assumed we have taken any arbitrary object W hat which is alpha W 1 plus W 2 in order to show linearity we have to show this. So, this object we have gone ahead and named as W hat for convenience no trickery. Now, what you have to show is that psi of W hat must be equal to alpha times psi of W 1 plus psi of W 2, but now it turns out that that is alpha times V 1 plus V 2 and by the definition of the inverse and nothing except that we have the conclusion that whenever because of surjectivity every object W 1 must have a free image V 1 every object W 2 must have a free image V 2 and by the definition of the inverse psi W 1 is V 1 psi W 2 is V 2. So, therefore, I can just plug it back in here and say that this is alpha times psi of W 1 plus psi of W 2 that in other words is just the definition of linearity. So, we have indeed shown that when such an inverse exists yeah which is guaranteed to exist because you see we have already mapped it back like this. So, this is constructive we have shown that this inverse when it exists is linear and because it is linear in this case just defining it in this manner does my job for me right. So, now you go back to matrices once again precisely square matrices and try and see what this essentially entails. For the basis set take the standard basis which is in case of n tuples of numbers we look at 1 0 0 0 like this until 0 0 1 like this. What are these objects when these objects are acted upon by a matrix what do you get subsequently this generates the first column does it not this generates the second column third column so on that will till this generates the last column yeah. So, what you are going to do is you are going to take those columns and of course, because it is an invertible matrix the columns have to be linearly independent and the number of those columns are exactly n. So, they will also form a basis for the range which is w in this case right. So, then look at those and map them back to these that is what your inverse does right. So, the standard inverse of matrices that we see is nothing, but a special case of what we have done here right forget about all the determinant related formula those are useful for solving problems quickly, but at least at a conceptual level you must be able to see the similarity between what is done here and what you actually do for matrices eventually right. But then things get tricky as I said when you do not have square matrices in other words you do not have isomorphism you have either or one of those properties either on to or one to one right. In such cases also we do seek to find some form of an inversion but before that I will just briefly delve into a couple of things first suppose I define this object L V w is a collection of all linear transformations from V to W all right. The claim is that this is a guess what vector space how do we go about showing this but this is a vector space take any two linear transformations phi 1 and phi 2. So, now remember that is that is the important thing once you have done that everything else follows straight away here the objects in this set are transformations themselves not the vectors of the vector spaces V and W remember. So, what you have is phi 1 and phi 2 and you have to consider now alpha phi 1 plus phi 2. So, check out if this object is also what a linear transformation between V and W that is the closure property. So, is this also a linear transformation that if you allow this to act on objects in V do the spit out objects in W in accordance with the linearity property make sense what you have to check is whether this object is also a linear transformation these two being linear transformations. So, these two coming from L V to W does also belong to L V to W for all alpha. So, that is the question right that is the question you have to answer do I need to prove this good thank you. So, we will we will just leave that as an exercise it is a straight forward exercise really you just have to check the linearity property of this as an operator. So, you have to let this act on two objects in V or a scaled version of an object in V and check those two properties that is all that is required here ok. The next important result is let me leave it here suppose V has a dimension and W has a dimension m what do you think is the dimension of this object see we have already convinced ourselves at least you asked me to leave that as an exercise. So, I have done so, but let us say this question is I the answer to this question is a yes I am telling you it is a yes, but remains to be proved by you. So, once it is a vector space first of all is it a finite dimensional vector space and if yes then a legitimate question would be that given the dimensions of V and W we should also be able to talk about the dimension of this vector space what do you think is a reasonable guess for this dimension mn where do you say that sorry yeah product of mn, but I am saying I am just trying to understand the intuition like why you say that any intuitive answer is good enough say yeah matrices very good that is that is the object that you have been most familiar with. So, if you talk about matrices just think of linear transformations between n tuples and m tuples of numbers. So, it is mn it is an array of mn numbers a matrix by the way is an array of mn a rectangular array you can always think of it as something that stacked up together to be a tall column of how many numbers then mn right. So, matrix of size m cross n is basically also going to come from a vector space whose dimension is mn now the important property here that implicitly we are using intuitively rather is that you have this vector space V from which you are mapping to the vector space W, but at the back of your mind because you use the idea of matrices what you have done is through some ordered basis and coordinate assignment you have gone on to fn and. So, suppose this is bv and here there is a bw and you have gone on to this and you have answered this question when looking at this mapping and you are asking me to believe that this mappings dimension is the same as this mappings dimension what are you essentially claiming you are claiming some sort of an isomorphism over what between what and what you are claiming an isomorphism between this vector space and the vector space of matrices. Remember these are not isomorphisms in v's and f's and all these alone these are isomorphisms now between v to w and cross n you are trying to claim that these are isomorphic in some sense right that is a symbol for isomorphism by the way. So, can you think of any ready made recipe to cook up an isomorphism between these objects what I mean by that is some sort of a bijection linear bijection that takes every matrix here and maps it to a map here or a map here and to a matrix here it is a very important question right we will answer it a while later, but because we are already discussing these objects I thought it is important we will resume with our discussions on the inverses of maps in cases where it is not an isomorphism, but I could not help but resist the temptation of mentioning this because it is very important that we understand that this is isomorphic to this alright and how do we show that isomorphism how do we extract that isomorphism out of these two objects right. So, in the next lecture we shall investigate further the existence of the left inverse or the right inverse and in which cases they are possible and in which cases they may not be despite the matrix being suitably fat or tall yeah once again we will come up with some rank condition ok not surprising because we have seen that it is the rank nullity theorem that allowed us to infer all there is to know about these finite dimensional vector spaces and their into and onto maps right thank you.