 the previous lecture we have seen this important vector space which is a vector space of all linear transformations from a vector space V to another vector space that is W all right. And we have seen that so long as these vector spaces individually that is V and W are themselves finite dimensional there is always a way to look at these transformations as nothing more or less than matrices right. So, suppose we say this dimension is m and this dimension is m what we have essentially showed is that there is a constructive way of viewing these mappings as objects like so right through assignment of coordinates. So, this is under an ordered basis for V and this is under an ordered basis for W. Now, remember in and of themselves these are not matrices yeah it is very important to know that because what are these fellows taking in as their inputs objects here when you talk about matrices they take objects as n tuples of numbers and spit out objects as m tuples of numbers in general, but the input to these fellows here are basically objects residing inside this. So, ideally I should be writing them as something like this right where the object that comes in here is basically an object inside V and the dot here represents whatever I have not chosen to quantify or qualify very precisely. So, this stands for the mapping that maps an object from V to f n and this stands for the precise object in V which is being mapped to f n right. So, many of you might have been misled into thinking that these are matrices in and of themselves, but they are not is that clear this symbol here does not stand for a matrix it is only when we map between these that this phi here which is some object residing here right and there is this a here it is only this a that is a matrix this is not a matrix, but later on we showed something else we also saw that you could have also chosen a different basis indeed right. So, let us call this b v 1 let us call this b w 1 and now let us call this b v 2 right and let us call this the coordinate assignment given by b w 2 and now of course, when I look at this object here as some a till day that is a different matrix ok. However, the interesting thing is that if I now choose to overlook this part here which is to say that I am not looking at the abstract vector spaces themselves, but rather the different coordinate representations of those vector spaces subject to different choices of basis this is the choice corresponding to b v 1 this is the choice corresponding to b v 2 remember both of them will take every object here to some n tuple and some other n tuple possibly right. Similarly, any object in w will be taken to an m tuple corresponding to a choice of basis and to some different m tuple corresponding to a choice of a different basis for w. Now, if I want to understand the relation between a till day and a without always having to revert to what this phi actually is there is a way of doing that because now these are between f n and f n. So, if I completely choose to overlook this this portion here what am I left with what I am left with is to show some equivalence of this and this. Now, in tracing those paths what I essentially have is the following this is the a till day via the direct path and via the so called indirect path what is happening here. If I follow the direction of the arrow in the opposite direction it is the inverse mapping is it not because if this mapping takes me from here to here its inverse will take me from here to here remember none of them individually are matrices that is the important point to note and why is this inverse guaranteed to exist because it is an isomorphism. We have shown you the dimensions of these two objects are the same and the kernel of this is only trivial the assignment of coordinates the way it is done right this has to be a 1 2 1 on 2 right and therefore the inverse is guaranteed to exist. So, what am I doing in going from here to here it is basically first I am hitting it with this mapping b v 2 maybe I should leave a lot of space here because as I go from here I will be accumulating fellows to the left right. So, the first thing I do is I hit an object here with the inverse of this that is b v 2 inverse. So, what is the input to this fellow and n couple of numbers yeah, but the moment this is acting on something where is this dwelling inside now. Now, this is dwelling inside v now this whole object has to be acted upon by b v 1 see where we are going with this now the claim is that at this point this is now legitimately a matrix and this is in fact nothing, but this entire operation that I have written down here is nothing, but a matrix corresponding to a change of basis one which we have already seen earlier right it is just a substitution. So, you go from one basis to another. So, from this basis to this basis the change of basis is encapsulated by this this is a matrix right now what do you do. So, first you hit it with the inverse then you hit it with this then whatever you have now it is time to go from this side to the other side right. So, now whatever you have gotten here as an n couple that needs to be affected by this a that I have chosen. So, there is this a alright, but that is not where the story ends because now I have to go from here back to this, but this is again where to take it from an m couple to an abstract object inside w. So, again this individually when it is taken like this this is the again going to act on an m couple. So, I am putting those brackets by the time we end we will probably have lots of those brackets, but I hope you understand what is going on because now when I do this mapping this is b w 1 the inverse thereof now I am back to w right. Once I am back in w now I need to go back to f m again, but in terms of this representation in terms of b w 2. So, now I have the final piece which is b w 2 acting on this entire object, but look closely at this and you will see once again the cumulative action of these two fellows is nothing, but something that can be captured by a matrix because this is again something that takes an m tuple to another m tuple the object the underlying object is still the same w inside this big w right. It is just a basis change and we already know that a basis change between m tuples or n tuples of numbers is captured through matrices right. So, this when I get rid of all these parentheses is what I had written in the previous lecture and of course, for brevity we will just write this as this object b w 2 then there is b w 1 inverse then there is a then there is b v 1 and b v 1 the inverse thereof where you look at this object and you look at this object these are representable as matrices right. So, you have a matrix on one hand you have another matrix on the other hand and they are related by some transformation matrices that capture change of basis in their respective n tuples or m tuples yeah. Now, if you stretch your imagination slightly alright and consider consider v is equal to w in other words this 5 will then become what we call an endomorphism yeah we have we have used that term earlier endomorphism mapping from v to itself. So, if it is an endomorphism and furthermore what else b v 1 is equal to b w 1 and b v 2 is equal to b w 2 what do we have then then can you not say that the object here and the object here that is these matrices they are actually inverses of one another first thing to observe is that these are matrices once you are agreed upon the fact that these objects are actually matrices which we have just argued that they are now in your in order to show that these are actually nothing but inverses of one another you just take the cumulative effects. So, take this and take this successively operate on them what happens if b v 1 is equal to b w 1 and b v 2 is equal to b w 2 what do you expect as a result right. So, you take this act on this. So, b w 1 inverse times I mean not times composed with b v 1, but remember w 1 and v 1 are the same or rather b w 1 and b v 1 are the same. So, these operations lead to the identity map and then these two oh I have probably missed here something it is v 2 right yeah. So, this is v 2 now we have ok. So, now if you take these two ok. So, now if you take these two that is the identity map not the identity matrix and these two then because the identity setting in the middle does not matter. So, then these two will also lead to identity. So, in that case we have the following that any transformation of the form where you see that a matrix a till day given by some t inverse a t or let me use p probably some p inverse a p this is very important for those of you who have done some preliminary courses on control theory or other such courses you would have come across this very often we call them similarity transformations. What is preserved a lot of things are preserved, but quint essentially what is preserved is the action what is being done to every object here there is no difference it is just a different name that you have given it is just a different form that you have given, but all that you have essentially done is nothing but a change of basis. So, if you keep this sort of a picture in mind we have already derived this explicitly right in the previous lecture, but if you keep this picture in mind you do not really need to always recall that derivation you can just see what is going on all that you have to know is that what is happening between this and this is the same as what is happening in this yeah that is what is being traced out here. Any questions about this no these are not matrices these are individually not matrices these are just some coordinate assignments that are being done no I am not saying these are matrices I am saying the cumulative effects of these two are matrices. Because when you are taking when you are going from here to here you are going from an n tuple to another n tuple through the composition of two such linear mappings then between f n's between n tuples the only possible linear transformations are after all matrices. In fact, we will say something much more now now that we have clarified this we will say that this particular space and this particular space these are both vector spaces I asked you to verify that this is indeed a vector space hope you have done that. So, if you have done that check because now it would make sense to say that this and this are actually isomorphisms they are isomorphic these two vector spaces there is an isomorphism that takes you. So, every time I give you a phi here that belongs here you can spit out a matrix that captures exactly the action of phi on every object in v and that matrix will spit out exactly the effect of what that does in terms of actions of a on n tuples. So, that it is taken to an m tuple yeah. So, that is an isomorphism that is a that is a that is a big claim I think a couple of lectures back we had probably made some claim like that, but now we are in a position to illustrate why it must be so any other questions on this right. So, how do we show that this is indeed what we are claiming that is an isomorphism right. So, suppose you define a mapping m that takes objects over here and maps it to objects in here. So, this is of course, again dimension is equal to n and this dimension is equal to m right. If we can show that there is a linear bijective mapping m which takes objects like like phi that is mappings from v to w and translates them to this yeah. So, what do you think is that mapping would you I need to give it always a closed form representation no, I can just outline the process that is that is the very important concept right it is not like every function should look like y is equal to f x. If I basically tell you a constructive way of relating an object in this vector space to some object in this vector space I have adequately told you what the mapping is. So, once I have understood that mapping in that sense all that I need to show is that the mapping is also linear, injective and surjective that is all that I need for isomorphism yeah. The claim is that these two are isomorphic. So, I must be able to show that this m whatever I am going to define now is indeed a legitimate bijection linear bijection right. So, what do we do any suggestions what is what is it that we have seen so far which immediately leads us to this sort of a mapping what should be done to objects such as phi. So, that I get a matrix we have already seen one such case in the previous lecture what is it that we need to do yes exactly. So, we start with remember these are different kinds of vector spaces than v and w individually, but you take a basis for v nonetheless v v and without loss of generality any basis right. So, this is the basis for v and this is the basis for w right and you study the action as we have seen right what do we do we take phi the action of phi on the elements in the basis set. Now, these are objects in w. So, we go ahead and represent them in terms of b w's just a recollection really it is it sounds like a fancy statement of facts, but it is just a recollection. See what is it that we are doing given a phi that is all I need right this choice of basis in v and w I can always get that these are finite dimensional vector spaces all of this is guaranteed to exist the only thing that I needed was this phi. So, m of phi is given by this m of phi is given by this and you agree that this is a matrix. So, this is a legitimate object because when you represent in terms of basis in w these are m tuples. So, this is indeed an object like. So, all that we need to show now that the way we have defined it here is linear injective and surjective. So, how do we show that it is linear well we have to just show two properties essentially you take phi 1 plus phi 2 let m act on it and you have to show that this is m phi 1 plus m phi 2, but I can do it even more easily by taking both at one go like so. So, for all alpha belonging to the field and phi 1 and phi 2 belonging to this this is what I have to show for linearity and I have to show that m of phi is equal to the 0 matrix if and only if phi is identically the 0 operator from v to w. So, remember this is not the number 0 is a 0 transformation sorry not operator the 0 transformation which means that no matter what vector v you pass on to it as argument it always maps it to 0. Essentially any object in the basis for v is taken to 0 if any object in the basis is taken to 0 then every object in the vector space is taken to 0 right because any such function which is linear is clearly characterized by its action on any basis like we have seen in the previous lecture. So, this is the second thing we need to show and thirdly for any a belonging to f m cross n there exists phi in L v comma w such that m acting on phi is equal to a. So, basically these three verifications right this first verification second verification and the third verification. So, I am not going to do this completely, but I am going to just give you the idea behind it the sketch you can fill it out by yourself it will also give you some practice how do we go about this remember these are not objects in individual vector spaces like v and w, but rather these are now linear transformations themselves. So, what do we have to show in order to prove this we have to show that whether you are acting in the abstract vector spaces that is v and w or whether you are mapping from the n tuples to the m tuples this object does not change. This object is what a matrix this object is also a matrix right how are these matrices characterized well in this manner right. So, this is phi 1 acting on that basis and phi 2 acting on that basis of course is just a scaled version. So, multiplying a matrix is equivalent to multiplying each arm every one of them individually each column individually and then you can push that scalar inside and then what happens what can I say after that the fact that this has to be true. So, you start with this side and you show that it has to be the same as this because of the linearity of phi itself and the linearity of the matrix additions they just fit in it just dovetails nicely. So, linearity is hardly anything much to prove this one needs a slight amount of thought a little sophisticated thought, but not much again. When I say this what does it mean remember it is not the 0 vector in v and w it is a 0 vector over here that means that what I am saying that hitting an n tuple with a 0 vector what does it do it means it takes it to no matter what n tuple you give me it always gives us 0 of m tuples that means which is the only operator that can do that or rather sorry not the operator which is the only linear transformation that takes objects from v to w such that every object in v gets mapped to the 0 n w can it be a non-zero linear map if this object were not to be identically the 0 linear map between v and w that means there is at least one object in v which it maps to a non-zero object in w right, but then if you do this particular operation here and you choose a basis that contains precisely that vector suppose there is one vector which does not get mapped to the 0 of w choose that as a starting point for a basis then at least one of those columns here will be non-zero. So, nothing other than the 0 linear map from v to w can map to the 0 m cross n matrix. Therefore, the kernel of this m which maps linear transformations phi to m cross n matrices is only the 0 linear transformation from v to w. So, the kernel is indeed trivial and therefore, this mapping m that I have defined here right is one to one or injective. Finally, the very construction of this immediately tells me how to go about it I mean I can you know just like we did with inverses I can just arbitrarily define for instance I know that an n dimensional space has n elements in its basis go ahead and choose any basis it has n objects all right. And now you take suppose I give you some a matrix here yeah. So, for any a I can give you any arbitrary a if I indeed give you any arbitrary a which you know has these different different n number of columns at least. So, you first start scanning these columns scan the first column right of course, the first column if it is non-zero just map it to one element in the basis set right and say that phi of v 1 is this you are defining the phi remember you are not defining v 1 or anything, but you can just define it like that all right. So, you can go ahead and keep doing the reverse mapping like likewise you are defining the phi that is the important bit it is always important to keep your focus on what objects you are defining you have been given a matrix a and from the columns of matrix a you have to cook up what you have to cook up the phi right. So, go ahead and choose a basis in w a basis in v right and those columns of this object a if they are all you know there are enough number of linearly independent columns then there is hardly anything much to do here, but if they are not right if they are not enough number of linearly independent columns then again you choose only the number of linearly independent columns that span the column span of the given matrix and once you have defined it over them your job is done right you can just go back and map them to a sufficient number of these objects and the rest of the things I do not really care right. So, if I have to prove that it is a surjection I will always be able to find such a map right. So, just go ahead and do this as part of an exercise and convince yourself that indeed this mapping that I am talking about here is linear 1 to 1 and on to and therefore, the important conclusion from all of this is exactly what we had said a couple of lectures back that if they are isomorphic then their dimensions must be the same and you know that the dimension of m cross n matrices over the field f is m n again a rather unsophisticated argument would convince you of that you have m cross n array of numbers from the field f how you view it is up to you you can just stack it up as a rectangular array of m cross n or you can stack it up as an array of m n cross 1 a tall column and then it looks exactly like the m n tuple the Euclidean spaces, but essentially they are the same they are exactly m n different numbers m n tuple that is also an isomorphism by the way right sometimes in some texts you will see the VEC operation used very sort of routinely. So, you say the VEC of some matrix is like vectorification of the matrix the data inside the matrix you just write it as a tall column, but essentially it is the same thing. So, the dimension of this space is exactly m n and therefore, because of the isomorphism that we have now shown you the dimension of this is also going to be m n. So, when you have finite dimensional vector spaces and you are looking at the space of all linear transformations over finite dimensional vector spaces the basis for that must contain exactly m n number of linear transformations right that can uniquely specify every linear transformation right ok. Any questions on this so far? So, once we have seen this we will now move it move into a very specific kind of linear transformations and eventually hope to reveal something more interesting and some result that we have already proved we will revisit that in light of this new construction that we shall now see.