 So let us discuss some more properties of linear transformations. The first important result for linear transformations whose domain is a finite dimensional vector space is called the rank nullity dimension theorem. So let me prove that first. The rank nullity dimension theorem, okay. First what is a rank? What is nullity? Let T be a linear transformation from V into W. I will assume that V finite dimensional, the rank of T is the dimension of the range of T or the range space of T. Let us use R for this then almost like gamma. This is dimension of range of T. That is a rank. The nullity of T, let us denote it by eta is the dimension of the null space of T. That is let us say eta is dimension of N of T. So I define these two numbers non-negative integers. Gamma and eta are non-negative integers. Dimension of the range space of T is the rank. Dimension of the null space of T is the nullity, okay. So these are the numbers. There is a dimension of the domain space V. I have assumed V to be finite dimensional. So this theorem relates these three numbers. The rank nullity dimension theorem, I will call it R and D theorem. V is finite dimensional and T be a linear map from V into W. No conditions on W which means W could be infinite dimensional but V is finite dimensional, T is linear, okay. Then this theorem says that rank plus nullity equals dimension of the domain space. Rank plus nullity equals dimension of the domain space, okay. Now you see that dimension of W does not come into this equation. That is the reason why there is no condition on W. W could be infinite dimension, okay. So let us prove this and then look at some of the consequences of this result. Let me first discuss the proof. I want to make use of the fact that any linearly independent subset of the vector space V can be extended to a basis of V, okay. So let me start with null space of T. What I know is that null space of T being a subset of V must be finite dimensional. Null space of T is finite dimension. So I will take up basis consisting of finite many elements. Let us call U1, U2, etc U gamma, eta. U1, U2, etc U eta be a basis of null space of T. I know that null space of T is gamma dimension, eta dimension. So there are eta vectors here. This is a basis of null space of T. This is linearly independent. This can be extended to a basis of V. Extend this to a basis of V. That is I will have U1, U2, etc U eta. Then let me call the other vectors V1, V2, etc Vl. This is a basis of V and so what this means is that dimension of V must be eta plus L, okay. Dimension of V is eta plus L where eta is the dimension of the null space of T. That is a nullity. So we need to only show that L is equal to rank of T. We need to only show that L is equal to rank of T, okay. So the claim is, is that clear? So dimension of V is eta plus L because of this being a basis. There are eta plus L vectors, eta we started with. This is a basis for null space of T. So eta is nullity of T. We need to only show that L is equal to rank of T. That is gamma, dimension of the range space of T. So we will exhibit, we must exhibit a basis of range of T consisting of precisely L vectors, okay. It is probably not unnatural to take these vectors, take all these vectors, look at the action of these vectors under T. The action of T on these vectors will give you 0 because they are in null space. Look at these other vectors and probably they should be a basis. In fact what we will show is that you look at T v1, T v2, etc. T vL. We will show that this is a basis of range of T, okay. Suppose we show that this is a basis of range of T then it follows that there are L vectors here. Then it follows that rank of T is L and so this equation holds, okay. So that is what we will do. We will show that these vectors are linearly independent and they span range of T. Let us first dispose of the spanning thing. We must show that this is a basis so we must show that it is linearly independent and a spanning set. So let us take y in range of T and then show that y is a linear combination of these vectors, okay. y is in range of T by definition then there exists x in v such that y is equal to T x, there exists x in v. Now this is a basis for v and so I can write this y as T of x that is T of, this is, this x is in v, it is a linear combination of these vectors so I have something like alpha 1 u 1 plus alpha 2 u 2 etc plus alpha eta u eta plus let us say beta 1 v 1 plus beta 2 v 2 plus etc plus beta L v L that is my x, linear combination of these vectors this is a basis. T is linear so this can be rewritten as alpha 1 T of u 1 plus alpha 2 T u 2 etc alpha eta T of u eta plus beta 1 T v 1 etc beta L T v L, okay. that u 1 u 2 etc u eta they are in null space of T they are in fact linearly independent they are in null space of T so these terms are 0 so this is just beta 1 T of v 1 etc beta L T v L that is this belongs to span of the vectors T v 1 T v 2 etc T v L linear combination of these vectors because these are 0 and so for one thing this is a spanning set T v 1 T v 2 etc T v L is a spanning set it spans range of T. Next linear independence so let us consider a linear, let us consider a linear combination of T v 1 T v 2 etc T v L equate that to 0 so that each coefficient is 0. So let us consider what shall I say delta 1 T v 1 plus delta 2 T v 2 plus etc plus delta L T v L suppose this is 0 I must show that each of the scalars is 0 this use the fact T is linear so this means T of delta 1 v 1 etc delta L v L is 0 this means this vector delta 1 v 1 etc delta L v L that belongs to a null space of T that belongs to a null space of T null space of T is span by u 1 u 2 etc and so that is a linear combination of T of this vector is 0 null space of X belongs null space of T if T of X is 0 so this vector must be in the null space of T that is what I have written down null space of T has u 1 u 2 etc u v 2 as a basis let me just go to that line. So delta 1 v 1 plus delta 2 v 2 plus etc plus delta L v L must be a linear combination of u 1 u 2 etc u v eta because null space of T is span by these vectors so let us say I have beta 1 u 1 plus beta 2 u 2 etc beta eta u eta is that okay bring this to the left hand side that is delta 1 v 1 etc plus delta L v L minus beta 1 u 1 plus etc beta eta v u eta this is 0 but u 1 u 2 etc u eta v 1 v 2 etc v L they are linearly independent so this means each of these scalars delta 1 etc equals delta L is 0 beta 1 etc beta eta is 0 go back and see what you have here delta L delta 1 etc delta L they are 0 so this is a 0 vector I am sorry you have the proof right away delta 1 etc delta L equal to 0 I started with this combination I started with delta 1 T v 1 plus delta 2 T v 2 etc delta L T v L equal to 0 I have shown that the scalars are 0 so T v 1 T v 2 etc they are linearly independent so T v 1 etc T v L this is a linearly independent subset hence the theorem our claim is that L is a dimension of range of T first we have shown that these vectors T v 1 T v 2 etc they form a spanning set then we have shown that they are linearly independent that is rank penalty dimension theorem okay let us look at some consequences one of the consequences of the rank penalty dimension theorem is the following I will state this as first corollary v is finite dimensional in fact I will take this time dimension w equals dimension v T from v to w is linear then we have the following T is injective if and only if T is surjective if the dimensions of the domain and the co-domain vector spaces are the same then a linear map is injective if and only if it is surjective that is now you go back to the last example of the last lecture the last example is the last lecture T of T from R 2 to R 2 T of X equals X 1 plus X 2 comma X 1 minus X 2 we verified that it is 1 to 1 as well as on to if we had known this theorem it would have been enough to verify only one of those dimension of co-domain and the dimension of the domain there are the same so it is enough if you verify one of those okay how do we prove this this is a consequence of the rank penalty dimension theorem rank of T plus nullity of T is the dimension of the domain space which is also the dimension of the co-domain space okay suppose T is injective we had shown last time that null space of T is single term 0 in fact they are equivalent T is injective if and only if null space of T is single term 0 null space of T is single term 0 if and only if nullity of T is 0 because we had defined the dimension of the 0 space to be 0 so null space of T is single term 0 if and only if nullity that is eta is 0 eta is 0 if and only if gamma is dimension W gamma is dimension W but what is gamma? Gamma is a dimension of the range of T that is this happens if and only if dimension range of T is dimension W range of T is a subspace of W if it has the same dimension as W then it must be equal to W so this happens if and only if range of T equals W this is the same as saying that T is surjective okay so T is injective if and only if T is surjective okay this is one of the consequences suppose I have a relation see dimension V dimension look at the finite dimensional case when both V and W are finite dimensional these two are integers one can compare these two integers I will state this then as the next corollary again V and W are finite dimensional this time I will assume dimension V to be N dimension W to be M T from V into W is linear then we have the following if N is greater than M see the equality case has been discussed earlier let us discuss the case when one of them is greater than the other each is greater than the other in two cases N greater than M is one case M greater than N is the other case if N is greater than M then the conclusion is T is not injective if N is greater than M then T cannot be injective if M is greater than N then T cannot be surjective okay that will completely exhaust the three cases when V and W are both finite dimensional okay proof by contradiction if T is injective first part if T is injective then we know that null space of T is single term 0 and so nullity is 0 and so rank plus nullity dimension theorem rank plus nullity equals dimension that theorem says that rank is equal to dimension of W that is dimension V which is N dimension V is N but gamma is dimension of the range space what this means is that range space of T dimension of the range space of T is equal to N but then range space is a subspace of W so this N cannot exceed M since range of T is a subspace of W and dimension W is M this N cannot exceed M M is a dimension of W M is a dimension of W N is a dimension of V first we have used dimension V is N so gamma is N gamma is the dimension of the range space range space being a subspace of a finite dimensional vector space cannot exceed the dimension of that subspace so this N cannot exceed M so that is a contradiction that is if T is injective then N is less than or equal to M means if N is strictly greater than M then T cannot be injective that is a first statement the second one is similar I am going to leave that as an exercise second part is similar second part is similar again proved by contradiction okay so these are two important consequences of the fundamental result on the dimensions of the subspaces null space and the range space of a finite dimensional vector space. Let us look at linear transformations that are both injective and surjective a little closer if a linear if a function is injective and surjective then it is invertible if a function is both 1 1 and onto then it is bijective it is invertible and the inverse function is also 1 1 and onto okay if f is bijective then f inverse exist and f inverse is bijective okay the question that we would like to ask here is T is injective and surjective linear then I know T inverse is injective and surjective question is T inverse linear the answer is yes let us let us first prove that let us first prove that the inverse of a linear transformation is also a linear transformation and then look at some consequences for vector spaces if there exists an invertible linear transformation between them. So let me first prove this result I will state this as a lemma let T from V into W be linear and bijective by which I mean injective and surjective 1 to 1 and onto then we know that T inverse exist as a function and we know that T inverse must be a function from W to V the claim is that this is linear T inverse is linear let us first prove this and then look at some of the consequences of this result okay I want to show T inverse is linear I have to verify these two equations that T inverse is additive and then T inverse alpha x is alpha T inverse x so proof let us first start with two vectors x y in W this time and consider T inverse of x plus y I must show that this is equal to T inverse x plus T inverse y for one thing let us call this as the vector z this vector z belongs to V and I remember that x and y come from W this means T is invertible so T inverse is invertible T inverse inverse is T so I am operating by T I am operating by T T of T inverse x plus y is T of z but T of T inverse is composition T circle T inverse that is identity transformation so x plus y equals T z x plus y equals T z now look at x and y they come from W x and y come from W and I know that T from V to W is bijective so each of these must have a pre-image x is equal to T of U and y equals T of V for U V in capital V in fact these must be unique you can verify that if T is bijective then the pre-images must also be unique okay in any case we have U and V from V such that x is T U y is T V go back substitute use injectivity of T that is let me now write T z first T z is x plus y x is T U y is T V so this is T of U plus T of V T is linear T of U plus V so I have T z equals T of U plus V T is injective T x equals T y implies x equals y so z equals U plus V U plus V T U is x T V is y so this is T inverse x plus T inverse y but z on the other hand is T inverse x plus y so that is equal to T inverse x plus T inverse y so T inverse is additive the second part is similar let me go through that quickly consider again T inverse of alpha x I will call that z then alpha x is T of z x is in W and so I can write this as alpha T of U I am using the same U alpha T of U equals T of z T is linear so this goes in T of alpha U equals T of z T is injective alpha U equals z on the one hand alpha U is alpha T inverse x on the other hand z is T inverse alpha x so T inverse is linear if it exists if T inverse exists and if T is linear then T inverse is linear no dimensions here this is true for any linear transformation between any two vector spaces okay now let us look at what an invertible linear transformation does on finite dimensional vector spaces so we need the following definition T from V to W is called an isomorphism T from V to W is called an isomorphism if T is linear and bijective if T is linear and bijective isomorphism means same structure morphism is structure same structure it preserves the structure it is called an isomorphism that is a special name for a linear transformation which is also bijective what we have seen just now is that if T is an isomorphism then T inverse is also an isomorphism okay note we have just now shown that T inverse is linear T inverse is bijective is known anyway so if T is an isomorphism then T inverse is also an isomorphism what does an isomorphism do to finite dimensional vector spaces before that another notion isomorphic vector spaces if T from V to W is an isomorphism then we say that V is isomorphic to W there is an isomorphism from V into W then we say that V is isomorphic to W and then use the following notation to denote that there is an isomorphism from V into W we will use this notation on the left I have V on the right I have W this is the symbol for isomorphism V is isomorphic to W if V is isomorphic to W can I conclude that W is isomorphic to V why T inverse if T is an isomorphism then T inverse is an isomorphism from W to V and so now we can say that V and W are isomorphic it is not just V is isomorphic to W we can in this case say V and W are isomorphic also if V is isomorphic to W W is isomorphic to Z can I conclude V is isomorphic to Z yes because composition of bijective maps is bijective composition of linear maps is linear I have not proved this before but it is not difficult to prove composition of linear maps is linear and inverse of the composition there is a reverse order law similar to the matrix inverse T 1 circle T 2 inverse is T 2 inverse circle T 2 T 2 inverse circle T 1 inverse okay and so and what is the trivial isomorphism from a vector space to itself identity is linear bijective inverse is linear bijective so V is isomorphic to itself V is isomorphic to W implies W is isomorphic to V V isomorphic to W W isomorphic to Z implies V isomorphic to Z so this is an equivalence relation this partitions the set of all vector spaces into equivalence classes the classes have the property that if you take two vector spaces from two different classes they cannot be isomorphic if you take two vector spaces that are isomorphic they belong to the same class okay what is also what this also does an isomorphism also does is to split finite dimensional vector spaces according to their dimensions that is important thing isomorphic classes correspond to precisely the dimensions of the vector space that is if two vector spaces are not isomorphic I know they are finite dimensional they cannot be of the same dimension on the other hand if two vector spaces have the same dimension then they belong to the same isomorphic class that is there is an isomorphism between them okay this is what we will prove in a short while from now. So let me first give an example of an isomorphism and then proceed okay now you will notice that after me writing down the isomorphism that it is really trivial to have written this down so I want to look at this example let us consider P 2 and R 3 P 2 is V R 3 is W P 2 is the space of all polynomials with real coefficients degree less than or equal to 2 dimension of P 2 is 3 R 3 is of dimension 3 I want to give an isomorphism between them let me define T from P 2 to R 3 by T of a polynomial P I must have on the right hand side a vector with 3 coordinates on the right hand side I must have a vector with 3 coordinates can you give a natural vector on the right just take the coefficients what I know is that P is a polynomial so P of T is alpha 0 times 1 plus alpha 1 times T plus alpha 2 times T square T is a real variable alpha 0 alpha 1 alpha 2 R from R again this is a real polynomial define T of P to be this 3 dimensional vector alpha 0 alpha 1 alpha 2 then probably I am going to leave this as an exercise T is linear T is 1 1 it is enough then T is onto because the dimensions are the same so this is an isomorphism T inverse is an isomorphism from R 3 to P 2 but maybe you can take that as an exercise again what is the inverse of this transformation okay let me just state that T is an isomorphism what I have exhibited here this is an example of an isomorphism T is an isomorphism okay you see that this is almost natural how to associate an isomorphism between vector spaces of the same dimension okay we will try to imitate this in the general case but before that I want to prove this result I have T from V into W an isomorphism T from V to W be an isomorphism I am assuming that V is finite dimensional that is let U 1 U 2 etc U n be a basis of V so I am assuming that V is finite dimensional I have exhibited a basis given that T is an isomorphism what can be shown is that so what is your guess from this can I get a basis for W look at T U 1 T U 2 etc T U n this is a basis of the vector space W as a consequence dimension of V is equal to dimension of W okay this is what I said that if you have an isomorphism between finite dimensional vector spaces then the vector spaces must have the same dimension we will prove the converse also if 2 vector spaces have the same dimension then there is an isomorphism very similar to this particular example let me first prove this result I want to show that this is a basis of W spanning set linear independence let Y belong to W T is an isomorphism so T is bijective T is surjective there exists X and V such that Y is T of X X is in V this is a basis of V I can write this as T of alpha 1 U 1 plus alpha 2 U 2 etc plus alpha n U n T is linear alpha 1 T U 1 etc alpha n T U n I have written this as a linear combination of T U 1 etc T U 1 T U etc T U n is a spanning set okay nothing much in this just that T is on to that is what we have used for linear independence we will prove we will use injectivity of T linear independence suppose that suppose that a linear combination of these vectors is 0 let us take beta 1 T U 1 etc beta n T U n to be 0 this means T of beta 1 U 1 etc plus beta n U n equal to 0 this means this vector inside beta 1 U 1 etc beta n U n belongs null space of T but T is injective null space of T is single term 0 so this must be the 0 vector beta 1 U 1 etc plus beta n U n is the 0 vector but remember that U 1 U 2 etc they form a basis for V so these vectors are linearly independent so beta 1 beta 2 etc they must be 0 each of the scalars is equal to 0 so remember I started with beta 1 T U 1 etc beta n T U n I have shown that each of the scalars is 0 so it follows that T U 1 etc T U n is linearly independent we have shown already it is a spanning set so it is a basis of W and they are n n number the number of elements in this basis is n so it follows that dimension W is n that is the same as dimension of V in the next lecture I will prove the converse and also consider other examples.