 See one of the central notions in linear algebra is that of a linear transformation this is also central not only linear algebra but the entire mathematics. Today I will discuss the notion of linear transformation give several examples now these examples will justify the statement that I made just now you will see that linear transformations arise in differential equations in integral calculus in matrices for transformations between vector spaces etc. So let us first look at the notion of a linear transformation look at several examples and probably towards the end of the class we will look at some properties some simple properties some not so simple properties and then we will be able to compare how a linear transformation behaves with a general function between vector spaces okay. So let us begin with the notion of a linear transformation between two vector spaces. So I have two vector spaces let us call T as a linear transformation what this means is that V and W are vector spaces T is a function to begin with so I am just emphasizing it is a function that satisfies the following two conditions T of U plus V equals T of U plus T of V for all U V in V and T of alpha U equals alpha T of U this is true for all alpha in the underlying field which we are assuming is a real field and for all U in V. So a linear transformation is a function linear transformation between two vector spaces is a function between those two vector spaces that must satisfy these two conditions let us observe that the right hand side vector is in W what is inside U plus V that is in V T of that is in W and the formula for T of U plus V is given by this the formula for T of alpha U is given by this right hand side okay. Sometimes we write T U instead of T of U okay this is just a notational convenience. So let us now look at examples let us dispose of the trivial examples first one you look at the map which I will denote by 0 0 from V to W defined by 0 of U is 0 for all U in V see the left hand side 0 or O is a 0 transformation the right hand side 0 is a 0 vector in W okay trivially this is a linear transformation because it satisfies these two conditions okay this is called the 0 transformation example 2 I will call it I I from V to V you know what it is the identity transformation defined by I of U equals U for all U in V now you see that the right hand side U is the same as a left hand side U that you started with so you need W to be equal to V for identity transformation W is equal to V so these two are trivial examples of linear transformations let us now look at one non-trivial example and then probably two examples from geometry motivated by notions from geometry. So let me first look at a non-trivial linear transformation let us say T from R 2 to R 3 T from R 2 to R 3 is defined by T of something okay so T of X 1 X 2 a typical element in R 2 is X 1 X 2 T of that element let us say the right hand side is X 1 X 2 X 1 minus X 2 and I need to remember that X comes from R 2 T of X 1 X 2 is the first component is X 1 second component X 2 third component X 1 minus X 2 so what you observe is that this X 1 X 2 belongs to R 2 this belongs to R 3 this is linear let me first take this as the first example and then verify these two conditions that this is a linear transformation we need to verify first condition so let us take X Y in R 2 and the notation that I will use is X equals X 1 X 2 Y equals Y 1 Y 2 I must verify that T of X plus Y equals T X plus T Y first okay look at T of X plus Y this is T of the vector X plus Y I know is coordinate wise X 1 plus Y 1 X 2 plus Y 2 I will call this as T of Z 1, Z 2 where Z 1 is X 1 plus Y 1 Z 2 is X 2 plus Y 2 so I have T of two coordinates I have the formula there so this is Z 1, Z 2, Z 1 minus Z 2 that is the definition T of X 1 X 2 is X 1 X 2 then X 1 minus X 2 so T of Z 2 Z 1 Z 2 Z 1 minus Z 2 Z 1 is X 1 plus Y 1 Z 2 is X 2 plus Y 2 Z 1 minus Z 2 is I will write it as X 1 minus X 2 plus Y 1 minus Y 2 I can write this as X 1 X 2 X 1 minus X 2 plus Y 1 Y 2 Y 1 minus Y 2 this is addition in R 3 this is the operation plus in R 3 the first term now is T X the second term is T Y okay so T X plus Y is T X plus T Y we can also verify T of alpha X equal to alpha T X let us do that quickly T of alpha X is T of alpha into X 1 X 2 that is T of alpha X 1 comma alpha X 2 which by definition is alpha X 1 minus sorry alpha X 1 alpha X 2 alpha X 1 minus alpha X 2 which let me write as alpha times X 1 X 2 X 1 minus X 2 which is alpha T X okay so this is a simple verification that T is a linear transformation that is defined here is a linear transformation okay let us look at other examples let me now look at two examples coming from geometry look at the transformation that sends a vector in R 2 on the plane I have a transformation which sends a vector X to a vector obtained by reflecting with respect to the horizontal axis that is example 4 you tell me if this is what I am talking about T of X 1 X 2 is X 1 minus X 2 this is the transformation which takes the vector X 1 X 2 to X 1 minus X 2 so it is reflection with respect to the horizontal axis okay if you think of R 2 as a horizontal and a vertical then this is what it does say X 1 X 2 then if X 1 X 2 is here okay let us say X Y and I will call this a general point X 1 X 2 then it must go to this point which is a reflection of this this is X 1 minus X 2 I could have written X 1 minus X 2 here X 1 X 2 here but does not matter reflection this is an example of a linear transformation I am not going to verify that it satisfies those two defining equations this T is linear this comes from geometry reflection example 5 rotation okay let me call it T is a reflection a reflection with respect to some axis I have taken the horizontal axis 5 is rotation rotation let us first derive the formula and then get the transformation from that formula rotation means the following I am again in R 2 I have a vector here at a distance R from the origin and I will call that X 1 X 2 this makes an angle so this is R for me this makes an angle let us say alpha with the horizontal axis positive X axis I am rotating this when you rotate this length does not change the distance from the origin does not change by rotation so let us say I have something like this here this is my Y and I write Y as Y 1, Y 2 the rotation is by an angle theta the vector X has been transformed to the vector Y by an angle theta can I write down a formula for the transformation that sends X to the vector Y okay now you know this horizontal vertical components if you want to can use Pythagoras theorem a right triangle is okay what I have is X 1 is the horizontal component that is cos alpha X 2 is sin alpha yes there is an R coming otherwise we can write down a similar formula for Y Y 1 is R cos alpha plus theta and Y 2 has a similar formula let me write down this okay let me write Y 2 similarly Y 2 is R sin alpha plus theta so this is R cos A cos B R cos alpha cos theta minus R sin alpha sin theta Y 2 for me is R sin alpha sin theta sin A cos B plus cos alpha sin theta okay so this is R cos alpha go back to this this was X 1 cos theta minus X 2 sin theta that is my Y 2 Y 1 rather Y 2 is on the other hand R sin alpha that is X 2 so let me write this first R cos alpha X 1 sin theta plus X 2 cos theta X 2 is R sin alpha into cos theta so I have these two expressions for Y 1 and Y 2 again horizontal vertical components then let me write Y as a column vector this time let me write Y as a column vector then this Y I know is Y 1 is X 1 cos alpha minus X 2 X 1 cos theta minus X 2 sin theta so let me write just the coefficients cos theta minus sin theta this into X 1, X 2 I will write that also as a column vector so I am now writing a matrix equation something like A X equal to B, B equal to Y equal to A X Y 2 is what I need to write next X 1 sin theta this is cos theta so I have written Y equals A theta X this matrix the entry is depend on theta so that is A theta X I have written Y as A theta X so this is the transformation formula if you give me X I substitute into this I get Y of course I must know theta as I must know the angle of rotation now look at A theta let me now use this A theta to define T from R 2 to R 2 by the formula T of X equals A theta X I am defining T of X that is a transformation is A theta X rotation rotates X to Y then use matrix multiplication to conclude that this is a linear transformation okay this T is linear T is linear and it is the rotation map T is the rotation map or the rotation transformation okay. So these two examples come from geometry let us also look at some other examples coming really from geometry but this time we may have to look at higher dimensions that is my fifth example let me look at T from R m to R n let me take the case when m is greater than or equal to n I will define T X 1 X 2 etc X m m is greater than or equal to n that is n is less than or equal to m so the number of coordinates on the this on this right hand side I have a vector which has less than m coordinates there is a natural definition X 1 X 2 etc X n X is in R m so what I have dropped is what I have done is to drop the coordinates from n plus 1 to m okay this is called a natural projection T is called a natural projection on R m now you can verify that this T is linear these are some of the simplest examples of linear transformations this T is linear let us look at the usual projections projections that we encounter in engineering drawing for instance T from R 3 to R 3 defined by T of X 1 X 2 X 3 this let me say it is X 1 X 2 0 X 1 X 2 0 these would be called projection operators T is called a projection operator on R 3 we will reserve the word operator when the vector spaces V and V are the V and W are the same if V is equal to W then linear transformation will be in particular called a linear operator so this is an operator it is called a projection operator you see that any point on the plane rather any point in 3 space is dropped on to the horizontal plane the plane let us say X Y plane any point in the X Y Z plane the Z coordinate is 0 so we are looking at the projection of any point in 3 space on the so called X Y plane that is the projection operator this is just one of those examples I have another several other examples for instance T of X 1 X 2 X 3 could be X 1 0 X 3 or X 1 0 0 on the X axis etc all these are called projection operators let us look at other example let us take this time T from R m to R n with M less than n it can be equal also the definition is as follows this will be X 1 X 2 etc X n if it is strictly less than n the other coordinates are taken to be 0 so these are n minus m components m is less than or equal to n so this may not be there then it will reduce to the natural projections but otherwise there are certain coordinates which are 0 now this is not an operator this is what is called as natural inclusion again T is linear it is called a natural inclusion it is called a natural inclusion in particular this allows us to think of R m as sitting inside R n if n is greater than m you can think of R m as sitting inside R n so this is natural inclusion this is another example of a linear transformation. Let us look at other examples let us take one from the space of matrices let us say T from R m cross n the set of all the vector space real vector space of m by n matrices to the real vector space of n by m matrices defined by T of A equals A transpose T of A equal to A transpose this A transpose has been defined earlier if A is equal to A ij then A transpose is Aj i and so if A is m cross n then A transpose is n cross m this is linear that is because we need to verify T of x plus y equals T x plus T y let us look at T of A plus b T of A plus b by definition is the transpose of A plus b but the transpose can be verified to satisfy this formula A transpose plus b transpose ok this is a easy consequence of addition A transpose plus b transpose A transpose is T of A b transpose is T of b so this is additive T of A plus b is T of A plus T of b T of alpha times A is T of its alpha A transpose alpha A is multiplying alpha to each component of each term of the matrix A so that can be taken outside it is alpha A transpose remember it is a real case if it is a complex case you must take alpha bar outside alpha A transpose that is alpha T of A so T is linear we have verified so this one comes from transformation between vector spaces of matrices let us look at example from differential calculus let us take T from C prime 0 1 I usually say this is a complex valued space of complex valued continuous functions on the interval 0 1 C 0 1 C prime 0 1 space of complex valued continuous functions on 0 1 with the property that the first derivative is continuous I will consider this as a real space T from C 1 0 1 to C 0 1 again real space the mapping has T of a function f is f prime its derivative function T of f is f prime the derivative function T of sin theta is cos theta f comes from C 1 0 1 first of all this is well defined because T of f is f prime f if f is in C 1 0 1 does f prime belong to C 0 1 that is the case because of the definition of C 1 0 1 f prime is the first derivative that must be continuous so this f prime belongs to this so this is well defined this is a function to verify that this is linear comes from differential calculus D by D T of f plus G is D f by D T plus D G by D T D by D T of alpha times f is alpha D f by D T so this T is linear this is called the differential operator this T is called the differential operator okay so it is not just the superficial connection to differential calculus what we will see is that later when maybe in the next lecture we will discuss the notion of a range space of a linear transformation null space of a linear transformation there you will see that the null space of a linear transformation when T is a differential operator especially coming from constant coefficient that is it is a differential operator with constant coefficients then the null space is precisely the sets spanned by the solutions which are called the so called complementary functions of the differential equation okay so this connection is not just superficial okay this will be made clear later. So this is called the differential operator coming from differential calculus one from integral calculus and probably I will stop this list the last example is 10 let us say I have T from C 0 1 again for the sake of simplicity I will take this to be the real null space of continuous functions on 0 1 to R this time the domain vector space is one dimensional it is just R T defined by T of f is integral 0 to 1 f of T D T T of f is integral 0 to 1 f D T the Riemann integration now we know that this is well defined again because from integral calculus we know that every continuous function is Riemann integrable so the right hand side is well defined and you can verify easily that this is linear transformation that is integral for two functions f and g that are continuous integral 0 to 1 f of f plus f of T plus g of T D T is equal to integral 0 to 1 f T D T plus integral 0 to 1 g T D T T of that is T of f plus g equals T f plus T g T of alpha f is alpha times 0 to 1 f T D T that is alpha f alpha T of f rather so this is linear this is called an integral operator I will simply say integration integral transformation this is again a linear transformation okay probably one final example which sort of summarizes several of the previous examples not all of the I will state that as example 11 let me say T is from R n to R m is defined by so I have T of x equals A x where I am given an m cross n real matrix A I am given an m cross n real matrix A through this matrix I am defining a transformation this transformation is between R n and R m it is from R n to R m defined by this equation T x equal to A x this is matrix multiplication you see that if A is m cross n then A x is 1 1 cross m cross n m cross 1 okay that is a vector in R m so this is well defined most of the examples that we have discussed previously 0 transformation identity transformation the second example the third example reflection the fourth example rotation natural projection natural inclusion projection operator all these are particular cases of this for different choices of A this T is linear which follows by matrix multiplication this is linear okay this sort of summarizes all those examples now what is also true which is the most interesting part of linear algebra is that a certain converse is true that is if I have a linear transformation between finite dimensional vector spaces then there is a matrix which has a property that the that the transformation T satisfies this equation for that matrix okay if I have a linear transformation between finite dimensional vector spaces then there is a matrix A we can construct a matrix A such that this holds for the linear transformation T that we started with okay so let me just say that a certain converse is true and this is this holds for finite dimensional vector spaces okay so this list should probably convince you that linear transformations are indeed important objects before I proceed to the certain simple properties let me also consider this notion of what is linear sometimes is not really the linearity that we would like to have as illustrated here that is I want to give example 12 which is not really an example let us look at T from R to R defined by T of x equals x plus 1 okay T of x is x plus 1 the translation now we can plot this on R 2 that is you can call this as y then I have y equals x plus 1 now this is a straight line not passing through the origin okay you can verify that this T is not linear you can verify that this T is not linear in spite of the fact that intuitively in R 2 y equals x plus 1 is a line okay so if you have a formula representing a line in R 2 this does not necessarily correspond to a linear transformation this is just a simple point I wanted to illustrate okay in any transformation that transforms a line to a line is not necessarily a linear transformation is what I wanted to emphasize okay so this is not a linear transformation you can verify by simple examples that this T is not linear so anything that looks like linear is not necessarily linear on the other hand if it is straight line passing through the origin then this will be a linear transformation okay let us now look at some simple properties okay first property is the following so let me write down this theorem T from V to W T from V to W is linear then I have the following properties property 1 0 must be mapped to 0 this is the first property for instance you could use this property in that last example T of x equal to x plus 1 T of 0 is not 0 so that is not linear T of equal to 0 equals 0 property 2 we know that T of U plus V is T U plus T V this holds for U minus V also T of U minus V is T of U minus T of V and property 3 T of U plus V equals T U plus T V this can be extended to a finite sum T of alpha 1 U 1 plus alpha 2 U 2 etc let us say alpha k U k this is equal to alpha 1 T of U 1 plus alpha 2 T of U 2 plus etc alpha k T U k this additivity property that is a condition 1 that a linear transformation must satisfy can be extended to finitely many terms in fact linear combinations that is here these coefficients alpha k R in R U 1 U 2 etc U k they come from V let us quickly verify that these properties hold so I want to discuss the proof very quickly the look at the first part I will call U as T of 0 when U is T of 0 plus 0 T is linear so T of 0 plus 0 is T 0 plus T 0 I am calling this as U so I have U equals U plus U then from the first simple property of vector spaces it follows that U is 0 that is T of 0 is 0 that is the first property property 2 T of U minus V by definition this is T of U plus minus 1 times V minus V is minus 1 into V T is linear so T of X plus Y so that is T of U plus the constant is outside T of V minus 1 T of V that is happening in W so this is T U minus T V that is property 2 property 3 T of alpha 1 U 1 plus alpha 2 U 2 etc plus alpha k U k I will keep this as a vector maybe I will call it W then this is T of alpha 1 U 1 plus W I know that this is alpha 1 T of U 1 plus T of W then keep this as it is alpha 1 T of U 1 plus T of W formula for W T of alpha 2 U 2 plus etc plus alpha k U k I again have I will keep this as it is the rest of them I will call it W 1 and proceed alpha 1 T of U 1 plus alpha 2 T of U 2 plus T of W 1 where W 1 is alpha 3 U 3 plus etc plus alpha k U k proceed by induction etc this is alpha 1 T U 1 etc plus alpha k T U k okay so really simple property just making use of linearity definition of linearity okay a little more non-trivial properties of a linear transformation we will discuss next to motivate this property maybe I will give an example start with an example let us look at the functions sin x and cos x these are functions from R to R real valued functions of the real variable x these functions have the property that sin x equal to cos x at infinitely many points all those points starting from pi by 4 if you want and then you add 2 pi so there are infinitely many points x for which sin x equal to cos x okay for a linear transformation this kind of a thing will not be true for a linear if you have two so I have two really two functions sin x and cos x which coincide at infinitely many points but if you have two transformations T 1 and T 2 that coincide at all those basis elements then they must be the same linear transformation okay this is one important property which separates a linear transformation from a general function let me make this clear let V be a finite dimensional vector space and T 1, T 2 from V to W be linear transformations sometimes I will also call them maps these are functions so I have T 1 and T 2 linear maps from V into W no word about W V is finite dimensional suppose that I have a basis B let us say U 1 U 2 etc U n let this be a basis for V so see V is finite dimensional there is a basis considering of finite elements finitely many elements I am listing that basis suppose T 1 and T 2 satisfy the following equation if T 1 of U i equals T 2 of U i for all i 1 less than or equal to i less than or equal to n that is the transformations T 1 and T 2 coincide for the basis vectors the transformation T 1 and T 2 coincide for the basis vectors then we can show that T 1 is equal to T 2 then T 1 is equal to T 2 now contrast to this statement with the statement that I made to motivate this theorem sin x and cos x they are equal at infinitely many points but as functions they are not equal okay remember that T 1 is equal to T 2 means as functions these two are equal that is T 1 of x equals T 2 of x for all x and V as functions these two are equal they are one and the same one can also make the following informal statement from this theorem a linear transformation is completely determined by its action on any basis where I am assuming that the domain space is finite dimensional a linear transformation is completely determined by its action on any of its any of the basis of the domain space okay let us prove this quickly I want to say that T 1 is equal to T 2 so I will prove that T 1 of x equals T 2 of x for all x let x belong to V okay then I have a basis explicitly given script B so I can write x as alpha 1 U 1 plus alpha 2 U 2 plus etcetera alpha n U n okay let me now look at T 1 of x T 1 of this representation alpha 1 U 1 plus alpha 2 U 2 etcetera plus alpha n U n T 1 is linear so this is alpha 1 T 1 U 1 plus alpha 2 T 1 U 2 plus etcetera alpha n T 1 U n now I will make use of the fact that T 1 U 1 is equal to T 2 U 1 T 1 U 2 is equal to T 2 U 2 etcetera that is what is given T 1 and T 2 coincide for the basis vectors so this is alpha 1 T 2 U 1 plus alpha 2 T 2 U 2 plus etcetera plus alpha n T 2 U n again use of fact T 2 is linear to rewrite this as T 2 of alpha 1 U 1 plus alpha 2 U 2 plus etcetera plus alpha n U n but this is the x that we started with so this is T 2 of x so what we have shown is that T 1 of x is equal to T 2 of x for all x in V and so T 1 is equal to T 2 okay. Let me look at a numerical example to illustrate this result I want to give an example let us say T from R 3 to R 2 be such that T of the first basis vector is this T of the second basis vector I am in R 3 to R 2 so let us say minus 1 0 0 1 0 let us say this is 1 1 and T of 0 0 1 these 3 equations define T uniquely these 3 formulas define T uniquely what is the general formula for T of x I can write down because any x can be written as a linear combination of these okay so let us do that quickly let us take x in R 3 then x is I am following this notation consistently x 1 x 2 x 3 I can write this as x 1 into so okay see in our notation this is E 1 this is E 2 this is E 3 standard basis vector so this is x 1 E 1 plus x 2 E 2 plus x 3 E 3 any x is a linear combination of these the coefficients x 1 x 2 x 3 are given by the components of x I want T of x that is the question what is the general formula for T of x given x so T of x by definition is x 1 T of E 1 x 2 T of E 2 plus x 3 T of E 3 just plug in these values you get the formula for T of x so T of E 1 is minus 1 0 x 1 into minus 1 0 plus x 2 into 1 1 plus x 3 into 0 1 so you get a formula in terms of x this is minus x 1 plus x 2 second coordinate x 2 plus x 3 so this is T of x minus x 1 plus x 2 x 2 plus x 3 okay this is general formula if you know x you just plug in here you get T of x so the action of a linear transformation on a basis that is enough to determine the linear transformation completely okay let us also ask this question the answer will be given in the next this is the question I have let us say a basis V U 1 U 2 etc U n as before this is a basis of V I am given a set of vectors not necessarily a basis of W let us call them W 1 W 2 etc W n this is just a subset not necessarily a basis be a subset of W I have a basis for V and just a subset of W the question is does there exist a linear transformation a linear map T from V to W that takes the corresponding elements to the corresponding U I to the corresponding W I that is map such that T of U I is equals W I U I goes to W I does there exist a linear map T from V to V V to W such that this condition is satisfied okay for this to be satisfied do we need the conditions on W 1 etc W n okay if there exists a linear transformation is the transformation unique we will answer these questions in the next lecture.