 Let me recall we are discussing the matrix of a linear transformation let me give the framework let us say I have T from V to W, V and W are finite dimensional vector spaces T is a linear transformation I have the notation BV for an ordered basis of V I remember I have given U1, U2 etc Un so that dimension of V is n this is a basis of V and I have a basis for W denote that by BW that is let us say V1, V2 etc Vm so the dimension of W is m okay then we are going to define the matrix of the linear transformation T that is done as follows T is completely determined if TuJs are completely determined okay let me say T is completely determined by these vectors Tu1, Tu2 etc Tun these are n vectors these are n vectors in W each of these vectors is determined by m scalars that we know so this is where I want to introduce the following notation TuJ is in W, W is this is a basis for W and so this TuJ is a linear combination of these V's so let me write it as follows I will write this as Aj okay let me write it as A1j A1j V1 plus A2j V2 plus etc plus Amj Vm there are m vectors so there are m terms A1j A2j etc Amj these numbers A1j A2j etc Amj are unique corresponding to TuJ unique with respect to this basis okay we have fixed these two basis now we want to determine the matrix of T relative to these two basis okay so these numbers A1j A2j Amj are unique for TuJ let me write the summation notation and then introduce the matrix of the linear transformation so the notation is as follows the first subscript is what varies so this is i equals 1 to m Aj Vi i equal to 1 to m Aj Vi remember on the right hand side i is the running index i is the running index the free index is j that corresponds to TuJ in other words Tu1 is A11 V1 plus A21 V2 etc Am1 Vm okay now this is the summation notation that will be useful in some of the computations later okay now if this is the case then we know that the matrix of this vector relative to this basis matrix of a vector relative to a basis was discussed in the last lecture what is the matrix of this vector relative to this basis you just take these coefficients and arrange them in a column so it is this column vector A1j sorry yeah A1j A2j etc Amj this is the matrix of the vector TuJ relative to Bw I emphasize by writing this subscript then the matrix of T relative to Bv Bw remember I am talking about ordered basis okay the matrix of T relative to Bv Bw that is given by it is defined by okay so let us recollect I told you that these n vectors uniquely determined T okay complete information of T is given by these n vectors each of these n vectors in turn depend on these numbers A1j A2j etc Amj in general Aij so T is completely determined by m times n numbers okay so let us remember T is determined completely by m times n numbers we arrange them as a matrix and that is what I am going to call as A so A is Aij I throw jth entry is denoted by Aij that is my matrix A and what is this this is the notation that we will use for the matrix of a linear transformation where the jth column is given by A1j A2j etc Amj this completely defines a matrix A that is a matrix of the linear transformation T relative to these two fixed basis fixed ordered basis now this is in conformity with the usual notation of writing down a matrix A11 A12 etc A1n A21 A22 etc A2n Am1 Am2 etc Amn okay so this is the matrix of the linear transformation to summarize how do you construct this matrix you take two basis one for V one for W two ordered basis look at the action of T on each of the elements of this basis and then write it in terms of the basis vectors in W collect the coefficients arrange them as columns that is the matrix of the linear transformation. What is important is this formula what is important is this formula that T of uj is summation i equal to 1 to m Aij Vi if this formula holds then Aij is the I throw jth entry of the matrix of the linear transformation okay if this formula holds then Aij is I throw jth entry of the matrix of the linear transformation that completely determines the matrix okay let us look at some examples let us say I have T from R2 to R3 defined by T is from R2 to R3 so let us say I have T of x1, x2 is x1 plus x2 x1 minus x2 and x2 2 x2 I am going to consolidate by doing one or two examples I want to determine the matrix of this linear transformation relative to the standard basis let us do that first you will see that it is very simple the question is what is the matrix of T relative to the standard basis of R2 comma standard basis of R3 let me denote B1 B2, B1 is the standard basis of R2 so that is 1 0 0 1 B2 is the standard basis of R3 1 0 0 0 1 0 0 0 1 okay this is my U1 this is my U2 this is my V1 V2 V3 so I must look at Tu1 write it as a linear combination of these 3 vectors collect the coefficient that be the first column determine Tu2 write it as a linear combination of the 3 vectors again write it as a second column that is my matrix so T of 1 0 T of 1 0 is 1 1 2 I must write this 1 1 2 as a linear combination of those 3 vectors now you see that is the standard basis so the T of 1 0 is 1 1 0 I must write this as a linear combination of the standard basis vector since I have standard basis vector the coefficients are easy to see this 1 times 1 0 0 plus 1 into 0 1 0 plus 0 into 0 0 1 this actually gives me the first column I will write the first column after computing the second column also T of 0 1 0 1 is 1 minus 1 2 okay you see that this is again an easy linear combination okay so I have written down the first and the second column so what is the matrix of T relative to these 2 bases collect the coefficients 1 1 0 that is my first column collect the coefficients 1 minus 1 2 that is my second column this is the matrix of the linear transformation corresponding to the standard basis okay okay let us redo this problem with B 2 being changed to the following I will call that B 3 now okay this is only to illustrate the fact that the matrix of a linear transformation will change if you change the basis okay let me change B 2 so instead of B 2 I have B 3 let us I have these vectors 1 1 0 1 minus 1 0 for instance 0 0 1 it is clear that this is a basis these 3 vectors are linearly independent and the dimension of R 3 is 3 so this must be a basis okay I want to determine the matrix of T relative to B 1 B 3 I will use the calculations that we have done earlier T of 1 0 B 1 has not changed so T of 1 0 is 1 1 0 earlier it was written in terms the standard basis vectors now I must write this in terms of these vectors okay but you can see that by inspection this will be 1 by 2 times 1 1 0 plus 1 by 2 times 1 minus 1 0 this is 0 so this is 0 times 0 0 1 is that okay half plus half is 1 half minus half is 1 okay so oh this vector is the same as this yes this is the same as the first basis vector so that has made matters simple okay so 1 0 0 is the first column of the matrix of T relative to B 1 and B 3 T of 0 1 has been done before so what is that 1 minus 1 2 so what are the coefficients 1 minus 1 0 1 2 okay okay this is 0 the contribution comes from these 2 this is 1 minus 1 and a 2 okay so it has turned out to be simpler than the previous one so the matrix of T relative to the basis B 1 B 3 is this time 1 0 0 0 1 2 okay this is obviously different from the matrix of T relative to the basis considered earlier let us look at one more example let us say I have the operator D from P 3 to P 2 a differentiation operator defined by D of P is P prime P prime is a derivative of P P is a polynomial small P that is a polynomial P prime is a derivative of this polynomial let us take the standard basis for these 2 B 1 let us say is P 3 so I will have 1 P T square degree less than or equal to 3 this is the basis for P 3 dimension 4 P 2 I will use B 2 1 P T square dimension 3 I want to determine the matrix of D relative to these 2 basis what is the matrix of D relative to these 2 basis okay look at the first one let us call these as P 0 P 1 P 2 P 3 P 0 of T is 1 P 1 of T is T P 2 of T is T square P 3 of T is D cube let us call this as Q 0 Q 0 of T is 1 Q 1 of T is T Q 2 of T is T square okay I need to make these calculations what is D P 0 P 0 is a constant D P 0 is 0 how do you write 0 as a linear combination of independent vectors the unique way is 0 times Q 0 plus 0 times Q 1 plus 0 times Q 2 so that determines the first column let me write it on this side the matrix of D relative to these 2 basis the first column is 0 0 0 remember the order of D what is the order of D it is from a 4 dimensional vector space to a 3 dimensional vector space 3 cross 4 D is a 3 by 4 3 rows 4 column matrix okay D P 0 is this D P 1 P 1 is T derivative is 1 1 unique 1 times Q 0 plus 0 plus 0 D P 2 2 T 2 T in terms of these unique 2 times Q 1 I must write plus 0 times Q 1 plus 0 times Q 2 I am leaving the details here 2 times Q 1 Q 1 is the second vector so it is 0 2 0 that is a third column D of P 3 T Q 3 T square 3 times Q 2 all the other coefficients are 0 so that is 0 0 3 so this is the matrix of the linear transformation the derivative transformation with respect to these 2 basis okay let us get back to the general ideas the question is how are see we have defined the matrix of a linear transformation but remember that there are 3 matrices that we have actually defined 1 is the matrix of X relative to B V see the framework is as before T is from V to W finite dimensional spaces B V is a basis of V B W is a basis of W T is linear there are 3 matrices that we have actually defined the matrix of the vector relative to the basis B V there is a matrix of the vector T X that is in W that is relative to the basis B W and finally the matrix of T relative to B V comma B W fixed basis there are 3 matrices really how are these 3 related if you look at the relationship so we will derive a relationship between these 3 matrices these 3 vectors really and if you look at that relationship it will tell you why I made the statement that any linear transformation between finite dimensional spaces is like multiplication by a matrix. So we have the following result connecting these 3 matrices I have given the framework as before let me only emphasize the basis vectors it is not necessary right away I will write down the formula connecting these 3 it is the following if you look at the matrix of T X relative to B W it is the matrix of T relative to B V B W into the matrix of X relative to B this is the relationship connecting these 3 matrices or vectors okay so now can you see the matrix of T X is equal to the matrix of T into the matrix of X this is like A according to our notation so the matrix of T X is like A times X the matrix of X the matrix of T X is like A times the matrix of X okay so any linear transformation between finite dimensional vector spaces is like T X equals A X like multiplication by matrices that is A times X where X is in V okay let us prove this this is a remember this is an equation involving vectors proof for the proof I will make use of the formula that I had written down earlier let me recall the matrix of the linear transformation T relative to these 2 bases I call that as A and this is ijth entry is A ij where the jth column of this matrix where the jth column of this matrix is related to the linear transformation T by means of the following formula i equal to 1 to m A ij V i i is the running index j is the free index this is how we write the matrix of the linear transformation T relative to the bases V 1 V 2 so let us recall this is the basis for V this is only to emphasize my notation V 1 V 2 etc V m this is the basis of W okay so from this relationship we will establish this formula okay let us start with the matrix of X relative to B V let us say that this is if X has its representation alpha 1 U 1 plus alpha 2 U 2 etc alpha n U n okay I have simply written X and then I want the matrix of this relative to B V then what I know is that this is just the column alpha 1 alpha 2 etc alpha n this is the matrix of this vector X relative to B V okay I will need this formula this is the formula for representation for X okay what is T of X? T of X is T of X is given here let us say I write summation j equals 1 to n alpha j U j X is alpha 1 U 1 plus alpha 2 U 2 etc alpha n U n that is this I want T of X I know T is linear so I write this as summation j equals 1 to n alpha j T of U j just use linearity of T plug in the formula for T U j from the previous step this is summation j equals 1 to n alpha j T U j the summation index there is i summation i equals 1 to n A i j V i okay but you see that this is a finite sum I can interchange the indices i and j without a problem so this is summation first I write i i equals 1 to n I will take V i outside and then club the other two sets of scalars as summation j equals 1 to n A i j alpha j what I have done is to exchange the summation let me call look at this this is a number for every i this is a number for every i this is a number okay whereas V i are vectors so factor about this for the moment this is a number and the running index is j let me call this number as beta i right the free index is i let me call this beta i so this is i equals 1 to n beta i V i okay summation beta i V i where what is beta i beta i is this number so let me write down where beta i is summation j equals 1 to n A i j V j I am sorry A i j alpha j that is beta i okay let us go back and see what we have done and consolidate to summarize T x has been written as a linear combination of V 1 V 2 etcetera V m where the coefficients are beta 1 etcetera beta m T x is a vector in W if I want to know the matrix of T x relative to B w then I must look at the unique representation of T x in terms of the basis vectors that is in terms of the vectors in B w that is in front of you then this means T x is the matrix of T x has this these entries as a column entries that is T x B w it is just this vector okay what I really want is this what I really want is this now just plug in these formulas and see that it will unfold let me write it in full what is beta 1 i is 1 A 1 1 alpha 1 plus A 1 2 alpha 2 etcetera see the summation is over j A 1 n alpha n that is beta 1 beta 2 i is 2 A 2 1 alpha 1 plus A 2 2 alpha 2 A 2 n alpha n i equals m A m 1 alpha 1 A m 2 alpha 2 A m n alpha n I have simply written down the expanded forms of beta 1 beta 2 etcetera beta m coming from this formula i equals 1 alpha 2 m now what you have on the right is precisely A times x this is A 1 1 A 1 2 etcetera A 1 n A 2 1 A 2 2 etcetera A 2 n A m 1 A m 2 A m n into alpha 1 alpha 2 etcetera alpha n but what is this this is A this is A that is this is this matrix what is this alpha 1 alpha 2 etcetera alpha n we have started with this x B v okay. So the right hand side okay this is a then the basic relationship connecting the matrices of T x the matrix of T and the matrix of x okay let us look at some of the consequences okay remember to summarize from a matrix we can define a linear transformation from a linear transformation with fixed basis we can have a unique matrix which means there is a one to one correspondence there is a one to one correspondence between matrices of linear transformations relative to fixed basis and linear transformations themselves okay that is let me define this as a function then we have the following result define a function phi from L v w I will tell you what L v w is 2 R m cross n what is L v w L v w is the set of all T from v into w so it is a T is linear so I collect all linear transformations between from v into w collect all linear transformations that I am calling as L v w now it is easy to see that this is a vector space this is T to see that this is a vector space what are the 2 operations vector addition and scalar multiplication vector addition is addition of 2 linear transformations how is addition of 2 linear transformations defined point wise S if S and T are linear transformations then S plus T of x is S of x plus T of x okay so point wise addition then with respect to this operation S plus T is a linear transformation from v into w scalar multiplication alpha times S how is that defined alpha S of x is alpha times S x this is scalar multiplication with respect to vector addition that I defined earlier this is a vector space this is a vector space over the same field that we started with we are looking at always the real field case for simplicity so this is a real vector space this is a real vector space I would like to actually determine the dimension of this real vector space okay but before I do that let me tell you what this function is Phi is a function from L v w to R m cross n now we know how to uniquely determine a matrix corresponding to a linear transformation all we have to do is fix 2 bases so let us say I have fixed 2 bases b v for v b w for w when I fix these 2 bases I know that there is a unique matrix corresponding to any linear transformation T any linear transformation T will be an arbitrary element here so the definition of Phi of T will be just the matrix of T relative to the ordered basis b v comma b w T belongs to L v w okay I am defining a function from the vector space L v w to the vector space R m cross n how do I get the numbers m and n m is a dimension of w n is a dimension of v I have fixed 2 bases for these 2 vector spaces then I know how to determine the matrix of the linear transformation T so can you see then this is well defined Phi of T T is a linear transformation on the right this is a matrix so that is an element of R m cross n and how is this unique how is this a function this is function because relative to a fixed 2 fixed bases the matrix of a linear transformation is unique so the right hand side is unique so this is well defined it has some further properties it is linear first can you see that Phi is linear Phi is not just a function what we will show is that it is linear it is in fact invertible it is 1 to 1 and 1 to we will also use another property it is a map that preserves products okay I will explain this a little later Phi is linear what is the meaning of that you must verify that it satisfies Phi of S plus T equals Phi of S plus Phi of T and Phi of alpha times S is alpha times Phi of S these properties follow simply because of the corresponding properties for matrices okay so please verify this what exactly am I saying if a and b are 2 matrices then a plus b into x is a x plus b x alpha times a of into x is alpha into a x that give that will give you these 2 properties Phi is 1 to 1 and on to what about that Phi is injective and subjective Phi is linear okay Phi of T equal to 0 the 0 matrix implies the matrix of T relative to b v b w this is 0 matrix the same 0 real 0 okay the matrix of T relative to b v b w that is 0 means each entry is 0 especially jth column is 0 if the jth column is 0 what it means is that T u j is a 0 vector if T u j is a 0 vector how do you write this as a unique linear combination of v 1 v 2 etc v m the only way is that this coefficients are 0 which means T u j is 0 but T u j is 0 can happen is a basis vector T u j is 0 for every j T u 1 is 0 T u 2 is 0 etc T u n is 0 but we know this is if this happens then T has to be only the 0 transformation T u 1 is 0 T u 2 is 0 etc T u n is 0 the only transformation that will satisfy this property is the 0 transformation so T is 0 so what have we shown we have shown that null space of Phi of T is singled on 0 null space of Phi Phi is the transformation we have shown that it is a 0 transformation please remember that there is a difference this is a 0 matrix this is a 0 transformation this is a 0 transformation that I have written down null space of Phi is 0 which means Phi is injective how is Phi surjective how do you how do you rephrase this given a matrix is there a linear transformation corresponding to this matrix we know that there is okay so Phi is also we have already seen that Phi is surjective really Phi is surjective also and so Phi is an isomorphism it is a homomorphism which is bijective so it is an isomorphism so Phi is an isomorphism and the fact that Phi is surjective needs a little argument but I am going to leave that as an exercise what you must actually do is take a matrix define a linear transformation then how are these two the linear transformation and the matrix are defined the linear transformation must be such that the matrix of the linear transformation relative to B V B W is the matrix that I started with okay that is really what you need to verify okay but I am still going to leave that as an exercise so Phi is an isomorphism we know that isomorphism preserves dimensions so what is the dimension of T L V W what is the dimension of L V W what is the dimension of R m cross n m into n can you give a basis for R m cross n think it over standard basis the basis consisting of the following matrices one in the first entry all other entries 0 one in the second entry all other entries 0 do it for all the m n entries the dimension of R m cross n is m n so the dimension of L V W is m n so L V W is m times n dimensional if V is n dimensional W is m dimensional.