 Okay, so cyclic decomposition theorem is the next important result I will first start with the notion of cyclic subspaces cyclic subspaces and annihilators annihilators we have seen before cyclic subspaces let me define that first okay so what is the cyclic subspace what is the cyclic vector for defining a cyclic vector I will look at the following subspace so V is finite dimensional and T is a linear operator on V for a fixed vector X and V the T cyclic subspace generated by X so T is the operator that we start with for any fixed X I want to define this T cyclic subspace generated by X is defined by the notation is Z X T Z X T is the T cyclic subspace generated by X this is the set of all G capital T X such that G belongs to F F of T okay observe how this is constructed take any polynomial arbitrary take any polynomial any degree look at G capital T T is fixed G capital T and then operate on X that will be some vector in the vector space V put that in this Z X T do this for all polynomials do this for all such polynomials so this is obviously an infinite set and it is also obvious that this is a subspace observe that this is a subspace of V this is contained in V to begin with it is a subspace of V this is a subspace I am just emphasizing I have already given the name T cyclic subspace but I want to emphasize that this is a subspace this is called the T cyclic subspace generated by X if this is a subspace of V there are times when this subspace is equal to W is equal to V that special location now happens for some vector X that will be called a cyclic vector X is called a cyclic vector by the way the name why it is cyclic will be clear when we get a basis for this subspace so right now we will just keep it as this X is called a cyclic vector for T X is called a cyclic vector for T if this vector X satisfies the condition that this subspace is the whole of V X is a cyclic vector of a T if this subspace is the whole of V okay to consolidate let us look at some examples take the case when X is 0 what is this cyclic subspace generated by 0 take all polynomials then G T 0 if capital T is a polynomial G of capital T is also sorry if capital T is linear G of capital T is also linear so this is just single term 0 okay what about any vector X identity operator what is this subspace it is not difficult to see that is span of X and of course assume X not 0 X 0 has been taken care of X 0 has been taken care of earlier so this is span of X a little more general in the third example Z is equal to span of X if and only if X is an eigen vector for T again X is assumed to be not 0 I will not emphasize this this is more general this subspace is generated by the vector X if and only if the vector X is an eigen vector for T a quick proof of this suppose that this is equal to span of X then G T X this is the general term in this subspace this is a multiple of X for any polynomial alpha in the field for any polynomial G G T of X is in Z X T that is a definition if it is a span of single vector some multiple of that I want to show from this that X is an eigen this is true for all polynomials G in particular G of T equals T for G of T equals T we have T X we have T X equals alpha X X is not 0 so this is an eigen vector okay conversely something that we have seen before okay let me go through this quickly it is easy still conversely if T X equals lambda X and if G is any polynomial then what happens to G T X is what I want to see I want to show that this is a multiple of X so it will follow that any anything in Z is a multiple of X G T X we know is what T X equals lambda X so it is G lambda this is just G lambda X that is a number alpha times X alpha is G lambda so what we have shown G T X is a multiple of X so this belongs to span of X okay so that is the third example generalizing the second this is for the subspace for a cyclic vector is there an example let us define T on R 2 0, X 1 X is an R 2 first coordinate is 0 second coordinate is the first coordinate of X that is T what is this subspace E 1 is a first standard basis vector 1 0 I want to determine this subspace I want to show that this is equal to R 2 okay let us observe that this T has a property that T square equal to 0 will put in T square equal to 0 Z is what by definition this is the set of all G T X such that G is F in F T take G to be any polynomial let us say G of T is alpha 0 plus alpha 1 T etc alpha K to the T K if you look at G T it will simply be alpha 0 identity plus alpha 1 T because T square T cube etc are 0 so G of T is this I want to calculate G T X G T E 1 sorry this is G T E 1 please correct this this is G T E 1 I am looking at the cyclic subspace generated by E 1 in this example so G T E 1 is alpha 0 E 1 plus alpha 1 T E 1 I am applying this to E 1 but T E 1 is E 2 this is alpha 0 E 1 plus alpha 1 E 2 G T E 1 is alpha 0 E 1 plus alpha 1 E 2 G is arbitrary can I get all vectors in R 2 from this combination that is G is essentially the first to 2 alpha 0 and alpha if I vary alpha 0 alpha 1 G varies so is it clear that this then is this subspace generates all vectors in R 2 so it is clear that the cyclic subspace generated by E 1 is the whole of R 2 I leave it as an exercise for you to show that E 2 is not a cyclic vector because T E 2 is 0 T E 2 is 0 so you cannot so can you see already what cyclicity here means E 1 T E 1 that is what we have done here E 1 T E 1 in general E 1 T E 1 T E 2 E 1 etc in general a vector X T X T square X etc that will be a basis for a cyclic subspace that is what we will prove next in fact okay so I have given an example of a cyclic vector in R 2 for a particular linear transformation okay so I want to prove this theorem now which makes it clear why this is called a cyclic subspace but before that I need a definition let me give the definition this definition is something that we have seen before I want to recall this the T annihilator of a vector X the T annihilator of a vector X is it is the subspace I remember having used S T X V now T annihilator of X will just be S T of X I define S T X W for a subspace W earlier I am looking at the particular case when W is single term 0 this was mentioned even that time this is the set of all polynomials now such that G T X is 0 set of all annihilating polynomials that annihilate X in particular that is all not all annihilating polynomials yeah all polynomials that annihilate T belong to this but it is it has more general polynomials in this situation this will be renamed as just M X T and just call it capital M X T I have given this notation because we have seen this before I will call it M X T so for me M X T is this subspace of F T this is an ideal this is an ideal this is an ideal in a principal ideal domain there is a unique monic polynomial which generates this ideal that monic polynomial will also be called the T annihilator of X okay let P X be the unique monic polynomial generating M X T I started with a fixed X I am fixing T P X will be for me the unique monic generator of this ideal in F T we have seen this fact before that the minimal polynomial is a member of this and so P X divides a minimal polynomial all this is to recall that M divides P X the minimal polynomial of capital T divides P X of T I am sorry P X divides M the degree of P X could be smaller P X divides M belongs to small M belongs to capital M small M belongs to capital M this could have a lesser degree than the minimal polynomial so P X divides M okay. Now these notions are related cyclic vector and this P X are related that is the next theorem so let me state and prove this okay so is it clear I will refer to the polynomial the unique monic polynomial that generates this ideal also as the T annihilator of X the notation is P X in general P X could have a lesser degree than the minimal polynomial of T okay so now we are in a position to prove this theorem let degree of P X be equal to K the framework is already there V is a finite dimensional vector space T is a linear operator on V P is the unique P is the T annihilator of X let degree of P X equal to K then dimension of this cyclic subspace generated by X is also K these are related in fact we have the following in fact if script B is X, T X, T square X etc T to the K minus 1 X this has K elements this is a basis for I am going to change the notation because I am actually looking at subspaces okay so degree of P X is K dimension Z instead of T I will take an operator U that is equal to K then this will be X U X U square X etc U to the K minus 1 X this is a basis for Z X U okay this is the theorem now it should be clear why this is called a cyclic subspace we have something like a single vector X all the other basis vectors can be obtained by action of U okay so it is a cyclic subspace okay proof of this so U is a linear operator degree of P X is K let G be any polynomial let G be any polynomial see I want to show that this is a subspace of this I will study Z X U closely and then show that this is a basis take G in F T then I have two polynomials one is G the other one is P X I will apply the division algorithm F T is a principal ideal domain it is an Euclidean domain so I can apply Euclidean algorithm what is Euclidean algorithm it is a generalization of this that if A and B are two integers positive integers then I can write A as B times C plus R where either R is 0 or R is less than B I started with A and B yeah R is less than B the same thing happens for a Euclidean domain for this Euclidean domain F T I am going to look at G and P X okay I will say that there exists Q and R which are polynomials Q is the quotient R is a remainder that is the suggestion such that such that this polynomial G of T can be written as the polynomial P X T the annihilating polynomial for X into Q the quotient plus the remainder R where Q and R satisfy where either R is equal to 0 or yeah not Q and R where R and P degree of R is less than degree of P between any two polynomials this can be done between any two polynomials let us say P and G this can be done if the degree of G is less than P Q can be taken to be 0 and R is equal to G if the degree of G is greater than P then you can actually divide P by I am sorry yeah you can divide G by P there is a quotient and a remainder the quotient is a polynomial Q remainder is this the remainder has this property that the degree is less okay we are looking at Z X U so I need to look at G T X really what happens to G T X so I am just writing Z X U this is the collection of all G T X so now I am I have a formula for G so look at G T X G T X is P X T Q T X plus R T X where either R is 0 or a degree of R is less than degree of P these are polynomials in T so they commute so I can write this as Q T P X T X plus R T X but P X T X must be 0 because P X is the unique P X is the annihilator for of X P X to begin with must be a member of this subset subspace so this is 0 so this is just R T X okay whatever be the degree of the polynomial G if you look at the action of G on T then the powers of T beyond K do not matter because R the degree of R is less than degree of P X right look at R of T the degree of R is less than degree of P P X has degree K so I have R as R 0 plus R 1 T etc I am just coming degree of R is less than degree of P X P X has degree K so this will be R K minus 1 T to the K minus 1 yeah what is the question P X T comes from this I collect all the polynomials G that is added to the property that G T X is 0 this subspace is an ideal that is generated by the unique monic polynomial P X in the first place P X is a member of this so P X capital T X is 0 P X capital T X is 0 it is an annihilating polynomial I have gone back to T I wanted you does not matter by force of habit I have gone back to T let us keep it as T okay is this clear the degree of R is less than degree of P X so I can write this but it is a monic polynomial so this R K minus 1 is 1 so I can remove it okay it is a unique monic polynomial the degree of the highest power of T the coefficient of the highest power of T must be 1 okay that is my R so G T X is R T X which is R 0 I plus R 1 T plus etc plus R K minus 2 T K minus 2 plus R K minus 1 is 1 T K minus 1 this acts on X that is G T X I have written the complete action of any polynomial in T on the vector X so I know Z I know Z X T I have written the action of any polynomial I started with G as any polynomial I have written the action of G T on any I have written the action of G for G being any polynomial on X that is what I have here for one thing okay what is that I have written it here see I have written U and a T here so I need to stick to some notation I will call it T go back to this so this G T X must belong to this this Z is a set of all such G T X so for one thing it must be clear that span of these vectors span of which vectors X T X etc T square X etc T to the K minus 1 X span of these vectors must be equal to the subspace Z X T this is a spanning set I want to show this is a basis in fact this is a spanning set I must show that they are linearly independent also is it clear the calculations that we have done I have started with a general polynomial G I want to look at Z X T so I must know the action of G T X to know the action of G T X I have applied the Euclidean algorithm and I observe that this is the same as the action of a polynomial of degree K minus 1 whose the coefficient of the highest degree is 1. So I have written so this this step is clear that the span of these vectors is Z X T I must show that these are linearly independent if they are not we will get a contradiction. So suppose that they are linearly independent suppose that X T X etc T to the K minus 1 X are dependent then there exist scalars let us call them delta not delta 1 etc delta K minus 1 not all 0 such that delta not X plus delta 1 T X plus etc delta K minus 1 T K minus 1 X 0 suppose these vectors are linearly dependent we will get a contradiction let I be the greatest subscript 1 less than or equal to sorry 0 less than or equal to I less than or equal to K minus 1 let I be the greatest subscript such that delta I is not 0 there is at least 1 because at least one of them is not 0 they are linearly dependent this is the greatest all the terms from that term onwards will be 0 so I get 0 as delta not X plus delta 1 T X etc plus delta I is not equal to 0 so delta I T to the I X that is 0 all the other terms is not there all the other terms will have the coefficient 0 this is the greatest subscript what we must do next is clear I define a polynomial H of T using these numbers delta not plus delta 1 T etc plus delta I T to the I this polynomial has a property that H of capital T X is 0 so this is the kind of argument we have seen before this polynomial has this property H T X is 0 because of this degree of H is I that cannot exceed K minus 1 which cannot exceed K a contradiction because P X is the annihilating polynomial the degree is K it cannot be less than K this is a contradiction and so these vectors are also linearly independent and so this subspace Z X T is generated by these vectors X T X etc T to the K minus 1 X generated by me this is a basis is what I want to say this is a basis for the cyclic subspace Z X T okay what we are going to do is to look at okay let me tell you the general scheme why are these cyclic subspace is important okay I will just write down this general scheme this is really the cyclic decomposition theorem which we will prove in the next couple of lectures so the objective is to write V as a direct sum of cyclic vectors X 1 cyclic vector of this subspace cyclic vector of this subspace etc okay we have already seen a decomposition of this type for the subspace is W 1 etc W K where W I is a null space of P I T of R I P I T to the R I coming from the primary decomposition theorem okay we have seen a direct sum decomposition of a finite dimensional vector space in this manner where the subspace is W 1 etc W K come from the primary decomposition theorem right away you have a connection between Eigen values and those subspaces here the connection is not very clear but that will be made clear in due course where does Eigen value come where does Eigen value come in this okay but this is what we want to prove this is cyclic decomposition theorem that is the reason why these subspaces are important that is why one is introduced in cyclic subspaces of an operator T okay so we really need to look at the matrix of the operator T corresponding to this that will be a block the next block will be the matrix of the operator T corresponding to a basis of this etc I will write down the matrix of T relative to the cyclic basis that I have for this subspace then this subspace cyclic basis for this subspace etc cyclic basis for this subspace I will collect those cyclic basis take the union write down the matrix of T relative to this this will essentially be the Jordan form Jordan canonical form okay that is what we will prove that is the last result of this course so we need to actually look at the action of we need to look at the matrix of T relative to the basis cyclic basis that we constructed for this so we will look at each subspace look at the matrix of T corresponding to that particular cyclic basis okay let us now look at the operator U not the operator T so let us say I have this framework let U be an operator on really a subspace W suppose the degree of Px is k then what we have shown just now is that the vectors x Ux U square x etc U to the k minus 1 x these vectors form a basis then this is a basis for W this is what we have seen just now I would not be changed V to W T to U I want to write down the matrix of U relative to this basis what is that matrix what is the information that we have from that matrix to write down that we will introduce another notation let me define really renaming let me call V I as U to the I minus 1 x for 1 less than or equal to I less than or equal to k I am calling these as V 1 V to etc V k V 1 for instance is just x the first vector I use 1 V 2 is U x that is U V 1 etc I can go up to the last but 1 look at okay V k is also U to the k minus 1 x which is U to the k minus 1 V 1 but you can proceed by induction V is V 2 is U V 1 V 3 is U V 2 really it will be U V 2 you can proceed to show that this is equal to U V k minus 1 so capital U V k minus 1 this can be shown by induction yeah so these have been relabeled as V 1 etc V k so I have B as V 1 V 2 etc V k I want the matrix of U relative to this basis that is a question what is the matrix of U relative to this basis so I must by definition how do I get this matrix I look at the image of U V 1 I must write it as a linear combination of V 1 V 2 etc V k that will be the first column U V 2 linear combination of V 1 etc V k that is a second column etc U V 1 U V 1 is V 2 okay then U V 2 is V 3 etc I can go up to U V k minus 1 U V k minus 1 I have written it that is V k which means I have the first k minus 1 columns of U V 2 there is no V 1 first term 0 second term 1 all others are 0 that is the first column V 3 0 0 1 all others are 0 0 0 0 1 all others are 0 the k minus 1th column will be 0 0 etc the last entry is 1 this is the k minus 1th column remember there are just k minus 1 equations here the last column will be filled if I know what is U V k what is the action of U 1 V k now that comes from the minimal that comes from the annihilating polynomial P x I need to fill up the last column so I need to know what is U V k what is U V k then I know I take U V k write it as a linear combination of V 1 etc V k then I know the last column okay what is that so I will come back to this and fill it up let us do the calculation what is U V k I know is that degree P x is k let me take P x to be of this form shall I use P 0 P 0 plus P 1 P etc P k minus 1 T k minus 1 plus the coefficient here is 1 T k I have written the polynomial the annihilating polynomial P x like this using the fact that this is an annihilating polynomial I will be able to compute U V k what I know is that 0 is P x U x which is P 0 x plus P 1 T x sorry P 1 U x plus P 2 U square x plus etc plus P k minus 1 P to the k sorry U to the k minus 1 x plus U to the k x that is P x U x that is 0 this is the operator U capital U this is P 0 into x x is V 1 plus P 1 into U x U x is V 2 plus etc plus P k minus 1 U k minus 1 x U k minus 1 x is V k plus U k x U k x you can show is V k U k x equal to you can show it is U of V k U k x is U of V k which is what we need to determine so push these to the left hand side I get the unique representation for U of V k in terms of these vectors is minus P not V 1 minus P 1 V 2 etc minus P k minus 1 V k remember that P is a polynomial of degree k so there are k plus 1 terms here this side I have k terms P 0 up to P k minus 1 there are k coefficients those k coefficients fill up the last column okay so I go back and write complete this matrix of U relative to this basis B see this entry is 1 the last column entry minus P 0 minus P 1 minus P 2 etc minus P k minus 1 this is the k by k matrix this has a special structure the matrix of U relative to the cyclic basis the matrix of U relative to the cyclic basis has this form this matrix is called the companion matrix for the polynomial P x okay 0 1 etc 0 0 1 etc 0 0 etc 0 1 minus P not minus P 1 etc minus P k minus 1 these are the coefficients that determine the polynomial P x okay this is called the companion matrix the companion matrix of P x the matrix that accompanies this polynomial P x what we have shown is that the matrix of the operator U relative to the cyclic basis is the companion matrix is the converse true is the converse true what is the converse what is the converse the matrix of U relative to the cyclic basis is the companion matrix can I get a basis which is a cyclic basis from this can I get a basis the question is this I know P x I know the companion matrix so what is the converse does there exist a basis a cyclic basis really cyclic basis by which I mean there is a vector x then the next is U x U square etc does there exist a cyclic basis for W such that can I say cyclic basis generated by x for W which is this subspace what I know is that the matrix of U relative to a basis is the companion matrix I have the operator U with respect to some basis the matrix of U is the companion matrix does it follow from this that there is a cyclic basis for U that is the converse okay given the operator U there is a basis such that the basis has the property that the matrix of U relative to that basis is the companion matrix of the polynomial P x can I find a vector x so I should not probably call this x cyclic basis generated by some x not cyclic basis generated by some vector x not the answer is yes so the proof of this is as follows the answer is yes first let me add on the answer the answer is yes the proof is really one line what is that one line proof let B be V1 V2 etc Vk be a basis for this subspace such that should I call it B prime I will reserve the script B for the cyclic basis so B prime is this basis B a basis for this such that the matrix of U relative to B prime equals the companion matrix equals the companion matrix that goes with P x companion matrix for P x the question is can I get a basis which probably I could call it B can I get a basis B which is a cyclic basis for this subspace for this subspace ZXU how do I construct I am not going to prove this I will simply tell you what you must do show that V1 is a cyclic vector that is all so I am really leaving this as an exercise it can be shown that V1 is a cyclic vector that is the subspace W is Z V1 so what I call as x not I am saying V1 is a candidate for x not this is just one candidate in fact there are k candidates at least k candidates for this for this cyclic basis okay the final result I will again state it not prove it okay so remember what we have done is we have looked at the notion of a cyclic subspace generated by a vector we looked at a basis for that and then we wrote down the matrix of U relative to that basis okay that is a companion matrix the final result is the following if A is the companion matrix for Px if A is the companion matrix for the annihilating the T annihilator of x then the minimal polynomial the characteristic polynomial both the minimal polynomial and the characteristic polynomial equal the polynomial Px both the minimal polynomial and the characteristic polynomial equal the polynomial Px the proof is really by verifying that this annihilating T annihilator of x is an annihilating polynomial for U or T proof the only hint is that so that Px U equals 0 so that Px equals 0 it then follows that it is a characteristic polynomial as well as a minimal polynomial okay so let me stop here.