 We are proving step 2, step 2 the statement I have written down once again essentially what we have is this representation, we have this representation for FTY, Y is an arbitrary vector F is this polynomial so FTY must belong to WK minus 1 I am looking at a general representation of FTY then whenever I have a representation like this it always holds this is what we must show it always holds that this F divides each of those polynomials G1, G2, etc GK minus 1 and this Y0 has this property that it can be written as FTZ0 for Z0 and W0 I mentioned that this is T admissibility, T admissibility of W0, W0 is a T admissible subspace of V, okay the proof is by induction on K and K equal to 1 is just T admissibility of W0 for K equal to 1 there are no terms here for K equal to 1 FTY is Y0 then Y0 is FTZ0 okay that comes so for K equal to 1 I need to only verify this part because there are no polynomials here the polynomials GI's are not present so K equal to 1 is just T admissibility of the subspace W0 we will assume that the result is true for K minus 1 assume that the result is true for K minus 1 we will prove it for K we will prove it for K so consider K greater than 1 I look at these polynomials G and I look at the polynomials GI and F applied division algorithm I will just say that there exist polynomials HI and RI such that I can write F as say FT is I am looking at GI of T is F of T HI of T plus RI of T for every I where RI's are polynomials that satisfy where either RI is equal to 0 or degree RI is strictly less than degree F this is by the division algorithm the claim is that RI equals 0 for all I claim is RI remainder is 0 for all I if let us say we have proved this claim it would then follow that F divides GI that is what we want to show the second part follows from admissibility okay we want to show that F divides GI so we will show that RI is 0 proof will be by contradiction suppose some RI is not 0 we will get a contradiction contradiction to the hypothesis that it is true whenever the index is K minus 1 okay so we want to show RI is 0 for all I let us now define remember that I have been given a vector Y okay so I will use this vector Y and define a vector Z as Y minus summation I equals 1 to K minus 1 HI of T YI I know the polynomials H 1 etcetera H K minus 1 coming from the previous step these polynomials are known I define a vector Z in this manner okay W not is T admissible each W I is also T admissible each W I is T invariant each W I is T invariant W K is this each of the cyclic subspaces is invariant under T W not is invariant under T so at every step you are adding a subspace which is invariant under T so W K is invariant under T so this look at Y I they are taken from I equal to 1 to K minus 1 can you see that this belongs this whole thing belongs to W K minus 1 because W K has this property that W K minus 1 W K minus 2 etcetera they are all contained in W K it is a kind of nested sequence of subspaces okay W 1 contained in W 2 contained W 3 etcetera so each of these terms will belong to W K minus 1 the last one and so if you look at the difference Z minus Y that belongs to W K minus 1 the difference Z minus Y belongs to W K minus 1 now if this happens then we should quickly make this observation then look at S Z of W K minus 1 I am claiming that it is the same as S Y W K minus 1 S Y W K minus 7 I have already denoted that by F so all I need to do is to show that these two polynomials are the same in other words I want to show that if X Y belong to a subspace W then little S Y comma W is equal to little X S X comma W this is what I want to show the difference Z minus Y belongs to the subspace W K minus 1 but that comes from the definition okay the fact that these two are the same comes from the definition of the of those sets S T so maybe I will just quickly highlight that let us say I have X comma U in a subspace W such that X minus U belongs to W then I want to show that this S T X W I want to show that this is in fact the same as S T U comma W okay if these two subsets are the same then their generators will also be the same these are the generators okay let us S T X W let us say that G belongs to S T X W then by definition G T X belongs to W that is G T X equals W now look at look at G T of Y U G T U can be written as okay I consider G T X minus U U minus X consider G T U minus X this is G T U minus G T X what I know is that this belongs to W this belongs to W okay because U minus X belongs to W so this belongs to W so this is G T U minus G T X G T U minus W that belongs to W G T U minus W that belongs to W W is in W capital W is a subspace so G T U is also in W the whole process can be reversed so what I have shown is that okay this means G belongs to S T U W see this is really straight forward I am just explaining it quickly G belongs to S T X W implies G belongs to S T U W the whole process can be reversed okay so please check the details and then verify that since those ideals are the same the monic generators will also be the same so I get this the notation for this polynomial is F that is what we have here we have assumed this is the notation so I have this also look at F T Z F T Z is F T Y minus summation I equals 1 to K minus 1 F T H I of T Y I have applied F T to this vector F T Z is F T Y minus summation F T H I T Y that is this this is F T Y there is an expression from the theorem from this representation F T Y there is a expression so I write this as Y not plus summation I equals 1 to K minus 1 G I of T Y I minus summation I equals 1 to K minus 1 F T H I of T Y I this is Y not plus summation I equals 1 to K minus 1 G I minus F H I Y I G I minus F H I G I minus F H I is R I so this expression is so let me write I am looking at F T F T Z expression for F T Z is Y not plus summation I equals 1 to K minus 1 R I of T Y I R I is the remainder okay I want to show that each R I is 0 suppose some R I is not 0 suppose R I is not 0 for some I among all those non-zero R I I will take the one with the largest subscript let J be the largest I such that R I is not 0 let J be the largest subscript such that R I is not 0 then R J is 0 sorry R J is not 0 and I also know coming from the condition for each R I the degree R J is strictly less than the degree of the polynomial F and I go back to this representation for F T Z rewrite it by making use of this J I can write F T Z as Y not plus summation I equal to 1 to J this time only up to J that is the largest after those the other R I's are 0 so the summation is only up to J I equal to 1 to J R I of T Y I let me now use a short notation for this polynomial Z W J minus 1 I have to find Z here I look at all those polynomials say some G G T Z such that G T Z belongs to W J minus 1 that is generated by this unique polynomial little less I am calling that as P P is the monic generator of that ideal the set of all polynomials G such that G T Z belongs to W J minus 1 okay with this notation observed before that W J minus 1 is contained in W K minus 1 see J is this fixed index such that R J is the largest index such that R I is not 0 so W J minus 1 contained in W K minus 1 from this can you see that okay this is always true from this can we see that tell me if this is correct F is already defined as SY W K minus 1 then this conductor divides P P is the polynomial that I define just now W J minus 1 contained in W K minus 1 which only means that if you take a polynomial G if you take a polynomial G if it has the property that G T if it has the property that G T Z belongs to W J minus 1 it will be such that G T Z belongs to W K minus 1 okay any polynomial that is present in W any polynomial that is present any polynomial G which has the property that G T of Z that belongs to W J minus 1 will also be present in this okay from this it follows that this conductor divides P the degree of the polynomial coming from this will divide the degree of the polynomial coming from this the polynomial the degree of this will be less than the degree of this the degree of the conductor corresponding to this will be less than the degree of the conductor corresponding to this in fact the polynomial F will divide P okay this can be verified quickly once you have this we have the following if F divides P then I can write F divides P then I can write P as F times G for some polynomial G I look at P T Z P T Z by definition is F T G T Z this is G T F T Z these two commute and G T F T Z I will write it is G T Y naught plus summation I equals 1 to J G T R I of T Y L. So I am using the expression for F T Z here these two polynomials commute so it is G of F of Z so I have P T Z to be this now what is P? P is defined here P is defined here P of capital T of Z must be in W J minus 1 that is the definition P of capital T Z that must belong to W J minus 1. So this polynomial okay let me write once again this vector belongs to W J minus 1 okay look at what you have on the right on the right I will rewrite it as G T Y naught plus summation I equals 1 to J minus 1 and then J G T R I of T Y I plus G T R J of T Y J that is the last term I am just splitting this sum up to J minus 1 and then the last term we have observed that left hand side this vector belongs to W J minus 1 look at this vector that belongs to W J minus 1 because this is a sum that is happening the first term comes from Y 1 the first term has Y 1 second term has Y 2 etc we have just now observed that all these are contained in the last one that is W J minus 1. So this belongs to W J minus 1 this belongs to W naught but W naught is also contained in W J minus 1 W naught is contained in each W K so these 2 terms belong to W J minus 1 this belongs to W J minus 1 so this must also belong to W J minus 1 that is G of T R J of T Y J this must belong to W J minus 1 which means I will now compare I will now compare the polynomial I will now compare the polynomial G R J with S Y J W J minus 1 I will compare this polynomial G R J with this polynomial this is a unique monic generator of that ideal of all polynomials let us say some L L of T L of capital T Y J belongs to W J minus 1 this is another polynomial that is a one with the least degree. So degree G R J for one thing must be greater than or equal to degree S Y J W J minus 1 agreed G R J is a polynomial that belongs to that ideal G R J is a polynomial that belongs to the ideal of all polynomials L of T such that L of capital T Y J belongs to W J minus 1 but that ideal has this little S as the generator so that is polynomial with the least degree with that property so I have this now S Y J W J minus 1 come back to this that is P J so this is the same as degree P J the polynomials are the same for each J P J is Y J S Y J W J minus 1 but P J has this maximum property that among all the polynomials tell me if you agree with this among all those see P J has a property that among all those polynomials let us say again L with a property that L T Z belongs to W J minus 1 P J is the one with the maximum degree so this is a one such polynomial see this means little S of capital T of Z belongs to W J minus 1 but P J is a one with the maximum degree among all those polynomial so I have this but this is equal to degree P by our notation P is S Z W J minus 1 this is the most crucial step in this proof step 2 is the most crucial most crucial is these inequalities degree P we are at the last step a sequence of inequalities but what is P? P is F G so this is degree F G let me just write down the final inequality that we want by the way you cannot do not assume that you can understand the proof of the theorem right here in the class and then it is done with you have to work many of the steps here some of the steps I am not giving details some of the steps I am giving details here but you may not be able to understand here okay but you have got to go back home work it out and then verify that these are all correct statements finally as I said I want only this inequality degree G R J is greater than or equal to degree F G that is degree G F product of polynomials this means degree of G sorry G I will like to cancel degree R J is greater than or equal to degree F a contradiction because the remainders have been chosen in such a way that the degrees must be less than degree F I do not have that here but remember that this contradicts contradicting the choice of R I in particular R J degree of R J cannot exceed the degree of F contradiction is because of the fact that we have assumed R I is not 0 okay so each R I must be 0 okay so each R I is 0 that is F divides G I for all I we have taken K polynomials this time we have taken K polynomials this time we should also ponder over where we have used induction hypothesis I have not mentioned this could not have come without the induction hypothesis that the result is true for K minus 1 up to K minus 1 we are proving it for K okay ponder over that but I am saying that the second step is over here F divides G I this is what we wanted to show we want to show that F divides each G I and there exist Z naughts with this property but then as I told you this Y naught is a vector that comes from W naught W naught is T admissible so that part is easy okay since W naught is T admissible there exist Z naught in W naught such that F T sorry Y naught is F T Z naught so second part is there immediately that is a proof of step 2 let me write step 3 I will write step 3 here and prove it quickly okay what is left really what is left in the cyclic decomposition theorem in the cyclic decomposition theorem we have got to show that V is a direct sum of these subspaces and that the T annihilators of X K that is those P K's have the property that P K divides P K minus 1 for K equal to 2 to R so what is left is that the sum is direct sum we have already got the sum step 2 gives me the sum I want to show this is a direct sum it is not enough with these Y 1 etc Y R I will define new vectors which will give rise to a direct sum so I need to prove really independence of these subspaces I need to prove independence of these subspaces and then the fact that P K divides P K minus 1 then the proof is over except the last part where there is some uniqueness uniqueness I will not prove here okay what is step 3 there exists nonzero vectors this time X 1 X 2 etc X R and the corresponding T annihilators we call them P 1 P 2 etc P K that is that V is the direct sum W 0 direct sum Z this time X 1 not Y 1 Y 2 etc P K divides P K minus 1 this is what I need to prove that is step 3 so I will just look at the construction how to construct these vectors X 1 etc proof of step 3 start with vectors Y 1 Y 2 etc Y R these are coming from step 1 step 1 gives me these vectors apply step 2 apply step 2 to this vector Y equal to Y K and so F is P K okay that is the reason why I have retained step 2 here step 2 I told you essentially is some inference about this representation if I have this representation then each of these polynomials G I must be divisible by F and this Y 0 has a property now I am going to apply this representation for Y equals Y K now Y equals Y K you go back to this if Y is equal to Y K then this S Y K W K minus 1 is P K that is a notation I used earlier this is a notation I used earlier yeah that is here in front of you in fact if Y is equal to Y K then S Y K W K minus 1 is P K so all that I will do is apply this with Y equals Y K and F equal to P K so I have the following in other words I am just rewriting this representation I am just rewriting this representation for Y equals Y K F equals P K so F T okay F is P K so P K of T Y K that is what I have on the left Y 0 plus summation I equals 1 to K minus 1 G I T Y I okay this G I also I should change I remember that G is G I's are divisible by F so I will rewrite this G I is just F H I right this is F H I but F is P K so P K T H I of T Y this is G I this whole thing is G I of T G I is divisible by F that is G I is divisible by P K and we have written down now G I as F H I F is P K so I have this written out define vectors X K I will do that here itself define X K by X K equals Y K minus Z naught minus summation I equals 1 to K minus 1 H I of T Y this step is rather similar to step 2 I am defining a new vector X K I define the vector Z there Y K minus Z naught minus this I will again look at I define Z as Y minus something look at X K minus Y K X K minus Y K is Z naught plus this now this vector belongs to W K minus 1 this is in W naught it is also in W K minus 1 so X K minus Y K belongs to W K minus 1 and the argument as before if I have 2 vectors X and U such that X minus U belongs to W then those monic generators will be the same that is look at little S X K Y sorry W K minus 1 that will be the same as little S Y K W K minus 1 but this is what I am calling as P K S Y K W K minus 1 is P K what happens to P K X K also P K T X K without writing the details let us see this quickly P K T X K is P K T Y K minus P K T Z naught minus P K H I minus P K H I plus P K H I will get cancelled P K T Z naught is Y naught so P K T X K is 0 P K T X K is 0 please check this so X K the new vectors that we have defined have this property this is to for all K this is to for all K now can you see that this means W K minus 1 intersection Z X K T must be single term 0 this is a crucial step for independence that is see at each step look at the first step W naught we start with W naught then we are adding Z X 1 T I want independence so I would like to know if W naught intersection Z X 1 T is single term 0 in a general step I have W K W K is W K minus 1 plus Z X K T I would like to know whether this Z X K T that I am adding is independent with all with the subspace W K minus 1 this this so this question is important is the subspace Z X K T independent with W K minus 1 I am claiming that the vectors X K defined here in this manner have that property now why is this true this is because see you got to go back and use this condition P K T X K is 0 means that this is single term 0 let us do this quickly what is Z X K T Z X K T is the set of all G T X K just a set of all G T X K such that G belongs to F T this is at X K T okay what is what is P K P K is this in particular P K is this S X K W K minus 1 so this is the one with the least degree which means this G is a multiple of P K is that clear I collect all those polynomials G that side with the property that G T X K belongs to I have taken this from this I want to see whether I want to see what happens when G T X K belongs to W K minus 1 if G T X K belongs to W K minus 1 then I have the following this G T must be a multiple of P K because P K is a unique monic generator anything is a multiple so this G T is a multiple of P K but P K T X K is 0 so G T X K must be 0 so if G T X K belongs to this then G T X K is 0 please verify this not immediately if this belongs to W K minus 1 then this must be a G must be a multiple of P K but if it is a multiple of P K then because P K T X K is 0 it follows that G T X K is also 0 and so this is independent so let me remove this portion this guarantees independence this condition guarantees independence that is the first part okay that is the first part which means what instead of Y 1 etc Y K I will use X 1 etc X K I get a direct sum decomposition the last part is P K divides P K minus 1 that is the last part P K divides P K minus 1 how does this follow what we have proved just now is that use the fact that P K X K is 0 for all K P K T X K is 0 for all K now I will go back to this representation in particular I will go to this okay I will remain here I will go back to this representation and then remember that whenever I write F T Y in this manner then F must divide each of these terms instead of F I have P K that is what I have here so I also observe P K X K is 0 so I have P K T X K equals 0 plus P 1 T X 1 plus P 2 T X 2 plus etc plus P K minus 1 P X K minus 1 all that I have done is to write 0 as a sum of 0's I can write this as 0 plus if you want summation I equal to 1 to K minus 1 P I of T X I just to get this similar to that representation okay what is that we have done P K T X K I know that belongs to W K minus 1 and this is a representation whenever I have this representation I know from step 2 that this P K must divide each of these polynomials I am through step 2 consequence so I will just write by step 2 P K divides P I for all I running from 1 to K minus 1 the complete proof of the cyclicity component theorem of course the last part I have not done the part that these the positive integer R and the polynomials uniquely determine the positive integer R and the vectors non-zero vectors satisfying the conditions of the theorem uniquely determine the annihilating polynomials P 1 etc P K that I am going to skip let me look at some quick consequences of the cyclicity composition theorem one of the results that I have been mentioning let me emphasize it once again suppose I have V okay let me give you this corollary first before stating this result remember we started with this question given a subspace W which is invariant under T can I find a complementary subspace W prime which is also invariant under T the answer is the following let W be invariant I want T admissible let W be T admissible then there exists W prime such that T of W prime is contained in W prime and the vector space V is the direct sum of these two subspaces. So if you take a general subspace just an invariant subspace in general it will it will not work I have given an example yesterday if you take a T admissible subspace then it works proof cyclicity composition theorem this is a corollary of that start with W not equal to single term 0 W is T admissible right okay can I say this if W is the whole of V then there is nothing to prove if W is not V apply cyclicity composition theorem I will not give the details here apply cyclicity composition theorem that is each time you add a cyclic subspace remember that each cyclic subspace is invariant under T so all you are trying to do is looking at W direct sum the other ones Z X 1 T etc Z X K T X R T let us call this as W prime then I know that this W prime is the invariant subspace and V is the direct sum okay. So answer for this question is if it is not just invariant but also T admissible then we get an invariant subspace W prime may not be T admissible we do not know that but W prime is invariant under T this is one consequence what is the matrix form matrix analog really matrix analog of the cyclicity composition theorem is that any matrix B is similar to A where the A is block A 1 A 2 etc A R all other entries 0 where each A i is the companion matrix of the polynomial P i how does the proof go we know that the restriction operator of the operator T to a subspace W when I write down the matrix of the restriction operator with respect to the cyclic basis W is a cyclic subspace that is W equals Z X 1 T when I write down the matrix of the operator T relative to the cyclic subspace I know that the matrix is the companion matrix of the annihilating polynomial T annihilator of X T annihilator of X 1 X i T annihilator of X i this is a basis collect all such basis put them in this block form then this is a matrix of the operator T okay so this is a matrix form matrix analog of the cyclicity composition theorem this is called the rational form the above form is called the rational form of B that is for if you start with any matrix B then it can be reduced to the rational form the construction is by means of the cyclic subspaces see there is also a Jordan form but I do not have the time for that so all that I will do is I will give an example a numerical example of the rational form of a matrix okay look at this matrix B is 5 minus 6 minus 6 minus 1 4 2 3 minus 6 minus 4 this matrix has the characteristic polynomial as T minus 1 into T minus 2 whole square the minimal polynomial is T minus 1 into T minus 2 the minimal polynomial is a product of distinct linear factors so the matrix is diagonalizable B is actually diagonalizable there is a basis of R 3 having the property that each of the basis vectors is an eigenvector okay but I am interested in the rational form of this matrix it can be shown that this matrix B is similar to the matrix A that is you can construct a cyclic basis and a basis which corresponds to an eigenvalue I will not give the details here this A is 0 minus 2 0 1 3 0 0 0 2 see this matrix actually can be diagonalized I am just looking at another form I can show that this B is similar to A that is there is a matrix P which is invertible such that P inverse B P is equal to this matrix A and what is the structure here? The structure here comes from this first block this first block is a companion matrix corresponding to the T annihilator of the eigenvector corresponding to the eigenvalue 2 calculate the eigenvector corresponding to 2 call it x 1 look at x 1 T x 1 that will form a subspace the cyclic subspace compute the annihilator compute the companion matrix after the annihilator this block corresponds to that no I am sorry this does not correspond to that T minus 2 whole square okay in any case please verify this corresponds to a cyclic subspace companion matrix corresponding to a cyclic subspace this corresponds to just an eigenvalue this is a rational form of the matrix B okay I think I will stop here.