 We will continue to analyze a single linear operator. We looked at the matrix formulation for diagonalizability. We will now look at a formulation in terms of subspaces, okay not matrices. So we will discuss the notion of direct sum decompositions and projection operators, direct sum decompositions of subspaces and the relationship with projections, okay. So this is what we will discuss and see how this is related to the problem of diagonalizability of an operator, okay. We will first discuss the notion of direct sum decompositions. We need the notion of independent subspaces. So I have these subspaces K subspaces W1 etc these are called independent subspaces if the following implication holds if W1 plus W2 plus etc plus Wk equal to 0 with small w i belonging to capital W i if this equation has 0 as the only solution. This is independence of subspaces. I take any linear combination of vectors in W1 W2 etc Wk equate that to 0 then each coefficient each scalar must be 0 that is what this means, okay. Subspaces are called independent of this condition is satisfied. Example I will just discuss one example for the moment because we have encountered this before. Let T be an operator on V and W1 W2 etc Wk be the eigenspaces of T corresponding to the eigenvalues these are distinct. I have K distinct eigenvalues of the operator T look at the eigenspaces then we have seen before that these eigenspaces are independent then W1 W2 etc Wk are independent subspaces, okay. We have seen this result in fact we have even seen something more if B1 is a basis for W1 B2 basis for W2 etc Bk is a basis for Wk then B1 union B2 etc union Bk is a basis for the sum for the sum W, okay this is one example of independent subspaces may be one more example take B to be U1 U2 etc Un basis for V look at the one dimensional subspaces spanned by each of these vectors then this is obviously a set of independent subspaces, okay. So two examples of independent subspaces let us prove a quick characterization look at may be two more examples and then look at the notion of projections so I want to characterize when the subspaces are independent so we have the following result, okay before I state that result let us also make the following observation suppose that subspaces W1 W2 etc Wk are independent I have independent subspaces I look at the sum of these subspaces I will call that W sum of these subspaces W is the sum then we have the following easy consequence any X and W has a unique representation any X and W has a unique representation X equals W1 plus W2 plus etc plus Wk every X and W has a unique representation in this form so in some sense is easy to see given this in some sense you can think of W1 etc Wk as coordinates of a vector in the sum, okay you can associate unique numbers to the vector X belonging to W you can think of W1 the coefficient of W1 as the first coordinate coefficient of W2 as a second coordinate etc, okay why is this unique if X can be written as Z1 plus Z2 etc plus Zk with Zi in WI if there is another representation we will show that these two are the same if that happens then equate the right hand side you will get W1 minus Z1 plus etc Wk minus Zk equals 0 W is a subspace so this belongs to so I will call it Y1 plus Y2 plus etc plus Yk equal to 0 with YI in WI sum difference etc they belong to the subspace now this equation tells me that each of the terms must be 0 because the subspaces are independent and so WI equals Zi for all. So this representation must be unique if the subspaces are independent representation in the sum not in the entire space V the representation in the sum W representation in this sum that must be unique, okay. Let us look at a characterization of independent subspaces I will write down two statements that are equivalent to the subspaces being independent the first statement is W1 etc Wk are independent second statement remember that we are working only in finite dimensional vector spaces for every j 2 less than or equal to j less than or equal to k the following condition holds you look at Wj intersection W1 plus W2 etc plus Wj minus 1. Look at the sum of for every j look at the sum of W1 etc Wj minus 1 you take the intersection with Wj that must be singleton 0 this is the second condition for every j so for instance W1 intersection Wj is singleton W1 intersection W2 is singleton 0 W1 plus W2 intersection W3 is singleton 0 etc W1 plus W2 etc plus Wk minus 1 intersection Wk is singleton 0, okay all these equations must hold that is condition B condition C if V1 V2 etc Bk are basis ordered basis for W1 W etc Wk then B equals B1 union B2 etc union Bk this is a basis for the sum these statements are equivalent see for this theorem I will prove only A implies B and leave the rest for you the rest of the statements are almost tautologies, okay so I am not going to prove those I will just prove A implies B we only show that A implies B in fact A if and only if B also follows but I will skip that part, okay if the subspaces are independent we must show that this intersection is singleton 0 for all j, okay so take X in that intersection Wj intersection W1 plus W2 etc plus Wj minus 1 I must show that this is 0 this is a 0 vector I must show that X is equal to 0 this means what I can write X as for one thing it is in this subspace I can write X as let us say X1 plus X2 etc Xj minus 1 because it belongs to this subspace X i belongs to W i 1 less than or equal to i less than or equal to j minus 1 X belongs to this so X can be written in this manner yes, okay are we through look at bring this to the left hand side this is X1 plus X2 plus etc Xj minus 1 plus minus X equals 0 with the first vector in W1 second vector in W2 etc j minus 1 vector is in Wj minus 1 this is now in Wj I will make use of that this X minus X that belongs to Wj so I have a combination here equated to 0 I am assuming condition A assuming the subspaces are independent so it follows that each must be 0 what I only want is that X is 0 each must be 0 so this must be 0 so X is 0 that is what I wanted to show if X is in this intersection I have shown that X is 0 so if the subspaces are independent then these intersections are trivial as I mentioned the others other equivalences are exercises, okay for you. Let us now look at maybe a couple of more examples of independent subspaces let us consider so I am looking at example 3 let us say I have the vector space to be R n cross n the set of all square matrices with real entries let me take W to be the subspace of all A such that A equals A transpose the subspace of symmetric matrices subspace of all real symmetric matrices I will call it W1 and define W2 as a set of all A and V such that A equals minus A transpose skew symmetric matrices W2 is a subspace of all skew symmetric matrices I have already mentioned that these are subspaces so I wanted to verify that these are indeed subspaces some of two symmetric matrices symmetric constant multiple of a symmetric matrix is symmetric similarly for this now what can be shown is that okay that is probably easy to see any A in V can be written as a sum of two matrices A1 and A2 A1 belongs to W1 A2 belongs to W2 A can be written as A1 plus A2 where A1 is a symmetric part A2 is a skew symmetric part so A1 is A plus A transpose by 2 this belongs to W1 and A2 is A minus A transpose by 2 this belongs to W2 and that this representation is unique okay but before that can you see that this is obviously symmetric you take the transpose you get the same thing this is skew symmetric take the transpose this goes with A with a minus sign and so this must be skew symmetric okay and that the intersection of these two subspaces is singleton 0 the only matrix that is both symmetric and skew symmetric is a 0 matrix right the only matrix that is both symmetric and skew symmetric is the 0 matrix so W intersection W2 is singleton 0 okay so these are independent subspaces and any vector in V has a unique representation so this is really this condition for J equals to second condition that the intersection is singleton 0 okay finally okay I think that is enough for examples let me just tell you what is a direct sum okay let me give this definition if W1 W2 etc Wk are independent subspaces if these are independent subspaces if I denote W as the sum then we say that W is the direct sum of these subspaces we say that W is a direct sum of these subspaces sometimes it is called interior direct sum in group theory you must have studied such a notion interior direct sum exterior direct sum how these are isomorphic etc okay so remember that when I write when I say that W is a direct sum of these subspaces what it means is that the subspaces must be independent now this is just the usual sum to differentiate the direct sum from this sum we will use this notation this is denoted by W equals W1 plus with a circle you must have seen this in group theory etc Wk plus inside a circle so when I write W in this manner it means that W1 W2 etc Wk are independent subspaces this is the direct sum okay example of a direct sum several we have seen three examples before all the three the sum is a direct sum I will give one final example for direct sum suppose T is diagonalizable suppose T is a diagonalizable operator then V is the direct sum of the Eigen spaces if T is diagonalizable then T can be written as a direct sum of these subspaces the subspaces being the Eigen spaces corresponding to the Eigen values of T okay the proof follows from the fact that Eigen spaces are independent and there is a basis of V each of whose vectors is an Eigen vector okay so in this case you can write V as a direct sum of these subspaces okay so this will be so you can already see that there is a connection from diagonalizability to direct sum of subspaces okay we will explore this relation a little further but before that let us look at the notion of a projection what is a projection an operator E on a vector space V is called a projection it is called a projection Ion it is called a projection if E square equals E this is called idempotence E is an idempotent operator the other name is projection because there is a geometric connotation any idempotent operator will be called a projection okay the geometry cannot be invoked immediately we will have to wait for orthogonal projections but let us look at some simple properties of projections immediate example of a projection we have seen these examples natural projections etc when we studied linear transformation so I will just give one example this is obviously a projection okay E square is E okay I want to collect some properties of projections which will be useful later and then connect these two notions direct sum to compositions and projection operators so some quick properties of projections the first property is that projection acts like the identity map on its subspace and 0 on its nulls on its range projection acts like identity on its range space 0 on its null space so this is like identity this behaves very close so E behaves somewhat like the identity operator in the following sense the first one if x belongs to range of E then E x equals x and the converse is obviously true a general operator will not be like this if x belongs to range of T then x equals T y that is all I know but if I know that T is also idempotent then T x will be equal to x this is one of the properties property 2 is that if x belongs to null space of E okay I will write like this null space of E is range of I minus E this is second property of projections null space of E is range of I minus E so these are like complementary kind of operators property 3 V can be written as range of E direct sum null space of E property 3 range and null space of a projection they form a direct sum of the vector space V in this case we say that they are complementary subspaces if I have two subspaces W 1 and W 2 such as W 1 direct sum W 2 is V then W 1 W 2 are called complementary subspaces so if W 1 W 2 are independent subspaces then they are complementary subspaces again see that this will not hold for a general operator T range and null space need not be complementary for a general operator but for a projection they are complementary okay quick proof of these for the first one x belongs to range of E implies that x can be written as let us say E y operate E on both sides so E x is E of E y that is E square y but E square is E so this is E y E y is x again so E x equals x if x belongs to range of E then E x equals x is that E x is E square y but E square is E because E is an idempotent operator so this is E y go back to this equation you get x so if x belongs to range of E we have shown that E x equals x converse is obvious if x is equal to E x then x obviously belongs to the range of E okay if x equals E x then x belongs to range of E that is trivial so you have both ways implication that is property 1 property 2 null space of E is range of I minus E let x belong to null space of E this implies E x equal to 0 this means okay this means x minus E x equals x E x is 0 so x minus E x is x that is I minus E operating on x equals x so I have shown x belongs to range of I minus E I have written x as I minus E of some y in particular x so if it is a null space of E then it must be the range of I minus E and this whole process can be reversed that cannot be done here so let us do it again this can be done if you observe that if E square equal to E then I minus E whole square is I minus E then this process can be repeated reversed can you see that E square equal to E in fact definitely if I minus E the whole square is I minus E look at I minus E whole square it is I minus E minus E plus E square minus E and plus E square get cancelled so you will get I minus E so E is a projection if and only if I minus E f is a projection use the first property suppose x belongs to range of I minus E if x belongs to range of a projection then the projection acts like identity on that x so I minus E x is equal to x that is what we put in the first part I minus E x equal to x gives this this gives this this gives this so this process can be reversed and so these subspaces coincide these subspaces coincide the last part that it is it gives rise to a direct sum decomposition may be we must make the following observation that we have not used in any of these results the dimension of the vector space so this holds even for an infinite dimensional vector space okay you have a projection over infinite dimensional vector space all these properties hold okay last part V is a direct sum just make this observation any x and V can be written as E x plus x minus E x E x gets cancelled this on the other hand can be written as E x plus I minus E x this belongs to range of E this belongs to range of I minus E which I proved just now is null space of E and so any x has this representation okay obviously but I must prove that the intersection is singleton 0 only then it will follow that this is a direct sum decomposition okay so let us do that quickly but that is almost obvious from what we have proved till now let us take let us take U in range of E intersection null space of E I want to show that U is 0 so I want to show that V equals range of E plus null space of E where the plus is the direct sum so I must show that the intersection of these two subspaces is trivial let us take U in range of E intersection null space of E if U is in the range of a projection then E U is U anything in the range the operator acts like identity on that vector but U belongs to null space of E so E U is 0 you also belong to null space of E so E U is 0 so U belongs to this intersection U must be 0 and so this is the direct sum so V can be written as range of E plus null space of V where this plus is now the direct sum okay so it has some nice property that there is if you have a projection then there is a direct sum decomposition if you have a single projection if you have several projections can we relate them to several subspace okay that is a question we will answer the answer is yes but before I prove that result there is another property which we must observe for a projection every projection is diagonalizable it is almost easy so I will state it as property 4 it is really property 4 if E is a projection then E is diagonalizable to observe that E is it is almost trivial but to observe that E is diagonalizable you must tell me what are the possible eigenvalues of E 0 and 1 okay let lambda be an eigenvalue of E then E x equals lambda x for some x not equal to 0 look at E square x E square x on the one hand is E x which is lambda x on the other hand from this equation it is lambda E x but E x is lambda x again lambda square x so if lambda is an eigenvalue then lambda must satisfy this equation for some vector x not equal to 0 lambda x equals lambda square x so lambda into lambda minus 1 x equal to 0 with x not equal to 0 implies that lambda 0 or 1 so these are the only possible eigenvalues these are the only possible eigenvalues of a projection if the eigenvalue is only 1 then E is identity if the eigenvalue is only 0 then E is a 0 operator all other projections in some sense lie between these two okay identity is trivially a projection 0 is trivially a projection I square is I 0 square is 0 all other projection in some sense lie between these two in the sense that the eigenvalues are either 0 or 1 okay I have got the eigenvalues what about I the claim is that E is diagonalizable so can you give me a basis with the property that each vector in that basis is an eigenvector yes 0 and 1 why that is okay a matrix having two distinct eigenvalues does not mean it is diagonalizable if it has two distinct eigenvalues I know there are two independent eigenvectors thus does it exhaust all the does it exhaust the number of basis vectors in any basis see what remember that E is an operator on V so if you look at the matrix of E it is n x n n x n matrix has only two eigenvalues obviously some of them will repeat 0 will repeat let us say r times then 1 repeats n minus r times when you have eigenvalues repeating then there is no guarantee that it is diagonalizable but if it is a projection operator then there is a guarantee how does that follows see it follows from the last property V is range of E direction null space of E take a basis for range of E I will call it V1 U1 U2 etc U r and the basis for null space of E I am saying that the union of these two the union of these two is a basis of V is obvious because it is a direct sum decomposition but more importantly I must show that this each vector here each vector here they are eigenvectors that is the most important thing right the fact that the union is a basis is a consequence of the fact that W1 W2 are independent subspace these two are independent subspace we have seen this before but does it follow that this is each of the vectors here is an eigenvector what is eigenvalue anything in the range of V will satisfy the condition so U1 is in range of V so E U1 is U1 that is E U1 is 1 times U1 so anything in the range of V is an eigenvector corresponding to the eigenvalue 1 anything in the null space of V is an eigenvector corresponding to the eigenvalue 0 so this is a basis in fact any basis this is more than what one could ask for you take any basis for the range space any basis for the null space take that union that will be a basis with the property that each vector in that is an eigenvector okay not just one basis each of whose vectors is an eigenvector any basis for range of E union any basis for null space of V that union will form a basis each of whose vectors is an eigenvector is that clear so here E U1 is U1 so this holds for 1 less than or equal to i less than or equal to r and E Ui is 0 r plus 1 less than or equal to i less than or equal to n so Ui r plus 1 etc Un these are eigenvectors corresponding to the eigenvalue 0 these are eigenvectors corresponding to the eigenvalue 1 their union is a basis and so E is diagonalizable okay so any projection operator is diagonalizable how are projections related to direct some decomposition okay I will maybe just state that theorem and prove it next time one of the one of the ideas that I want to use in that proof maybe I will discuss that today let us say I have let us say I have V as a direct some decomposition in this manner W1 W2 Wk are independent subspaces. Let me define the operator let us say E i on V the operator E i for i running from 1 to k for each subspace I will define for each subspace I will associate a projection as follows E i on V is defined as follows you take any x in V and then this x as a unique representation let us call it x1 plus x2 plus etc x i minus 1 comes somewhere x i after that x i plus 1 plus x k any vector x in the vector space V can be written as a unique sum in this manner I will define E i of this x as the ith term in this sum see when I write see when I write this sum what is understood what is implicit is that I write it in this order first the term corresponding to W1 second the term corresponding to W2 etc implicitly the bases are ordered vector addition is commutative so this is the same as x2 plus x1 plus etc okay but I am not looking at that representation I am looking at that representation where the first term comes from W1 second term comes from W2 etc given that decomposition given that representation I define E i of x to be the ith term ith term is x i E i of x is the ith term that is x i I am claiming that this E i is a projection I am also claiming that E i into E j is 0 when i is not equal to j. So maybe I will just prove E i is a projection that is easy to see what is E i square of x E i square of x is E i operating on E i of x that is E i operating on E i of x E i of x I have the definition it is ith term in this right hand side that is x i okay now I want E i of x i then I must write x i as a sum in such a way that I must write this x i as a sum pick up the ith term okay but x i is E i of 0 plus 0 etc plus x i plus 0 plus 0 where each of the 0s comes from W1 W2 W3 etc not W i for W i I have x i all the others are 0 so I now look at the ith term that is x i E i of any vector is the ith term where the terms are written in this order the first term is in W1 second term is in W2 etc. So I have shown E i square of x is x i but what is x i x i by definition is E i of x x i is E i of x so what I have shown is that E i square is E i so this is an idempotent operator E i is an idempotent operator so it is a projection second property E i into E j equals 0 that is also easy look at E i E j of x E i E j of x is E i of E j of x for E j of x I must look at the representation of x x 1 plus x 2 etc plus x j etc x k this is E i E j of E j of any vector x is the jth term E i of x j it is a jth term so it is x j but E i of x j I must look at the ith term that is obviously 0. So just to emphasize it is E i of 0 plus 0 etc ith term is 0 this is the ith term j comes somewhere here I am assuming i less than j j could be less than i there is no problem i comes somewhere here j comes somewhere here all other terms are 0 I want the ith term the ith term is 0 so this is 0 assuming i is not equal to j i equal to j we have proved already E i square is E i if i is not equal to j then E i E j is the 0 operator okay we will prove another property that range of E i is W i and the fact that sum of these E i's is the identity operator identity operator is the sum of E 1 E 2 etc E k okay and then also prove a converse that I will do in the next lecture.