 We are discussing direct sum decompositions and projections, okay. I gave you some of the preliminary ideas of the proof of the following theorem. Let us prove it today completely. Let V be a finite dimensional vector space and T is a linear operator on V. Suppose that V is the direct sum of the subspaces W1 etc Wk. Suppose I have a direct sum decomposition of the vector space V. I must mention that there is no operator to begin with. Operator connection will come later. So I have just this. The operator connection will come later. That will be the next result. Let us first discuss the relationship between direct sum decompositions and projections. That is what we will first discuss. So V is the direct sum decompositions of these subspaces. The first part is there exist k linear operators. There exist k linear operators E1, E2, etc Ek defined on V such that the following conditions are satisfied by these operators. The first condition is that identity can be written as E1 plus E2 plus etc plus Ek. Identity can be written as a sum of these projections. We are going to prove these projections. Ei square is EI, that is EI is a projection. You take different EIs and take the product that is zero operator. Similarly the relationship between EIs and WIs is given by the last condition which is that range of EI is WI. Range of EI is WI. So that is the first three are properties of these operators E1, etc Ek. The last one connects the projections, connects these operators to the subspaces WI that we started with which form a direct sum decomposition of the vector space V, okay. Now what is important is that the converse is also true. Conversely suppose that there exist k non-zero operators E1, E2, etc Ek such that conditions 1, 2, 3 hold. Conversely suppose that I have operators E1, etc Ek such that conditions 1, 2, 3 are satisfied by these operators. If we set WI to be the range of EI, call these, call the range spaces of these operators as WI then the converse statement says that we have this decomposition, okay. So this is really the connection. You can go from one to the other. You can if you have a direct sum decomposition of a vector space into independent subspaces then you can define projections on those subspaces. Conversely if you have projections on subspaces then you can use that to obtain a direct sum decomposition of the vector space, okay. We will need this result. We will do a little further perhaps in today's lecture itself when we will also discuss how these projections are related to operators. The projections must be related to operators so the subspaces must be related to operators. The natural connection between subspaces and operators is that the subspaces must be invariant, okay. We will also try to discuss invariant direct sum decomposition today but first direct sum decomposition and projections. Proof I have already given the first few steps in my last lecture. There are two parts. First part is if there is a decomposition then we can define projections which satisfies certain properties and the relationship between the projections and the subspaces given by the last formula. So let us assume that V has this direct sum decomposition. If this happens then we know that any vector in V has a unique decomposition, okay. X equals X1 plus X2, etc Xk. There is a unique decomposition and when I write this decomposition I mean that the first term comes from W1, the second term comes from W2, etc. The last term comes from Wk. There is an order in which I write the terms. Remember that these are addition of vectors so I can write this as X2 plus X1, etc but I will not do it. I want to look at a decomposition which corresponds to the direct sum decomposition that is given to me. So this is what I will write. Given a vector X I will look for its unique decomposition in this manner. That is always possible because there is a direct sum decomposition, okay. I want to define the operators. This was done last time. Let me do this quickly. For each EI I define EI of X to be the ith term in this. Now that is why I need to know what is the ith term. When I say it is the ith term I must be careful about the order in which I write the terms. That order has already been fixed. I will define EI of X to be XI, the ith term on the right hand side. The claim is that these EIs for 1 to k satisfy these conditions, first three conditions and that the range space of these EIs equal WI. That is what we will prove but all these are very straightforward. For one thing look at, this is my definition, okay. This is the definition of EIs. Whenever I define a function I must know that it is well defined. For an X is this well defined, is this XI unique? It is unique because this representation is unique and I am following this order. First term coming from the subspace W1, second from W2, etc. Last term coming from Wk. So this is unique, this is well defined, linearity, easy to see. Is this item put in? That is okay. First thing is identity can be written as a sum of these. That is almost straight forward. Let us do that here. X is by definition X1 plus X2, etc. Xk where X1 by definition is E1X, X2 is E2X, etc. I have defined these k operators by means of that formula. This is EkX. Each is linear so I can take the sum E1 plus E2, etc. Ek operating on X. I have started with X. I have written that as T times T of X so this T must be equal to identity. This is true for all X. For each X I have to look at this representation, unique representation. So it follows that E1 plus E2 plus etc plus Ek is the identity operator. That is the first condition that these operators must satisfy. Second condition EI square. Let us look at EI square of a vector X. The vector X has this representation. EI square X by definition is EI operating on EI of X. EI operating on EI of X is it is ith term. So it is EI of XI. Now I will rewrite this as EI of 0 plus 0 plus etc plus 0 plus XI plus 0 plus etc plus 0 where all the zeros come from the other subspace. W1, W2, etc, Wi minus 1, Wi plus 1, etc, Wk. This is the unique representation of XI as a sum of vectors from the subspaces W1, etc, Wk. EI of any such vector will be the ith term. The ith term here is XI. So this is XI but XI by definition is EI of X, okay. And so we have shown EI square of X is EI of X for all X. So second condition is satisfied. EI square equals EI. Second condition is satisfied. Let me go back and verify the third condition. I will do it here. So this is the statement of the theorem. I want to verify this condition. EI EJ equal to 0 and finally range of EI is Wi, okay. That is also an immediate consequence of the definition. Look at EI EJ of X. This is EI of EJ of X is the jth term XJ. EI of XJ, again I will do a similar thing. This will be 0 plus 0, etc, plus ith term is 0. Jth term comes somewhere here. I am assuming I less than J. There is no loss of generality. If I is greater than J, this will come later. Jth term here plus 0, etc, plus 0. This is the jth term. That is this is the unique representation of the vector XJ as a sum of elements of W1, etc, Wk. EI of this representation I know is the ith term. That is 0, okay. If I comes after J then also it is 0. So there is no loss of generality. So I have shown that EI EJ X is 0 for all X. So this operator must be greater than 0 operator. EI EJ is 0. Range of EI equals WI. How does that follow? What is EI of X? So I need to go back to this. I will keep this. If X is this then EI of X is equal to XI. This by definition belongs to WI, okay. And remember that if X belongs to range of EI then what? You have an idempotent operator then the operator acts like identity on its range. That is what we saw last time. So X belongs to range of EI then range of EI then X is equal to EI X. EI X for me is XI that belongs to W2. So what have I shown? I have shown that range of EI is contained in WI, sorry. This is WI. Range of EI is contained in WI. Take any idempotent operator. It acts like identity on its range. So if X belongs to range of EI then X is equal to EI X. X is equal to EI X but EI X we have by definition is XI. XI by definition comes from WI. So range of EI is contained in WI on the other hand. On the other hand if X belongs to WI then if X belongs to WI then I want to know what is EI of X. To know EI of X I must know what is the representation of this as a sum, the unique representation. I know that this is in WI. All the other terms must be 0. This is the ith term. All the other terms are 0. I must know this representation in order to write EI of X. Then this so EI of X is equal to the ith term that is X. So I have shown that if X belongs to WI then I have shown X belongs to range of EI. That is WI contained range of EI combined with this condition 4 holds. That is the connection between the projections and the subspaces. Otherwise if you look at the first three conditions they talk only about this collection E1 etc EK. What are the properties that these satisfy? So this is the first part okay really simple. If you just follow this follow the logical steps coming from the definition. Converse. I need these three conditions. So I will try to prove the converse again in this part. I must show that conversely if I have these operators satisfying the first three conditions and if I call the range of these subspaces as WI then this gives rise to a direct sum decomposition of the vector space V. The only thing that I will do is first I will show that any vector in V can be written as a sum of vectors coming from these subspaces and then show that this representation is unique. It would then follow that it is a direct sum decomposition. Proof of the converse part. Proof of the converse. I must show that any X in V can be written as a sum of vectors coming from these subspaces but that is straight forward. If you look at the condition that these operators satisfy what is the first condition? Identity is the sum of these operators E1 plus E2 etc plus EK. So I take X in V then X can be written as IX that is E1X plus E2X plus etc plus EKX where each of these this belongs to W1, this belongs to W2, this belongs to WK. So for one thing I have written any vector as a sum of vectors coming from W1 etc WK. So V is contained in this sum. If I show that these subspaces are independent then it is a direct sum. It is the same as showing that this representation is unique. It is the same as showing that this representation is unique okay. That is the definition of independent subspaces. Is that correct? Subspaces W1, W2 etc WK are independent if the equation U1 plus U2 plus etc plus UK equal to 0 UI coming from WI. If this equation implies each UI is 0. This also gives rise to the fact that representation in terms of the sum is unique. So we will prove that this representation is unique. Suppose I have another representation if possible. Suppose that by the way I must mention that E1X belongs to W1, E2X belongs to W2 etc, EKX belongs to WK because of this and this is just a notation. It is not an assumption. Just a notation it is not an assumption. See this if this holds then we want to show this holds. This is just a notation. The ranges of these operators are called W1 etc WK. If the operators satisfy condition 1, 2, 3 then that must give rise to a direct sum decomposition is what the claim is okay. So range of this is in range of E1 that is W1 and so I have this representation okay. I want to show uniqueness. Suppose that X is written as let us say Y1 plus Y2 etc YK where each YI belongs to WI. I must show that Y1 is E1X, Y2 is E2X etc. I will show that Y1 is E1X, Y2 is E2 etc, YK is EKX okay. I will just look at E1X. E1X by definition is E1 of Y1 plus Y2 etc YK. E1 is a linear operator. It is E1 of Y1 plus E1 Y2 plus etc E1 YK. E1 Y1 is E1, Y1 is okay. I will keep this as it is for the moment. Look at the other terms E1, Y2. Y2 is YI is in WI, Y2 is in W2, W2 is a range of E2. So this is E2, Y2 really right. Again I am using the fact that any element if E is a projection then E acts like identity on its range. Y2 belongs to range of E2 so E2 Y2 is Y2. Each term E1, EK, YK. Look at the terms from the second onwards. I will use property 3. Property 3 is satisfied by these operators. So all these terms are 0. So I will have just E1 Y1 but where does Y1 come from? Y1 comes from W1 that is range of E1 so E1 Y1 is Y1. This is what we wanted to prove. So what we have shown is that E1 X is Y1. This E1 X is Y1 similarly E2 X is Y2 etc EK X is YK. So this representation is unique. So it can be similarly shown that EI X equals YI for all I. So the representation for X is unique that is these subspaces are independent which is the same as saying that the sum is not just ordinary sum it is a direct sum decomposition. So V is W1 direct sum W2 etc direct sum WK okay that is the converse part. Okay let us use that clear. We need to as I mentioned we need to also look at operators and their relationships with these projections okay what is the relationship and how is that relationship given in terms of the subspaces. I will state the next result which will make use of this theorem again so I will keep this whole thing once again. This statement we need now I will look at T. Let T be a linear operator on this vector space V that is finite dimensional. I have the subspaces W1 etc WK satisfying the conditions of this theorem okay let me just mention. Let E1 etc EK W1 etc WK be as above what is the meaning? The meaning is that E1 E2 etc EK are operators that satisfy the first three conditions of the previous theorem W1 W2 etc WK are the subspaces that are defined as range of EI equals WI. So I already have a direct sum decomposition of the vector space V. How do you get T into the picture? The first thing we must observe is that T EI equals EI T if and only if T of WI is contained in W. So the answer is in terms of invariant subspaces that is why this notion was introduced sometime ago. This is a natural notion to connect a linear operator and projections associated with a linear operator. We will show how these projections are associated with the linear operator T but this is already one relationship. If each WI is invariant under T then each EI will commute with T. If each WI is invariant under T then each EI will commute under T and conversely okay. In this we will look at diagonalization in a different language. For diagonalization we have seen two characterizations. We look at diagonalization another characterization diagonalization using invariant direct sums that is the objective of this topic okay. Let us first prove this and then look at that theorem. Proof I can freely use the properties of EI's and WI's that have been defined earlier okay. Let us first show that if this condition holds then each WI is invariant. Suppose that T EI is equal to EI T for all I. I look at WI is invariant under T. Let me take Y N T of WI. Let Y belong to T of WI. I must show that Y belongs to WI. Then I can write Y as T X for some X in WI by definition. Now X is in WI. WI and EI are related by the condition fourth condition. So this X can be written as EI X. WI is range of EI. So this Y is T EI X instead of X I have written EI X. EI acts like identity on its range. T EI I know is EI T. I am assuming this. So this is EI T X. Now whatever be T X this is EI of something that belongs to range of EI which is in which is WI. So this belongs to range of EI which by definition is WI. This is what I wanted to prove. If Y belongs to T of WI I must show that Y belongs to WI I have done that. So if T EI is EI T then T WI is contained in WI. We must establish the converse. Conversely let us assume that WI is invariant under T. I must show that T and EI commute, okay. So we need to look at representation. Let us take X in V. Then I know that X has this representation E1 X plus E2 X, etc EK X. This comes from the first equation. Identity is E1 plus E2 etc EK. What is T X? T X will then be T of this. T E1 X plus T E2 X each of these is invariant under T. So I can write this as T E1 T of anything T of something in WI is contained in WI. So this can be written as this is in WI and WI is range of EI. So do you agree that I can write this as let us say E1 Y1 plus E2 Y2 EK YK. I am assuming that each W is invariant under T. I will show that each of the EI commute with T. Each W is invariant under T. I will show that each EI commute with T. So T E1 this is T of a vector in W1 that is in W1 but anything in W1 is W1 is range of EI. W1 is range of E1. So this is E1 of some vector. I am calling that as Y1, E2 Y2 etc EK YK. What do I want to look at? I want to look at EI of T. I can write this as summation J equals 1 to K EJ YJ and then I will look at EI T. So EI T of X is summation J equals 1 to K EI EJ YJ. I is fixed J is a running index. I is fixed J is a running index. So when J takes a value I I get EI square all other terms are 0 because the properties that these projections satisfy. EI into EJ equal to 0 when I is not equal to J when J is not equal to I J is a running index. So this is simply EJ YJ I am sorry EI YI. J is a running index J takes a value I it is non-zero all the other terms are 0. So EI TX is EI YI but EI YI by definition from here is I just look at the ith term TEI X EI YI is ith term coming from this representation that is TEI of X. So I have shown EI TX equals TEI of X for all X. So okay that is the second part. See the reasoning here is T of this vector that vector is in W1 this is in W1. So I have T of some vector let us say W1 T of W1 but T of W1 is in W1 and W1 on the other hand is range of E1. So this little W1 is in range of E1 so it is E1 Y1 for some Y1 is that okay. This vector let us call it W1 W1 belongs to capital W1 that is range of E1 if a vector belongs to range of this then W1 can be written as E1 of some vector I am calling that Y1 I do that for each term. So W1 is E1 Y1 W2 is E2 Y2 etc okay. So this relationship holds between all the projections and the operator T if and only if each of these subspaces that is the range of the projections must be invariant under T okay. Let us now connect all these with diagonalizability. V is a finite dimensional vector space and T is an operator on B. Let lambda 1 etc lambda k be the distinct eigenvalues of T if T is diagonalizable then there exist k linear operators E1 E2 etc Ek such that the following conditions are satisfied. The first condition is T is a linear combination of these operators and the coefficients in fact come from the eigenvalues lambda 1 E1 plus lambda 2 E2 plus etc plus lambda k Ek. This is one relationship between T and the operators the projections E1 etc Ek. Second formula third formula fourth formula we have seen before identity is E1 plus E2 etc Ek each of these is a projection that is EI square is EI for all I they are kind of perpendicular the product is 0 EI EJ is 0 whenever I is not equal to J. The final condition is that the range spaces are some subspaces these subspaces give rise to I am recalling what we proved just now these EIs are such that range of EIs are certain subspaces which give rise to a direct sum decomposition. Now I also have the operator T and eigenvectors eigenvalues so what do you expect these subspaces to be? Eigen spaces condition 5 range of EI equals the eigen space corresponding to the eigenvalue lambda I for all I range of E1 is eigen space corresponding to lambda 1 etc since T is diagonalizable what it also means is that V is a direct sum of the subspace that is diagonalizability. Converse also holds what is a converse? Conversely suppose that there exists K non-zero linear operators E1 E2 etc Ek and distinct numbers lambda 1 lambda 2 etc lambda k such that conditions 1 to 4 hold. Conversely I have K non-zero linear operators even etc Ek and distinct numbers lambda 1 etc lambda k such that T is this particular linear combination I is this particular linear combination EI each EI is a projection the product of any two projections is 0 any two distinct projections is 0. Then lambda 1 etc lambda k are the eigenvalues of T that is the first thing the operator T is diagonalizable and the range of EI is the eigen space corresponding to eigenvalue lambda and this is a converse part so this is a kind of a necessary sufficient condition for T to be diagonalizable again this has two parts first part is relatively easy so maybe I will prove the first part today today's lecture proof first part is this if T is diagonalizable then I must show that there exist K linear operators that satisfy these conditions together with the condition that the range is an eigen space a range of EI is the eigen space corresponding to lambda I okay of which condition 2, 3 and 4 have been verified already I will simply appeal to that definition I am given that T is diagonalizable. So I can write V as W1 direct sum W2 etc direct sum Wk because T is diagonalizable it has a basis such that each basis vector is an eigen vector so these are eigen spaces where Wi is the eigen space corresponding to the eigenvalue lambda I this I am this I can write because T is diagonalizable okay that is the first part if T is diagonalizable I must show that these conditions are satisfied condition 2 to 4 will follow from what we did earlier only we need to verify condition 1 and 4 1 and 5 define EI as before okay to do that you need a representation for X and V I have this representation X equals X1 plus X2 etc where this XI comes from Wi this representation I know is unique using this representation I can define E1, E2 etc define EI is as before then we do not have to do it again I will simply mention that 2, 2, 3 conditions 2 to 3 I am sorry 2 to 4 conditions 2 to 4 hold identity is E1 plus E2 etc Ek EI square is EI EI EJ is 0 when I is not equal to 2 of which I will take this second condition identity is E1 plus E2 etc plus Ek so if you look at T, T is I will look at Tx Tx is T acting on this out of X so let me say X equals Ix operating on X so Tx is T E1 T E2 etc this is T E1 X etc T Ek X now look at E1 X E1 X is in range of range of E1 can be for the moment assume 5 and proceed 5 is immediate let us assume 5 for the moment range of EI is Wi I will prove 5 next that is immediate suppose 5 holds range of EI is Wi, Wi have the property that V is W1 plus W2 etc Wk that is anything in Wi so long as it is non-zero it is an eigenvector corresponding to the eigenvalue lambda I in particular anything in Wi will satisfy if Z belongs to Wi then Z satisfies TZ equals lambda IZ that is what I will use this is in W1 W1 is an eigen space for lambda 1 so this is lambda 1 E1 X do you agree E1 is in the of eigen space W1 W1 is a eigen space corresponding to lambda 1 so T of that must be that number lambda 1 into that vector that vector here is E1 X so lambda 1 E1 X etc plus lambda k Ek X take X outside so this is lambda 1 E1 etc plus lambda k Ek operating on X so now you see that T of X has been shown to be this operator acting on X so this operator must be equal to T so 1 holds so 1 holds T is this specific linear combination lambda 1 E1 plus lambda 2 E2 etc lambda k K I must prove 5 so that this argument is valid but 5 is really straight forward what is range of EI okay what is condition 5 condition 5 I must show that range of EI is the ith eigen space is eigen space corresponding to the eigenvalue lambda I but how is EI defined EI is defined as the ith term in that representation right okay let us look at X equal to X 1 plus X 2 etc X I plus etc Xk then I know that EI of X is the ith term that is X I this X I is ith term that comes from W I so range of EI is W I but what is W I W I is the eigen space we started with this representation where W I is the ith eigen space that corresponds to the eigenvalue lambda I and so this is really what we have done earlier the only difference this time is it is an eigen space corresponding to the eigen space for the operator T that is the only difference just use this to simply say range of EI is W I that is the argument for the 5th property is as before the only extra thing that you have now is that W I is the eigen space corresponding to the eigenvalue lambda I and so 5 holds and so this argument is valid okay that proves the first part the second part I will prove in the next lecture that will take some time.