 the sum of the eigenvalues. So there is a close connection or appears to be a close connection between the diagonal entries of a matrix and the eigenvalues. What is the precise relationship between these two or can we be a bit more specific other than saying that they are both real valued and they have the same sum, can we be more specific about the relationship between these two sets of real numbers. And this notion is what is called majorization. So we will define this notion now. By the way the word majorization also appears in the context of optimization but that is a different notion, do not get confused. This is a notion of majorization that we are defining here in linear algebra. Okay so let alpha in r to the n and beta in r to the n be given. If their elements are arranged in increasing order meaning that alpha j1 less than or equal to alpha j2 less than or equal to less than or equal to alpha jn. So these are the indices where the j1 is the index in alpha for which the corresponding entry in alpha has the smallest value and jn is the index for an entry in alpha where the corresponding entry of alpha has the largest value and similarly beta. So beta say m1 less than or equal to beta m2 less than or equal to beta mn and if the summation i equal to 1 to k beta mi is greater than or equal to sigma i equal to 1 to k alpha ji for every k equal to 1 to up to n and if equality is satisfied at k equal to n then beta is said to majorize alpha. Okay so basically the vector beta majorizes alpha is in the sense that it's we say that beta is you know kind of greater than or equal to alpha. If the sum of the k smallest entries in beta is greater than or equal to the sum of the k smallest entries in alpha and this holds for k equal to 1 to all the way up to n minus 1 and the sum of the entries are equal. So for example this vector say 1, 2, 3, 4 majorizes 0, 1, 2, 7. Okay so this number is bigger than this number 1 plus 2 is 3 is bigger than 0 0 plus 1, 1 plus 2 plus 3 is 6 is greater than 1 plus 0 plus 1 plus 2 which is 3 but the sum of all these guys is 10 and the sum of all these guys is also equal to 10. So equality is meant when you add up all the entries together but I don't need these to have been arranged in increasing order like this. If I had written it like this 4, 3, say 1, 2 this majorizes and I don't need this to have been written in increasing order 1, 2, 7, 0. I can write it like this also. So these two vectors but of course it's possible that two vectors that have the same sum still don't majorize each other. This is a very special structure. Not all vectors can be ordered like this. Okay so this is the notion of majorization. So we have the following result. So a is in c to the n cross n and Hermitian then the vector of diagonal entries of a majorizes the vector of eigenvalues of a. That means that if I take the smallest diagonal entry of a that will still be greater than or equal to the smallest eigenvalue of a. If I take the sum of the two smallest diagonal entries of a that will be greater than or equal to the sum of the two smallest eigenvalues of a and so on. Now we're making a quick list. So let's quickly prove this. So you can see this is very interesting and again what I consider a very very counter intuitive result that you will be able to find such a very interesting relationship between the diagonal entries of a matrix and the eigenvalues of a matrix. Okay so the proof goes by induction. Okay so induction meaning we'll look at the size of the matrix and use induction over the size of the matrix. Of course when I take n equals one the if you take a scalar that's equal to the diagonal entry it's also equal to the eigenvalue and so there's nothing to prove. Okay now suppose this result holds for all matrices of size k cross k and k going up to n minus one and now we need to show that the result holds for k equal to n. Okay now we need to show that this result holds for n. So let it be an n cross n matrix be a given matrix. Okay now consider the matrix A1 which is obtained by deleting a row and column and for A1 this result holds by our induction hypothesis. How do I get A1? I get it by be obtained by deleting the row and column corresponding to the largest diagonal. So I find out which is the largest diagonal entry of A and that row and column I delete and I call the matrix A1. So now if you are plate footed you are already seeing how this proof will go. So now the A1 is obtained by deleting a row and column and so we'll use the interlacing result. Okay so let lambda one less than or equal to less than or equal to lambda n be the eigenvalues of A and lambda dash one less than or equal to lambda dash n minus one be the eigenvalues of A1. Now by the induction hypothesis okay first of all let's say A i1 i1 less than or equal to A i2 i2 less than or equal to et cetera less than or equal to A i n i n be the diagonal entries of A. Okay and basically we've obtained A1 by deleting the row the i n i n row and column of the matrix A. Now by the induction hypothesis summation i equal to 1 to k a ij ij so the diagonal entries arranged in increasing order of the matrix A is the same as the diagonal entries of the matrix A1 arranged in increasing order. So I can write a ij ij here this is greater than or equal to sigma j equal to 1 to k lambda dash j and this is true for k equal to 1 through n minus 1. So this is just directly from the induction hypothesis. Now from the interlacing theorem we have that lambda one is less than or equal to lambda one dash is less than or equal to lambda two less than or equal to lambda two dash less than or equal to et cetera up to lambda dash n minus one is less than or equal to lambda n. Okay so that means that if I'm taking the first k guys here okay instead if I add up lambda one through lambda k I'll get each of these lambda one dash is greater than or equal to lambda one lambda two dash is greater than or equal to lambda two and so on. So I can write sigma j equal to 1 to k lambda dash j is greater than or equal to sigma j equal to 1 to k lambda k lambda j and this is true for k equal to 1 all the way up to n minus 1 and so this inequality continues to hold with lambda j this summation j equal to 1 to k lambda j sitting here so sigma j equal to 1 to k a ij ij is greater than or equal to sigma j equal to 1 to k lambda j and this is true for k equal to 1 through n minus 1 but equality certainly holds because the trace equals the sum of the eigenvalues so that's it. So basically the so what we've shown is that the vector of diagonal entries of a matrix A majorizes the vector of eigenvalues of a matrix A and this we use this interlacing property you know it's an essential ingredient in showing such results. Now majorization is actually very useful in expressing the relationship between the eigenvalues of the sum of a matrix and the individual components so for example if you recall we've seen results like lambda k of a plus lambda 1 of b is less than or equal to lambda k of a plus b is less than or equal to lambda k of a plus lambda n of b so the kth eigenvalue of the matrix A plus b is at least equal to the kth eigenvalue of a plus the smallest eigenvalue of b and at most equal to the kth eigenvalue of a plus the largest eigenvalue of b and if b is positive semi definite then this is non-negative so we have lambda k of a is less than or equal to lambda k of a plus b and we also have that lambda j plus k minus n of a plus b is less than or equal to lambda j of a plus lambda k of b so these are some results that we have seen earlier so so in this context we have two more results that I'm just going to state because we don't have time to do the proofs right now but these are results that talk about majorization type relationships between eigenvalues of the summands to the eigenvalues of the sum of the of two matrices so the first result is the following so a b are n cross n Hermitian symmetric matrices so let lambda of a be a vector okay and its entries are lambda i of a and lambda of b be another vector whose entries are lambda i of b okay and similarly lambda of a plus b is a vector with lambda i of a plus b as its entries and so these denote column vectors in r to the n with components equal to the eigenvalues of a b and a plus b arranged in increasing order okay this is very important okay they are arranged in increasing order then the vector lambda of a plus b majorizes lambda of a plus lambda of b okay and so that's one result so it talks about a more precise relationship between the eigenvalues of a plus b and the eigenvalues of a and the eigenvalues of b but what you have to do is to arrange the eigenvalues of a and b in increasing order and then add them together then this vector of eigenvalues of a arranged in increasing order majorizes the vector of lambda of a plus lambda b okay then we also have the following converse result so let n be at least equal to 1 and let a 1 less than or equal to a 2 less than or equal to up to a n and lambda 1 less than or equal to lambda 2 lambda n so imagine that these are some diagonal entries and these are some eigenvalues and suppose that this this vector majorizes the vector lambda which has lambda i as its entries then there exists a real symmetric matrix a equal to a i j being its entries in r to the n cross n such that a i i equal to a i i equal to 1 to n so it has a i as its diagonal entries lambda i is the set of eigenvalues of a so given a set of numbers real numbers which where one set of real numbers majorizes the other set of real numbers then you can find a matrix a such that the the vector a are the contains the vector a forms diagonal entries of this matrix and the vector lambda forms the eigenvalues of this matrix okay so there's always such matrix that you can find so that's all i wanted to say today will continue on Monday