 Okay, so now in order to proceed further, so I need one more definition, it's just a small extension of what we already know. So B in C to the n cross n is said to be Hermitian and positive semi-definite which I will often abbreviate by PSD, it is Hermitian and X Hermitian B X is greater than or equal to 0 for every X in C to the n. So the only difference from a positive definite matrix is that the inequality becomes a greater than or equal to 0 and it's okay for it to hold for every X in C to the n, we don't have the restriction that X must be a non-zero vector. Of course for defining positive definite matrices, I cannot allow X equal to 0 because X Hermitian B X will be equal to 0. So I won't be able to satisfy a strict inequality constraint if I want to define positive definite matrices but for positive semi-definite matrices we can allow X to be equal to 0 because after all the condition is that it should be greater than or equal to 0. And we will denote this by B greater than or equal to 0 like this and I may even write it as B greater than or equal to 0 but unless I explicitly say otherwise B greater than or equal to 0 means positive semi-definite, it does not mean that the entries, all the entries of B are real and non-negative. So once we define these positive semi-definite matrices, we can now talk about what is called a monotonicity theorem which says that all the eigenvalues of a Hermitian matrix cannot or none of the eigenvalues can decrease or they will all increase but either increase or remain the same if you add a positive semi-definite matrix to the matrix. So it is the following corollary. Let a and b be n cross n Hermitian symmetric matrices and b is positive semi-definite. Arrange the eigenvalues of a and a plus b in increasing order. Then lambda k of a is less than or equal to lambda k of a plus b, 1, 2 up to n. This directly follows from the previous theorem because in fact all you need is the lower inequality. One immediate consequence of X Hermitian BX being greater than or equal to 0 is that lambda or actually here, this lower inequality, I will use a different color, this lower inequality because one immediate consequence of lambda of b being a positive semi-definite matrix is that X Hermitian BX is greater than or equal to 0 for all x and so all the eigenvalues are greater than or equal to 0. In particular lambda 1 of b is greater than or equal to 0 and so lambda k of a plus b is going to be greater than or equal to lambda k of a plus some non-negative quantity. So basically if I drop this non-negative quantity, the inequality will still hold and so lambda k of a will be less than or equal to lambda k of a plus b. So that is what the result also says. Now another thing is that once you start looking at positive semi-definite matrices, you can actually say more about these eigenvalues. It is not just the smallest and largest eigenvalues of b that you need to use in order to bound the kth eigenvalue of a plus b. You can actually prove much more sophisticated results about the location of these eigenvalues and these kinds of bounds often take the form of what is known as an interlacing theorem and there are many interlacing theorems in linear algebra. So we will discuss a few of these interlacing theorems. The first interlacing theorem we are going to discuss is for the case where b is a rank 1 matrix. So that is the following theorem. So as before, let a be an n cross n Hermitian symmetric matrix and z be a vector in c to the n. If the eigenvalues of a and a plus or minus z, z Hermitian, this z is the same as this z although I wrote it differently, are arranged. This a plus zz Hermitian and a minus zz Hermitian are two different matrices, but whatever I am going to say is valid for both those matrices. So that is why I am combining them and writing it as a plus or minus zz Hermitian. In increasing order, we have two parts to it. Lambda k of a plus or minus zz Hermitian is less than or equal to lambda k plus 1 of a is less than or equal to lambda k plus 2 of a plus or minus zz Hermitian. And this is true for k equal to 1, 2 up to n minus 2. Of course, this is when I have k plus 2 here, k can only go up to n minus 2. And b lambda k of a is less than or equal to lambda k plus 1 of a plus or minus zz Hermitian is less than or equal to lambda k plus 2 of a. Okay, so what this result is saying is that the k plus 1th eigenvalue of a can be bounded by, it can be bounded below by the kth eigenvalue of a plus or minus zz Hermitian and upper bounded by the k plus 2th eigenvalue of a plus or minus zz Hermitian. And similarly, the k plus 1th eigenvalue of a plus or minus zz Hermitian is lower bounded by lambda k of a and upper bounded by lambda k plus 2 of a. So, in particular, if I write all the eigenvalues from k equal to 1, 2 up to n minus 2 of a and a plus or minus zz Hermitian, then I approximately this is what I have lambda 1 of a. That is this when I put k equal to 1 here is less than or equal to lambda 2 of a plus or minus zz Hermitian. So, it is less than or equal to, so this is lambda 2 and k plus 2 becomes lambda 3 of a less than or equal to. Now, lambda 3 of a I can take k equal to 3 here lambda 4 of a plus or minus zz Hermitian less than or equal to lambda 5 of a and so on. And if I start with this part a here lambda 1 of a plus or minus zz Hermitian less than or equal to lambda 2 of a less than or equal to lambda 3 of a plus or minus zz Hermitian less than or equal to lambda 4 of a less than or equal to etc. So, that is what these two inequalities say. Let us see how to show this. So, the starting point is again the Courant-Fischer theorem. So, let k be some number n minus 2. That is what we have in the condition here. And use the Courant-Fischer theorem. So, I will start with lambda this quantity lambda k plus 2 a plus or minus zz Hermitian. This is equal to the min over w 1 up to w. Let me go back here. So, for lambda k it is the min over w 1 through w k. So, if I want lambda k plus 2 I must do min w 1 to w n minus k minus 2 these in c to the n then max over x not equal to 0 and x perpendicular to w 1 through w n minus k minus 2 x Hermitian a plus or minus zz Hermitian x over and this in turn is greater than or equal to. So, this is a slightly clever step in the proof. The min over the same quantity w n minus k minus 2 in c to the n the max over x not equal to 0 and x perpendicular to w 1 through w n minus k minus 2 and x perpendicular to z x Hermitian a plus or minus zz Hermitian x over x Hermitian x. Why is this true? It is true because all I am doing is adding one more constraint here. So, whatever max this could achieve maybe this cost function cannot achieve as great a maximum value because it is a more constrained optimization problem. So, this will be smaller than this. Now, if x is perpendicular to z this term here is x Hermitian zz Hermitian x. So, that term just drops off. So, I can drop that term and write it in the following way. This is equal to the min over w 1 through w n minus k minus 2 in c to the n and I will just define a w n minus k minus 1 to be equal to z and then I will say the maximum is over all x not equal to 0 and x being perpendicular to w 1 up to w n minus k minus 1. Okay, that is the same as saying x is perpendicular to z and my cost function is now x Hermitian A x over x Hermitian x. Now, instead of minimizing it subject to this constraint that w n minus k minus 1 equals z if I just drop this constraint and allow w n minus k minus 1 to be any vector in c to the n, I can possibly achieve a lower minimum than what is achieved by this cost function here or this objective function here or rather this optimization problem here. So, I have a further lower bound by allowing w n minus k minus 1 to be anything. So, that is the min over w 1 through w n minus k minus 1 in c to the n the maximum x not equal to 0 0 x perpendicular to w 1 through w n minus k minus 1 of x Hermitian A x over x Hermitian x and by Kurovichet theorem this is exactly what we equal to lambda k plus 1 of A. Okay, so you can see that the proof involves this interesting step of saying so if x was perpendicular to z you would get a lower bound on the first cost function and then you push the constraint into this w n minus k minus 1 then you allow that to become arbitrary and both those steps are lower bounding steps and that gives you lambda k plus 1 of A. Similarly, I can do the other way. So, let K be in so that the the theorem proving the theorem involves proving four inequalities and this proves one of them. I will do one more and but basically from these inequalities the theorem follows 2, 3 up to n minus 1. So, if I look at lambda k of A plus or minus z z Hermitian this is equal to now I will use the Maxmann version this is the max over w 1 to w k minus 1 in c to the n of the minimum over x not equal to 0 x perpendicular to w 1 through w k minus 1 x Hermitian A plus or minus z z Hermitian x over x Hermitian x this is just Kurovichet theorem which is then less than or equal to the max over the same constraint as the previous the minimum over x not equal to 0 x perpendicular to w 1 through w k minus 1 and now I will throw in an extra constraint that x is perpendicular to z of x Hermitian A plus or minus z z Hermitian x over x Hermitian x and now since I have thrown in this extra constraint that x is perpendicular to z this minimum may not be able to achieve as low a minimum as this optimization problem and therefore this is an upper bound and this now the next steps are exactly the same as before so I will do the max over instead of I have w 1 through w k minus 1 in c to the n c to the n and I will set w k equal to z of the minimum x not equal to 0 x perpendicular to w 1 through w k of x Hermitian and I can drop this term because x is perpendicular to z here so A x over x Hermitian x and then I will neglect this constraint w k equal to z and thereby I can possibly achieve an even higher maximum than this optimization problem so I have max over w 1 through w k in c to the n the min over x not equal to 0 x perpendicular to w 1 through w k of x Hermitian A x over x Hermitian x and by Kurofischer theorem this is exactly equal to lambda k plus 1 of A okay and so the inequalities in the statement of the theorem follow from these inequalities okay notice that we use the max min formulation to prove an upper bound on an eigenvalue and we use the min max formulation to prove a lower bound on the on an eigenvalue okay okay and this is something that you will notice in many results that we show going forward okay now one useful fact to keep in mind is that if if b is in c to the n cross n we put it this way and is a Hermitian symmetric matrix then that means that it is uniterally diagonalizable and it's a non-defective matrix and so for such a matrix we can write b as u lambda u Hermitian where the matrix u contains the eigenvectors of b and lambda is a diagonal matrix containing the eigenvalues of b so u is uniterally and lambda is diagonal now so let's say lambda is equal to sorry of lambda 1 through lambda n so these lambda 1 to lambda n are the eigenvalues of b now in this case because b is non-defective the rank of b is equal to the number of non-zero eigenvalues okay and in particular if rank of b equals say r then only some only some r eigenvalues here will be non-zero and the remaining will be equal to zero so I can say that lambda r plus 1 equal to etc up to equal to lambda n is equal to zero and so in fact we can write as sigma i equal to 1 to r beta i sorry lambda i times ui ui Hermitian okay and conversely any matrix of this form of the form sigma i equal to 1 to r lambda i ui ui Hermitian where u1 up to ur are linearly independent has a rank at most r and if if all these lambda i's are not equal to zero okay let me put it this way um in the following way lambda i not equal to zero has rank r if um if ui's are not known to be a linear ui's are not known to be linearly independent then it has rank at most r okay this fact will actually turn out to be quite useful so for example a rank one matrix can be written as some lambda times u u u Hermitian for some vector u which is non-zero so a rank one Hermitian symmetric matrix can be written as some lambda one times u u Hermitian and so on okay so the next result that I want to share is also an interlacing theorem and this is about what happens if you pad a matrix by a row and column to get a matrix whose size is one more than the matrix you started with so it reads like this so suppose a is in c to the n cross n and is Hermitian and y is in c to the n and a is a real number and these are given okay then let a hat denote the matrix a and y y Hermitian and this small a this is a matrix of size n plus one cross n plus one okay so question is how are the eigenvalues of a related to that of a hat so no remember that a hat has n plus one eigenvalues and a has only n eigenvalues so let the eigenvalues of a and a hat be arranged in increasing order okay and denote them lambda i lambda hat i respectively so this is actually i equal to one to n and this is i equal to one to n plus one then lambda one hat is less than or equal to lambda one is less than or equal to lambda two hat is less than or equal to lambda two less than or equal to lambda n minus one less than or equal to lambda n hat less than or equal to lambda n is less than or equal to lambda n plus 1 hat. Okay, so this is the interlacing theorem. So what it says is that the largest eigenvalue of this matrix is going to be bigger than the largest eigenvalue of A. The smallest eigenvalue of this matrix lambda 1 hat is going to be smaller than the smallest eigenvalue of A. So all the eigenvalues and but not only that all the eigenvalues of A hat interlaced between pairs of eigenvalues of A. So lambda 2 hat is going to be between lambda 1 and lambda 2. Lambda 3 hat is going to be between lambda 2 and lambda 3. Lambda n hat is going to be between lambda n minus 1 and lambda n. So the last but one largest or the second largest eigenvalue of A hat is between the largest eigenvalue of A and the second largest eigenvalue of A. But the largest eigenvalue of A hat is greater than or equal to the largest eigenvalue of A.