 So, the generalization of this for all Eigen values is the following theorem, which is a very central theorem in this variational characterization of Eigen values. This is a theorem by two people, Courant and Fisher. This is called the Min-Max theorem. So, as usual, our setup is A is an n cross n Hermitian symmetric matrix and it has Eigen values lambda 1 less than or equal to lambda 2 less than or equal to etcetera lambda n and suppose or let k be an integer, 1 less than or equal to k less than or equal to n then we have two results, the minimum over W1 through Wn minus k in c to the n. So, I am allowed to choose n minus k vectors in c to the n over which I am doing this minimization, the maximum over x not equal to 0, x in c to the n, x perpendicular to W1 through Wn minus k of x Hermitian Ax over x Hermitian x is equal to lambda k and the maximum over W1 through Wk minus 1, the minimum x not equal to 0, x in c to the n, x perpendicular to W1 through Wk minus 1, x Hermitian Ax over x Hermitian x is also equal to lambda k. So, there are two ways to write lambda k as a solution to an optimization problem and in both cases it is a double optimization, there is a Min-Max in one case and a Max Min in the other case and of course like I mentioned before if k equal to 1 or n in this case when k equals 1 this goes over W1 through Wn minus 1 and x perpendicular to W1 to Wn minus 1 whereas in this case, this maximization step you can drop it because there is no such thing as W0 so this constraint does not arise. So, you just have to do the minimum over all x not equal to 0, x in c to the n, x Hermitian Ax over x Hermitian x that will be equal to lambda 1 and so this constraint also drops off. Similarly, when k equals n there is no such thing as W0 here so this minimization drops off and similarly x perpendicular to W1 through W0 there is no such thing so this constraint also drops off and so the maximum over all non-zero x of x Hermitian Ax over x Hermitian x is equal to lambda n. So, when k equals 1 or n in one of the two cases we will be omitting one of the outer optimizations. So, we will prove this result we will only prove this first part the other part is actually almost exactly the same but you just have to modify the steps a bit. So, as usual we will write a as u lambda u Hermitian u is unitary and lambda is diagonal of lambda 1 through lambda n. Then let k be some number which is between 1 and n then if x is a sum vector which is not equal to 0 then x Hermitian Ax over x Hermitian x is as usual u Hermitian x Hermitian lambda u Hermitian x divided by u Hermitian x Hermitian times u Hermitian x and further if I look at all vectors such that x is over all non-zero x okay is the same as so I will write it this way u Hermitian x x in c to the n x not equal to 0 is the same as the set of vectors y in c to the n y not equal to 0. In other words I am thinking of u Hermitian x as y and if I want to optimize this over all x not equal to 0 I can as well optimize it over all y not equal to 0. So, if w1 through wn minus k in c to the n are given then the soup of x not equal to 0 x perpendicular to w1 through wn minus k of x Hermitian Ax over x Hermitian x is equal to the supremum over y not equal to 0 y perpendicular to instead of since x is since y is u Hermitian x I can write this as y is perpendicular to u Hermitian w1 etc up to u Hermitian wn minus k of y Hermitian lambda y over y Hermitian y and as before I will expand this out and this numerator is actually equal to sigma i equal to 1 to n lambda i times mod yi square and further I can impose a constraint that y Hermitian y equal to 1 and optimize this over all y Hermitian y equal to 1. Okay, so just for the sake of completeness let me write this step this is equal to supremum over y Hermitian y equals 1 y perpendicular to u Hermitian w1 up to u Hermitian wn minus k of sigma i equal to 1 to n lambda i mod yi square. Now I will do my brilliant thing from the previous discussion and I will say that this is greater than or equal to the supremum over y Hermitian y equals 1 y perpendicular to u Hermitian w1 up to u Hermitian wn minus k and I will further set y1 to yk minus 1 yk minus 1 equal to 0 of the summation i equal to 1 to n yi mod lambda i lambda i mod yi square but since I have set all these guys equal to 0 I can go i equal to k to n lambda i mod yi square and this in turn is equal to the supremum since the first k minus 1 terms are equal to 0 this constraint y Hermitian y equals 1 reduces to yk square plus yk plus 1 square plus etc up to yn square equals 1 and y should still remain perpendicular to u Hermitian w1 up to u Hermitian wn minus k of sigma i equal to k to n lambda i mod yi square and of course as before this is a convex combination of lambda k lambda k plus 1 up to lambda n and this convex combination is at least equal to the smallest value here and another way to think about it is I will replace all these lambda i's with lambda k then I am only decreasing the value and summation k equal to i equal to k to n mod yi square equals 1 because of this constraint here and so there is nothing left to optimize and so I can say that this is greater than or equal to lambda k okay so what we have shown then is that the supremum over all x not equal to 0 x perpendicular to w1 through wn minus k x Hermitian ax divided by x Hermitian x is greater than or equal to lambda k and this is true for arbitrary w1 through wn minus k but again the previous result above shows that which is I am referring to this one here I am scrolling up so the minimum over yeah yeah so as a first step I need to change k to n minus k then what will happen is this un I would be doing so if I replace k by n minus k I would be going from un up to un minus 1 up to u I've replaced k with n minus k and I would have I'm counting down I have to count down up to um u k plus 1 okay and this if if x is perpendicular to all of these vectors then what I will get here will be let me write this with a different color so that it is not confusing later on un un minus 1 u k plus 1 will give me I've replaced k with n minus k so I'll get lambda k here okay now what I'll do is I'll call this vector w1 this vector w2 etc then this will become w so if yeah so this will so how many vectors do I have here I have I'm going down from n to k plus 1 so there is there are exactly n minus k vectors here and so I have w1 through wn minus k so basically if I set w1 equals un w2 equals un minus 2 and w wn minus k equals uk plus 1 the maximum of this subject to x perpendicular to all these vectors will be equal to lambda k okay so the largest value this can take for that specific choice of w1 through wn minus 1 uh w1 w2 up to wn minus k is equal to lambda k so therefore yeah just go back and think about it so this result shows that uh equality equality here is valid when wi equals un minus i plus 1 okay that implies that the infimum over all w1 through wn minus k of the supremum x not equal to zero x perpendicular to w1 through wn minus k x emission ax over x emission x is equal to lambda k which completes the proof okay so um I don't want to go further ahead the next theorem is a is another very very crucial theorem which is called Weyl's theorem we'll discuss that in the next class but what I strongly suggest is uh you know you guys should definitely go over the proof on your own and make sure you understand every step of this proof because the ideas in this proof will use again and again to prove many many more results and um uh and these arguments are uh slightly tricky to convey orally especially in this online mode and since I can't see you guys I don't know if you're able to follow the proof as I explained it or not but based on whatever I said if you now go back and look at the proof on your own you will be able to fill in the steps and if you're not then please stop me at the beginning of the next class and ask me which step you weren't able to follow and we can go over the argument again but the arguments we made today are crucial we are going to reuse them in many proofs going forward and uh at that time if this proof is not completely clear to you you won't follow many of the proofs that we are going to discuss in the following classes so please spend some time on it so we'll stop here for today and we'll continue in the next class