 Again, we will drop the pretense of dealing with general operators and just talk about n cross n or m cross n in this case, complex matrices ok. So, we start with some a that is of size c m cross n and we will see as we proceed with the proof it will really not matter as to whether m is greater than n or less than n ok. The idea will still carry forward. So, in general we can just assume that m is less than n, but everything that we are going to do here after you can check that whether you take n less than m the same idea will still carry forward ok. So, this is the matrix no symmetry nothing. So, what do we do? The first thing we do is we will look at a a Hermitian a. What can you say about this matrix? No matter what a is a Hermitian a is of size n cross n is Hermitian. I should not use the term symmetric when we are dealing with complex matrices. So, it is Hermitian all right. Where are we going with this? What have we just learnt about Hermitian matrices? Diagonalizable by a very precise kind of a basis and orthogonal basis provided by its eigenvectors all right. So, a Hermitian a all right there is some v which leads us to what exactly? So, this is that v let me just not draw it up here. Where sigma is equal to diagonal notice this very carefully ok sigma 1 squared sigma 2 squared till sigma r squared and then 0 0 0 how many such zeros n minus r. Now, it warrants a bit of explanation at this point up until this point ok. This is a diagonal matrix no problem is in admitting, but what I have written here even till this diagonal you will agree, but now what I have written is that these are squares of real numbers by the way. So, these are real numbers. So, that means what I am saying is that these diagonal entries will have to be positive. Why is that true? Why should that be the case? What I am trying to say is that if I have a matrix like this then its eigenvalues must be positive. We have actually spoken about this briefly if you remember in the least squares solution, but now we are making it very explicit. We gave this matrix a kind of a name we said it is positive definite of course, when it was real. But now I am just claiming that such a matrix will always have only positive or at best non-negative eigenvalues. It can never have negative eigenvalues. We know that such a matrix is going to be what? Positive definite which means that if you hit it with some y hermitian on the left and some y on the right, then it is basically exactly equal to the norm of a y squared and norm is always positive norm squared is also I mean norm is non-negative norm squared is also non-negative. Therefore, this matrix is positive. Now suppose this fellow ends up having a negative eigenvalue. Suppose any hermitian matrix, so just I will do that as a side note. Suppose P is equal to P hermitian I am going to use this symbol to denote positive definite and yet there exists lambda belonging to r negative which means negative real. Of course, it has to be real. If it is hermitian its eigenvalues can only be real. So, let us assume the contrary that there exists lambda negative such that P v is equal to lambda v. What do I immediately have as a consequence? If I now take v hermitian P v means v hermitian lambda v which is equal to lambda times the norm of v squared which is then surely negative which is a contradiction because this fellow if it is positive definite for no vector v should I be led to the conclusion that v hermitian P v is negative. But the moment it has one eigenvalue which is negative, so at least it is a necessary condition that every eigenvalue of this fellow must be non-negative as it turns out it is also a sufficient condition and that is also very obvious. So, this proves necessity of the condition. So, every eigenvalue of a positive definite matrix every eigenvalue of a positive definite matrix must be of course when we say positive definite we normally take real. So, this hermitian matters not we only take real. So, P is equal to P transpose is what we say but I am just generalizing for this matrix also. So, what we say is that this fellow cannot have negative eigenvalues because then it would violate the very tenet of positive definiteness. But it is also sufficient you see because if I have an orthogonal basis in terms of my eigenvectors right then I can represent every vector as a linear combination of the eigenvectors right. And if all those eigenvectors are positive then what do I what do I conclude about it? So, you take any vector. So, I am doing this a little loosely because this is not exactly strictly part of our syllabus. But let us say suppose V is equal to alpha 1 V 1 plus dot dot dot till alpha n V n. Now, if I take V hermitian P V and each of these are eigenvectors remember. So, because I have sufficient number of eigenvectors that is why I am able to diagonalize it in the first place right. And each of those eigenvalues are at best 0, but never negative is what we are claiming right. So, that means we will have to relax it to positive definite to positive semi definite yeah I mean I could just say this for instance yeah. Now, if this is the case what does this turn out to be? What can I say about this? Look at P V P acting on this. So, this is V hermitian times alpha 1 lambda 1 V 1 plus dot dot dot till alpha n lambda n V n. When I hit it with V hermitian on the left then it is V 1 hermitian this that leads to alpha 1 lambda 1. When it is V 1 hermitian the second term what does it turn out to be? 0 because these fellows are orthogonal. So, this exactly turns out to be alpha 1 squared lambda 1 plus alpha 2 squared lambda 2 plus dot dot dot again please read this as real now. I am just using the hermitian as a general nutritional purpose, but I mean choosing a complex matrix and saying it is positive definite really makes no sense. You are dealing with a real life problem. So, the matrix will be real in this case, but symmetric matrices are a special case of hermitian matrices. So, this is just going to be plus alpha n squared lambda n. Now, if all these lambdas are positive or at best 0, but not negative then this is going to be greater than or equal to 0. So, indeed having all eigenvalues non-negative is not just a necessary condition for positive semi-definiteness, but is also a sufficient condition for positive semi-definiteness. Is that part clear? Because now I assumed I took any arbitrary vector represented as a linear combination of the eigenvectors. So, this is arbitrary and I am hitting it with this arbitrary vector v hermitian on the left and v on the right. You can just say v transpose p v. So, then this p v is what? p acting on alpha 1 v 1 leads to alpha 1 lambda 1 v 1, p acting on alpha 2 v 2 leads to alpha 2 lambda 2 v 2, p acting on alpha n v n gives me alpha n lambda n v n. So, that is what I have written here. Now, I am acting on this using v hermitian which is alpha 1 conjugate, but then this is real. So, alpha 1 v 1 hermitian plus alpha 2 v 2 hermitian plus dot dot dot till alpha n v n hermitian. So, that is what this v hermitian is, this object. Now, when this object acts on each individual element, the v 1 hermitian when it acts on fellows such as v 2 v 3 is 0. So, it only filters out alpha 1 lambda 1 and gets multiplied with alpha 1 conjugate or alpha 1 in this case. So, that is such as alpha 1 squared. So, this is the exact expression that you get and this expression if all your eigenvalues are positive, the alpha 1 squared, alpha 2 squared, alpha n squared these are already positive. So, positive numbers multiplied by positive numbers and summed together can only lead to positivity unless all of them are 0 which is the worst case other than that this is always going to be a positive number, right. So, this is at least positive semi definite, right. So, that means, if all your eigenvalues happen to be non-negative that is for a symmetric matrix, they will obviously be real. So, no point in thinking about or worrying about complex eigenvalues that will never occur. So, these are all real. If they are positive real or non-negative real that is a short test for positive definiteness and now go back to that least squared problem that we approached and we talked about the Hessian, right and we spoke about why the Hessian corresponds you know the sign of the Hessian look at the sign of the Hessian. If it is positive then it is a minima, right. If it is positive semi definite then it is a it is positive definite it is a minima. So, checking for that positive definiteness of that Hessian is akin to again just checking for the eigenvalues and ensuring that the eigenvalues are all positive instead of checking for every possible vector in every possible direction. Not that I am saying that finding out eigenvalues is trivial by the way, but I am saying it is an equivalent condition. So, non-negativity of eigenvalues is not just a necessary condition as highlighted here, but it is also sufficient condition for positive semi definiteness of a symmetric matrix, right. Now, we get back to here. So, now you agree that if I am writing out this form here, this is true because this fellow is exactly one such positive semi definite matrix, right. This fellow can only be 0 if something belongs to the kernel of A. Otherwise, this is always going to be what the norm is going to be non-zero, right and positive number therefore, this is positive semi definite therefore, its eigenvalues could be like this where it has exactly R non-zero eigenvalues and the rest N minus R are 0, right. So, this part is clear. So, I will now erase this apparent digression that we took, but one that was essential, alright. So, I hope that this so far based on what we have just shown is clear, right. What does that mean? So, A Hermitian this fellow contains the eigenvectors of A Hermitian A. The columns of V are exactly the eigenvectors of A Hermitian A and the diagonal entries of sigma which is a diagonal matrix anyway contain the eigenvalues of A Hermitian A which are at best 0, but never negative, right. So, let us assume without loss of generality that there are R non-zero eigenvalues and the remaining N minus R are 0. You could have chosen R is equal to N, does not matter, alright. So, now what is the next step that we are going to carry out? What is the crucial observation in all of this? If you also look at A A Hermitian, could you not have also seen something similar? A A Hermitian, what holds for A Hermitian A also holds for A A Hermitian 2. We can also prove this by that root, but let us not go by that. So, what I am instead going to do is I am going to write this down explicitly. So, we have sigma 1 squared dot dot till sigma R squared and then 0s here and 0 here and a 0 here, yeah and this is V 1 V 2 till V N. This is equal to A Hermitian A V 1 till V N, ok. Let us consider W as another matrix say like so, sigma 1 sigma 2 till sigma R. So, these are positive numbers. You can always take the positive square roots, alright and let us have this as I identity of size N minus R. So far so good. Let us also assume without loss of generality at this point here, yeah. This is our choice by the way. So, we can let this be, there is nothing to prove in this. Let sigma 1 squared be greater than or equal to sigma 2 squared so on till greater than sigma R squared greater than 0. This is our choice. We can just rearrange the eigenvalues and eigenvectors in such a way to get this. There is nothing special in this, right. Nothing to prove here. It is my choice. If you have two eigenvalues 2 and 3, yeah and the first eigenvector is V 1 and the second eigenvector is V 2. It does not matter if you put V 2 first as the first column and V 1 as the second column and call the first eigenvector eigenvalue as 3 and the second eigenvalue as 2. It is just an ordering of the basis. So, this I can always choose it like this. So, there is some ordering in this, a strict monotonic decrease here, alright. So, now the way I have cooked up this W, can I not say that this W is going to be invertible? It is a diagonal matrix after all. Just the reciprocals of these terms and these are by default by my definition here. These are nonzero. So, their inverses will exist turn by turn here, right. So, I am going to erase this part now, but I can probably erase the entire part here. So, we have let us say V Hermitian A Hermitian A V is equal to sigma 1 squared sigma 2 squared or the dot till sigma r squared 0s like so. And let us say we hit it with W inverse here and W inverse here. That will be similar to W inverse here and W inverse here, clear? Legit operations, yeah. I am just doing it. Do not ask me why, you will see in a moment why, but I am just I am allowed to do this operation is what I am trying to convince you about, agreed? Now, if I do this, what happens to the right hand side? What is W inverse? It is just 1 by sigma 1, 1 by sigma 2 so on till 1 by sigma r and these are just again identity and you are hitting it on the left and on the right. So, what does it do to this diagonal block? Is it not equal to identity of size r 0, 0, 0 like so, yeah. So, what is this? This is A V W inverse and notice carefully what I am going to write here. I am going to write A V W inverse the Hermitian thereof, would you agree? Because this fellow is also Hermitian. So, whether you take the inverse of W and then the Hermitian or not, it does not matter. This is a real matrix diagonal matrix is always symmetric. So, the inverse of this diagonal matrix is also diagonal therefore, also symmetric therefore, whether you hit it with the Hermitian operation or not, it matters not. This is equal to what? The same I r 0, 0, 0, probably we will stop here at this point for the next module and then we will carry on. But this is an important observation because what is this showing us? What is this? If you write this as a big matrix say y then this is y Hermitian times y. So, what we will next see is if you split up this y matrix into columns. So, what is the size of this matrix? This is m cross n, this is n cross n, this is also n cross n. So, this whole thing is also m cross n. So, there are n columns each containing m tuples. So, that is how we are going to represent this whole object now. This is y and this then becomes y Hermitian. So, we will see what this leads us to conclude in the next module.