 The very few results in this course, for example, sure decomposition theorem is one of them, which apply to any matrix and this is also one of them. You can find the singular value decomposition of any matrix. Okay, so the way we prove this is first we will prove using induction that there exists unitary u and v such that u Hermitian Av equals some matrix S where S is a matrix which has real non-negative diagonal block on the top left corner and zeros everywhere else. And then we will show that that S is equal to the sigma that we have defined here. Okay, so now, so what we will do is we will first perform the proof that there exists a unv such that u Hermitian Av equals S where S is a diagonal matrix. It is actually a rectangular matrix. So, yeah, so I mean a rectangular matrix with only the top left r cross r block being non-negative and real value. We will show that by induction. So we will do induction on the minimum of m and n. So in any induction based approach, the first step is to show that the induction holds for the index equal to 1. So suppose min of m n is equal to 1. So that either is either a row vector or it's a column vector. Now, if a is m by 1, then it is a column vector. So I will call it say this vector a is a column vector. Okay, now we will just directly let u okay be in c to the m cross m be unitary with first column a over norm of a. Okay. Now, we can we can we can without loss of generality throughout this proof, we will assume that a not equal to zero without loss of generality because I mean, you can trivially show them in this is trivially true if a is equal to zero, you just choose sigma equal to zero and you can choose any unitary u and v and this will this will hold. Okay. So the result holds trivially if a is a zero matrix. So a to here is non zero. So it's okay to divide by the L2 norm of a. Okay, now it's always possible to find a unitary matrix you once you pick the first column, you just pick orthonormal columns to it and form a unitary matrix of size m by m. Okay, and let v be just this matrix with a single one. Okay, this is a one cross one matrix with entry equal to one. Then you have mission a v is then going to be equal to and and incidentally this a two. So this is also true. I mean, you can you can write out for this column vector, you can write out and see that this is actually true. So so then if I compute you Hermitian AV, when I do you Hermitian times a has only one column in it and that column is in the same direction as the first row of you Hermitian and all other rows of you Hermitian are orthogonal to the column of a. And so this is actually equal to the norm of a and then zeros everywhere else. Okay, and this is exactly the form I wanted, which is that you Hermitian AV equal to sigma and in this case, sigma is of size m by n where n is equal to one here. So this is m by one. Now, similarly, if a is one cross n, then a is a row vector and I'll write it. Let's see how I want to write it. Okay, then I will take u equal to the one cross one matrix one and v to be a unitary matrix of size n cross n with form of a as first column. Okay, then once again, u Hermitian AV. So a is this a Hermitian. And when I have this here, the first column of v, when it multiplies with a, I will get a Hermitian a which is a two squared divided by a two. So that will be a two and all other rows or all other columns of AV will be zero because the other columns of the are all orthogonal to this a a Hermitian here. And so this will be equal to zero zero and is of size one by n. So again, it is in the form when form that u Hermitian AV equal to sigma. Okay, so this this shows that the theorem is true when n min of m n equals one. Okay, so now we go to the induction step. So, so we assume theorem holds for all m by n matrices with min of m n strictly less than k. Okay, so min m n up to k minus one. And now we need to show that it holds for the k min m n equal to k also. So, let a be of size m by n. And suppose I define sigma to be the spectral norm of a, then choose, I will just write what this is so that it is clear. Okay, so now this is an optimization problem that has some solution, whatever the solution gives us as the objective function, that is by definition equal to sigma, which means that there exists some x for which norm L to norm of ax equals sigma. And suppose x is that particular x. Okay, so I will write this as x tilde here. And then I will say suppose x in c to the n is such that x2 equals one and ax equals sigma. We back in one second. So, so suppose x is that vector that solve this one. And so x is some vector such that this L to norm is one and the L to norm of ax equals sigma. And let y in or I just say, let us define y to be sigma inverse times ax. Okay, again, because a is not equal to zero. And so this quantity, this quantity is strictly positive. So, sigma is also strictly positive. So, it is okay to write sigma inverse or one over sigma times ax. So, it is again coming because we are assuming a is a non zero matrix. So, then so ax, rather i.e. ax equals sigma y. And also if I compute y2, I will get equal to one because the norm of y L to norm of y is going to be one over sigma times the L to norm of ax. And the L to norm of ax equals sigma. So, y2 equals one. Now, let u be in c to the m by m be unitary with y as its first column. It is a unit norm vector. So, we can choose y to be its first column. And let v in c to the n cross n unitary with x as its first column. So, again x2 equals one. So, I can choose x to be the first column of this unitary matrix. Now comes the magic. So, then if I compute u Hermitian Av, this is equal to, I can write it as sigma 0, 0, some w Hermitian and some matrix c here, where w is in c to the n minus 1 cross 1. And c is a matrix in m minus 1 cross n minus 1. This happens because of the way we have chosen this u and v matrix. So, if you want, you should just multiply it out and convince yourself that this is in fact true. If I multiply this by Av, if I multiply it by the Hermitian of this first row here, then it is the same as multiplying this by this vector. And what I will get then is sigma squared plus w Hermitian w as its first element and c times w as whatever sits below that. And now if I take the norm of this, I have that the norm of u Hermitian Av times sigma w is at least equal to the square of the first entry here. So, in other words, actually the norm of this is this square plus the entries of this square whole square root. But if I drop all the terms corresponding to this and then just take the square root, this is what I will get sigma squared plus w Hermitian w. That becomes a lower bound on the norm of this vector here. Now, by now we use the submultiple activity. We have that u Hermitian Av. That's what I'm saying. So, what I would do normally is I would do sigma squared plus w Hermitian w square plus sigma i equal to 1 to... Sir, I got it. You got it, right? So, 1, 2, I want to say m minus 1 c w i entry square. I need modulus over here because these are complex value square. And then I should be taking this whole thing power half. And what I'm doing is I'm just dropping all these terms. Then when I take the power half, I'll just get this thing. Now, this is less than or equal to the product of the norm of this times the norm of this times the norm of this, but these are unitary. So, their norm, their Euclidean norm equals 1 or their their spectral norm equals 1. And so, this is just norm of a. Okay. Now, the other thing is that if I multiply, if I apply the submultiple applicativity on this, what I have is U Hermitian Av Sir. Sigma w, yes. Sir, U Hermitian Av spectral norm will be equal to spectral norm because left and right multiplication. So, it will be less than or equal to... Good question. Okay. Yeah. So, yeah. In fact, I think you know, we are actually going there. Just bear with me for a minute. For now, I'm just using the submultiple multiplicativity property to claim that the spectral norm of U Hermitian Av is less than or equal to the spectral norm of A. Okay. So, now, if I look at the L2 norm of this, this is less than or equal to the norm of U Hermitian Av times the norm of this vector sigma w. And so, from this, we have that... So, I could have probably done this faster if I had said that these two will be equal. But anyway, I'm just following the textbook here. So, so what we have from this is that sigma squared. So, this quantity, okay, this is a lower bound on this. So, I should put this over on this side. So, sigma squared plus w Hermitian w is less than or equal to this quantity, which is less than or equal to this quantity. And this itself is the norm of... So, this itself is less than or equal to the norm of A times the L2 norm of sigma w is nothing but sigma squared plus w Hermitian w power half, right? And so, if I just square both sides and then cancel off one of the sigma squared plus w Hermitian w, I have that sigma squared plus w Hermitian w is less than or equal to the spectral norm of A. But sigma, let me go back to sigma, sigma is equal to the spectral norm of A. Okay. So, what that means is... And so, sigma... So, if I substitute this here, it says that w Hermitian w is less than or equal to 0, which in turn implies that w equals 0, because this is a non-negative quantity. And so, if it's going to be less than or equal to 0, the only way it's possible is if w equals 0. And so, in this matrix, whatever we wrote here, it means that this is... This u Hermitian Av is now reduced to a form where you have sigma and then 0s in the first column and then 0s in the first row other than the 1 comma 1th entry. And this is an m minus 1 cross n minus 1 matrix. And you can now apply the inductive argument on this. And so, that basically completes the induction part of the argument. So, now apply the inductive assumption to the m minus 1 cross n minus 1 matrix C. This is similar to what we did in the case of the Schur's theorem proof. And so, we can... And then that completes the induction argument. Okay. Now, there is one last part of the proof where I need to show that u Hermitian Av is equal to sigma. So, suppose there exists u and C to the m by m, v and C to the n cross n. And the existence of these we've shown by the inductive argument, such that u Hermitian Av equals s where s has 0s everywhere except for the top left r cross r block. Say gamma 1 through gamma r on the diagonal. Okay. We need to show that these gamma 1 through gamma r are equal to sigma 1 through sigma r. Excuse me, sir. So, how we got rid of that under root of sigma square plus w Hermitian w? So, if you square both sides of this equation, I'll get sigma square plus w Hermitian w square equals... So, there is a square missing here. Right. And then I get sigma square plus w Hermitian w. And I'm cancelling that on this side. So, I have this thing squared. Okay. And sigma equals norm of a2. So, sigma squared is equal to norm of a2 squared. So, this and this cancel. And so, from that, I get w Hermitian w less than or equal to 0. So, thanks. There's a square missing here. So, we can assume gamma 1 greater than or equal to gamma 2 greater than or equal to, etcetera, gamma n, gamma r, without loss of generality. Because if not, we can always permeate the columns of u and v to make these in decreasing order. Now, u Hermitian av equals s means that av equals u times s. I'm just pre-multiplying by u so that a times the ith column of v is equal to gamma i times the ith column of u. Okay. And this is true for i equal to 1 through r. Okay. This is just writing out what this means because s is a diagonal, has r cross r diagonal sub block. And everything else is equal to 0. And similarly, u Hermitian times a am I right multiplying by v Hermitian is equal to s v Hermitian which then implies u i Hermitian times a is equal to gamma i times v i Hermitian for i equal to 1 to r. Okay. So, what this means is that if I look at, let me take the Hermitian of this. So, a Hermitian times u i is equal to gamma is a real. So, it's just gamma i times v i. So, that if I look at u i, so if I take this gamma at the other side, I can write this as a Hermitian a v i is equal to gamma i squared times v i equal to 1 to r. So, that implies that gamma 1 squared up to gamma r squared are the nonzero eigenvalues of a Hermitian. So, by definition, this implies that gamma i equals sigma i because we define sigma i is to be the square roots of the eigenvalues of a Hermitian of a for i equal to 1 to up to r. So, that completes the proof. Okay. So, the couple of more remarks. So, the columns of v are a full set of orthonormal eigenvectors of the matrix a Hermitian a and the columns of u are a full set of orthonormal eigenvectors of a a Hermitian. Okay. I mean, it's possible that there are multiple sets, but they this v are one such full set of orthonormal eigenvectors. This is by construction on how we build this proof. And then also another consequence of this proof is that this so this what we define to be sigma which is so this is so that the spectral norm of a is the largest singular value of a. Okay. We saw that sigma is one of the singular values. Okay. And it is in fact the largest singular value of a. This 2 for a square matrix is equal to square root of mu where mu equal to the largest a Hermitian a. Okay. So, you can try to relate these on your own later.