 So, if you have gone through your problem sheet that we have given shared with you on inner products there is there are a few questions on this idea of so called adjoints and self adjoints. So, I would like to just revisit that a bit because we have not discussed that at all in this class where do they originate from. So, you know we have vector spaces that are also inner product spaces right and the moment we have an inner product like this what do we say we pick up an object right v 1 and an object in it is the same vector space right. So, this is let us say the inner product in the vector space v and this maps to an object in the field of course, the field is either r or c. Now, suppose without loss of generality you have this linear transformation phi which is a mapping from one inner product space v to another inner product space w. So, this is an inner product space this is another inner product space right. So, here is what we say about this suppose you let phi act on an object v inside v. So, now this belongs to w if you take its inner product with some w. So, this is then the inner product that is defined in the vector space w yeah. So, if you can find another if phi star is such that it is a mapping from w to v such that this is true what is. So, here is what is happening here notice phi v yeah. So, phi v is after v has been acted on by phi it is taken it to w and then you are taking its inner product with some arbitrary w in w right and you get something in the field which could be r or c. Now, I am saying that if instead you had allowed this transformation phi star to act on the chosen arbitrary w. So, that the inner product of v with phi star acting on w where is this dwelling now this is the inner product defined in the vector space v yeah for all w in w right. Then phi star is the adjoint of phi. So, here is the idea you have the inner product in v you have the inner product in w both are inner product spaces and you have a transformation that takes you from one inner product space to the other. So, what you could have done is you could have simply gone ahead and perform the operation perform the transformation phi on v. So, that you reach w and then take inner product with arbitrary vectors in w. If the number that this spits out turns out to be the same as the number that is spat out by this inner product now defined on v, but you do not do anything to v. So, it is just sequence of operations being changed in some sense here you are first acting on this using phi going to w and then taking inner product with w going to the field. Now, you are dwelling in v using the fact that v is itself an inner product space instead you are letting w some arbitrary vector which give you this right let that phi star act on that w which give you this. So, that you end up having the same number over the field yeah if you can find such a map then that is called the adjoint of the original map. So, a map and it is adjoint have the opposite effect one maps from v to w the other maps from w to v ok. So, this is the definition of an adjoint let us take an example in the space that we understand very well. So, R 2's and R 3's and so on. So, let us say phi is a mapping from R 3 to R 2. So, what does it do? It takes x 1, x 2, x 3 and maps it to 2 x 1 plus x 3, x 2 plus 5 x 3 alright you can already see what the matrix of this transformation is right is just going to be 2 0 1 and 0 1 5 yeah. So, at the heart of it is this matrix 2 0 1 0 1 5. So, these are both inner product spaces real inner product spaces as simple as it can get. So, what are we saying now? So, what we are saying is that you let this x 1, x 2, x 3 take sorry let us act on this. So, we have 2 x 1 plus x 3 and you have x 2 plus 5 x 3 is a vector let us take this fellows inner product with in R 2 let us call the vector v 1 and v 2 or let us say p 1 p 2 ok. So, what is this? This is 2 x 1 p 1 plus x 3 p 1 plus x 2 p 2 plus 5 x 3 p 2. If I collect together terms like this I can simply let us leave it there let us not collect together terms at this stage. Now if there does exist some 5 star it needs to act on a fellow in R 2. So, for any v 1 or any p 1 p 2 pair that I give you I must be able to describe 5 star in terms of its action on p 1 p 2 yeah and it must be able to generate a 3 tuple right. So, can I erase this middle part ok alright. So, if I now let us say 5 star act on p 1 p 2 it must give me some 3 tuple. So, alpha 1 p 1 plus alpha 2 p 2 beta 1 p 1 plus beta 2 p 2 gamma 1 p 1 plus gamma 2 p 2 you agree this is all that can happen after all right where this alpha 1 alpha 2 beta 1 beta 2 gamma 1 gamma 2 will be the matrix representation of the transformation 5 star. So, I am required to find these fellows alpha 1 alpha 2 beta 1 beta 2 gamma 1 gamma 2 such that that equals this when its inner product is taken with right. So, what happens then if this fellow sits as a second entry now then I have x 1 x 2 x 3 this fellow and on the other hand I have alpha 1 p 1 plus alpha 2 p 2 I could have done this in a much shorter fashion I am just going about it in the long winded way just to show you yeah just not skipping any steps here beta 1 p 1 plus beta 2 p 2 gamma 1 p 1 plus gamma 2 p 2. So, what I want is this inner product now defined in R 3 to be equal to this inner product which was defined in R 2. If I manage to do that and solve for alpha 1 alpha 2 beta 1 beta 2 gamma 1 gamma 2 I would have got the adjoint of phi is it not that is the question before us right. So, what is this going to be equal to alpha 1 p 1 x 1 plus alpha 2 p 2 x 1 plus beta 1 p 1 x 2 plus beta 2 p 2 x 2 plus gamma 1 p 1 x 3 plus gamma 2 p 2 x 3. So, what am I going to look for now what is my strategy going to be yeah exactly. So, what happens then what is what is alpha 1 yeah what is p 2 x 1 0 right p 2 x 1 does not exist here does it p 2 x 1 has a coefficient 0 there. What is this p 1 x 2 0 again what is p 2 x 2 1 what is p 1 x 3 sorry yeah and what is p 2 x 3 5. So, notice that phi star in fact, I because I know the answer I can just look at that is it not going to just turn out to be this right. So, in case of real inner product spaces complex conjugation is the same as just transposition right complex conjugation transpose is just the transpose because complex conjugate has no meaning here. Now, if you are dealing with complex inner product spaces you would have seen that you would require to take the you would be required to take the complex conjugation of each term here as well right. So, now comes the notion of what is called a self adjoint operator. So, if it has to be self adjoint can the two vector spaces be different if it is an operator then it has to be a mapping from itself to itself because as you can see the matrices themselves will not be of different sizes right. So, self adjoint is when let me carve out a little space here and say if V is equal to W and phi is equal to phi star then phi is self adjoint and a beautiful thing about self adjoint operators is that they there are two beautiful things one if you have a self adjoint operator it is eigenvalues are always going to be real and the second important thing that such a matrix will always have irrespective of whether the eigenvalues are repeated or not always have exactly n linearly independent eigenvectors it does not matter whether the eigenvalues are repeated right not just that move over the eigenvectors will be orthogonal they will give you an orthogonal basis. So, very very special kind of operators are these self adjoint operators in simple terms if you look at matrices real matrices symmetric matrices will be representative of self adjoint operators over the complex field it is basically the Hermitian matrices named after the famous mathematician Hermite whose by the way the person who is responsible for showing that e the natural yeah the that number is irrational ok and the fact that you cannot have any polynomial with rational coefficients whose root can be e. So, Hermite showed this ok. So, it is named after I think his name is Charles Hermite ok. So, it is Hermitian when you are dealing with complex inner product spaces. So, in the remainder of the time that is what we shall endeavor to demonstrate to you ok, but if you have. So, up until this point we have just described adjoint and self adjoint now we will say that if you have a self adjoint operator over finite dimensional vector spaces look at its matrix representation it will be a symmetric in case of real or Hermitian in case of a complex inner product space. So, such Hermitian operators will always have real eigenvalues and their eigenvectors will be not just distinct, but orthogonal will form an orthogonal basis ok. So, that is the big result, but of course this is a very special case you might argue that this does not help us in answering the general question we have posed that is when can you diagonalize any arbitrary operator because this is a very special class of operators after all ok. So, we will take the case of Hermitian because as you see the complex field subsumes the real field ok alright. So, we will cut the frills and we will get to the matrices Hermitian matrices. So, A is equal to A Hermitian suppose where of course A Hermitian is nothing, but the conjugation transposed ok this belongs to n cross n alright. So, first to show that the eigenvalues of A can only be real ok. Suppose A V is equal to lambda V for some lambda in C and V not equal to 0 in n tuple ok. Let us take complex conjugation and transposition on both sides ok. So, this is our first expression second expression is this implies and is implied by V Hermitian A Hermitian is equal to what happens to lambda, lambda gets conjugated yeah. So, far so good right ok. Now, what else if we post multiply equation 2 with V on both sides what do we get V Hermitian A Hermitian V is equal to lambda star V Hermitian V. What is on the left hand side this A Hermitian is equal to A right. So, I might as well write V Hermitian A V is equal to lambda star this is nothing, but the norm of V squared is it not right. What about the left hand side A V is lambda V from 1. So, I can just pull out this lambda and I have minus lambda star norm of V squared is equal to 0 which means lambda minus lambda star norm of V squared is equal to 0 what can I know say about norm of V squared nonzero of course, because it is an eigenvector. So, it is not 0. So, I can get rid of this. So, it means lambda is equal to lambda conjugate what sort of numbers what sort of complex numbers are equal to their conjugates not unless they are real can this be true right. So, therefore, any eigenvalue of a Hermitian matrix and by extension a symmetric matrix of course, because symmetric matrices are just special cases of Hermitian matrices. So, any symmetric matrix or Hermitian matrix can have only real eigenvalues clear great. Now, suppose lambda 1 V 1 dot dot till lambda n V n eigenvalue eigenvector pairs lambda i is not necessarily distinct for A right. Then V 1 V 2 V n forms an orthogonal basis for C n. So, we are not asking for any distinct eigenvalues or anything. It is a very special case of just Hermitian matrices right suppose. So, of course, if you have n is equal to 1 here it is nothing really to prove again yeah. Suppose n is equal to 1 obviously true we are going to use induction of course, you have already seen it I suppose. So, let it be true for n is equal to k that is yes no these are vectors eigenvectors. So, these are in C n. So, this is a basis for C n the eigenvectors of the n cross n matrix form a basis for C n right ok. So, we are assuming that if n is going up to k it is true for every number less than or equal to k. Now, we have to show that if it is true for k it must be true for k plus 1. So, let A belong to C k plus 1 cross k plus 1 suppose A V is equal to lambda V where V is not equal to 0 which means that this is an eigenvalue eigenvector pair. In fact, I will do something better I will just go one step further and say that the norm of V is equal to 1 that in one short rules out V being 0 and also say that is a normal vector its norm is 1. I can always do that no matter what vector you choose just divided by its norm it is a it is scaled right. So, let us say this is true all right. So, by Gram-Schmidt procedure extend V to a to an orthonormal basis for C k plus 1 all right again nothing fancy you start with one vector which is linearly independent of course, because it is not 0 you can always extend it to be a basis for k plus 1 dimensional vector space. Now, if you permit me to erase this I am going to use this entire space up now. Let us write this in the following manner. So, let this basis be given by V V 1 V 2 till V k ok only V is the eigenvector remember the others suggest going to complete the orthonormal basis. So, let us take a V V 1 this is now the matrix k and what is this going to be equal to let us call this the vector V right. This is going to be equal to lambda V a V 1 until a V k all right. Let us hit this on both sides using V Hermitian what that would lead to is the following V Hermitian V 1 Hermitian. So, these are row vectors now agreed and this is V k Hermitian this is a and this is V V 1 V k is equal to what what can you say about this V Hermitian V is 1 right. So, this will just leave behind lambda here is it not what about the second term I mean if you hit V V 1 Hermitian with this what can you say the term here I am talking about this is an orthonormal basis. So, inner product of V with V 1 V 2 V 3 till all of the others is 0. So, this is 0 0 this entire thing is 0 is it not correct these things I do not care about for the time being except that when the time comes I will use that. Now, what is the size of this block k cross k agreed ok this is clear no doubts about this see what I am doing is I am right I have not written this V Hermitian in on the right hand side perhaps I should have, but this is the same matrix that is acting here also as this V Hermitian. So, this thing acting on this is just going to give you lambda because V Hermitian V is 1 this thing acting on this is going to give you 0 the subsequent all other fellows until V k Hermitian acting on V will give you 0. So, this is 0, but now I have this important observation V Hermitian A V the whole Hermitian is equal to what it is equal to now the order gets reversed. So, observe that this is equal to V Hermitian A Hermitian V which because A Hermitian is equal to A turns out to be V Hermitian A V which means what that this matrix is therefore, also Hermitian yeah if this matrix is Hermitian what must these entries be 0 well because this every element below lambda here is 0. So, every element to the right of this also had better be 0 right. So, therefore, this is 0, this is 0, this is 0 due to symmetry argument yeah if that is. So, what are we eventually left with we are left with this being equal to some lambda and whole row of zeros and a whole column of zeros and some poor little A hat which is what exactly it is of size k cross k is it Hermitian well of course, because this whole thing is Hermitian means this is also Hermitian, but from the base step assumption what do we know that any object of size k or lower can always be diagonalized. So, therefore, this means that there exists U such that U Hermitian A hat U is equal to some lambda hat where this is diagonal because if this is an orthogonal basis it is linearly independent if this is linearly independent you know these are eigenvectors. So, they will diagonalize. So, then what is it can we not write well we will say that V Hermitian A V is equal to lambda 0 0 and this is A V is equal to lambda 0 0 and this is U Hermitian agreed right lambda hat yes sorry lambda hat U lambda hat U Hermitian right what is that that is nothing, but 1 0 0 U lambda 0 0 lambda hat. So, this is a hat times 1 0 0 U Hermitian where this fellow here is diagonal of course. So, then this finally leads to us to conclude what if we hit it with 1 0 0 U Hermitian V Hermitian A V 1 0 0 U Hermitian V Hermitian A V 1 0 0 U is equal to lambda hat which is the diagonal matrix all that we need to show is that this fellow is what an orthogonal matrix yeah it is an orthonormal matrix. In fact, the fact that this multiplied by its complex conjugate transposed will lead to identity. So, this I leave to you as an exercise to show that show that this fellow is what an orthogonal matrix is. So, this is what I am going to show you that 1 0 0 U V times of course, I have just spelt it out almost it is hardly much to do now for you V Hermitian is equal to identity and we are done why because now I have shown you that this is exactly the basis set right that diagonalizes it and once you have diagonalized it is of course, those are the Eigen vectors right. So, for this very special class of self adjoint operators over finite dimensional vector spaces you will always be able to diagonalize them and this is beautiful unfortunately we do not have time for the applications that I had spoken about, but we will delve into those in the next lecture. We will show you what all applications this has starting with a test for positive definiteness or lack thereof to certain special curves such as ellipses hyperbolas, parabolas in 2D to volumes or quadric surfaces such as paraboloids and hyperboloids etcetera. So, that is all in the next lecture. Thank you.