 Welcome back to the lecture on Quantum Chemistry or Principles of Quantum Mechanics. This is the third lecture giving some more details on the mathematics basic mathematics particularly matrices and the algebra of matrices. In the last lecture we looked at the Eigen values and Eigen vectors of a simple 2 by 2 matrix and then I gave you some examples to work out as problems. We shall continue that in this lecture since Eigen values and Eigen vectors are fundamentally important quantities in quantum mechanics. The Schrodinger equation particularly the time independent Schrodinger equationwhich commonly people read as H psi is equal to E psi where H is the Hamiltonian and psi is the wave function and E the energy Eigen value. The Schrodinger equation is an Eigen value equation for the Hamiltonian matrix or the Hamiltonian operator and the size or Eigen functions of the Hamiltonian operator and a major activity in quantum mechanics is to determine the Eigen functions once the Hamiltonian is known. And therefore, the elementary properties of Eigen values and Eigen vectors that we examined we examined in the last lecture and the ones we will look at in this lecture and probably later on these are extremely important for you to keep in mind as they will come again and again ok. So, let me start with today's lecture. So, we are on to the third lecture. So, lecture 3 properties or I will say more properties of matrices I am Mangala Sundar from the Chemistry Department of IIT Madras. This lecture is brought to you through the funding and the support of the National Program on Technology Enhanced Learning organized by the IIT's and Indian Institute of Science and funded by the Ministry of Human Resource Development. So, again like the last two lectures we shall divide this into some two or three smaller parts and in the first part of the lecture which is about say 15 to 20 minutes I shall talk about a couple of elementary properties as well as definitions that we have to worry about. So, let us look at the properties of the traces, determinants and some simple matrices like the orthogonal unitary and Hermitian matrices. We shall follow this with specific properties of Hermitian matrices because Hermitian matrices are the most directly useful quantities in the quantum mechanics particularly the Hamiltonian matrix and any other matrix of operator which is an observable or an experimentally measurable quantity those matrices are all Hermitian. So, we shall study the properties of Hermitian matrices a little more only to the point that we need them and not worry about the formal details. So, first we will do look at the traces and determinants and after doing this I shall give you the example of the eigenvalues and eigenvectors of a 3 by 3 matrix that I promised in the last lecture, but these properties are important and I have sort of put them in between the eigenvalues of a 2 by 2 matrix and eigenvalues of 3 by 3 matrix. So, let us look at the trace, if there are 2 matrices a and b such that a b is not equal to b a that is they do not commute we still have the trace of a b is equal to the trace of b a. It is easy to verify because suppose we write a by the elements a i j and b by the elements b i j then a b matrix element i j would be the sum over the column index k or row index as you see it a i k b k j. The trace of a b is nothing, but the sum of the diagonal elements of a b a b i i and therefore, we can immediately write that the trace of a b is given by sum over k it is the i i matrix element. So, instead of j we have i. So, it is a i k b k j, but summed over all i as well k i k i k k i, but the these being numbers a i k and b k i being complex numbers or real numbers or whatever we can rewrite to this as i k b k i a i k and this is easily seen as sum over k sum over i b k i a i k. And you can write immediately that this particular sum is the matrix element k k of the matrix b a k k and what we have is the sum over k. Therefore, this is the trace of b a. So, this is an extremely important property namely trace of two matrices a b is equal to the trace of a. From this it follows immediately that when we have more than two matrices we should follow a cyclic order namely if you want to write trace a b c three matrices then there are different ways of associating them. So, if you do a and b c as another matrix then this is the same as the trace of this b c matrix which is a product times a by exactly this interchange. The result of b c is a matrix anyway. So, a times that is the same as that times a one relation. Now, instead of a b c if you do the following trace of a b c and write this as trace of a b the result of a b being a matrix times c that is the same as trace of c times a b. And therefore, you see that the trace of a b c is cyclically invariant namely it is equal to trace of b c a in that order a b c order and that is equal to trace of c a p. Now, this is a very simple way of looking up to the trace property of products of matrices, but they are cyclically invariant except when a and b that is the only case that a b is the trace of a b is equal to the trace of b a and if a and b commute the product a b itself is equal to the product b a. So, we do not we have a trivial identity that the trace of the matrix is equal to the trace of itself I mean it does not need anything beyond that, but if the matrices do not commute this property is still important ok. In the same way we can talk about the determinant of the two matrices a b is equal to the determinant of b times a ok. And you can see that this gives rise to the same cyclical invariance for the determinant of the product of many matrices n if you do that it is determinant of b c n times a which is the same as determinant of you repeat this several times you will get n a b c dot. We made use of these two properties in the eigenvalues and eigenvectors determination in the last lecture when we said that the sum of the eigenvalues is the same as the trace of the original matrix and the product of the eigenvalues of the matrix is the same as the determinant of the original matrix. I mean there we did make use of these cyclical relations, but we see where they come from basically from the elementary definition of matrix multiplication. Now, the other part that we need to worry about is two more special matrices called orthogonal and unitary matrices. A matrix A is orthogonal if its inverse is given by the transpose of the matrix. This means that the determinant of A is never 0 because the inverse exists the transpose of any matrix can be obtained by transposing the rows and columns. Therefore, if that has to be equal to the inverse of the matrix obviously, the determinant of that matrix has to be non 0, but if you look at it carefully when you write A A t which is the same as A A inverse matrices ok. If A inverse is equal to A t then A A t is equal to the identity matrix. Therefore, determinant of A A t is equal to the determinant of 1 which is of course, 1 the identity matrix. So, what you have is you have the determinant of A squared please remember A t the determinant of A t is the same as the determinant of A therefore, A A t is the same as the determinant A whole square ok or is equal to the determinant of 1 which is identity which implies that the determinant of A has to be either plus 1 or minus 1 and the case of plus 1 A is known as special orthogonal matrix. Now, if we examine this a little bit further when we write A A t is equal to 1 that is the identity matrix what we imply is that the element of the product matrix A A t i j the i jth element of A A t is since this is a an identity matrix on the right side has to be the Kronecker delta delta i j. Now, you can write this again A A t i j in terms of the elements of A and so, if we do that it is sum over k A i k and the matrix A t k j and that is equal to delta i j, but the matrix A t k j is nothing other than A j k therefore, it is sum over k A i k A j k is equal to delta i j. So, if I write to this matrix by say A 1 1, A 1 2, A 1 3, 1 3, A 2 1, A 2 2, A 2 3, A 3 1, A 3 2, A 3 3 and if I say that this is an orthogonal matrix then from this relation you see that A i k times A j k keeps the columns constant therefore, you take i k k is equal to 1 here sorry the i and j are constant the columns are summed over therefore, you have summing over elements of 2 different rows rho i which is for example, if i is 1 it is a first row if j is 2 it is a second row then A 1 1, A 2 1 times A 1 2, A 2 2 times A 1 3, A 2 3 is of course, since it is 1 and 2 is 0. If you take A 1 1 with itself A 1 2 with itself A 1 3 with itself that is A 1 1 square A 1 2 square and A 1 3 square you sum them over then you should get 1. So, this is also known as row orthogonality, the rows are orthogonal to each other ok. Now, A t A is also equal to the identity and that immediately gives you the column orthogonality. You can write that by saying A t A i j is equal to delta i j and so, what you have on the left hand side is the sum over k A t of i k A k j is equal to delta i j and the left hand side this is sum over k A t i k is A k i A k j is equal to delta i j and if you look at the matrix here A k i A k j summed over k means A 1 1 A 1 2 A 2 1 A 2 2 A 3 1 A 3 2. So, we take a column and we take another column if we multiply the two columns element by element and then add them all up it gives you 0 whereas, if you take the same column the column of elements A 1 1 A 2 1 A 3 1 and multiply them with themselves A 1 1 themselves A 2 1 A 3 1 and that gives you 1. Therefore, the columns are normalized to 1 the columns are orthogonal to the other columns the rows are normalized to 1 the rows are orthogonal to the other rows and this is the property of there an orthogonal matrix. Now, a unitary matrix differs only in a very small extent in that the inverse of a unitary matrix which we now call not A, but let us call that by let us write that by U the unitary matrix is U inverse is the U dagger or the Hermitian adjoint adjoint Hermitian adjoint is U dagger is U t star transpose complex conjugate. So, if you are writing the matrix element U inverse i j it is U dagger all of matrices U dagger i j and U dagger i j is U t star which means U j i star ok. This is the property of unitary matrices this is the definition of unitary matrix that is the inverse of the unitary matrix is equal to the Hermitian adjoint and in this case again U U dagger therefore, is equal to U dagger U and that is equal to the identity matrix 1 and in no time you can see that this means that sum over k U i k U dagger k j is equal to delta i j and U dagger k j is U j k star. So, what you will get is let me write it here k U i k U j k star is equal to delta i j and therefore, you see this is equivalent to a rho orthogonality because you are keeping the two rows i and j constant rho orthogonality the difference between a unitary and the orthogonal matrix is that the rho orthogonality involves the complex conjugate of one of the rows it does not matter which rho you complex conjugate when you say a number is equal to another number its complex conjugate is equal to the complex conjugate of that number delta i j is 1 or 0 therefore, the complex conjugate of this number is also 1 or 0. So, it does not matter which rho you complex conjugate, but you multiply one row with the complex conjugate of another row for a unitary matrix when you do it element wise and sum them up the answer is 0 and likewise for column orthogonality you can very easily show that sum over k U k i U k j star is equal to delta i j. So, let us have a break here and right after this we will come back to looking at some properties of the Hermitian matrices and then in the last part of the lecture we shall solve the Eigen values of a 3 by 3 matrix which has degenerate Eigen values. So, we shall start with the properties of Hermitian matrices what is a Hermitian matrix? A matrix A is Hermitian if it is equal to its Hermitian adjoint a dagger which is a t star. So, what you have is a i j is equal to a j i star this is the definition of a Hermitian matrix ok. So, let us use the properties of Hermitian matrices and let us use this property to prove a few simple results result 1 Hermitian matrices have real Eigen values or let me write in this way the Eigen values of a Hermitian matrix are always real we shall see that just in a few minutes ok. Now please remember the bracket notation when we write the vectors by column and also rows if you remember that ok keep that in mind we may not need it now, but we will need it further during this part of the lecture. So, let us define the Hermitian matrix the Eigen vector 1 Eigen vector of a Hermitian matrix by 1 Eigen value lambda x ok. Now remember what this stands for? This stands for a column vector x containing elements x 1, x 2, x n and has a Hermitian matrix h i j where i n j run from 1 to n and it is equal to lambda times x 1 x 2 to x n ok. So, this statement if you have to write I can write this for one element x i the following namely sum over j h i j x j is equal to lambda times x i ok. The row that multiplies this column gives you that element in this row ok here the row here multiplied by this column gives you that row ok that is what this is h i j times x v lambda x i let us multiply both sides by x i star and then sum over i all i's i is equal to 1 to n. So, when we do that we get the following equation namely sum over i j x i star h i j x j is equal to lambda times lambda being a constant is outside the sum i x i star x i. Since it is an equation involvingtwo quantities two numbers. So, if we complex conjugate this equation both left hand side should also be equal ok. The complex conjugate of this is i j x i h i j star x j star is equal to lambda star sum over i x i x i star, but of course this is part is equal to this part, but lambda and lambda star we still do not know that is the equal point we wanted to show. Now, remember that for a Hermitian matrix h i j star. So, this theindex the indices i and j are dummy indices therefore, if we interchange i and j throughout this sum remains the same. So, if we do that i j x j h i h j i star because we are doing that interchanging x i star nothing changes on the left hand side because these two are equal and it is still equal to lambda star sum over i x i x i star ok. Now, use the Hermitian property namely h j i star is equal to h i j therefore, what we have is sum over i sum over j x i star h i j x j is equal to lambda star sum over i x i x i star, but remember this is exactly the same as the first line that we started with namely this one sum over x i star h i j x j sum over i j and this is also sum over i and sum over j x i star h i j x j, but the right hand side now has lambda times x i star x i sum over i and here it has lambda star x i x i star. Therefore, the sums are the same sum over i x i star x i and sum over i x i x i star are both the same therefore, what you have is lambda minus lambda star sum over i x i x i star is equal to 0 and this is nonzero therefore, lambda is equal to lambda star or lambda is 3 L ok. So, very simple way of verifying that the Hermitian matrices have real eigenvalues. Now, the second property is that the eigenvectors of Hermitian matrices, the eigenvectors of Hermitian matrices will be there will be different eigenvectors for different eigenvalues. One important property is that the eigenvectors of a Hermitian matrix corresponding to different eigenvalues are orthogonal to each other that is what we will next show. If the eigenvalues are the same, they are not necessarily orthogonal, but they can be made orthogonal which is the third property that we will show by defining what is known as an orthogonalization process or a Gram-Schmidt process and then we will see how eigenvectors of a Hermitian matrix with the same eigenvalues can be orthogonalized. So, these are the properties that we need to know again and again and we apply them all the time and therefore, it is important to keep this right in the beginning ok. The eigenvectors of a Hermitian matrix corresponding to different eigenvalues or orthogonal. Now, for this process we shall recall the bracket property namely h, we have h as the Hermitian matrix with one eigenvector x with an eigenvalue lambda 1 times x and we have another eigenvector of the same Hermitian matrix y with another eigenvalue lambda 2 y and we assume that lambda 1 is not equal to lambda 2. Given that we want to show that x and y are orthogonal or if you remember the bracket relations that I discussed in the previous lecture, what we want to show is the inner product x y is equal to 0 is equal to y x ok. This is what we wanted to verify. Now, the x is given by the columns here of course, it is given by the rho y is given by the column. So, what we wanted to show is sum over i x i star y i is equal to 0 or equivalently sum over i x i y i star is equal to 0. This is what we wanted to prove ok. Again we will have to use the property of the Hermitian matrices. So, let us first write h on x is equal to lambda 1 on x as sum over j h i j x j is equal to lambda 1 x i and h on y is equal to lambda 2 y as sum over k h i k y k is equal to lambda 2 times y i. Let us multiply the first equation by y i star y i star multiply by y i star sum over i this equation namely j h i j x j is equal to lambda 1 x i. So, the answer is sum over i n j y i star h i j x j is equal to sum over this is i sum over i lambda 1 y i star x i. Now, if we interchange the indices here i n j we still get the same equation because they are summed over both i n j and so what you have is sum over i n j y j star h j i x j is equal to lambda 1 sum over i y i star x i. This is h i j star because it is a Hermitian matrix. Therefore, what we have is sum over i this is y i I am sorry this is y i. So, what we have is h i j star sum over i j when we interchange the indices i n j we should have here a different index namely y j star h i j x i now it is ok. So, what we have is h i j star y j star and sum over j and we shall keep the sum over i 2 x i and that is equal to lambda 1 sum over i y i star x i. Now, this is the eigenvalue equation for the y and h i j star y j star will give you lambda 2 times y i and lambda is real. Therefore, what you have is sum over i lambda 2 you have y i star x i is equal to lambda 1 sum over i y i star x i. So, please remember that if you have the equation h i j y j star sum over j star the result is lambda 2 which is real therefore, it is a star is the same you will get y i. So, that when you substitute you get lambda 1 minus lambda 2 times sum over i y i star x i is equal to 0 and since lambda 1 is not equal to lambda 2 this quantity sum over i y i star x i is equal to 0 or this is the same as writing x the vectors y and x are orthogonal if the eigenvalues lambda 1 and lambda 2 associated with the two vectors x and y are different. So, it comes out naturally from the property of the Hermit Hermitian matrix. Now, another important property is that what if the eigenvalues are the same if lambda 1 is equal to lambda 2 lambda 1 minus lambda 2 is automatically 0. Therefore, if you have a matrix with what is known as the what are known as the degenerate eigenvalues that is two eigenvalues are the same then can we show that by this process that the two eigenvectors are orthogonal. We do not have to worry about that. If two eigenvectors are there in the first place and if they are not orthogonal it is always possible for us to orthogonalize them anyway. So, let me just to show you for a simple two eigenvectors two vectors which are orthogonal how we get the orthogonalized vector and with that in mind it is immediate that even if we do not get the orthogonal eigenvectors it is possible for us to orthogonalize them and therefore, we can assume that the eigenvectors of a Hermitian matrix are always orthogonal no matter whether the eigenvalues are the same or not because it is possible to orthogonalize them anyway ok. So, let me just show you that when two vectors x and y are such that that their inner product is not 0 that is x and y are orthogonal not orthogonal how do we get orthogonal eigenvectors out of them ok. This procedure is called Gram-Schmidt orthogonalization procedure C H M I D T and can be extended to any number of arbitrary vectors I mean vectors which are not orthogonal and it is a very simple way of visualizing what is meant by an orthogonal quantity. So, if you say that x and y are not orthogonal to each other what it means that a component of y in the direction of x is non-zero that is what it means or vice versa the component of x in the direction of y is non-zero. Therefore, if we subtract to this component from that vector what remains is obviously not going to have any common component and therefore, it is going to be an orthogonal to the original vector that is all you do you subtract out the non-orthogonal part of an eigenvector from another eigenvector through a projection process and this removal makes the resultant two eigenvectors to be orthogonal because they do not have anything in common. It is an extremely simple way physical way to visualize them and let me write the answers out very quickly ok to follow the class work notation I would denote the two non-orthogonal vectors as with the same eigenvalue H x 1 is equal to lambda times x 1 and H x 2 is equal to lambda times x 2 this is because the eigenvalue is doubly degenerate in this case that is the matrix has two eigenvalues which are equal and these are the corresponding eigenvectors and we have x 1 x 2 is not necessarily 0. Therefore, how do we make an eigenvector two eigenvectors which are orthogonal out of this first let us define the two vectors x and y as follows x is equal to x 1 and y is x 2 without the component of x 2 that is in the direction of x 1 the component of x 2 in the direction of x 1 is subtracted out and that is the scalar x 1 x 2 times the vector x 1 this is the component which when projected along the x 2 gives this part x 1 x 2 divided by the normalization quantity x 1 x 1 ok. The moment we define the vector y as the vector x 2 minus the component x 1 x 2 on x 1 which if you remember the bracket notation that we used can also be written as x 2 minus x 1 x 1 on x 2 divided by the length x 1 x 1 squared you can see that this is a projection operator and when it projects on to x 2 it gives the number x 1 times x 2 multiplied by the vector x 1 and if the vectors are orthogonal to begin with then this product is 0 and therefore,the two orthogonal vectors are x 1 and x 2 themselves which is equal to x and y, but if this product is non-zero then you only remove that magnitude from x 2 and then what you get as y will be now orthogonal to the vector x. So, if we take the product x y you can see that x is equal to x 1 so it is x 1 x 2 minus x 1 x 1 times x 1 x 2 divided by x 1 x 1 and you know that this goes away and so what is left over is that x 1 and x 2 this cancels out this is 0 therefore, you see that x y is orthogonal ok. Now or x and y the same eigenvec values they do they have the same eigenvalues it is easy because h on x is the same as h on x 1 that is how we defined x and therefore, it gives you lambda times x 1, but what about h on y that is also easy to see because h on y is h on x 2 minus you remember it is x 2 x 1 times h on x 1 divided by x 1 x 1 and this is of course, this is lambda times x 2 minus lambda times x 2 x 1 on x 1 divided by x 1 x 1. So, you have lambda times whatever is in here is the vector y therefore, h on y is equal to lambda times y. So, y is the eigenvector with the same eigenvalue except to that now y and x are orthogonal. So, this process is called the Gram-Schmidt orthogonalization. Now what about if we have three vectors which are not orthogonal, but all of which have the same eigenvalues. So, if we have a larger Hamiltonian matrix at 4 by 4, 10 by 10, 20 by 20 and if some eigenvalues are the same how do we find for such Hermitian matrices the orthogonal components same process. From each vector you remove all the components that this vector will produce when it is projected to all those vectors you subtract them out then you will get the eigenvectors which are orthogonal you will have to order them. So, for example, you will have to choose one vector as a starting point as the first vector you will have to choose x 2 as a next vector any one of them sorry this is x you will have to choose the second vector y as x 2 minus the component of x 2 in the direction of x 1 and then you will have to choose z as the component of x 3 in the directions of both x 1 and x 2 subtracted out from x 3 if you do that then obviously, z will be orthogonal to both y and x and therefore, you can create a chain of orthogonal eigenvectors in a sequence with some other sequence of non orthogonal eigenvectors. I will give that as a problem in one of the assignments, but it is important to recall that these things are fundamentally important. Now, let me pass for a break and in the next 10 minutes I shall solve the eigenvectors of a 3 by 3 matrix as an example and then we will stop here we will continue with the properties of matrices in the next lecture. So, the third section will be eigenvalues of a 3 by 3 matrix. So, we shall look at the eigenvalues of a Hermitian matrix again with the degenerate eigenvalues. So, I shall use a well known textbook example namely a matrix A 502010202, I can use any other matrix I found that this is there in the book of Arfkian and just I will solve this. This is also referenced in my lecture notes I use follow the book of George B. Arfkian and his colleague Weber the mathematical methods for physicists it is one of the best books on mathematics at the graduate student level that I have come across. I use it a lot I am thankful to the authors for having written this through several editions and now I believe the book is on the sixth edition at least that is what I have as the latest one. And some examples occasionally I shall pick up from there even though I will try and solve mostly problems of my own this is an example picked up from that book ok. Now very simple the eigenvalues of this matrix eigenvalues ok one can obtain this by the determinant of phi minus lambda 0 to 0 1 minus lambda 0 to 0 2 minus lambda that is equal to 0. And if you expand to this immediately you get phi minus lambda times 1 minus lambda times 2 minus lambda and if you expand it along either the row or the column it does not matter plus 2 times minus 2 times 1 minus lambda is equal to 0. So, what you have is 1 minus lambda times phi minus lambda times 2 minus lambda minus 4 is equal to 0 the answer is lambda is equal to 1 this is 7 10 minus 4. So, it gives you 6. So, the equation here is lambda square minus 7 lambda plus 6 and then you have 1 minus lambda and if you factor this out it is 1 minus lambda times you factorize that you get lambda minus 6 times lambda minus 1 is equal to 0. So, you have lambda is equal to 1 as twice the eigen 2 eigenvalues and lambda is equal to 6 as the other eigenvalue. So, here is the degeneracy ok. Now we shall use the fact that the eigenvalues of a Hermitian matrix can always be eigenvectors of a Hermitian matrix can always be orthogonalized and therefore, we shall assume that the eigenvectors would be orthogonal we will use that property ok. So, here lambda is equal to 6 the eigenvector is easy to obtain you have phi 0 2 0 1 0 2 0 2 you have this is lambda 1 then you have x 1 1 x 2 1 x 3 1 is equal to 6 times x 1 1 x 2 1 x 3 1. So, the equations that you get are minus minus x 1 1 plus 2 x 3 1 is equal to 0 and here you get anyway minus 5 x 2 1 is equal to 0 and these the third equation that you get is obviously, going to be the same thing namely minus 2 x 1 1 plus 4 x 3 1 is equal to 0 these 2 are 1 and the same. So, you have got the equation that x 1 1 is equal to 2 x 3 1 and x 2 1 is equal to 0. So, the eigenvector not normalized is 2 a 0 a and if you normalize it you will cancel the a out and you will get 2 by rho 5 0 1 by rho 5 this is one eigenvector. Now, the eigenvector 2 for eigenvalue 1. So, when you write that you will see the redundancy right away 5 0 2 0 1 0 2 0 2 x 1 2 x 2 2. So, this is x 3 2 is equal to 1 times x 1 2 x 2 2 x 3 2, but since 1 happens twice there are 2 such columns that we have to find out. So, let us call this one as lambda 2 is equal to 1 ok that is why this 2 and you can see immediately that it is 4 x 1 2 plus 2 x 3 2 is equal to 0 and you have x 2 2 minus x 2 2. So, it is there is no other equation because this x 2 2 will cancel out with this x 2 2. So, x 2 2 is arbitrary in addition to or undetermined in addition to 1 of the 2 being arbitrary that is what it is when you have a degenerate eigenvalue doubly degenerate eigenvalue. If you have a triply degenerate eigenvalue you will see that 2 of the eigenvector coefficients will become will not be involved in any equation in addition to the third one. So, you will see that these things are not without any structure I mean there is a fundamental structure the eigenvalues reveal that when you look for the symmetry that is degeneracy refers to some hidden symmetry in the system and here the question is we do not know anything about x 2 2. We know that x 1 2 has to be half of minus the half of x 3 2 ok. So, we have one equation we can choose x 2 to be either 0 or x 2 2 to be a constant ok. So, both choices are possible x 1 2 is of course is equal to minus x 3 2 by 2. Therefore, if we choose x 2 2 is equal to 0 then the eigenvector that we have is 1 by root phi 0 minus 2 by root phi or if I put the minus here plus here. What about choosing x 2 2 is equal to a? If we choose x 2 2 is equal to a or some constant then and choose x 1 2 to 0 then what we have is 0 a 0. Now, this choice is meaningful in the sense that these two eigenvectors are orthogonal to each other. You can see that the column this multiplied by that this multiplied by that this multiplied by that the sum of that is 0. So, the choice of x 2 2 is equal to a x 1 2 is equal to 0 gives you immediately an orthogonal eigenvector to the other eigenvector both of which have the same eigenvalue. So, it is possible to do that, but a is of course, when you normalize it becomes 0 1 0 ok. So, you have 3 eigenvectors lambda is equal to 6 corresponds to 2 by root phi 0 1 by root phi and lambda is equal to 1 corresponds to 2 eigenvectors 1 by root phi 0 minus 2 by root phi and lambda is equal to 1 corresponds to 0 1 0. Now, if you do not choose this 0, but you choose it to some arbitrary you have to ensure that this eigenvector is anyway orthogonal to this eigenvector ok. So, you can see immediately that if x 2 x 1 2 is equal to x 2 x 1 2 is equal to minus x 3 2 if you have that you have to ensure that this quantity and this quantity these both eigenvectors are orthogonal. So, I made the easier choice of making them to 0, but you can play around the point is when eigenvector values are degenerate you have to be careful with the way you derive the eigenvectors. It is always better to do the eigenvectors one at a time and make sure that the second eigenvector is orthogonal to the first one and also to other eigenvectors and so on. So, it is a it is a process that one has to do carefully of course, you will learn this as you go along and as you solve more and more problems ok. So, let me stop at this point the property or the characteristics of eigenvalues and eigenvectors and in the next lecture for the first part I will talk a little bit about the functions of matrices and matrix operators as we call them or functions of the operators we will talk a little bit about the commutativity and properties of matrices and then the second half of the lecture we will move on to the next topic namely the solutions of differential equations. Now it is not possible for me to talk a lot about matrices in this course except to the point up to the point that we need them and we use them as we need more and more such matrix techniques we will develop them then and there, but it is important to take a look at the differential equations first order and second order again in a rather quick way in order to start the process of solving the Schrodinger equation which we will start off as a differential equation and later on only we will use it as an eigenvalue equation for matrices. So, until then thank you very much.