 All right, let us once again perform the whole enchilada. That is, let's diagonalize a matrix, not knowing any of the information from the beginning about any of its eigen theory, any of the spectral results about this matrix. Let's do the whole enchilada. But this time we're gonna do a complex matrix, but this complex matrix will be Hermitian. If we take the conjugate transpose of A, we do get back A here. And so when we diagonalize this, we can actually do a unitary diagonalization. So that's what we're gonna try to do here. And so this will take a little bit of effort because there's a lot of things going on here. But so let's begin by looking for the eigenvalues of this Hermitian matrix. By the spectral theorem of Hermitian matrices, we can anticipate that these vectors or the eigenvalues are gonna be real. So how do we do that? The characteristic polynomial of this matrix, remember, is the determinant of A, A minus lambda I. We don't know what the lambdas are yet. We'll treat it as a variable. And so we're taking the determinant here, two minus lambda, one plus I, and then one minus I, and then we get three minus lambda. As this is just a two by two matrix, we can take the product of the diagonals and subtract them. So we get two minus lambda times three minus lambda. And then we subtract from this, one plus I and one minus I right there. So foil out the two minus lambda with a three minus lambda. We end up with six. We're going to get minus two lambda, minus three lambda plus lambda squared. Do be careful on the complex arithmetic. If you haven't as much practice with that, you can foil that thing out in which case you would get something like one minus I plus I, and then you're gonna get negative I squared, which is actually plus one, since I squared is negative one. You'll see some combination of things like the negative I plus I, they cancel. The negative two lambda will combine with the negative three lambda. And so we end up with six minus five lambda plus lambda squared, and then we get a minus two right there. Putting this all together, our characteristic polynomial is lambda squared minus five lambda plus four. And our goal of course is to factor this thing. We want factors of four that add up to be negative five. We can take lambda minus four and lambda minus one. And so then our eigenvalues are gonna be four and one, which as I predicted, I mean, because there's no suce saying that happened right here. I didn't have to look at the bones of a dead true or anything. We have a Hermitian matrix as eigenvalues necessarily have to be real. So we get these numbers four and one. So now we have the eigenvalues. So what we wanna do now is proceed to compute the eigenspaces. Let's find a basis for each eigenspace. So we're gonna consider, we wanna look for the null space of a minus four i. We'll do that over here. And then on the other side, we'll look at the null space of a minus i, one i there. And so consider what's going on there. So we have to first take the matrix a minus four i, like so. And so as a matrix, this will look like two minus four, which gives us a negative two, one plus i, one minus i. And then we're gonna get three minus four, which gives us a negative one right here. Did I do that one correctly? I think so, yeah. Let's see. Yes, that's fine. And so notice here that you have a two by two matrix and we know that this thing is singular, right? Because after all, since four is an eigenvalue, this matrix is gonna be singular. So we can actually do ourselves a little bit of a benefit for us that we know that the second row is gonna basically just vanish away because it has to be a multiple of the first row. Otherwise this thing would be a non-singular matrix. Since it's singular, we know this thing can row reduce to be, I'm gonna times the first row by negative one as well. So this will reduce to be two, one minus one minus i over zero, zero. And the reason I do that is then, I mean, you could divide by two if you want to, but we have this relationship. If you think of the associated homogeneous system of equations here, we have that two x one is equal to one plus i x two. And so if you pick something like x two equals one, you end up with, well, actually, I think I'm gonna go the other way around. I actually wanna solve for x one right here. And so you're gonna get one plus i over two times x two. So if you said x two equal to be one, right? Then you're gonna get this vector v one, which is given as x two is set to be one and then the first one will be a one plus i over two, like so. And like always, if you don't care for, if you don't care for the fractions, you could always factor that thing out, the one half that is. And so you end up with one plus i and two, which is kind of what I was hinting towards earlier. And you can just ignore this scalar because again, we're looking for a basis, we don't need a specific vector. So we can take this vector v one to be one plus i and two. This is perfectly good right here. As we are trying to look for a unitary diagonalization, we do have to normalize this thing. If we take the length of v one here, you'll end up with one plus one plus four all inside the square root. You just square the real imaginary parts of these things individually and add them together. And so we end up with the square root of six, which then tells us our vector u, the eigenvector we want. This is a normal vector here. It's how you have the length of one. We're gonna get the vector one over the square root of six times one plus i and two. So this is our vector u one that we're gonna use in our forthcoming unitary matrix. Let's go back and repeat this process for the eigenvalue one. So same basic idea. We have to look at the matrix a minus i, which that ends up giving us negative one. We get negative one. I'm sorry, I think I did that one backwards there. Peaky Neville we did before, right? I mean, after all, if you're not sure what to do at the stage, just look at the determinant you did before and just plug in, plug in lambda equals one there. Of course, don't actually take the determinant. So we get two minus one, which is one, one plus i, one minus i, and then three minus one, which is a two. So this is the matrix we wanna row reduce. And kind of like I said before, we could use the fact that because we have a two by two matrix and we know it's singular, the second row we don't need it because it has to be a multiple of the others. This kind of helps us avoid some of the arithmetic with the complex numbers. We ended with one, one plus i and then zero zero like that. And so then in terms of our free variables and dependent variables, we get x one is equal to negative one plus i x two. And so we can take as a vector v one we'll just take x two to be one again. And so we get negative one minus i right there. That gives us a pretty good eigenvector. No fractions to have to worry about. We can't really avoid the imaginary numbers whatsoever. But be aware as we normalize this thing when you take the length of this vector, this is not v one, this is v two now. When you take the length of v two, you're gonna get one squared plus one squared plus one. So the length of this vector is a square of three. Again, you just square the real imaginary part of the individual and add those all up. And so then you two is going to be the vector one over the square of three times negative one minus i all over one. And you can switch that with a negative if that, if you wanted to, I think we'll be content with the vector that's in front of us. So now that we have you one and you two, our matrix, our matrix A, the original matrix A can be the factor as P D P star. We're looking for a unitary factorization here. Now the matrix P will take as its columns, you one and you two, the diagonal matrix will take the two eigenvalues we had as its diagonals. And then as we're taking the conjugate transpose for P, our matrix is gonna look like you one star, you two star. So those are gonna be row vectors right there. And so if we plug in the information that we have now found, we get the following, give yourself some space here. So you one we saw before, we're going to get one plus i over the square root of six. We're also gonna get two over the square root of six. Don't feel any desire to rationalize those denominators. It's not gonna help us out any bit right here. And then for you two, we're gonna get negative one minus i over the square root of three. And then we're going to also get one over the square root of three. So this is our matrix P. Our matrix D will be the eigenvalues we had before, which is four and one. Make sure you go in the same order that you put the eigenvectors. And then for P star here, we take the conjugate transpose of the first matrix. So rows become columns and make sure you take conjugates. So you get one minus i over the square root of six. So we took the conjugate of that. You're gonna get two over the square root of six. That was a real number. So therefore the conjugation did nothing. Next, you're gonna get negative one plus i over the square root of three. Notice how I took the conjugate right there. And then you get one over the square root of three right here. And so you now see in front of you the, we see the orthogonal diagonalization of this matrix, right? And you can multiply this thing out and verify that this is in fact a correct factorization of the original matrix right there. And so this is again the whole enchilada, but with the extra benefits that we did some complex numbers this time and we in fact got an orthogonal, I take that back, we need a unitary diagonalization, which is the complex counterpart of an orthogonal diagonalization.