 So, in the previous lecture, we have started to look at this notion of norms, right and we have seen that every inner product space automatically is endowed with this notion of a norm and this norm has certain properties. So, one of the very important properties that this has is this Cauchy-Schwarz inequality which we have proved for a general field and when I say general field of course, it means either complex numbers or real numbers. Now, there is an interesting little derivative of that result which is if you are dealing with just real field there is another alternate proof of this Cauchy-Schwarz inequality. I will just again sketch of that proof. So, what is the sketch? You just take alpha v 1 plus v 2, okay. Of course, then v 1 and v 2 belong to this and alpha is real. So, this is an inner product space, inner product space over R. We've already done the general proof. I'm just giving you another way of doing this proof if the inner product space is over R. In that case, it becomes even simpler. So, let's look at this object. What can we say about this alpha v 1 plus v 2 inner with itself is nothing but the norm of alpha v 1 plus v 2, right, which by its definition is positive definite. Yeah, now let's try and open this up and see what this leads to. Remember now this is real. So, if it's real, what do we know about this inner product? It's not sesquilinear anymore. It's bilinear, yeah. So, what can we say? And remember that this is real then this alpha into alpha is just alpha squared. You don't need alpha, alpha conjugate because we are dealing with real numbers now. So, this is alpha squared norm of v 1 squared. You can just open up the brackets and see for yourself. Plus, so see alpha v 1 v 2 inner and v 2 inner with alpha v 1. They're the one and the same, right, because you can flip the order now and the conjugation has no meaning because it's real, right. So, what we have is twice alpha v 1 v 2, right, plus. Now, let us look at this object as a function of alpha. What does this dictate? If it's a function of alpha for some fixed up v 1 and v 2, what sort of a function of alpha is this? It's quadratic polynomial, right? And on the top of that, we have this condition. So, what can happen if this is positive? Then what must the discriminant of the quadratic polynomial be? Negative, right? So, therefore, either we have for strict inequality, what we shall have is, this is the discriminant, right, two times this, yeah. So, 4 v 1 v 2, the inner product thereof minus 4 non-v 1 squared, non-v 2 squared. This must be negative, in which case, of course, just cancel out the 4 and you have the Cauchy-Schwarz inequality. But suppose there is this equality. What can happen in case of the equality? Can there be two distinct roots? Because if there are two distinct roots, then in between those two distinct roots, there must have been a sign reversal. But there can be no sign reversal after all. I mean, if you have this as your alpha and this as your f alpha. So, at best, you can have repeated roots like so, at some point. You cannot have two distinct roots because in that case, that positive definiteness property would be violated. So, if you have repeated roots, then this just changes to equality, right. So, that's as straightforward as that. If you're dealing with real vector spaces, this is your Cauchy-Schwarz inequality, which we did a lot more effort, at least slightly more effort to prove in case of a general inner product space when the field is allowed to be both complex as well as real, right. So, you can complete this argument. Again, I'm saying it in words, you complete the argument here, right. So, that is Cauchy-Schwarz inequality. That story has been told. Now, along with this Cauchy-Schwarz inequality, when we dealt with these vectors in Euclidean spaces, we had the notion of angles. And we said when a vector is perpendicular to one, then the inner products are zero. But the notion of perpendicularity is extended to the notion of what we call orthogonality, all right. So, by definition, if two vectors have an inner product that is zero, then the two vectors are said to be orthogonal. And by extension, such notions are also allowed for ideas of so-called orthogonal sets. But we'll get there later. We have another inequality, if you recall, which we are yet to prove that we discussed or introduced in the previous lecture, which is the triangle inequality. As it turns out, the triangle inequality will be a straightforward application of the same Cauchy-Schwarz inequality, which is why we prove this first. And then we go to the triangle inequality. So, let us look at the triangle inequality. Suppose you have V1 plus V2. Of course, V1 and V2 are coming from the vector space V, which is an inner product space. So, all those things are understood, right. So, this is equal to what? Of course, V1 plus V2, inner product with V1 plus V2, right. So, what can we say about this? This is further equal to V1 squared plus V2 squared. Remember, now we are back in the domain of considering the general field, which is either complex or real. So, F is equal to C or R, which is to say we have to respect those rules of sesquilinearity. We cannot consider it to be bilinear anymore, right. So, what is this? This is V1, V2 plus what is this? V2, V1. But what can we write this as if we flip this? It's just the conjugation from the definition. See, if you take a complex number and add it to its conjugate, what results? Two times the real part, right. So, this is nothing but so far so good. This is just equality so far. But now, what do you think about the comparison between the real part of a complex number and its absolute value or its modulus? What is the relation? Real part is a real number. Modulus is a real number. So, we aren't comparing apples and oranges, we are comparing two real numbers. And there is always an ordering between them. Is there something that you can say about the real part of a complex number and the modulus of that complex number? What do we know about it? So, if a complex number is x plus Iy, then x is its real part and its modulus is square root of x squared plus y squared. So, what can we say about them? Is one of them always bigger than the other? Which one? The modulus is always going to be bigger, right? So, if I replace this, this is fine, isn't it? No problems? And now, if I use Cauchy-Schwarz inequality, what follows from here? This object that I have encircled in blue, what's that going to be? Less than the norm of V1 times the norm of V2, yeah? The argument here is Cauchy-Schwarz inequality that connects these two. But what is this? It's just this perfect square, is it not? Yeah? So, this is V1 plus norm of V2 whole squared. What did we start with? We started with this and we end up with this. These are all positive numbers. If you take the square root, it only makes sense to talk about the positive square root, right? So, based on all of this, we have V1 plus V2's norm is less than or equal to the norm of V1 plus the norm of V2, right? So, that's the triangle inequality. Any doubts? So, all we've used is the only non-trivial step we've used really is this Cauchy-Schwarz inequality. Everything else is just complex numbers and their algebra, right? So, as I said a while back, we have this idea of orthogonality, okay? So, let's dig a little deeper and see how far we can stretch our notions of Euclidean spaces. So, we have the famous Pythagoras theorem, right? Base squared plus height squared is the hypotenuse squared, right? So, suppose you have, okay, let's define orthogonality first, formally, orthogonality. So, V1, V2, belonging to V, which is an inner product space are orthogonal if V1, V2, the inner product thereof is 0. That's the definition of orthogonality. But now, when we have the Pythagorean theorem, we geometrically understand it thus. And if you have this as V1 and this as V2, right? Then what can we say about V1 squared plus V2 squared? The norm thereof, that is, what is it? This is V1 plus V2, right? So, in fact, what you have is this is V1 plus V2. And therefore, in that case, that aforesaid triangle inequality, what does, what happens to it? Is it the triangle? Okay, we'll just see V1 plus V2 whole squared is equal to V1 squared plus V2 squared. Now, the question that arises to our minds immediately is, is this also an equivalent condition for orthogonality? That is, so this is the question. So, you can go ahead and check that it's obvious. It's trivial, in fact, that if this is true, then this must be true. But on the other hand, is the converse also the case here? We know in Euclidean spaces, it is. But in general, is it true that if this equality holds, then can you immediately infer that V1 and V2 are orthogonal? Why not? Exactly. So, if you just go through that proof, that's why it's important to understand these proofs. So, that you're not misled into inferring more than you actually can. See, there's nothing special about the Euclidean space, except for the fact that this field was real. The moment you are dealing with inner product spaces over the complex field, this sided implication does not work. That is if this, sorry, I marked it incorrect. This sided implication does not work. That is the Pythagorean equality does not immediately imply that this is true. It might still have happened that the real part of V1, V2, the inner product of V1, V2 is zero. That does not mean that the inner product of V1, V2 is zero. The inner product might have had a non-zero imaginary part. Only when you're dealing with real field, inner product spaces over real fields, then this Pythagorean. So, that's a very big standing assumption, is it not? So, do not infer that this is an equivalence of orthogonality. The equivalence of orthogonality is just as much as is stated in the definition. So, you have to understand what is the definition? The definition is this, not this. This is just a special case when the field is real, right? So, be very careful about carrying your intuition for real fields into more general vector spaces over complex fields as well, okay? Check every possible outcome that could be there. And then, assure yourself that this indeed carries forward, okay? All right. So, this is now the notion of orthogonality. With this, we can actually go ahead and also define an orthogonal set, okay? So, what is an orthogonal set? So, suppose S is a set that is contained in the vector space V, which is an inner product space, such that S is equal to V1, V2 till Vk, okay? If the inner product of Vi with Vj is equal to 0 for all i not equal to j, then S is orthogonal set. So, you take any pair of fellows, any pair from that set, you take their inner product. If those two fellows are distinct, then their inner product vanishes. And in that case, you say that this is an orthogonal set of vectors, all right? So, why are we interested in this? We will see that shortly, but before that, there is a claim that follows from these notions immediately. Suppose, S inside V, an inner product space is an orthogonal set containing, can you guess what's coming? What would be? So, this is the setup of this result. So, what's coming up? If this is true, then what will happen? Yes, thank you. Then, S is a linearly independent set. It's really not very hard to see why this should be the case. Of course, if you have the zero vector question of linear independence vanishes, we've already proved that much earlier. Even before we get delved into these inner product spaces, any set containing the zero vector cannot be linearly independent. So, the zero vector we have to rule out if you are looking for an independent set, linearly independent set. So, of course, the zero cannot be contained, but if the zero is not contained, see the zero is definitely a legitimate candidate for being a part of an orthogonal set, because the zero is not just orthogonal to every other vector, but also orthogonal to itself, yeah? So, we rule out the situation where the zero is part of this set that we've picked. And then we look at whether this claim holds water. So, suppose summation alpha i si is equal to zero i going from one through k, where s is equal to s1, s2, sk and is contained inside v. This implies if you take the inner product with the zero vector, obviously it's going to be zero, right? So, again, I can omit that subscript later on when it's clear. So, we have, let's take any sr, for instance, yeah, for r belonging to the set 1, 2, k. But what can we immediately say about the left-hand side? Isn't this going to act like a filter that's going to just pick out alpha r? Because you take any other si apart from sr and take its inner product, it just vanishes. That's the beauty about this thing, this orthogonal set. Only the inner product with sr of sr with itself survives. So, what survives eventually at the end of the day is alpha r times the norm of sr squared, which is zero. And of course, we have ruled out the case where any member in that set s is zero. So, therefore, sr is not equal to zero, yeah? So, if sr is not equal to zero, I can just go ahead and divide by sr norm squared on either side. So, alpha r is equal to zero since norm sr is not equal to zero. But did I make any special choice of r? I just chose r to be any arbitrary number from 1 through k. So, if I have now established that r, for any r between 1 and k, alpha r is zero. Have I not also established that alpha i is equal to zero for all i belonging to the set 1, 2, k, yeah? Because this is just any arbitrary sr that I picked out. You can go ahead and pick out s1, s2, s3, s4, s5 sequentially and prove that each one of those alpha 1, alpha 2, alpha 3, so on, until alpha k have all become zero, which means that every alpha i is zero. And that just implies that s is a linearly independent set, yeah? So, we have established indeed that when you have an orthogonal set, just check to ensure that the set doesn't contain zero. Because a set with zero can also be an orthogonal set. But if you excluded the possibility that zero is part of that set, then linear independence comes as a guarantee, right? That's what we have seen so far, okay?