 Hello all, welcome to another session of our NPTEL on nonlinear and adaptive control. I am Srikanth Sukumar from Systems and Control IIT Bombay. We start again by looking at our nice motivational image. This is essentially a rover in Mars. And our motivation is to essentially design algorithms that help drive autonomous systems such as thieves on the Mars. Now, let's go into a recap of what we did until last time. So last time we sort of started with the inner product space notion. And we also define what is a Hilbert space, which is essentially an inner product space, which is complete with the associated norm. And beyond that, we sort of went into much more detail of what is an induced matrix norm. We looked at the notions of supremum. We also, of course, tried to understand how supremum is defined and how to compute the supremum for functions and sets. Now, we also looked at a few important matrix properties, which is properties of symmetric square matrices, which is something we are going to regularly refer to in our lectures further. So, one of the key properties being this sort of an inequality on a quadratic form. We also saw how to compute the induced norm for special cases, that is for the one norm, two norm and the infinity norm. We saw how the matrix induced norm can be reduced to simple formulae. And we computed an example for this case. Now, we were left with trying to prove or look at the Cauchy-Schwarz inequality for a more general induced norm case. So, if you remember in the lecture before the last, while proving the norm property for the two norm or the Euclidean norm, we did look at a Cauchy-Schwarz inequality proof for a very specific case. So, what we want to do today is start off by proving this Cauchy-Schwarz inequality for a general inner product space. So, that's sort of the first thing. So, that's all the first thing. So, let's see, let's see. Suppose I want to do this, I will add an empty page. So, let's see. Yeah, I will just add this kind of a page. And I want to move this here. And suppose I mean, this is where I am. I have an empty page. And I want to sort of try to prove a general version of the Cauchy-Schwarz inequality. So, let's sort of begin. So, general Cauchy-Schwarz inequality proof. So, this is one of the very, very critical inequalities that we will use for all sorts of norms, that is objects, that is vectors, matrices and so on. So, I want to definitely understand how this gets proven in general. So, suppose I have, suppose I consider the norm of u. So, let's see. So, the first thing I can say is that I can always write a vector u as the inner product of uv divided by norm of v. So, let's see. And in the direction v plus some vector w where v, where we use the notation v is orthogonal to w. What does it mean for v to be orthogonal to w? It means that the inner product of vw is 0. So, I hope this is sort of clear to you that any vector can be decomposed in this form. Any vector u can be decomposed in this form. It can be written as a, with two components. One in the direction of v and one in a direction orthogonal to v. So, it is very easy to sort of see in the typical spaces. I mean if I have a vector u and I have a vector v, then I can always make an orthogonal vector w. And I can get projection here and projection here. And then this vector u can be written as sum of this and this. And that's exactly what this expression right here is. So, now if I want to compute the norm of u, what do I do? I take the inner product of u with itself because this is how I define the norm in this inner product space. And that is what? That's essentially the same as taking inner product of uv divided by norm v times v plus the orthogonal component w. And the same thing again uv divided by norm v orthogonal component. And that's the closer. And then I can use the inner product distributivity properties. If you remember the inner product is distributivity is distributed. So, this is one of the properties. If you notice the inner product can be distributed. Scalar quantities come outside and it is symmetric. So, using these three properties, I will be able to manipulate this inner product on the right hand side. So, if you notice this guy is a scalar quantity. So, it's this guy. So, what do I do? I use the distributivity and the scalar multiplication property to write this sense. This is the first piece v comma v. Yes, this is the first piece that is v comma v. Then the second piece would be. Well, the second and third piece would be the same. So, I'm going to write it in. It will come out as v comma w plus w comma w. Alright, how do I get this again? Let's look at a little bit carefully. The first piece here is obtained from this guy and this guy. Inner product of these two. Okay. The second and the third piece are from this combination and also this combination. Alright. Okay, both of them mean the same thing. This is because of symmetry of the inner product. Alright. So, that's the first and the second term. And the third term is of course from here. Right. Excellent. Now, it's easy to observe that the left hand side is norm u squared. The first term on the right hand side is norm v squared like this. The second term is actually zero. Right. Because this is exactly what you assume that v and w are orthogonal to each other. So, it's like an orthogonal projections. Alright. So, this is zero. Right. And so, I'm left with... So, this guy is sort of zero and I'm left with what? I'm left with the last piece which is norm w squared. Alright. Excellent. Now, norm of w squared is definitely something that's... You know, I know that this quantity is greater than equal to zero. Right. And so, this inequality can be written as an inequality. Right. Because these two have now cancelled out. Let's see. Wait a second. Wait a second. Let's see. Have I missed something out? I feel like I'm missing some term here. So, this is... The projection is uv divided by norm v. So, that is the projection in the v direction. And then I have something left in the w direction. So, this is okay. And then I do an inner product which gives me two of these. Right. Which is norm v squared. Alright. I would have thought that I would get something a little bit more. Alright. Let me go back to my source and let me see. Okay. So, this is a problem with the decomposition. So, there is seemingly an issue with this decomposition. It's supposed to be squared. Okay. It's supposed to be squared here, here, here and here. Right. That makes sense. Alright. That makes sense. Because one of the norm v's is corresponding to making this quantity into a unit quantity. And then the other norm v is to make this into a unit vector. Right. Because I write it in terms of unit vectors. Okay. So, this is fine. This is fine. Absolutely. This is okay. Alright. Actually, let's see. What I get here is a norm of v to the power 4 and that gets cancelled to become to the power 2. Yes. And so, this can be written as this guy. Right. And now, if I compare these two ends. Yeah. What do I get? I immediately get my desired Cauchy Schwarz inequality. That is u v less than equal to norm u times norm v. Right. So, all I've done is I have, I have essentially moved this guy to the left. Right. And I have gotten rid of the squares everywhere. Okay. Once I do these two things, I am left with this. Alright. So, that should be evident. Okay. So, that's the Cauchy Schwarz inequality. Alright. This is the more general. Okay. So, once you have this Cauchy Schwarz inequality in your bed, you will see that this is a very, very useful inequality. And then this is, this is just a, you can see that this should remind you of, you know, something like this. Yeah. This is simply a variant of this. This can be proven using the Cauchy Schwarz inequality. Yeah. So, this should be very easy for you to understand. And this is not exactly the Cauchy Schwarz inequality like I said, but this is actually a variant and this can easily be proved using what we have just shown. Okay. Alright. So, I leave that little bit of, little piece to you. Alright. Now, now that you've done vector norms and matrix norms, we go to the next object which is signal norms. Alright. So, this should be clear to you by now that we are going to be dealing with states which are functions of time, outputs which are functions of time and control which are functions of time. Therefore, we are not just dealing with vectors and matrices operating on these vectors, but we are dealing with signals that is these vectors change as a function of time. Right. So, therefore, we also want to deal with the notion of signal norms. Yeah. So, how do we define signal norms? They are defined using vector norms. Yeah. As you can imagine, we already talked about this as mathematicians, we like to develop new notions based on our existing notions that we already are aware of. So, that's what happens regularly. The induced matrix norm was developed using the vector norm. Right. Similarly, the signal norm is also developed using the vector norm as a basis. So, what is the interesting thing about the signal norm? You will notice that the signal norm is not function, not a function of time, though the signals themselves are functions of time. Right. So, signal norm somehow tells you something about the overall behavior, all time behavior of a signal. Okay. So, this is something that we should be aware. Yeah. That a signal norm always tells you something about the entire signal. That is a signal for the entire span of time. Yeah. So, the first one is the P norm. How do you define the P norm? You define the P signal norm for a continuous signal, for a continuous in time function, if you may. Yeah. And these can be vector signals as you can see. Right. This is defined as you take the vector norm at each time, take the pth power, then you integrate over all time, and then you take the pth root. Okay. That's the idea. Take the vector norm to the pth power, integrate it for all time, and then take the pth root. This is what is the P signal norm. Further, if you look at the infinity norm, as you notice, the infinity norm is always different, slightly different as compared to the rest of the norms. Right. So, the infinity norm is simply the supremum overall time of the vector norm of the signal. Okay. So, as you see, in the first definition, we integrate over time, and in the second one, you take supremum over time. Therefore, the time argument completely vanishes on the left hand side. And this is what I have said that the signal norms take you something about the behavior of a signal for all time. Yeah. It's like a global property of a signal, if you may. I mean, globally time property of a signal. Okay. Now, the important thing to notice is that the vector norm here is arbitrary. Yeah. If you remember for the matrix norm, I had mentioned very clearly that if you want the p-induced matrix norm, you take the pth, you take the p vector norm here on the right hand side. But now, there is no such thing. All right. That is, this p is coming from this p and this p. All right. This p has no connection to which vector norm we use. And so, you're free to use any vector norm. The only requirement is that for one problem, one complete problem, you should not use different vector norm notions. You should use the same vector norm for every norm computation in a particular problem. Otherwise, there will be inconsistencies in your result. Yeah. That's it. That's all you need to remember. Otherwise, this is an arbitrary choice. Okay. Now, as we mentioned already, this norm xt signifies any vector norm. And the choice doesn't matter. However, do not switch in between. That is, in between a problem, please do not switch between the vector norms. Okay. So, one of the important things about these signal norms is that they actually define spaces of signals. In fact, they define vector spaces of signals. And this is a very, very key, very, very critical notion. Yeah. So, what do we say? That if xp is finite, then for any p, that is 1 to infinity, then x is said to be in LP space. You say LP is a vector space of signals. Okay. One of the critical things to notice is that until now, when we talked about Rn and Rp and Rk and so on, we're talking about a finite dimensional vector space. But here, each LP, for example, L1 space, L2 space, etc., etc., is an infinite dimensional vector space because these are spaces of functions. Okay. So, if you've seen a serious course in mathematical analysis and vector algebra and vector spaces, you will know these notions. If not, you can read upon them. But the idea is that each LP space, that is L1, L2, L3 and so on and so forth, L infinity, each one of them is an infinite dimensional vector space. Yeah. But their vector space is nonetheless. Okay. And how do you categorize them? You say that if a function has a finite LP norm, then it belongs to LP space. Yeah. And these are very, very useful regularity conditions which appear in a lot of places in mathematics like approximation theory, Fourier, transforms and so on and so forth. But really, it's not just a significant here in the context of controls and adaptation, but in a much more general setting in mathematics, these are rather critical spaces. Okay. So, anyway, I mean, so we also have a discrete counterpart for this, which is the small LP space. What will happen? All your, for the discrete counterpart in these definitions, the integral gets replaced by the summation. That's it. Okay. And then you have a small LP space instead of the capital B, like we have now, but in this course, of course, we are concerned only with the capital LP space or LP space. Yeah. Another thing to notice is that a signal in L infinity is the same as the signal being a bounded signal. Okay. So, this is another important categorization to remember. Okay. It's not very difficult to prove. You know that this is what is the infinity norm of a signal, right? You know this is the infinity norm of a signal, right? So, suppose I assume that xt bounded, what does it imply for a signal to be bounded? It implies there exists a positive number such that norm of xt less than or equal to m for all t greater than or equal to 0. This is what it means for a signal to be bounded. Okay. So, if the signal is bounded, what happens to x infinity? What can I say? Let's see. I can say that notice that this quantity is less than equal to m for all time. So, what does this imply? This implies, so this particular piece of information tells me that soup for all time of xt is also less than equal to m. All right. This immediately means that x is in l infinity, right? Because the infinity norm is bounded, right? Great. Great. So, I hope that much is clear to you. Okay. So, now what about the other way around? If the infinity norm is bounded, so say I know that if x infinity is equal to m, what do I know? I know that soup of norm xt is equal to m, right? Which implies and this immediately implies that norm xt less than equal to m for all t greater than or equal to 0. Why? Why? Because soup is just a upper bound. It is a least upper bound, but it is still an upper bound, right? Therefore, if you say that soup xt is equal to m, then at each instant in time that is each norm xt has to be less than equal to m. There is no choice, okay? So, done. And this is of course the definition of a bound, right? Of a bounded function. Also, they are equivalent. These are equivalent notions. So, as far as notation is concerned, the signal norm never has a time argument like you saw because the time argument gets neutralized by the integration nor by, you know, taking supremum. While the vector norm always has the time argument. So, please be very, very careful even in the writing process. The signal norm never has the time argument, but the vector norm has to be evaluated at an instant in time. Otherwise, there is no question of a vector norm. It is not a vector at all if there is no time, right? So, we have to choose a particular time, say one second. We have to insert that in X and then you get a vector, a fixed vector. And then you can compute a norm just the way we have learned how to. All right? Great. So, one of the cool things that we know about vector norms is not something we will prove is the notion of normed equivalence. Okay? What is that? It says that you can take any two vector norms, say P norm and a Q norm. All right? You can take any two vector norm and they are relatable by constants alpha and beta. That is, the Q norm is lower bounded by alpha times a P norm and upper bounded by beta times the P norm. Okay? So, this is always true for any vector norm. However, one of the interesting things to note is that this is not possible in signal norms. Yeah? And it is very easily shown by a very simple counter example. Suppose I take X of t as the vector signal for sine t and sine t. All right? So, this is a counter example of why such equivalence is not possible. Right? Now, let us look at the infinity norm. Right? So, I will compute X infinity. And what is this? This is soup t greater than equal to 0 norm X t. I choose to use the two norm because it is easy to compute. So, what is the two norm? So, this is soup t greater than equal to 0 for sine square t plus sine square t. Right? And that is just equal to 1 because this quantity is 1. So, supremum of 1 over all time is just 1. Okay? So, what have we shown that the infinity norm is 1. So, I can even say that X belongs to L infinity as per our definition. Now, let us try to compute say 1 norm. Okay? So, what is X 1? It is now integral 0 to infinity norm of X t. And I take the two norm, two vector norm here again d t. Okay? What do I get here? I get, again, this is just equal to 1. So, I get infinity here because the integrand is 1 integrated from 0 to infinity. So, I get infinity. So, what have I just seen that X does not belong to L 1. In fact, you can show that X does not belong to any L p. Right? So, what do we know? We know that X belongs to L infinity. So, X belongs to L infinity not to any other L p. Right? So, therefore, all the other signal norms become infinite. Right? So, therefore, there is no such possibility of norm equivalence. All right? So, it may just happen that it is a infinity norm is bounded, but the other norms are unbounded and so on. So, therefore, there is no question of having inequalities like these because this by definition means that all the vector norms have to be bounded. All the norms are bounded. Now, in the signal norm case, one cannot even guarantee that all norms are bounded. A signal which is L infinity may not be L 1. A signal which is L 1 may not be L 2 and so on and so forth. So, this is a rather critical thing to remember. There is no norm equivalence. Okay. Great. So, what have we sort of seen today? We have sort of completed the proof of the Cauchy Schwarz inequality for a more general case, more general vector spaces, more general inner product space if you may. After that, we started to look at the notion of signal norms, which is the next level of motion that we need to complete different proofs that will occur through this course. In the process, we also learned about the notion of LP spaces. Right? And finally, we also saw that there is no norm equivalence in LP spaces unlike the vector norms where norm equivalence is a pretty standard motion. Right? Great. So, this is where we will conclude today. See you again next time. Thank you.