 Hello everyone, and welcome to yet another session of the NPTEL on nonlinear and adaptive control. I am Srikanth Mar from Systems and Control IIT, Bombay. As you can see, we are again in front of our very, very motivating background image, which is of a rover on Mars. And we are as of now almost at the stage where we are stating all these very, very interesting results on analyzing such autonomous algorithms, right? So what we have been doing is we have been looking at the upon of stability theorems. So this is essentially, as I had stated, are the most seminal results of nonlinear control. So what we have looked at are four of these theorems, like the first four. So in each of these cases, we start with C1 function, which is also positive, which is also positive definite. And therefore, these two criteria together are stated to make V a candidate Lyapunov function. And if we do have a candidate Lyapunov function, then we evaluate the derivative of the function along the dynamical system. And based on that, we make some conclusions. So first, we saw that if it's semi-definite only, that is the least possible property on the derivative, then the equilibrium zero is stable. If the derivative is semi-definite and V itself is decrescent, then the equilibrium is uniformly stable. Then in the previous lecture, we saw the two stronger properties. Like I said, the definitions move to stronger and stronger versions as we go down below here, right? So if you look at the next property, this is essentially when V dot is in fact negative definite, then we have local asymptotic stability of the equilibrium denoted as AS. And on top of that, if V is decrescent, we get local uniform asymptotic stability denoted as UAS. So where we were last time is we were working out this example for asymptotic stability and we had not completed this example. So first, we want to complete this example today. In any case, what I will do is I will mark the beginning of today's lecture here so that, yeah, I will mark the beginning here, right? So I know that this is where I have to begin, like the new definitions. But before we go forward to right here, we are going to complete our discussion on this example. If you notice, this is a linear system, but since we are doing a Lyapunov analysis, so this may look a little bit more complicated to you than a typical linear system, right? Which may not be something that you like, but well, that's what it is. So if you look at this system, what you have is something like a typical spring-wise damper, which is normalized, right? And so here we take a candidate Lyapunov function as a sort of a linear combination of the X1, X2 states and the X1 state, okay? So we take a quadratic in X1 and X2, sum of X1, X2 and an X1 squared. What I do additionally from the previous time is I introduce these additional sort of tweaks that I can potentially play with. This is what was missing last time. So these tweaks are this k in front of this X1 here and this 2 alpha in the bottom here, okay? In fact, I think there was a 2 already. I'm just adding the alpha here, all right? So now, of course, if you look at, if you want to check the positive definiteness, et cetera, et cetera, it's not too difficult. You can see that this is 0 only when both X1 and X2 are 0, yeah? And this again is something that we sort of discussed a little bit last time, right? So and if X1 and X2 are non-zero or the state X which is equal to X1, X2 is non-zero, then of course you have a strictly positive outcome from this function. If any one of them is non-zero, you can see you will have a strictly positive outcome, okay? So this is, you know, sort of nice, all right? So this is sort of nice. So let's not worry about the positive definiteness anymore. So we already have a C1 candidate, C1 function which is positive definite. Therefore, V is a candidate we have on our function, right? Now, if I take the derivative carefully with the k and alpha now being present, then I get kX1 plus X2 times kX1 dot plus X2 dot. And from here I get 1 over alpha X1, X1 dot, right? And then it's just substitution of the dynamics here, right? So this is where I substitute for the dynamics. So kX1 dot is simply kX2 and X2 dot is just minus X1 minus X2. And X1 dot is X2 again here, okay? Now what I do is I try to sort of combine at least bit of this term here, okay? And in order to do that, what I do is I write this X2 as kX1 plus X2 minus kX1, okay? So X2 is being broken into kX1 plus X2 minus kX1, right? It's easy to see that this is still X2, right? And because I do that, this kX1 plus X2 term can be combined with this kind. And because of the minus kX1 term, I get a minus kX1 squared over alpha, alright? Now what? So this is already a good term. So I'm going to continue to sort of write this term everywhere. So this is like minus X1 squared and I have k over alpha. I'm just using the red color in order to distinguish between these additional constants that I got, which was not there last time. So here also I continue to get, if I make this bigger, I will continue to get minus kX1 squared and then I will get k over alpha, okay? So going back to this step here, I just combine this term and then bring it here inside this term. So if I take kX1 plus X2 common here, you can see that I'm going to retain these terms, kX2 minus X1 minus X2, which is this term. And I'm going to add this X1 over alpha right here, okay? It's very simple. And then this term, of course, remain as it is all through. So I don't even worry about this term. Now if I look at this term right here, I combine the X1 and X2 terms. So I have this term in X1 and these two terms in X2. So I get something like this simply by combining the X1 and X2 terms. Now what do I do? I take one minus k, the negative of one minus k common outside. So I'm left with X1 times one minus alpha, one minus one over alpha, divided by one minus k plus X2, right? Now in order for this to be a negative squared term, which is what I want for negative definiteness, I know that I want this to be a negative squared term. In fact, I wanted to look exactly like this. So this term has to resemble this term, okay? In order for that to happen, we have a few requirements. The first is that k has to be less than one, right? Because otherwise this is not positive anymore. The second is alpha has to be positive. Of course, k also has to be positive. I mean, this is not negotiable anyway. And k definitely has to be positive. All the constants that we introduce have to be positive here, all right? So otherwise, we land up in problems due to definiteness issues, okay? So k has to be positive, but it has to be less than one to make sure this isn't positive, so that I get a negative outside, which is what I want. Something like a negative squared term, all right? I know that alpha has to be positive. Now because I want this term to resemble this term, right? I want this guy to be exactly equal to k. That's what I have written here. And then I sort of expand this quadratic in k. 1 minus 1 over alpha is k minus k squared. And so I bring it all to one side and I get this quadratic equation, which has two solutions, of course, right? I don't mean that. I can pick either solution, right? I mean, in fact, I don't think we need to distinguish as such. Let's see. So I'm not going to distinguish. I can pick either solution. I can pick either solution. So note that for there to exist a solution, I need this quantity to be positive, right? Because if this is negative, then this becomes imaginary. So k has no solution, which is not okay. If k has no solution, then I don't have a candidate k upon a function, all right? So we definitely want k to be positive, right? We want k to have a solution, right? That this quadratic equation have a solution. Therefore, I need whatever is inside this to be positive. And for that, I have this requirement, right? That this is less than 1, right? And this gives, I mean, I can simply solve this very quickly, right? Not too difficult. I can solve this and I will get alpha is less than 4 by 3, right? I mean, I'm just simply taking 1 minus 1 over alpha is less than 1 over 4 and I do a simplification and I just get alpha has to be less than 4 by 3, right? So I already know that alpha has to be positive and now I have that alpha is less than 4 by 3, right? So now actually I was not completely correct. I already had mentioned that k has to be less than 1. Therefore, I cannot choose the plus sign here because if I choose the plus sign here, I may end up with k more than 1, okay? But in any case, either one is possible because there is a division by 2 here. Either one is possible. You just have to think carefully about making sure that k remains less than 1, okay? That is all. Either case is possible. So I have two conditions now. I have that k is between 0 and 1 and now I have that alpha is... I thought this would work. Apologies, just give me a second. All right. In any case, it's okay. This and I have this condition, right? Alpha has to be between 0 and 4 by 3, okay? And I also have k has to be between 0 and 1, okay? So these are important things. Now, so what I can do is one possible choice just in case you're wondering what it should be. One possible choice is say I make 4, 1 minus 1 over alpha to be exactly equal to half. Instead of taking it as... Because it has to be less than 1, I just take it as equal to half and from this I can calculate alpha to be 8 over 7. It's not difficult to see that this is... You know, this is going to be less than this. 8 by 7 is less than 4 by 3, okay? So simple. So what have I achieved? I have achieved by making these choices of alpha and appropriate k. What I've achieved is something like... I mean, let me complete this, right? This is like minus 1 minus k kx1 plus x2 whole squared minus k over alpha x1 squared, which is in fact negative definite, right? Which is in fact negative definite. And using the Lyapunov theorem now, right? That v dot is negative definite. I can conclude local asymptotic stability. In fact, v is also decrescent in this case, right? If you notice, there is no... No, I'm sorry, where was I? Yeah, there is no time dependence in v at all. So obviously it's decrescent. Decrescence is free. All in decrescence is free, right? So therefore, v is also decrescent, right? So in fact, not only can I apply this result, I can in fact apply this result also, right? That v dot is negative definite and v is decrescent, okay? And this essentially gives me local uniform asymptotic stability, right? So this is going to be... I'm going to characterize this as uniformly asymptotically stable, right? Uniformly asymptotically stable. All right, let's continue then with the rest of the definitions, right? So now, once we have the local result, we also want to look at the global result. What do we need for the global result, right? What we require is that, again, you keep adding more and more qualifiers, right? So what we require is the first two are the same, v dot has to be negative definite, v has to be decrescent, and v also has to be radially unbounded, okay? So in this case, if you notice, v cannot be a map like this anymore, right? So in this case, in fact, I have to carefully specify require v to be mapping T0 infinity cross Rn to R, okay? In this case, the appropriate candidate Lyapunov function has to map the entire state space because if you remember, radial unboundedness requires v to dominate a class Kr function, which is by nature of function, which that's increasing for all values of the state and goes to infinity, right? So for radial unboundedness of v, we require this lower bound to happen for all values of the state, right? And since this has to be the case, the itself has to be defined first for all values of the state, right? So no more ball of radius R, right? So we need v to map all time and all states to a real number, okay? Great. See, other than that, this is very identical to this just with this additional radial unbounded property, right? And then I have global uniform SM dot x stability. Now, if I again go back to my example, in fact, a very, very nice example, right? Which helps me actually cover all my definitions, to be honest, yeah? And that's why it's a really, really simple, nice little example, okay? As simple as that. It's not something too magical, but it's a rather simple example, okay? So if you look at this, like a global stability kind of example, if you look at this v function itself, right? You already see that v is not just positive definite, it is in fact also radially unbounded. So this is positive definite, but in fact, also radially unbounded. Why is it radially unbounded? It's evident that it's positive definite and as the states go to infinity, v has to go to infinity, right? You can take any direction of the state, it doesn't matter. Even if you move along the line kx1 plus x2 equal to zero towards infinity, that is the only sort of issue that can happen. That is, this term remains zero, even for large values of state. It is true. This term does remain zero for large values of state. If I move along kx1 plus x2 equal to zero, right? Along the straight line kx1 plus x2 equal to zero, this, in fact, sorry, if k is positive, then this is like a straight line in the opposite direction. Along that straight line, if I go to infinity, this term is zero. However, this term goes to infinity. So there is no way of avoiding going to infinity as the states go to infinity for this problem. Okay? So therefore, this is radially unbounded also. So in fact, and all the other proof remains exactly the same. In fact, so this system is not just UAS, it is in fact globally uniformly asymptotically stable. Okay? Now, one might then wonder, I mean, are there, I mean, examples which are, right, are there examples which are, I mean, sort of not global, right? Yes, of course. I mean, as soon as I start to get, you know, nonlinear things happening, there's a possibility for a lot of different kind of phenomena. Okay? So if I look at, say, let me try, I'm going to give this a shot. This is example five. If I look at this system, x1 dot is x2 and x2 dot is, say, minus sign x1 minus x2. I want to make this a little bit simpler for us. So I'm going to use something like this. Yeah. So just for simplicity of analysis, I'm going to use this. So if I now look at this system, right, what can I say? Yeah, what can I say? So, so here in this case, I choose my vx1 x2 as 1 minus cosine x1. Let's see if this is going to help me, but this might create some trouble for me. That face is going to be, yeah. So say I look at something like this, a system of this kind. Yeah. This is maybe an interesting looking or looking construction, but yeah, let's not worry about that. So if I'm just trying to illustrate a case when this is sort of constructed, but I'm still trying to illustrate a case where you don't get global properties. So if I take this 1 minus cosine x1 plus half x2 squared, this is positive definite. Why? I think I'm not sure, but I think we had considered this example in one of our lectures for non-zero x1. So, so x1 is, you know, if you look at the range of x1 when x1 is at 0, this is 1. This, this, when x1 is 0, this quantity is 1. So this is 0. So at 0 state, so it should be evident to you that v0 comma 0 is of course 0. All right. Now, if I take any non-zero x1, right? So, when does this go to 0? This function, this function goes to 0 next time at x1 equal to 5, right? Because cos will go from 0 to 1. Sorry, will go from 1 to 0 to minus 1. Wait a second. No, this will not go to. Next time it will go to 0 is at 2 pi at x1 equal to 2 pi. At x1 equal to 2 pi, this is 0. Wait a second. At x1 equal to 2 pi, 1 minus cosine x1 is 0, right? Right. Right. 1, 0. Absolutely. Absolutely. So this is correct, right? So what I will sort of do is, let's see, what I will do is just to make my life sort of safe, I am going to sort of consider x1 in minus pi to pi and x2 in r, right? So, in this sort of ball, if you think of this as a ball or you can consider any minus r to r. You can take any size in x2. x2 doesn't matter. But if x1 is within minus pi to pi, you are guaranteed that this is positive for all non-zero x1, x2. Therefore, this function is positive definite in minus pi pi cross r. So this is the Cartesian product, right? So this is positive definite. It's not radially unbounded. Notice it's not radially unbounded because the largest value this guy can take is 2. And this is not never going to infinity. Yeah? So if I, because if I take x2 to be 0 and I propagate only along the x1 axis towards infinity, this is the maximum value we will go to is 2. Therefore, this is not. So this is an example of not radially unbounded. Okay? So great. So now we do the analysis, right? We quickly take the derivative, which is v dot is 1 minus cosine x1 gives sine x1, x1 dot plus x2, x2 dot. And here I just plug in for x1 dot and for x2 dot, right? Great. So now I can see that this guy, this guy will cancel out, right? And I'm left with minus sine x1, I apologize, this is x1 I believe, minus sine x1 square minus x2 square, which is again negative definite in minus pi pi cross r. This is again not difficult to verify, right? Because only at x1 equal to 0, so pi and minus pi are not included. So only at x1 equal to 0 is this going to go to 0, right? So this is of course uniformly asymptotically stable, right? Because it's also decrescent, right? So it's decrescent because decrescent is free because there is no time appearing here. All right? Excellent. So this is an example where we is not readily unbounded. Therefore we don't have global stability. So it is in fact possible, right? It is in fact possible. So there is a couple of more properties which I will probably look at in the next lecture because we won't have enough time now. So what we have looked at today is the global stability property. We worked out the earlier missing example of asymptotic stability, which also turned out to be uniformly asymptotically stable and also turned out to be globally uniformly asymptotically stable. We finally also looked at an example which was not globally uniformly asymptotically stable, right? It was only, it only had local uniform asymptotic stability. And so we saw that this is also a possibility. It's not that we are guaranteed to have global properties all the time, especially for nonlinear systems, all right? Excellent. So we will continue this discussion on stability next time. We will wrap up the Lyapunov theorems and move forward. All right? That is the plan. Excellent. All right? I will see you again next time. Thank you.