 Welcome back. So, last time we were discussing the autonomous systems of first order equations and we defined orbits through a given point positive orbits and we studied some simple properties of these orbits and we also stated a calculus lemma which will be useful in analysis of some simple examples which we are going to do now. So, let me again continue with some more examples. So, in this example I would like to indicate to you the significance of the equilibrium points. So, I take an example. So, this is example continue. So, this is third example. So, again 1 d. So, x dot is equal to 1 plus x square. So, notice that this does not have, does not have at this right hand side is 1 plus x square is always bigger than equal to 1. So, there is no particular reasoning for this 1 x square. So, you can put any function. So, for example, x 4, x 6, anything and even you can put sin square x at 0. What I want is that this is a non-negative quantity. So, that the right hand side does not vanish for any x. So, this. So, I claim that no solution of this equation. So, you see the absence of equilibrium points says that for this particular equation no solution is bounded and in fact, this is more or less true. In one dimension we can see it very easily even in more dimensions I will mention some examples that is also true. So, in one dimension it is very easy to see. So, once this the right hand side term is does not vanish. So, for all x it has to either remain positive or negative. So, this solution will always be increasing or decreasing. So, that is the reason this solution cannot be bounded. So, let me try to give a different reason. In fact, I will mention some of the some three methods to see why the solution no solution is bounded. So, and you can see the power of calculus simple calculus. So, I am not writing any solution in explicit form. In fact, I do not know whether I can write if I take any arbitrary non-negative function or positive function. I do not know that, but simple calculus will tell me that no solution of that equation is bounded. So, method one. So, this is just calculus. So, we will also learn some analysis here. So, look at the right hand side 1 plus x square. So, therefore, x dot is bigger than or equal to 1 for all t. So, I take any solution x of t and if that satisfies this given equation then I must have this x dot. Remember x dot is d x by d t that is always bigger than equal to 1. Though we cannot differentiate an inequality, fortunately we can integrate an inequality. So, integrate this inequality with respect to t. So, this is calculus integrate with respect to. So, that implies x of t is bigger than or equal to t plus some constant. Now, look at this right hand side t plus c. So, as t goes to infinity that goes to infinity and the solution x t always stays above that. So, that implies. So, this limit x t as t goes to infinity. So, it is unbounded. So, let me just use this same idea for this second order equation now. So, x double dot is equal to 1 plus x square again no equilibrium points. So, this no equilibrium points and again the claim is no solution is bounded even for this equation it is a second order equation. So, let me again now describe it in different ways. So, method 1 again simple calculus. So, let me stress that point it is simple calculus we are not you can try to solve it if you can I do not know whether an explicit solution for this equation is possible or not, but I am not interested as of now in the explicit solution, but I just like to show that any solution to this equation is unbounded. So, write this as a system write as a system. So, I introduce another variable y. So, y is introduced by this x dot equal to y and y dot equal to y dot will be x double dot and from the equation I get that. So, this is my this new variable y x dot is equal to y and from the given equation y dot will be 1 plus x square and now again if you look at the previous example now I use that example for this y. So, that. So, this implies y dot is bigger than equal to 1 and again upon integration that implies y t is bigger than equal to t plus c. So, this clearly says that y t goes to infinity. So, in particular it says y t is bigger than equal to 1 for t not necessarily for all t, but some t star. So, that is t star for some t star that you can calculate from this t plus c use this thing. So, you want bigger than equal to 1 and now you go back to the first equation here and use this y t is bigger than equal to 1. So, that implies from first equation x dot is bigger than equal to 1 for t bigger than equal to t star. So, instead of now I integrate from t star to some t. So, I am only in this range and that implies again. So, x t is bigger than equal to t plus some constant for t bigger than equal to t star and that implies infinity. And already this has given us limit y t t goes to infinity is also infinity. So, remember y is nothing, but the first derivative of x. So, in this case both x and x dot they go to infinity they are unbounded. So, it is not necessary that both x and x dot are unbounded. In some situation it can happen that one of them will be bounded, but other certainly at least one component will go to will be unbounded. So, hence the orbits will be unbounded. So, in the absence of equilibrium points. So, this is the point we note. So, in the absence of equilibrium points the solutions will be unbounded. So, the orbits will be unbounded. We will go to the next example. So, method 3. So, view this equation x dot equal to 1 plus a double dot as a conservative system. So, we will have a detailed discussion on this conservative systems. At that time you can come back to this example and analyze using whatever methods we derive in this conservative systems. In general a system x dot x double dot either an equation or an equation or. So, this is differentiation with respect to x. So, this is this is called a conservative system. We will see in detail later that I am just introducing some terminology that is all. And when say you remember here only x is involved. So, it is an autonomous system, but only x is involved. So, when this term also contains x dot for example, suppose I have a system like this x x dot. So, both. So, these are called dissipative systems. So, dissipative systems are more difficult to handle than the conservative systems. So, we will see little later. So, for example, you have seen in the discussion on problems. So, the Duffing equation and Van der Poel equations are examples of dissipative systems. So, we will see them in more detail later. So, let me now go to other example. The one dimensional equation, the cousin of pendulum equation. So, for this equation. So, let us write down the equilibrium points. So, what are the equilibrium points? Equilibrium points are those at which the right hand side vanishes that is sin x. So, the zeros of sin x. So, these are plenty. So, this is just x bar equal to n pi all multiples of pi. So, n is an integer plus or minus 1. So, there are an infinite number of equilibrium equilibrium points. So, that you remember. So, let me just draw this these points. So, this is 0, this is pi 2 pi that is sufficient. So, each. So, orbit passing through these points is just singleton. So, these are the orbits. So, these are equilibrium points and orbit passing through them is. So, any orbit suppose it starts here it has to stay here, it has to stay here, it has to. So, it cannot cross. So, in this case we will have an infinite number of orbits. So, let us analyze in one particular interval. So, let us you take this let me call it i n n pi n plus 1 to consecutive equilibrium points and I would like to see how the orbit starting in this open interval behaves you know that is our. So, that is our phase line analysis. So, I would like to see I would like to start the solution somewhere here and see how it behaves. So, you let me write slightly bigger n pi n plus 1 pi the interval i n and I start an orbit through this x naught x naught is in i n. So, again you go back simple calculus. So, what we if you recall what we did in other examples we would like to see whether the solution is increasing or decreasing and that will help us in determining the direction of the orbit. So, again go back to this equation x naught is equal to sin x. So, if I can determine the sign of sin x then that I will determine the sign of x dot and that will help me in determining the direction of the orbit and now look at sin x. So, sin x has different behavior. So, this is positive if n is e 1. So, that you can easily check in this interval this is in e n this is in i n. So, if n is even sin x is positive and if n is odd sin x is negative. So, that means what if n is odd sin x is decreasing. So, let me just so the solution is decreasing. So, this is n odd whereas, if n is e 1. So, it is positive. So, it increases. So, you see the behavior of solution is very much dependent on whether the sin t is positive or negative that is and now you use calculus lemma in either case. So, use calculus lemma. So, this is an important result from. So, if n is odd look at here it is decreasing. So, this is the situation n odd. So, it is decreasing and bounded below by n pi. So, it cannot cross n pi because that is the two orbits cannot cross that is so simple one of the simple results of orbits properties of the orbits. So, we have. So, the limit exists limit x t x t tends to infinity it exists it exists and again one of the properties of the orbits when that solution has a limit it has to be an equilibrium point and in this situation the only possibility is that it goes to n because there are no other equilibrium points between n pi and x 0. The nearest equilibrium point to x 0 is this n pi and whereas, if n is e 1 it is increasing and now it is bounded above by n plus 1 pi. So, it has to go to n plus 1 pi and important thing you should notice here is I do not have any explicit form of the solution. Just based on the properties of orbits and simple properties of orbits and this calculus lemma I am able to derive. So, that means a solution starts in this interval i n. So, it has these properties I do not have any knowledge of the explicit form of the solution, but I have this qualitative behavior and that is the power of this qualitative theory. So, you should realize that. So, without having an explicit knowledge of the solution we are able to derive its for example, these limits we are able to do that. So, in this case you also have explicit form of the solution. So, let me just state that as an exercise. So, it is a very simple equation x dot equal to sin x and you might try using the separation of variables to derive an explicit formula for the solution you can do it, but just like to caution you. So, when you try to do this thing you get into this tan function an implicit relation as tan x equal to something and then a task is to invert that tan and tan is a multi-valued function. So, you should exercise caution when you are trying to invert that tan. So, let me just say that. So, let x 0 equal to. So, it is in I n. So, I can always write that as alpha. So, 0 less than alpha less than pi. So, that means x 0 is in that interval open interval I n. So, that the solution it is a good exercise x t of x dot equal to sin x with. So, I give the initial. So, I take x 0 t equal to 0 x 0 is given by and we have already observed the different behavior of the solution depending on whether n is odd or even. So, the same thing we can expect in the form of solution also. So, just let me write that thing. So, x t is equal to n pi plus 2 tan inverse. So, c e to the minus t if n is odd 2 tan inverse c e to the t if n is 0. So, this is for all t. So, what is this c alpha is equal. So, you have used this tan inverse many times from the your college plus 2 level. So, let me recall this tan inverse is a function from r 2 minus pi by 2 2 plus pi by 2 this is 1 1 r 2 just a recall. So, it is a very good exercise and as and when we develop your concepts we will again come back to these examples again and again. So, let me now mention one 2 d example. So, we have already considered it many times. So, here there are 2 functions. So, let me use the x and y. So, y dot equal to minus x. So, it is a 2 d system very simple system. So, here 0 0 is the only equilibrium. So, I want the right hand side to be 0 here. So, the first equation gives me y equal to 0 and second equation gives me x equal to 0. So, that is the only equilibrium point here and of course, you can explicitly solve this thing, but without explicitly solving. So, let us try to analyze the orbits again. So, look at d by d t of x square plus y square this is a standard example. So, we are again will be seeing this again and again. So, if x and y are solutions of this given system. So, let me calculate this d by d t of x square plus y square. So, this is just 2 x x dot plus 2 y y dot and let me just substitute for x dot and y dot from the given equation. So, this is just y plus 2 y y dot is minus x and you see that. So, 2 x y minus 2 x y that is 0. So, that tells you that this implies. So, this is true for all t. So, x square plus y square. So, initially whatever you give x 0 equal to x 0 and y 0 equal to y 0. So, then I get this x 0 plus y 0 square where x 0 is x 0 y 0 is. So, the orbits satisfy the simple equation and this is equation of circle centered the origin radius square root of x 0 square plus y 0 square. So, let me call this as r 0. So, it is very easy to draw these orbits. So, this. So, if I take x 0 equal to y 0 equal to 0 I get the equilibrium point namely x 0 y. So, this is x x x this is y axis and if one of them is non 0 then r 0 is positive. So, you just get. So, these are all the orbits for the simple equation that the orbits. So, again we have to indicate the direction in which t is increasing and I will just mention that. So, this will be the direction. So, if you change again the signs here it will be. So, this will be in counter clockwise if you change the signs here it will be counter clockwise. So, this is direction of. So, remember that. So, whenever we draw orbits we have to indicate in which direction the values of t are increasing. So, that we have to mention. So, we will come back to these examples again as and when we introduce new concepts. So, next we are going to introduce the concept of stability. So, let me begin with a simple game experiment whatever you call it and all of us at some stage of our childhood either we have done this thing or we have seen it again the simple thing. So, you take a bow you place it in this way and you also place it in inverted way and place a marble here and carefully place the marble here. In the first case if I move this marble little bit this way this way it will have few oscillations and come back to the original position. So, this is the example of stability and same thing if I want to do for this marble and if I push either this side or that side it will just fall off. So, this is unstable. So, this is stable and that is what. So, this is equilibrium position and this is also an equilibrium position, but this is unstable and this is stable. So, we are going to introduce the concept of stability and equilibrium. There are more general at this point let me make some remarks. There are more general concepts of stability of a solution, orbital stability and even structural stability. There are more advanced topics. So, for this introductory course we want to make even this introductory course little advanced by introducing some additional topics than the usual topics that are covered in MSCC labors and in fact this qualitative theory is one part of such an advanced thing. So, in this course we just introduce this concept of stability of an equilibrium point. These are special solutions as we already learnt. So, we concentrate only the stability or instability of equilibrium points in this course and for more advanced things once you learn this concept you can always refer to the more advanced texts which are listed in the references and learn more about them. So, before going further let me just a remark again we will be talking now more. So, this equilibrium points let me just spend few minutes. So, just like the synonyms for orbits or trajectories and paths there are many synonyms for equilibrium points also and some of them are fixed point, point I will write p t stationary point, critical point. So, different books might use different terminology etcetera. So, you will find many more and the one terminology that is used frequently in geometry is singularity. Let me just spend few minutes they call also this equilibrium points as singularities. So, this is especially in geometry differential geometry. So, talking about geometry the system 1 so, which we started this x not equal to f x. So, we will just call it because all our analysis is about that equation. So, in geometry this is called a vector field and in particular this right hand side especially right hand side. So, for example, in 2D case so, we will have 2D case. So, we have f 1 of x 1 x 2 and f 2 of x 2 x 2 x 1 x 2. So, these are given functions and so, let me just. So, at every point a direction is given. So, this is you take a point here it is given here. So, this is the vector field and then given this vector field one would like to construct a curve having this given vector field as tangent. So, this geometrically interpreted as tangent and in this terminology in geometry solution is called or solution curve is called integral curve. So, that is geometry. So, regarding this terminology singularity. So, when f 1 and f 2 are both not 0. So, at least one of them is non-zero. So, this f 1 f 2 defines a unique direction as you can see here. But if both are 0 then there is no direction. Direction is suddenly lost and that is perhaps the reason why in geometry they refer to this equilibrium points as singularities. But whereas, since we are having the notion of dynamical system. So, this equilibrium points or stationary points or even critical points or appropriate terminology when you view this equation one has a dynamical system. So, with this simple remark. So, let me now just introduce the definition of stability and Russian mathematician Lyapunov was the one who contributed a lot to this theory of stability. So, in fact what we are going to study is this Lyapunov stability. So, everything is attributed to Lyapunov even if I do not write that. So, whatever we are covering in this lecture and perhaps next lecture. So, let x bar an equilibrium point this x not equal to f x. So, that is our autonomous system. So, again I call it one. So, that you always remember. So, x bar is said to be stable Lyapunov stable stable. So, this definition is more closer to the usual continuity definition of a function at a point. So, with usual epsilon and delta let me state it and then we see the geometry. If given epsilon positive there exist delta positive such that for any solution x of 1 with x t not minus x bar some new notation you see I will just introduce that thing less than delta implies minus x bar epsilon for all. So, this notation. So, this is just the Euclidean distance in R n. So, notation. So, if x is in R n with components x 1 x 2 x n. So, denote by this this is actually an R x 1 square and that this is just square root of that square root of that. So, what does the definition say definition say. So, here is x bar. So, at some initial time t 0 if the solution. So, this is just. So, x t 0 is this distance is delta and the requirement is that the solution x t. So, x t is here and then the orbit the solution will not go beyond this that is what the second conditions. So, once the solution at time t 0 is within a delta distance from x bar then for all future times. So, this is all future times t bigger than t 0 it will stay within an epsilon distance from x bar. So, important thing that x bar. So, we are discussing the stability of this given x bar. So, otherwise otherwise so if that is not satisfied otherwise x bar is said to be unstable. So, this otherwise you just write the negation of the statement. So, carefully write it. So, here if you go back again. So, if you have to write the negation of this thing. So, there exists some delta let me just mention that that is important. So, otherwise so that means. So, there exist an epsilon positive such that for all delta positive and a solution x of 1 with x t 0 minus x bar less than delta implies x tilde minus x bar will be bigger than epsilon for some tilde bigger than t 0. So, sometimes negations are difficult to write, but carefully see the logic and this what happens no matter how close I start. So, that solution there is a later a time and the solution leave that epsilon ball. So, that is what happens. So, this. So, there is an epsilon ball here. So, sorry this is not. So, x bar x bar is there and this. So, there is some epsilon. So, no matter how close I start. So, this is x 0 x t 0 and this is delta any delta that is important any delta. So, you can come as close as possible except that you should not be at x bar there then that is an equilibrium solution. So, you will stay there for all time. So, that I have to mention. So, x t 0 and x t 0 not equal to epsilon. So, that is important we are not there and then if you start and eventually that fellow will come out at t star. So, this is x tilde. He has come out of that epsilon ball that is what it says. So, in the remaining few minutes. So, let me just tell what I am going to do in the next class and so next we are going to introduce the concept of asymptotic stability again of an equilibrium point. So, it is slightly more than stability. So, again let me just let x bar be an equilibrium point. Let me just say it in few words and I will continue in the next class. So, x bar is said to be asymptotically stable if it is stable. So, that is the first requirement. So, it has to be stable slightly more than that then and there exist be positive such that any solution x of 1 x t 0 minus x bar less than b satisfies limit x t minus x bar less than infinity. So, this is same as saying that is limit. So, in order to even have some good meaning for these definitions. So, an additional thing has to be noticed. So, that this x bar should be an isolated x bar should be this concept I will introduce that should be an isolated otherwise what we are stated is technically not correct because if there are more equilibrium points near x 0 then we cannot even have these definitions. So, I will say what this isolated equilibrium point is then the definitions make very good sense and then we will have some more examples and we will proceed.