 So let me share the screen so you can see my slides, right? Yes. Perfect. So hi, everyone. Thank you, Jacopo, for the introduction. So today and tomorrow, we will be talking about nonlinear dynamics. So let me first show you how I have organized all the material for today's and tomorrow's tutorial, and then we can get started right away. So I will start with a very simple and brief introduction just to remind ourselves what are nonlinear systems, why are they interesting, and I will also give you a few examples with a particular focus, of course, in ecology. Then I will show you how, for example, by using stream plots, we can actually understand the general behavior of the solutions of a nonlinear system without actually solving the equations' analytic help. Then we will be talking about stability in nonlinear systems. In particular, I will show you the two main tools that can be used to study the stability of equilibria in nonlinear differential equations. Of course, throughout all these topics, I will show you plenty of examples. We will also do a couple of very simple exercises. So we will get a practical sense of, we will get our hands, let's say, on the topic. So we can better understand what we're doing. Now, as Jakob already told you, these tutorials are, I mean, they shouldn't be just simple lectures, but they are meant to be a moment of discussion. So please, at any time, if what I'm saying is not clear or if you have questions, please ask me. So if everything is clear for now, we can get started. So what we would like to do in general is, let's say, solve or understand something about nonlinear differential equations. Now, by nonlinear differential equations, we basically mean an equation or a system of equations that look like this. So with this function, f here is any nonlinear function. So this, for example, will all be simple cases. They can be in one dimension or in two dimensions here, where this function is nonlinear. So these are all simple examples of a nonlinear system. And they, I mean, they are interesting, and we want to be able to understand something about them, basically, because any interesting phenomenon in nature is described by a nonlinear differential equation. Let's see some example. One of the, I think, most simple, but also overlooked physical system that is actually described by a nonlinear differential equation is the pendulum. Now, you may be used from introductory physics courses to see this equation with sine of theta approximated by just theta. But that is only an approximation of the true, let's say, pendulum equation, which is actually a nonlinear one. Then another important case where of nonlinear system are fluids, because fluid dynamics is regulated by Navier-Stokes equations, which generally are nonlinear equations. For example, this is the equation for an incompressible fluid, where u here is the velocity field of the fluid. And this term, for example, we can see that is not linear. So again, fluid that an important example of nonlinear system. Now, I want to introduce two examples that we will consider over and over again in these tutorials, and that are actually relevant for ecology. The first one is the logistic growth equation. You may already have seen it somewhere at some point in your life. But basically, this equation simply describes the growth of a population x in a system with, let's say, a limited amount of resources. Now, of course, I'm going to show you this in more detail in a few slides. But basically, you see that if we didn't have this term here, this equation would basically be a simple linear differential equation. And so we would have that the population x here is growing exponentially without limit. But this term, as I will show you later, basically makes it impossible, sorry, for the population to be larger than k. So this system has a maximum population, let's say, that can be sustained. The other system that is relevant for ecology is log-cavoltaire equations. Again, this is a very famous system, which basically describes the dynamics of the population of a prey, which is x here and a predator, which is y here. And these parameters basically measure the interactions between these two populations. You see that in both cases here and here, the functions that define our differential equations are nonlinear. So these are indeed nonlinear systems. OK, so what we would like to do generally with, let's say, an ideal world with nonlinear differential equations would be to solve them analytically. But unfortunately, this is almost never possible because we don't have a theorem or a general recipe, let's say, that can give us a direction into how to solve any given nonlinear system. So it is reasonable to ask ourselves if and how we can understand something about nonlinear systems without actually solving the equations. Now what people generally want to do with nonlinear systems is studying their equilibria. Now I'm going to introduce definitions that probably are very familiar to you, but I just want to first refresh their meaning. And then I also want to build a small vocabulary on a nonlinear system that can be useful throughout this course. Now in general, if we have a general nonlinear system like this, a given point x star is said to be an equilibrium of the system if the function that defines our equations here is equal to 0. You see that this means basically that the equilibria of a system are the points where the variable doesn't change because if x is equal to x star, then f is 0, x dot is 0. And so the variable will not change once it is equal to this equilibrium here. Now what people generally want to do with equilibria is understand if they are stable or unstable. So we need some notion of stability and instability to study them. So let me introduce some very informal definitions of stability and instability. I will make these definitions a little bit more formal in a few slides, but I just want you to get an intuitive idea of what they mean. So in general, an equilibrium is said to be stable if any solution of the system that starts with that initial condition that is close enough to the equilibrium will always remain close to x star. Now of course, this notion of closeness, close enough is anything but rigorous, but still I just want you to get the intuitive idea from now. On the other hand, an equilibrium is said to be unstable if it is not stable. So if we can find solutions of the equations that start close to the equilibrium and eventually go away from it. Now in light of these concepts, we can basically reformulate our initial question as follows. So can we understand something about the stability of the equilibrium of a nonlinear system without actually solving its equation? The answer, of course, is yes. There are several ways, the several tools that we can use. And one of the simplest one, but also quite effective that we can use in this direction, is drawing stream plots, which basically means drawing the trajectories of the system in the state space. Now I think the easiest way to understand how stream plots work is to see how they are done practically. So let's start with a simple example. So assume we are given this differential equation, which is, of course, nonlinear, because this function here is a simple cubic function. So let's draw it. This function basically looks like this. And so we can see very easily, we can also factorize the expression in this way. So we can see very easily that the system has three equilibria in this case, which are the three points where this function is equal to 0. In particular, in this case, are minus 1, 0, and 2. Now the basic principle behind drawing stream plots is the following. Now wherever this function f here is positive, x dot will be positive. So the solutions that start from point where f is positive will be characterized by the fact that x is increasing. So for example, if we consider this interval here between minus 1 and 0, in this interval the function f is positive. So any solution that starts from these points here will be characterized by the fact that x is increasing. And the same will happen, for example, in this interval here. On the other hand, wherever f is negative, x dot will be negative. And so the solutions will be characterized by the fact that x is decreasing. So if we start from any point in this interval here, we will have that the solutions actually go towards the left and the same here. So in the end, what we can draw are these trajectories. So this is the stream plot of our system. And so you can see that by doing this very, very simple drawing, we can already guess which equilibria are stable and which are unstable. In particular, we can guess that this equilibrium here is unstable because you see that if we take any initial condition that is slightly larger than 2, we will have that this solution basically increases without limit. And something similar happens here if we take an initial condition that is likely lower than minus 1. On the contrary, we see that solutions here are actually going towards the equilibrium in 0. So we can guess that this equilibrium is stable. Let's see another. Is everything clear here? OK. I think it's a good time if you have a question. Exactly. Or ask a question, too. Please remember that this is a tutorial. So this is really meant to have everyone being on the same page on the top. So if something isn't clear, don't hesitate to ask a question. OK? OK. So let's see basically the same thing applied into a different case. So let's consider this equation. x dot equal cosine of x. So in this case, we know that our function here looks like this. So we will have basically an equilibrium in every positive and negative on the multiple of by halves, because these are the zeros of the cosine function. Then if you apply the same principle as before, we will have, for example, that in this interval here, in this interval here, and in this interval here, the function is positive. So the solutions will go in towards the right. On the other hand, in this interval here and in this interval here, the function is negative. And so the solutions will be going towards the left. So in the end, the stream plot that we can draw of this system looks like this. So you see that in this case, we basically have an alternation of stable and unstable equilibria in our state space, which, of course, is the x-axis here, because this system is a unidimensional system. We only have x as a variable. Let me now show you basically how we can use these stream plots in cases that are ecologically relevant. The first thing I want to show you is the stream plot of the logistic equation. Now, actually, this is one of the few lucky cases where we can actually solve an equation analytically. So I hope that with this example, it will be clear that, I mean, comparing what we see with the stream plots with what we see with the analytical solution, the stream plots can actually help us understand the general behavior of the solution of a nonlinear system. So first thing, let's try to, I mean, let me show you how these equations can be solved analytically. So let me write here in the whiteboard the equation. OK. So to solve this equation, I'm going to use some physicist's tricks. So if you are a mathematician, I apologize because you will probably be horrified about what I'm going to do. But basically, what we can do in this case is we can separate the variables. So we bring everything that depends on x on one side and everything that depends on t on the other one. I'm sorry, this whiteboard is a little bit slow, but I hope you can see everything. So once we have done this, now this fraction here, we can actually decompose it into two terms. So we can write this as dx over x plus dx over k over 1 minus x over k. So you see that if we simply add these two fractions, we get exactly this. So we have this equation. But these are very simple terms to integrate because, for example, this one will be the logarithm of x plus a constant. This one will be minus the logarithm of 1 minus x over k plus a constant. And this here will be simply RT plus a constant. So in the end, we can rewrite everything as logarithm of x over 1 minus x over k equal to RT plus c, which basically means that x over 1 minus x over k is the exponential of RT plus c. Now, of course, we have to determine the value of this constant, so we can evaluate this expression for t equal to 0. And by simply rearranging basically the expression that we get in this way, the final analytical solution of the logistic equation looks like this. This is the analytical solution of this equation, where x0 is, of course, the initial condition on x. Now, you see that here we have an exponential with a negative exponent because remember that both r and k are positive parameters. So as time passes, so as t becomes larger and larger, we can neglect basically this term here. Then x0 and x0 cancel out. And so we can see that the solutions of this equation tend towards the value k. The only case in which this doesn't happen is when the initial condition 0 because you see that in this case, if x0 is equal to the numerator, is there a question? Yes? Yes, yes. So could I explain what is r and x and k? Yes, of course. Basically, the biological meaning of r is the growth rate of the species. You see that if we didn't have, let me write it down, if we didn't have the quadratic term, so if we had only this equation here, the solution of this equation would simply be constant times, would simply be an exponential. So in this case, we would have a population growing exponential. The biological meaning, on the other hand, of this other parameter, it's going to be clear in a few seconds. But basically, this is the maximum population that the system can sustain. Because let me draw basically how the solutions are. So for example, if we start from x0 equal to 0, as I just told you, basically, the solution is constantly equal to 0. So this is a trivial case because it basically describes a system with no population. Something similar happens also when x0 is equal to k because you see that here we would have k minus k. So this is always equal to 0. And again, this and this cancel out. But then if we take any initial condition between these two values, this function basically behaves like this. So we have a population that grows. And the growth, the initial growth here is actually an exponential growth. Because you see that when x is very small, we could neglect it. So at the very beginning, the growth is well approximated by an exponential. But eventually, this population saturates towards k. On the other hand, if the initial condition is larger than k, what happens is that the population quickly decays towards k. So basically, the meaning of this parameter is the maximum population that the system can sustain. Because basically, the idea behind the logistic growth equation is that the population here, this could be microorganism, could be animals, could be anything. But it is in an environment with limited resources. So at a certain point, the system will reach a population that is not sustainable anymore. And so x cannot grow larger than k. And even if x is driven to a value larger than k, then it quickly decays back to k. Does that answer your question? I mean, was I clear enough? Yeah, yeah. Please request to all. Please don't put the message regarding the witnesses. OK. Sorry? OK. OK, it's correct. It's OK. OK. So the fact that k here is basically the maximum population of the system is also why this parameter is called the carrying capacity of the system. So now we know how these solutions work. In particular, if we look at what is happening in the state space along these axes, you see that these solutions here basically are going towards k. And in the same way, these solutions here are going down towards k. So let's see if we could understand this general behavior already from the stream plot. So again, this is our question. The function, the non-linear function that describes the system now is simply a parabola. And so we can draw it. Now, of course, here I am not considering the negative part of the state space simply because this variable here represents a population. So it makes sense only when it is non-negative. So this is the aspect of this function. So we see immediately that we have two equilibria, in the int, 0 and k. So now let's apply what I've shown you before for stream plots. In this interval here, the function f is positive. And so the solutions, the trajectories of the solutions will go from 0 to k. On the other hand, in this other part of the state space, the function here is negative. And so the solutions will move again towards k. So in the end, the stream plot of the system is like this, from which we can also guess that this equilibrium here is unstable because the solutions are moving away from it, while this equilibrium here is stable. So you see that if we look at what is happening in the state space, we are actually recovering the same behavior. So of course, by using the stream plots, we can't say, for example, how quickly these trajectories are moving or the exact way in which this is happening. We couldn't guess, for example, that there is an exponential, like I've shown you before, that regulates the speed of this motion. But again, without even trying to find the analytical solutions, we can see how the solutions are moving in general. Now, if everything is clear here, so if there are no further questions, there is a question. Yes, of course, of course. What does it mean of unstable and also stable? By unstable and unstable, in this case, I'm meaning the very informal definition that I have given a few slides ago. So unstable, in this case, means that the solution are moving away from the equilibrium. Because you see that in this case, if you pick any initial condition that is slightly larger than zero, the solutions will eventually go away from this equilibrium. While, on the other hand, stable equilibrium means that the solutions are remaining close to the equilibrium. So in this particular case, for K, for example, we have that the solutions are actually going towards the equilibrium. But this is not necessary for the equilibrium to be stable. So this is a particular case of stable equilibrium. In general, for an equilibrium to be stable, we just need that the solutions remain close to it. Does that answer your question? Yeah, thank you, thank you. OK, no problem. So I hope it is clear with this example that what we are seeing with the stream plot makes actually sense if we compare it to the analytical solution. Can I ask? Yes, of course. So ecologically, it means that the K is the point where it achieves equilibrium, right? So there's no resource limitation, like we can say, for the growth. No, the resource limitation is in the fact that this population is not growing exponentially. If we had a system where we had an unlimited amount of resources, basically here we would have only an exponential growth with a fixed growth rate. OK, OK. What happens in this case is that K is the maximum population that can be sustained by the system, meaning that if for any reason the population grows larger than K, there are no enough resources to sustain that population. So you see that eventually the population will go back to K. OK. So in this case, in this sense, there is a resource limitation. OK. OK. So what we can do now is basically apply the same thing so we can try to draw the stream plots in the case of the log-cable data system. So our equations in this case are this and this is our state space. Again, here I am considering only positive values for x and y because they represent populations. So it doesn't really make sense to look at the other quadrants of the state space here. So if we look at these equations here, we can find very easily two equilibria, which are the origin and this non-trivial equilibrium. Now the origin is very easy to see why this is an equilibrium because if x and y are both equal to 0, then both x dot and y dot here are 0. The other non-trivial equilibrium is very easy. Let me show you. For example, we have that x dot is equal to alpha x minus beta xy. So if we want this to be equal to 0, we can rewrite this as x times alpha minus beta y equal to 0. So if x is not 0, we simply have that y is alpha over beta. If we do this exact thing with the other equation, we get that the equilibrium in the end is this point here. Now let's try to draw the stream plot. The first thing that we can do is see how the system behaves on this axis here. So for example, if we take an initial condition where x0 is equal to 0, so we start from the y-axis here, you see that there is a question. Sorry. Am I having a problem with the audios? No. So yeah, I think I can hear you. Yes. Can you hear me? Can you hear me? OK. OK. What if the k is not constant? So what do we do? I'm sorry, I'm not hearing you well. There is a little bit of background noise. I think the question was what if the carrying capacity is not constant? What if the carrying capacity here? OK, you mean here. Well, in this case, we would have basically a different system. k could be a function of x if you want or a function of any other variable. But I mean, it would simply be a different system. It would require a different stream plot. So we can use the same tools, but it would be simply a different system. OK. So where were we? OK, so if we take an initial condition, for example, that is on the y-axis here for the log table data system, you see that our system reduced to this. So the solutions basically will be decaying exponentially towards the origin. Remember that all these parameters here are positive, so minus gamma y is negative. On the other hand, if we take any solution that starts on the x-axis, so if y0 is equal to 0, we can write our system like this. And so the solutions will be growing exponentially on the x-axis. So the first thing that we can draw about the streamplots are these trajectories here. And notice that it would be enough for us to guess that the origin here is an unstable equilibrium. Because you see that along this direction, we have solutions that actually go away from the equilibrium. So by definition, this equilibrium here would be unstable. So let's try to see what happens in the rest of the state space. What we can do is see exactly when the components of this function here are positive and when they are not. So for example, if we look at when x-dot is positive, we have, again, as I've shown you before, that this must be positive. Now, we are not on the y-axis, so x is different than 0. And in the end, we get that y must be lower than alpha or beta. So we can divide the state space into two regions, one where x-dot is negative and one where x-dot is positive. Similarly, if we look at where y-dot is positive, we get that this is true when x is larger than gamma over delta. So in the end, we can basically divide our state space in four regions. And for each of these one, we know that x-dot and y-dot is other positive or negative. This basically means that we are able to see the general direction, let's say, towards which the solutions are pointing. For example, here, if x-dot and y-dot are both positive, it means that the solution is growing both in x and in y. So in general, the solution is pointing in this direction. And this is true also for these other areas. So you see that we haven't even tried to solve the equations, but we can guess already that the solutions of the Lord Cavalterra system oscillate around this equilibrium. Of course, we don't know exactly how this happens, but we know that this is happening. Is that clear? Are there questions? OK. So I hope it is. Sorry, I was muted. So I hope it is clear now that stream plots are actually useful to understand the general behavior of the solutions of an linear system. But of course, the powerfulness of this approach is limited because we can't always understand something about the stability of the equilibrium. For example, if we consider this case here, we know that the solutions are oscillating, but we can't say anything on the stability of this equilibrium because we could have the solutions here are spiraling towards the equilibrium. We could have the solutions are spiraling away from the equilibrium, or we could have any kind of behavior. In this case, we can't say anything about the stability of the equilibrium. So how we can study in general the stability of equilibria in nonlinear system? There are two main tools that can be used in this direction, which are Lyapunov functions and spectral analysis, which is also known as simply linearization. Now, is anything clear up to now? OK. So before I go on and I talk about how to use the Lyapunov function, I want to introduce the formal version of the definitions that I have given you before about stable and unstable equilibrium. So we consider a generic nonlinear differential equation and assume that we know x star is an equilibrium. Now, using the language of mathematician, now I'm not a mathematician, so I won't be very formal. But I just want you to let you know how the formal definition of stability is. So this equilibrium in the language of mathematical is said to be stable if for every neighborhood A of x star, there is a neighborhood B in A such that the solution starting from points in B will always remain in A. Now, this is just the formal way to say what I told you before, which means that an equilibrium is stable if every solution starting close to it always remain close to it, where the notion of closeness in the language of mathematicians is given by the use of neighborhoods. So let me show you graphically because I think it's easier to understand this way. Now, assume we have an equilibrium here. Now, this equilibrium will be stable if for every choice that we can make of a neighborhood A, so a set on the state space that contains this point, we can always find a smaller set B. And if we pick any point inside here and we use it as the initial condition of our nonlinear system, the solution starting from here it can move around all it wants, but it will never go out of A. So it will always remains close to the equilibrium in this sense. Now, if this is true, I want to introduce a couple of other definitions of stability because I mean, this way we can build this vocabulary that will be useful throughout the school. Now, if this definition of stability is true, not only let's say in the future, so looking at how the solution behaves for positive time, but if this is true also for negative time, we say that this equilibrium here is not only stable, but stable at all times. On the other hand, if this solution here, instead of going around this set A here, at a certain point, it moves toward the equilibrium. So if the limit of the solutions is the equilibrium, we say that the equilibrium is asymptotically stable. So the difference between a simply stable equilibrium and an asymptotically stable equilibrium is that in the asymptotically stable equilibrium, we know that the solutions are moving towards the equilibrium. While if an equilibrium is simply stable, this doesn't necessarily happen. Is this clear? Are there questions? Okay. So the other side of the coin is instability. So again, using the language of mathematician, an equilibrium is said to be unstable if it is not stable. So if we can find at least one neighborhood of X star such that for any choice of a smaller neighborhood, there is always at least one initial condition such that its solution goes out of A. Which again, it's just... Yes? Yes? So Shamolina, sorry for misspelling the name. You have raised the end. Yeah, I have a question. I could not understand the difference between asymptotically stable and normal stable equilibrium. Of course. The difference basically is in the fact that if an equilibrium is asymptotically stable, it means that we know that the solution is actually going towards the equilibrium. Yes. While if an equilibrium is just stable, so not as stable, but not asymptotically stable, then the solution is not going towards the equilibrium. It can be oscillating, it can be going around the equilibrium, but it's not actually going towards the equilibrium. Then why you are calling it stable? If you don't know its fate, I mean, ultimately when it will go towards that equilibrium, then we can call it stable, right? Then in what sense you are calling it stable? Yeah, I'm calling it stable in the sense that the solution is not going away from the equilibrium. So the idea of a stable equilibrium is an equilibrium where the solutions are always remaining close to it. If on top of remaining close to it, the solutions are actually moving towards the equilibrium, then we call the equilibrium asymptotically stable. Does that answer your question? No, I actually don't know that how could I differentiate? They say if I have a set of dynamical equations and then I solve that and say after a long period of time, my simulation shows that it is approaching to equilibrium. Then it's asymptotically stable. But how could I know from the graphical simulation that it is stable? Because in my system, I don't know where is the boundary of this B and A thing. This is mathematical definition. Yes, yes. In this case, I mean, of course it depends on the system you are considering, but I would say that you could see if the equilibrium is actually stable by using different initial conditions. So if I don't know, you guess, for example, from your simulations that X star here has a particular value, you start sampling some solutions around this value. So for example, if we have just to make things simpler, if we are in one dimension, let's say you guess from your simulation that X is more or less equal to one, then you could sample several points around one, so I don't know, 0.75, 0.80, 0.85, then 1.05, 1.10, and see how these solutions behave. If you see for example... Like starting from those initial conditions, exactly, you take like, I don't know, four, five, 10 points close to the equilibrium and you see how the solutions starting from these points behave. If you see for example, that all of them are going towards the equilibrium, then you have a good indication of the fact that the equilibrium is asymptotically stable. So going towards the equilibrium means like, how long time or there is no time sense here? No, in this case, there is no time scale here. I mean, the mathematical requirement is that for T going to infinity, the solutions goes to the equilibrium. So you don't have any kind of measure in this sense. Oh, then we have always played to the asymptotically stable thing. We cannot... That means we always see the asymptotically stable equilibrium, never the stable equilibrium. It depends. I'm going to show you some examples in this tutorial, but for example, if we have an oscillating system, so for example, if we have... Okay, so if we have, for example, I mean, I will show you the Laudka-Volterra system or for example, a pendulum. So assume this... Excuse me? Yes? Leonardo, can you hear me? Yes, I can hear you. At the last slide, can I go back to it? Yes. Yeah, so you have written X star is stable not only for all T greater than or equal to zero, but for all T included in R, right? But time for the sake of its definition, it is supposed to be contained in R plus, right? Yes, I mean, physically, yes, mathematically, not necessarily, meaning that... I mean, if you... Let me write it this way. If you have a system like this, so X of T, sorry, X of T equal to F of X of T, yes, you would say that I can solve this for T greater or larger than zero, but then I could define, I don't know, tau equal to minus T and solve this system here, X of tau equal to F of X of tau. This is another nonlinear differential equation which can be solved, but when tau here is larger than zero, T is lower than zero. I don't know if I'm being clear enough. So in any case... It is supposed to be a transformation on T, but T itself cannot be negative. Yes, if you want, I mean, mathematically, yes. If you want, I can change what I'm saying here by exactly what I've written here. So if you can say that the equilibrium is stable for T larger than zero, and then it is also stable if you make this change of variables, then it is stable at all times. Okay, thank you. No problem. So about what the question I was answering before, so for example, if we have... This is our system, this is our equilibrium. So for example, if we have a solution that goes in circles like this, this is, for example, what happens for the pendulum equation when we approximate it for very small angles. So you see that in this case, the solution is going around the equilibrium, but it's never going close to it or far from it. So in this case, the equilibrium is not asymptotically stable, because the solution is not... Now I understand the periodic orbit. Yeah, because the solution is not doing something like that, so the equilibrium is not asymptotically stable, but still the solution is not going away from the equilibrium. Yes, yes, yes. Okay, is that clear? Yeah, thank you. So... Just another fact, Leonardo. Yes, yes. So actually, in the linear stability analysis, you are always constrained to near about these stable states, stable or unstable states. Yes. Never far from it. And then you have to invoke this large deviation, right? So it has to be all the way... Then you have to invoke, if you far enough, you have to invoke large deviation. Yes, I mean, it depends on the system you are studying, but I'm going to show you something about it, but please continue. Please. So again, all the time, whenever you are applying these analytical results to conclude something, you are again very close enough to your steady states. Yes. So this asymptotically stability always matters. Yes, thank you. This is also that we can say the price that we have to pay for not being able to solve the equations analytically in general. So of course, if we would be able to solve analytically on non-linear differential equations, we would be able to say everything globally, basically, about an equilibrium, but since we cannot do that, we have to restrict ourselves to the points of the state space which are close to the equilibrium. Does that answer your question? Yeah, please go ahead. Okay, thank you. So I was saying the other side of the coin is instability. So graphically, what happens is what I was saying before. So an equilibrium will be unstable if I can find some initial conditions for which the solutions will eventually go away from the equilibrium. When in this case, go away is the idea of going away is given by neighborhoods. So if there are no further question, what time is it? Okay, so we still have 10 minutes more or less. If there are no other questions, okay, I can start introducing liapunov functions. So okay, now one of the tools that I have anticipated you we can use to study the stability of equilibria in a linear system are liapunov functions. So let me show you the basic principle that is behind using liapunov functions so that things will be more clear. So let's assume we have a system like this, a nonlinear system and we know that X star is an equilibrium. Now here I am drawing this in one dimension simply because it's easier but what I'm saying is true in any number of dimension. So assume now that we are somehow able, I'm going to explain you how we can be able to do so in a few slides but assume we are able to define in a neighborhood of this point a function w that has a minimum in X star in the equilibrium. Now, if we had the solutions of the system and we computed the value of this function along the trajectories of the system, for example, we could find out that this function is decreasing along the trajectories. So for example, we would be in a situation like this. So again, let me repeat, we define this function here in the proximity of an equilibrium, then we take the trajectories of the system and we compute this function along the trajectories. I'm going to explain you, of course, how we can do that. If, for example, we find out that this function is decreasing along the trajectories, so the time derivative of this function here is negative, then we will be in a situation like this. So you see that necessarily this means that our solutions are moving towards the equilibrium. And this happens because we know that the function here has a minimum in X star. So this would mean in this case that the equilibrium of the system here is asymptotically stable because we know that the solutions are going towards it. On the other hand, if, for example, we find out that this function here is increasing along the trajectory, so if the time derivative of this function is strictly positive, then we are in a situation like this. And so you see that in this case, necessarily the solutions are going away from the equilibrium. So in this case, we could say that this equilibrium is unstable. Is that clear? Okay. So now what I've... Yes? The question by Mgaka. Hello. Are you approaching from just one side? Yeah, of course I'm showing you just from one side, but I mean, if you flip this image vertically, you see that the same is true also from the other side. If we are in this situation, sorry. If you are in this situation, so if the solutions are decreasing, sorry. If the value of the function is decreasing, you see that if I started here, let me draw that probably this will be a little bit clearer. So if this is my situation, this is my equilibrium X star. So let's say that I can define this function here. Let me draw it a little bit better. So you see that if I start here, for example, and I find out that the value of the function is decreasing along the trajectories, then I will have exactly the same situation. So the solution will be approaching the equilibrium from the other side. Does that answer your question? Okay, great. So, okay. So what I've told you now is absolutely non-Rigorus. The rig, I mean, what I've said, of course can be made mathematically rigorus. And this is done by what is called the Lyapunov's second theorem. This theorem basically states that if we are exactly in the situation that I have just described, so a general nonlinear system with an equilibrium and this function defined here with a minimum in the equilibrium, then if the time derivative of this function along the trajectories is equal to zero, the theorem says that X star, the equilibrium will be stable at all times. So if this function here is constant along the trajectories, this equilibrium is stable at all times. On the other hand, if the time derivative is non-positive, so it's either negative or equal to zero, the equilibrium is stable. If it is strictly negative, sorry, the equilibrium is asymptotically stable. And if it is strictly positive, the equilibrium is unstable. Finally, a function that I have in, I mean a function like this that I have just introduced is simply called the Lyapunov function for this equilibrium. Now, we still have five minutes, 10 minutes, more or less. It's about 10 minutes. So, I mean, perhaps we can see if there are more questions from the audience. Yeah, absolutely, absolutely. I mean, it depends on whether, because Leonardo is giving a second part of this tutorial tomorrow. So, Leonardo on whether you want to, how you want to divide it into parts. But I would say, let's see if... Yeah, absolutely. So, I can stop here, no problem. Okay. So, are there questions, clarifications, something that is unclear? So, I have one question. Yes. Yes, please. My question is that in this case, the equilibrium is the minimum. Is this property is useful for when we have an equilibrium is a maximum. Yes, I mean, you could write basically the same exact theorem when the function has a maximum in the equilibrium instead of a minimum. Of course, you would have to change the names here because of course, if here we have a function with a minimum in the equilibrium and we find, for example, that the time derivative is negative, then we are moving, let me draw this, sorry. Probably this will be more clear. So, what I've just told you in the slides is basically that if we are in this situation, so I have my equilibrium and I have this function here with a minimum. So, what I've shown you in this case is that if the function decreases along the trajectories, then we are moving towards the equilibrium. So, the equilibrium is asymptotically stable. But for example, if we decided to define the function differently, so instead of having, sorry, a minimum in the equilibrium, it had a maximum, sorry. Okay, so the situation looks a little bit like this. You see that the equilibrium in this case would be asymptotically stable only if the value of the function increases along the trajectories. Because in this case, we would have that the function is increasing like this. So, we are actually moving towards the equilibrium. So, yes, we could very well do the exact same thing with a function that has a maximum on the equilibrium, provided that we change the definitions here accordingly. Is that clear? Mohammed, is that clear the question, the answer? Yeah, it is clear, it is clear. Then there is a question from Juan Jose. Please unmute yourself. Yes, thank you. I wanted to ask you if when we are looking for an attractor of a chaotic system, is there around those trajectories that are closed in a kind of space near the equilibrium points, there are we talking about the same notion of equilibrium or it's a bit different. You know, I would say that it is similar, but the notion of attractor is actually wider, let's say than simple equilibrium because by equilibrium, we mean a point, a specific point in the state space while an attractor can be something more, I mean, it can be also a different set because for example, I don't know if you have ever heard of cycle limits, limit cycles, sorry, but there are some dynamical systems, for example, where we find out that there is, I mean, I'm just trying an example, there is, for example, this circumference and any solution that starts, for example, here, instead of tending towards a particular point, it will tend towards this trajectory. So the solution will do something like this, sorry, I am terrible at drawing, but the idea is that the solution, instead of going towards a particular point, it goes towards a trajectory. Now, the intuitive idea is the same, so if we have that the solution instead of tending towards a point, it tends towards a trajectory, I mean, the idea is the same, but mathematically these things are very different and there is a whole different set of tools that can be used to study them. Thank you very much. No problem. The second line for asking a question was Ayan, Ayan? Yeah, I can hear you. Yes, yes. So I was just wondering if we go back from dynamic systems theory to study ecological networks, the concept of equilibrium and steady state, so non-equilibrium is something when you have this exchange of energy, right? Yes. So then, which I guess is always the case in ecological networks. So the proper terminology over there maybe is to use steady states or unstable steady states, rather than using these equilibrium states. Yes, when we talk about thermodynamically open systems, let's say, so system where there's an input of matter or energy or any kind of thermodynamic quantities like ecosystems, technically we should say steady states and not really equilibrium because there is no equilibrium. Yes, so you are right. Thank you. Okay, there was another question by Pablo Liacciano. Hello. So, but when you're talking about not being able to solve the linear system and then in order for you to talk about stability, you have to talk about stability in a region near equilibrium. Why is that? Is that because if you don't know how to solve the system, would you linearize it around the equilibrium or? Yes, exactly. I mean, if you don't, I mean, if we would know the full analytical solution like in the case I've shown you a few slides ago on the logistic equation. So if we could find the analytic solution like this, like this here, we could say anything that we can say about the system. We have like the maximum information possible on the system. But when this is not possible and this is not possible almost always, basically, we have to find ways to determine the maximal information which necessarily cannot be the maximum information because we cannot solve the system. So we have to do our best and often doing our best is studying the system close to equilibria. Did that answer your question? Yes, thank you. No problem. Great. Next in line is Debas Mita. Can you hear me? Yes. Hello. Yeah. So my question is that construction of this Lyapunov function, like whenever we change the model, so there I face problem with construction of Lyapunov function. So can you suggest some techniques that, some base techniques to assume what kind of Lyapunov function we can assume for to test the stability of a participant? Sorry, I lost the audio for a moment. Can you repeat the question? Yeah, so I'm asking about construction of the Lyapunov function for any participant. Yeah, I mean, this is the great problem of this approach. I mean, as you can see, it's actually very powerful because you can tell us a lot about an equilibrium but the problem of this approach is exactly building a Lyapunov function. In general, it is not easy at all to find the Lyapunov function for any generic system. So we have to be either lucky or we have to use some intuition to build the Lyapunov function. So there is always this kind of trade-off. So we have a very powerful tool that this powerful tool is not easy to use and it's not possible to use it always. Okay, thank you. Then there was, okay. Sorry, you were saying something, Devas Mita? Yeah, yeah, it's fine. So then there was a question by Samson, what I need. All right, thank you very much for giving me opportunity to ask my question. First of all, I want to appreciate the presenter for this wonderful presentation. My question goes straight like this. Where can we have a question of global asymptotic stability? Sorry, I don't think I heard correctly because of some audio problems. What was your question? I'm talking about, my question is tended towards when are you going to have a global asymptotic stability? Well, it depends. A case when we can say globally that an equilibrium is asymptotically stable, for example, our systems with concert quantities, for example, if we have a particle in a potential, so a very simple physical system, this particle can move only on the x-axis and we know that it is subject to a given potential, energy potential V. And for example, we know that this potential, for example, is a parabola. So this is defined on all the x-axis. So we know globally how the potential work. We know that, I mean, from physics, we know that the equilibrium, the equilibria of the system are the minima of the energy potential. So in this case, since this is a parabola, we only have a minimum. And so in this case, we can say, okay, there is one asymptotically stable equilibrium and this is the only one that there is in the system. So, I mean, this is a lucky case in which we can tell anything about the system globally and not only locally. Did that answer my question? Oh, thank you. Yes? I think that we're right. I'm satisfied. Oh, okay, thank you. So there is a question which is quite popular in the chat and I think could be the last question of today's session. So the question is, is there any general principle guiding us to the construction of a Lyapunov function? No, this is the great problem of this approach. There is no general principle. We have to be lucky or smart. I mean, there is one big exception. I'm going to talk about this tomorrow, but if we know, for example, that we have a system with a conserved quantity. So if we, for any reason, know that there is, like I've shown you before, potential energy or any kind of quantity that is conserved. Generally, these conserved quantities are a good first guess for a Lyapunov function. This is not always true. We have, every time we have to check that they are a good Lyapunov functions, but generally when we have conserved quantities, these can be good Lyapunov functions. If we don't have conserved quantities, then we are completely on our own and we have to find them on our own, which can be difficult or even outright impossible in some cases. Okay, great. So I would say that this is the perfect time, the end of today's session, today's tutorial. So as a reminder, Leonardo will give the second part of this tutorial tomorrow at the same time. So you can also watch again toward, I mean, as many times as you want, the tutorial that you'll navigate today on YouTube. So if you want to, you need something and you want to watch it again, please do. So the next slot in the next lecture by Joshua Weitz is starting in about 13 minutes. So what we're gonna do now is to splitting randomly assigned breaking rooms. You are free to stay in the breaking rooms, chat with whoever you are assigned randomly to. You are also, you should be able to switch the breaking rooms, to switch your room if you want, if you see someone you want to say hi. I guess Leonardo is staying with us, so you can also. Yes, absolutely. More informally, but also you are free, of course, to stretch your legs, to get a cup of coffee or take a break from the meeting. So with that, since I'm not host, if I'm ready. Okay, yes, I give you. Here we go. Yes. Okay, 20 breakout rooms, is it okay? Yes, okay. Cause 150 people. Yes. Okay. Can someone on the ICTP side, if you can hear me, let me know. Yes. I have answered you in the chat. So your sharing is good. Okay. You can see your sharing very good. So it's okay. I just remember you that reminded that we are now live on streaming. Yeah, no, I understand. Okay. On streaming. Hello, streaming people. Yeah, perfect. Great. Hello, you need help? Breakout rooms. I mean, there is only one breakout room that is room number 15 where I am assigned. But can I switch the rooms or can I see the available rooms? You can leave the room. And then I am going back to the main station, but I cannot see the other rooms or... If you want, I can move you. Do you have a bottom breakout room on the bottom of your bar? Bottom bar? It maybe depends on the version of your Zoom. It's after five. You have to go at least version number five. What is this Zoom version number? Do you want that they move you in another... Yes, actually I want to ask a question to the person who was giving lecture. I think he's out of the breakout room. So just leave the room. Oh, okay. Okay. Give breakout room, just on. Hey, Josh. Can you hear me? Hi. Yes, I can hear you. Oh, great. Okay, so I was... Because we have the participants paired randomly in group, randomly in meeting groups. So I was in one of those. Okay, that's fine. I'm sorry to find you. So how is it going? It's going fine. You have a big crowd. I think we're live on streaming, so... Okay. Oh. Yeah, just keep that in mind. Hello. Everyone who's watching us have this chit chat. I see. Okay. Great. Yeah. If you want to discuss anything before we start, we can go to a breakout room. But otherwise, I think I'm set now. I'll leave with you the coffee. I think you'll check that. Okay, great. Thanks.