 Welcome back again. So, we have a now complete idea about 2 by 2 linear systems is stability analysis and is about the phase portraits and all that. Now, when we go to an n dimensional system the things are not easy, because there will be n Eigen values and according to n Eigen values there will be certain Eigen values which are simple and there will be Eigen values with algebraic different algebraic and geometric multiplicity, there will be complex Eigen values which are distinct, there will be complex Eigen values with multiple multiplicity higher multiplicity. Accordingly the decomposition of matrix is much more complicated, but this is again one can really linearly make it equivalent to another matrix, but there will be various types of blocks what I call the block way of diagonalizing it. What we have seen in 2 by 2 systems we have 3 types of block one is in the diagonal form lambda 0 0 mu, the second one is the diagonal form is lambda 1 0 lambda and third one is of the form a minus b b a. Now, we are going to have different types of blocks according to the multiplicity and things like that. We will appeal to the Jordan decomposition theorem and later you will see that there are typically 2 types of blocks, but that 2 types of blocks can occur in different ways with a different order and that is the difference and then each of that blocks computing your exponentially is easy and that is what we are going to precisely here. For example, when n equal to 3 you have 3 Eigen values in this case the only possibility is that all the 3 can be real or 1 can be real and other 2 will be complex conjugates, these are the only possibilities will come. So, a typical block matrices will be equivalent to something like that you can have the form it may reach the form lambda 1 lambda 2 lambda 3. So, you may get block of this form typically this is the case you have Eigen values 3 Eigen values with 3 Eigen vectors you will have that. So, you have the main issues that the lack of Eigen vectors it can be of this form that is another form it can be lambda here lambda 1 here and then another one lambda 2 this is a block of this you already studied you can have a block of this form and here you have 0 0 0 this is a case you have a block of that form you will also have a block of that form another one lambda lambda lambda 0 0 here with all diagonal elements 1 this is the another block of this one typically these are the blocks you are going to get high dimension it will be complicated you will have different multiple of blocks and other type of block these are the cases where you have real Eigen values. Here one real and one complex you will have an Eigen value of this form and you have a block of this form a minus b b a you have 0 here 0 here 0 here you see and all these things you will see that you can see that you can compute e power b. So, I will give an exercise here to start with you consider this type of systems you take b 1 b 1. So, consider this system with b 1 the exercise is work out all possible cases work out all possible cases with the above matrix a possible cases with the above that systems. So, you can write down your system wide system with respect to b 1 b 2 b 3 b 4 areas thing and see the trajectories how it will look like and sketch the portrait and sketch the phase portrait if possible. So, we will start with an example here example an easy case a is of the form with a diagonal form 0 0 0 1 0 0 0 this is a decoupled system already in the decoupled system. So, 2 Eigen values 2 Eigen values lambda 1 equal to 1 and lambda 2 is equal to minus 1. So, here you have your multiplicity algebraic multiplicity multiplicity 2 luckily it is in that format you do not know. So, it is a decoupled system you can write down your solution immediately x 1 t is equal to x 0 1 e power t x 2 t is equal to x 0 2 e power t and x 3 t is equal to a x 0 3 e power minus t you see this goes to plus or minus infinity as t tends to infinity this goes to plus or minus infinity, but this goes to the origin. So, how does it is look like? So, if you look at your x 1 x 2 plane you will have unstable node type thing you forget about the x 3 component look at only x 1 x 2 plane it is something like a node because x 0 1 e power t x 0 2 is nothing but straight lines in the x plane. So, that graph if you are do not have if you are looking only your initial values are in the x 1 x 2 plane if you are starting an initial the and third component 0 then by this thing x 0 3 0 x 3 t will be 0 for all t and hence the if you are starting a solution in x 1 x 2 plane it will remain there and in that case it is a node type singularity. So, x plane thing node type singularity and these are straight line because you have the same eigen value 1 and the other hand. So, you will get your and which is also unstable in this particular case. So, you will get e u is equal to x 1 x 2 plane that is your unstable thing and if you look at it x 1 x 3 plane is you have one eigen value 1 again it is all these are all decoupled system. So, x 1 plane you have the eigen value 1 and x 3 corresponding to that you have a eigen value 1. So, you have your saddle type actually saddle type and you will this is a same thing when you look at it x 2 x 3 plane. So, if you look at this trajectory is restricted to these planes the trajectories will remain there itself and you will have again the saddle type and you are stable part and which is stable because you have minus 1 as the eigen value which is a not stable. So, it is a saddle type and you do not have it, but if you look at only the x 3 axis if you take something a point initial point in the x 3 axis that you will get a stable space. So, you will have x 3 is the stable subspace this is the unstable subspace and this is x 3 stable x 3 axis is your stable subspace which is 1 depth. Now, let us try to plot these things in a let us try to plot. So, let me plot this this is my if you do that one. So, this is your x 2 this is your x 1 and this is your x 3. So, if I so you can extend of course, you can extend like this. So, if you extend your x 3 like this. So, let us look at only x 2 x 3. So, if you look at x 2 x 3 you have the saddle point. So, the type they will be there and x 2 goes to infinity. So, you will have this thing and this is your x 2 x 3 correspondingly this is your x 2 x 3. So, you will have trajectory will going in this trajectory will go here you see. So, that is the corresponding to x 2 x 3 plane this is your x 2. So, you see similarly if you look at it x 1 x 3 plane again it is a saddle point. So, you will have this one which is going to x 1 going to infinity. So, you will have your x 1 x 3 if you go here. So, you will have your trajectory something like this. So, again x 1 going to infinity. So, the trajectory will move back. So, that is in the x 1 x 3 plane. Now, let us look at the x 1 x 2 plane. In x 1 x 2 plane the solution if you plot is a straight line and which is going to infinity. So, if you start from anywhere. So, it is a straight line you see you have your straight line if you look plot it here you will have plot. So, let me with a thing. So, this is the x 1 the axis will be of course, if you start an arbitrary point accordingly it will be high. So, whenever you start a point it will give you a trajectory and if you project that to the plane you will see that saddle point way of behaving it. So, let me put it a different color also for this thing. So, let me put a different color. So, you have that you see. So, that is a complete picture about that plane this is the phase protrait will look like what the graphs and the curves are given on the respective projected planes x 1 x 2 x 2 x 3 and x 1 x 3. So, you will have all that kind of picture. So, if you start arbitrarily it will move accordingly. So, now with this one may be one more example to give you. So, that you slowly get used to it how things will be three dimension you can do this little more with little more imagination. I am going to take a matrix which has some complex second value. So, the best way you know that the complex second value you have minus 2 this is a b minus b b minus b minus 2. So, I will take 0 0 0 0 anything three. So, if you do this one. So, you see the corresponding to have you have a system you see this corresponds to this corresponds to a 2 by 2 system this corresponds to a single system and this gives you the first part gives you a complex second value. So, you can do the exercise again each time you can do the exercise again values again values are minus 2 plus I of course, minus 2 minus I and 3 you have this again values. So, what you do is that you already seen this one this already in a decoupled form. So, 2 by 2 system the x 1 x 2 plane here is like a focus like a focus. You have to see that I do not classify because like focus if you take only x 1 x 2 plane and other things you have to see that one. So, if you look at here if you are trying to do the phase portrait this one. So, you have your phase portrait of x 1 x 2 plane. So, I have my x 1 x 2 plane like this. So, if I take any trajectory. So, you have here this is my x 1 x 2 plane this is x 1 this is x 2 and this is the x 3. So, if I start any point here again it is a decoupled the x 1 x 2 plane and x 3 part is decoupled because of this particular form of it. So, if I start here anything any point here if I start it will remain again. So, if I start anything in x 1 x 2 plane that means the x 3 coordinate is 0 and by looking at the way we constructed your matrix and solution it is a decoupled the x 3 part component is decoupled and hence it will remain in that plane itself and it will behave like a focus. And since you have minus 2 as the real part of the Eigen values minus 2 in the other case which is a converging thing it is converges. So, if you start from here it will remain in the plane and it will come something like that it will and you have an orientation according to the sign of the you see you have that writing. So, the trajectory this is the x 1 x 2 trajectory plane, but on the other hand now suppose I start a point from anywhere arbitrarily what will happen the x 3 component the Eigen value is 3. So, the x 3 t will be x 0 3 into e power t. So, the x 3 component as t tends to infinity the x 3 component will tends to infinity, but then x 1 and x 2 is something like a focus. So, it will move around that one and the move around the it is axis, but then moves away along the x 3 direction. So, if you do this thing. So, if you start from anywhere other than in the plane. So, it will move like this it will go around that thing, but that same time x 1 and x 2 it will move and it will become smaller and smaller around the x 3 axis because x 1 and x 2 go to 0 it is a x 1 and x 2 the real part will go to 0. So, it will move around the x 3 axis and move and reach go up, but the amplitude becomes smaller and smaller. And on the other hand instead of minus 2 if you replace 2 and it will be diverging. So, again then it will be move around like that it becomes larger and larger. So, that is how you can focus you can sketch the graph according to the model. So, we have two examples are given one you have the focus behavior on a restricted plane on another example in which you have a saddle point behavior. You can give all kinds of things with the node and all that. So, we will skip this we will do only that one. Now, we will go to the general case and Jordan form general case typically you want Jordan form. I am not going to explain the Jordan form in detail that you have already studied in the either preliminaries or you also studied in the general linear algebra course. So, Jordan form typically says that you look for all the Eigen values according to Eigen values you have to classify the real Eigen values you have complex Eigen values. And then if you the idea is that if you have an n by n system you essentially need n Eigen vectors, but if there is an Eigen value if all the n Eigen values are all distinct whether real or complex if you can produce n you have the diagonalization, but then there will be Eigen values real Eigen value with algebraic multiplicity, but less geometric multiplicity. And then you may not have enough Eigen vectors to diagonalize then the idea is to look for what are called generalized Eigen vectors studied in your preliminaries. But according to the deficiency between a geometric multiplicity and algebraic multiplicity you will be having a lack of sufficient number of Eigen vectors. And this makes just like in the previous case 2 by 2 system when you have an Eigen value repeated twice you got a block of the form lambda 1 0 lambda and which is not diagonalizable. So, now as I said in n equal to 3 you have all kinds of thing, but what Jordan decomposition tells you that every matrix A if you start with the A you can linearly can make it equal into a diagonal form not with a diagonal matrix diagonal with entries with blocks B 1 etcetera some number B R you can have the blocks each one each B I will have some order these are all belongs to certain multiplicity and other thing. So, you will be able to write this matrix B 1 you will have some order matrix say K 1 and B 2 will have some order 2 and etcetera B R will some order K j of course K 1 plus K 2 plus etcetera K R will be equal to n. So, the total order will be n by n matrix how does my B looks like that is more important the idea of that if you want to solve your system. So, if you have this is called my B big B. So, if you want to solve your system A x equal to y because of this diagonal form it reduces to it is enough to write down your solution enough to solve the system of the form each one. So, if you want to do it you can have a transformation P and P inverse here also it exists, but essentially reduces to study each B I a subsystem which may be of smaller order and this also B I has some special forms which I am going to describe here. So, it is enough to study B I of some y I the is equal to something. So, y I dot is equal to B I of y you can have solving for I equal to 1 to R and each y I is a vector corresponding to K by K vector K j vector K I vector if B I is of order K I by K I K by K I cross K I matrix then y I is a vector of order K I is a K I vector. So, you can solve for all these things. So, a bigger system eventually reduced to solving a smaller system. How does B I takes the following two forms B I takes the following forms takes the following only two forms that is a whole interesting thing following form. So, it will be of the form some lambda the diagonal entries are lambda and the off diagonal entries will be 1 the remaining will be 0 this is the case corresponding to real Eigen value corresponding to real lambda. And it will take the form or it will take the form the following form that is another form it will take the form D and here I 2 of this itself this block itself is a block each one is a 2 by 2 block where everything is 0. And what is D? D is of the form A minus B see these are familiar form eventually reduced to everything and I 2 is a 2 by 2 is an identity 2 by 2 identity 0 0 1. So, each block the only thing is that in the A B there are different blocks B 1 etcetera B R each B I will have different orders and according to the Eigen values whether it is a real it takes this form or the form D is equal to I 2. So, this is again the form is a block consisting of 2 by 2 blocks this is corresponding to this is corresponding to corresponding to complex Eigen values corresponding to complex Eigen values. So, it is enough essentially it is enough to how to know to compute in theoretically it is still difficult theoretically it is enough to know how to compute the exponential of these 2 type of matrices which is what I am going to do it right. So, I want to so I call this is equal to the form C 1 for the computation with the right this I call it of the form C 2. So, I want to know how to compute C 1 how to compute e power C 1. So, now consider this case we are going to do that one. So, C 1 look at C 1 C 1 I can write it separately as lambda I separate this one lambda into identity if some order whatever it is plus n I write that one. So, look at here I separate this one I keep only the diagonal entries put 0 0 everywhere plus I put diagonal entries also 0 and put 1 1 1 as the of diagonal. So, we can write where some order I have not told what is the order depending on that where n takes the form diagonal entries also 0 1 1 etcetera 1 the rest of the elements is you see it is as a very nice matrix. Here is a small exercise again for you you have to keep on doing that exercise assume C 1 is of order k I told you it will be some order I assume C 1 is of order k of order k. So, I is a identity matrix of order k by k and n is a k by k matrix exercise is that show that n power k is the 0 matrix 0 matrix and n power k minus 1 is not a 0 matrix just compute n n square etcetera if you want to compute her either by induction or you can compute that one such type of matrices are called nilpotent matrix. So, you have a matrix n matrix. So, let me matrix a any matrix q is said to be nilpotent said to be nilpotent of order k if k is the first instance where q power k is a 0 matrix and q power k minus 1 is not a 0 there is a first instance of course, if q power k is 0 that imply q power k plus 1 etcetera q power x plus k plus 2 etcetera 0. So, that makes the computation of matrices is easy because you do not have to compute after k onwards. So, the exponential term of nilpotent matrices reduces to a finite sum immediately therefore, e power q. So, the computation of nilpotent matrix is easy e power q will be identity plus q plus q square by 2 factorial plus of course, if k is large you have to do a large thing still lot of work, but it is finite. So, you do not have to worry about anything you see. So, that e power q after that say anything is 0 because q power k is equal to 0 implies q power k plus 1 equal to 0 q power k plus 2 equal to 0 it goes on to a. So, you have the nilpotent. Our aim is to compute e power c 1 e power we want to compute want to compute want to compute e power c 1. Of course, e power c 1 as I again remarked earlier if you have two matrices now c 1 is of the form two matrices lambda i plus n. So, if you want to compute e power a plus b is equal to e power a into e power b then e power a plus b is equal to e power a into e power a b if a and b commute. The interesting fact is that in this case one of the matrices identity hence identity and n will always commute any matrix with an identity matrix will commute hence further. So, further lambda i and n this is a trivial fact and commute that is a important thing if there is no commutation you can write. So, e power c 1 will be e power lambda i plus n since this commute this will be nothing but e power lambda i plus into e power a. So, you have immediately e power a and e power a is equal to e power n you have already the formula you have already the formula for e power n. So, you have your e power n. So, in particular so further. So, you are interested in commuting e power t c 1 because you want to find the solutions corresponding to c 1. So, that is e power t c 1 will be e power lambda t lambda in per e power t n. So, if you do a simple computation it is immediate I will go to the next page. So, e power c 1 will be equal to e power lambda i into e power t c 1 we want to introduce. So, you will get this as if you do this one you get e power t lambda into e power t n and that immediately can be written as e power t lambda. So, you have a complete thing if you do t will change the thing you get 1 t t square by 2 factorial etcetera up to t power k minus 1 by k minus 1 factorial then 0 1 t the last element will be t power k minus 2 by k minus 2 factorial. So, if you go like that the last but 1 the last here 1 and the last element rho will change the 0 etcetera 0. So, you see. So, you have an immediate solution. So, for the system with c 1 if you go back to the system with not this one next one. So, c 1 yeah if you go to the system with c 1 here you see. So, you want to solve the system. So, you have that this is a system a particular form of the matrix these are the only 2 things will coming up. So, if you go here. So, if you want to solve your system y dot is equal to c 1 y this will immediately implies your y t is equal to e power t lambda into e power t n this matrix e power t n of y naught see. So, you have your solution you have a nice solution here. Now, we go to the next case second case another one is of the form c 2 c 2 is again you can write it as in a nice way, but not with an identity you will write only d on the diagonal. So, you write your d here 0 here 0 here plus on the off diagonal these are all 2 by 2 0s 0 etcetera 0 is 2 by 2 0 0 0 i 2. So, you go here 0 the last, but one the last element will be i 2 and then you have 2 by 2 things 2 by 2 you have 0. So, you see. So, this is what again you can write this is a 2 matrices and this can be written as something like you call this diagonal diagonal of d d d plus some n where n in this form is off diagonal of so let me write it r. So, you can write this here again these things are commuting you have the commutation there is no problem. So, if you write your solution with the little computation I will skip here with the little computation which is an exercise the little computation the similar thing computation you can write your e power t c 2 is equal to e power 2. So, is that that matrix is of the form what is a matrix e power a t into everything will come 2 by 2 blocks r t r etcetera t power k minus 1 by k minus 1 factorial r 0 r t r etcetera up to that and the last element will be last element will be 0 last, but one will be last element is correct and what is r r will be of the form the same formula your cos d t minus p t you see only thing you are not able to separate it cos p t. So, you see again you have your solution y t if you have the solution y equal to c 2 y of this form your solution will be of the form y 2 is equal to e power a t into this matrix whatever it is yeah this matrix what do you call it whatever it is this matrix into this matrix y naught. So, you have your solution representation again. So, that gives you more or less a complete description of course, the things are not that easy as we think, but then you can write to your solutions completely here you are able to completely here. So, what you the important point in this thing that for every solution you are reduced your system of smaller systems and the only things are coming is the exponential function in the solution the polynomial function because of t t square etcetera polynomial functions and the trigonometric function. So, the every solution of your linear system is a linear combination of these factors something like t power k e power a t and cos b t or t power k e power a t sin p t. So, the only this elementary functions and its combinations will come in the representation of the solutions that is a small remark. So, maybe we will write a corollary in that form immediately what analysis we can do that the solution x t the solution x t of the initial value problem of the initial value problem this is an important observation initial value problem of the solution is the linear combination of the form t power k some number t power k there will be many different types e power, but it will be of that form cos b t and t power k e power a t sin b t. So, this also where lambda is of the form a plus i b of some form. So, that gives you. So, you know that only the expressions of the form t power k e power a t cos b t or sin b t are available and. So, you can immediately see that you can of course stability only when a is negative for all the Eigen values. So, the stability of this system if one of the terms the real part of the Eigen value is non zero and positive greater than or equal to zero if any Eigen value any one of the Eigen value has a real part which is greater than or equal to zero you do not have the stability. So, the x t goes to zero only when the real Eigen values all the real part of the Eigen value is negative in that case it goes to the origin exponentially, because there is an exponential term and that is what the one remark and secondly what you want to do is that you want to do one or two may be one or two examples at this stage may be I will give you one more thing one more remark I want to make it here probably thing the other remark I want to think. So, corresponding to that I do not prove here, but I want to make in the general case what will happen is that. So, you look at this all the subspaces e s is equal to the span of I am writing in general complex if the complex part is zero it will you will become real part. So, you look at v j u j where u j plus i v j is an Eigen vector corresponding to lambda. So, such that your e a j is negative you look at your Eigen value is always lambda is equal to a j plus i v j and Eigen vector this is the Eigen value and Eigen vector is equal to u j plus i v j you write it that way in the complex way you write everything. So, you have if v j equal to zero it becomes a real Eigen value there is no vector. So, there is no problem. So, it incorporates even the real Eigen values with v j equal to zero. So, look at all the Eigen values all the Eigen vector real and complex part of the Eigen vector corresponding to a real Eigen value corresponding to a Eigen value whose real part is negative and collect all that span it that are called the stable part and then similarly you have your e u this is span of v j u j with a j positive and e c which we have not seen an example maybe we will see an example span of v j u j with a j is equal to zero. What the interesting theorem will tell you which I do not prove the your space r n can be decomposed into e s these are called the direction e u direction. Of course, one may need to use the other e c this is called the stable space you will learn more about these things in the non-linear study stable this is the unstable subspace unstable this is called the center. What I have not mentioned here is about the generalized Eigen vector once to do the Jordan decomposition as I said you may not have the enough Eigen vectors. So, one has to work with what is called generalized Eigen vector. So, we will be studying in that the whole analysis in the generalized concept which we have not introduced or we do not have time in this course to get into more details about the generalized Eigen vector here, but we have to work with this decomposition to happen in that form in Jordan decomposition to happen you need to work with generalized Eigen vector. So, we will not give you may be probably if you have a time I will the interesting one more thing is that one more some of the notions I am introducing it you will understand it when more about it when we go into the non-linear analysis a definition to start with a subspace e is called invariant under the flow recall the flow called invariant invariant under the flow what is flow? Flow is e power t a if e power t a of e if we act on e it will remain in e itself. So, it will remain in e the other results about the proposition or theorem part of the previous theorem which I have not stated is that these spaces are invariant e power t a of e s contained in e s e power t a of e u contained in e u what it shows that if you start with a point in say e power s and if you follow the trajectory starting from there under e power a it will remain in e power s itself it will not move out of e power s. On the other hand that is a stable space subspace on the other hand if you start with from a initial point from e power u then e power t a of that element will remain all the time in e power u. So, it will not leave the spaces that is invariant thing. So, the e power s e power u are invariant subspaces under of the flow. So, the flow will not do that well in particular if x naught is in e power s you can see that e power that means all it is a real part of the Eigen values are negative and hence it is stable exponentially. So, if you start with an x naught in e power s e power t a has negative Eigen values the trajectory will go to 0 that is an interesting thing and this is in more general probably you may learn in the form of stable manifold theorem which we will not do it here. Probably we may do it in a module of non-linear analysis if possible otherwise we may not cover, but basically it is the content in the linear system of that. The stable thing anything you start from a stable subspace and then you do not get the subspaces when you go to non-linear analysis. You start thing, but in the linear system if you start from this stable e s it will go to the thing here. To give a probably I want to give some more example one more lemma or something I may skip it here, but then I want to start two more examples in the remaining time of this lecture may be one example or two example. Let us see we will give one more example. So, let us start with an example and may be one more example in the next class we will do it thing. So, with this example we can think. So, let us consider the system a 0 minus 1 1 0 0 0 0 0 2. If you look at the system a this is again actually decoupled decoupled into a 2 by 2 system here. If you look at here this is a 2 by 2 system decoupled with a equal to 0. So, it is a something like a center you see and then x p component is separate so what are your Eigen values you have your Eigen value i of course, minus i the other Eigen value lambda 2 and you have lambda 3 is equal to lambda 2 is equal to s and you can also see that your Eigen vector you can see that these are all exercise whenever I leave I do not work out is called the exercise part you have your 0 1 0 plus x i into 1 0 0 and that we will call it this is your u 1 plus i v 1 and you will have corresponding to this you have an another vector u 2 because that is a real Eigen value. So, you have only 1 Eigen vector 0 0 1 so you see so you have. So, if you look at it here this is part is 0 it is a corresponding to a center your e c is nothing but your x 1 x 2 play is a center it is a form of a center with a real part of this Eigen value is 0. So, for this lambda 1 lambda 2 the real part is 0 so it will behave like a center and your other Eigen value is 2 that is corresponding to the x 3 part and since it is a 2 in positive it will go to infinity and it is unstable and you have your unstable part is the x 3 axis you see so always decomposing and you are there is no nothing like stable here stable is empty. So, again complete x 1 x 2 play and decompose your r 2 of course, your r 3 is e c plus direct sum into e c plus direct sum into e c plus direct sum into e c plus u e c is empty. So, you plot this graph here you want to see a plot of this graph you want to see a plot of this graph. So, this is your x 1 to your x 1 and this is your x 3 and x 3 is a decouple part. So, whenever you start a solution here whenever you start a solution here because x 3 in the x 1 x 2 plane. So, if you have any initial values in the x 1 x 2 plane and by the solution is a decoupled x 3 is decoupled it will remain in the x 1 x 2 plane itself and in the x 1 x 2 plane the real part of the Eigen value is 0. So, it will remain like a center. So, it will be a circle the solution will be a circle. So, if you start from here it will be like this you see it will be in the x 1 x 2 plane. So, what will happen if it is a point if you are starting from above or below what does it shows that the this shows it will rotate the x 1 x 2 plane it should rotate but, then the x 3 component x 3 t is equal to x 0 3 if you look at here x 3 t is nothing but x 0 3 into e power 2 t it goes to infinity without reducing this thing. So, if you start from any point here if you start from here it will move like a center, but then x t will going up you see. So, if you anything it will curl around with the same like a center and will move up bigger that if you have below it will also move below with the same radius. So, it can work the exercise for you to take all kinds of things in three dimension and see all possibilities. So, may be one more example to and then in the next class I will give one more example if I have time. So, one more example I want to. So, the examples are the best way of imagine. So, I want to have a with the both we already seen with one you can value 0 now you have one with both you can value 0. So, what is this system corresponding to that x 1 dot is equal to 0 x 2 dot is equal to x 1. So, that implies immediately x 1 t is equal to constant that is nothing but x 0 1 and your x 2 t is equal to x 0 1 t plus x 0 2. So, let us find out equilibrium point if you want to find out equilibrium point you look at a x equal to 0. When a x equal to 0 the first one given in information because x 0 equal to 0 you will get it the second will give you an information you have the first component is 0. So, the first component is 0 second component is arbitrary that implies every point earlier we got every point on the x axis every here every point on the y axis is an equilibrium point is an equilibrium point you see. So, if you try to see the here plot the graph here all the points here are equilibrium point if you recall the earlier example which I have given in a similar situation with an eigen value with one eigen value 0 and other eigen value non-zero eigen value you have seen that all the points on the x axis thing and anything you start above it is goes towards that thing. But here you will see something different. So, when there is a degenerate case the situation will be different. So, now, let us look at it any point x naught here again what does it says that this tells you that again x 1 t is x 0 1. So, it should remain in the perpendicular lines till and the second one because of the x 0 2 t this looks as it it depends on this t tends to infinity you say x 0 1 if you take x 0 1 in this is a quadrangle on this quadrangle x 0 1 is always positive you see. So, if you look at the first if you take the upper half plane the first and second quadrangle your x 0 1 positive or x 0 2 if you take this sorry. So, if you look at this portion your x this is the x 1 axis. So, on that portion on this portion here also x 0 1 positive x 0 1 is positive and if you look at here in this case x 2 t here tends to plus infinity as t tends to infinity if x 0 1 positive and it tends to minus infinity as t tends to infinity if x 0 2 is x 0 1 is negative. So, if you start any point here the first component will retain here. So, it has to move in a line perpendicular to x 1 axis, but it should move to plus infinity even from here your x 0 common component is positive. So, it will not go towards any equilibrium point anywhere you start it it will move like this it will move again like that only. On the other hand this side x 0 1 is negative in that case it will go to minus infinity the thing and again it should remain in the perpendicular to x 1 axis. So, it will move here if you start from here it will move you see near the center the behavior will be like this anything where it starts does not matter even if any point you start it will move like that and these are all equilibrium points also only by axis if you start it it will remain there because every point is an equilibrium point and the right side if you start it is going to the infinity along that one. So, this is a phase portrait of this thing portrait of the system. So, with this we will have finished more or less everything with one definition I will end this class definition. If all the Eigen values of A this is because this terminology will be used in the non-linear system if all the Eigen values of A have non-zero real part have zero real part leads to center that is we have non-zero real part then the flow is said to be hyperbolic then the flow is said to be hyperbolic and the system is called as and the system is called a hyperbolic system. So, you will study more about this hyperbolic system etcetera in the non-linear thing. So, in the last lecture if we have time we will try to present one more example, but then may move my next lecture the last lecture of this particular module is to see how to represent the solutions in the non-homogeneous system. So, far we were studying x dot equal to A x t. So, we want to see how to use this to represent a solution of the form x dot equal to A x t plus g x and we make few remarks when it is a non-autonomous system where A depends on A t then the we do not have a representation like that, but we can represent a solution in the form of something is what are called the fundamental and transition matrix. So, with that we stop this lecture. Thank you.