 Češčo. The first of all, I want to recall what we got yesterday about the existence of a flow for starting from a regular initial network. If we start with a regular initial network so only triple junction only 120 degrees, zelo je zelo vzbih površenju. Tudi za to, da je tudi velika, da je tudi vzbih površenje in kako se očetimo mačnjega solonija, pa je tudi vzbih površenje in tudi vzbih je to. Zalam za taj ne več blizb, da splijem v zelo vzbih, se zelo vzbih s tebranje If you have a network with sum of free curvature equal to zero, you can call it a geometric to compatible, you can reparametize it in order that it becomes, it goes to satisfy this one. So if you have this one, this is for a trio, then this is for a general network, you have an evolution for a short time. Evolution is immediately smooth, actually, and smooth means that not only it's infinity, but also that all the compatibility conditions, geometric or not, are satisfied at every positive time. So if you have only geometric compatibility, you reparametize, you restore your flow, and by approximation you can also get that if you start, so this is the geometric to compatible, so you get the flows also in this case, in this case the curvature is continuous up to time zero. Instead, in this case, when you want to evolve only C2 without compatibility condition initial guy, well, you can get a flow by approximation, but the curvature is no more continuous up to time zero, and actually you only have C1 continuity up to time zero. So the tangent go to the tangent, so you have at least the continuity of, for instance, of the 120 degrees conditions, so you have, but not C2 continuity, because the compatibility conditions are satisfied immediately for every positive time, so if it is continuity there should be satisfied also at time zero, and this is possibly not true. Moreover, since you really want to evolve as I wrote yesterday, also no regular networks, in that case you have to use this theorem here, that I think you know some about bracket flows by the lecture, the previous week, I guess, most of you, and here you even lost the C1 continuity at time zero. So you have a bracket flow, which is immediately smooth, immediately all the compatibility conditions at the triple juncture are satisfied, it's a regular net, the evolving networks are immediately regular, so all the triple juncture, all and 120 degrees conditions are satisfied, but you only have kind of measure theoretic continuity in the tangent at time zero and positive time, which is the measure theoretic continuity given by being a bracket flow. Anyway, all these flows for every positive time are smooth, so if we forget the starting time, immediately we are smooth, we are on a smooth flow, and so you have all the compatibility conditions, and you have a satisfied something like gamma t is equal to curvature times the normal plus some tangential component times the tangent vector, and moreover, we have this flow starting from zero t, so if we move a little bit and we look at this flow on an interval epsilon t with positive epsilon, well this is a fully smooth flow, for instance, I look also like this, and actually there is always a possibility to reparametraize flow like this in order that it is a special flow, which means that it satisfies amix square, which was the ingredient to produce a curvature flow starting using the solonic of theory. But this is quite good, because if I only write this, this functional lambda is unknown and actually can be varied, because as we said reparametrization modifies only the tangential velocity of your flow. So there is geometric uniqueness of this flow in the right class, but geometric means that you can vary in the interior of the curves, the tangential velocity, and you don't see anything if you only look at the wall shape of your network. So if I choose a right reparametrization, not till time zero, but from some positive time on, I can always assume that my flow is a special flow, so it's given by this, which means that I have an expression for lambda now, which is useful. Lambda is actually the tangential part of this guy, so it's exactly, it's clearly given, so this becomes the first, which is actually equal to minus one over. So I have an expression for lambda, so I can use it to get evolution equation, not only on the curvature and its derivatives, like I did yesterday, but also on lambda, by means of the evolution equation. And which is actually what I'm going to try to see today, because for the analysis of singularities, you need to estimate, estimate some curvature, all the quantities around, so you have to write down how to get the estimate, usually take the evolution, write down the evolution equation for the relevant quantities, try to estimate in some way. Classically for smooth hyper surfaces, you do by being a maximum principle. That's the usual, most important tool in getting estimate, pointwise estimate. In this case, we have boundary points around, so maximum principle doesn't work very well if you don't know that the maximum of the quantity you want to estimate stays inside. So if you are not able to conclude that the maximum of your quantity is in the inside, is on the boundary, you cannot use maximum principle, so you cannot use the same line, so actually one possibility is to use integral estimate. With integral estimates, things gets better because the triple junction, in a way, if the quantities you are interested are good, which good means several things, they are not so much boundary points because you will do integration by parts, you have boundary contribution, but because of the condition of 120 degrees, in several situations, this boundary contribution will cancel each other and they don't give any contribution. So in a way, from a distributional point of view, and if you are looking for integral estimates, in several situations, these are not boundary points, are inner points of your network. So when I will speak of boundary points, I will really have in mind the fixed end points on the network, on the boundary of the domain. So now today I will show you how to get some of these estimates, and let me only say another couple of things. What can be done by means of maximum principle is actually that these, all these smooth flows that are re-reparmerized in this way, now I will consider only these special flows here, are always embedded. A network cannot lose its embeddedness till it doesn't get singular. So things like these, that these two curves get to touch each other, cannot happen, always embedded, and even more they cannot go to cross each other. This is because if you put yourself here, you can use an argument by maximum principle, which is standard in mean curvature flow. If you look at things here, you only deal with curvature flow curves, there are no triple junctions, forget triple junctions. And so with the same argument that the curve, an embedded, initial embedded curve cannot go to lose its embeddedness. At the same way, a network here, for still the evolution is smooth, cannot lose its embeddedness. So all these flows are flows of embedded networks, smooth with all the compatibility conditions, and special like this. And another thing that I just say very fast yesterday, is that there is really no hope for uniqueness. And in this example to see this, suppose that you start in a square, and now this guy gets exactly in the situation that you can use. The theorem of money never should, etc. So there is a bracket flow here, starting from this guy, immediately regular, so it opens, so this four point, you expected it opens, like two triple junctions getting far each other. So for Istan you can expect that it can happen, something like this, 120, 120, and then it moves on. But actually because of the symmetry of the problem, so it's our initial network is symmetric by rotation of 90 degrees, you can always admit that if there is this solution and is there, what you also have the other solution, rotating this in the other direction. And there is no way to decide what is the best. So actually there is a really loss of uniqueness in this problem. So what we can hope, we cannot hope for uniqueness, but conjecturally, I think it's the conjecture of Thomas Mannen, for the generic not so symmetric like in this situation, initial data, you should be uniqueness of the flow. But this is only a conjecture and what doesn't mean generic must be, it's only a little bit of speculation about this. Ok, so after we start, we start with these smooth flows. So we know that for some time you have a smooth regular evolution and then you want to understand what happens at some time. Possibly nothing happens, possibly your network goes on for every time. And hopefully you expect the case, the maximal time of smooth existence, you expect if this maximal time is plus infinity that your network will converge since we are moving by the gradient of the total length to a critical point of the connection, the network connecting the points on the boundary of your domain. What is sometimes called a Steiner network, which should be a critical point. So this could happen if you have a smooth flow without any kind of singularities for every positive time. Unfortunately, it's possible to show that there are examples when this t is not plus infinity. I'll show you an example in a while. So at some point there something happens, you have the maximal time which is finite and some singularity can appear. And in the simulation that we saw yesterday, actually singularities are apparently related to a change of structure. More or less the goal of today and tomorrow that singularities are change of structure. No change of structure, no singularities. The first theorem, the usual theorem zero in geometric flows is usually at a singular time in the smooth situation curvature must blow up. It happens in mean curvature flow, it happens in Ricci flow. If your curvature is bounded, there is no singularity. But here we have to admit that possibly there is also a different singularity where maybe the curvature is behaving well, but the length of a curve or a region is vanishing or a length of one curve is going to zero so the curve is vanishing. So the first theorem that I want to show today, at least in a sketch of the proof is that if we have the maximal time is finite, actually one of the two conditions, one of the two things happens or the curvature, either the curvature is not bounded at t goes to big t or the length of at least one curve of the evolving network as t is our flow, our smooth flow of networks is going to zero when t goes to big t. Actually these two conditions are not mutually exclusive. You can have one, the first, the second or both together. And actually what I will, this is the basis theorem that tells you what's happening at a singular time. And actually what I'm going to do tomorrow to refine this theorem saying that one must always happen. In every singular time there must be associated some change of structure, some curve is going to zero, curve or region. Because if a region goes to zero, one curve at least is going to zero. So this is ground level and tomorrow we will try to refine it saying that first must happen always. Then we will divide two cases, if first is happening and you can have two happening also, so curvature unbounded or curvature bounded. And the analysis of the two cases and we will see that in the case when one and two are both happens, it's exactly the case when a region is vanishing. And when one happens and two not to instead the curvature stays bounded, it's exactly the case when only one curve is going to zero. So there is a full description at a singular time. Ok, now I'm sorry today will be a little bit boring because in order to do this, usually what you do in the smooth case, when you take the curvature, you write down the evolution equation, you write down the evolution equation for the derivatives of the curvature, time and space derivative of the curvature, and then you try to prove that if your curvature is uniformly bounded, so you negate condition 2, all the derivatives of the curvature are uniformly bounded during the flow, then all the derivatives of your map gamma are bounded during the flow, then you can get a limit by jasko-jarzela, so you get a limit network when t goes to big t, this limit is smooth because you have bounds on all the derivatives, so you can reapply the previous starting theorems and so contradicting the fact that t is the maximum time of a smooth existence of one flow. This is the line when there are no triple junction around and the tool is maximum principle. In the presence of triple junction, well, you also can have this fact that one length can go to zero and maximum principle cannot be used. So the line is the same, but you have to use different tools in order to get the same result. So we start with working on evolution equation in order to get the estimates. So we have this guy here from now on. The first computation I want to do is to compute the evolution of a length of a single curve, let's call it Li. So we have our network here, let's take this, this is one curve of the network and Li is simply the length of that curve, so actually it's given by the integral of our gamma i of one, the measure given by the arc length parameter. And actually here, if we want to compute this derivative, this is gamma i, actually, well, the point is to compute the evolution of the measure associated with the arc length. Here again, using the formulas that I showed you yesterday, since the arc length measure is given by this guy here, well, now you simply have to take the time derivative of this guy, which means the time derivative of this guy. This guy is not affected by time, but this guy, yes. So if you do the computation like yesterday, here it's easier because you simply have to, here you can interchange always x and t. It's s and t, remember, there is a commutation formula, which an extra error term. So if we do this computation, what we get is equal to lambda s minus k square in the s. If you are doing mean correlation flow or curve, smooth curves, you don't have this guy. Possibly you already seen it in a whiskey lecture that the evolution of the measure associated to the moving hyper surface, in this case curve, is given by minus, for an episode there is h square, mean correlation square times the measure. In this case, since we have this multiple junction around, there is also a contribution by the space derivative of lambda, of the tangential part. Ok, so if I use this, this must be equal to integral over gamma i of lambda s minus k square. So I keep this part here, minus k square. And then, here if I look only at this, well I have, I'm integrating a derivative of something. So actually I can integrate the parts, or simply take the primitive. So this must be equal to the value of lambda, point y, but let's suppose that your curve is parameterized like this. This is gamma i1 of t, and this is zero t. So what I get here is lambda in one t plus lambda one t minus lambda zero t. Ok, as I told you, let me forget about in norm discomputation all the contribution coming from the boundary point where the network is fixed. So let me forget about this guy. This is only gamma i, since it's related to the curve. And try to, this is my i of zero. This is our triple junction, we go to 120 degrees. Because actually, well, I did it for this, but actually we are dealing with only triple junction, regular networks. Ok, so this is the evolution of Li, and I can do anything more than that. But now if I take the evolution of the length of the wall network, let's think to this special case of a triode, but it generalized to any general regular network. Well, I have to sum all this. If I take time derivative of L, which is time derivative of the sum over all the curves of Li, Well, I have the same conclusion here, simply I have to add all this contribution. So I have minus the sum of k square on all the curve gamma i minus the sum on the whole i of gamma i zero t. For the all curves. In this case they are free, but you can imagine it works for general network. Ok, this guy here is simply minus integral on your network, s of t of k square, like in mean curvature flow. And now if we look at this guy here, in this special situation there are only three curves. In general there are several others, and that you can sum this guy in groups of three. And if I add this, the free related to this triple junction, and the same for all the other triple junctions. What I have, I have, that I am adding the free values of lambda at this triple junction. But if you remember the computation I did yesterday, at every triple junction when you add the free curvature, or the free tangential component lambda, in the case of this flow here, you get zero. So you, for every triple junction take the free contribution coming from the free curves, and that group gives you zero. And this holds for all the triple junctions. So all this contribution is simply zero. So this is consequences of the 120 degrees condition. This is in a way one first hint that this triple junction satisfying 120 degrees condition are in a way inner point. Because at the end the conclusion is that the area, the length, the evolution of the length is simply minus k square of s of t. Like for a single closed curve. No contribution from the triple junction. Actually here I am forgetting the contribution from the boundary points. But actually it can be shown that since the boundary points are fixed, they are not moving. So if I look at what happens to gamma at the boundary points, it's fixed, it's constant, not moving at all. So if I take time derivative I get zero. So at the boundary points curvature must be zero, tangential velocity must be zero. So actually I am not cheating too much in throwing away this guy here, because actually it's zero in this case. And this formula is exactly that identical to the formula for a closed curve without triple junction at all. So this is the evolution of the area. Now the second, which is the easiest geometric element that you can associate to your network, the second geometric element is the curvature. Now I want to compute, but I will do everything in the easy situation, but it can be done for a general network. So I always have this, and now I want to compute the evolution of an integral of the curvature. Because usually for curves you estimate the curvature using the evolution equation that yesterday we wrote was KT is equal to KSS plus KQ. And also there is, do you remember the sign, the contribution by the triple junction plus yesterday we derived this. And actually if you could try to use maximum principle here, because you put yourself in the maximum point of the curvature, so this guy is negative, this guy gives some contribution, this guy must be zero because you are in the maximum point so KS must be zero. And if you do it at the maximum of the curvature you get KT smaller than K3 by means at the maximum point. And then you use this to estimate the evolution of the curvature. This if you can use maximum principle, but here you must be sure that the maximum of your curvature stays inside of your curve, not on the triple junction. And no one can tell you that, so you cannot use this line for network, which instead is the main line for a closed curve for instance. So, what you can do, well, you can try with integral estimates. So you consider for instance, with the idea that possibly there are cancellations like before, and so you are able to conclude something. So we start considering this guy, the L2 norm of the curvature squared on some curve on one of the three curves, gamma i. And again we start taking derivative, we use, so this is only for curves for us. So I take the derivative, here you have the S, so if you take this derivative you have to take derivative of this guy, derivative of the measure as before, so what you get is gamma i. Now I use this equation, so I get 2K times KT plus KQ plus lambda S lambda, KS. Plus the contribution by the evolving measure. I wrote it there, are you right here? Time derivative of the S is equal to lambda S minus K square S. So you have an extra contribution here given by K square lambda S minus K square. Ok, sorry. Ok, now I'm working on this expression. First I integrated by parts here, in the first term here. If integrated by parts I get minus 2KS square, 2 times K to the fourth power minus K to the fourth power. So it becomes amusing. This guy here, this guy here and this guy here. Moreover, when I do this integration by parts I have boundary terms. What is the boundary terms? Well the boundary terms is given by 2 times. Again, I forget the contribution here and I only write the contribution at triple junction. So I get minus 2 times KKSI. Let me write like this at triple junction. So I have these two guys in the game. These two guys together are given by 2KS lambda plus lambda S K square. And you can see that this guy is exactly the S derivative of lambda K square. So again, I can throw this term on the boundary. So what I take, what I get is, sorry, this one came with a plus. Because as before. And this guy again comes plus lambda K square to zero. So I rewrite it here. Let me write it like this. Minus 2KS square plus K to the fourth power plus the boundary guy, which is given by K square lambda KS K, 2 times, plus K square. I would love to, I have a very nice term here, which is this guy here. And I get order, the I get derivatives around is KS. And it appears here with a minus sign. So it's pushing things down. This guy here push things up, even if, but at least it doesn't have an I get order derivative. It's a zero derivative in the curvature. Instead here, I have this term here that unfortunately involve the KS, the derivative of the curvature point wise. So apparently there is no hope. Well, I can guess that I can control this term here by means of this. In order, I want to get an estimate from above on this derivative. In order to say that this quantity cannot go, you know, get too big, too much. But okay, I can imagine I can control this guy by this. But actually there is no way to control with an integral quantity, something which is point wise. The same, at the same level. But this guy cannot control point wise KS. There's no chance. Okay, now the point is that if you look at things like that, you are stopped. You cannot go on. But actually we are interested in the total integral of K square. So actually if I want to, now I look at this over the wall network of K square in the S. This is equal to the time derivative of the sum on all the curves of this guy here. So the result is the sum on all the curves of this guy there, which means simply on the network. And as before, if I sum, I need to check the sum on the all curves getting to the triple junction of Istapia. And again, all you think the simplified situation where there is only one triple junction or you divide in group of three, of three curves joining a single triple junction. So suppose that we are in the easy case and we only have three curves. We look at this guy here, in case we are in easy situation. So we can say one, gamma two and gamma three. So I am looking at this term, I12V of 2KIKSI plus IKI at the triple junction. Yesterday, if you remember, when I differentiate the 120 condition, which means sum of the tangent at the triple junction must be zero, I differentiate in time, I find out, if you get your north of yesterday, I find out that it must hold that KS lambda IKI times the normal, it must get zero. It was one of the conditions that I call third order condition that comes out from differentiating the herring condition, 120 degrees condition. Ok, actually again, if you do some linear algebra, there is only one possibility because this relation is satisfied with the free normal, which again, the sum, the free normals are rotation of the tangent, so they are free vectors with angles between them or 120 degrees. There is only one possibility that this, something like this holds, linear algebra, is that for the one, two and three, this coefficient here in front of the normal must be all equal. This is true if and only if KSI plus lambda KII are equal to KSJ plus lambda J KJ, for every I and J in one, two, three. So this quantity is the same as the triple junction for the free curves. Ok, how can we use this? Well, write it here again, that KI KSI plus lambda, sorry, KS, I plus lambda I KI is equal to the constant independent. And in particular, since I know that the sum of KI is zero, other conditional triple junction, and these three are constant, this means that the sum of two CKI is equal to zero, zero is equal to this guy, but this guy is equal to sum of KSI plus lambda I KI two times KI. Čeknoli is equal to sum of two times KI KSI plus lambda, let me write like this, lambda I KI square plus lambda I KI lambda I KI. And see, our guy appears, that guy there. And this equation equal to zero means that this guy here is equal to minus the sum, minus, what is that? lambda I KI square because of the relations. Why this? Why this is interesting? Well, because the derivative of the curvature is gone. You have no more something that point wise is at the same level of the derivative in the good term here. All these computation can be transformed in all these minus sum of I of lambda I KI square. And this guy is gone. So in this case, differently by the length, the contribution of triple junction is not gone before it was gone completely. In this case, at least it simplifies in the order of derivation. You can gain one order of derivation by using the relationality of junction. And this is good in order to do estimate because now there is hope that this L2 norm square of KS is able to control since we are one-dimensional. This guy here. And in fact, this can be done. And the tool to do this is actually interpolation inequality here, the theorem. Interpolation inequality of kind of Gallardo-Nierenberg inequality that was developed by Gallardo and Nierenberg, more or less independently. And the theorem that I wrote there, actually you can find it on the book of Adams, actually. And you have a curve with boundary in finite length. We have a sensitivity function. Then the first inequality tells you that you can find, you can bound an LEP norm of an intermediate derivative in terms of L2 norm of a product with powers of the L2 norm of the highest derivative and the L2 norm of the function U. Plus an extra term, which is related to L2 norm of the function divided by the length of the curve, which means that the more the curve is small, the worst is your inequality because the constant in front of the second term becomes larger. But actually, if the length is not going, is well-separated by zero, that you can think of that as universal constant. And moreover, there is also an L infinity expansion, so you have a point-wise L infinity estimate on intermediate derivatives in terms of the higher derivative and L2 norm of higher derivatives and L2 norm of your function. I think you can guess that the function that I want to put inside here instead of U is K, the curvature. Because here, I'm excited in this situation, you have a good higher derivative terms with a good sign, negative sign, and there is something which I want to estimate in L4 norm here and something point-wise. OK, here there is lambda. If you remember another relation that I wrote yesterday that the big K is equal to the big lambda, where big K is the vector of the free curvatures and big lambda is the vector of free tangential velocity. Well, this means actually that the norm, the sum of the square root of the sum of the squares of the lambdas is equal to the square root of the sum of the square of the K, which means that if you have inequality that you can always control lambda i with some constant times K, which is simply K1 plus K2 square plus K3. So, I can always estimate this term here with some constant time the maximum of K to the third power because of this relation, of that relation there. This L infinity is on the one network and now what I want to do is to use interpolation estimating in order to control this term here and this term here by means of this L2 norm here. Using interpolation inequalities. I just as you skip all the details, but actually two and now I want to let me put this up. Well, for the first, if you choose, well, u is always K for the first you take n equal to zero, m equal to one and p equal to four. What you get actually that you get the K in L4 is bounded by a constant times KS L2 to one fourth K3 fourth in L2, another constant divided by the length to one fourth divided times KL2. Now if you take a fourth power, both sides, what you get is K to the fourth power is bounded by constant integral of K square to three half plus one over L integral of K square, square. Now you use Peter Paul inequality here or Young inequality, as prefer, in order to separate these two terms and to put a small constant in front of this term here, square. So what you get here is smaller than one fourth integral of K square KS square plus a constant integral of K square cube and there you get plus the same guy. Ok, this must be done on a single curve and then some on all the curve. If you have a bound from below on your length because these constants are bad when length is small. So if you assume that your length is bounded below by some epsilon your constants are uniform, are there and bounded above by some constant depending on epsilon. So now if you have this and you substitute this case for here then will be bounded by something which is one fourth integral of KS square plus a function of integral of K square to the cube to the square then you can write plus constant K square square plus another constant and along the same line using now the second inequality, the second inequality here. This you can do by yourself following the same line and now the choice must be again u is equal to K and n is equal to zero and m is equal to one and p plus infinity and you get the right sigma that in this case is equal to one half, no, yes, one half. So you again have an inequality like this. You use young inequality in order to separate the multiplicative term here like you did before and again you find out that this term is again controlled by something like one fourth integral of K square S plus a constant K square square but then you have this, I didn't write it, one is smaller than this, second is smaller than this and then you have this minus two times KS square but now one and two gives one half of this guy minus two of this guy so all of these one, two and three can be thrown away. Final conclusion, the time derivative of integral of K square on your is smaller than a constant depending only on the length of your curve not going too close to zero plus the times, the integral of K square to the, sorry, yeah, it's cube, not square, sorry, cube here and cube here to the cube plus another constant. So what does this, that, okay, this holds if during your flow the length stays bounded away from zero. You have this with a constant uniform in time, this always holds and this for the zero, this already tells you if you consider, you call F the expansion here and you assume the length during your flow is uniformly bounded away from zero then if you have an ordinary differential inequality for your F smaller than CF plus, let's say D with C and D uniform in time which means that if a time zero F has some value you cannot go to plus infinity as fast as you want you have a bound in a small interval because you solve the odd e with the equal and make a comparison and your F cannot, from a bound at time zero you get a bound at positive time for a uniform interval all related to the bound from below on the left okay, now already a little bit a mess but you don't want to do this only for the zero levels you want to do it for the old derivative of the curvature and I'm not going to give you, to show you all the details but actually only the final results if you repeat all of this including the tricks in order to lower the order of derivation at the triple junction what can be done is that if you start now working on space derivative of the curvature square the line is the same, it's only more complicated the algebra of the triple junction becomes more complicated and the inequalities also well you get something a little bit weaker which is the following what you find out that you can estimate this guy by constant times the integral between zero and t of the integral on s of t of k square again to the power 2j plus 3 the s the t plus a constant integral over s t of k square 2j plus 1 only because it only works because the algebra that I use the algebra tricks that I use on the triple junction only works when j is even you can more or less do the same with the little weaker estimate you see here you have this integral in time in front that is not present in the other estimates but actually this is sufficient to say that suppose that the curvature is bounded uniformity in your flow and all of this again under a bound from beyond the length so suppose the length is uniformly bounded from below the curvature is uniformly bounded from above if the curvature is uniformly bounded from above all these guys are bounded so the time derivative of L2 norm of every even derivative of the curvature is uniformly bounded in the flow depending on the bound from below on the length and on the bound from below on the curvature but now if we get back to the theorem as we were interested in we want to show that something happens all the length goes to zero all the curvature goes to plus infinity but now we okay let's try to work by contradiction suppose that the length is not going to zero well the length is not bounded away from zero the length is bounded away from zero and the curvature is bound so we are exactly in the hypothesis in which I did this estimate so you have all these estimates this and the other on the curvature so all the L2 norm or even derivatives of your curvature integrated on your network or on your network are uniformly bounded during the flow but then using again the Gallardo Nierenberg estimates from L2 estimates you can pass to L infinity estimates so you get by using the third ones the last inequality so then you have L infinity estimates on all the even derivatives but well if you have L infinity estimates on the even derivative you also get L infinity estimates on the odd derivatives this line can be done also for lambda in that case just for curiosity you can do only on the first round for j odd but then you do the same argument at the end what you found out that if every t zero t the length is bounded by some epsilon by zero and the curvature is bounded by some constant k infinity s of lambda infinity bounded by some constant for every natural order then using the evolution equation you can also bound time derivatives because if you have time derivatives using this evolution rules you can always transform time derivatives in the space derivative so controlling the space derivatives implies control on the time derivative and mix it also fin and now if you control from the curvature on you also can control of the derivative of your gamma they pass delta m s delta infinity bounded by some constant and finally if you after I have all this control well already I can have since everything is controlled sending t to big t by asqually at zela I have a limit I have a limit family of curves and the condition that could fail the fact that the curve of my limit I get the curves converging some limit well this curve could be simply non regular which means that gay gamma t i x could be zero I want regular curves actually this is also you can also get this because if you consider the time derivative of the logarithm of gamma x where you do the computation this is simply gamma x gamma x divided by gamma x square which is actually equal to the tangent time the s derivative of lambda tau plus k ni and actually now if you expand this and you have to multiply orthogonally with the tangent what you get at the end it's only lambda s minus k square tau here again the rest goes away this is equal to lambda s minus k square which since all these guys are bounded this is bounded so you have that this quantity here is also bounded on a finite time interval so which means the logarithm of gamma x cannot that gamma the modulus of gamma x cannot get close to zero and or to large which means that there is a uniform bound from below on modulus of gamma i x t which means that it passes the limit since we are by aspirants we are passing smoothly the limit so we get a limit so smooth limit we can use the starting theorem in order to restart and contradicting the fact that t is the maximum time of smooth existence these estimates are only one family of estimates there are other couple of families of estimates that all are proved along these lines for instance you can as an exercise nothing more than the one that we did it in details you write down the time derivative of the integral of k square on your network plus the sum of the whole inverse of the length which actually is exactly the quantity we don't want becomes too large if L for all the curves is bound away from zero this guy is not too large and so this guy here can be shown along the same computation it is more than a constant of integral of k square of the same guy to the third power similar to the other one and again if you call f this guy here you again find out even better these inequalities which tells you that if at the starting point this quantity is small before getting large he needs some time he cannot do it as soon as he wants because you have a differential inequality an ordinary differential inequality moreover you can also make these estimates in variant by scaling which are extremely important for what follows because actually what we want to do with all this set of estimates we want to we want to do blow up blow up in understanding what happens when you take the limit of blow up so you want that your estimate helps you to take a limit so in the last 5 minutes just to so we have at this theorem here so all the curvature is not bounded all the length must go to zero and let's see a couple of examples that actually can be worked out with details that these are actually theorems so if we consider simple situations a triode for instance what happens to the evolution of a triode if the free curves the length of the free curves stay away from zero they don't go to zero then you have that your flow is fully smooth for every time and possibly converge to the minimal connection between the free the free points on the boundary for instance where this happens actually when the triangle connecting the constructed the free points on the boundary doesn't have any angle of more than 120 degrees otherwise the minimal connection and this guy is called a steiner point of the triangle is not existence in fact if we take a triangle with an angle larger than 120 degrees the steiner point is not inside the minimal connection between the free points is given by the union of the two edges connecting arriving at the point where the angle is larger than 120 and actually in practice in this case as there is no the flow is smooth and converge to the minimal connection in this case the curve connecting the triple junction to the point where the angle is larger than 120 degrees actually goes to zero and then you have to decide what to do with this guy here but in practice the flow stops there for a spoon what happens you have two situations or in this case the closet curve shrink down to a point or this curve here and in this case the curvature is going to plus infinity instead in this other case this curve here shrink vanish and there is a formation of a double point on the boundary here and it can be shown that in this case the curvature stays bounded in the guy without boundary point the theta guy again other two situation or one of the free curves in this case the central one shrink down to a point getting some limits like this and in this case the curvature remains bounded or there is a full region shrinking down and actually in this case the curvature goes to plus infinity and this is not easy to show maybe at the end I can tell you why what cannot happen possibly one can expect that at some point the network is shrugged down at a single point on all at the time this actually cannot happen in this situation for the theta guy so in all this example there is always a length going to zero sometimes the curvature is bounded and the curvature is not bounded but there are in no of this example the length is there is not going to zero and the curvature is going to plus infinity or being unbounded and this is exactly tomorrow that in the two situation here actually one is necessary one always happens and then you separate the cases in one plus two or one alone and the idea to do this this is a I have to be sincere it's a conjecture it's a conjecture because we are able to show this if this guy this conjecture which is the main open problem but in our opinion main open problem that the conclusion is true if you have this property that if you take your network flow before the singularities you choose whatever times you want to take the network at that time you enlarge it or contract it as you enlarge it enlarge it under contract enlarge it as you want and then take the limit of this guy if you can if you can take a limit you get some limit where what you get is never something with multiplicity larger than one in a way this means that suppose your larger network get at some in these guys which are converging you have two lines with two curves that are getting closer and closer and at the end you see a limit which is apparently a single curve but actually is a curve with multiplicity too because this kind of procedure enlarging and taking a limit the blow up procedure that possibly already saw in previous lecture and it's well, one possibility but the main line in order to understand what happens at a singular time why it's really important this multiplicity because this is a good situation if I know that my line will multiplicity to comes from the situation like this it's not so bad because in a way I separate the two sheets that are the two lines that are converging here and I deal with the first and with the second as graph and I do a lot of arguments and estimates on that but if I only see this guy here and I don't know from where it comes from it can also come from from two curves coming like this shrinking becoming thin and thin that in the limit produce again a line will multiplicity too so the multiplicity one conjecture that we hope and we believe it through prevent this kind of phenomenon if you multiplicity one and you see a line in the limit for instance it comes from a single curve approaching there like a graph if you see multiplicity two lines you have no hint if it comes from this good or it comes from this which is bad because you see there is a lot of here there you can imagine that these two lines are getting close to this with the slowing down the curvature because they want they are getting straight so the curvature must get down in this situation instead these two curves and get into the double lines which has zero curvature with a lot of curvature around which is vanishing in the limit and this is a situation that you don't want to see because in a way there is no connection between the limit and what's happening a little bit before instead here the connection is clear so this is why multiplicity one conjecture is very important in order to connect what you see in the limit what happens immediately before at a singular time and tomorrow we will use this block technique in order to show that the first condition is necessary at a singular point and also the same block technique will be useful to understand in general the shape of the singularities at a singular point in a situation in a more complicated situation ok, I can stop here today