 The last slide of the last time. That, as I said, what you are at the moment able to show is that under this conjecture, up to now still a conjecture, that if you take a rescaling networks of the network flow and take a limit, you always get something in multiplicity 1. Under this conjecture, we are able to show an improvement of the theorem that we saw. The other time is that if no length goes to 0, then the flow actually is moved. So there is no singularity, a big time t, a big t. Which means that at every singularity, some catastrophic event must happen. So all a region collapse down, or a curve is disappearing. So there is a change of structure. As I said, for a motion of a single curve, simple embedded curve, at a singularity, you have the curve that you're unbounded. But actually, also, your curve, which is by the raison theorem, actually, if you take a simple curve in the plane, this curve becomes complex at some point, and it gets rounder and rounder, and it shrinks down to a point in finite time. So also there, the length must go to 0. This is no more true for non-embedded curves, but actually, here in a way, we are, in all I said, up to now, I only discussed the analog of embedded closed curves when discussing embedded networks. One can also ask the same question I discussed for network flow, for immersive networks, where you allow your curves to intersect each other, actually. But I'm not going to discuss that. What is the main tool improving this theorem, which is my goal today? Well, actually, the main tool is this technique that possibly already saw last week, to take a blow up or rescale your flow. There are two main ways to take a rescaling of your flow and take some sort of a limit. One is to parabolic rescale your flow in space and time, having a rescaled flow still moving by curvature, hoping to have enough compactness to take a limit of this flow, and finding a limit flow, which is still a flow by curvature. And then you want to possibly classify these limit flows. And as we will see, for curvature flows, or mean curvature flow in general, the way to classify the limit flows is by means of whiskey and mountainousity formula, which also holds for the network flow. Actually, what instead I'm using, I'm using the way of rescaling things that was introduced by whiskey and again, in what we call a dynamical rescaling of a flow. So we take S of t is the flow of our network, of time t varying between 0 and the singular time, big t, which is everything is smooth. I want to remember that I put myself in the situation that we have gamma t is equal to gamma xx over gamma x for every curve, which is also the curvature times the normal plus some lambda tangential velocity times the tangent. And we proceed like this. We define also a new time parameter, which tau, which varies between the minus 1 half logarithm of big t and plus infinity. And it is related to the original time by this relation. And then I define the rescaled network as still that after choosing some point in the plane, we still think to the simplified situation of a triple junction of a single triode. X naught here, also you can take it here or here at the fixed end points, whatever, in the plane. And then you rescale around this x naught in this way. You take the original network here, time t of tau, around the x naught, and you rescale by this factor. That's the standard whisk and rescaling way. So you get a new flow, a new flow defined on an infinite interval of time, which is this one, differently by the previous one, which was a finite interval of time. And actually, differently, if you take a parabolic flow, actually this is no more this flow with tau parameter a curvature flow of a network. So the evolution in question has changed. Actually, I can do this, I write in this way, but actually here it is encoded a set of curves like here. So actually it's like I'm rescaling every curve gamma t by this operation. And then if you look at the evolution, let me write it here, at the evolution equation of the curve gamma tilde. So I'm going to evolution equation for gamma tilde. Well, actually what happens is that gamma tilde moves by k tilde, which is the curvature of the rescaled curve, plus gamma tilde again. But it is no more a motion by curvature. A motion by curvature would have been this. Instead, in this case of the rescaling, the new evolution law is given by this. So this is simple computation. So I think I can show you that to it here. So if I consider the tilde, the tau, this is equal to the tilde over the t times the t over the tau. And actually this d t over the tau, the minus 1. Now I want to compute this. Then I will compute this by means of the equation over there. If I compute this guy, this guy is k. I differentiate the numerator, gamma t over gamma delta t times 1 over square root of 2 times big t minus t, t of tau, plus gamma minus x0 divided by 2 times big t minus small t to the 3 half. And this is the derivative appearing here. Now I want to multiply by the inverse of this derivative. The tau, the t, if you use that equation there, it's simply 1 over 2 times t minus t tau. So I take the inverse. I have to multiply here. 2 times minus t of tau. And now you see if I distribute this guy, this I know it's a curvature because of the evolution equation of gamma. So I take this guy up here. So it's like the square root comes up. And there you get k times square root of 2 times big t minus t of tau. Instead here you have the same guy to the 3 half times the guy. So this becomes 2 1 half, which is exactly like the rescaled flow. So this product is simply gamma tilde. Now last observation is that when you rescale something to some factor, the curvature is rescaled inversely proportional to the same factor. So since we are rescaling by 1 over square root of 2 times t minus big t, the curvature of the rescaled, it's the opposite, the inverse. It's exactly this one. So this one, it's simply k tilde. And actually you get this evolution equation. Now what is the idea? The idea is you want to, in order, since we have our original flow on the integral 0 t, now we are enlarging the time of this integral and at the same time dynamically expanding since that factor at some point becomes close to 0. So it's an expansion factor. Expanding is your network. So in a way you are looking at things closer and closer to this guy here, where your network is possibly arriving. And in arriving, you are instead enlarging and looking at what's happened. And you want to get some limit. Because some limit, in a way, it's an enlarged vision of what's happening close to the point x, not close to the time big t. So you want, in a way, take some limit when tau, at least on a sequence, when tau goes to plus infinity, of this risk of the flow. You get that limit. The shape of that limit tells you something about your network approaching the point x0 when t goes to big t. Well, you can take it. Actually, the limit can be taken because we have enough compactness given by the estimate that I mentioned yesterday. But at the end, well, you get the limit. And then you don't know what to do because you want to have other information on this limit. And the technique, the result that tells us something about this limit is actually given by the whiskey and martinicity formula that I'm quite sure you already saw it last week, that in the smooth situation for a curve tells you this, take the time derivative of integral of, let me write for a simple curve, e to the minus x minus square, 4t minus t, 4 pi t minus t. Let me call this sigma. This means, well, this position x, this means, but this is like you take your curve. At every point, you evaluate the position. This is the position vector of your curve, which sometimes you find this putting the h1, the Hausdorff measure as a measure. In our situation, we are in a curve. We can use the arc line parameter. So if you want, all these can be written as s. And if you want, you can also put here gamma. Usually it is written with an x. It's easier to use it. So let me fix here. OK, this time derivative, if your curve is moving by mean curvature flow, can be computed. It's a tricky computation. At the end, straightforward. And you get the same guy there, multiplying something which is a square, which is for a curve k plus the product of the position vector times the normal divided by 2 times t minus t. Square. This is for a smooth curve moving by mean curvature flow, moving by curvature. Now the good fact is that this formula almost holds also for the motion of a network. If we do the same, take this guy that sometimes is called the whiskey integral. And actually, for a network, we have three curves. So we have to consider the sum on the three curves, 1, 2, 3. But for the three other three curves for a network, you have to sum all the old curves of the networks. And this kind of integral can also write as s of t. If you take this quantity evaluated on your network, this is the kind of weighted length of your network. Well, you get exactly the same result. On this side, you'll find the integral of s of t or exactly the same quantity. An extra contribution, which is given only these three boundary points. But no extra contribution coming from here. Because actually, in computing this derivative, you compute the derivative, then you start doing integration by parts on what you get. And integration by parts give you boundary terms. Some boundary terms are related to the end points of the network and are there. That I'm going to write in a moment. But again, like for the computation that we did for the length, for the evolution of the length of a network, that also there there was integration by parts. But the boundary terms at the triple junction add to 0, so they disappear. In this situation, since this guy is a kind of weighted length, also here, the boundary contribution of the integration by parts at the triple junction add to 0 at every triple junction. So actually, the corrected monotonicity formula for a network is the same. So let me write it like this. Plus some, let me call boundary terms, t, coming only from the end points, s of t. No contribution from the triple junction. So we have a monotonicity formula. But we still call monotonicity formula possibly here. Without the boundary terms, this guy is clearly negative. Well, non-positive. So this quantity is decreasing during the flow. Here, we have some boundary terms that possibly interact. And this guy is no more monotone in general. But we can still use this formula. These boundary terms can be computed, but it's not particularly important now. And how to use the monotonicity formula for our problem of understanding what we can get sending tau to plus infinity? Well, actually, monotonicity formula speaks of the original flow, the unscaled flow. Instead there, we have a rescaled flow. So now, the second step is to take this formula and rewrite it in terms of the rescaled flow. If we do that, you get a very nice formula because of the right choice of the rescaling. And that's possibly why we can decide to choose this kind of dynamical rescaling. The monotonicity formula becomes very nice because you see that this is the motion of our rescaled networks. And the tau derivative of this simple integral here, see that time has gone. Instead here, we have a term depending on t. Here, there is nothing. The sigma is the arc length measure on the rescaled network. Well, it's equal to minus this guy, which is clearly you can recognize the rescaling of this one simply. Plus some rescaling of the boundary terms that I have here. In the case of a closed curve, which if you do exactly the same procedure, you don't have this. You have simply the clean formula like that. Once we have this, we can also integrate between a couple of values of tau 1 and tau 2, tau 1 and tau 2 living in this integral. And actually, integrating, you see that you integrate here. You have this guy here integrated. It goes here. And integrating the derivative, you only get the values of these rescaled whisked integrals on S not tau 1 minus S not tau 2. Plus the integral in time of the boundary, extra boundary terms. So what you see is that this guy is negative. This guy depends only on tau 1, which is the smaller. So this guy is smaller than this one plus this other guy. Moreover, this other guy, I'm not going to show you the computation. I ask to upload on the material of my first lecture a survey on what we know at the moment, a data survey on the whole network and another paper with some details where you can find all the computation that I did at the blackboard and several more details of the lecture. For instance, you can find the exact form of this guy if you are curious. And I ask you to trust me that this last integral is actually uniformly bounded independently of the two rescaled times tau 1 and tau 2. So this means that this guy on the left is bounded by this one plus some constant, independent of the time. So I can send tau 1 to the extreme left to this value, minus 1 half log t, and tau 2 to plus infinity. Since the integral, this guy here is positive, so I have monotone convergence. And what I get is that I conclude that this guy, when I integrate on the wall interval, infinite integral, all this is always bounded by something which is independent by tau, some constant, depending actually, as you can imagine, by some value related to the initial network only. Because this guy here is the rescaling of this guy here and this tau here corresponds to the time 0. So this guy is the rescaling of the whiskey integral evaluated on the initial network of your flow, plus some constant. So in particular, since the initial network is compact, so you can compute this, this will be some kind of value, at the end, you get all this guy is bounded by some constant smaller than plus infinity. So it's finite. So you see you have an integral on an infinite interval of something which is positive and you get the finite result. So there must be a sequence of the integrand going to 0 plus infinity. Otherwise, your integral must be not finite, infinite. Actually, you can find the sequence tau i such that the integral inside, the integrand inside, which is also an integral, actually must go to 0. So we found out this. Actually, there must be a lot of such sequences because the interval is finite. Actually, in every choice of family of intervals that add to an infinite measure, you can find the sequence inside by the same argument. For instance, you can have one. And actually, you have to choose a right one. But then, on the sequence, you have this. And now, you take the associated rescaled network to this sequence satisfying this limit here. We have enough compactness using pushing a little the inequalities that I showed yesterday on the curvature and derivative of the curvature on lambda and so on in order to get enough geometric compactness that up to possible reparameterization of the curves of your network, of the rescaled network, you get some limit that converge in W22. So every curve is in the sub-space W22. So second derivative in L2 lock. And strongly in C1 alpha lock. Locally in C1 alpha and locally in W22. Two something at the end. To some limit network that I call S tilde infinity, that possibly has multiplicities. Because, as I said yesterday, there could be two lines getting close and close. And in the limit, you don't see two lines. You only see one because the two lines get super positive, one on the other. This is exactly why I need multiplicity while conjecturing, to exclude this, to exclude the possibility that this guy, this limit, is obtained by superposition of different part of the network. Moreover, we still have this. But if we have this, actually you see that on every ball, this guy here, on every ball this quantity here is bounded from below. And since it must go to zero, on every fixed ball or fixed radius, this guy inside here must go to zero in L2. So since we have a weak convergence, the curvature is lower semi-continuous in this convergence. So this guy here is instead continuous because it's the position times the normal vector. And we have C1 convergence. So in the limit, this guy, the integral, this integral on the limit must be lower than the limit for every ball on the limit on the approximating networks. So this means that on every ball this guy must be zero. But if it is zero on every ball distributionally in W22, then it is zero everywhere. But if you have this equation holding W22, actually this guy is an L2 function. But this guy instead is a continuous function because we have C1 alpha convergent. So you have a Bustrup argument. So actually if this one is C1, if the position is C1 alpha, this guy here is continuous. So this guy, which was in L2, actually is continuous. So it's no more in L2. It means that the position is in C2. Then you go on with the Bustrup in a standard way. And you conclude that this guy, that we call shrinker, the guy satisfying this equation we would call shrinker networks, are actually smooth, composed by smooth curves. And the equation actually holds in a classic way. So it's a sinfinity condition. Why we call the network satisfying this equation shrinkers because actually if a network satisfies this equation and you let it evolve by curvature, it simply shrink down homothetically to the origin. All these, it's simply an extension of the work of whisking for hyper surfaces, in particular for curve, and can be extended also to the network flow. The bad point, which is also a problem in the classic mean curvature flow in several situations, is that you see here, I told you for a subsequent scale of times, also I still have to extract other subsequence to these in order to find out one convergent subsequence with a competence argument. So possibly changing the subsequence going to plus infinity, you get a different limit. Possibly two different networks getting out from this procedure. This is a problem also, the uniqueness, the blow up limit. Also in the standard case, smooth of mean curvature flow, it's an open problem in several situations. Particularly in our situation, it's an open problem. So you possibly could have two different shrinker limits. Even if in some way they both describe the behavior of your network, approaching the point x naught when t goes to big t. What is a regular shrinker? Well, a regular shrinker is a regular means the same like for network and regular network. You only want to have triple junction and angles of 120 degrees between the free junction. Because for instance, this guy is a shrinker. Four lines crossing at right angles, it's easy to see that these equations satisfy k is 0, the curvature is 0, and the projection of the position vector on the normal is clearly 0. And more in general, every finite collection of infinital lines for the origin is a shrinker. Non-regular. There is only one regular shrinker. Because you only want one triple junction in this family. There is only one. Only triple junction means that at most, you have three half lines, and the three angles must be 120 degrees. In this family, there are a lot, but regular, just with this guy that we call the regular flat. Over also a single line for the origin is a shrinker. This one without triple junction in this family. But there are several others. As I said, if you take a shrinker and write down the evolution, this one, which is simply a motion by contraction, because t is a time parameter moving on minus infinity 0, so it's an ancient solution. And your guy, you see it's clearly the flow at time minus one half. Then you get, if your guy is a shrinker, then you get that the flow is actually a curvature flow. And I'll show you some example. More complicated than simple lines. There is a guy with this topology, which is a shrinker. He was invented by Ken Bracke in this book on mean curvature flow and very false and Bracke, where he gave the definition of Bracke flow. It's called Bracke Spoon. This guy is called Lance. And I guess that its existence was proven rigorously by a group of people, led by Felice Schulze and Oliver Schnurrer, actually. And also this guy, I don't remember, sorry, who proved the existence of sometimes it's called the fish. This is another example. And then the existence of this is, I don't think it's rigorously proved, actually. And for sure, the existence of these other guys are absolutely not proved. And let me tell you that it can be very complicated. This one, one, two, and three, for instance, they have no compact shrinkers, actually. This guy has these four half lines going to plus infinity. And these four came from a very nice collection of shrinkers that Tommy Mannen was able to show the full collection that he was able to produce. Actually, let me tell you that several of these are not the existence of several of these is not proven rigorously. There is a kind of classification by the number of regions they contain. So for no regions, you only have the straight line and the infinite flat regular triode. A circle is a shrinker. With the right radius, one radius, radius one. This is the bracket spoon, the lens, the fish. OK, the names comes from Tommy Mannen. These are the four with only one region inside. Two regions of things, more complicated. Compact or non-compact, more and more complicated. And I think, stop here. Then there are also example, non-embedded examples. And not all the shapes are possible. Or let me tell you that after the second line, it's all conjectural. There are good evidence and good numeric evidence. But even these line are, these shapes are conjectural, actually. And instead, also the last line, again, by numerical evidence, these shapes are not possible. With the exception that this guy, theta guy, that this shape is not possible for a shrinker. This was, we probe rigorously with my two colleagues in Naples, Pietro Baldi and the Manuele House. And the fact that this shape is impossible is actually the reason why a network with the shape of a theta cannot vanish instantaneously. Or as a wall at the same time. So if you don't have a shape, if you're choosing a shape, if you don't have a shrinker with that shape, if it's impossible, then a network with the same topological shape cannot vanish as a wall at a single time. Anyway, there are a lot of geometric conjecture on this guy. For instance, the strongest conjecture of Tommy Mannen is that the topological possibility are actually finite. So there is a bound from above on the number of regions that you actually can add and construct a new shrinker. At some point, you cannot go on. But that's the only conjecture at the moment. So there are several guys, several possible blow-up limits that you can get. And actually, as you can imagine, the possibilities are related to the topology, the original of the moving network, with the idea that when you take a limit, the topology cannot be simplified. So you have a moving network with low complexity topology. The blow-up limit can have at most the same complexity in topology. They cannot get more complicated. So the lowest topology of the moving network is the fewer network blow-up limits you can get by the procedure. OK, this is the general framework. Now I want to apply these techniques, all these ideas, to show this theorem here, which was my goal today. Because if no length goes to 0, you have multiplicity 1 conjecture valid, then also the curvature is bounded. If you remember that all the curvature is not bounded, or one length is not bounded below positively away from 0, so if I'm able to show this, t cannot be a singular time. So it's a kind of contradiction. t cannot be a maximal time, so the flow is smooth. So assuming what is the idea is to look at the possible blow-up limits, try to get information. When we take the blow-up limit, sometimes since we are taking networks, expanding networks, sometimes if you put yourself in the wrong point, for instance, in a point which is far from your network, and start expanding to that, you are sending your network to plus infinity, sending away. You don't get anything in the limit. You get an empty set. This can happen. So if it's not empty, well, the first consequence of assuming the multiplicity 1 conjecture is that it is embedded. Because if your limit network possibly has something like this, well, if it intersects itself, since it is a limit of a rescaling of network of the flow, and you want to get close to this, well, one moment before passing to infinity, you are close. So you're still in the convergence in C1. So you're still, you already were intersecting yourself. But you were a rescaling of the network of the flow. And we already saw that during the flow, the network cannot intersect itself. It cannot lose the embeddedness. So this is really impossible. But also this one, this situation, impossible, where there is a touching with a common tangent. Because actually, if I have a touching with a common tangent, looking back and choosing the right point, think that you can enlarge and enlarge and enlarge this with the right factor. And you are taking an enlarging of something which is already enlarged. And what you produce is actually a double line, which is excluded by the multiplicity 1 conjecture. This is a little bit rough, but the idea is like that. If you have a contact with the same tangent, enlarging with the right factor, you'll see in the limit a double line, which is forbidden. So this is also our limit shrink is embedded, no self-intersection. Then we assume that all the length bounded away from 0 positively. But then you are enlarging, enlarging of a factor which is going to plus infinity. So all the length are going to plus infinity. So that means that the limit network, all the curves you find in the limit shrinker, limit network, must have infinite length. So guys like these are not allowed. Guys like these are not allowed. You have screwed a lot of them. Because actually, what can be shown is that if any of a curve belonging to a shrinker must satisfy any question. Actually, if you consider, every curve must satisfy k is equal k plus x times the normal equal to 0. That's right. Suppose this curve is a curve of a shrinker. This is gamma. Gamma is a part of our shrinker. Then gamma must satisfy this. OK, but what is k? Where k is gamma, well, let me put a normal here and here. Then this is the curvature, the vector value curvature, which is actually equal to gamma ss. Instead here, I have the projection of the position vector on the normal, which means that this is equal to plus gamma times the normal times the normal equal to 0. And moreover, if I get back to the original equation and differentiate, so I consider ks plus s derivative of this guy, when I put the s derivative of this guy on the hex, I get the tangent. And I have tangent times normal, so no contribution. If instead I put the s derivative on the normal, I get minus curvature times the tangent. So this guy here is equal to 0, is equal to ks plus x times the tangent times the curvature, minus. Or if you want like this, which also can be written if k, well, look at this. Now this is an ODE with an arc length parameter. So suppose that at some point the curvature is 0. If at some point the curvature is 0, also ks must be 0. So for the uniqueness theorem for the ODE's, actually the curvature must be 0 everywhere. So if at a single point the curvature is 0, then the curvature is 0 everywhere, and you are dealing with a straight line, with a segment. If the curvature is non-zero, some point is non-zero everywhere. And it doesn't change sign. So we can assume, for instance, which is positive. So there are two possibilities. Or k is identically 0. So segment, k is always positive. And then ks, so I can divide over k is equal to x times the tangent. And it's a piece of converse curve. And if you look at this equation, you also see that the convexity must be toward the origin. Because being a shrinker, this curve must get close to the origin. So if the convexity was in the opposite direction, it must get in far during the flow. The second observation is that if you have a segment, the only possibility to satisfy this equation is that the segment is a part of a line passing through the origin. It cannot be a segment everywhere. It must be a part of a line like this. So clearly, there is another consequence of this ODE that I want to use. OK, so before. They must have infinite length, for what I said before. If we are in this case, it's not a segment. It must be at least an half line. If we are in this case, k is always positive. So it's convex and infinite length. So it starts rotating around the origin. Cannot get back because it's convex. So the only possibility is that it starts doing things like this, because cannot self-intercept. And now I want to exclude this possibility. Let me skip this point otherwise. Well, actually, there is an analysis by Abler-Schelanger that they classify all the possible pieces of curves of a shrinker. And there is a result. The analysis is absolutely non-elementary, very, very smart, because this equation is apparently innocuous, but actually it's quite complicated. And there is a lemma that they follow by their work, is that if you consider a curve satisfying this equation, at every round, it must cross itself. It cannot do things like this. I can give you an heuristic argument of this. Look at the equation here. Here there is the origin. The position is here. And then you look at the normal, which in this situation, more or less points in the same direction. But the point is that this guy here is larger the more you go far from the origin. Because there is the position here. So this guy here is larger and larger and smaller and getting close to the origin. And so the curvature, the more you are far from the origin, the more you are curving. The closest you are to the origin, you are curving less. With the limit that if you want to pass from the origin, if you pass from the origin, x must be 0. So passing from the origin, your curvature is 0. But then you are a straight line. So if you want to pass from the origin, you must be a straight line. The farther you go from the origin, the more you are curving. So this has a consequence that actually this curve, curve satisfying this, or either they are lines, or they live in a compact set. They cannot go too far. And moreover, this picture is wrong. Because when I'm here, I must curve a lot. So this means that when you pass, you make a turn, you cross yourself. So that's why actually infinite length curve with positive k must intersect each other. Since our curve in our shrinker in this situation must have infinite length, so by this argument must cross itself, they simply cannot be present. So in this situation of bound from below on the length, on the blow-up limit shrinker must be composed only by alphalines. Alphalines that are part of lines passing for the origin. But then they really originate in the origin. Because if you have an alphaline, here you have the origin. Your alphalines stops here. Since you have C1 convergence, this guy being limit or regular network, must still have regular triple juncture. So angles are 120 degrees. So this means that if your alphaline stops here, then you have other two curves starting from here, forming an angle of 120 degrees. And they cannot be alphalines, because they don't pass from the origin. So the only possibility is that all the alphalines originate from the origin. But then, since there is only this guy passing from the origin regular or a line, these two guys are the only possible limits that you can find out by this procedure. In general, by this argument, every time you have something unbounded, which means infinite length, must be an alphaline or a line by the same argument. So at the end, you conclude that the limit network coming out on this procedure is done either by a regular triad, the origin, or the empty set, as I said at the beginning, or there's an exception to all this. All this is true if x0 doesn't go in the end point. Then you require a special treatment. But I think you can believe me that if you put x0 here, what you get, it's a simple one alphaline after there is scaling. So in the special case that you put x0 in the endpoint of the network, the blow-up produced a single alphaline. Anyway, I don't want to discuss too much in this case, but as I told you at the beginning, the end points are usually dealt with by a reflection argument. You reflect your network centrally around the end points. And then you consider the new network where this guy is no more an endpoint. It's an inner point of the network. And then you use the analysis for the interior points. So this is more or less what I wrote. At the end, you get only this possibility with this alphaline only in this situation here. And with some effort, you can, even without assuming multiplicity 1 conjecture, with some effort, you can be able to show that the triode or the alphaline already has multiplicity 1. But instead, there is no way, because actually it's a problem also in the smooth case of curve, in the classical case of closed curve, to conclude that the straight line that you get here cannot have possibly multiplicity larger than 1. So we absolutely need to assume. And actually, the multiplicity 1 conjecture can be a little bit weakened, asking that you cannot get double lines. Would be sufficient for all the conclusion. But OK, we take the multiplicity 1 conjecture. So all these guys that you find have multiplicity 1. Notice that they all have zero curvature. Curvature is gone in the limit. So morally, you are taking your evolving network, assuming length are not going to zero. Then the other theorem tells you that the curvature must go to plus infinity. Then you are rescaling things. And when you rescale things in that way, a law we can, actually, the curvature is not going away. It's always there. And actually, you expect to be still present in the limit. Instead, they are in the limit, the curvature is gone. So if you morally can formalize this argument that you are not enlarging too much, the curvature is still there, and some curvature must survive in the limit, then you get a contradiction. Because all the limits that we found out have no curvature at all. And they cannot hide curvature in the triple junction. Or doing, like I said yesterday, that there was two very curved guys with a curvature concentrated that vanished in the limit. Because this is prevented by the multiplicity 1 conjunction. But this is only heuristics. But actually, what you can do is that if you get an empty set, it means simply that your network is not getting close to your point. So you can't wait for that effect. The meaningful points are the points, the reachable points, where the nectar is arriving when tea becomes get close to big tea. If you get a line, there is around a very useful and famous theorem of Brian White that says that if you get a multiplicity 1 line in curvature for curves, then locally, the curvature around the point x0 is, locally, uniformly bounded in time, up to time big tea. So it's a local equality theorem. If every time you make this procedure, you get a limit, you get a multiplicity 1 line, locally, there will be a curvature still stay bounded. Locally, in R2, curvature stays bounded. So it's a very strong and useful theorem by Brian White, which was more or less the basis to generalize it to what this is OK. For a half line, you do this reflection argument, and you get back to the Brian White situation. Because if you get an half line here and you reflect, you will get also the other half line. So at the end, you will get a line with a multiplicity 1. So you get back to Brian White theorem. And so you have this area, the curvature is bounded. But the non-elementary generalization was to generalize the Brian White theorem when you get a triple junction like this. Actually, we did more or less independently at the same time with Matteo Novaga and Anibali Manny and Tom Ilmani, Andre Neves and Feli Schulze. And let me, and so we have the same conclusion that in White theorem that if you take a blow-up limit and you get an infinite flat regular triode, then locally, you get your curvature is uniformly bounded in time. So actually, since we can only get these guys, and in all these situations, the curvature is locally bounded uniformly in time, then you can conclude that the curvature is bounded for the whole network uniformly in time, which is a contradiction. Because we have a theorem saying that if the length are not going to 0, the curvature must be unbounded. And this is the conclusion of this proof. Let me only make a couple of comments on the, again, I will be a little bit rough. But the proof by Ilmani and Neves Schulze of the analog of White theorem, it's actually more or less along the same line, a generalization with the same techniques to the situation of triple junction, the same line when instead you get a line. And it can be seen like White theorem, it's kind of a local regularity result for mean curvature flow or for curvature flow of curves. This result must be seen as a local regularity result for the motion of regular networks, for the motion of networks with free, because you can always, for a general network, if the length are not going to 0, which means that triple junction are not getting to close each other. Because when they get to close, you can prove that the curve connecting them is going to 0. So if you have a bound from below on the left, to triple junction, they don't get to close. If they don't get to close, which means that restricting the area that you are looking, you can only see a single triple junction. And the single triple junction, after the blow up, no other triple junction coming in. So you are looking at things locally. You get an infinite flat-triode, and then you use the theorem. So it's a kind of local regularity results analogous of White theorem for network flow. And why our proof was different when we used the different line by White's one? We also used, at some point, the White result. Well, the idea was to maybe have this guy here, which is moving. And you know that enlarging and enlarging by the hypothesis that you get a triple junction, you continue to see things that are getting closer and closer to the flat. So the curvature apparently is going away, but things that you are looking smaller and smaller. At some point, you get very close to an infinite flat-triode. And White's theorem tells you that you are in the annulus. You are getting close to this straight segment, here to this segment, here to this segment. White's theorem provides estimate on how close you are. Not only in C1, but in C infinity, actually, or C2 is sufficient, that you are getting very, very close in a strong norm here. Then using some geometric analytic ideas of taken by the papers by Mark Grayson, Anganand, and in particular some estimates of Ecker and Wiesken, what you can show is that you can control the rate of the curvature going to plus infinity, if it does, at this inside the annulus. So in a way, the curvature can go to plus infinity, but the interior estimates of Ecker and Wiesken tells you that it can do after some time. And this time is larger and larger. The closest you are to something with the low curvature, like here. So at some point, your Ecker and Wiesken estimates tells you that it takes too much time to become unbounded, to go to plus infinity, more than the time that you assumed was close to the maximal time of the big T. So as a contradiction, the curvature can go to plus infinity, but a big T plus epsilon. So that means that a big T didn't go to plus infinity. So you have an estimate of this kind. Because you are very close to something with zero curvature, even if inside here, you are not too much, but it's sufficient that you are very close in a suitable annulus like this. So it's a matter of this kind of estimate. And these things can be generalized when you get different blow-up limits, again with zero curvature. So not only in this situation, but also for instance, a situation that we'll see tomorrow, that you will have, for instance, four lines, or two lines crossing, or four half lines, forming angles of 120 degrees and 60. You will see that also when you take a blow-up limit in this situation, your curvature must be bounded. You assume more or less can be done along the same lines. Or morally, when you get a blow-up limit with zero curvature, usually the curvature of the original flow was bounded. Cannot go to plus infinity. This is only morally. To show this rigorously, it's not so easy. Okay, so that's five minutes. So from now on, we assume the multiplicity one conjecture. And we conclude that at a single point, one length must go to zero. So there's two situations, either one length goes to zero or a region goes to zero. Carrying all the length or the curves bound in the region going to zero. Anyway, we can separate in two situations. While the length is going to zero, the curvature stay bounded. Or while the length of some curve goes to zero, the curvature goes to plus infinity or better. Generally, it's simply unbounded. And we want to analyze the two k's. Actually, in the smooth situation, what happens, we don't have this fact that the length goes to zero. We only have the curvature that must be unbounded. And actually, it's possible to work out an estimate that actually has also analog in a higher dimension that at a singular time, the maximum of the modulus of the curvature is unbounded by some constant. Cannot go to plus infinity weaker than this rate. It must go to plus infinity at least with this rate. And let me mention that the analogous statement that we believe for a network, it's actually an open problem. We are not able to prove it. We have another rate from below on the blow-up of the curvature, but far from this one. And actually, you can classify then, or this is the minimum rate of blow-up of the curvature, then you can classify singularities by a bound from above. So if you have a bound from above of the same order, one over square root of t minus t, these are called type one singularities. I think the terminology was possibly introduced by Hamilton for Richie flow. Possibly, for someone who knows better than me. But, okay, if you said you don't have this bound from above, that actually implies that the order of k max at time t is this one. Well, you say you have type two singularities, the other singularities, which usually are the more difficult to be dealt with. But as a suggestion of Tobit Mannen, he said, okay, network flow has other example of type zero singularities, which means singularities with bounded curvature. With curvature not going to plus infinity, which actually it's going to, what is happening that I'm going to show you tomorrow, that you can have collapse of curves with bounded curvature. And actually, this is in two direction, what I'm going to try to show tomorrow and possibly also Friday, that there will be two situation, collapse of a curve with bounded curvature, collapse of a region with unbounded curvature. And this verse, if you have a singularities with curvature unbounded, must be present one region collapsing. And if you have a singularity with bounded curvature, only a single curve isolated, isolated, I'm going to tell you in a moment what this means, must be vanishing, not a region. There is a dichotomy related to the behavior of the curvature bounded, single curve unbounded, a region is collapsing. And what does it mean isolated in this situation? And suppose that this curve is vanishing during the flow when t goes to big t. And then you get what I mentioned several times, is disappearing. In this case, we will see that the curvature is bounded. And if the curvature is bounded, this is the case. Okay, but one can say, okay, I can imagine that there are no region collapsing, but two curves, one and the other here, are both collapsing at the same time. See the region are not collapsing, only two curves. Okay, but what happens in this situation if we have something like this? Well, this curve is collapsing, so here you have to think that there is an angle of 120 degrees, like here, here and here. And if you collapse these two, you see that these two curves that they survive have the same tangent. This curve, let's call it one, becomes one. And curve two becomes another curve here with the same tangent. And then we will have this one, another also here. Also this one, this is three and four. Now, with an argument similar to what I used before to exclude the possibility that in the limit, you get a non-embedded limit network. Also here, if you have two curves coming to a same, coming to a same junction with the same tangent, with an appropriate choice of points and the scaling, I still, if I get very close to here and rescaling and enlarging very few, I can produce a double line again, which is excluded by the multiplicity one conjecture. So two curves close to here cannot collapse together because of this argument. But if you, they say, okay, I try with three curves. Well, when they can collapse? Well, if all these regions collapse, then they can do. But then exactly what I want to exclude that there is a region collapsing. If you try to do, now we have one has to do some cases. One can try to do with three, one, two, and three. Now three, all collapsing. Remember, you always have 120 degrees angles. This is even easier because if you collapse this three, these two not collapsing go to cross. And this is easier to exclude. You don't even need the multiplicity one conjecture. Simply you exclude the possibility of intersection. So again, if you have a situation that these of collapse, these must close and all the region must collapse. And the same if you have five curves together. More than five, well, more than this starts being useless because it's easy to conclude that you cannot collapse without collapsing in a world region, otherwise you get self-intersection and this is excluded. The only possibly delicate situation will require a little bit careful argument and the use of multiplicity one conjecture is the situation of two curves because in the limit you get something like this that apparently could be good, but it is excluded by the multiplicity one conjecture. So the two real cases will be single curve isolated. Possibly several single curves, but not touching each other collapsing or one region collapsing. And in the two cases in the first curvature is bounded and the second curvature is not bounded. And these two cases will describe all the possible singularities of the net or flow. The first case we are able up to the multiplicity one conjecture treat in details. The second case, instead, we have a less precise description of what's going on and we also need another assumption which at the moment is only conjectural. Okay, stop here.