 So let me first thank very much Frank and Jeremy and Tomás for the invitation to speak at this conference but also to the other organizers for organizing such a great program and it's great to be here. So what I'm going to talk about is kind of a little bit the plan is and I apologize to some of you who have heard me at least once and I can see Alex there. So like Mihail said, feel free to prove a lemma while I speak on your own work. So anyway, so what I would like to do is because of the mixed audiences to first maybe review a little bit some of the work of Burgen which is in the background of all of this and then sort of talk about some recent work with Gileola Stapilani for the cubic equation in 2D and the Quintic focusing in 1D and then sort of like talk about some more recent work which we're still writing with Sacher Hani and Jonathan Mati and me look really at Gileola and maybe perhaps between one and two and three I want you to think about like it's kind of like talking about sort of in the first two is like the notion of Gibbs measures which is this sort of like equilibrium states come in and while the second part is about sort of like the non-equilibrium states. So let me quickly so I'm going to talk about the Schrodinger equation and just for the purpose of this talk I'm going to talk most of the time about the periodic case and I'm going to talk about the square torus not the rational one and sort of for the purpose of this talk every time that I talk about critical a super critical or subcritical always mean relative to scaling and so as everybody knows here because we have people here who have done that work there has been a lot of progress and studying these equations and most of it has been sort of in deterministic aspects of the phenomena but there are still some questions that remain there's essentially like you know super critical case which even in the defocusing case people don't know if you have blow up and but you also there are gaps in the in the well-posed theory as we will see so here's just quickly some reviews like of some of the work that you all know on RD and so this is for subcritical regimes here the work of Brigand the work of Colliano and then we know sort of for critical equations I mean and I'm not putting all names but there has been a lot of like fantastic work and also some conditional results that sort of intercritical and so and supercritical as I said there is not much there's not nothing known really in the defocusing case okay so on the periodic case though the situation is much more limited and the results are fewer and there's even a even a problem at the local well-posed theory right so so so Burgen proof like in 93 that actually dispersion in the periodic is indeed weaker and so for example in 1d the l6 so the l4 in 2d actually do have epsilon derivatives this is actually he constructed examples and so what that means is that in the local well-posed theory you cannot although this the question for example the cubic in 2d or the quintic in 1d are l2 critical you cannot close them there is no local theory at the level of l2 that's open okay this is bigger than zero there are no results in l2 local okay and so in the same paper at the subcritical regime so Burgen sort of proof as I said the the local well-posed this is just some examples and of course you have global at the energy level okay so how about global well global you have the work for so there were work for for small data of her tessek of the taru and then the sort of the the breakthrough work the first sort of like global the large data global of jones compositor who actually did the large data for the quintic defocusing in 3d a student of mine did after the breakthrough actually of Burgen the matter of the street has he did that's part of his thesis he did the cubic defocusing nls in 4d and as I said there was a small data result before and maybe I should mention here that there's work of kili pan design which they have also a different proof of the small data and new proofs for the irrational case but as I said I'm going to focus on the rational case and the reason is because the non deterministic setting you still rely heavily on analytic number theory and so we don't know still how to treat many of the things in the rational case so the point is that as I said in the l2 level there is not this is not even know so the the the the point of view that you want to take is that then maybe what you should do is you should study these equations from an undeterministic point of view and this this brings us to Burgen which is the starting point in 1994-96 where he studied the long time dynamics for periodic equations in the almost sure sense and showed that you have actually global well-posedness on a set of data of full gives measure so the gives measures for nls had been constructed for example for the cubic nls in 1d in 2d and 3d de-focusing had been constructed by glim jaffin the context of quantum field theory like the p4 model or the allen stochastic allen kan and then in the context of hamiltonian systems by lewis rosen spear who also treated the focusing case in one dimension okay here you have if you don't if you don't know that's just invariance means that you have this measure that if you take a set of initial data as you flow it it preserves the measure and so and so why is it that it works I mean because usually what happens is when you are sitting on the and I'll show you in a minute when you are sitting in the in the statistical ensemble or in the support of the measure you are at a regularity which is very rough and there are no concert quantities at that level so why is it that you can get global okay so the point is that assuming that you have a local well postness and it doesn't have to be deterministic it could be non-deterministic then the invariance of the measure where it acts as a concert quantity that allows you to continue the result from local to global at a level of regularity where there are no concert quantities so that's that's that's the key okay and the virtue of this of doing this is that somehow it captures generic behavior and I'll say something more in a minute about that and so now this yes I will explain it so so an idea of what what do I mean and so I mean generic in the sense of probability that's what I mean okay but I'll you know infinite dimensions is a big place but you know all right so so actually there are some of course limitations and challenges one is actually for this perceived PDE is the actual constructions of this of these measures they are not that easy they're easy to construct if you are on a finite measure about the domains if you're in infinite measure or infinite volume there are some problems if the PDE is not Hamiltonian it's not always clear how to do it and as I said in 2D and 3D the focusing you actually you cannot construct it directly was proved that actually what you have to do is you have to renormalize the non-linearity and remove some sort of like divergence of some infinities in order for the measure to make sense that was known in the focusing case for 2D Bridges and Slates said that there was no Gibbs measure the measure is not available in higher dimensions like four and even in 3D if instead of cubic you are quintic then it's also known that if you try to do the same renormalization procedure you either go to something trivial to the Gaussian measure so there are no Gibbs measures either okay and for example if you take something like the defocusing quintic in 2D or the quibbick in 3D for which there is a measure the problem and I'm going to show you why in a minute is that in order to prove it in variance you of course have to have a global flow and you can have actually a weak solution but if you want to show the global well postness then you are going to see that you go high in dimensions you're going to see it in rougher and rougher spaces and it's actually not clear how to to do this okay this is challenging and but I you know okay so so the the the defocusing quintic into the might be within reach thanks to the sort of the new new techniques from some sort of like the but a different but a control distributions for going earlier regularity structures but this is a little too soon to talk about that okay so quickly how is this measure defined because I'm going to be this uh needing this so of course you think of your equation as an infinite dimensional Hamiltonian system on the Fourier coefficients and let's say one dimension let's take just one dimension p the lesser equal than five levels for a sphere said well you know you you can construct this measure that formally looks like this okay so this is like Lebesgue measure this is Hamiltonian and this is a measure now if you look at this this is total nonsense because every there are three three terms here the the Lebesgue measure and the kinetic and the potential energy and every single factor is infinity so this makes no sense but it's a gesture of what is it that you you you you want to do okay so so you cannot define the even the Lebesgue measure but what you do is you you take you construct it using a Gaussian measure as a reference measure and then you construct the Gibbs measure as a weighted measure in two steps so so so the Gaussian measure on some sort of like Heeler space whatever you wish you construct it as the weak limit of finite dimensional measures so here you see this is the kinetic part of your of your Hamiltonian and this is the Lebesgue measure which if you are Hamiltonian then you will see theorem tells you that is volume preserving and then this row ending is is an it's sort of like you can view it as an induced probability measure under this map okay so where these are just from now on these are always iid complex Gaussian random variables taken with mean zero and so the point now the most important point now is that you want to understand the Gaussian measure Ross a weak limit of this finite dimensional approximations and it's easy to see that they are additive but they are not countably additive and so the point is that if you want them to be countably additive and there is a theorem a trace theorem that tells you when they are countably additive then the point is that on hs they are countably additive only if s is less than a half okay but not in s bigger equal than a half they are in 2d they are countably additive if you are below l2 with probability zero more in 3 if you are below minus a half probability zero if you're above okay so what that means is that if i want to this is this is exactly what guides you what is the support of the measure and where is the data going to leave that's the regularity of the data you're going to take so in 1d if you're looking at the gauge measure you have to be sort of in h a half minus if you are in 2d you have to be just below l2 and so on so you see as you go high in dimensions you get very very rough and so once you you do this then you know that this Gaussian measure is just the law for this random variable and so these are the typical elements in the support of the measure this is how they look okay and so this is a function almost surely in 1d and the distribution in higher d and you have these these tail estimates so as i said before once you have constructed the Gaussian measure then the Gibbs measure is just a weighted measure where where sort of this is the non-linear part of the of the Hamiltonian this is a renormalization and what you want is this to be sort of the the radonico-dim derivative you want this to be in l1 relative to to to rho and so if you are in the focusing case you have nothing for free because you still in order to be to do the renormalization for this to be a probability measure you have to put the l2 cutoff okay so the same cutoff that you see in the deterministic case comes here and actually it's a nice exercise to see which be which constant is is larger and actually this one is smaller so so you don't win anything but anyway uh so if if the power is less than five you just need some cutoff and if the power is five which is the critical mass critical then it has to be sufficiently small all right so what is that we're going to do is like we're going to prove in 1d this is his result so i just i'm just stating it because i want you to get used to how he reads is that you take the equation with that data which is the date in the support of the measure and then what does it mean to prove almost sure it means that you can find a set in here if you are focusing and you have to condition everything to have a small l2 of measure one of the of the gives measure one such that for any such data the initial value problem can be continued globally okay and then you also prove that this measure is also invariant okay so why is this important well let's recall for a second deterministically you have local if this is strictly bigger than zero and globally if you are in h1 and uh and that's it and so what the invariance measure gives you as i said before is a global result in h half minus where there is nothing there's no conserved quantity so how are you going to get it but that's that's what the the measure does for you and so how does he prove this let me just quickly go through this because we are going to need it it's like what you have to do is you have to approximate the equation by a finite dimensional approximation and and and actually the crucial fact because what I want to bring home here is is how do you use the invariance of the measure and what is important which is a little bit of a of a of a confusion is that what you what you really use is the invariance of the finite dimensional measure not the not the variance of the infinite variance of the infinite some come comes in parallel to the to the global well-posedness at the end so so why is it that the finite dimensional measure is is is invariant well because for this equation when you take these things you stay Hamiltonian so this stays Hamiltonian which is not always true when you take an equation and in order to prove a local result you have to gauge I mean what happens is that you might stay Hamiltonian but the problem is that you don't know how it looks and this is actually a very delicate point what happens with Gaussian measures so measures and there are sort of these non-canonical transformations and then as I said before Liouville theorem then tells you that because you stay Hamiltonian this is volume preserving this is conserved so this is a conserved this is this measure is is invariant and now what you do is you use the invariance of the finite dimensional measure to continue so you know by the by the local theory you have a deterministic local theory for this local and so now you use the invariance of this to continue the solution to the finite dimensional approximation to a global one in time provided that you have you are on a set of good data so what that means is that in this process you have to select a set of omegas with probability one that tell you that you're still that you can continue that you can do this process and so once you have this then how do you continue there is this approximation lemma that tells you that once you have the finite dimensional reach time t you take the u that you know that is local and you walk the u in an iterative method side by side with the un and this is not a trivial problem because what happens is that you see after the initial the local time of existence you somehow these two can become really far apart okay and you have no a priori bound on you so this is not trivial lemma so what you have to do is you have to take as a stepping stone an intermediate system which is the infinite dimensional equation with projected data and then what you do is you use that as a stepping stone and you compare u to u prime and then u prime to un step by step and then that's that's how you do it and so once you have that now you have now the the sort of the global u and sort of the icing in the cake is that you get the um you prove that you have this converges to to mu and you prove you get the invariance of mu uh but it's really the invariance of of of mu n okay so how about 2d what the burgand did in 2d and here is but where becomes interesting or maybe what I want to say is that so in 2d he considered now the cubic which is still a little too critical but now as I said before the Gibbs measure which was known to exist uh is needs some sort of like renormalization in order to exist okay and you can actually see this renormalization also in trying to prove this theorem uh because it's actually the same thing you need to remove to deal with resonances and so what you do is you have to treat what is called the weak order non-linearity so you have to subtract this term from the cubic and then he proved exactly the same theorem that then your globally will pose below l2 and the associated measure is invariant so here now this is the point that in the in the kinetic in the potential energy this is unbounded almost surely so you have to sort of weak order and uh so you renormalize the finite dimensional approximation and uh and that removes this term and what I would like to say is that actually this is the first result ever that is global and is super critical okay so in the in the dispersive community this is the first result uh uh that is super critical and global super critical in the sense of l2 you mean yeah scaling every time as I said every time I talk about critical super critical is relative to scaling okay and so and so and something nice that is actually just an observation is that you see that the measure sort of always charges open sets with positive measure so if you prove using this argument if you can prove that such a result with with probability with measure one then what that tells you is that you cannot have glow up blow up in that topology uh that is you can have blow up but you cannot be stable in that topology okay so that's actually uh something nice okay because you know okay you can still have this is the focusing you don't know if you have blow up but it cannot be stable okay so that tells you that that's kind of like this okay so now the additional difficulty here is that Burgen did not have a local well post in place as he did in 1d okay so he cannot even start the argument that I just described a minute ago because there is no local there is no deterministic local well postness before below l2 is super critical so what he says is he says well look I mean I just want to prove I'm almost sure global results so why do I need a deterministic local it's enough to prove that I have a local well postness for data that is in the support of the measure in other words all I need is a probabilistic uh local well postness result and so if you look at this paper from 96 which is a very long paper it's like 56 pages like you know 50 of those are just to prove the probabilistic local well postness okay so that's that's that's the that's the heart of matter in that paper and the rest just follows uh the same as before so that's what I said there okay and this is what I said I mean it so this brings us to this notion of saying okay so now forget about for a moment about the gives measure and forget the moment about uh global let's think about uh it's just the local well postness and so what what Burgen is saying essentially is that if you if you if you take data and you randomize the data then you're always going to be able to improve your local theory okay so that's the take away okay and this is actually not a foreign idea for for us and for for anybody but but especially for harmonic analysts because this is exactly what what what what what we know and we use all the time uh uh when you prove little pale inequalities for example uh that sort of random series enjoy better LP LP estimates and so this randomization doesn't improve the regularity what it does is improves the the LP the the integrability but once you have that then you use that in turn to get better estimates that you would otherwise deterministically so what is his this is a strategy to to prove this and this is actually the same strategy that I don't know if at the same time but but similar time this is the same strategy but what is used used by the patent the bush uh for the for the stochastic uh for the FIFOR model and they were using exactly the same idea what's the idea the idea is that now you look solutions of a particular form so what you'd say is you say well you randomize the data now what I'm going to do is I'm going to look for solutions that look like the linear evolution of the random data plus some other uh w and instead of solving for you you solve for the difference equation okay and so as a consequence of this and so what he can prove is that if you solve for this difference equation now you solve here in a smoother space that you start okay so you solve for a fix you know whatever you want to do but you're going to find the w in a smoother regularity than uh the initial data and so as a consequence he proves almost surely in omega that the non-linear part always is a smoother than the linear part okay this is in this 96 and 94 96 paper actually now um an important remark here is that by doing this it's not that you're just saying that if you randomize you bring the problem from being like super critical to being subcritical that's not true I mean what it is is that you have a problem where the non-linearity is sort of hybrid right I mean you have super critical terms which are random and you have sort of like w's which you treat deterministically which are really maybe smoother or subcritical but you have to but you have to treat all interactions random random deterministic deterministic random deterministic and so one of the things that you need in place before you even start this thing is because you have to deal with the deterministic deterministic at a smoother space you actually have to have in place a deterministic local well postness uh in a smoother uh space to start at least in this approach all right so uh let me just quickly tell you why it works so how it works uh so uh uh so everybody I'm sure here just stated by everybody knows sort of the typical large deviation estimate so think of k here as being how many uh the how many random terms you are dealing with in your non-linearity whatever that is and so here you have just a sort of a linear combination of the random variables so a large deviation tells you that you know in omega uh sort of the the the lp norm is is controlled by the l2 norm and so what's the point of this the point of this is that then uh uh by chevich's the set where the absolute value of this function is bigger than lambda is uh the case exponentially so now take any any delta and take lambda here to be delta to the minus k over two times the l2 norm and put it here and so what that tells you is that there exists a set okay which is the complement of this so this omega is the complement of this so the probability of the complement of omega is the probability of this which if you put this number here gives you that that tells you that if you're outside that set right so if you're outside these sets so that means if you're here then you can replace the absolute value of this by the l2 norm square and that's fundamental so let me show you why okay that's the key so what's why so let's do one example suppose that you have data that looks like this okay this is what we're going will have if you are in two dimensions just below l2 it will look like this okay and uh in in doing your your random random interactions what you have to do is you have to estimate things of this form okay where this is just the set of n1 and 2 and 3 each of them in z2 such that uh they behave like this now what the weight coordinate does for you which is the which is actually very interesting because it's a different way of understanding resonance is is that it tells you that n1 and n3 these two are never n2 which means that you never lose the independence of the random variables so every time that n1 is n2 this becomes an absolute value square you lose the independence so the weight coordinate what it does which is the same as the the the resonance it tells you that this stays you you always have independence okay and so now if you just were to do Cauchy Schwarz here if you want to estimate the absolute value of this oh sorry the l2 norm of this and you look at this and you simply were to so you sum in n and in m because i'm doing l2 and you simply do Cauchy Schwarz then you pick up you lose because you pick up the cardinality of that set you pick up the cardinality of this set which is big okay it's big these are integers you're counting lattice points on spheres or intersection of spheres or whatever and and and then you lose okay and what that loss means that you cannot lose derivatives you cannot close and so what the large deviation estimate tells you is that you are replacing as I said a minute ago modulo a set of omegas you can replace the absolute value with the l2 and so what you are looking at is at this without that and that's enough to close okay so that's how it works I mean that's that's how it works in the old run okay now um I want to say that something just because Sergio's here it was a quick that actually uh well I maybe I'll say it in a minute okay let me let me actually move a couple of transparencies so so so so the take away from here is that the randomization uh uh what it does is actually allows you to improve the local well post in the theory and you should view the randomization as a separate issue from the invariance of the measure once you have improved your local well postness the question is how do you go to global and then it depends which equation you have which regime you are if you have a measure you don't have a measure you can actually do a variety of things or you may not be able to do anything at least with with current technologies at least nothing that is conditioned I mean one thing that I should say here maybe going back is that one problem that you have is that you see even if you if you have conserved quantities or even if you have conserved quantities for you in the way that you study these problems when you study the difference equations you have no conserved I mean you lose the w doesn't satisfy any anything so it's a bit it's a bit problematic um okay so uh as I said then then how do you pass so separate the two issues randomization improves local and then if you have a measure then you can go from local to global if you don't have a measure maybe you can do energy estimates or you can do something else depend on the regime or you are still you may be still working on it because you have nothing and as I said there was a this has been a lot of work after bourguin so nothing happened between bourguin's work and sort of around 2007 and there was a lot of activity I'm not being comprehensive here there's a lot of people that have worked for a lot of equation and not just dispersive also sort of like on fluids equations and and maybe because it's just search juice here I'm going to I wasn't going to say anything because it's kind of like this is actually part of a long-term project but you know in this in this paper that we sort of consider sort of a non-year wave equation with the the climate market of null forms the qijs and actually this we are interested in this question for it for in this equation for a geometric purpose and for trying to understand these methods beyond the this regime in which what you do is you you look around the linear evolution we want to look around special solutions and this and these geometric equations have like a sort of ground states or we are interested in in trying to understand how to to sort of understand how to propagate a little bit better how to understand the transport of the of the randomization and but but what I want to say is that you know it's actually it's a beautiful recast of these null forms that actually you can they're exactly I mean they're like a gift that keep on giving because they are exactly what you need when you have quadratic derivatives to preserve the independence that I show you before so I told you that if you have random random interactions you have to do something like the week ordering to make sure that you use you keep independent the the random variables and then and then and what the null form does is exactly that renormalization in the probabilistic context and so so it's actually quite interesting and so let me quickly talk about this this work so this is actually two two things in which two results which are still at the subcritical regime but the reason why we are interested in this is because they are actually closing an important gap in the subcritical theory between the deterministic theory and the and the almost sure global well posed so one is in 2d you take the cubic defocusing equation and as I said a minute ago so what we're going to prove was that if you are below l2 you have almost sure global and for the look for the global deterministic using the i method the best result known which I have a picture here is above two thirds okay and so we're going to prove it here and if you are here you have global but as I will explain in a minute if you have a global well posed that is a probabilistic you cannot use just the deterministic propagation of regularity methods to to fill in this gap and so we were interested in understanding how to get global in this range and for the 1d which is focusing you have the same problem which there is a still a gap between so that the almost sure global is just below a half and then of course you have in the focusing you have above h1 and the defocusing you have above four ninths but in the focusing you have nothing and we also wanted to close that gap and both of the results follow from a method that we we like to call a propagation a probabilistic propagation of regularity so so let me put this in here because this is this what I just I just said this is so let me just talk about the 2d why is this not trivial the result is not trivial because if I give you data which is a smoother than the random data that is sitting in h minus epsilon that data is with probability zero I told you with probability zero is in the support of a measure so I cannot just take a smoother data a larger decay and use bourguin it doesn't you have nothing okay so so so so you usually usually when you prove these results I mean this is what is it that you do having in dispersive pd having more regularity never helps you usually what you do is you say well you need a concert quantity to prove something global you prove it at the level in which you have a concert quantity and then you use propagation of regularity to get the smoother solutions and here that doesn't work okay and so what we do instead is we sort of the key idea is that what we want to do is we want to decompose the data into into into a term that is close to the support of and I'm going to be made up precisely close to the support of the measure and the rougher in the rougher topology in the rougher topology that where we are so so for example in 2d we want to do it for as bigger than zero so the rougher topology will be the h minus epsilon and then a smoother reminder and the question is how to do this so that things work and then we want to do sort of like a sort of a perturbation argument to conclude okay and and the and the and the argument that I'm going to present here is is actually we do it in these two cases but it's very general so every time that you have proven result and that is almost sure using a measure and and and there is a gap above you you can use it so so every time that you have a measure you can use this every single time so let me just give you this example so this is the statement of a theorem sorry for it I have to state something so u alpha just means is just to remind you that the data that I'm taking has this form okay so this alpha means that this is the data so you see when alpha is zero you are in h minus epsilon and this is with probability zero in the support of the measure so u alpha just means that that's the data and you solve the the cubic equation and I'm above l2 so I don't even if you normalize or renormalize you're above l2 so you can always put the the weak ordering which is finite now into the linear term so so I'm not going to worry about that and so the theorem says that if you give me any large t and any alpha positive and some some epsilon then there exists a set of so this is the Gaussian measure so there's no Gibbs measure but you always have a Gaussian measure associated to the one plus alpha regularity which is that means that you're you're forming this with the derivative to the to the alpha square instead of instead of the gradient you put the gradient one plus alpha square and the support is here and so there exists a set with with measure one such that you can continue this to a global solution and it has this form okay so that's that's the global result so how do we prove this you take the data and it's two steps so in the first step all we want is to to to be able to show that this data gives rise to to a global solution in some rougher topology okay and then with the step two we will we will optimize the regularity so the idea is that you take this data and you you do you decompose it into you take some l that is going to be determined later and you break it into n less than l and then you take a second part where you put is like if you wish you can think of this an as a as a as a high free as a high pass filter okay so you let you let just this this high frequencies pass and that's what I call a n and you write this as hl plus psi one omega that's this term and so what you know is you know this is just the low frequencies then in the in the h epsilon one which is the regular then this is uniformly bounded for all okay that's no no no problem and and this a n else are this as I said this is sort of like high pass filter and in size they're always less or equal than one over and so now what you do is you use bourguin's result in h minus epsilon to claim that there exists a set with probability one such that if you look at now the yl this yl leave this alone take this yl okay and you add to that this term okay so this is a smoother so the the probability they got the Gibbs measure since this with probability zero and this is in the support of the measure beta l is just one over l so this is in the support of bourguin's measure and you can add anything smooth to that because this is not seen by the measure so this is still in the support of the measure and so this is one of the elements for which bourguin can can can continue globally okay so write that as your yl plus an outside two and so now you know that this evolves by by bourguin's result this evolves globally to some solution u l that has this form and you have these bounds because bourguin proved that this is always a smoother than you start start here you end up here so furthermore actually in part of the argument another one to get in the details what you get there is this in this iteration argument this approximation and this iteration argument that i mentioned before the you get as a sort of byproduct of that you get so these are just the xsb norms with epsilon and b equals to a half so you also get these uniform bounds on on these terms in these xsb norms on each iterative time step okay uniformly now so that's part of this approximation and iteration argument of of bourguin that i showed you before so now what you do is you use this that this is global the u l is global to prove the existence and uniqueness of the u alpha that you want okay and and the key the reason that's the that's the perturbation argument and the reason why the perturbation argument works is because the these norms are less than one over l they are small so these terms are small they can be made small okay and so what you do is instead of solving for u alpha you look again at a at a different as a suitable difference equation so you call zeta prime l to be the linear evolution of what you want minus the solution this is the this is what you want to solve so you want to solve for that but instead of looking at that you look at the difference between the bourguin one and that one that's the perturbation that looks like this this this solves this problem and this is kind of like the cubic term and this is the cubic term and the fact that these two things are small is what allows you to to to finish with a contraction and iteration i mean the two that these two things are small plus this uniform bound allows you to prove that indeed this exists and then the solution the z alpha that you are after is simply this z prime and so you found you found the solution as the linear evolution of of that data plus the z alpha and once yes yes once no like no no he's cheating yeah yeah he's cheating no no no no come on all right so um and once you have that the step two is the is the is the is the recovery of regularity is the recovery of regularity which actually i'm going to skip since since now i'm being pressed so the second part is is what i want to talk is so all of this was about so i don't have these cons so the first part is about not the cosmic censorship whatever is about the the global opposites which you don't like so but that gives measures which kind of is kind of like the the the the equilibrium ones and so what i want to talk about quickly now is about this other result which is actually uh the transfer of energy this is related to uh the talk of patrick maybe the first day for the halfway but also to some lectures that's her i mean it's kind of the other side of what's her was describing last week and so this is kind of like the the the typical question of the transfer of energy of the out of equilibrium dynamics for nls and so everybody knows that if you are on rdu and if you have scattering that there is no sort of like a skates and that on compact domains you lose that okay so so the question was how to how to how to capture that and this is what also goes by the name of turbulence and bourguin's approach was that what you want to do is you want to study the growth of the of the sovereign norms uh to see how this moves um how this is transferred okay so he's he's he's actually he's a conjecture uh in in 2000 was whether he asked whether there exists global solutions to the cubic equations whose uh hs norms uh growing definitely okay this is what is called the sort of infinite cascade conjecture and so um bourguin himself did some progress on that he constructed some special um one d non-linear wave equation some modified equations in which he he exhibited that there's some work of kusking and then the fundamental progress came in this paper of koliander till staphylene takau katao in which they actually constructed large but finite growth on the sovereign norms and if you haven't seen that result i i put it here there's also work of honey gerard dan prelie uh guardia kaloshi in processi and uh and more recently actually in a in a really nice paper of honey post others that go from vigilia sort of they prove that if instead of the torus you take the the product domain this these cylinders then actually they they they solve the the conjecture they said they prove yes and they also gave some some rate and on monday the first talk of the of this conference we we saw patrick talking about uh similar things for the for the halfway on art and so now to understand bourguin's conjecture for the cubic nns is actually very very hard very hard of course very hard problem very ambitious and so but something that which is kind of like in between bourguin and what i describe in the first part of the talk which is this this this invariant uh Gibbs measures is really the the the study of the existence and uniqueness of non-equilibrium invariant measures okay so this is kind of like the in the stack mech it's kind of like this is very hard and very poorly understood the the existence of stationary non-equilibrium states and for us this this is trying to maybe give some sort of justification some sort of like rigorous sort of framework for for the statistical description of the um sort of out of equilibrium dynamics now this problem is very hard and it's very hard even for stochastically forced systems uh in which is is actually is actually very hard but there has been progress uh in this sort of like uh chains of oscillators of ekman pillier rey vele rey vele thomas and and more recently of her mattingly in which they look at sort of a collection of unharmonic oscillators with with nears near neighbors coupling so here you have the Hamiltonians they consider and this is in the first two papers this is the relationship between the pinning and the coupling and this is is the much harder case in which the relationship is like this and they are already they cannot solve this problem unless they only have three oscillators so here there is no no no uh sort of problem with the end but here they cannot do more than three and the idea is that what you do is you take this and you put them into contact with two heat baths at different temperatures so you have your system and what you do is you want to inject energy into the first mode and dissipate it into the last and and you want to try to see whether the system will relax into into a stationary uh non-equilibrium state and so um for us so so so for us the starting point is actually this toy model in the work of Koliander Klistafilani Tagokatau whose Hamiltonian has this form and uh and um it is known that proved there uh so here this is much harder than the other one because the interactions don't depend just on the relative distances but also on the momenta of the particle and the neighbors and there's known that there is a finite set for which the resonant nls is close and collapses to to this toy model the one that has this Hamiltonian and so we do the same thing we attach a two heat baths at the low mode c1 and cn and which is sort of a standard way of of adding and dissipating energy and what we expect is this system to to converge to a non-equilibrium invariant measure uh which has some sort of flux uh uh from low frequencies to hard and we also want some sort of like rate of convergence and so that's that's what we want to do and so uh the stochastic of the model looks like this uh for that h so this is you see the heat baths here and we also add these terms that's the the Hamiltonian so we leave the the middle terms alone the middle modes alone and we just do this and uh and so you you look at the so just I wrote it just for convenience this is the the sort of the the the generator of the of the transition semi-group the the Fokker Planck operator is here and so what you do with the first thing that you can prove is that if you are at equal temperatures of t1 is tn then this system indeed relaxes to an invariant Gibbs measure okay and all all of that follows from the fact that if you look at the adjoint of the of the operator Fokker Planck on this this is zero and so what is interesting is in the non-equilibrium case that's the sort of equilibrium is what happened at different temperatures okay and you want to prove whether there exists a unique smoother godic non-equilibrium invariant measures uh that has uh this transfer of energy and the answer is that again that if you have three modes so in other words we only have one two three so j is two then uh actually we can prove that and maybe uh to finish because I don't want to push your patience what is the hour 30 seconds oh three minutes perfect wonderful so um is that what I want to explain is that in the proof of this what is hard is is the only thing that is really hard is the existence okay everything else kind of is a standard okay but the existence of the measure is very very hard so to understand this let me show you two transparencies first um first I'm going to maybe show you what we can prove so uh we can also get some rates and the problem is that we're still finishing there is one region which is a little delicate so we don't know if we are going to get exponential or polynomial uh yet so so one one useful thing is to change coordinates then uh you have to change coordinates more than once but you can change coordinates so this is the ij so this is m it's the sum of these squares and this is the the the the angles and so the first problem we have and like the previous works is is that is that we lose the hyperlipticity in all of phase space so we we are not hyperliptic everywhere but uh uh we have to remove this okay we have to remove have a degenerate line that we have to remove the good thing is that if you start uh with c2 equals to zero then you will remain at c2 equals to zero but uh in the hyperlipticity which is what gives you that if something exists is going to be smooth uh then uh we have now a boundary we have this boundary and so the existence of the measure follows from uh what is it that you want to do what you want to do is you want to construct the elia point of function that somehow penalizes this region where this is a small and sort of high frequencies and somehow uh what do you want to do is uh uh once you construct it then that construction will give you an upper bound on the heating time of of the good region which should be a compact set so let me show you quickly a picture because maybe then it will be easy this is this is actually what is how about the problem we have to chop phase space which which which they didn't have to do before and so now it's like this is the good region where you want all your dynamics to happen and so what you want to do is you want to prove you want to construct leapon functions here that tells you that as soon as you enter any of these regions you you are kicked back into here okay but usually you know usually they when these things are very hard to do and usually people what they do is they say well you know your guess your best guess is that the leapon function should be e to dm but that doesn't work everywhere here and you have to actually understand very well what are the dynamics of the deterministic system to to understand what works where and in the regions in which you have more than one boundary these boundaries are determined by the temperature and the gamma and so on but in the regions in which you have these boundaries they have contact they talk to each other you can guess one and then the way to solve it the way to solve it so you can guess one of them and then what you have to do is you have to solve a sort of a a Poisson equation with the right boundary values and some convexity conditions and solve for all the other ones flow it all the others to get them and in some of the regions to get this the problem is that you have to really understand to get this to you really have to understand how the phases you need to you need to understand the the phase diagram and know that the phases will look at some point and so it's actually quite delicate and so and so this is where the the the heart of the matter is and as I said once you have that then the rest the uniqueness and the audacity follows from sort of a control arguments or a controllability lemma that tells you the deterministic system sort of can access any region in phase space so so if you give me two points I can find an epsilon a neighborhood of one so that I start here I end up in the other one anywhere in phase space and then there is a standard a theorem of Strucambara that tells you that once you have that for the deterministic system the the the stochastic one will behave the same way and thank you very much I leave you the picture it's been a lot of time on that thank you questions yes so they understand correctly that the difficulty is to understand the dynamics of this ODE in big dimensions very precisely yes and we cannot do you see the problem is look at this we have to break all of this only with three modes if you have four for example the problem what is the problem your problem is okay high energies you can you can do what the problem is at low energies like they are when when you're zero so if you are stuck in the middle and your energy doesn't move anywhere right I mean if you have four modes maybe you you're stuck there and and so you know we are happy first if we can do three I mean the paper of Jonathan with with Martin they're also on three they can they cannot do four either so it's not it's not clear it will be much more complicated the way in which you will have to chop and understand is the construction of this Lyapunov function is it's hard because does it help if you have more conserved quantities um I don't know see the Hamiltonian is here I don't know good question