 Thank you. And thank you for the invitation. So Keith, we went for the same meeting in the bank for the first time in Montevideo, but Keith already came eight times here. So I have some catching up to do because I came only six times. But it's very hard because each time I come, he's already there. So my setup is compact, a closed remanian manifold with negative sectional curvature. And you know that the geodesic flow is another. And one object of interest, if you want, is a central stable manifold of the unit tangent bundle. So have the unit tangent bundle, spherical tangent bundle. The unit tangent bundle has the geodesic flow. Geodesic flow is another and there is a central stable flow. And the dimension of the leaves is the same as the dimension of the manifold. And you might be interested in the, I mean, you have a foliation on a compact space and it's transitive, right? The leaves are red and they go everywhere. It's a very nice foliation, but the leaves are... So you wonder whether there is some regulatory always of such an object which goes everywhere and mixes everything. But there is no natural mapping. I mean, there is one flow which preserves, of course, the geodesic flow preserves the central stable foliation. So have the geodesic flow, the X bar, the geodesic spray. So X bar is tangent to... is tangent to X bar at X is tangent to WCS. But, of course, it's a higher dimension so it's much richer than that. So you want something which takes into account all the direction. And the natural object is the Laplacian. Because locally, the central stable foliation is identified with a space. So you have a metric on the space. So you have a metric on the leaves. So you have a metric on the space. So you have what we call delta S, which will be the divergence of... I mean, the divergence of F. And it makes sense for F, which is smooth, or C2, C2 even, along WCS. I don't care much... I mean, transversely, it's only half coefficient, which depends on the metric. And in all the talk, the metric will be infinity. But nevertheless, transversely, it's only half continuous, but it's continuous. It's already not so bad. And I have these operators on the leaf. And now I have some dynamics, because I have the diffusion associated to the Laplacian. And what I want to consider is not only the Laplacian, but I mixed the two, and I will call L lambda, which is that operator, delta CS, plus lambda E. So it's an operator which is very nice along the leaf. But globally, it's not nice. It takes into account only the derivatives along the leaf. Nevertheless, you can ask whether... On a compact space, you have an operator which looks like elliptic. It's not elliptic, but elliptic at least in a few directions. So you can ask whether there are invariant probability measures. So what does that mean? And L lambda is stationary. If for all f, c2 of WCF, c2 along the leaf, and the two germs are continuous. And globally. And for all functions like that, if I apply my generator of my diffusion, I get zero. So it's the invariant for E to the t lambda diffusion, which is E to the t lambda lambda. So that's the notion of stationary measure, and it's quite old. I mean, it has been studied already. So for lambda equals zero, just don't take into account the geodesic spray. And that's general for a foliation. You can always talk about the diffusion along the foliation and stationary measure. In that case, they are called harmonic measure. And for the central stable foliation, it was shown by Garmet, Lucy Garmet in 83. That there exists a unique M not, which is stationary for L zero. And more or less, she describes M not sufficiently. So locally, you have the leaf of your foliation. And you have, I mean, it's bigger than the transversal one. It's not very realistic. But so this is dimension n. This is dimension n minus one, as you can see. And locally, you know the density, the conditional measure of M zero. R of the form are proportional. Anyway, conditional measure defined up to some coefficient. Two kx, kxy, dy. So you have the Lebesgue measure because you have the matrix. You have the Lebesgue measure on the leaf. And you have k and k are harmonic functions. You have delta, yes, y of k, xy equals 2. So that's one example which is well known. Another one which is well known is for, so if I take, if I call v the volume entropy, some function. Because it's up to some, it's proportional. So I have a density here, I have a density here. The only thing I can say is the ratio between the density here and the density here. And the density is, the ratio is kxy. Doesn't matter. Okay, I fix the x as a function of y on the leaf. Yeah, but it's a, I mean, it's a cycle. So if I take another x, it will be the same function up to some constant, which will be exactly k of xx prime. It's long to describe, so yeah. Okay, so the volume entropy of m, which is v is the limit r1 over r log of the volume of the ball in the universal cover of radius r. As you know, it's also the topological entropy of the geodesic flow. So if I take lambda equal v, there is also, that's something which I did some time ago, there is also a unique stationary measure for, and that measure has many names. It's the biogé, for people who know the biogé, roblame measure. It's also the Knieper measure, the one you put, you take the sphere, you take the stable, you lift the sphere to the universal cover by putting the stable direction. You transfer, and you get a limit as r go to infinity. Let's give you a family of, yeah, you look at the measure, you look at the limit r go to infinity, give you some measure on the question. That's also, maybe, it's also, so the conditional measures are also proportional, are also, but here it's much better, it's e to the minus v, the Boozmann function of x, y. So it's not the nicest way of describing it. The nicest way of describing it is to say that it's the conditional measure on the strong stable manifold are proportional to the beg. And transversely, it is the unique measure which is invariant by the strong stable manifold. So it's nice. It's a nice measure also, so very natural. And a little later, for all lambda smaller than v, it was shown by Ammestadt Mt7. The same result. There exists a unique M lambda stationary for L lambda. For lambda smaller than v, bigger than v, there are plenty of them. And you have a conditional measure of M lambda are proportional to some function, K lambda xy dy. And K lambda satisfies some relation. So I have delta s in y of K lambda xy plus minus lambda divergence, again divergence, yes, of x of K lambda times x. There you have some equation like that which is satisfied by the conditional measure. Again, I take the derivative in y, it does not depend on x, because the fact that it's 0 does not depend on x, because if you change x, you multiply by a constant in y. And it's the ratio of the density where each time you take a real conditional measure. Okay, so that's what is known. And so the results I want to talk about today, so the theorem, work in conversion, more than work in progress, but it's not completely written, is what happens when lambda goes to infinity, as lambda goes to minus infinity, M lambda goes to the Liouville measure. So what happens when, I mean, there is one example, constant curvature or locally symmetric spaces, that's the example, I mean, not much to say, that's the example of 0, M is locally symmetric with negative curvature, then M lambda is Liouville measure for all lambda. And conversely, I mean, it's not very hard to see that if for one lambda M lambda is Liouville measure, then you look at what it means for that function, both to be going from one point to another, it goes to, you have the Jacobian, if it is a Liouville measure, you know how it changes with the flow in the Jacobian, and you know this equation. So you have sufficiently many equations to conclude that, to conclude what? To conclude that the mean curvature of the whole sphere, delta of the Wussmann function is constant, and that by many people, the French crowd, implies it's locally symmetric. So locally symmetric is the same as Liouville measure for all lambda and for one lambda. And for surfaces, there is a very nice phenomenon that M is locally symmetric, M is constant curvature, as soon as M0 is equivalent. That's a famous result by Katoch. So it's M0 equal Liouville implies locally symmetric. That's very nice result of Katoch is that M0 is equivalent, has the same negligible set as Liouville, is equivalent to Liouville. This implies already that you have constant curvature. But of course you use conformal equivalent, if you have any metric in dimension 2, it's conformally equivalent to metric of constant curvature, and then you see that the entropy, this amounts to say that the entropy of the Liouville measure is the same as the V, and then you see that there is some Cauchy-Schwart somewhere. And you have the same kind of result for surfaces, for the other measure. Same for M0 equivalent to M, what do I say? Katoch is a measure of maximal entropy, Liouville has maximal entropy, so the other one is the harmonic measure of the maximal dimension, and what's remaining? The M0 equivalent to M. So that's true also for in dimension 2 only. I did it, but Katoch also did it in the late 80s. Once you have the idea of Katoch, it's just a small modification. And there are conjectures, of course, that's still true in higher dimensions. So the conjectures have names for higher dimensions. So Mv equivalent to M implies localism metric. That was, of course, conjecture by Katoch after this result. But in fact, the other one that had been conjectured before, even if they didn't have the... So M0 equivalent to Mv implies localism metric was conjectured by Sullivan, because I didn't say that Mv is also related to Patterson-Sullivan compaction. And actually M0 equivalent to M implies localism metric was even conjectured by Lyon Green. Anyway, these are conjectures, and very little is known about them. Yeah, it's still open in the old dimension. The other thing which is known is a result by Livio Flaminio, is Katoch conjecture holes in a neighborhood of Constant Curvature. So the local version is true around Constant Curvature. You cannot have... You will measure have maximal entropy, which is another form of... That's far that I know, that's the only positive result. So what our results say, we couldn't do anything with it, but it's fresh, so we still hope. In some sense, we interpolate between M and Mv, the whole family, not only with M0, but with the whole family of measures. So we can try to follow the different quantities along the parameter lambda. It doesn't work... I mean, there is something we want to be increasing, but non-decreasing, but it's not clear it is. But that was basically the idea behind this, and what do I want to say? I want to say a few words about the proof. So the proof is... Yeah, I mean, what happens? What happens when lambda goes to minus infinity? Well, I just erased it, but the equation is integral of delta plus lambda, delta Cs plus delta lambda f dm lambda equals 0. Oh, I can... So lambda goes to minus infinity, so I can epsilon minus 1 over lambda. So what do I have? It's minus x, so I divide by the plus epsilon delta Cs f dm lambda equals 0. So in some sense, and epsilon goes to 0 now. So in some sense, it's a perturbation of minus x. So you expect any... I mean, you are in a compact space, so you have a weak star limit of the measure. So you know that any weak star limit measure will be invariant under the geological flow. So that's already something. So any weak star limit is invariant into minus x bar invariant. So invariant by the reverse geological flow, which is also invariant by the geological flow, the same. And here is what you know about the M lambda, is that the conditional measures on the unstable manifold are nice, given by something which satisfies an equation which I just erased here. Delta of K lambda plus lambda minus lambda of K lambda x. But the problem, for such an equation, you have a Harnack inequality. So along the leaf, K lambda is quite regular. So it's good. We have many examples of that, right? We have piecin-sinai or u-state, and we have many examples. We have a family of measures which are regular along the central stable manifold. They go to the limit. It's still regular. The only one which is absolutely continuous is Lebesgue. Except that if I'm not mistaken, the sign is wrong. So when lambda goes to infinity, you still have Harnack inequality, but it's worth and worth. So the estimate that you have on the regularity of log K lambda is disappear when lambda goes to infinity. I mean, it's intuitive, right? I mean, you have less and less influence of the noise. So what... You know, when you take the whole Laplacian, that's also classical, it's Kieffer in the 70s, to show that the limit as epsilon goes to zero or something like that is the Liouville measure, or the Sinai-Bauerian measure, in the non-conservative case. But here we have something which is not elliptic. We cannot use Kieffer ideas because we have good estimate for delta, but only along the stable. When epsilon goes to zero, we have good estimate on the neighborhood on the central stable manifold. It doesn't go very far. So we use thermodynamic formalism to try to characterize Liouville by the entropy. And this has been done also in the case of... No, okay. The entropy of what? That's the question. But the idea is that if you have a diffusion like that, there is a way of taking a bigger space and thinking of it as a projection of a family of random diffeomorphisms. So we will do dynamics, but random dynamics. Random dynamics means you have a family... you have a family of diffeomorphisms of your manifold, and each time you make a random work, if you want, on the diffeomorphism of the group. Of the group of diffeomorphisms of the manifold. You choose the diffeomorphism, do it at random with some distribution, and then you choose another one at random with the same distribution. And for each path, in fact, if the f is close to the hyperbolic thing, on each path you have very nice hyperbolic sequences, and you can do hyperbolic dynamics, path by path. Okay, you have to construct the guy, and when you construct the guy, you realize it's not as nice as that, but that's the idea. So there are four steps. So the first one would be to construct a random dynamics, and we'll have two properties which will be... so that preserves W s, W c s, because that's what you want. Randomness is only on W c s, and it's somehow which is... which is related somehow to... a little more precise, but now that you have a random dynamics, there is a notion of random entropy, random Lyapunov exponent, and for that random dynamics, we just have to show that piece in formula that h is bigger than the integral of log g s dm lambda. You have to define h, but the nice... I mean, the thing which makes the idea work is that what is it on the universal tangent, universal cover of the space? One lift of a lift is all the geodesics which converge to the same point. So now on that I do the Laplacian, which is the Laplacian on m tilde, which is just lambda x, and lambda is very negative. So what do I do? I go this way, I go very far from that, and then I do a small perturbation, a big spot, and a small perturbation. So W c s is invariant by the construction, so it will be something like the unstable manifold of this random family. It's a very nice random family where you have some variation which is invariant and which really has the same property of the unstable because you are close to minus. So g s is a Jacobian in the direction of... along W c, g s c is a Jacobian along W s c, which in the random world will mean the unstable manifold, not unstable, central unstable, or something. And then you saw that h m lambda, then this guy you just will be a limit. What do I want? I want to have a case in formula at the end, so I want the log g s c d m is smaller than the limit. Lambda goes to minus infinity of integral of log m lambda. You have to do the construction in a way where all the elements are continuous when lambda goes to minus infinity. That's what people who do that kind of construction know how to do. And the last step, which is the most important, is to show that the entropy is at least the entropy of 4, 5 minus 1. At least the lim sup at lambda goes to minus infinity of this h lambda, h over lambda, which I have not defined yet. And so it's a question of a personal continuity of the entropy. But this has been done by Yomdin and Buzi. You have to adjust in that circumstances, but we are in a good shape because we have sinfinity mapping which preserves the WCS. And this entropy, which I give you definition in a moment, only involves how your dynamics acts on WCS. So Yomdin arguments are essentially correct, but they are not really adaptable. But it has been done in the case of Bikovison and Lysangyang. It has been done in the case where, to get the Kiffer reason, in the case of perturbation, random perturbation of another of DeFeu. So we are not in the same situation. We have much weaker hypothesis, but all points that can be taken care of. So maybe I explain the construction of the random dynamics. There are two points. The construction of the random dynamics, I mean it's classical, but maybe it's not well known in the audience, and Keith already made the picture for me. And that part, I'll try to say a few words. And the definition of the entropy, that's classical. So how do you construct a random dynamics with some diffusion operator? It was realized by essentially Maliavin and other people behind him that what you have to do is you have to go to the autonomal frame bundle. And on the autonomal frame bundle, there is a natural way of constructing random dynamics, which projects back, such as the trajectory, so it's a random product, as I said, and if you look at the trajectory, they project back on the trajectory of the diffusion with the right probability space. You lift it, so you go on the frame bundle. Okay, there are small little... Okay. So how do you... So crash course on a stochastic flow. How do you construct the Brownian motion on RM? So you take a delta, so you're starting from zero, you take a small delta, and you choose a direction at random, and you go up to square root of delta in that direction. That gives you a small geodetic here. So you go straight line here, and you go... You choose again at random on the sphere of direction, and you go square root of delta, and you continue. Okay. And the old theorem of Golmogorov, for instance, is that when delta goes to zero, the distribution of this path converges to some distribution on paths, which, of course, are only one half-folder. And that's the construction of the Brownian motion on RM in one word. One picture. And the wonderful observation of Maliava is that you can do the same to construct a diffusion. So I have my generator, and now I have what do I need? I need some mapping. So the diffusion will be on OSC of SM tilde. I do it on the universal cover because there will be some problem when you go to the question. But, a priori, I do it on the universal cover. And this is the space of the orthonormal frame, tangent to WC. And I do the same. If I start from some element here, u, so it's u, I start from u, u which belongs to TV of... No, I start from a frame, sorry. I start from a frame, yeah, u, which belongs to TV of... So I need a way... I need a way to map my RM into the space of frame. So I teach points in TSM. Actually, I have E1 into EM. I know what they are, and I can do it in a smooth way. So now if I take one of these frames, I do the same. Here I choose one direction. So I choose one direction. And I do parallel transport. The same picture as in a keystroke. I do parallel transport for a time square root of delta along that direction. And if I want to have lambda x... So I put... And if I have lambda x, I also put delta bar. I have parallel transport along the geodesic for a length delta. So I arrive here. I started with the u. I have another u here. And my mapping tells me that the second one, how I choose my geodesic here. So I choose my geodesic here by the image of this one, square root of delta, and delta x bar, and so on. And the same proof of Kolmogorov tells you that this converges for almost every choice of... At each step you choose... And you put everything in the same probability space. The same as the one we had at convergence here. And for almost every choice, you have a path, a continuous path at the end, which is as there was before. And the projection on... So it's a path on the space of frames. But essentially, if you change your frame, you go elsewhere, but you also have a frame. And if you start with the big measure on the frame, it's invariant under that process. It's stationary, you know. And so it gives you a path, and the distribution of the path are the distribution of the diffusion associated with that generator. The nice thing from that construction... I mean, it needs the proof, but it's a common sense, is that, okay, I start from one U, and I start from one leaf. But if I take the leaf nearby, it's continuous, right? If I have my C, and I have another C prime, and I do my construction, and moreover, I mean, you have to check that, indeed, for lambda smaller than V, it goes far away. I mean, it goes to infinity, so it doesn't go toward C. So since it goes far away from that point, I mean, everything is very close, and it's even closer and closer. So what you construct here is continuous in C. Continuous is an epsilon for the same reason. I mean, if you change epsilon to our lambda, it will be, I mean, it will be close here. It will be a little bigger, a little bigger, but if you take a finite path, it will be... And so for almost every trajectory here, you have not only a trajectory here, but you have really a flow, and we have really a map from the... So I go to the limit, so I have something like that, and I have really a map from U here to the tangent space, to the frame. And the map depends, I mean, the construction of the map, say, up to time one, depends only on what I did in up to time one here. So I'm really composing independent map. So not only have... So this gives you much more, this gives you really a random walk on even a random process on the space of diffeomorphism of your space, of the orthogonal tangent, orthogonal frame bundle. And that's... Okay, that's the 80s, Magavin, Bismuth, Elleworthy. But the nice thing is that it tells you that because of that, you can do for any stochastic differential equation. So the nice model for random perturbation is random diffeos. Because if you want to make a continuous analog, that's the only way of doing it. Of course, discrete, you can make a Markov chain where each step is close to the image, but if you want to make a continuous analog, that's the only way of doing it. So that's my advertisement. Stochastic flow is the only way. If you go now to discrete time, random perturbation is a natural way, random diffeos is a natural way of making a random perturbation of a diffeo. It seems to be a little far-fetched, a priori. But because of that, and random diffeos are much more easy. Trajectory by trajectory, you can do dynamics. It's just non-autonomous dynamics. But if you have cones, you have cones. Sometimes they are destroyed, but you have some estimate of how often. It's really a dynamic. You repeat, but you don't repeat always the same thing. So in particular, you have some entropy. What I said about the continuity in lambda, it's also true when lambda goes to minus infinity. So the trajectories are really closer and closer to the trajectory of the guy when you don't do anything, which is the frame flow. The frame flow for the space of a frame over the central table manifold. So this guy needs to be checked, but there is no obstacle. Okay, so I want to talk about, so that's what I explained, maybe. So what is the entropy? To define the entropy, you define a bow and bow. So what is the entropy? I mean, the simplest, you know, I have diffeos of Tm, Ts of Tn. I have a product measure on that. So at each stage, I take at random one of these. So a typical sequence is phi 0, phi 1, so on phi n with the product measure. And now for such a sequence, I can define the stable relative bow and bow. It should be Bs of u. So I start with some u. I have my sequence. I have some number eta and I have some integer n. And I look at all the w in the same frame space, such that the distance between, so phi 0, phi 1, phi n minus 1 of omega, everything depends on the sequence of u and phi 0, phi k minus 1, phi 1, phi 0 of w is smaller than eta for k between 0 and n. That's exactly the definition of the bow and bow. And we take the thing. We take only elements in the same space. I mean, the space of interest. We take an autonomal frame above which are tangent to the central stable mass. And otherwise, it's the same definition of the usual bow and bow. And just subalitheliacalic theorem, I mean, it's not very hard to see that, to define hm of phi bar u, which is the limit n minus 1 over n log of m, so m lambda. For us it's m lambda, stationary measure. I have a stationary measure on the SM. So I want a stationary measure on the frame model. But there is one way of doing it, is just to put a Lebesgue measure on all the fibers. So that's m bar lambda. So m bar lambda of eta. And we take the limit eta go to 0. That's the bow and definition of the entropy and it works. It's sub-additive along, I mean, it's random. It's random, but it's sub-additive along the sequence. And again, the way they are defined, they preserve WCS. I didn't say it explicitly, but I worked in WCS. So I defined a different morphism of the manifold which preserves the central stable foliation. So I have a central stable foliation, but I know also, and I can look at the central stable foliation and saturate by the fibers. And I know that the conditional measures are absolutely continuous because they are absolutely continuous in the projection. That's the property of m lambda. And I put Lebesgue on the fiber. So the conditional measures are absolutely continuous. That's a piecin formula. This is a piecin formula for the random thing, which I think, okay, it's not exact, but Aaron Brown, for instance, has a proof of the piecin formula in that case. I mean, it's really partial foliation and random diffusion. It's an atomization of piecin formula. Okay, so we are in good shape. We just have to check that. And maybe I'm saved by the gong, but... Oh, great. Those are two propositions to prove. That's where... I mean, all that was a little painful by moments, but okay, so the trick now, we have the relative Bowen Bolt, and we have also the notion of Bowen conditional entropy. I don't know if you know that, which we can adapt in that case. How many later I have... I fix a row positive and a sequence as before, and I call R of what? Row 5 bar U eta N. The number of B, S, W, 5 bar, 7 eta, 7 N bolts needed, minimal number needed to cover the Bowen Bolt, B, S, U, 5 bar. So I fix a row. I fix my row, and I look... I'm interested in the disparition of the hypersemic continuity of the entropy. The entropy will be very nice, very continuous if I fix the size of the row. If it might fail to be a personic continuity, it's because something strange happens when a row goes to zero. So you just take a topological invariant, which is introduced by Bowen, which is called the conditional. The conditional was not a good word, but since now we have relative, so it's a conditional, stable relative. Which will be what? For phi bar U is a... No, H of row phi bar will be the sup over U. It's a topological object. Given the random sequence, this topological object, the sup over U of lim... It goes to zero, of lim sup, and 1 over 1, R of row phi U eta n. Okay, that's a topological number which measures how the random differo brings the entropy even at a very small scale. If you know the entropy at scale of row, but maybe it's not enough to know the whole entropy. This is measured by some number, and that number depends on the sequence, but it depends on the sequence in a stationary way. So if I have some lambda, so for lambda bigger than minus infinity, H of row phi is almost everywhere constant, because it depends on phi, or the sequence of independent objects, but it's stationary. So I call this object H lambda of row phi, and the two propositions we have to prove are one which says that we can control this object, and the other one which says that the defect of a person in continuity is exactly that object. The statement, so the first proposition, that H m of the reverse geodesic flow, that the lim sup lambda goes to minus infinity of H lambda, H m lambda, the guy we defined, is smaller than H m of phi minus 1, plus the defect, and the defect comes from here. The defect is lim, row goes to 0, of lim sup lambda goes to minus infinity, of H row lambda constant. It does not depend on phi, H lambda of row. That's the first proposition, and the second proposition is that this guy is 0. Okay. So what is behind this? That's what Bowen proved that... Bowen introduced this condition, at least non-random, that H of row goes to 0, for different morphine, and he showed that as a condition for a person in continuity of the entropy, a person in continuity of the topological entropy, the neighborhood of the difference. So, but it's just... That's another question, and this one, maybe that's the most interesting, was done by Yom Din for sin infinity manifold, when he proved the entropy conjecture for sin infinity manifold by showing that there is no... It was interpreted by Jerome Bousy in really a theorem of a person in continuity of the entropy for the infinity difference. For sin infinity difference, the entropy, the application measure goes to entropy of the measure, is a person in continuity. And the proof is that, and how do you prove that? Yom Din, this topological argument, this topological object, and what is the topological object, is the biggest growth of a sub-manifold. And if you take a sub-manifold, and you look at how much, and you look, you apply your difereomorphism, how much it can grow in a ball while staying in a ball of radius rho. You take the manifold, you apply it, and each time you intersect with the ball of radius rho. If this is control, then entropy is the person in continuity. And how this is control, that's a wonderful idea of Yom Din, for sin infinity case, you approximate by a polynomial. And the control will be eventually the derivative divided by r, log of the derivative. For difereomorphism, the lack of a person in continuity of the entropy is exactly the log of the Lipschitz constant divided by the regularity. So if you ask infinity, the regularity is infinity. Okay, you have to do this in that setting where it's not uniform, it's random, but it's essentially the same. It's longer. Most of the steps, which I think might be a problem, we check them. Okay, maybe writing completely, there would be a new surprise, but a priori I'm confident that it's true. I can stop here. Yeah, assume sin infinity. I mean assume, the manifold is infinity. The central stabilization is infinity. And those guys are infinity, but... It's not infinity. It's not infinity. Okay, I have the impression of using a hammer to kill a fly, but it's not really clear how to make a counterexample, but we haven't really looked at it because it's quite fresh. What is it? This guy is zero.