 Very good. It's good for me. Yeah. So let's go now to the second talk of today. And I have a pleasure to have Sylvain Crovisier now, from CNRS in Orsay. Sylvain will talk about dynamics of smooth surface defymophisms, strong positive recurrence, and it is a kind of a sequel of what Jerome said yesterday. Thanks, Sylvain. Yeah, thanks, Yurii. Also to Stefano and Joseph for organizing this event. So as Yurii said, so I will talk about surface dynamics, but it's the continuation of Jerome's talk he gave yesterday. And there will be a third talk by Omri Sarig tomorrow. So it's about the same topic. So several works we are doing together. I think my talk is rather independent from Jerome's talk. I will recall what I need from his talk. In any case, you can ask questions, of course. OK, so yeah. So I will talk about surface dynamics, dynamics by defymophism of surface, a bit in higher dimension too. I want to do it in a quite general way. So let's take any defymophism. And there are different situations. Of course, one particular case is when complexity, the entropy vanishes. In this case, you may build some simple example. And you may expect that you have some rigidity, that in fact it's difficult to find another example. So any such dynamics will fit with this example. So there are works in that direction. But it's not the subject of today. Today I will rather consider dynamics with positive entropy. So on surface, you know by real inequality that it's implied that you have invariant probability measures, which are hyperbolic, which have one positive, one negative exponent. Then by famous theorem by Catoch, it implies that you have horse shoes, uniformly hyperbolic sets. But beyond those uniformly hyperbolic sets, the dynamics is delicate to describe. So there's the part of the dynamics which carries these measures, hyperbolic measures, that are not necessarily uniformly hyperbolic. We call this non-uniformly hyperbolic set. And so this is what I would like to talk about today. So what is the goal? So as I said, the goal is to consider an arbitrary system, not focusing on a particular setting, and to focus on this non-uniformly hyperbolic set. As it is a very rich and large set, we won't be able to describe completely the dynamics. So we have to take a viewpoint. So we will focus on the part of the dynamics with a large entropy, so in particular, on invariant measures with large entropy, and if it exists on measures which maximize the entropy. And at the end, during the talk, I will also mention some results in higher dimensions. OK, so to start with, let me recall some very classical results. So the classical results are for uniformly hyperbolic system. So take a deformorphism, which is uniformly hyperbolic here. Uniformly hyperbolic will be an ozov for now. So I consider a system which is mixing, which so I cannot split into several pieces. And it's hyperbolic. And then we have many good properties. So regarding to the topic of this conference, it's possible in this case to build the Markov partitions and from the Markov partitions to introduce a coding of the dynamics. OK, a coding by a subshift of finite type, which is irreducible or transitive, which is parallel to the fact that the initial system cannot be decomposed. OK, then I was mentioning the measure maximizing the entropy. So in this case, yes, there exists a measure maximizing the entropy. There exists a measure whose entropy coincides with the topological entropy. This measure is unique. And from the measurable viewpoint, we understand exactly the dynamics. This measure is isomorphic to a Bernoulli system. But this measure appeared in other ways. So one way is to consider the periodic orbit of the system and with higher and higher period and to see how they could distribute. And when you do that, you see that it converts towards this measure. Maximizing the entropy. Then you may want to understand this measure from keeping in mind that you have a smooth system and so you will consider smooth observable, or at least hold a continuous one. And then you can see how the mixing property behaves on these measures and you get a speed of mixing, which is exponentially, which is exponential. So that means that the correlations decay exponentially fast. And you can also get limit theorem. So, for instance, the central limit theorem, so you consider the Birkhoff sums and you look to their fluctuations compared to the average. So if you rescale by by a square root of the number of iterates, you it converts to a it distributes to a normal law. OK, so this is very well known here. It is in the special case where your hyperbolic system is made of one single piece. Then you have a generalization of that in for so you introduce hyperbolic system with more pieces. And then you have what is called the Smale spectral decomposition serum, which asserts that, in fact, the system in this case, the normal ordering set, decomposes into finitely many pieces that are disjoint, compact, invariant and transitive. So you cannot decompose further. The species are called the basic sets to try to to to to see what you have described before. You have to decompose further. There's the pieces. So, in fact, each of the species is a finite union of sub pieces of disjoint, compact subsets that are cyclically permuted. And when you induce at the period on one of these other species, what you get now is topologically mixing. And from that, you recover the previous property. So in particular, you have one measure maximizing the entropy on each of these periodic piece. Maximizing the entropy of the basic set. And you also have the limit serum, the exponential mixing, as I described before. Okay. So the goal is to discuss if these properties can extend to more general systems that are not uniformly hyperbolic. So what are the questions? When you look to a general system, you want to see if it's possible to decompose it, then to study each piece independently. One question is, can you code these species by building Markov partitions? Can you find equilibrium measures on each of these species? So I discuss measures maximizing the entropy. So you want the existence. You also want to discuss the uniqueness and then their properties. Are they Bernoulli? And do you find also some limit series? So on surface for smooth deformorphism, this works quite well. This is what I will tell now. So consider a C infinity surface deformorphism. As before, I assume the entropy is positive. And to start with, let's assume it's topologically mixing. And then we have results that go along the previous line. So first new house has shown that you have one measure maximizing the entropy. And more recently, David Burguet has characterized this measure, has found this measure as the limit of the set of periodic orbit. The difference with a uniformly hyperbolic case is that you should not exactly take all periodic orbits. In this case, you should take the saddles, whose Lyapunov exponent avoid a small neighborhood of zero. And that are not to degenerate. Okay. Then Omri built generalize Markov partition for the system. So a coding, which does not call the exactly the whole system, but a huge part of it. Particular all the measures whose entropy is above some, some number. And to use that coding, it's useful to have an irreducible coding, coding, which is a transitive. And this is what we did after with Omri and Jerome to improve his coding to find a transitive Markov. And from that you can get properties for the system. And in particular, we have proved that the measure is unique and isomorphic to a banuist system. So I think Yuri explained that in his mini course. Just a rough skin. Yes. You talked a lot about the coding, but not and you mentioned a bit the consequences. Okay. Can I just ask a quick, quick question? Is this in this general setting? Is it known by the examples? That support whether this measure maximal entropy is supported on a uniformly hyperbolic set? Or can it also not be supported on a uniformly hyperbolic set? Is there anything known about this? Well, you can build example where the system is not uniformly hyperbolic, but it's the system is a degenerate uniformly hyperbolic system and then the measure is everything. And so it's not supported on a uniformly uniform set. Yes. Sorry, the measure of maximal entropy. Yes. Is everything in what sense? You mean that it has full support. Okay. Okay. There's full support, but does that mean that it does not, that cannot exist a set? Oh, which is uniformly hyperbolic, which has full measure. I mean, the support. This cannot exist in that case. Okay. So it's an example in which it cannot really live on a set of this uniformly hyperbolic. Yeah. Yeah. We don't show that, but you should imagine that in this topology, academic in case the measure is fully supported on the full manifold. It's there are example like that. And it's not a CRM. So I don't know if it's true in general, but it's the picture you, I think that you should have in mind. I think you can, you can do the following. So if it's supported in general, you can do the following. Okay. So if it's supported in a uniformly hyperbolic, like a horseshoe, then the entropy is log of some eigenvalue. We have countably many possible values for that. But you have the film of fisms with any, any value in the real numbers. If I'm not mistaken, actually, right? I think that whenever you have a non uniformly hyperbolic, a body, a clinic class or a clinic class, then the results show that the measure of maximum entropy is fully supported on the coding of the shift. Yeah. Exactly. So this is what I wanted to say after I did. I want to. Yeah. Yeah. You're right. What I think the question is for topologically mixing, whether, which means, of course, the periodic points are dense and the whole business, whether in that case you can have these irrational entropies, log of the irrational numbers rather than log of algebraic numbers and uniformly hyperbolic case, of course you have a, the log of algebraic numbers and most accountable set, but can you do this for general defumorphisms that may not be mixing? As everybody here knows you out with what Jerome and Sylvan did. You have the measure of maximal entropy and its entropy is not the log of an algebraic number. But the question is what about topologically mixing for the manifold? I don't know. Yeah. Okay. Thank you. Okay. Let me say one thing about this results. I put some on the left, some on the right. That's because of the technique to, to, to prove them. So the, the result on the left. Use are based on. Yom din theory. So they, they allow to, so this is what she'll done. Started to do to, to show that at small scale, you cannot produce a lot of entropy. So this gives some semi continuity on the measure of maximal entropy. And then this has been pushed. David Burge was on the right. Everything is beta on the base on the symbolic approach. So you have this coding that is built and then consequences on that. And so the result I would like to. The one, the, the goal for my lecture and Omri, I would like to. The one, the goal for my lecture and Omri's one is to present this one. So to, to, to get more properties on the, on the measure maximizing entropy. So to, to show the exponential mixing and to, to get the central limit serum and other consequences that Omri will state probably tomorrow. Okay. So here's a result in a topology, a mixing case. And then now we would like to see what happens when you are not topology, mixing. So as in the. Actually, in case we have to decompose a system. And so the natural. Object for that, as it has been said, are the omoclinic classes. So if you pick a saddle. What is the omoclinic class? What I recall now the topological omoclinic class. You, you take the transverse intersection between stable and stable manifold. Take the closure. So it's an invariant compact set, but it's more than that. It's transitive. And inside the periodic points are dense. You have a dense set of periodic points that are omoclinically related to all. So you have orbits that connect all to this periodic orbit. In both directions. So fine. In the uniformly hyperbolic case, this corresponds precisely to the basic sets. And so you, you want to use them. In order to decompose a system. Okay. If you want. Set that are topologically mixing. Still here you can. Find a finer decomposition. So C click compact subset. Compact subset that are cyclic only permutant. And one. One. Maybe the, the period of the cycle or multiple of the cycle is the GCD or the payers of orbit. Or periodic orbit omoclinically ready to go. The problem with that is that in general, this classes are not the joint. So it's not exactly a decomposition. Nevertheless, for smooth surface dephomorphism, it's not far to be a good decomposition. So it's what we have shown. So first. Buy a catwalk serum. If you take all the omoclinic classes together, it covers. A set which has full measure for any measures with positive entropy. So you describe all the. Part of the dynamics with positive entropy. Second, that this classes are almost joined. With respect to the entropy. So if you, if you have two classes, take the intersection. If the classes are different, the intersection has zero entropy. And then the number of classes is maybe not finite. But, but almost. So if you fix a bound on the entropy, the classes with entropy larger than this bound, you only have finitely many ones. Okay, so in this sense, it's not far to be a. A good decomposition. And then. Then you can extend the result of the. In the topological mixing case. So you restrict now to each omoclinic class. So if the classes topologically mixing, that's fine. If it's not, as I mentioned there, there is a sub decomposition again, and you may induce on one piece, which now is topological mixing for the return. And so when you restrict restrict to the sets. And what you get is the same as before. So you have a unique measure. Which whose entropy correspond to the entropy of the set you consider. And this measures has this good properties. Expansion mixing and some limits here and like the sun central limits here. Okay. So this is what happens on surfaces for smooth. Different morphism. But you mentioned that it's not too anymore for neither in higher dimension nor for different reasons that are not maybe not infinity. There are examples in intermediate smoothness where there is no measure maximizing the entropy. For instance, this has been built by zero busy. And in a year that I mentioned, the problem is that having measure with positive entropy is not enough to guarantee that the measures are. Informally hyperbolic. You may have zero. Exponent and then everything is more complicated. So you will need to restrict to some particular city. Anyway, so there's the composition into the classes is maybe not very adapted to these cases. And so you would like to do something more measurable. So this is what you can do. So first, when you fix a saddle, you consider the measures that I got it and hyperbolic and that are almost clinically related to this clinic orbit. What does it mean me that for almost every point for this measure, it's stable manifold. This unstable manifold cut transversely the stable and unstable manifold of your initial saddle. This is an equivalent relation. And so it allows to, to define a con table at most con table number of sets of hyperbolic measures that we could call the homo clinic classes of measures. And then if you want to set in the manifold, what you do is to take the set that carries all the measures in the same class of measure. So you consider a measurable invariance subset in the topological class, which carries exactly the measure that are in the class of measures of your saddle. Okay. So in this case, you recover a partition, if you want, the composition, the borealomuclic classes will be joined. And then you can get some of the previous properties. You can code as before. And this, so in higher dimension, it's an extension of Omri's work due to, to snare Benovalia. And then the transitivity is also what we did in our work with your memory. And from that you may deduce that on this borealomuclic classes, there is at most one maximizing measure. It is Bernouy, Bernouy if the period is one. So the question is, in this more general setting, how to go beyond and find the finer properties for this measure. So discuss the, first the, the, the, the existence of this measure and then, then there are more precise properties. Okay. So this is what, what I'm doing now is to identify a good setting that ensures there's good properties. So this setting is what we call the strong positive recurrence. So once again, we, we focus on the non-uniformly hyperbolic set. So let me recall and, and precise what, what we mean in our work. So, so remember, the casein has introduced what we call now casein set when, when you have one measure, which is hyperbolic, you can find subsets with arbitrary large measure where the points see some good hyperbolicity. Our spirit here is not to fix one measure. We don't specify one measure, but we still consider the sets independently from the measure. So a casein set for us is a set of points where you, the tangent space has a splitting into a space, a stable space, an unstable space. The set itself is not invariant. So anyway, along the stable space, you have some exponential contraction along the unstable, an expansion or contraction in the past rather expansion, contraction in the past. Yes. So the contraction is, comes with a rate which is this number chi. And then also with a constant C. Okay. So this is what you see when you start from a point in the set. You see contraction in the future in the past along these two bundles. But more than that, you need also to control what happens when you consider other points along the orbit. So of course, asymptotically, you will also see contraction and expansion. But maybe with a new constant C. Okay. And you don't want that this constant C that deteriorates too much. So there is an extra term with this epsilon that controls how this C degenerates. So it degenerates with a small exponential behavior given by a small constant epsilon. So a casein set is defined by three number chi, the bound on contraction expansion, this constant C and epsilon, which controls the degeneracy along the orbits. So when you pick one measure, specific measure, you can build casein set with arbitrary large measure and taking epsilon as small as you want, which is good. So here we will sometimes need to reduce epsilon and reducing epsilon allows you to have a unique splitting and also to do this, to develop this theory that casein did. So to build a stable and several manifolds at the point of the set that vary continuously for some topologies, you want at least. And you see, so this set is compact. And because of the uniformity of stable and several manifolds at the point of the set, you can only intersect finitely many Borrello-Mochlin classes. OK. So this is a casein set. And what is an uniformity of republic set? It's everything that you can capture through the sets. So you take the union. OK. The only point is that you will need epsilon to be small, so you don't exactly take the union. You take the intersection on epsilon. OK. So if you want to fix a bound chi, you take all the casein set defined with some epsilon, then take the intersection on epsilon. And then you want all the bounds chi, so you take again the union. What's important for today is more the dependence in C. You may imagine that chi is fixed because I am interested by measures with entropy large enough so the exponents are large enough. So chi, you may fix chi if you want. And epsilon is more something technical so that you may forget if you want. But the C here is important because it tells you how the hyperbolicity degenerates. OK. For a uniformity of republic set, fix C, capture the whole set. If you're not uniformity of republic, when you have, when you iterate, sometimes the hyperbolicity is weaker and this corresponds to a larger C. So we think, so this piece in set gives, so you filtrate your uniformity hyperbolic set through this piece in set that are larger and larger with larger and larger C. And this gives a notion of behavior at infinity or topology at infinity in the uniformity hyperbolic set by considering the complement of the piece in set. OK. So you said that in this part of the dynamics, you go to infinity or the hyperbolicity degenerates if you leave any of this piece in set. OK. We like to say that it's a Bernoulli gene. On the uniformity hyperbolic set. So what to do with that? So the point is that there is an infinity. This set is not compact. So what to do with that? Well, in general, this may, this generates some pathology. And so the goal is to introduce a property that ensures that the infinity degenerates. The property that ensures that the infinity does not kill the good properties you lack. So this property is what so appeared in other setting before and has been called the strong positive recurrence or SPR. So how to state? So again, if you want, let's forget Cayenne epsilon. It says that there exists a value h smaller than the full entropy and some number of tau and some piece in set such that any ergodic measure whose entropy is large enough, so larger than this value h, any of this measure should give weight, large weight to the piece in set. So a measure larger than tau to this piece in set. So what does it say? It says that the measure with large entropy do not escape to infinity with respect to piece in topology. So if you consider a sequence of measure whose entropy goes to the full entropy of the system. Then this measure cannot have its whole mass that goes to infinity so that escape from any of the space in set. So as I said, this is a notion that appeared before because it's very natural when you have a system which is not compact. And so it appeared first in the theory of countable Markov chain. So you have a classification. And one of the cases is a strong positive recurrence. So it has been developed by many people. I mentioned some of them here. Then there are other non-compact setting. You may consider homogenous dynamics. You may also consider geodesic flow on non-compact negatively curved manifold. So you have hyperbolicity but non-compactness. So this has been developed recently in the work of Gwizel Shapira-Tapi. So in their work, it's not that the hyperbolicity degenerates. It's really that you have a lack of hyperbolicity of a lack of compactness because the manifold itself is not compact. In our case, the manifold is compact. But this is the lack of hyperbolicity that creates some non-compactness. Let me comment a bit on the countable Markov chain just to give some flavor on this notion. So this is defined by considering an alphabet with countably many letters and giving some transition rules. So from each letter, you can jump to a finite number of other letters. So you may represent this transition by a graph whose vertices are the letters in the alphabet. The arrows are the allowed transitions. So it's an oriented graph. And then from that, you define the Markov shift. So you can see the sequences of letters on this alphabet which respect these transitions. And you undo with the shift as usual. The fact that you only have a finite number of transitions from any vertex imply that this Markov shift is locally compact. But it's non-compact if the number of letters is infinite. And so what is the topology at infinity? The analogous to what we had before. It's the complement of finite union of cylinders. Finite union of cylinders is the analogous of the piecin sets. You may wonder what is the entropy in this setting? So the entropy you can take, for instance, the supremum of the entropy. And the entropy of the invariant probability so that you have the variational principle. OK. And then, like before, you have a notion of SPR, but there are many ways to state it. Many equivalent formulations. So the first one I gave here is analogous of what I have state. So any measure whose entropy is large enough gives some uniform weight to a compact set, so finite union of cylinders. But there are interesting viewpoints or there are viewpoints. So you may try to introduce an entropy at infinity. So it's what remains if you cut a big part of your system and you see what you have in the complement or you try to count orbits that spend only finite number of iterates in this big compact part, but most of the time outside and from that you define an entropy at infinity and SPR is equivalent to say that this entropy at infinity is strictly smaller than the full entropy. Another way is to say that you have a measure maximizing the entropy. But more than that, each time you look to the sequence of measures whose entropy increases toward the full entropy, then the sequence of measure converges to this maximizing measure. Another viewpoint is to say that you have exponential small tails. So if you look to set which avoid a big compact part during any iterates, then the measure of this set of measure maximizing entropy decays exponentially fast with the number of iterates. And another viewpoint is to see it as a spectral gap for some transfer operator. So this has been developed by Omri and Vansir in the so considering the one side of the graph. And here I give two examples. So on the left you have a graph which defines a shift which is not SPR because if you remove a big compact part you will always see a graph that contains a copy of your initial graph. And so you cannot check that the entropy at infinity is smaller this way. Whereas on the second part I remove a big part which is the left part on my graph. What I see is just this ladder and from each point I have two ways to escape so the entropy will be let's say log 2 whereas if I take the full shift I have a copy of this graph and I have three symbols with three transitions from each point and so the entropy is at least log 3 so I have this gap between entropy at infinity and full entropy and so it says SPR. So it was just to give some flavor in a setting which is better understood than differmorphism but now let's go back to differmorphism so we have this definition and the goal is to use it and to check that it is satisfied in some cases and so we have been able to show that it is satisfied by smooth, infinity smooth surface differmorphism of course with positive entropy. So this is what I will discuss in the remaining time and tomorrow I will discuss the consequences of the of the property so how to check that we have this SPR property so let me focus on surface and for the moment with the CR differmorphism okay so when you consider a measure an ergodic measure you have three natural numbers you have its entropy but you also have two liapunov exponents that measures how orbits separates at the tangent level so one is one lambda plus one lambda minus and let me introduce another property for the map for the differmorphism let's say that the liapunov exponents are entropy continuous so if you have some continuity when you look to measures with large entropy so if you look to a sequence of ergodic measures again as we already consider whose entropy goes to the full entropy of the system then there's ergodic measures have so assume they are converging so this has to be ergodic and the exponent of the system of the measures converge to the exponent of the limit measure so it's well defined because the limit measure is ergodic okay so I am interested in this property because it turns out that this is equivalent to the SPR property so the direction that I will discuss today is a direct direction because our goal is to check SPR the conserved direction uses more and uses what Omri's will talk about tomorrow so I won't mention okay so the entropy continuity to check it we will need a result that Jérôme has his topic yesterday so Jérôme introduced that result which seems independent which seems to have nothing to do with SPR at first glance so you look to any surface morphisms infinity and now look to a sequence of ergodic measures I am not assuming that the entropy goes to the full entropy of the system but I am assuming that the entropy converges to some positive number and let's assume also that they are apparently a point of exponent converged to a positive number and then the goal is to compare this with the entropy and the exponent of the limit measure you know that the exponents are not continuous that the entropy is not continuous in general you have some semi-continuity but in general not full of continuity and so this result tells you that if it's not continuous the gap the defect of continuity for entropy and for your point of exponent are related okay related by this inequalities okay so the one on the left tells you some semi-continuity and the other inequality relates these two quantities so a particular case that interests me that now assume that the entropy of the measure in UK converges to the entropy of the limit measure then the ratio on the right has to converge to 1 and so on the middle also so if the entropy converges then to the entropy of the limit measure then it's said that the Yapunov exponent converges also to the Yapunov exponent of the limit measure okay and so it's exactly the property we wanted to have the entropy continuity and so it implies immediately that almost immediately that since infinity the morphism on surface entropy continues okay so you just need to be a bit careful because you want remember in the definition we had that any sequence of ergodic measure going to the top of the entropy for the limit measure any limit measure has to be ergodic so the last part is exactly comes exactly from the theorem I have stated but the ergodicity you need to do something what to do so first so if you take a sequence of measure in UK whose entropy goes to the full entropy then the semi continuity I mentioned due to Yomdin's theory and then used by Sheldon tells you that the limit measure has to be the full entropy so it's a major maximizing entropy and then we use what we did before to have the uniqueness of the maximizing measure in the class and so this measure there is only one it has to be ergodic okay fine so this is how to check that we have entropy continuity now how to check SPR okay so let me put these two properties the entropy continuity it's what we have of the Lyapunov exponent and the SPR property it's what we want when you write it as I did here there is some parallel in both cases you consider sequences of measures whose entropy of the system and you want to see how it behaves along this sequence okay so let me try to explain why the first implies the second so we have to build a piecin set to check SPR we have to build a piecin set what is a piecin set if you remember it's a set of points having a stable okay good with some uniform behavior along this table there's a stable and stable spaces some quantitative contraction or expansion in the past or in the future so something related to what to this classical hyperbolic times a point in the piecin set good contraction starting from the initial point to any positivity rate for the stable for instance so how to check this hyperbolic times well there is a classical tool it's a piece lemma okay sorry just one question is there another step or is this supposed why does the finding the simultaneous hyperbolic times why does this prove this is enough to prove the result you're saying well not exactly because in the definition of piecin set okay definition of piecin set from a point we require contraction for instance along this table so it's a hyperbolic time the other property which is that when you move along the orbit you also have hyperbolic time with a different constant that does not degenerate too much so I will focus on the initial point I won't talk so much on this part which is more technical I mentioned something but I don't want to go into the technical I'm just trying to understand the logic of this so you're assuming you have entropy continuity so you're taking a sequence of measures and then you're using the conclusions of this the fact that the app on our exponents converge to show that you have these simultaneous hyperbolic times which basically means for the stable and unstable it's not a piecin set okay so that's the idea so let me focus on one difficulty the difficulty is simultaneous here if you have if you have a vector direction that is contracted asymptotically then you have this classical piece lemma that tells you that for many iterates in the future you get hyperbolic times so times where the contraction starting from this time is good for all further forward iterates okay so piece lemma tells you you have many such hyperbolic times for the stable direction meaning that the proportion of iterates satisfying that is non-zero then piece lemma applies to the other space then stable will tell you the same for the for backward iteration so for the measure it tells you that you have a set with positive measure of point that are hyperbolic times for the stable another set which are hyperbolic times for the past if there is a measure if you don't control them the set can be disjoint if there is a measure are not large enough so the goal is to improve it to get a measure close to one in both cases and then you take the intersection so I focus here on the stable direction but the conclusion is that we have a set of point of hyperbolic time which has a large measure so why is the entropy continuity of the exponent that's because the entropy of the the apunov exponent imply that if you have a measure in UK converging to mu as in the property then the oscillated splitting of the measure in UK has to converge to the oscillated splitting of the measure mu in a weak star sense so meaning that for set with large measure the stable and unstable are close to those of the measure mu why? because it iterates where the splitting of mu does not fit with the splitting of mu then if you want to check the contraction of the stable then the stable is not aligned with the stable of mu and then you will not see the full contraction of mu you will see a smaller contraction and then when you continue to iterate you will never recover it okay so to have the same similar apunov exponent as mu for a measure you really need not to waste time and so most of the time you need to be aligned with the stable of mu okay so this is why the entropy continuity of the exponent forces the oscillated splitting to match on set with large measure okay so this is the first point but the second is just first to apply oscillated or Birkoff to the initial measure mu and say that on a set with large measure if you iterate enough you check the apunov exponent you see the apunov exponent iterating a fixed number of times that is chosen large but from the continuity of the oscillated splitting we have obtained then this property can be passed to the measure in UK okay so the measure in UK not only have the apunov exponent but also the time to check the apunov exponent for the new K when K is large is similar to the time for mu okay so you get some uniformity now okay good and then now let's apply the piece lemma that I mentioned so you want to check the contraction you know that a piece lemma will tell you that you have many some hyperbolic times but the and piece lemma will give you some proportion some measure of point satisfying that this measure a theory may not be so good if okay asymptotically you have some contraction but maybe there are places where you see a much stronger contraction because if for some iterate you have a very strong contraction then it allows you after that to have an expansion and so at the iterate where you have the expansion it cannot be a hyperbolic time for the stable space so you would not have places where you have too strong contraction and so this is why our second point here is used it but for set of point with large measure up to replace F by a given fixed iterate large then the the contraction is not too strong or not too large and and so it's not exactly what is used in piece usually in you just give a uniform bound on the linear cycle you are iterating here it's a modified version of please that instead to have this uniform a per lower bound you replace it by a bound satisfied on a set with large measure okay so it's modified piece lemma who's proved proof that the fact is a consequence of the maximal ergodic serum okay so you have it and from that you get that the please the hyperbolic times up to considering hyperbolic time for an iterate fn then have a measure as close as you want to one and you can do the same for the unstable take the intersection and you have this simultaneous hyperbolic time okay so that this is the main point then as we said it's not finished to get a hyperbolic set you will piece in set you need a bit more what do you need you need to control also the oscillation so this that this concerns he does not degenerate but once you are obtained a set with large measure then you may look to the returns to the set so to the frequency of return to the set and apply again please serum so that you have for many points the number of times you enter in this set when you look into a you iterate until time n is a good proportion and having this property is exactly what allows to control this fluctuation of the constancy but it's more technical but it's just a piece of my the job has already been done in the what I mentioned before okay I think the main thing I wanted to say is why the entropy continuity of the exponent can be helpful to build this piece in sets okay okay so this is the end of this story so we have obtained entropy continuity yesterday and from entropy to continuity the SPI property for a smooth surface let me just mention one thing if I still have a few minutes or yeah okay thank you so I was focused on surfaces what about higher dimensions so even if you look to a smooth surface you can see the difference in higher dimensions as I said you cannot expect to be SPI in general you may have no hyperbolic measures so you need to restrict to some subclasses of different morphisms so let me mention one I'm not trying to give you an idea of how it works in higher dimensions so in this setting we are partially hyperbolic so let's say in dimension three we have three bundles one strong stable one strong and stable and the center and I assume the center is one dimensional and let's assume that you have an invariant fallation tangent to the center whose leaves are circles you may consider a product of a nanosophism of T2 times the identity on the circle and then if you deform that you remain in that class and there are two more assumptions that you can always obtain by perturbation even by smooth perturbation the first one is the accessibility so the fact that traveling along strong stable and stable many falls you can go from any point to any other points and the last one is to say that along the center the dynamics is not isometric so this is guaranteed once you have somewhere hyperbolic periodic point whose center does not vanish so in this setting then the differmorphism is SPR ok and then from that you deduce the consequences that Omri will describe tomorrow ok so this class has been considered several times before so let me say that to comment what I've discussed before about omoclinic classes they are open set inside this class where the system is topologically mixing ok nevertheless you have periodic point whose center exponent is negative and others whose center exponent is positive so when you consider the topological omoclinic classes you specify the center exponent and then there are subsets you have at least two different omoclinic classes one corresponding to negative and one corresponding to positive center your point of exponent ok so this shows that borrel omoclinic classes are really finer than topological omoclinic classes in higher dimension and then what about measure maximizing the entropy then you have result by you have finite number of measure maximizing the entropy that are hyperbolic along the center under the assumption I gave at least two one corresponding with stable center one with unstable center ok and so if we want to have more properties for this measure you need to do for instance to check that it's SPR and for that you can do it because Alit, Azibi and Jaegang Young did something SPR was not their goal but they proved exactly what they did so they prove a property which is very similar to the entropy continuity on surfaces and that directly implies SPR so what they prove well part of what they prove and it's exactly what we use but it says actually that measures with large entropy have their center exponent ergodic measures so you see since you have uniform Liapunov experiment away from zero then from that you may play with please and try to build piece in sets that are uniform for these measures and check the SPR property ok so it was to mention something in higher dimension ok so I have finished to summarize again there's three lectures yesterday Jérôme discussed the continuity of Jaepunov exponent and today I talked about SPR property that you can check from the continuity of Jaepunov exponent and tomorrow Omri will give many consequences thank you thank you Jérôme thank you Jérôme do we have questions or comments just a quick comment based on one of the early questions that Stefano asked if you if you remember Thurston's difumorphisms on surfaces that were smoothed out by Gerber and Katalk you could see infinity mixing difumorphisms which are not they have these Hamiltonian like critical points near the Thurston discontinuities that's a great talk Silvan thank you Siman I have a question regarding one of the users of the lemma you mentioned the please lemma or the actually refined please lemma yes I wonder actually a technical question let's say hyperbolic orbit with nice enough properties for example that it has the present constant temperable with epsilon temperability can you get some stronger properties about the decay or the bounds and the growth of the present constant what happens when you change epsilon if there's some smaller finer structure that applies to all epsilon yeah so this is this is something we do okay so I think I will talk about it tomorrow okay but there will be a lot of suspense so let me say shortly that our definition of SPR property requires something for all epsilon small we are only interested by small but in principle to check SPR you need to check that for all epsilon in fact afterwards so from what we will tell tomorrow it's possible to check SPR but we just looking at one epsilon provided it's small enough okay so I don't say why maybe I don't know because it's I will wait thank you thank you for the talk so any other question or comment I have one I didn't understand what is the relations between the usual and what is considered with the Borrel classes yes so for instance the usual ones they can only intersect inside with zero topological entropy okay if you think for smooth surface they are not that different they are they correspond if you take Borrel homoclinic class its closure is a topological homoclinic class and a topological homoclinic class contains only one non-trivial Borrel homoclinic class okay because of this intersection property but when you consider other system in area dimension it's not longer true and that's also why I wanted to give this example in area dimension on T3 on T3 at least for some of the system I described the T3 is a single topological homoclinic class meaning that each time you to be more precise you may have pericorbits with different one which is stable in the center another one which is unstable in the center so there's two pericorbits but if you build the topological homoclinic class as a set they are the same they coincide with T3 so their intersection is huge you don't have at all this property whereas if you only consider Borrel subsets you only consider points that have stable and stable manifolds and which are related homoclinic to there's two different pericorbits then you will distinguish these two behaviors stable or unstable in the center at least and maybe more and you get finer classes that are essential to introduce in area dimension okay it's clear now it allows you to understand finer things on surfaces but it's not key because topological classes is enough but otherwise no you need to so does anyone have any other question or comment for Sylvain okay so I think not so let's thank Sylvain again thank you Sylvain it's good to see you all and I will see you again tomorrow at the same time, at the same place and we will have tomorrow I think Agnieszka and Omri speaking okay so goodbye everybody see you tomorrow Sylvain, merci