 Donc, aujourd'hui, il n'y aura pas de spécificité homogèneuse, donc il y aura un spécificité projectif, mais pas de spécificité homogèneuse, pas de cochon de Jimo de Lambda. Et ce sera un talk about stationary measures. First, generalities about stationary measures and Markov operators. And then, if we have time enough, I will discuss precisely the applications of the notion of stationary measures to result in the study of product of random matrices. The purpose of the introduction about stationary measures is two-fold. First, to introduce some general tools that we will need in the study of dynamics on homogenous spaces. And also to prove results about product of random matrices that we will also use in our study of homogenous dynamics. Ok, so let me recall you the general definition I gave you last time, so assume X is a locally compact space. And I will assume that it is second countable, which means, so this is very practical when you give the talk in English because you have this notion of being second countable. And there is no classical translation in French, so it's better. In French, it's something like metrizable with a sigma compact or something like that. Ok, so anyway, this is what it means. And also, I will define Markov-Feller. Markov chain, Markov-Feller chain, maybe, ok. C'est une réserve. Sorry? Maybe Feller Markov chain. Markov, ok, so Markov-Feller Markov chain, so everybody is happy. Or Feller Markov chain, so you choose the one you prefer. And this is a map X, map to PX, which goes from X to the set of probability measures. So I will denote usually by this P of X, I will denote the space of Borel probability measures on a locally compact space. And I will assume that this map, so I will ask for this map to satisfy the natural continuity property. So for example, such that, for every phi, so I looked at it carefully. Yes, I will add it, this is more or less, it's not very important. I assume that for any phi which is continuous and bounded on X, P phi of X, which is the integral over X of phi with respect to the probability measure dPX, this is continuous in X. So this is, so for me this is the random analog of a dynamical system, that is I have my space, it is maybe not compact. And when I start from a point, I will go to another point, but this point is not deterministic. Usually, for example, you can look, for example, you can take PX to be the Dirac mass at TX, where T is a continuous map. Ok, so, but now I will allow this point, this image to be a random image, which means that from X I will go to a point now, but which is random, that is distributed with respect to a probability measure. So, in fact, we could define the notion, for all I will say in general, I want to use continuity. So we could define the notion of a Borel Markov chain and everything would work, but maybe this is not very useful because all of the Markov chain we will encounter will be of this form. All I will say is that fact only depends of the Borel structure of X. So I could take X to be a standard Borel space, etc. It's not very, but this is just abstract formalism. So in this case, one would like to define, so one has, in this situation, so what I introduced is very classical in probability theory, one has a probability measure on the images of X. But one can look at the second image of X. So this gives a new probability measure. So what is this probability measure? This is average. So if I want to look at the distribution of the second image of X, the probability measure I get is something like the integral of Py, dPX of Y. So this is the distribution of the probability measure of the second image. And if I want to look, this is the distribution of the second image. But I can also look at the joint distribution of the two first images. So this is a probability measure on X times X, and this is just the joint distribution of the first two images. So this is the probability measure on this space whose projection on the first factor is Px and whose projection on the second factor is this probability measure. Ok? Because if I forget the first image, this is what I get. So in fact, this probability measure is integral of delta Y. When I take the first image to be Y, the second image is distributed with respect to Py. This is what I use there. So this probability measure is a fiber measure. This projection here is Py and the measure is Px and the measure on the fiber associated to Y is Py. So this is what I write formally as this measure. Ok? This is the joint distribution of the two first images. And now I can interact with this process. Ok? So I won't write the formulae, but you can look at the joint distribution of all the images. Ok? And this measure exists. This is the theorem. This is not very difficult. This is a standard fact from abstract measure theory in this situation. From the data of the distribution of the first image and then by a standard iteration process, I can define the joint distribution of all the images. So this will be a measure on the site W. So W is a set of trajectories. So I take copies of X and I want to put the distribution of the trajectories which starts from X. So there exists on this set, there exists a map which goes, so I keep this set with its product sigma, the product sigma algebra of the Borel sigma algebra of X. What I claim is that there exists a map from X to the probability measures on W which associate to one point, so that exists even in unique, which associate to one point a measure which I will denote by omega X. So the distribution of the trajectories which starts from X. And it has to satisfy an equation which is precisely, so this is a proposition. And it has to satisfy an equation which is as follows. When I look at omega X, this is the integral, the integral, the integral, I don't know, anyway it's not good English, so we don't care. So it is of what, yes it is, sorry, omega X satisfies an equation. It is, so the first coordinate, you start from X. So the coordinate at time zero is X. And then what you get is just the integral over X of the probability measure. When you look, the further coordinates, if you start from Y, they are distributed with respect, the sequence of the trajectories is distributed with respect to this probability measure. But I have to put another H over the X of Y. So this is a map that is Borel and which satisfies this property. And this is just, you write the natural distance distribution of the n-first coordinate and by standard arguments about measures and product spaces, you can prove that you have a limit and it satisfies this property. So this is called Markov measures. So in probability theory, they are denoted by this letter. This is by px. But sometimes I find this notation little confusing. So I like to denote it this way. My cursor doesn't share this point of view which makes sometimes communication difficult. So anyway, there exists such a map that is a distribution of the trajectories which starts from X. And in general, now if nu is a radon-measure, I will come to this point in one minute. If nu is a radon-measure on X, then I will denote by omega nu. We set omega nu as being the average of X of omega X dnu of X. So this is now, this is a measure, this is a measure, a radon-measure on W. Ok? So that is the issue. Now, this is not the distribution. You have the range, the starting point. You take a starting point that is itself distributed with the measure. Ok? And the proposition will answer the question. For any such nu, we have equivalence between saying that p, this measure, is invariant. So if you have an operator on functions, you have an adjoint operation on measure. Ok? So saying that the measure is invariant is exactly saying that omega nu is shift invariant. That is now, your measure, nu, nu is a measure on the space. Ok? So on the space, you have deterministic dynamics. You have a random dynamics. So for the measure to be invariant, saying that the measure is invariant is saying that it is invariant on average. Ok? So by the dynamics. Ok? So, when the Markov operator comes from the action of a group equipped with a measure, this is the property of being stationary. Ok? But now, this property of being invariant by the random dynamic, to the random dynamical system with a shift, you can associate deterministic dynamical system. You look at the shift map on the space of trajectories. That is when you fix all the trajectories. Now, the dynamics become deterministic and the dynamics is just forgetting the first letter. So, if you start with a stationary an invariant measure by the random dynamic, you can associate to it a natural measure on the space of trajectories and the property of invariant under the operator equivalent to the property that the measure on the space of trajectories is invariant by the shift map. Also, in general, this is not very useful because the shift space is a very large space. But this tells you why properties that are true for deterministic dynamics such as classification the fact that every finite measure is an average of ergodic finite measures, etc. Directly, we also hold random dynamics for Markov operators just because you have this nice bijection here. Ok? Be careful that I'm not saying that if you take any measure that is invariant under the shift map you will recover a stationary measure. This is just true for measures which have a very particular form, measures that are obtained by this process. Ok, so now we will speak precisely about ergodicity. So, I will say a definition. I will say that if I have a function I will say that nu is p, if nu is a measure that is invariant under the operator we say that nu is p ergodic if and only if for any function phi which is bounded and invariant then phi is constant almost everywhere, nu almost everywhere. Ok? So this is really analog to the usual notion of ergodicity for for transformations which preserve a measure and now again I claim that there is a link between the ergodicity of p with respect to nu with the ergodicity of the shift map t with respect to omega nu. So, if I want to draw a link I need to draw a link between functions that are p invariant which are sometimes called this is such a function sometimes it is called p-harmonic function Ok? So I need to draw a link between functions that are harmonic and functions and the shift space that are t invariant and this is not a direct link it needs the construction so the construction relies on martingale convergence or m so we need some more probability theory so to there is a general fact general fact that to such to a harmonic function you can associate a martingale so this is what I will construct now so a martingale so I will remember to remember nu what a martingale is so if you have say a probable x a mu which is a probability space and if bn is a sequence of sub-sigma algebras of a which is increasing I will say that the sequence phi n in the L1 space of the probability measure is a martingale with respect to the sequence of sigma algebras if and only if for any n when I take the conditional expectation of phi n plus 1 with respect to bn this is phi n so you have you have experiments successive experiments and at each step you measure the result so the space x is the space of all the possible results and at each step you measure what you have so you determine some information so you restrict what is possible in the space but when on average when you measure the result of the experiment what you will get at step n plus 1 on average is what you had with probability 1,5 it gives you 1 and with probability 1,5 it gives you minus 1 you make the sum of all the other results since you have zero expectation if you look at the sum it satisfies this property but it is a martingale for example you can take at each step you can change the distribution as if you keep zero average and you still have a martingale ok? and there will come at once an example which is not of this form so if you have this property so there is a CRM so of course if you take a function I will denote by b the sub sigma algebra generated by all my intermediate sub algebras so of course if you have a function for example then if you take its conditional expectations with respect to the algebras bn it gives you a martingale and the CRM is kind of reciprocal to this statement so assume you have a phi n which is included in the LP space for p which is larger than 1 and that it is uniformly bounded and that it is a martingale then there exists some phi in LP such that for any n has a conditional expectation so this is a doob CRM of phi with respect to bn is equal to phi n ok? so there is a CRM is false for a p equal 1 and I'm not sure I can give you a counter example at once but I know that this is false for p equal 1 you need p to be larger than 1 for as it won't be a problem ok? but for p equal 1 it is not true I should write the sequence on the other side and phi n goes to phi mu almost everywhere phi nx of x converges to phi of x you have also convergence in the LP space except of course if p is infinity if p is infinity in general ok so this is a general fact from probability theory and now my claim is that from so I denote all the functions by phi this is not a very good idea I'm sorry my claim is that from probability space W équipé with Markov measure omega x for a given x and given harmonic functions I will define a martingale this is a very simple formula define phi n so this is a different phi this one is var phi and this is phi phi n of x of a sequence so this is a function on this space this will be my probability space ok I should define also the sigma algebra at W n this is the sigma algebra of functions this is very natural which depends on the n first coordinate so I set phi n of x of W just as being phi of W n ok so I have my point x then I throw the random walk so there is some W1 W2 I reach W n and I measure the value of the function so if the dynamics is deterministic saying that the function is harmonic is just saying that it is invariant of the dynamics so this is constant when the dynamics just comes when you have a deterministic dynamics that is you just have one map then since it is invariant what you measure as time n is what you had at time 0 ok of course you lose this when the dynamics is random but you keep the fact that this is a martingale and this is almost I claim that this is a martingale with respect to this sequence of sigma algebras and to this I need a formulae for computing conditional expectations so and then to apply this formula in this case for this function so indeed just and the formula for computing conditional expectations will come from the construction of the major omega x so this is not a product measure when you have a product measure it is easy to compute conditional expectations if you have a space x which is a product of 2 space y and z and you have a major nu a major nu, you make the product measure you have if you have a sigma algebras that is the product the tensor product of the given sigma algebras on y and z and you can see y as the sub algebras of x so if you compute the conditional expectation of a function of the 2 variables with respect to this sigma algebras y that is you want to average against functions that only depend of the coordinate y and what you get is that this is so you should evaluate it at some point y, this is now, this is only a function of y this is just the integral this is fubini sur m this is conditional expectation is a product case but you see what you need for this formula to hold is not really a product measure what you need is a fiber measure you have a measure, here you see this as a base here you see this as a fiber and you have a projection measure this is mu on the base and then you have a range on each fiber ok so of course, here the measure does not depend on the fiber but when the measure depends on the fiber which is the point of y, you still have a formula of this form what happens here is that so the first coordinate is x and if I fix a second coordinate as being y I very well know the measure on the fiber this is just the mark of measure of y this is what the formula says ok so if I do this at step n excuse me if I have a function phi which is a function L1 in L1 of this measure, here I have a formula for conditional expectation with respect to Wn which just comes from this property here the iterated version at time n of this property this is the fact that when you take conditional expectation with respect to Wn evaluated at some point so now I don't have to put the additional coordinates because this is just a function on the n first coordinate n plus 1 first ok so I have to remember the formula this mark of property the fact that this is just the integral over W over of phi evaluated at Wn Wn minus 1 et now I put some new point W prime, so this is fixed and I take this integral with respect to the mark of measure at Wn so why do so this formula this formula, these points are fixed and I want to build a new sequence this is a function on the set this function phi is a function on the set of sequences ok, and how do I build a new sequence I take my n minus 1 first coordinate and then I add a random sequence here and this random sequence is distributed with respect to the mark of measure at the last step ok, so this is exactly what they probably say when they say that this is a mark of process what happens after the end step only depends on where you are ok, the distribution you have the first step but the distribution of the future only depends on the point I have reached it does not depend on the past ok, this is why I have such a nice formula that is the distribution here this measure for which this is a conditional expectation always looks as something like that but the point here is that the measure that I put here it should depend on everything but since it is in Markov it only depends on the last step ok this is a Markov property step pour the end one is just a formula this formula here ok, so now I have a formula for conditional expectation and I can check that my sequence here is a martingale this is just this is just the fact that the function is harmonic so this is very abstract and formal probability theory but this will stop soon because it is just what I need why is this a martingale so I need to compute the conditional expectation of phi n plus 1 x respect to the sigma algebra Wn and to evaluate it at some point of this form so I take my formula this is just integral over W of phi n plus 1x but now what is this phi n plus 1x by definition this is just a small phi the function, the pin harmonic function evaluate it at coordinate Wn plus 1 and there is a prime so sorry this is not n plus 1 sorry now this is the coordinate 1 of the sequence W prime ok, it's just my formula and you have an integral so this is equal to the integral over W, large W of phi of W1 prime d omega omega n of W prime so W prime is distributed with respect to the Markov measure at Wn ok, and I only look at the first component ok, so by definition of the Markov measure the distribution of step 1 is just given by the measure p Wn so this is the integral of phi of say y dp this is just the definition of the Markov measure ok, that is what I'm saying is that I've reached step Wn and then there is a guy Wn plus 1 which I denoted by W1 prime at step Wn I integrate over all the possible values so this is just this is exactly p phi this is a very general fact you should take this function and what I'm saying is that if I look at the function phi evaluated at Wn plus 1 and if I take the average of all the possible values when Wn is fixed what I get is just p phi of Wn and this is exactly Wn which is phi nx of W so my sequence is a martingale that is when you have random invariance you build martingale when you have deterministic invariance you have constant functions along the trajectories when you have random invariance you have martingale along the trajectories this is really the key point is the Markov property now since I have a martingale and of course since my function I took, I was careful and I took a bounded function so what I have is a bounded martingale a uniformly bounded martingale so by doob's your M I know that it admits a limit that is when you take a harmonic function when you follow a trajectory there is a limit so by doob's your M there exists some function phi which is the function phi of x which belongs to an infinity of W equipped with a major omega x such that omega x almost surely phi of Wn converges to a phi x of W so I cheated you a little because from the beginning I use the fact that the property p phi equals phi holds everywhere but it does not hold everywhere my function is an invariant element of L infinity of x nu so this property holds nu almost everywhere with respect to nu which means that the conclusion I wrote here is not true for any x nu almost every x that is on a set of major 1 so for the moment nu is not finite so on the set whose complement has major 0 for nu when has this property so precisely the function x so the function which means nu for the moment I didn't assume nu to have total major 1 because I will apply this to random works on groups in 1 minute and when the group is not compact the natural invariant major is a hard major which is infinite I want to keep in mind that what I'm saying holds when you study random works on groups so I have the property so phi x in fact the dependence on x holds is present in the dependence on the first variable the first variable here is x ok and as you have phi of w which is the limit of phi of w n in particular phi is shift invariant ok so I define phi as being the Poisson transform of phi ok and the claim all this is very general there is nothing just formalism about Markov preparators ok I don't note maybe neveu maybe neveu the Poisson transform so this is a map that goes from infinity x with respect to the major nu and you take the p invariant functions in this space t is the shift map p is the Markov preparator so these are the functions these are the harmonic functions these are the bounded harmonic functions on the space ok and ok ok ok ok ok ok ok this is a bijective isometry and I almost gave the proof I just didn't give you the conversion the reciprocal it's just a reciprocal it's just, if you take a t invariant function phi you send it to the function phi of x which is the integral avec respect à la majeure de Markov. Vous avez la valeur d'un autre âge. Quand vous avez un gale martin, la valeur à la étape 0 est une expectation. C'est la computation de cette forme. C'est la naturelle. Si c'est la limite, pour récover la fonction originale, vous avez juste à prendre l'expectation. Ok. Ok. Donc, c'est... Vous avez aussi la valeur d'expectation. Quoi ? La valeur d'expectation. E5B1. C'est correct ? Oui, oui, oui. C'est exactement E5 avec respect à B1. Euh... 2B... 0. 2B0. Maintenant, c'est la fonction d'un autre âge. Et c'est avec respect à la majeure de Markov. Avec respect à la majeure de Markov. Ok. Donc, c'est l'expectation, oui. Ici, toutes ces expectations conditionnelles sont élevées avec respect à la majeure de Markov. Quoi que vous construez, ce n'est pas un gale martin avec respect à cette majeure qui n'est éventuellement pas finie. Quoi que vous construez, ce n'est pas un gale martin avec respect à la majeure de Markov. Et puis vous laissez varier un set de majeures 1, un total de majeures, pour vous. C'est ce que vous faites. Ok. Donc, peut-être que c'est tout à fait pour vous, comme il m'a dit la première fois que j'ai rencontré ça. Mais je veux juste emphasiser le fait qu'il y a une fermeture. Quand vous avez un Markov opérateur, il y a des processus limites sur les trajectories. Et il y a cette poisson transformée qui vous permet de changer le point de vue. Si vous parlez des dynamismes randomes sur X, vous parlez, vous parlez sur les dynamismes déterministes et sur le espace de trajectories. Et ce sont des points de vue duales. Donc, la corollerie de ça. Donc, avant de parler de corolleries, je vais me justifier la terminologie de poisson transformée. C'est juste de la fax qu'il y a un exemple. Donc, un exemple, on prend X pour être le disque unique, à l'intérieur de C. Et je ne le vois pas comme le disque unique, mais c'est le point carré de la plaine. Le disque point carré, peut-être. Et si vous définissez P d'une fonction évaluée sur X, comme donc vous fixez un radius R, et comme 1 par 1 par 1 par une surface cette disque dans formule r l'intasto sur le disque centre.x selon formule r d'une faille par respect à et la mesure surface. D'une şekilde surface je dois dire la surface n'est pas la distance ukrainale ce n'est pas la distance hiperbolique C'est le majeur, qui vient du point carémétrique, et c'est le majeur qui est associé au point carémétrique. C'est le Riemannian majeur associé au point carémétrique. Vous pouvez dire que c'est le PSL-2R invariant métrique pour la action naturelle de PSL-2R sur le disque. Pour le P11, peut-être, si vous êtes familiar. Hyperbolic, radius, hyperbolic. Et maintenant, le poisson transforme, identifie le poisson transforme dans la façon classique. C'est-à-dire, quand vous avez une trajectorie pour ce processus de Markov, quand vous avez une trajectorie, le staff d'un point x, il va avoir un point limité dans la boundary. Et le poisson transforme, la fonction harmonique sera plus ou moins la même comme la fonction harmonique pour l'application normale. Et ce que vous avez à la limite, c'est juste le poisson transforme, le poisson transforme, la fonction harmonique. Et la raison n'est pas, c'est juste de la facture, je mets ici, je ne veux pas étudier le Markov chaine avec un temps continu. Pour une simple raison, je n'ai absolument pas de compétence. Mais ici, c'est juste une version discrétisée de la motion de Brownian. Bien sûr, quand vous voulez comprendre les fonctions harmoniques, vous avez l'application, vous n'avez pas un opérateur discret, vous avez un paramétre semi-groupe. Mais tout ce que j'ai fait pour un opérateur, je ne pouvais pas le faire pour un paramétre semi-groupe de opérateurs. Et dans ce cas, vous devez prendre un paramétre semi-groupe, qui est l'exponential de l'application. Ce que vous avez, c'est exactement, c'est exactement le poisson transforme, c'est le sens classique. Mais ici, ce n'est pas de la différence. Ce que vous avez, c'est le poisson transforme utile. Le poisson transforme classique dans le disque unique, c'est le cas particulier de tout ça. C.R.M, je l'ai mentionné, ça peut aussi être dit comme le fait que quand vous avez une fonction harmonique, et quand vous avez une trajectorie de la motion brunienne, quand vous followz la fonction harmonique along la trajectorie de la motion brunienne, c'est une motion hyperbolic brunienne. C'est une motion hyperbolic brunienne. Puis, en dehors du cas classique, il y a une motion hyperbolic brunienne, il takes an infinite time to reach le boundary. Where as the classical one hit the boundary at some point. And when you follow the value of the harmonic function along the trajectory of the brunienne motion, it has a limit. It is linked to the boundary value of your harmonic function in classical potential theory. In this theorem on your left, you define the formulaic Poisson theory or Poisson theorem without defining Poisson boundary. Yes, because Poisson boundary, what Poisson boundary is here, is just the space of ergodic components. The space of, but in general, for example as a measure is infinite, this is not very clear what the space of ergodic components is. So I prefer to speak because if you take the space, the sigma algebra of thin variant subset, it is not clear whether the restriction of the measure omega nu to this sigma algebra is sigma finite. You have this kind of problems but you can always speak of this space. This space makes sense. In general, if you give a nice motion of a space of ergodic components, then you know that the infinity, when the measure is finite, when the measure is finite, for example, but this is not the case you want to consider in group theory, of course, in the case of random works of groups. If the measure nu is finite, then you can see this space as just the L infinity of the space of ergodic components. In this case, this space of ergodic components is the Poisson boundary and what you get is the classical notion of Poisson isomorphism, etc. Ok. Ok, so precisely, and the corollary of this is that P is ergodic with respect to nu if and only if T is ergodic to respect to nu. Ok. Ok. Donc, maybe this was a little abstract. I'm sorry. But now I will be more concrete. So, so now we will focus on a particular case and this is a case of group actions. So, again, if I have now, I still have a space X which is a locally compact and second count table. But now I have also a group which is of the same species that is it is also locally compact and second count table and I have an action of G on X, a continuous action. So that if I equip nu, if nu is a probability measure on G, I can define a necessitated Markov operator P mu, so P mu of a function phi evaluated at some point X is just integral over X of phi over X over G. You are always right if mu is a probability measure or you are always right this little difficult for me. D mu of G. Ok, so I take instead of acting by one transformation I have my point, I take a random transformation and I jump to GX but G, there are several possibilities and I take another edge over all the possibilities. So, which means that P mu can be seen as a map where we send watts point X to the measure which is a convolution product of mu with respect to delta X to the Dirac mass at X. Ok. So, this is a case of this is an example of Markov operators that I will always study. And in this case so there is a very classical example and this is so the existence of this example is the reason why I have always been using I wanted to maintain the case where mu was infinite and that is a case where X is G and you take the right action of G on itself so you could also take the left action of course this is exactly the same but for some reason yet we see in one minute I want to take the right action and then a harmonic function is just a function which satisfies a harmonic function is a function which satisfies that for say so you keep G with hard measure which is a so you tell maybe we can assume that G is an immodular in this case to simplify which is not very essential but assume G is an immodular that satisfies that hard almost every G in G you have phi of G that is equal to the integral over G of phi of G H in the mu of H so to be very very formal since I assume all my actions to be left actions I should put an inverse here but see if I want the right action to be a left action just follow me but I will put I won't put an inverse of course this is the same if you are not happy you can replace mu with a mu tch it is in measure under the inverse map ok anyway this is just kidding so you have harmonic functions and the construction I just made identify the harmonic functions so you have the space W here but what is W so there is another way of seeing W so in general when you have this situation there is another description of the measure omega X in this case because from the measure on G you can form the space so I will denote so in all the sequence all the talks B will denote the set of sequences in G so this is against not a standard notation and but when we started working on this topic this is Benoît for us this was the state this was the Bernoulli space so we denote it by B this is for Bernoulli sequences ok sorry Benoît space ah ok this is a Benoît space it's a very good idea ok and you will keep it the point is that you could say this is more or less the same as W for example in case of random works a group on itself but I don't denote it by the same letter because I don't put the same measure now this is not this is not the set of trajectories this is a set of trials that is before moving in the space I first select all the elements of G that I will let act ok and then I can move in the space which means that you have a measure you have the space X cross B you have a map if you fix some X in X sorry given X in X you have a map from B to W which sends B to the trajectory so the first point is the trajectory is X you start from X ok so this measure I will denote it by beta usually and then you go to the first point B1 X then I choose the second point X point B because this is a map from B to W ok but X is fixed ok so for given X there is such a map because I have to fix my starting point when you don't fix I want to fix a point and to describe omega X for the moment there is no measure on X there is no measure new for the moment I have to fix starting point so there is a map of this form etc etc ok this is a trajectory and it is not very surprising that this is a it is a lemma or a sublemma that omega X so define this map I don't know by PX lemma that omega X is just the image measure by PX of the major beta ok the Markov measure can be recovered from the measure and the space of trials ok so of course if in the space X there are stabilizers this map is maybe not a major space isomorphism but in all the cases we will study this is a major space isomorphism in general you can there are cases where you forget something when you smash ok but let me be careful in all the cases we will study this will be a major space isomorphism ok and you also have so this is the third point of view when there is a group action so now if you have a measure new which is a measure on X then you confirm you say that it is new stationary if the convolution power of new with respect to new is equal to new ok if it is defined because when new is infinite it is even not clear that it is defined but when this is defined and equal to new you say that the measure is stationary and the following equivalence that saying so again this is formalism that saying new is stationary amounts to say and this is almost the same result as a move that when you take the measure new and you make the product with the measure beta beta is this measure in the space of trajectories it is invariant under the map that sends X B to B1 X T B where B is a shift map so this is almost my voice is lost so this is almost the same theorem as before when these maps here are measure space asomorphism when there are no stabilisers this is the same theorem yeah B1 because I want the coordinate 0 to be X so these are the sequences with indices at least one this is just a matter of convention ok you want the starting point to be the coordinate 0 ok so this is just formalism this is more or less the same theorem and saying that this one is ergodique this is the same ok so in this case if I come back to poisson to harmonic function of the full group so what I am saying is that I have this dynamics this here so let us go back to the example where X is D and you have action on the right ok so you you have a dynamics on B cross D or G cross B maybe and what I am saying is that again the set of harmonic function so you have infinity of D and you take the P harmonic functions P new harmonic functions and what I am saying is that this is the same of the set of T invariant function so maybe you know this map by T tilde of T twiddle invariant functions here ok so this is the same theorem you have poisson as a morphism and here but here the point is that when one speak of poisson boundary of groups the point is that there is an additional structure here which comes from the fact that the group acts on itself on the left and this action commutes with the Markov operator on the right and this is a key feature of groups that actions on the right and on the left commute to each other and in particular when you speak when you have a harmonic function then you can regularize it on the left so you can always assume that it is left continuous ok what happens is that in this space the left, the vectors here you have an action you have an action this is not b1x this is b1 because the random walk act on the right ok so now you have the action of g on the left on x and what happens is that in this infinity space there is an action of g it is not a strongly continuous action that is in general because it is in action of infinity it is in general not strongly continuous but the vectors which are strongly continuous for a weakly dense substance of infinity due to the fact that when you have a harmonic function you can regularize it on the left and this is a weak approximation for the weak start topology of infinity so forget about this technical details but what happens here is that you have another space which is a space of functions which are continuous for the g action here which are vectors which are continuous for the g action here ok and what happens is that here you have a cistar algebra structure it is an infinity space so what I am describing is a subsistar algebra product of continuous functions product of two continuous functions is continuous so this cistar algebra this is the algebra of continuous function on some compact space which is in general very large ok and this is what is called the Poisson boundary you take the set of continuous vectors here and this is the sigma algebra so as you have an action this space is equipped with a continuous action of g ok so this is the g space and this is what is called the Poisson boundary ok but in general you don't have this kind of additional structure ok so the Poisson boundary comes from the fact that you have left action on g which commutes with a mark of operator this is very formal we will come back to product of random matrices so for example if g is just a free group i.e. a two generator nu is a measure a simple random work of the free group the Poisson boundary here this is the boundary of the free group the Grumov boundary in general when you have a hyperbolic group under mild conditions if you have a nice random work on a hyperbolic group the space you recover here is exactly the Grumov boundary ok this is this is just the link of what we are doing when we handle stationary measures with a classical this is a link there are closed links with the theory of Poisson boundary ok so so let me now go back to the case of group actions and now i will again i want to study stationary measures on homogeneous spaces so now i will always assume a measure i will study i just so i try to give a general description of stationary measures even in the infinite case because i wanted to draw a link with Poisson boundary of groups etc but now i want to study dynamics so i am interested in probability measures so now i want to study probability measures in all the sequels now the stationary measures will have finite measures will be finite measures so for the study of stationary measures there is a another way this is one of the richness of the structure of stationary measures of finite stationary measures is that when you have a stationary measure there are two ways so one way is as a probability measure that is invariant on average the other way is to see it as a family of probability measures that satisfies an equivariance property so for the moment it has a strange and i will precisely say what i mean so this is a there is a transformation on the set of stationary measures and this is totally analogous to Poisson transformation and this is why i mention click i bit it will be simpler so this is a proposition by first and back this Poisson transformation for stationary measures so what does it say it say that when you have d at an x you have two probability measures a new stationary so again new new probability measures then there exist a map from b to the set of probability measures on x which is Borel which is a Borel map such that every continuous function with compact support x every continuous bounded function makes no difference for beta almost every b in b when you take of phi of b1 b and x over x over what over x d new x this converges to the integral of our x of phi with respect to new b which means that so you have so this is very strange because you see I want my group to act on the left ok so I want to first act by b1 then by b2 by b3 etc so the product is not written in the right order so this is not linked to the action of the random work but somehow to the converse of the action and you will see one minute where this comes from so this tells me that I have my measure on my space x there is some distribution the measure new and then I push it by b1 bn and it has a tendency to concentrate on something else of course if the measure is really invariant under the group then this something else is just the measure but it tells me that when the measure is not invariant something happens at the limit and it is not very clear what it means even for me but I can prove you why it is true this is something wonderful which is really due to first and then this idea of starting from a stationary measure transforming it to something else ok and the proof is just the Poisson isomorphism it's just from the fact that in this situation when I have a stationary measure I can look at the function of the group which sells some element to the group d to to what to the integral over x of phi no no this is a small b sorry this is a measure which depends on the parameter ok what happens is that this function is harmonic on the left on the right exactly where stationarity tells you so since it is harmonic on the right you have this property this is just martingale theorem ok this is Poisson boundary all the construction I made ok this is why I wanted to be very careful and to let the group act on the right in the case of group theory because here when you act on the left and x you get something on the right ok and this is the action Poisson isomorphism theorem but applied just as a neutral element ok in fact of course what happens that I could put any g ok but the limit measure would depend I would get a translate but this holds but if this works at g this works at the neutral element for the function composite by g the point here is that ok so that stationarity holds not almost everywhere on the group with respect to our measure but it holds everywhere ok so you have the properties that this is a martingale from this function you can build a martingale in the usual way just this holds everywhere for every starting point in the group in particular at the neutral element ok so this is harmonic ok so this is another in fact this is another way this is Poisson theorem for measures but it comes from Poisson theorem on the group ok so and now what do my measures so what are these measures newbie but you know that what happens when you shift b so when you shift b you forget the first letter here b by b 1 minus 1 so this is just almost surely for b almost every b you have this equivalence property if you want to compute if you shift on the shift space then you just add you multiply your measure you let one element of the group act on your measure ok and what is new and this is just Poisson theorem the harmonic function from the limit martingale what you do is that you take expectation so new is just the integral over b of newbie ok and in fact this is equivalent again this is some Poisson theorem that tells you that if you want to give a stationary measure you can give un mini of measures and b which satisfies this property and takes expectations and this measure if you have this property you directly have that this measure is stationary so there are 2 point of views what you lose when you take stationary measures what you lose is invariance of course so invariance would imply that this would be constant but anyway you still have something at the limit when you studied ergodicity of measures when you take harmonic functions they are not invariant but there is something at the limit and for stationary measures for stationary probability measures you have the same kind of property that is you lose invariance but you get a limit object and you can either study the stationary measure in itself or study the family the equivalent family of measures and it turns out that all the study of homogeneous dynamics I will proceed to this is what I will study I won't study the stationary measure in itself I will study the limit measures here so now I will focus on the case let me check if I didn't forget anything ok so now we will go back to Lee groups and I will describe what happens and you see this was the origin of first hand bear interest in this theory the description of product of random matrices so are you ok with all this ? I'm sorry this is a little abstract and the key point is we will use this relation very very often so one has to try to understand where it comes from I cannot say that it is pure for me but the key point is that when you have harmonicity you have limit objects you have harmonicity on the disc you have limit objects etc etc this is the idea so now I am studying product of random matrices so I take v which is a real vector space and I will take mu to be a probability measure on the linear group what else and I will study the stationary measures on the project space the project space of this the set of lines the set of vector lines in v and I will I will always denote by gamma mu inside glv the subs of my group span by the support and here is the CRM by Fürstlberg Fürstlberg I should write assumes that gamma mu is proximal and strongly reducible so I will explain in one minute what it means then there exist a unique mu stationary measure probability measure on the project space and ita almost every b in b new b is just a direct mass so first I need to maybe I should erase this I need to explain the there was a question no there is a unique mu stationary probability measure new on PV and for this probability measure you still have you have this limit measures and in this case these are direct masses ok so I will explain the words in the title in the statement so what are the words so you say that gamma mu is irreducible in the usual sense it is irreducible if for any subspace w of v if it is invariant then w is a zero space or w is a full space this is an irreducible subsomigroup and you say that gamma mu in fact this is not a property of the semi-group but of the full group spanned by the support but you will see in the proof that the key role is played by the subsomigroup spanned by the support gamma mu is a strongly irreducible is the same holds for any not only for invariant subspaces but for invariant union of subspaces for any finite family of subspaces if it is invariant if it is invariant then either there exist some i such that w i is a full space for any i w i is a trivial subspace and this is equivalent to saying that not only gamma mu acts irreducibly but so do all of its finite index so assume that this is the same saying that gamma mu acts strongly irreducibly the same as for the full group spanned by gamma mu so in case gamma mu is a group this amounts to say that every finite index subgroup acts irreducibly ok so this is the contrary of an irreducible action of a finite group an irreducible action of a finite group is never strongly irreducible ok so this is to say that you want to avoid finite group actions of course because if the group is finite then every orbit in the projective space carries a stationary measure so the theorem cannot hold and this is not the only counter example i could this is a very trivial counter example but you really need this assumption that it is strongly irreducible so now what is proximality why do you need to even speak about the semi-group instead of group spanned by the support not for strongly irreducibility but for proximality ok so let me think one minute maybe i'm cheating you in this case this is the same ok if the semi-group spanned by the support if the subgroup is proximal so is the semi-group but because we are working over the real numbers so this will not be evident in the proof because if you work over any locally compact field in particular it would work over Qp and if you work over Qp you write this is a weaker assumption a stronger assumption ok but if i work over Qp you can have a sub-semi-group that is not proximal whereas the full group spanned by the semi-group so you write that over the reals it does not make sense as soon as it is strongly reducible of course if it is not strongly reducible you take this matrix so i did not explain the assumption one minute so i'd say that an mmd in glv is proximal if you can speed v as the sum of a line plus a hyperplane the hyperplane is stable by g ok and the spectral radius of g in w is smaller than the modulus of the eigenvalue that is everything as soon as you are out of g the asymptotic behavior of orbits of linear orbits for the iterates of g is shown by the lambda you don't see w this amount is to say that g has an attractive fixed point in the projective space this implies this this is clear because you have this everything comes to x and the converse follows from a reduction of matrices you have to use Jordan form of the matrices to see that as soon as you have something like that or unipotent part you don't have an attractive fixed point so you have to use Jordan reduction ok so to come back to your question of course if you take your semi-group to be the matrices of the form ah sorry the semi-group is thank you if there exists g in gamma mu that is proximal ok of course I take matrices of this form in SL3 for n non-negative sub-semi-group and it is proximal but if you take the inverse semi-group it is not proximal so it's not the same for the sub-group span by a gamma mu to be proximal it turns out that if you take a gamma mu it is not diagonal it requires some work that over the field of real numbers if you are strongly irreducible the semi-group is proximal if and only if the group is this is the same I will not use the fact that I am working over the real numbers today I will just use the fact that I am working of characteristics may or zero maybe not even not I am just using the fact that I have a local field ok so in SL3 QP in SL3 QP you can build a Zarisky dense sub-semi-group which is not proximal and for which the inverse semi-group is proximal you have only matrices whose Jordan form is of this form in your semi-group so really you realize that this is what I have in mind also the pediatric case when I am writing all these CRS ok so pa pa pa pa and when this theory was built this result that this assumption of the semi-group is the same as an assumption of the group it was not known when this theory was built the fact that over the real numbers the property of proximality in fact it only depends on the Zarisky closure ok it is just a property of the Zarisky closure of gamma mu not of gamma mu itself this is not too over QP SL3 QP is proximal but it has Zarisky dense sub-semi-group which are not but over the real numbers it is not true ok this property of being proximal for strong irreducible semi-group if it is true for the group it is true for the semi-group because they have the same Zarisky closure ok but this results about the Zarisky dense sub-group or sub-semi-group of linear groups semi-simple linear groups over R were not known by Furstenberg so if you look at the papers so now they are known by Furstenberg but 50 years ago they were not known so if you look at the papers he is writing it this way ok in fact he was also dealing with the Piad case without knowing it somehow so I will prove you this theorem so there will be several steps so the first step is to prove a very general fact and we will use it today but we will use it often and it is as follows assume that you have a probability measure on GLV and assume that gamma mu is strongly reducible then I claim that and you assume also that mu is a probability measure on the projective space which is stationary then then probability due to the strong irreducibility your probability cannot see subspaces that is for every W which is a proper subspace the measure of the projective part of W is 0 this is a quantitative form of irreducibility since you do not preserve subspaces you cannot give measure to them so the proof so this is a map by Furstenberg also and we will encounter it several times it is very important in this theory the proof is as follows you let d which is smaller d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d