 It's a wonderful place and it's a wonderful conference. I'm really pleased to have a talk here. Thanks a lot. I will be talking about infinite dimensional groups, the corresponding actions and the ergodic measures. So my talk will be really close to the lectures of Yanxi Xu. I want to start with our favorite example, definitive theorem. I will stress a couple of things that are important for me and then I'll show the similar pattern in several other examples. So we have sequence of random variables and I want to termate them somehow like this x3, x1, x2. And I'm interested when the corresponding joint distributions are the same here and there. This will be the definition of the exchangeable sequences. Or speaking in other words, let me say that we have only values of zeros and ones, then this is just the random sequence of zeros and ones and I have the infinite symmetric group acting on this space and my question is who are the corresponding invariant measures? The answer is definitive theorem. Of course, if we have something independent and identically distributed, we can exchange them, nothing is changed. If I have only zero and one here, I have only one parameter. This is the probability of one. This means that if I consider the measures mu p for just the Bernoulli measures with parameter p, they're definitely exchangeable. In fact, the first part of definitive theorem tells us that these are p. From zero to one, zero and one included are all the ergodic measures. Now, if we're interested in invariant measures, then we can consider, for example, the sequence that is all zeros with probability one-half and all ones with probability one-half. It is definitely exchangeable and it is definitely extremely not independent. The values are just the same all the time. And we can say that it is just the mixture of mu zero sitting in the identically zero plus mu one sitting in identically one. We can take any mixture of this kind, which means that I have my measures mu p and I can integrate them with respect to some measure, I will call it the spectral measure. Of course, I am to tell what does it mean. Let me denote the corresponding measure by mu. If I want to integrate some function, say continuous bounded function with respect to my mu, this just mean that I consider the integrals of f with respect to my measures mu p. This depends on p and I can integrate it with respect to my spectral measure. This might look complicated, but think of this picture as follows. We have the interval zero one, we take any distribution on zero one, we pick some particular point as p and then we consider the corresponding sequence with this Bernoulli p distribution. Thus in these two steps, we recover all the sequences who corresponds to my measure mu on this space. So this is just simple two step procedure. And what I want to tell you that similar procedure can be done for much more general situation. Let me describe what the more general situation is. Instead of just S infinity, that is the infinite sequence of finite symmetric groups, I consider the sequence of k and compact groups. And I will be interested in the inductive limit. Think of it just as about the union of my infinite sequence with respect to these inclusions. Sometimes we call it inductively compact group. I need an action on some space, metric and separable will be enough. So the setting is quite general. The question is whether I can say something about the ergodic measures here. And if you've been present usually you know that we can. Yanshi told us about the ergodic method, which is as follows. I want to take a point in my space x. I want to consider the mapping of such type, mapping from my group to the space which just maps an element of the group to the corresponding element of the orbit. I have, this was k and this is the compact group. I have that her measure sitting here. It was denoted k and if I'm not mistaken and I consider the corresponding push forward here who is just m, k and x. Group is compact. This is well-defined. This means that I consider the orbit of the particular point and the push forward of my her measure to this orbit. That's it. What is the theorem? Yutverchuk, ergodic measure, take almost any point. What we will have? We consider such measures sitting on the orbit and they will converge quickly. My ergodic measure, new. Almost sure for almost every point with regard to this measure. You are to be a bit used to it, to understand what does it mean. I hope that some of you understood it when we spoke on Wednesday about the infinite symmetric group. Then we can apply this. In this subject, usually when you know something about the infinite symmetric group, you want to understand the similar results for the infinite unitary group. All right. Let me write this as like this. Otherwise it will be there. If we have a group of such type, it is neither compact nor locally compact. One of the natural questions for this group is who is the substitute for the her measure? Locally compact group, we have an obvious answer. Here there is no obvious answer at all. We can try to consider the corresponding projective limits of the unitary groups, but then we are to define how to project bigger unitary group to the smaller one. This can be done in several different ways and this is not the easiest way to proceed. We will try to simplify things passing from unitary group to the Hermitian matrices. Helitransform. This means if I'm not mistaken, something like this, I take the matrix here and it should be more or less like this. It is not everywhere one to one, but if we forget about the set of measure zero sitting here, then it will be one to one. Now we have our measure, R measure, really R measure. Consider the corresponding push forward. Let me denote it say by mu. What, why? For the symmetric or for the Hermitian matrices, there is no difference at all to define the corresponding projective limit. We need a projection from H projection. I have an n times n matrix Hermitian. I consider the corresponding corner and this will be my m by m matrix. I just cut the upper left corner and of course if this matrix was Hermitian, this matrix is also Hermitian. When I get to the corresponding project limit, I just get all infinite Hermitian matrices. That's it. It's much more simple than in the case for the unitary group. Now I have the projective limit. I have my measures on the finite levels. Now believe me, unfortunately there is no time to show the calculations that this sequence of measures there, how to say, good with respect to these projections. This means that we can, that exists with, actually it means in some sense that we consider measure on a bigger level and we project it to the lower level. We get the corresponding measure on the lower level. This means there exists unique new, I will call it new R, my infinite Hermitian matrices. So passing from the question, who is this R measure, I get to the understanding of this object. Here is how would I like to do it. I want to, instead of speaking about this measure itself, I want to speak about the corresponding spectral measure, but this is the part two. So first of all, to speak about any spectral measure, I want to understand how the corresponding ergodic measures. Now my question is, describe ergodic measures for the action of infinity on the infinite Hermitian matrices. Of course, I want to use the method of Wershic-Kerow-Falschanski. Let me say a couple of words here. This is the theorem of Wershic, but you will see now that it's not quite easy to apply it. You are to invent something to apply it in each particular case. It was done first by Wershic-Kerow for the infinite symmetric group and then by Wershic-Kerow-Falschanski. For this case, for the infinite unitary group. That's why probably this is the best way. What we can write. Now, if we say about the infinite symmetric group or something countable like it, we can use this with convergence directly. We did it on Wednesday. If we have real combo groups, we need some restatement. We have, let me write it here, and I will denote my matrix by lambda. I have convergence to some organic measure mu. This is quite general fact, then if we have weak convergence of such type, I can pass to the corresponding characteristic functions. Let me denote them by functions. Where this convergence is uniform on complex subsets. And I will leave some place here. I will have one more formulation a bit later. Now, I want the formula to measure A. What I get is the integral over unitary group and here the scalar product for the space of matrices. It's just A times the action of my group. Yes, probably I forgot to tell you. Let me tell it to you now. My unitary group acts by conjugations. This is my orbit. I want to simplify things for a while. So for any Hermitian matrix, I can diagonalize it like this. Let me write this lambda. It will be A1. I will have some U1 times the U1 inverse. If we put this part here and we use the invariance of my power measure under the multiplications, we see that we can put diagonal matrix here and diagonal matrix here instead of any matrix and nothing is changed. This means that I can parameterize any orbit and any measure just by the corresponding agent values. And I'm interested on the values of the characteristic function but also only on the set of agent values. I have an object like this and by this formula and I want to write some more convenient formula for it so I can work with it. This is the integration over the compact group. So I get the result always, no problem. This is an entire function. I can consider the corresponding Taylor expansion. Now, it is easy to see that if I permute my agent values, the result doesn't change which means that I have actually the symmetric function here. Symmetric with respect to A1, AM now. Which means that I have the monomial for the Taylor expansion. I act by permutation. I consider another monomial with the same coefficient and so on, I combine them together. And what I get at last is the expansion in terms of symmetric polynomials now. Here is something that depends on lambda one, lambda n. We have the same thing that this expression is symmetric with respect to lambda one. So on lambda n means that we have the second sum here. And here is some coefficient depending on n on mu, mu and n. Let me tell it once again. I have an expression depending on A1, AM and lambda one, so on lambda n. This is quite a general statement. It depends on nothing. If I have symmetric expression in these variables, I will get some, let me write this as mu here is some basis for symmetric polynomials. The same thing here and some coefficients. This is extremely general, it depends on nothing. Now I want to consider this particular case and there are several calculations and so it will be not the calculations themselves but just sketch. The key ingredient here is the simplification of this formula. In fact, it is, Harivchandra formula is some explicit formula for the integral of such type. After some algebraic calculations, we can get the following here. I had any basis for symmetric polynomials. Now I consider, let me write it somewhere here. Now my S mu are sure polynomials. And according to Harivchandra, I will get that almost all my coefficients here are zero. Only non-zero coefficients are when mu is equal to mu. That is, I have only some like this and I want bearing this in mind. Come back to my proposition. I have weak convergence of measures. It is equivalent to the uniform and compact sets convergence of characteristic functions and it is equivalent. Okay, it doesn't actually need equivalent here. Let me write it like this. Under some mild assumptions, which I always throw in this case. No, actually I do need the equivalence. The convergence of characteristic functions will be equivalent to the convergence of the Taylor coefficients. We come here, we need to consider these coefficients. Now we have explicit formula for the coefficients here. So believe me that the result is like this. This is equivalent. This is lambda one over n, so on, lambda n over n. Meaning if I divide this quantity by this quantity, the result tends to one. Application of my proposition tells me that if I have the convergence like this, I should have the convergence. These quantities converge when n tends to infinity for every mu. We can restate it more or less as for any symmetric polynomial. We have the convergence when n tends to infinity. Sure polynomials are quite complicated. Instead of considering sure polynomials, I will consider the simplest polynomials possible. The Newton power sums. Let me write you the definition. P m of x one n is just the power sums. Meaning I have x sub i to the power m. I want to consider the convergence of these polynomials instead of these ones. So what do I have? I have P one of lambda one over n. This is just the sum of lambda n and so on. Now you can guess when do we have the convergence? This converge somewhere, this converge somewhere, this converge somewhere, and I want to know what are the natural assumptions for that? Well, the natural assumptions are that for all my agent values, we have the corresponding limits for the normalized agent values, meaning there exists the sequence of x i such that this holds. If we have this, it is natural to say that we have the convergence here, the convergence here, and so on. But we have a subtlety. I understand you, Andrey. That's where to interchange infinite sum with the limits. This is not always can be done. And in fact, we will lose something. For example, for P one, what we get in this situation, we can get anything. It will be additional parameter gamma one. If we consider the convergence here, it's easy to see that this limit is more or less this. So in general situation we have gamma two, greater or equal. And it's an easy exercise if we have convergence here and we have convergence here, then this is actually equality and all other actual equalities if m is greater or equal to three. So this is the description. And what I tried to say that if we have convergence of this type, then we have convergence of this type. Let me restate this and this in terms of matrices. These are just the ageing values. Now this is just the normalize trace and this is the normalize square of the trace. Let me restate this. This is the Olshansky-Gershik. But we have just described that parameterization for the ergodic measures. For this action, we have gamma one, we have non-negative gamma two and we have a sequence of. So when did I start? That's all. Okay, then just last minute. You see that the main ingredient here is the corresponding formula of Harris-Chandra. If we want to say that corresponding theorem say unitary group. In fact, we have Harris-Chandra only for skew symmetric matrices. In the case of unitary group, it doesn't matter if we have skew symmetric matrices. Skew symmetric matrix. We multiply it by i and we get symmetric one. But there is no such map for orthogonal or heteroneonic case. That's why symmetric and skew symmetric are different. More simple. Our statement is that if x is skew symmetric then we have similar classification. Just let me remind you that skew symmetric matrices that trace is zero. So we have only gamma two greater than equals zero and the sum of x. Ice. And last but not least, we have made step one. It was the classification of ergodic measures. If we forget about gamma two, we have a sequence of points sitting on real line which means that any spectral measure that is the measure on the ergodic measures. It is just the measure sitting on the points on the real lines. And they told us in the morning that this is just the point configuration process. And for the case of the unitary group as a spectral measure, I remind you that this was our question. We get the sinus process described at 10 o'clock. Thank you for your attention.