 Okay, so today we are going to try to prove this classification theorem for the guiding measures on for the action of infinite periodic linear group on matrices and first I would like to introduce a very successful method for treating such a problem. So this is called Varshek, Varshek Kirov over Oshansky also method, organic method. So in this method the general setting is as follows. We consider a group such that we have a growing train of compact subgroups. So here every KN a compact subgroup of K infinity and this K infinity itself is usually not compact union of all these compact subgroups and we are interested in the group action of this group K infinity like U infinity or infinity as infinity and in our case GL infinity is EP. This group acts on a complete magic separable space. Separable means there exists a countable and subset. Then recall that yesterday in exercise we the first exercise in the first exercise we want to classify the simple classification of the guiding measures for a group action by a compact group. It's just orbital measures and in fact this is sort of true in this group that can be approximated by compact group. So here is the precise state moment. This is a theory originally by Varshek and finally turns out to be very successful by a series of papers by these people. It says the following. So intuitively we have a base X and we have the action of K infinity. So if this K infinity is just Km the compact subgroup then we know that the got measures are just a point and some orbit this is K and 0. So intuitive says any got measures any got measures of this action can be approximated by a sequence by a sequence of orbital measures generated by a single point. So we have an X0. So this is the support of a got measure for like Km. Then we have a bigger group so the orbit will be bigger. So the union will be the Km plus 1 orbit and the Km plus 2, 3, 4, 5. Then finally the orbit gets more and more in some sense mixing so it turns out to be the got one. So more precisely this is just an intuitive way of saying so more precisely this what does that mean by approximate more precisely. So if mu is a gotic for this action then there exists a point actually there will be such point will be as mu measure 1 such that the following is true for any the limit of n tends to be infinity fg x0 dm and yeah this is the higher measure on the compact group goes through this mu this holds for any continuous bounded function on x. So if we write this in the following way this is x f dm km my notation in the exercise so here and Km x0 is the push forward the fully map Km x0 x that sends g to the orbit. So these convergence actually says exactly that this sequence of orbital measures goes weakly to our original gotic measures yes yes this this is the orbit of the subgroup yes this measure is a push forward of this so we have a of higher measure yes yes so at least this gives us a next vector way of studying a got a measure and it's important here that any of got measures can be approximate so first we should study those measures which can be actually approximate by orbital measures and by the way in fact those measure which can indeed but be approximate by sequence of orbital measures are purely is not a gotic but in general it's not a got it but at least we can restrict ourselves to studying as smallest subclass and it turns out to be very successful. So let us give an example so we can actually using the theorem to prove Schoenberg theorem let us recall the Schoenberg theorem but for simplicity we can do this for real case so we have all infinity this is just the union of O m so this is n times m our subgroup and we are in the same situation because here O n is viewed as such group this is n times n part this is our subgroup and here is 0 and indeed these so these compact groups goes up to the whole group and we consider the group action of our infinity acts on R n and explained yesterday the group action is just so every element is of this form and this part acts on the first m coordinates and the leave does nothing for those coordinates afterwards. So suppose we are given suppose that mu is now a gotic for this group action then we actually can find x0 in this space x0 let me write as vector infinite column vector R n such that m k m O n x0 approximates our mu and let us continue what does this mean so these are exactly probability distribution of the following vector so we have a little little infinite one and here we choosing let me write w m a higher random so a random matrix random matrix uniformly from O m acts on our point x0 this is x1 to xm and after it sends some point some coordinate so this is the actually this is just a definition of orbital measures choosing these randomly as it should be but we know that the hall measure is invariant by left multiplication right multiplication so here I can choose I can change wm to wm and so here is still random and some bm which is this is still random this is the deterministic so actually here is in distribution they are the same distribution distribution this generic element just an element in of this in O m but when choose this wm such that because here xn is fixed and can choose wm such that wm on this point x1 to xn here star star will be one element here some lambda m 0 0 afterwards because this group action can change this this is the transitive on the unit in the sphere of those coordinates from one to m and we don't do anything for the rest of coordinates of course we can find the wm such this is true so this is in distribution the same as wm here's one infinity and something this can change because we change this n is actually given by the sum of x1 square x to the sum of this is equals to the sum of that and the lambda n is positive lambda m so this is just a column vector a random vector and if we look at so what does it mean mu the convergence of these two days this means for every coordinate for any finite coordinate the joint the the projection or any final coordinate the the measure goes to this that projection of mu to that finite coordinates so we can only look at a fixed so this is in fact some random vector and if we just look at those n coordinate which is fixed and we let n goes to infinity then and we can grow later so first study for fixed m then for any m we if we know that actually this is Gaussian measure then this means actually use Gaussian measure so we need a following exercise so how to construct random matrix so random half means random but with respect to how measure also glue matrix from on so the exercises show that the grand Schmidt procedure the grand Schmidt procedure of Gaussian matrix G which is G I J so everything is independent and the standard Gaussian the Gauss the grand Schmidt procedure with respect to for example those or columns for us it will be more convenient with columns with respect to columns gives random how random also glue matrix of course for fixed m so in particular this means so what do we do by grand Schmidt so for first corner we just normalize it so and here what does this mean this is just we take we multiply the first thing the the first though yes we just multiply the first these gifts in fact they sing zero and here is just the first column W m multiply multiplied by this constant but the first column in distribution this is in distribution the same as then as okay our right lies next G1 some G1 square G2 square gm we just normalize the first column this is the first step of doing grand Schmidt but we are care about only the first n coordinates coordinates and we don't forget that we multiply by some constant here and it turns out there's some subtlety here for such measured convergence necessary and the con the necessary condition is lambda m over lambda m over m converges converges actually yes this should converges this is a necessary condition such that such the such measure we have such convergence necessary condition this is by Fourier transform so if we only care about those n first columns we can write now m square root m and g i so this times m so I will move here and wait G1 the m m square over m so we this in fact is the first first m coordinates but then we use the law of large numbers this goes to one almost surely and we have a necessary condition this converges for example to some data or still then we get this is approximately when it's large this is just a data of G1 G2 Gm and this is for fixed m but that big n goes to infinity and actually this is for any n so finally the this measure this measure has to be a Gaussian measure so but we haven't finished what we what we have proved now is a guardian measures must be a Gaussian measure but I haven't proved that any Gaussian measure is actually an a guardian measure and the fact is that any of it any Gaussian measures is actually an a guardian measure is argued by the finality theorem because Gaussian measures here is just to the other data it's kind of a Bernoulli measure and there's a subgroup of the own infinity that permutates coordinates so and we already know that these product measure put measure powers is already a guardian for a subgroup so it is already it is also a guardian for the big group okay so in fact this is the philosophy behind for our treatment of the periodic case I didn't mention that this periodic is not essential so any non local or any non-accommodian local field works so some in time okay so now let us go back to the setting of geo infinity Zp times gl infinity Zp add game to just action on the infinite matches is Qp let us restart this procedure so let me will be a guardian measure and no guardian measure for the above action action by the way this is already this is simple but it requires some something this neccess condition and this is a key point here also for our but actually them there exist then there exist a point infinite matches such that such that this the finite the compact ones compact orbital measure converges weakly to this order to measure that is a worship theorem now recall our so okay we do the same what does this mean this by definition it's the push forward of the following of the following map here's g1 now let me write probably x y to x here's x zero y minus one since this is this group is a compact group the distribution so the the push forward of this map is this optimal measure but we actually we want to do this this is same because why and why inverse they have the same distribution the so we have here and the x in fact in fact it's we should the cons we should look at us these x one so we have here x and infinity one and here x zero our divide divided by two three four copies and don't care about what's here let me write okay this is a part a and here is why zero zero one infinity and since we do the same thing recall that a can be written as w1 diagonal things p minus k1 this is the second exercise of w2 of course this this k n depends on n the size of a and this depends on m and here w1 w2 they are just some fixed some fixed element in gl and zp here's in fixed in gl and zp so by the invariance of multiplication we can absorb this w1 to here but without changing the distribution here and the absorb the other here so without loss of generality we come and we can suppose here it's this diagonal matrix of course we cannot argue directly that the whole matrix is a diagonal one so but then we only care about a small corner of this big matrix big random matrix so we care about this corner we care so we look at the L times L up corner upper left or north-west corner upper left corner of our big run the big random matrices but ML is ML is it is a corner L times L corner of in distribution in distribution it's the same as a corner of x here and the diagonal one p1 m p m y where x y are uniformly sampled from our compact group gl and zp what what do we do if we look at the L times L corner that means we times here one one one L times and zero zero zero and x and here y this is our original matrix and if we want to look at L times L corner that means we times from both side such matrix L times this so this gives us an L thing and something I don't I don't want to I don't care for the moment but what does this mean this is if I just look at this so x is a randomly randomly chosen from uniformly chosen from gl and zp and we care about the first L rows and we do the same thing here this is our x this is our y and we multiply this matrix then we only care about the first L columns now we will use actually a variant of the last exercise from yesterday so here we multiply this is just we multiply here by p minus k1 m multiple here by p minus k2 m and here multiply p minus k m m and multiply this matrix rectangular matrix by these rectangular matrix so okay suppose suppose we know suppose that we know that approximately this thing is first first columns first the rows or first columns that actually are in just by transit transposition the behavior they are the same but they are independent of course if we know this is from L first columns first the rows of x is approximately the L first rows of random matrix uniformly from this map this group this compact group so what does this mean so suppose we know that the first L columns they actually very close to independent uniformly sampled from the piatic integer which is a compact group then we can conclude that this thing is so here we can write as x so we actually have so everything will be independent suppose everything is independent and we get actually so this is one point that we will later try to convince this is true and that second point a necessary condition which needs some Fourier analysis on piatic field and is and this soup m k1 m should be finite should be bounded and by the way all these k1 to kn and recorded k1 they are decreasing and in z in union minus infinity so since the big one is bounded and we know that so suppose this is bounded for some number c then we know that z less than c union minus infinity this is compact this is compact set and the power set is again a power set so in fact we come assume without the loss of generality that actually this key one n k and n they converges to a to a point to a point so in this so we can actually erase this so we so we have the this the this thing and turn transposition of this thing and everything is independent and we multiply things are like this then we'll get the m L corner is approximately like the sum of p minus k m this n can be because it can be from 1 to n and the larger and larger in factors from 1 to infinity and the everything is independent this is actually i m y n j so one can you I can put this index upside this is this is approximate days where x i m y j and adjust the independent and the uniformly sampled from the p but remember this is not precisely what we proved we also have something like what we wrote yesterday we have something like p minus k z i j things this I will give a exercise here I will come back an exercise following thing so actually it is a following if yes this k m is just a decreasing sequence but even to start but it's not strictly crazy it can start us stop at some point and repeat k k k k k k k so what I would like to give an exercise is following so probably this is this is what we have is p k x i so let us just look at 1 1 point y m so this is from m from 1 to m the distribution of these converges to p k of z which is these uniform uniformly from from z p actually the precise exercise I want to give the following suppose g is a compact a billion group and suppose mu is mu is a probability measure of g so because we are in group setting we can define a convolution of mu so convolution so we have g times g here's g we're in compact setting so let's know so the push forward so we have mu here mu mu and by this map the push forward we will denote by convolution to this is just a recall so the convolution power m converges weakly to the higher measure of the closed subgroup generated by generated by as the support of mu just give a hint study Fourier coefficient analysis of this measure okay so actually this is very close to our final result then let us go back to this technical part why the else first the roles of which is in fact a invertible matrix is approximately is just a coefficient without any restriction so else fixed but here n n goes to infinity the size goes to infinity but we look at just a fixed small corner ah so you we can write it last here for example this is x i j this is the y i j so what is the so we did with this is here and what is the i j coefficient of the product of three measures yes this is i j to the m so actually if we write it correctly L L L no this is not the right um index this is the m m m this is to m and m is the size of these big matches but we let m goes to infinity actually this okay this is m to n this n goes to infinity so I probably can give a proof of the last exercise which gives a hinge of such thing um how many times do I have two minutes okay so these are in gl and dp so it's not easy to analyze the directory but let us look at a finite version m fp so this is finite group this is the n times n invertible matrix and I want to look at the first L rows and I want to say this this is approximately um first rows and independent everything is uniformly sampled from fp so this can be looked at following so by the way how we count the numbers of this finite set the way we are convenient way of counting the counting and cardinality of this set is first we choose a non-zero vector and we have for p times n choice minus one choice and then for choosing the second one we choose um from this from the subset of from the subsets of p to the power n from those vectors which are linearly independent uh from the chosen one so we excluded uh a subspace generated the linear subsets generated by a non-zero vector so the second row we have choice pm minus p and for the we continue um the first for the third row we choose from um the subset which is outside of the linear space generated by two independent linear independent same okay so we actually compute two p and minus p l minus one and etc but you see that we see that if we fixed uh l those which are linearly independent and the method of competing uh to uh completing such metrics to be in vulnerable metrics the way the number of ways of completing is independent of the choice of the fixed choice so if I'm fixed to something linear independent they are no how many ways how many ways is just the rest of uh see so actually um this distribution will be um uniform distribution over um a subset a subset s f p m um so this is uh the vector uh how to say um we have v 1 v l that are linearly independent and here this is l okay f m and here is a uniform distribution for just v 1 v l there's no restriction no restriction at all just choose any l just choosing any l vectors then here we have cardinality so here we have a set s here we have a subset which depends on m then we take we compare actually so this is a subset of these we compare actually the uniform distribution on s n and the uniform distribution on x m but if we look at this scene we actually can see immediately that these two things um equals to one so asymptotically uh these two um subset is not very different and so the uniform distribution on s s n and the uniform distribution on s and x n are actually very close um and in fact um for here the argument this is a starting point then this is all will be uh automatic in some sense by uh we know the how measure of this group