 Aujourd'hui, nous allons aller un peu plus loin dans ce que j'ai expliqué hier. Donc, nous avons un graphe et nous avons défendu l'adjacentité opérateur sur le litel M2 de la vertices de la conneville. Et de cela, nous avons défendu pour aucun vecteur un majeur spectraal avec des moments qui encodent le nombre d'entraînements fermés. Et nous avons vu que quand l'adjacentité opérateur s'est défendue c'était la finite, c'était l'adjacentité de la vertex transitive ou c'était l'adjacent unimodulaire, nous pouvons... Nous avons eu ce majeur spectraal. Le majeur spectraal, le majeur très spectraal, est un bon objectif qui moralement seulement dépend sur les valeurs eigenes et pas sur les valeurs eigenes de l'adjacentité opérateur A. Aujourd'hui, je veux étudier la décomposition lebec de ces deux majeures. C'est-à-dire que c'est la mesure de la grandeur, ou la mesure du mur, qui est avérée. Ok, donc si A est assumée, la dégouche est uniformement fondée, pour simplicité, alors mon opérateur est fondé, et il y a une décomposition orthogonal de la L2, où, par exemple, HSC est un set de vectors psi dans l'L2OV, comme ça, un mu, g, psi, c'est purement simulare continuous. Et par exemple, quand on prend une vector E0, si on regarde, par exemple, la partie singular continue, c'est la norme de la projection orthogonal de l'E0 sur cette vector space squared. Similarly, si, vous vous souvenez de cette résolution de l'identité, le mu, g E0 de la single tonne lambda sera, parce que ici vous pouvez de plus en plus décomposer Vectors pour qui mu, g, psi est une masse directe à quelque part, et mu, g, E0 de lambda sera, je pense que c'était ma notation, E lambda E0 squared, qui est la projection orthogonal, c'est la projection orthogonal sur la vector space de l'E0 et de l'E0. Juste un, si vous avez le temps, je vais vous parler beaucoup de l'atomes dans les tricots. Je vais commencer par un commentaire, mais qu'est-ce qu'il y a pour la source, si nous le restons avec nous, qu'est-ce qu'est-ce qu'il y a pour l'atomes? Donc, on a vu que, si vous avez un graphe fin, si vous avez un graphe fin, cette mesure sera purement atomique. On a mentionné cet exemple d'exemple d'exemple d'exemple d'exemple où cette mesure peut être purement atomique aussi. Donc, il y a un très simple moyen de créer d'atomes sur l'infinite graphe. Je pense que c'était le premier observé par Kierpatrick et Eng-Kerfer, qui est la suivante. Imaginez que votre graphe, vous êtes intéressé par la majeure spectrale sur la route et votre graphe semble comme ça. Vous avez une vertex X, une vertex Y. Ici, vous avez le reste de votre graphe, qui pourrait être possible d'infinite. Et ici, vous avez un graphe fin, et ici, vous avez un graphe fin. Je le rappelle, le G1 et le G2. Imaginez que quelque part de votre graphe, vous avez cette configuration. Imaginez que le G1 est isomorphique pour le G2. Ces deux grapheurs ont les mêmes valeurs. Il y a des graphe finites. Imaginez que vous avez un G1 de Psi, qui est lambda Psi, et que Psi est 0. C'est pas 0. Et puis, Psi est aussi une vectorise de G2, avec les mêmes valeurs. Et donc, considérez la vectorise Psi, où vous vous mettez minus 1 sur la route square root 2 ici. Vous vous mettez Psi sur la route square root 2 ici, et où vous mettez 0. Et puis, la question de valeur de la vectorise de G2 est satisfait ici, parce que ici, il y a 0. Il y a satisfait là-bas, parce que ici, il y a 0. Et ici, il y a satisfait, parce que tout ici est 0, et vous avez minus Psi de 0, plus Psi de Y, qui est là-bas de la route square root 2. Ok? Donc, en particulier, chaque fois que vous avez une configuration, imaginez que Psi est normalisé, vous trouverez que la mu E0 de G, notre lambda, sera plus grande que les modules de Psi de la route square root 2. Ok? Parce que c'est une projection orthogonale sur les vectors lambda, vous avez trouvé 1, donc c'est très important. Donc, chaque fois que vous avez des sub-crafes finites, dans l'infini-craf, vous avez des atomes partout dans la mesure spectrale. Ok? Par exemple, je pense que la motivation de K-patrick et Neggart était, par exemple, qu'ils regardaient la percolation sur ZD, avec paramètres P et P louvres et PC. Ok? Si vous regardez le spectraume de l'opérateur HSMC, avec probability 1, il existe des components infini-connectés, et sur ces components, vous pouvez prouver que la partie atomique est dense, parce que vous avez des graphes finites partout. Ok? Bien sûr, c'est juste une possibilité, G1 et G2 ne peuvent pas être isomorphiques, mais c'est important que j'ai un commun avec un vector. Ok? Donc, mais ce n'est pas la seule source de troubles et atomes, comme Justin m'a dit ce matin. Ce que j'ai envie de faire, c'est de ne pas essayer de compter les atomes, mais essayer de bounder le nombre d'atomes, de prouver l'existence de la partie continue ou absolument la partie continue. Donc, sur la mesure spectrale qui est plus facile, ou sur la mesure mondiale, d'E0, Ok? Donc, donc, à ce point, donc, nous avons un résultat avec Arnabsen et Baleen-Fierac, qui disent la suivante. Donc, si vous faites bound percolation, si vous faites PC, c'est bien sûr. Si vous faites bound percolation sur Z2, donc, c'est 1,5, plus grand, et vous regardez les percolation clusters. Donc, peut-être que je devrais mettre Ro, qui, ok, je vais le mettre. Donc, cette mesure, c'est, donc, c'est un perc de ZDP, donc, ici, c'est O2. C'est un low des components connectés. Vous faites la percolation sur ZDP avec le paramètre P et vous regardez les components connectés de la origine, la rotée de la origine. Donc, c'est ce que nous avons vu yesterday comme exemple. Donc, c'est une mesure unimodulaire et nous sommes intéressés par cette mesure. Et ce que nous avons, c'est que cette mesure a un non trivial continue part. Donc, je vais sketcher la preuve de ça. Et on verra que nous avons un bas bas sur la masse totale de la partie continue. Bien sûr, pour P less ou equal à PC of half, cette mesure est purellement atomique. Et pour, donc, c'est pour la percolation et pour les trés, nous avons plus de résultats refins et j'ai juste pris un résultat si la percolation est unimodulaire et supportée en trés. Ok. Oh, je dois dire, la percolation de trés. Ok. Et puis, la percolation de rôles a un non trivial continue part. Si c'est la percolation qui sera la percolation de trés. Ok. Si T a un 2 ou plus de trés avec une positive probabilité. Où un an de trés est simplement un ray, donc une infinité géodésique. Ok. Donc, en fait, la percolation de rôles peut avoir 2 0, 1, 2, ou une infinité de trés. Ok. Et nous avons un résultat plus précis. Nous pouvons bâtir ici individuellement la taille de chaque atomes. Mais je ne vais pas entrer sur ça. Donc, je vais refocusser sur ça. Et la percolation de rôles de les 2 statements, ils relèvent sur l'observation quantitative que dès que vous avez sur les graffes planaires et les trés, dès que vous avez un long pass, qui est presque couvert votre graffes, vous devez avoir quelque partie continue. Ok. Long pass sur les graffes planaires crée une partie atomique, une partie continue. Continuant. Ok. C'est la philosophie de ces 2 résultats. Ok. Donc, comme je l'ai dit, je vais essayer de vous expliquer le 1er résultat de la percolation sur les 2. Donc, je vais introduire quelque chose qui s'appelle le passmatch. Donc, qu'est-ce que c'est? Donc, vous avez un G, une graffes finie. Ok. Et vous avez 2 sub-sets i, j ou vertices qui ont le même cardinal, liti, ce qui s'appelle capital Pi, qui est une collection de pass est un passmatch si, qu'est-ce que c'est? Si Pi i, c'est une séquence de vertices et Pi i, c'est une séquence de vertices, je ne sais pas, Pi i1, non, peut-être Pi i, K1, K, Pi i, où qui commence, donc, ce sont des vertices de la graffes et c'est le pass, donc, toutes les sub-sets des capels de la graffes et Pi i, K1, qui commence à Y. Donc, pardon, mes notations sont très belles. Je vais mettre un M ici. Ok, je ne veux pas mettre un M avec des notations. Donc, Pi i, c'est un pass de Pi i, jusqu'à J de Sigma L et Sigma est une permutation de B qu'on appelle la map de match. Ok, donc, il y a des points, c'est un set I, c'est un set J, la graffes est quelque part et Pi i, I1 est match à Pi A1, qui va de I1 à J de Sigma 1. Il y a un pass I2 et le reste. Et, comme je le disais sur le slide, je veux que le pass soit vertex de la graffes. Les passes sont vertex de la graffes. Ok. Donc, la graffes est L'attention passée est une collection de passées, qui sont des joints vertes, qui vont d'abord du set A, puis du set J. Et puis, la durée de pi est, s'il vous plait, la durée du passage. La durée du passage. Ok? Et on dit que la durée passée est minimale. La durée de pi est minimale et la durée de pi est minimale pour la durée de pi, qui est la durée passée, qui est la durée passée de pi à J. Ça pourrait être avec un match de la durée de pi, mais je veux que tout le pass soit minimisé de pi à J. La durée de pi soit minimisée. Elle doit être minimisée. Pour être un match minimal. Ok. La durée de pi dans chaque pass n'est pas la durée de pi. Ah, donc la durée de pi est aussi la durée de pi que l'intersection est élevé pour I different from J. C'est comme dans la photo. Je ne veux pas les coupes. Ok, donc vous voyez l'existence de la durée passée de pi à J. La durée passée. Vous verrez où le plan R est important. Non, ce n'est pas important. Donc, nous avons un TOM qui généralise un travail de pi à J sur 3. Le TOM dit la suivante. La durée passée. Donc, la durée pi est finie. Assume que j'ai et j'ai par là-bas et assume qu'il y a un non, il y a un match de pass Non, j'ai dit qu'il y a un match de pass de pi à J et tous les pass-matches de l'alimentation minimum qui s'appelle les pass-matches de l'alimentation minimum ont la même map de l'alimentation de l'alimentation minimum par exemple, il y a un unique pass-match de l'alimentation de l'alimentation minimum Can you remind us what S index B means ? Ah, it's a permutation on the first B integers Yes, so imagine that you are in such a good position then if m1 up to mR are the multiplicities of the second values we have the sum of the mi minus B, so B is the common cardinality of i and j plus it's bounded by L is the cardinality of V is N where L is N minus the minimal pass-matches it's a number of left over vertices, it's a number of vertices which are not covered by its pass-matching of minimal lengths Ok, so in particular for any k, you should sum the first k multiplicities, it's bounded by k B plus L Ok, so let's see If you want You see the position of the eigenvalues, it's a variable result because the position of the eigenvalues is completely irrelevant We can do finer estimates which depend on the position of the eigenvalues, but only for trees Ok, so let's for example, let's just to understand what these definitions are about Imagine that you have, and let's prove this theorem on percolation on ZD Imagine that we do some kind of vertical percolation So we have the grid of size N and you remove, you just remove some vertical edges Ok Then you can build a pass-matching, if you take i as being this set Ok, and j as being this set Ok, so you have, since it's a planar graph you have pass-matching from i to j, and that's where planarity comes into play It's, there is a unique pass-matching from i to j, either there is zero or one Otherwise, the edge has to cross, the vertices should cross Ok, so in particular, since you can apply, so this will form a pass-matching So the number of leftover vertices is zero, all vertices are covered So if you apply this result, it tells you that the multiplicity of any eigenvalue is bounded by And so B is equal to N, is bounded by N Which is very small compared to N squared, which is the total number of vertices So whatever the way you remove the edges, the vertical edges The multiplicity of each eigenvalue is quite low Ok, so it's not exactly percolation on ZD, on Z2, but quite close enough So you can refine this argument and prove the result above there Let's try to do that Ok, let's try the sketch Let's try to do that So what happens? So you, now I take my reel So you remember this look approximation Which says that instead, here instead of, so this is a Z, we look at, we are interested By the law of the percolation on Z2 Which is infinite But we know that the mass of atoms converge by look approximation We saw that yesterday So it's enough to restrict ourselves to finite box And let N goes to infinity Because we know that the mass of each atom converge Ok, so here is my, so here are my bonds They are present or absent with probability P And 1 minus P Ok, and so on I want to mimic this argument So I will take I I will take the maximal flow from left to right Ok, so I will take I as being the maximum number of, I will take B Sorry, as being the maximum number of this joint pass Which are crossing from left to right So here is one such pass Another one, another one And so on And I take B as being the maximum number of those Now I take I1 to be this point I2 to be this point Up to IB G1, G2, GB Ok I can apply my theorem Say that if you sum the multiplicity of, for example the K largest multiplicities It will be bounded by K So B is less than N, of course So it will be bounded by K times N Ok Plus the number of left over vertices Each pass is of lengths, if each orange pass is of lengths at least N Ok, so the number of left over vertices is upper bounded by N squared, the total number of vertices, minus B times N This is an upper bound for L Because my pass, each orange pass is at least of lengths N So in particular If B, if you can prove that B is larger than delta times N For some delta positive With high probability Ok Then You divide everything by N squared It's bounded by K over N Which goes to 0 in the limit N goes to infinity Plus 1 minus B is at least delta N So 1 minus delta N Which implies by look approximation That the total mass of atoms For example, the total mass of the continuous part Is at least delta Ok So in fact the continuous part Has a total mass which is lower bounded By the maximal number of pass By the maximal flow from left to right And you see, we were able to use our theorem Because the graph is planar So B, if you find such pass, they are unique So they will all have the same matching pass Ok So But then The question of what is the maximal flow From left to right Ok, so by Manger's theorem B is equal to mean flow with max cut So this is a maximal vertex cut From the top to bottom So which is a minimal number You want to go from bottom to top And you want to cross a minimal number Of vertices which are in the percolation cluster So here sometimes you cross Edges which are covered by... Ok So there is at least one over four Every vertex is adjacent to four vertices Four edges Min cut Sorry, min cut So it's a minimum edge cut Ok, I write edge cut because it's easier Because if you zoom, if you look in the dual graph So this is my box You look in the dual graph Here Ok And then what you do is you put If this edge is present in the percolation cluster Ok This edge here Will receive a weight Which is one Ok If this edge is not present Ok, you see this is the edge This edge, if this edge is not present In the percolation cluster This edge will receive a weight Which is zero So every edge in the dual graph Receive a weight which is zero or one Or not, it crosses an occupied edge And then the min, the edge cut So let's call that T T is the minimum Of the sum of the weights Where you take the minimum of all paths So the edges are in gamma All paths from top to bottom So it's a first passage percolation problem And... So... Prove that Tn Over n So in probability So this is case 10 86 In probability Tn over n converges to delta Ok And delta positive Is equivalent To the probability that the weight So the weights are id That the weight is zero Is less Than the critical percolation probability In the dual graph Which is one half here But the probability that the weight is zero Is one minus p So for us It proves that delta is positive Ok, maybe it was too fast But the id is just that Since you are in a super critical In a surcritical cluster Percolation cluster You can pack many yellow paths From left to right And a dense So there is a density On the number of paths From left to right So the number of paths B is for the n And that's it So in particular our theorem Implies the case 10 theorem Which says that the percolation probability Is... There exists an infinite cluster When p is larger than one half But we use another theorem Ok, so... Let's see I will try to give you The proof of that Because I think it's fun So it's an open problem To prove the same statement For d equal to 3 And the obstruction Is not the fact that First passage is not the problem Of the result of case 10 First passage percolation Valide in all dimension The problem is that our theorem The problem is on the condition That there exists a unique But all minimal path matching Is the same matching map This condition has no reason To be satisfied In dimension higher than 3 Ok, so we should prove this theorem So... So the proof over there So it is based On the divisibility property Of the characteristic polynomial Of the data of A Of A minus X Ok, so... On the divisibility probability Of a collection of monique polynomials Which would be this Characteristic polynomial Where we restrict ourselves to minors Ok, so this is the determinant So this is a minor Where rows In I And columns In J Have been removed So I'm not sure it's Conventional notation So what I note here Is a number of... Is a rows and columns Which I remove in the estimation Of this determinant Ok, so you introduce Delta B Sorry, of A minus X As being the... So it's a monique polynomial Which would be... So maybe it's... It will be the greatest common divisor Of... So let's call that... We don't need a name for that Of that of A minus X I J Over all J So that the cardinal of I Is equal to the cardinal of J Which is B Ok, so... And zero divides Par exemple, if I is equal to J So determinant Of A minus X I I As a degree And... N minus I Which is N minus B Ok, so N will be The tel number of that Right? Because this will be In this case, this is simply The cardinal polynomial Of an Hermitian matrix Of size N minus B Ok And... So in particular, the degree Of Delta B Is at most N minus B So we want to prove that in fact This degree, in situation which are described Like that, they are Much lower So, and another thing Is that we can relate easily the degree Of this greatest common divisor In fact, it's exactly The sum of M I minus B plus So the first fact is Why is it true? So if... Let's check this fact If A is diagonal Ok, so you have lambda 1 M1 times, lambda 2 M2 times and so on Ok When you take the determinant Of A minus X I When you try to remove B, you try to remove At most Values, at the columns of You try to remove B At the maximum between B and M1 Rows and columns On lambda 1 You will get that Delta B Divide a polynomial With lambda 1 M1 minus B plus You can do that for any eigenvalue And you get this statement Ok, because otherwise It's 0, if you take A of i, j with i different From j Then since there are 0s Here, you will have The determinant will be simply 0 So the degree So delta B Is the product Of the Lambda i minus X Power Mi minus B plus Over i So, this is true if A is diagonal But in fact it's true As soon as A is Any Hermitian matrix It's not Difficult The only observation is that Ah The only observation is that The determinant Delta B If you compose by You could define B B of X as being Any n by n matrix with Polynomial coefficients Delta B of B of X Is equal to delta B Of U, B of X V If U and V are Invertible matrices Ok, so this would be fact 2 Ok, so if you know the fact 2 The fact 1 A minus X Is D minus X Times U, U star For some military matrix And We have already computed delta B For diagonal matrix So the fact 2 It's just an exercise on the linearity of the determinant Which says that for example If B1 Bn are the rows Of the columns B Of X And you look at the determinant Of a linear combination For example of the first row Ok, we would like to prove that Delta B of B divides this guy Ok, because then using If you prove that It will prove You can put linearity And you can do the same on the rows And columns and it will imply that So you want to prove that whether or not B of V of X Divide Where i, j of cardinal B Cardinality B So you use the linearity of the determinant This is the sum Of the ai Of the debt Of bi So maybe it's columns Bj B2 Bn Ok, there are a few cases to consider If one Is in j Meaning that you have removed In this expression you have removed You have removed the first column The value here is irrelevant So we can assume That one is not in j Otherwise there is nothing to prove If y is not in j And j Imagine that j is not In j neither Then it means that B of j Somewhere there is B of j Because j is not in j So there is a determinant and two columns are equal So it's zero Ok, so again there is nothing to prove So we can assume that the only case To consider is one not in j And j in j But then You just switch You put Bj on the j's position And It will give you a minus But it will still be One of the minors of B of x Ok So this proves that Charles, I just need some Scaling here So is there some assumption That the determinant of B of x Determinant of u times v Ah, so u and v should be invertible Otherwise what we prove is that The determinant of B of x Divide that Yeah, invertible But what about the constant So there are matrices Ok, so you have to define Yes, so You have to define the fact That the greatest common divisor Has to be a monique polynomial So then you don't have to care About what you are Otherwise of course you should have the product determinant equal to one Yeah Yeah So And then So we want to To are back to our problem So now the goal So the only thing to prove Is to take The determinant Of A minus x Of ij Where i and j are exactly My passmatching And we should prove That the degree Of that Is Because you know that this is less Due to the definition Of the greatest common divisor Is degree is less than any other Poin, non zero polynomial That it divides, so it will be less And the sum of Of the degree of delta B Of A minus x Which is as we saw the sum of the Mi Minus B plus Ok So now it's simply A determinant expansion I will stop very soon Because So what we did is just to write These determinants So it's just There is nothing fancy here You can write it, we found convenient To write it as a determinant Of Another matrix B minus xd Of size n times n Where B Where dii Di Is one Minus the indicator That i is in i and j And B Is Like that, so B is the same As A Except that when you have a vertex In When you consider a vertex In il A vertex in i, you just put Outgoing edges and when you take A vertex in j, you just consider In going edges and you add An extra vertex, an extra hx So B of E i Is equal to sum Of B of E j Is equal to sum of J different from i Of A i j E i Ok, this is to say You consider only outgoing edges And B of E j Is E i Ok, so what you do Is you write that as a sum of Of all permutation of the signature Of the permutation of the product Of B i sigma To i Minus xd i To i Ok, and for this product to mean on zero Let me do one last picture All the rest is For this product to mean on zero i has to be matched To one of its neighbor And then this guy has to be matched To somewhere along the path Up to it reach an element In j Then j is the only way To consider is to come back To i2 because all other entries Are zero and then like that Ok, so what you You do some Really basic computation And what you will find That this will be equal to the sum Of all passmatchings Of some sign Which you can compute Of the determinants Of A minus x Pi Ok, because when you have removed The passmatching, the complement It's simply B and This matrix B minus x And A minus x coincide So this has exactly A minus l So the only thing that you have to For minimal The only thing that you have to check Is that the sum of the plus minus Does not coincide Ok So That's the idea Let's see Ok, so I will not To have another way To portrays I will not mention We have an analog criterion Which I will not mention Because I don't have enough time But let's see We just give you a consequence So we don't, as you see The proof is based on determinants And so we don't have Direct analog Of this theorem On an infinite graph And the other criterion Which I will just not mention We apply it It's a criterion which works Directly on infinite graphs On unimodular So what the The theorem that we have Is that you define an invariant Line ensemble As a pair formed by So it will take rho Supported on trees Ok, rho unimodular Some tree And Imagine that on an enlarge Probability space So These are my, there is a tree You can You can build lines Which go from one hand To another of the tree In such a way That the The lines And the tree is still unimodular So the way you can encode You can see that As a weighted graph So there are some lines There are some other vertices Which are not covered by the lines Edges which are not covered by the lines That you can see As a, you can align It's simply It's simply a weighted You can put weights on edges And say that The weight of an edge of the form UV Is between 0 Or 1 And the constraint Is a sum Of L of UV Belongs to either 0 or 2 Where you sum over U And this should be true for any V Ok, so any vertex Is either crossed by the line In which case it has two Distinguished edges To build this Local width topology and unimodularity It extends to weighted graphs And we say that we have an invariant line ensemble If you can build Such Such weights On the edges Such that the weighted tree Is still unimodular And then what we have is that This is what So your tree Can be covered Partly by infinite lines Then what we have is that the continuous part The total mass of the continuous part Is at least The probability Under the measure that rho Is covered by the path Where you say that If the sum is 2 You say that the vertex On the line Ok So then you can try to Find an example on which kind of trees Can you Are you able to build such Infinite line ensemble For example you can prove that There will always exist The infinite line ensemble There exists an invariant Line ensemble Such that the probability That the root is in L Is at least 1,6 Of The degree of X minus The degree of the root Minus 2 plus Divided by what Divided by the expectation Of the degree We will see an exercise That the fact that you have Two or more ends When you are supported on infinite trees Is equivalent to the Save at the expectation of the degrees Larger que 2 As soon as you have a graph Which has two or more ends This is positive And you will have Some invariant line ensemble Of positive density So you will have continuous path So I skip many days here Of course but we have something Which is slightly stronger because we can Bound individual mass of atoms But I will not enter into those details So there are open problems In this part So Is some criterion for Absolute existence of absolute continuous path In the very spectral measure You have seen that for Schrodinger operators You have this Vagner estimate which gives you exactly that But here everything is discrete so it's a tough question And we have no criterion For absolute continuous path Just criteria which bounds the total mass of atoms We have spent some energy To try to do Critical percolation in dimension Larger than 3 At least 3 or more And there are some Other models where we Don't know like Trees with one end Random The limit Uniform planar maps For example Which are objects which are critical in some sense Where we cannot apply In either of our models So we don't know whether or not The corresponding spectral measure Has some continuous path or not Ok now I want to go To simplify To look in a very simple setting To try to be able To say something about This is much harder So for example a toy problem So I think this kind of problem Was called quantum percolation By A paper by Dugène Lafore and Mio in the 50s Shortly after the paper From Derson So So what I don't know really what it means So what Concretely what we speak about Is For example you take a Galton Watson 3 With offspring distribution P Ok so the root has A number N of offspring Which is sampled according to P And all its Of springs have a number of Of springs samples independently Which have an independent samples With again solution P and so on Ok so we could Try to understand the adjacent Inter here And try to understand This measure Try to decompose it as a continuous Plus pure point plus singular Continuous part Ok it's not a unimodular tree Except when P is Poisson But there is a unimodular version of it Which I will not There is no If you prefer a unimodular you can ask If you know what it is You can just consider a unimodular Degree distribution P And what I'm going to say is correct So So we want to understand This measure So It's a very hard question Me my personal interest was to take P was Poisson Because it is This is a local weekly This is a Benjamin Ischram limit Of L'Andoche Renograph And try to prove that for C large enough This measure has a continuous part Because we know that for C For any C We have the expected spectral Measure as atoms Dance atoms everywhere So every total area Algebraic integer is an atom Just now we probably explain you So but still There could be even if this measure As atoms everywhere there could still be Some hope but there are some continuous part Which emerge at some Quantum percolation threshold Ok And Some physicists Very close to here They have even a conjecture on What is this quantum percolation threshold Ok So Back to earth Unfortunately I'm not able to say Anything about that So I will do something simpler Which is We consider a measure P Which is close to To To be constant So in some So but the Vasserchter In distance Which is a piece power Of the expectation of N minus Q For some atomistic Q Is small And P would be larger than 1 Ok So Keleur has proved the theorem Where we have Simon And Daniel Lenz So Keleur has proved the theorem Which says that if P Of 0 is 0 And W P Of P delta Q Is less than some Is small enough P On small P For some P Larger than 1 Then almost surely With respect to the randomness of the tree Mu T Of E naught Is absolutely continuous Ok Since I was interested by Poisson Another case which is interesting Is for example when you do bound percolation So you are interested by P It's a binomial variable With parameter It's too many p so sorry P And Q And Q Ah yes sorry Q is an integer Better than Q So what you do with your tree Is close to TQ Which is the infinite Q array tree So every vertex has two As Q of springs So it's not unimodular Because the root has a different degree Than the other Ok So what I could prove Is To lift up to As soon as you lift this condition From what I started with There are finite bending some graphs Everywhere in the tree Ok So it implies that the atomic part Ah You will find atom possibly atoms A dense set of atoms On the support of this measure But it's still There could still be some continuous part So that's what I did Ah So this is not killer anymore So let's If WP Of delta Q Ah Is small enough For some P At least one Ah then Mu E0 T As An absolutely A non trivial continuous part With positive probability Because with positive probability Your tree is finite So it's just atomic Ok and moreover You can have something like Ah Ah you can be more precise Than that And save What Absolutely continuous But so this measure Has atoms everywhere And you can prove that If the density is If f of lambda Is density Mu t of E0 Of the absolutely continuous part of that Ah you can prove That f Of lambda is density Of the semicircular law With reduced square root Q Ah 2 square root Q So it's something like Working in random matrices But not knowing the normalizing factor A bit of shame But ok So it's what Lambda Square Indicator Modulus of lambda Is less than 2 4 Q Ok so Ah you can prove that The expectation Of f of lambda Minus f of lambda C Goes to 0 As W1 Of p delta Q Goes to 0 Ok So it implies that this absolutely continuous part Ah it has a total mass Which goes to 1 As you get closer to the infinite Q R e 3 And its density Approximate Ah Its density approximates The semicircular law If you are speaking about The unimodular In 2003 You will get instead of the semicircular law The question Which we saw yesterday Ah Ah yes So there is no So there is no The p equal to 0 condition Ok The second statement Is to compare with the binary or with the delta Q The second what The second statement Yes W p Of p delta Q Yes maybe just put one Ok Ok so if in L1 If in L1 versus 10 distance You are close enough to direct mass at Q So it's just that the expectation Of n minus Q Is small enough Then you will have some absolute continuous part With positive probability So In fact What I just did Is I did some preprocessing On this random tree And applied some result For example Simon And Michael On random Schrodinger operator So what I want to explain to you Is how you can connect This quantum calculation problem To some problem on Random Schrodinger operator On trees which are more or less Well understood Ok Well I don't call it percolation There is a paper by Dogen Mio and Lafort This is called everything quantum percolation If you want to study The adjacency operator On all kinds of percolation graphs Yeah ok That is just quantum percolation Yeah I mean the tree is quantum percolation On trees if you want To study the adjacency operator On Percolation graph That is quantum percolation Because quantum is always The adjacency operator Is the kinetic energy of a quantum particle And it is now the random environment But it is still We call it that way Yes Ok so If we understand this result As a perturbation of the QRI tree The spectrum The major spectrum of the QRI tree Is the K-10 Makay No it's a semi-circular Because it's not The K-10 Makay Would be for regular tree So since we have a bias At the root You get the semi-circular But as I said If you can modify this statement And put K-10 Makay If you replace at the beginning Briefly How you transfer this problem Which apparently is very tough Because you have atoms everywhere Two more well understood problems Of random shredding operators On trees Ok So Let me What did I say Ok So So let's consider The Anderson operator Of Simon course Ok You like to put a minus here So Proofs of So you can When you are on the infinite QRI tree When lambda is zero Your spectral measure at the root Is simply the semi-circular To square root Q Ok, so And if you call G of Z The course is just Transform Of The semi-circular law What happens is that it satisfies A fixed point equation Which is Z plus Z Plus some Q times G of Z Ok So now if you consider the Anderson operator Of this Exactly as This Finberg formula that we saw yesterday If you look at the If you take G Not of Z Which would be defined as Like that Of It's a Q times G I just write it like that And this fixed point characterise The semi-circular law G Satisfied due to the Recursive structure of Of the QRI tree G will satisfy a fixed point equation But G naught Has the same distribution Like minus Z Minus lambda V Not Plus Some of Y to Q Of GX Minus 1 Everything depends on Z here And the GX Are id copies of G naught So this is a simple consequence Of the Resolvent formula And then The Resolvent Is bounded By one of the imaginary part of Z So Then it's easy to But when lambda for any Z With imaginary part Which is positive G naught of Z Will converge In probability To G of Z Because as lambda goes to 0 Ok, because this term becomes negligible And you arrive at the same fixed point So let me write it like that Ok But of course The main difficulty is to get this convergence Uniform On the imaginary part of Z At least for some When the real part of Z Is in some spectral region And So the first one to achieve that was Cline 0 96 Ok He transpose this equation He wrote for example The Fourier transform He wrote this equation In terms of Fourier transform of the laws At a fixed point and apply some abstract Implicit function theorem Saying that the fixed point was locally stable When you are perturbing When over around the cube fixed And lambda going to 0 Then there are more refined methods I mean it's a beautiful method Then there was an approach By Asler Which was further extended By Lenz Which is more geometric Which quantifies In some sense quantifies The fact that this equation Is Looks like a contraction Locally when q is larger than 2 And There is another proof Which gives a weaker statement Which is Very neat As a man seems And Varsal In 2006 Ok, but the idea is still the same You have to prove that This map has some contraction property Which allows you to take this conversion Uniform in the imaginary part of So one important point Is that on random showing operators Ah, so forth in my case We have g0 Of z which is minus Z plus the sum Over all of spring of the roots Ah, so just n say Of g over x of z We are conditioned on n Gx or id copies of g0 So We have replaced The randomness on the potential By a randomness in this sum Ok, well it can be almost sure If you define that properly Ok So What I want to do quickly Is to explain you Or you can write this equation In terms of Like something which looks Roughly like that And where you can If you read one of those papers You will be able to apply one of those methods To this problem Ok So it's based on a 3D composition Which is very simple So you look at Probability You look at t, you have your infinite tree You have your tree Ok, tx Is this tree Subtree root t dot x Of all vertices Which pass to the root And goes through x Ok And you look at s The survival set The set of x Is tx Ok And pe The probability of extinction The probability that the root is not in s Ok So pe The smallest root of As you all know Of x is pi x Where phi is the generating function Of spring distribution Ok We are in a setting where pe The probability of extinction Goes to 0 When w P of delta q goes to 0 The closer you get To The probability of extinction is continuous For the local topology For the weak topology At any point except The direct mass at 1 Ok, so Now what you do is You write this equation In terms of whether or not So Of course you have to condition On the fact that the root Is sub tree is infinite Ok Because otherwise So What you do You have n e n s So n e plus n s Is equal to the total number of vertices Of the root The number of vertices which get extinct And n s, the number of vertices Which survive infinitely Ok, of course you can compute The joint law of those two guys That's quite easy And it's an exercise In probability And the only thing that we use So you could write Condition On n s Larger than 1 It's the same thing as conditioning The root The tree is infinite Ok You can compute this joint law Ok And so we denote by g 0 s Of z The conditional law So you have g 0 s Of z Which would be equal to minus Z plus the sum From x to n prime S Is So condition G is the conditional law And n prime e n prime s Is the conditional law Of the number of vertices Which get extinct and survive In the condition On the fact that there is at least one Which survives So then if you survive You have an IID copy Of the Resolvent when you survive Plus something which I Put in a noise But now the noise depends on the Potential where v of z Is the sum Of n prime e Of g x e z Ok, where again G x e z Is a law of As law, the law of Resolvent at the root when Condition on getting extinct Another fact is that the expectation Of n prime e goes to 0 As P delta q goes to 0 Ok, so You are almost here That the problem is that you Potential depends on z And that's where you have Atoms everywhere Ok, but there is a lemma Which says that For any epsilon There exists a k A subset of r Compact Ok, maybe not Suggest the Lebesgue measure of the Complement is bounded by epsilon And The expectation Of The supremum Of The expectation Of the absolute value Of v of lambda plus i eta Power P 1 So expectation of this Where lambda Is in k And eta Is positive That this goes to 0 As w P Of p delta q Goes to 0 Ok, so up to Removing A nasty subset of Real numbers Which will be dense Because that's where exactly Where would be the Atoms of the measure You can control any moment As soon as you come around this fast enough You can control any moment Of this rest And then what you can do You restrict yourself to this deterministic set k And you apply one of your The methods of Oh, the second moment To No Because When you are When lambda is in k You control nothing on the On the potential The potential is bounded In Lp Only for lambda In your set k But this set k Is this set k The complement of k Is dense in r So it's still some So I think in 2 minutes I can almost give you the proof of islam Yeah, almost So the first point is that The probability that the tree Condition an extension Ok Condition an extension I put the Condition, the five of the tree gets extinct Your tree, since it's a super critical tree Either It dies out very fast Or it will be infinite Ok So it turns out that you can prove That this is less than c Times delta power k Minus 1 For And delta Arbitrary small As p close to Delta q So it means exactly what I said That if you have an extension You are very likely to die To be Not to have springs Because delta is very small So Ok That's the first Which can be proved It's again an exercise in probability The second point now Is we will build this set Introduce lambda k As being the set Of values The set of lambda So the lambda is in the spectrum Of the finite tree t Where t as size k Less than k The number of unlabeled trees grows Exponentially So there are at most Any two trees which are isomorphics That will have the same spectrum And the tree of size k Is at most k a distinct value So it implies that lambda k It does not grow too fast It's probably like c to the k For some universal constant c Which you can compute Open the book by Flajolet And Cedric You will know more about this constant c C is the number of unlabeled Crafts of size k Now your set k is very easy You take just k As being the set Now it's where I should not Ok, your set You take bk of epsilon Which Is a set of x In r Which are close to one of the lambda k And close Being Exponentially close And then you take k Which is r Minus the union Of the bk epsilon 4k Ok So the Lebesgue measure of k Is bounded by the Of k complement Is bounded by the sum of the Lebesgue measure Of those guys You look at both on lambda k So the sum over k Of lambda k So you sum over all eigenvalues In lambda k And then you grow a ball of reduce epsilon 2 minus k over lambda k Which is epsilon So you have my computation correct So this is an arbitrary The complement is arbitrary small And then what you have to check Is that the expectation of that Is bounded Ah, I have 2 minutes Due to this fact Because you can even prove that In lp If here you put p What is enough to check What is enough to check Is that the potential Is a sum of So this is a small sum In lp, it's usually 0 So we have to check That this variable Is bounded by some constant Which will depend on delta For any uniformity For any z in my set k And then it's a sum Of terms like that And the sum has a small number of terms So the only thing to use Is that the distance Ah If you have the Cauchy-Stich This is bounded by This is the Cauchy-Stich This is bounded by the 1 Over the distance between The support of nu And z Ok, so you can bound Let's do p equal to 1 You can bound the expectation Of g e o z Is a sum Over all k The probability that the tree The tree which gets extended Is larger than k Times Since you are in lambda k When the tree is less than k You are in lambda k So since you are in k This will be You can bound g You can bound this term If you know that the sub tree Is of size at most k 1 over the distance Which is at least The reciprocal of that Ok So this is bounded by c power k And this is bounded by delta power k So if delta is small enough This will be a conducting theory Ok, so Thank you I Actually I have a I have an interesting Do you have a part Maybe What I would call Bondon percolation on the empty I would be useful to In your In the doubt In the doubt Because what is needed Is the percolation threshold For quantum percolation Which is also what happens Yes, but for the density Of state, not for the Non average measure That's why it was easy I mean easy Ah, that's the density of states Yes So it's just a vaginal estimate I wouldn't believe this I mean the physics is For the percolation threshold For quantum percolation It's actually higher For the percolation There are always bottlenecks And the quantum percolation It just cannot get through these bottlenecks It will get stuck It's gathering events So it could be actually strictly higher Ok, on 3 On 3 On 3, it's clear But it's high It's no proof But I would bet my My shirt that it's higher I would not be that confident But I would also bet my shirt that it's higher Ok But yeah But it's a much more difficult question Of course