 And it goes down to chi one right that's just the absolute value of a Gaussian It's a Jacobi matrix, okay, so so this is trotter theorem And and what we know what the way he claimed is that this matrix if you write it like this with independent entries It's the same eigenvalue distribution as a GOE and in fact it has the same same spectral measure as well Which which we which we have seen now Okay How much time do I have? How much 20 for me, it's okay good so So I want to do some Benjaminish I'm convergence here But let me let me first Let me first look at Yeah, the spectral measure of that we didn't do that actually This belongs to the first part of the talk, so we'll do it now So what the spectral measure of Z? Well, you know, you can compute it in many ways, but let me do it without computation So let's look at the end cycle Okay You know that the spectral measure of the encycl converges to the spectral measure of at Z Or the eigenvalue distribution of the end cycle also converges to the spectral measure of that So what it was so what is the eigenvalue distribution of the end cycle? That's not hard, right? What is it? Well the encycl the adjacency matrix If I call this T this matrix, which is just the right shift matrix Okay, this is the right shift matrix Then the adjacency matrix is just t plus t transpose Right t transpose is t inverse so t and t transpose commute So in fact, I can get the eigenvalues of t plus t transpose just by understanding the eigenvalues of t But you know the eigenvalues of right shift matrix, right? It's just the roots of unity That's the basic basics of finite finite free and free analysis. So these are exactly the roots of unity and The eigenvalues of t transpose are gonna be the conjugates of those so the same eigen space We have the conjugate eigenvalue or the inverse of the eigenvalue So we're adding an eigenvalue so these roots of unity they live on the circle Okay, and then you take take this and you add its at its inverse So you could think of this the following way. Let me think of it this way. So you take a circle of radius two Okay, and you project you just take the real parts, okay So, you know, I'm basically writing it like this So take a circle of radius two and and take the real parts of the eigenvalues of that That's gonna be the eigenvalues away Okay This is this is this is this is okay. You can compute it directly to So where does this converge? Well the circle converges in Benjamin ishram to Z and From this picture you can figure out. What is the what is the limiting? What is the spectral measure of Z, right? Because you take the circle of radius two And this you know discreet Uniform essentially uniform measure on the circle will converge to the continuous uniform measure on the circle and the projection Of course, it's just projection. So you take the length measure of the circle and project it to the real line Okay, so that's called the art sign distribution. Okay, so so that's the spectral measure of that I did you I made some so this is just some exercise that I did for you Okay, so my next thing is To Is to prove you the Wigner semicircle law, they're gonna be one or two proof depending on how much time I have Try to do a gentle introduction here Okay, so So let's think of this matrix that I have here as the adjacency matrix of a weighted graph Okay, so what is the weighted graph? Well, it's just you know, it's a it's a path of length and or length and minus one and vertices and And you have weights on the edges. Those are the kites And you have weights on the on the vertices those you can think of those as loops on the graph and those are normals They're independent So so so this this random this random weighted graph is your GOE It's another way of thinking of your GOE It gives you a geometric structure in your GOE So let's use this geometric structure to get the Wigner semicircle law So the exercise I need for that It's the following if you look at a chi and random variable It's actually well, what is it? It's the length of a Gaussian vector of And in any management so so, you know, like that's like a Gaussian random walk in any steps so under Yes, so so the distance of the Gaussian random walk. So it's gonna be roughly skirt of n That you know and and in fact this converges to a normal in distribution You can deduce it from the CIT for the square of chi's which is which has you know, which is independent some of independent random variables Okay, so so so this chi's for All intents and purposes They look like square root of n plus the noise, which is a smaller order and In the off diagonals, you also have some noise which is of order one So what should you do you want to take this graph you divide all the labels by by by root 10? Okay, and Then you take a Benjamin ishram limit Okay So what does it what does Benjamin isham limit see from this graph? Well, let's let's see. Let's withdraw. So this graph For this Benjamin isham convergence It doesn't really see this small noise right because it's it's of smaller order if you divide by root n It's just converges to this quote of n. So it's it's almost the same as the graph where you put here You know you root here as square root of n minus one on this on this Edge and divide by n and you put here a square root of n minus two over root n and so on So you get through it one over root n Okay, so so in terms of Benjamin isham convergence you just see this graph You don't see the noise This is all math and I'm not saying Actually, you have local right actually you have local convergence To the limit of this thing. So what will be the limit of this? So so well the graph structure, you know the loops have disappeared and your graph is gonna be zed But what are the labels? well The labels come from picking your route uniformly at random So it's it's a uniform the position which you have at zero is a uniform random variable from one to n minus one and or Basically, yeah, so essentially one to n and and so this label and Also, if you pick your route, then the labels change very little near you near your roots Okay, so if you think about it a little bit just say sit back and see what is the what is the Benjamin isham limit of this? Rooted graph the answer is simple. You're just gonna put a square root of our uniform random variable On every edge of that and it's the same uniform random variable not independent Okay, and the only randomness here comes from the choice of route nothing else the randomness in this model It's basically forgotten You only see the deterministic picture Okay, so so that's that's the Benjamin isham limit Of your over DOE. Okay, we've taken Benjamin isham limit of the GOE matrix This is it. So let's see. What is the spectral measure? Well, it's the expected spectral measure, right? So the eigenvalue distribution as we discussed so mu converges to the expected sigma the expected spectral measure of this random rooted graph So what is it? Well, the randomness just comes from you So if we fix you What's the spectral measure? Well, if you fix you it's just arc sine Right, and it's scaled by square root of u with wise Okay, so if so so the expected spectral measure is gonna be the average of these arc sine according to a square root of a uniform random variable Okay So let me tell you this story slightly differently. So what do you do you pick a circle of radius square root of u? We use a uniform Pick a random point on the circle and project it down to the real line Okay, that's that's that's that's the answer So what is it? Well, actually picking a circle Uniformly according to square root of a uniform and picking a random point in it It's just the same as picking a random point from the disc That's a that's another exercise, but it's trivial Okay So so we have proved And it's important that there's without any computation Okay, the Wigner semi-circle law Okay, so that's that's the end of the first thing It's the end of the proof and I'm gonna give you another proof Which almost again also almost has no no computation And it's just gonna be enough to finish the stock first lecture It's a proof too, and it's kind of you know, it's it's it's a proof that is very in many ways It's very annoying to me But it works Okay So the proof goes by the following so I take this I take my graph that I have here one over root n with this guys and I take now the rooted limit Okay, so the rooted limit where root is here It's not the banyaminish I'm in it. I'm not picking the root at random. I'm taking the root the rooted limit of this graph So where does that go to? Well the sky and over and it's going to one So oh My Yeah, it just goes to that plus Because this is rooted convergent with this route. So what does this tell me? Well, it tells you that the sigma which is the spectral measure of the GOE It's not the eigenvalue distribution with the spectral measure converges to the sigma of this graph Which we already discussed was the Wigner semi-circle law. So what is this convergence? This convergence is weak convergence Improvality There are two things right because we have to talk about deterministic convergence, but these are random objects. So so we convergence of the deterministic World and and its convergence in probability to this deterministic limit. These random measures converge to deterministic limit So this doesn't quite prove you that the Wigner semi-circle law because it's about the spectral measure Right the spectrum measure is not the eigenvalue distribution. It has weights, but what are the weights? Well Here again, you probably noticed but because you've been here for a while, right? So what are the weights? The spectral weights in the GOE Well, in fact in any ensemble, which is invariant under orthogonal conjugation The eigenfunctions are just a random orthonormal frame, right? So their first coordinates are just to form a random point on the sphere Okay, so there's squares of the first coordinates always form a Dirichlet Dirichlet one-half One-half distribution. Okay, so these are just the same as taking you know beta one-halves and Or what is it gamma one-half? So chi Chi squared things Adding them and then normalizing by their their sum so that their total Messes is one because that's a probability measure. So the weights have to add up to one And not only is Dirichlet. It's independent of the eigenvalues, right because you Even if you condition on the eigenvalues you Conjugate doesn't change the eigenvalues will change the weights to something if you conjugate by random Then it changes the weight to exactly this Dirichlet thing So So the spectral measure here is something very simple You just have the eigenvalue distribution and then you put these independent random weights that are Dirichlet on it so so what that tells you is if you look at a set the measure of a set A and This the the so this is the eigenvalue mass Let's put an end here. So the So let's and let's call this B because I don't know or a like this So if you take any any set on the real line and look at what what weight this measure assigns to it And you can what what weight the the eigenvalue Then the spectral measure assigns to it Okay, then the difference of this by the law of large numbers just converges to zero in probability Okay, first of all for this to be non-zero in the limit you have to about order and eigenvalues in here Right, and then and then you're just adding these order and independent things from this Dirichlet Or almost independent things from the Dirichlet. So they behave according to law of large numbers things Okay, so you put together this and this You get a new proof of the Wigner-Semister-Cullough which tells you that even even UN Converges to the Wigner-Semister-Cullough and what's really annoying about this to me is that The information of what the spectral measure looks like here and at the random points should have nothing to do with each other If you take some general graph, there is absolutely no connection right Yet in the in this particular case In this particular case it gives you the same answer So what this tells you is that these Jacobi matrices that correspond to the to the GOE They're very special. They have some very special properties that are not typical And often you can exploit these special properties as we have done before to prove certain things Okay, so They have any more time or Five minutes, okay Okay, so what I was planning to do next time is to show how The general beta ensemble can also be represented in this in this way, okay, so Here is how this goes So if you put a chi of beta over 2 and minus one so So so so so so so you take this try diagonal matrix and you still put here a normal Zero two and so on so you do everything is the same as before except you put a beta over 2 in here And so in beta is 2 is just one Then you still have a Jacobi matrix and Then you have this beautiful theorem of Dimitri of Edelman so which says that The eigenvalues of this matrix correspond to the beta ensemble the Hermit beta ensemble, so they're joined distribution is Right has density The vendor month Now you raise it to the power of beta with respect to some Gaussian Gaussian I Think you have beta over 4 I'm not sure about this Some of lambda I squared So you have some Gaussian background measure which tries to make this kind of these guys Gaussian these These points, but then there is some repulsion which makes them spread out and And they're joined distribution is this So so this gives you a very nice tool to analyze these beta ensembles and and prove various things about it and And and so my plan For the rest of the week or the rest of the three lectures is it's the following so So so tomorrow will will give you a proof of the of the Duman tree Edelman theorem, okay, that's that's That's Mostly I mean, I'll give you a Almost complete proof of that so you'll be able to Actually, it will even give you a complete proof of the eigenvalue distribution for beta equals two Okay, which if you want to do that precisely, it's not so easy you can see in in offers book offers and under zone Zaytuny guyone and there's a guyone Zaytuny book that it's actually if you want to do that precisely It's fairly fairly involved So so so but this will give you You know Technicality-free proof of that We're gonna do so this is tomorrow and I'm gonna show you the bike man Aros Prasheh Transition so you may have heard about it. I think this is probably the most famous thing nowadays about random matrix theory period This is an engineering textbooks and Gerard told me that this is one of his weakest theorems and I told him, you know, you're still better off than fatu But it's you know, it's actually you know the proof there is is is not that simple But we will be able to do a very simple proof which works for general general beta ensembles using using these Using these tri diagonal matrices So on on the day two and three or three and four for now The plan my plan is to think of these We'll look at this nice geometric structure that we have given the GOE and try to take a limit of it In various ways, so they're too nice too simple. Actually, I've already given you two ways to take a limit, right? The Benjamin is from limit and the rooted limit But I want to I want to take a limit in which this becomes a differential operator Which is called a stochastic area operator and that's all that will be good to capture the trace of it on distributions for general beta And I want to talk about also how to use this limit this representation of the trace of it on distribution to to deduce various things And if I have time I'll tell you how to do the same thing in the bog So you can take a third kind of a fourth kind of limit where you get bulk a bulk again value distribution Which is called a sine beta process And there is a nice operator there. They're called the sine beta operator So so that's the plan for the next three days Thanks The questions So so try diagonal matrix always corresponds to the path, right? The weights are different So, yeah, I was gonna give the dozen exercise So so the wishart ensemble right you take some rectangular matrix you put in IID IID GOE IID Gaussians and The wishart ensemble you look at a transpose okay, and Here you can already take this origin by diagonal matrix and and sorry original Rectangular matrix and bring it into a bi diagonal form without changing the eigenvalues of a in transpose and Then the a transpose will be automatically try diagonal So that's very similar to what we did so it's a good exercise to do And that will be actually give you the the the wishart case of the bike banner especially transition other questions yes, so Okay, so you would like to so you have this vector e and Let's say that you want to get the vector This is e1 the first coordinate vector and let's say that you want to get the vector ek Okay, so you can actually it's not so hard to check that there exists a unique polynomial But if you apply this polynomial to a And you take e1 Right, so so this is some linear combination of powers of a times e1 then you get ek So that that's where the orthogonal polynomials come come in. I'm not completely sure either but There is a there is okay, so so the let me tell you this way There is an L2 isomorphism as in general here So on the one hand you have L2 of the matrix It's just the just the ordinary the Euclidean space in which the matrix acts. Let's go be matrix And the on the other hand you have the real line with the spectral measure sitting on it and multiplication by the matrix corresponds to Well to multiplication by x in the real line And then there is a commuting diagram then right so that so that's the L2 isomorphism between these two things and So when you construct orthogonal polynomials That you know that has to do with multiplication by x right to x and x squared and so on you you're talking like that There's the same story on the matrix size of what we did here. I don't know if that helps Okay Another question last question, okay, so we can thank Belind again