 Okay, let me start first by thanking the organizers for organizing this workshop at this time of year, this very nice place, and also to invite me to speak. So I'll pick up a bit where Julien actually usually mentioned something yesterday about long-range interacting systems about the fact that they could have non-equivalent ensembles and non-concave entropies and mentioned that briefly. So I want to explain this in all details and there's some nice results about long-range interacting system. Actually, it's something that very interesting that comes up when you look at thermodynamics of long-range interacting systems and statistical mechanics and equilibrium states. And I'd like to explain with a lot more detail how long-range can bring non-equivalent ensembles. I'm supposed to be representing mathematical physics, so I'll be using the board and some of the slides. So you have the slides, but I'll come back to the slides in a minute. So what I'll be considering is classical, mostly classical systems where you can make also sense of non-equivalent or equivalent ensembles for quantum systems. I'll be mentioning that at some point. So I have an n-particle system. I had to use the board. This is really nice. And then you have some, so you have, if you want to describe this, you have to have the microstates. The microstate is just a microscopic configuration and in the slides, this will be omega. So this is just the state of your first particle, second particle, and then so on. You have n particles. And then I assume that the whole system is described by some of Newtonian, which I'll write H of n of the microstate. So this will describe the microscopic energy of your system. So the question is how you describe this in terms of thermodynamics, in terms of equilibrium statistical mechanics. So I'll be talking about the equivalence of equilibrium statistical mechanics. So I'll be taking a thermodynamic limit. So as you know, I can describe this by fixing the energy or fixing the temperature. So I could say, okay, so I'll fix the number of particles. I'll fix the energy. And then in terms of thermodynamics, the function I have to consider in this case is de-entropy. So de-entropy is a function of the energy. I could decide also, on the other hand, to say I have a fixed number of particles and I have a fixed temperature. In this case, the thermodynamic potential I should be using is the free energy. Okay, so this will be a description in terms of thermodynamic potentials and the question will be whether the two sides are equal. Then you can also say, okay, let's go one level deeper. I can also look at equilibrium states. So if I look at fixed energy, I can define, so say some microstate. It could be the magnetization of a spin model. It could be the distribution, the velocity distribution for gas. And then for fixed energy, then it will be a set of equilibrium states, which I can call, say, m star for fixed energy. And then maybe I have many. So in fact, I have a set of equilibrium states, but they're all parameterized by the energy. And then for each energy, I have a set of equilibrium states. Now, if I look at the system as a function of temperature, I'm gonna have the same now, but all parameterized by temperature, a set of equilibrium states. So this will be the macrostate level. I can also discuss equivalence at that level. And the question is, is the equivalence for the thermodynamics the same as the equivalence for the macrostate, okay? And then I can go even one level deeper. I can say, well, actually, for the statistical description of my system, I'm actually defining a distribution. So underlying this is a probability for the microstate. And if I fix the energy, that distribution, that measure will be parameterized also by the energy. This is the micro canonical distribution. It's gonna be a constant for all the states with the fixed energy. And it will be zero otherwise. For the canonical ensemble that the distribution you use is entirely different. It's not that distribution. As you know, that distribution is the Gibbs distribution. So it's essentially, well, it is the exponential of the Newtonian with the normalization constant, which is the partition function. So these are two very different distributions for the microscopic configurations. And the question is, well, obviously they're not the same, but are they similar? Okay, and I'll come to define these three levels. So of course, I haven't written this, but this is the micro canonical ensemble. And this is the canonical ensemble. They describe two very different physical description of the systems. But somehow if you read textbooks on thermodynamics, they'll say, and I think you probably know this, they're the same. But are they the same? In what sense are they the same? Are the conditions for saying that the two ensembles will be equivalent, that they will give the same description of your systems in some limit, okay? This is what I want to discuss. And of course, physically, they describe two physical situations too. So as you know, the micro canonical ensemble is a system with a fixed energy. So it's a closed system, whereas this one is a system in contact with a heat bat at a fixed temperature. So here actually, there's no reason to think that they should be the same if I look at my canonical ensemble. So the physical picture will be this one. I have my system coupled to a much bigger bath with a fixed temperature. If I look at the energy of that bit of my system, that as in particle, the energy actually is not fixed. It's fluctuating. So how can that system, that canonical system be the same as the system with the micro canonical fixed energy? Okay, this is what I want to discuss. So I think you know this, the basic picture is that, okay, yes, if I look at the energy of my system in the canonical ensemble, if I look at the distribution of the energy for that subsystem, it is fluctuating. There's a probability distribution for this. But the point is that if I look at a system with a given size, it's gonna be some distribution. If I look at a system with a bigger size, that distribution has a tendency to concentrate. If I look at a system with an infinite size, that energy would actually concentrate on a specific energy, which is the equilibrium energy. That will be say, I'm gonna call it U star as a function of temperature. So although indeed the canonical ensemble has energy fluctuations in the thermodynamic limit, you actually concentrate on a fixed energy, look just like the micro canonical ensemble. So this is the basic idea of equivalence. And equivalence then can only be defined, can only be meaningful in the thermodynamic limit. For finite size systems, the two ensembles are not the same. One has fluctuation, the other has not. And in fact, if you look at any microstate, the fluctuations of the microstates are different in general. But the point is that the equilibrium states can be the same or can be put in correspondent. This is what I want to discuss. Okay, but again, so this is actually the full story. If I'd given this stock 10 years ago, I would be finished here. Thermodynamic limit bring a fixed equilibrium state and that equilibrium state can be put in correspondence with the micro canonical ensemble. And that's it. So what the long range I actually brought into the picture is the fact that you can have non-equivalent ensemble. And this is what I want to discuss. And something that we've been working a lot in the last, they'll say five, 10 years or maybe even more actually, is to try to define equivalents at these three levels. So equivalents at the thermodynamic level had been studied for a long time. But the new bit that we've been working on is to define now equivalents at the macro state level but also at the measure level. And then we'll see that they actually relate it. And they should be, of course, because we know statistical mechanics is only a reflection. Well, thermodynamics is only a reflection of the underlying microstates, the statistical mechanics. So if you see something at a thermodynamic level, it should come from beneath from the statistical mechanics and it should come from the measures. So equivalents should be the same at all three levels and this is what I want to discuss. So the punch line is this one is that short range systems actually have equivalent ensembles for very specific reasons which I'll discuss. It's the concavity of the entropy whereas long range systems can have non-equivalent ensembles. And I'll discuss which one have equivalent ensemble and which long range systems don't have equivalent ensemble. And as I mentioned, it's all related to the concavity properties of the entropy. And this is a surprising bit. If you read most textbooks or actually, yeah, most textbooks on statistical mechanics, they'll say entropy is always concave and free energy is always concave. But that's actually, that's not the case and I'll show you some examples where that's not the case. And the basic cartoon picture is this one. Short range systems will always have concave entropy whereas long range systems, you can have this picture of an entropy function as a function of the energy in the thermodynamic limit that has this shape. It's non-concave, okay? And then I'll discuss if I have time the issue of finite system actually developing non-concavity, developing a concave entropy. So before going there, I want to come back on something that was mentioned by Julien about short range systems having concave entropy. And he presented this argument of phase separation to show that the entropy has to be concave. So this is an argument that was put forward already in the 60s, 70s by Ruel to prove that for short range systems with a certain class of interactions you have concave entropy. So the idea again is that for a short range essentially you have a finite range interaction. If you look at the correlations, the particles will feel only a certain neighborhood around them and so you can divide the system, the full system into more or less independent parts. And so you have this idea of the extensive energy but you have a subsystem separation. You can separate the energy into additive part, more or less uncorrelated and in this case this will lead to a concave entropy and I'll come to this in the next slide. Whereas if you have a long range systems and now what I have by long range in my mind is the first definition that was put forward by Julien. It's the equation where you have a sort of interaction that decays with a certain power and that power is greater than the dimension of the system. So in this case the interaction, the system feels interaction over its whole length. So you cannot subdivide the system into independent parts. As you cut the system actually there's a huge energy cost in doing that subsystem separation. And in this case the surface or the cutting of your system is actually as important as the bulk energy. There's a surface energy that's not negligible as in the case of short range systems. In this case this is the source of the fact that the entropy can be non-concave. The argument that's used by Ruel actually to show that entropy has to be concave cannot just be used for long range systems. And this is the argument here. So let's look at the entropy. So this is the usual log of the number of microstates having a certain energy. And I take the thermodynamic limit. So I'm considering the entropy per particle in the thermodynamic limit. So this is the entropy density. And so this is assumed to be a well-defined function in the thermodynamic limit. You have a thermodynamic limit. And now again the argument for short range system is that you can cut your system in many parts. For instance in two parts. And then you have the energy of the first part, the energy of the second part. And the point is that for short range interaction there is an interaction energy but it's much smaller than the bulk energy. So you can neglect that part. And so you have essentially a system that's additive. For perfect gas it's exactly additive but for the short range it's more or less additive in a thermodynamic limit. So you have these two energy contribution and then you can use this also in the counting of the microstates to separate the density of states. And what you get is this inequality here where the entropy of the separation is greater than the separation of the entropies. And that's coming again from the short range interaction and mathematically that's coming from an argument that we call the sub-editivity argument for the entropy. Okay and then you have this thing and this is actually a statement, mathematical statement of concavity. If you look at the plot here the blue part will be the top part where you have the entropy of the mixture whereas the red is the mean of the two parts of the, so it's the mean entropy and not the entropy of the mixture. Okay so that's for concavity. It applies to a large class of systems under the conditions in a class defined by red. And again this separation now doesn't work for long range and then we'll try to see what happens in this case. So the question of course is okay so here entropy is concave. If really I look at the definition of the entropy is there anything that says that entropy has to be concave? The answer is no. If you look at the free energy actually the definition of the free energy from the partition function you can prove that this is always concave so there's no problem. But the entropy doesn't have that property and then I've been trying to find for a long time a very simple system that has a non-concave entropy and this is it. So this is, I think you're familiar with the plus minus spin model and then you calculate that so that's the query-wise spin model where there's no interaction and you just calculate the entropy and then you get just the bell-shaped curve entropy. This is a variation of that model where you take half of the spin to be independent and then the other half of the spin to be completely dependent. If one is up, they're all up. If one is down, they're all down. If you do the calculation, the entropy calculation for this you get this shape of the entropy which is obviously non-concave. It's essentially twice the entropy of the independent block translated. And then you can make sense of that function very simply because you can ask what is the ground state of your system? It's all the spins down. There's only one state for this, so entropy zero. What is the excited state of your system? All spins up, only one state for this. So you have this point, the ground state, the excited state and then you can say, okay but there's also a state of energy zero. It's half of the spin down, half of the spin up or vice versa. There's only two of these states. So the entropy is also constant there and then so there's no, so you have that middle point there, C. And in between actually you have a mixture of states where you have all the independent spins with a certain entropy contribution. So the entropy has to be greater than zero over these two parts. So only with these points you know that the entropy is non-concave. But if you do the calculation, it's really a back of the envelope calculation, you get that curve, okay? So this is really a very simple toy model where you see that the entropy is not concave. So again, there's nothing in the definition of the entropy that says that it has to be concave. And then someone could say, okay, but what about physically? Is this a physical model? To some extent, for the medical purpose or for actually going deeper into the theory, you don't need this. You only need to show the existence or by proof by concept that the entropy can be concave. And then afterwards we can discuss, okay, what kind of interactions actually can bring such a non-concave entropy? In this case, the non-concave entropy is brought up by the block of spins that are all coupled together. So that's the long range interaction. That's like the simplest way you can put long range interaction is all to all interaction, but only on half of the spins. Okay, so this is, now I want to discuss the consequence of this. And then first I'll start with the thermodynamic level of trying to relate thermodynamic functions that relate to different parameters. And I'll take again the micro-canonical and canonical, but I could do the same discussion, of course, for canonical, grand canonical. The point is that you have an ensemble with a fixed quantity, and then you have another ensembles where you fix the quantity on average using some parameter. In this case, if the parameter conjugated to the energy is temperature, but it could be also the density and the chemical potential or the volume and the pressure and so on, okay? So everything I say for canonical, grand canonical is applicable to, I mean, micro-canonical and canonical is applicable to your favorite ensembles. So the first case is concave entropy. In this case, you have equivalence. And what do I mean by equivalence? It means that there's a relationship, there's a one to one relationship between the entropy and the free energy and that relationship, I think, you know, is the Legendre transform. So for concave entropy, we know we have equivalence and what does that mean? It means that, okay, so here I have the micro-canonical picture, I have the entropy, and the other function, I forgot to define it, it's my free energy. So my free energy is the thermodynamic limit of one over N, the log of the partition function. So that's essentially up to a constant, it's the free energy that you see defined in textbooks, so this F of T. And here I would have like E is F, I mean, minus TS. And here I'm just changing the parameters a bit. So the Legendre transform you see there is between the entropy as a view and then my phi of beta, which is the free energy, okay? So it's up to some rescaling and some constant. What you see as F is E minus TS. So the Legendre transform is that relationship. It's the statement that the free energy will be the Legendre transform of the free energy, I don't know, the entropy will be the Legendre transform of the free energy, and vice versa, the free energy will be the Legendre transform of the entropy. Actually, here it goes both ways because you have concavity of the entropy. So as I said before, the free energy is actually always concave, okay? So it's always true that the free energy is the Legendre transform of the entropy. There's no problem. So you always go from the left to the right, from micro-canonical to canonical. And here you can go back now to the entropy because the entropy is concave. So it's also the Legendre transform of the free energy. So in this case, what's encoded in the entropy is also encoded in the free energy. You have the same information. It's a one-to-one transform. It means also that there's a one-to-one relationship between the energy and the temperature. And that relationship is this. It's a relationship between if you fix the energy, I can say, okay, there's a temperature associated with this. It's the temperature that fixes the equilibrium energy of the canonical ensemble. And so that relationship, that equilibrium energy is given by the Legendre transform. It's given in fact by the derivative of the free energy. So for a given temperature, the derivative of the free energy is the equilibrium energy. Okay, so in this case we say that we have thermodynamics equivalents of ensembles. The two sides, the two thermodynamic potentials actually encode information about the thermodynamics in the same way because you can go from one side to another. So there's a one-to-one correspondence in the way that you describe the system with two different parameters. And it is this thing. Essentially, if I change the temperature, I can scan any energy of the system. I can fix the energy to a very definite energy that doesn't have fluctuations, so it just behaves like a micro-canonical ensemble. That's the easy part. And again, that's essentially the picture that's been in textbooks and been known since Gibbs as to what equivalence means. Now the new thing that appeared is that as you study long-range systems, you have the possibility of having a non-concave entropy. And now you have to see how is the L'Agent transform affected by this? And what you do is that you break this duality of the L'Agent transform between the entropy and the free energy in the following way. So first, suppose that you have a system with this non-concave entropy. Again, I said that the free energy is always concave, so you can always say, well, the free energy will be given by the L'Agent transform of the entropy. That's always true. What you cannot do, though, is that if you take the L'Agent transform of that free energy, you don't get back the entropy because the L'Agent transform will only give you a concave function, so it cannot give you the non-concave entropy. What it will give you mathematically is the convex envelope, the concave envelope of the entropy. And then there's an easy argument to see this because if I have an entropy that's like this and then say I have another entropy that's more or less the same thing, but say it looks like this. They have the same concave envelope, but somehow in the non-concave part, they're different. If you take the L'Agent transform of these two functions, function one, function two, you get the same L'Agent transform, which is the free energy, which is the one I see on the board. So the one and the two will have the same free energy although the two entropies are different. So there's something missing here. There's information in the micro-canonical picture. There's information parameterized as a function of the energy that's not transposed by the L'Agent transform to the free energy. So there's something missing in a canonical ensemble and this is the basis for the non-equivalent ensemble. So as I scan the energy, a lot will be happening here. A lot will be coded and coded in this part which we call the non-concave part. It's the part of the entropy that doesn't coincide with the concave envelope. That part is lost in the L'Agent transform. So if you want to describe your system as a function of temperature, you'll have something missing. You go back again to the L'Agent transform, you'll have only the concave envelope, so it's missing some information. So the picture that you have as a function of the energy, if you vary your energy, you'll see something. If you vary your temperature instead, you'll see something else. So the two ensembles cannot be equivalent and then because we're talking about the thermodynamics of it, we just say that they're thermodynamically non-equivalent. So this property, actually, if you think about this, now there's some information I've put in the drawing there. I'm putting the free energy as being non-differentiable. It has a corner. Actually, this is very important. Mathematically, you can show that if the entropy is non-concave, the L'Agent transform will give you a function that has a non-differentiable point, a corner. And then a corner in the free energy means that the derivative jumps. It means that the equilibrium energy jumps. It means that you have a phase transition. You have a discontinuous phase transition, a first-order phase transition in the canonical ensemble. And then this is exactly actually the physical source of the non-equivalent ensembles. If I vary my temperature, I'm gonna see some energy, but as I cross that critical temperature, the energy will jump. So there's a whole range of energy where if I could fix the energy, I could go in that region. But as I fix the temperature, actually, the system jumps over that region. So it's not seeing that energy region. And that's the source of the non-equivalence. It's the fact, again, that you're jumping over that region. So you have two different descriptions. One as a function of the energy and one as a function of temperature. And this is very general. It's a general results of convex analysis that the L'Agent transform of a non-convex, non-concave function, gives you this non-differentiable function. Now, there was also mentioned yesterday of systems having negative specific heat or negative heat capacity. And this is related to it. So if you read textbooks, they'll say, oh, no, the heat capacity is always positive because it's the variance of the energy. That's true in the canonical ensemble. So the definition of the heat capacity is the variation of energy versus the variation of temperature. If you're in the canonical ensemble, the parameter you have to vary is the temperature. So the slave variable is the energy. As you change the temperature, you change the energy. In this case, you can show that, indeed, the heat capacity is always positive. And it's related to this variance of the energy because the free energy is always concave. So the second derivative will always be negative. So this quantity is always positive. In the micro-canonical, you see, you have to think about what I changed now is not the temperature. What I changed is the energy. So the notion of the heat capacity is slightly different now. The temperature will be the derivative of the entropy. That's your slave variable. The parameter is the energy. So you have to invert that definition of the heat capacity. And in this case, the heat capacity can be negative because the sign of the second derivative of the entropy can be positive or negative, depending on whether it's concave or non-concave. So there's no contradiction in saying that the heat capacity is always positive. That's true for the canonical. Actually, there's been lots of noise in the literature about this. It's just that because you have two different ensembles, you have to actually look at that quantity in two different ways. And these two different ways are not just the same. They will be the same if the entropy is concave. Actually, they'll be one-to-one related, but they won't be the same if the entropy is not concave. So here, actually, you can show that the heat capacity is sufficient to have non-equivalent ensembles. If you have negative heat capacity, it means that you have a non-concave region of the entropy. So you have to have non-equivalent ensembles. OK. So this is a thermodynamic level. Now I want to go one level down. Suppose now I don't care about my thermodynamics, but I'm actually measuring something in the lab as an equilibrium state. And I'm measuring that equilibrium state either as a function of energy or as a function of temperature. Will I see the same thing by varying these two parameters by doing two different experiments where I vary those parameters? So mathematically, what is the problem? Mathematically is that I have a certain macrostate, which is a function of my microscopic configurations. And again, it could be, it's very general. It's anything you want to study in your system. It could be the magnetization of your spin system. It could be the velocity distribution of a gas. It could be a combination of different macrostates. It's not necessarily like a real macrostate. It could be a vector. It could be a distribution. It has to be a function of the macrostates. And then what you want to calculate, of course, is a distribution of that macrostate. OK, so you have an underlying ensemble, which is the microscopic distributions over the macrostates. So what you do is that with the microscopic distributions, so you have distributions for the micro-canonical ensemble. With this, you can calculate the distribution for your macrostate. This will be the probability that your macrostate has some value here. Here, you vary the energy starting from this ensemble. And then you can do the same for the canonical. And then with this, you'll find a canonical distribution for your macrostate. Now again, it could be some fluctuations. What I'm interested in is the equilibrium state. So you have a picture of concentration very similar like this, where the concentration actually goes like 1 over the square root of the number or the volume of your system. And so what you pick up is the most probable value, and that's the equilibrium state. And again, I can vary the parameter of my ensemble. So I'm going to find a point where the probability is maximum. That's my equilibrium state. And that equilibrium state will be parameterized by the parameter of my ensemble. So I have two sets, potentially two values for a certain parameter value. In general, there could be many of them. So you have, as general, a set of equilibrium states, parameterized either by the energy, or parameterized by the temperature or the inverse temperature beta. The question now is, what is the correspondence between these two sets? I have two sets now. So the notion of equivalence is very different. It's not a Legendre transform. I have to compare two sets for two different parameters. So this is something, I'm just summarizing the result here. I'll give you on the next slide an idea of where that result is coming from. But essentially the result is that you have macrostate equivalence exactly under the same condition as thermodynamic equivalence, concavity of the entropy. So if I pick a point, for instance, if I pick a point here, the blue region where the entropy is concave, so where the envelope actually coincides with the entropy, if I look at all these energies, then the set for these energies will be in full correspondence with the set for a given temperature. And the temperature I need to choose is exactly the temperature that you would guess is the temperature given by the derivative of the entropy. So that's the thermodynamic equivalence. So here, thermodynamic equivalence implies macrostate equivalence, okay? And vice versa. And the point again is that if you have a non-concave entropy, so if I look at all these energies where the entropy is not concave, I don't have thermodynamic equivalence. And at the microstate, I don't have microstate equivalence either. So there will be some states as a function of the energy, some equilibrium states as a function energy that I see as a function of energy, but I don't see them as a function of temperature. And this will be related again to this jump in the phase transition, okay? So, and this is true for any microstate. Any microstates for which you have equilibrium states actually will have this property. So this is very general and this is something we can prove actually using large deviation theory. So this is a bit abstract and I'll show you an example where we see this. But this is a very simple model. So this is the mean field pots model with three states, three state mean field pots has a non-concave entropy, okay? This is for two colors, for two states you have a concave entropy, but for any states greater than three you have non-concave entropy. And this is a sketch of the entropy. Actually, the non-concave region is so thin that you don't see it if I plot the entropy. So I'm just exaggerating here the entropy. I'm showing the non-concave region quite big here. But it's non-concave, so there's no equivalence. So how do I see this? So I can choose, so here I have three colors, I have three states. So I'm gonna choose now the distribution of spins. So it's the fraction of spin ones, the fraction of spin twos and the fractions of spin threes, okay? And here I know that the equilibrium state has the symmetry where it's ABB. So the microstate will be just a fraction of ones essentially. So the microstate is the fraction of spin ones over the total number of spins. And then I can calculate that microstate. It's a very simple microstate, but I can calculate that microstate as again as a function of the energy or as a function of temperature using the two different ensembles. I'm not assuming anything about the equivalence between the two ensembles when I do that calculation. And here are the results. So on this plot in gray, you see the variation of that fraction as a function of energy between over the energy range of the model. So you see that it's continuous. So as a very energy, I have A actually going from, I think it's one third all the way down to zero. Okay, so there's no phase transition. There's nothing really happening here. The fraction just varies continuously. If I vary the temperature instead, what I see is that for high temperature, essentially zero, and then I get to a critical temperature and then that fraction jumps. And then it varies continuously again. If I take that plot now and put this in correspondence with the previous one, it's that black line here. And so you see I'm setting, there's something missing here. As I vary the temperature, I'm not getting all the macro states or the equilibrium states that I see as a function of energy. And this is the jump associated with the phase transition. This is the first order jumps that I see here. There's something I don't see because I vary the temperature. And this is the macro state non equivalence. Okay, so this is like a one dimensional picture of something you can see in much higher dimension. Again, if your macro state is a vector, you'll see this also. As you vary temperature, there'll be a jump. It's a generic jump because it's related to a generic phase transition. And that jump just misses over something that you're gonna see as a function of energy. Okay. Okay. So before now I go to the lowest level, the lowest of micro states. I just want to give you an idea of why we have this. Why do we have macro state equivalence exactly where we have them with dynamic equivalence? So the idea is, if I need a brush, maybe I can do it here. The idea is that essentially the canonical ensemble is a mixture of micro canonical ensemble. So what do I mean by this? So again, if you have a look at the distribution here, you see that the canonical has many energies. Okay, it can take many energies. The subsystem here can exchange energy here. And so you have energy fluctuations. But the point is that if you look at a specific energy, the system is behaving like micro canonical. And how do I know this? I know this if I take the canonical distribution, if I take the Gibbs distribution and I condition on the energy. So if I take this and I condition on a state that has a fixed energy, you'll see that all the states will have the same probabilities because I'm fixing the energy here. So conditionally on fixing the energy, the canonical is just like the micro canonical. It has constant weight for all the states with a fixed energy. So this indeed is just, now if I can use, I can use then a mixture, I can say anything for the microstate, I can decompose this using Bayes rule by saying, okay, I'm just gonna condition on the energy. And here I'm gonna have the energy distribution for the canonical. And then I do the integral over all the energy. So I'm just separating using Bayes rule at a probability as a bunch of conditional probability. But you see now, if I fix the energy is the same as the micro canonical. So what I have really is a bunch of micro canonical probabilities, which are all weighted by the probability of seeing that energy. Okay, so the canonical is a mixture of micro canonical. So where is this guy gonna concentrate? Well, it's gonna concentrate where the micro canonical concentrates, but also where the energy of the canonical distribution of the energy will concentrate too. And the concavity actually enters in that concentration in the second probability. It's the concavity of the entropy that determines where this guy concentrates. And so the integral will concentrate on one point and that will be the concentration point of your canonical. Okay, that's basic. That's why the thermodynamic level really determines the microstate level. Okay, so now what I'll do is I'll finish with the last level, which is a much more refined level. I want to discuss how these two measures are similar. It's clear, so I want to discuss equivalence of the ensembles at the measure level, at the microscopic level. It's clear, it's totally clear that the two measures are different. They cannot be the same and they will never be the same. This is a distribution that's putting a lot of weights on all different configurations for many different energies. Whereas this one is putting weight only for those configurations having a fixed energy. So they're two different distributions. Yet, in the thermodynamic limit, there's something happening actually. That measure becomes more and more like this one and this is what I want to discuss. And if you can make sense of this, then this is the ultimate equivalence you can discuss. Because if the measure then resembles in some limit the other one, then all the other properties climbing from the top for the microstate equilibrium states and the thermodynamics has to come from the equivalence of the underlying measure. Okay, and this is the ultimate level that we've been discussing much more recently. It wasn't really discussed before. So again, you have these two ensembles and what you can prove is that they are similar. This one is becoming similar, but on a specific scale and it's the log scale. So you have to define a sort of distance and it was defined a bit yesterday by Julien. When he was talking about Blassoff, you want to discuss the distance between two distributions. And here the distance between the two distribution is this log distance, or you can also define the distance using the relative entropy. But I'm gonna discuss this one here. So the point is that, all right, again, the distributions are different, but if you take the ratio and you take the log and the one over n for the equivalence, that limit will go to zero. If you have, if you take now energies in the non-concave region, that ratio will not go to zero, okay? And it's that sense that the two distribution will look the same with respect to that distance. What does it mean concretely if you have a look at that limit there? I'm basically defining a sort of equivalence on a log scale with the number of particles. So I'm gonna say the two distributions look the same. They are the same up to exponential correction factors, which are small o of n, a smaller than linear of n. So the distributions are not the same, but they're the same up to correction factors that are exponentially smaller than linear in the exponent. This is why when I take the log and one over n, that corrections disappears. So that's the distance. That's the distance I'm taking. Again, they cannot be the same. They cannot be exactly the same. You'll never have equality between this and this in any limit. But if you take that notion of distance, that notion of equivalence, then you get this. And this is the notion we call this the asymptotic equivalence of measure or the log equivalence of measures. And this is very important. In information theory, this is called the asymptotic equipartition theorem. The fact that for different systems having different distribution, if you take bigger and bigger system, the distribution looks more or less the same, but not quite. What do you mean by not quite? It's this. And here, the equivalence, again, is given exactly from the concavity of the entropy. So where the entropy is nonconcave, you don't have the two distributions being similar, but where the entropy is concave, you have these two distributions similar. Now, there's something I haven't mentioned, which is that actually this is a function of the microstate you take. So actually, this is a probabilistic statement. It has to be a probabilistic statement. This is true for almost all microstates. So there could be microstates actually that don't have this even in the concavity region, but this is true for almost all microstate with respect to the measure of the ensemble. Okay, and again, I just want to emphasize, this is really the deepest level of equivalency that you can define. If you can show that the two measures are similar, then anything above calculated with these two measures then will also look similar. In fact, they will look equivalent under the same conditions. Okay, so I'll finish by presenting some examples and then discussing some things a bit more general. So here I'm just putting a partial list. I haven't had time yesterday actually to draw a list of all the systems that we know that are long range and have equivalent or non-equivalent ensembles. But for teaching purposes, the simple one is the one I showed you before, but the next simplest model on the list having non-equivalent ensemble is the Putz three state model, which I've shown before. There's also the blueberry griffet model, which is another three color spin model that has non-equivalent ensemble that was actually studied by Julien and by David Nuccaamel and Stefano. And then you have various other long range models having non-equivalent ensembles, like for instance, 3D gravity, where you regularize the potential confined and if in some region of energy then you have this non-concave entropy. In turbulence model, various point vortices models actually will have also non-concave entropies. You have also turbulence, like geostrophic turbulence model that has this confined plasma that was studied by Michael Kiesling. And then you have other models that have long range interactions too, but they don't have non-equivalent ensembles. They have equivalent ensembles, for instance, the HMF model, the Mentonian Meanfield model that was mentioned yesterday, this one has equivalent ensembles. So to have non-equivalence, you need long range, but it's not enough, okay? So it's a necessary conditions for having non-equivalent ensembles, but it's not sufficient, and this we know. Now, there are also other ensembles that you can consider. So I've mentioned that, I've been discussing micro-canonical and canonical, but you can go over a grand canonical, you can go to the isobar ensembles, you can go also to an ensemble where the magnetization is fixed or where you fix the magnetic field. So these would be conjugate ensembles in the sense of micro-canonical and canonical. But there's also other ensembles that are not so common, like this one, for instance, in stretching experiments in DNA, so what people do is that they fix one end of the DNA and they just do statistical mechanics of stretching the DNA, see how much of the force is obtained by putting a certain length, or they fix the force and then see how much of the DNA is elongating, so then the length is actually random. So in this case, you have two ensembles, you can have the force is constant and then you see how the length changes, or you have constant elongation and you see what force is actually applied by the DNA strand, okay? And then you have two ensembles and it's not clear that they should be equivalent. In this case, you have the same notion of equivalence by looking at the concavity of the thermodynamic potential associated with the fixed length ensemble, okay? It's always the concavity of the potential associated with the fixed constraint ensemble that determines whether your ensemble will be equivalent with the other one. Here it's a bit funny actually to define a thermodynamic limit, but you can define sort of an equivalence for a finite size system and that's been discussed a lot actually in these papers. Then more recently, people have looked at other objects and defining an object, a certain measure with a constraint versus a measure without that constraint where you fix that constraint on average. And one example is, for instance, random graphs where you can have an ensemble of graphs, you just generate graphs randomly and then you can generate them with a fixed number of nodes, say, or with an average number of nodes or you fix them with a fixed number of nodes with a certain degree or another ensemble that has a different distribution that gives you that degree distribution and on average, you have a fixed degree. Okay, so you can imagine again kind of micro-canonical ensembles of graphs and canonical ensembles of graphs and that's been done a lot actually in the context of Erdos-Rini graphs but also scale-free and small world where you have two different measures where you have a fixed constraint and then the constraint fixed on average by a conjugate parameter like a temperature for the graph and again, the equivalence of these graphs is exactly the same as I discussed before. You just need to define a thermodynamic potential, a function which associated with the fixed constraint and then has to be concave in order to have equivalence. That's something that was discussed recently actually by a bunch of guys in Leiden, Squatini and collaborators in a recent PIL and here what they find, they find non-equivalence for graphs but it's not really related to the long-rangedness of the interaction. They get non-equivalence if you put too many constraints. Okay, so the idea here and then actually it brings something new in what we've been discussing in the field of long range is that you could have different ensembles and then you could fix so many constraints that the constraint system is not equivalent to the one actually that tries to fix these constraints on average. Essentially, if you have an extensive number of constraints with the number of components like particles then you can have also non-equivalent. So non-equivalence is not related per se to having long-range interaction but it can also be related to having constraints like this and what kind of constraints now can lead to non-equivalent ensembles is something that people are trying to study now. Okay, now the last example brings me to my conclusion and for the conclusion, I'd like to put the problem in a much more general way. Just to give you a feeling that this idea of equivalence is actually not related to physics at all. It's just a mathematical question and it's my kind of game that I try to do all the time is to try to separate what is the physics from the mathematics and in this problem, it's not to do with the physics. The physics will come later on when you ask questions about what kind of interactions will lead to non-concave entropy. But the problem of equivalence as it is is a problem of measures. It's a problem of probability. You have a system, you describe that system using two different types of probabilities and then you're trying to say, okay, but how similar are the descriptions that I use in terms of probabilities? And this is something actually that's been studied a bit in mathematics and then we're trying to continue this now but suppose you have some kind of construction so it could be a system like a statistical mechanical system, it could be a graph and then you say, I want a certain measure, I want a certain distribution on that construction where I fix something. I fix the energy, I fix the number of nodes, I fix something, there's a constraint. This will be like the micro canonical ensemble. You have a fixed constraint, it's a conditional distribution. Now you say, okay, this is a bit difficult to treat. I'd much prefer having a distribution that doesn't have a constraint. So what I'll try to do instead is fix the constraint on average by choosing a distribution that has a certain parameter. Now it so happens that in physics we like exponential distribution, we like the Gibbs distribution, there's a physical actually basis for this so we've been using a lot of distribution and then the question at this case is, what is the relationship between that exponential distribution and the conditional distribution that I've put before? And this is the equivalent question but it's a mathematical question. I want to define this either at the level of concentration points or at the level of measures and I've shown you that the right notion to use is the log equivalence. So pending on the concavity of a certain function related to these distributions, I'm gonna have log equivalence of these measures under some conditions. But now I can go deeper. Do I need to look at the exponential distribution? I don't need to. I could try to use any other kind of distribution and ask the same question. I have a conditional distribution, I'm using something else, any kind of distribution, for instance here I'm just putting what we call a generalized canonical distribution where you have the exponential term but then an arbitrary function of your emittonion. Is this distribution now equivalent to the micro-canonical one, the constraint one? The answer is yes under some conditions and then there are other conditions related to the concavity of your first conditional distribution that will give you whether this distribution is equivalent to your conditioning. But the question is always of equivalence between conditioning and something that's not conditioned. And again, these distribution cannot be exactly the same but there's a sense, there's a scale at which they look the same and they look very similar or under some conditions where they're not similar at all and where they will give you different descriptions of your system. In fact, I could put here any queue. I don't even have to use exponential form distribution. I'm just put a queue distribution here. I'm just asking, is this queue related to the condition distribution initially? And this is the bare mathematical problem. And in many cases there's actually an infinite number of queue distributions that will be equivalent to your conditioning. And this is quite amazing when you think about this. We think that the canonical is so special. It's the one that has a fixed temperature and then by fixing the temperature you fix the energy. But in fact, there's an infinite number of ensembles that's equivalent to the micro-canonical ensembles. There's an infinite number of different physical or unphysical descriptions that will give you exactly the same as if you fixed the energy of your system. And this I find quite amazing. But it's related, deep down it's related to this concentration. So you have to have a limit. You don't have that equivalence by fixing n. For a fixed n, for a fixed system size, of course the two distributions will give you something different. It's just when you look at the thermodynamic limit as your system gets bigger and bigger that the two descriptions will look more and more the same and they will become the same on that notion in the thermodynamic limit proper. Okay, so with this I'll just finish. So this is the mathematics. I'm quite fun of mathematics. If you know me, you're not gonna be surprised, but I'll just want to finish now with the more physical questions and it's something we should discuss. We've been trying to actually think about these questions for a very long time and then try to make sense of the answers. We have some partial answers, but there's no definite answers to these questions and it'll be nice if you like to think about these questions. We have different people, actually people we've never seen before at long range conferences or at least the ones we organize. Maybe you had your long range conferences on your own but there seems to be more overlap now with new people so maybe we can get new inputs on this on the more physical questions. So here I'm just putting three there. There might be more actually and I'd like to discuss more if you have more. But the first important one is the class of interactions that will give you non-equivalent ensembles. So what we know from the 60s and the 70s from the work of Lanford and Ruel is that they were able to define a class of interactions where they call stable, tempered, short range interactions including finite range interactions where they were able to prove concave entropy equivalence fine. Can we do something similar for long range? Or can we do something similar with also with the number of constraints? Like what kind of constraints, how many constraints, what kind of interactions can we define something very precise where we can say oh, okay, non-concave entropy, yes, and non-equivalent ensembles. So again, the only clue that we have, so here we don't have any answers. The only clue that we have is that at least if you only look at the interaction it has to be long range but that's not enough. Even if you look for instance at some models you fix a fixed model and then for some parameters it's gonna be concave but for some other parameters it's gonna be non-concave, okay? So it's not just a class of Hamiltonians per se but it's also the value of the parameters that you put in your Hamiltonian that will determine equivalence or non-equivalence. And fundamentally this is related to first order phase transition because I've shown that non-concavity translates into a first order phase transition. So the question I'm asking is what class of interactions will give you a first order phase transition? So you have to go back to the models that you know that give you a first order phase transition and try to make sense of these models to see if you see, if you can define a specific class that will give you these non-concave attributes. And then the last one is quite important and then we're still working on this and I don't know if Julien will be talking about this. I don't know, he's finished already, I don't know. But about can we, are they, can you see something experimentally? Like can you measure something that's non-concave? Can you actually see experimentally that two ensembles are different? We can see this in simulations, we can see this in calculations. We have, so you'd say, okay, gravity, we know that maybe has non-equivalent ensemble but that's not allowed. You cannot control gravity or you cannot control galaxies. What would be quite nice is to have a sort of in-house experiment where you can do the two experiments of having fixed energy and fixed temperature where you could see something non-equivalent. And again, this is related to phase transition. You have to play with a system that's gonna have a first order phase transition. And there's some proposals now floating about using laser traps to do this. I think the idea now is that for 1D, you have equivalents for laser traps but for 2D, maybe non-equivalents but for sure 3D you would have non-equivalents but it's quite difficult to manipulate these traps but I think it's ongoing work. But it's really something important that we need for the field, okay? So we have the theory, we have the examples, we have lots of calculations and what remains to do in a way, one big challenge is to put this experimentally and then see this convincingly. Okay, I'll start the talk on this. Thank you. Thank you.