 Good morning, oh excuse me Okay, everybody take your seat so just one announcement as it's written on the program This evening. We have the social dinner Which will be up here on the terrace where we have the coffee breaks Hopefully there will be plenty of food We plan we plan for that so Be around after the end of the last lecture today Okay, so this morning we have Fairner crowd Boris third lecture on introduction to Monte Carlo and then again we have a coffee break and then the first lecture by Deepak Dar Okay, there are yours, okay, so good morning again and Bravo as we say in French for having had the strength of for coming today this morning so today I have a Very tight program so today there will be no jokes and I will go straight through I Will be very very serious much more serious than the first days because I have any so I have no time to leave We will do we will have two parts first part is exploring detailed balance and global balance and as Yesterday I give you further reading for today in this first part. There is the original paper by Metropolis et al 1953 Very insightful paper. We will go through some details that are Some aspects that are very little known and then to recent papers of myself where you can read Where can you can read through if you lost a little bit contact with what we're doing? the pioneer in the pioneers in this field of exploring global balance are mathematicians or statisticians Called Diaconis Holmes and Niels and they wrote really insight Very very interesting article in a mathematics journal But which is also as I as I would not have put it here if I didn't think that it is accessible to everybody So in my opinion, we will go through we will so we will go through and we will touch on this Idea it's called the lifting idea which has considerably renewed The first the theory and now the practice of Monte Carlo calculations in the second part We I will discuss the Metropolis Hastings algorithm On our little example of computing pie. It's called the triangle algorithm I call it triangle algorithm and if I have time I'll go through cluster algorithms and there are two sections in my book I don't put this here. Just so I do you see my name It's just that if you get lost then you can read through here the original paper on Where I hope to get today? Is by Uli Wolff 1989? It's one of these papers that I can I Still have the the visual impression In 1989 to have see at the time we had preprints and I picked up this preprint and I just was I Was just amazed at this paper and I have been amazed by this by this article as by by the other The other articles for a long time. So this is also how you recognize a really fantastic paper 30 I mean, you know, maybe not in this situation, but a long time later You still remember where you were standing when you first read this paper Anyway, so this is the way with me and maybe you are different, but so anyway so let us Explore detail balance and then I and global balance I will have a very slight overlap with Professor Deepak Dar who will speak about Hard-core models we checked yesterday that we didn't have that the overlap was finite But not too large. So I will discuss hard sphere systems. Also, but before this let us remember that the detailed balance condition is Pi C P from C to C prime is Equal to pi C prime From C prime to C So in another in other words this we recall this the flow From C to C prime Had to be equal to flow From C prime to C This was the detailed balance condition the global balance condition Just put it here again is pi of C is sum over all the C primes including C P from C prime to C times pi of C prime so which means that the so the the total flow So this is the flow From C prime to C and the sum over this flow is we call this The total flow flow into C and This flow into C must be equal to pi C So this is this condition that we have to check in order to be sure that the global balance condition is satisfied If the global balance condition is satisfied if we have a periodicity if you have irreducibility Then we know that the algorithm will Is mixing and will mix and will converge towards towards the probability distribution pi so the model I will look at now is the one-dimensional version of the original model that was treated in In the in the paper by metropolis at the aisle 1953 It is the model of hard spheres But I only do it in one dimension So I have n hard spheres Hard spheres of radius sigma on a ring of length L and each of these configurations each of the legal configurations is equally likely We can write this as a partition function Which is an integral? e x 1 Exn, so this is the variable x going this direction. I have periodic boundary conditions pi of x 1 xn and pi of x 1 x n is Equal to 1 if we have a legal configuration and 0 if we have an overlap So there will be a little a little question about permutations. I Just put the word permutations here so the question is whether we have particle one two three four five up to n and then There are two versions of this model one version is where where I Demand that the labels also get get mixed up and one where I don't care care about labels Fundamentally, I will mostly not care about labels In fact, I will also not care about the whole system making so then I also I'm interested in well anyway, so we'll go into details. I don't actually so I make some simplifying It's not even approximations, but I don't treat the whole system. It's all its complexity I'm but in particular. I'm not so interested in permutations So believe it or not I Have on these two pages. I have nine algorithms Last night I decided to Make your present. I will only speak about seven of them This was a very very difficult decision But I think I will leave it at seven because too much will simply be just too much but I will have to I will have the pleasure of presenting you seven algorithms two of them three of them really old and Three plus how much makes seven and four of them from last year Which still amazes me because this is so simple. So algorithm number one is I told you yesterday yesterday and before That yesterday that the metropolis algorithm the metropolis algorithm was just one algorithm to enforce the detailed balance condition and the other algorithm is called the heat pass algorithm and to To follow up on this thing I'll explain to you first the heat path algorithm and this algorithm is the following I Have a configuration. So now I do them for the do the following. I Have a configuration. I just draw part of it and at time T I Have a configuration C with particle I Sitting somewhere So now what I do is I simply take out particle I Intermediate I do an intermediate step Even though this costs me 30 seconds. So this may be a minus one a plus I minus one I plus One so I take it out. I push it put it back So I close my eyes. I put it back in a random, you know Yeah, so this is a random position because I Tell you though so and I put it back at a random position between I minus one and I plus one So the algorithm is sample I So choose I randomly between one and n Then It it was at X I and then replace it with an X I between X I'm plus one minus two sigma. I would say an X I minus one plus two sigma Alright, so let us check that this algorithm satisfies the detailed balance condition Yeah, but okay, I do the following because I'm not really I'm not really committed to actually writing this. So I just say replace X I Okay, you just replace X I somewhere So now this algorithm satisfies detailed balance because this is t plus one and this is configuration C prime because evidently so pi of pi of C is Equal to one must be equal to one pi of C prime Must be equal to one So in order to satisfy the global balance condition I must have the probability to go from here to here must be equal to the probability from here to here and This you realize that this is trivially satisfied because I can erase this particle I get to this I can erase this particle I get to this and the probability and both probabilities are the same So detailed balance is okay It's satisfied now we can also check irreducibility and One way just I want to give you an example of how you check it It is you start from one initial condition for example Just like this one two and Which is a compact initial condition and you just convince yourself That from this condition from this initial state from this state you can reach by a sequence of Well engineered moves you can reach any other configuration and since we satisfy detailed balance you can go back from any one to this one and Then you can from this configuration this can be C and then you can also get to any other configuration C prime So you can go from this to here and from here to here and so you you have a an indirect way So there is a there is a sequence of moves Well, I should have I should draw the same number of particles So you there is a sequence of moves that gives you from C to C prime So irreducibility is okay Okay, okay, and the algorithm is a periodic also because it has a finite It has the possibility of putting off leaving the particle here. So it's a periodic also So this is so these two parts are kind of boring Irreducibility so that was a question earlier today in much of the physics like Literature this what what is called what should be called irreducibility and a periodicity is called ergodicity And it creates a lot of confusion because ergodicity is something else. I am trying or excuse me I am trying to sample I am trying to sample the I am trying to sample the Equilibrium the the the probability distribution pi of x1 x2 xn So I'm trying to get at an at a configuration where The probability of having a configuration x1 x2 x3 xn is the same for all x1 to xn It's the same for all x1 x1 xn that are legal So I so thanks thanks for the question. So I'm trying to sample All right Okay, let me just say it here. So this is what this is what we want to do today okay, and Two days ago. We had the partition function Which was okay, thanks. Thanks again. So this was yesterday So yesterday and the day before we discussed two types of algorithm direct something Where we simply threw a table into the square. So this is let me put the picture So this is simply particles in a square, but now I have a Related problem. It is it is this one So it is true, but I will not discuss the face at for this particular case I will I can show it to you in two minutes for this particular case. There is a direct sampling algorithm But I don't want to this I Mention it in two minutes, but I will not go through so yesterday We discussed also for this I don't even know that was a direct Direct something I don't which is the complete exception. I discussed Markov chain algorithms So now we want to do the same. We want to this we want to discuss Markov chain algorithms to sample to sample from any initial configurations to sample an orbit and to sample this integral and We will discuss at the end of the of the of this hour. We will have a I'll tell you about the mixing times how long it takes to go from one configure from an arbitrary initial configuration towards the equilibrium So one algorithm of this is you start with an initial configuration and you replace particle I then you replace particle J then replace particle K Then you replace and so on and so on and if you do this long enough The theorems that we have proven show you that the configuration that you will will see is an is an Is an is an is an configuration taking from the equilibrium distribution? Thanks again for the question so now Yes, oh, that's what I said. So I'm not so much interested into the permutation problem So and if I don't do this so I don't don't exchange I don't allow the particle to go from here Somewhere then of course the permutation So I consider what I consider local algorithms This is what I'm really interested because this is what I can then later on take to higher dimensions Of course, I'm not so you know, I'm really interested in having a toy model for higher dimensional Algorithms and there I'm into so I'm I study local algorithms So let me put it here. So this is the local heat bath algorithm and Even though the local heat bath algorithm does not exchange the permutations It will get the configurations into equilibrium So this one will sample this distribution up to Up to But it up to Relabeling of particles. So let me let me do let me give one One serum if you want to Which may be related to what Double was mentioned later. So here I have a configuration of hard spheres with radius sigma on an on an interval L and If you watch carefully, I take this interval which makes makes three centimeters Now 4.5 centimeters, I put it here This makes 2.7 centimeters This makes okay So here I have a heart sphere configuration on the link on a ring of length L with particles of Diameter to sigma Here I have an equivalent configuration of point particles on a ring of which length yes even though I think you should speak up but We speak louder. So this is L prime is equal to L minus 2 and Sigma and these two are Equivalent and now you see also that now I can run The I can run the heat bath algorithm on This configuration or on this configuration and they are just the same So a Markov chain Monte Carlo algorithm on this configuration is the same as the same Markov chain Monte Carlo algorithm with the same random number on this configuration and what this shows is This is simply point particles on a ring and This can be used to show that the partition or this is really I mean if you think about it It's clear that the partition function of this system is in fact L minus 2 n sigma to the power of n divided by n factorial on the left and on the right and Even though I don't want to go into into detail the reason that we can compute the partition function of this system exactly Where were you the reason that we can compute the partition function exactly? shows me that there must be a direct sampling algorithm and It's very easy to do to sync it up So this is also Because the Markov chain the the heat bath algorithm for this thing here, which is simply point particles on a ring It's the same as this one. It's step by step equivalent can be can be used or is Implies immediately that this system cannot have a phase transition as a function at any finite packing All right, so the second algorithm That I want to discuss my list of seven is The metropolis algorithm Yes Well, what I simply do is I freeze to give a simple answer. I freeze n minus one Degrees of freedom then I have a single particle problem and this single particle problem can be solved by Direct sampling So this is what I'm doing. So let me so the general point is I'm taking my whole configuration. I freeze it Okay, I freeze the position of I minus one I plus one and then I solve the problem of how to Equilibrate this particle here this particle here in the environment of all the other particles So if you have some interactions, then you have simply have to solve this problem for a single particle and usually you can solve it Okay yes well because this is comma, so I was answering this question comma because this corresponds to Equilibrating the single particle in the bath of all the other ones that That so this is called So now second algorithm Yes, it's just a name so You could imagine that you have you fix the n particles and then you have little little Demons that that push the single particle around for a long time And if it moves around so the particle I if it moves around Then after a while it will end up at the at the random position All right, so now Metropolis algorithm The Metropolis algorithm is the one where I have Particles I minus one I I plus one or other labels It consists in the following I sample Particle I This word sampling I must admit to you that it took me two years to understand. I had a famous teacher called David separately in Really the king of quantum Monte Carlo algorithms and I spent years with him And he always had you sample and then you sample and something I never understood what it actually means So some and now I do the same thing so sample and simply means you choose a random particle between one and n Okay, so you sample you take an example of the distribution of the elements one to n so you take You take particle from one to n then You move I X I X I plus or minus epsilon and this epsilon is the displacement Well, in fact we called it Delta before but now let's me do it epsilon and the epsilon has the property that It is some distribution that is Pi of p of epsilon is equal to one and then you do one more thing or try to move and You don't allow Overlaps and You don't allow Jumps, so what this means is you take the particle here You try to move it by epsilon plus epsilon or minus epsilon and you put it here Or what could happen so then if it's like this then this is the configuration at T This is configuration C and then at T plus one You have this configuration Okay, but if you take this particle and you move it Overparticle I'm up I plus one you leave it where it is and this corresponds us to connect you to the to the to the situation of yesterday this means that you that you at T plus one you have the same Configuration at as at T and this makes you means you make a little pile of these configurations so you don't allow Move over here and you don't allow a move where you would have an overlap between Between the between two spheres so now of course This algorithm satisfies detailed balance because the probability if So this plus or minus is chosen with probability point five each So with equal probability you go to the right Which for me is the left and with equal probability you go to the left and Because you go this way in this in this way the probability to move from here and Make the move By a minus epsilon from here to here is the same as the proof the probability to have make a move from plus Epsilon from here to here and we have detailed balance and A periodicity and irreducibility and tata tata and so on So now it is not too early To get into It's not too early to get into the stick of it So now until now it was really easy, but now I will get it will get really complicated very very complicated All right, so this was detailed balance is okay is okay But now I want to check that the metropolis algorithm also satisfies the global balance condition You will tell me that this is kind of easy because the detailed balance condition Implies the global balance condition and so you know there's nothing to compute so you can go back to sleep but Maybe you don't do this because now I want to I want to show Explicitly that the global balance condition is true The global balance condition just escaped my attempt to erase everything You see it here. This is the global balance condition And now let us check That this algorithm satisfies the global balance condition to do this You can read this up. So now we moved from 1953 to 2017. It's really it's really pathetic So here I have I minus one Here I have I plus one and Here I Have I So now I have to check The pie is the pies all the pies are equal to one So now I have to check. What is the flow into this configuration? Where can I have come from to have reached this configuration? This is what this what this condition means, right? It means so this is the probability to go from C prime to C and I have to sum up over all the ways to get into this configuration So what there are four possibilities? One possibility is I was here and I moved here Second possibility is I was here and I moved back third possibility is I tried to move Here Is excuse me. I put myself here I tried to move here But this move was rejected So if the move is rejected, I stay where I was and Fourth possibility I tried to move too far and This move was rejected You are free to ask questions because this is what I want to discuss in the next two minutes so now Let me show this Let me show this These possibilities explicitly So it is possible That I tried to move so this particle wanted to move by Minus by epsilon into the into the negative direction, but it went too far It would have gone too far. So I go back No, no, I just suppose that we move particle I At the end I'll do a sum over all I So this is a flow that I call rejected minus By epsilon of I I'll do a sum over all I's a little later Understand because if the move is rejected, I'll stay where I am and then I get this configuration that I started with the other possibility is that I Was here and I moved to this configuration So this is of course another epsilon and this other epsilon comes with an accepted flow By epsilon in the plus direction. Are you following? So I have four possibilities Possibility number three is That I tried to move forward But I tried to move too far forward and I moved back I am so I was rejected and I stay where I was so this is a flow are I plus of Epsilon and the final possibility is That the initial configuration was I minus one here I Plus one here at the same positions at here. I was positioned here at I and I moved back and this is given by an accepted flow minus by some epsilon of particle I so now I can write that the flow Into configuration so this is configuration C the flow into this configuration C Is I take up what you what you wanted me to write from the beginning is I have the choice of n particles I Choose I From one to n Then I had the choice of going forward or backward So it's one over two n sum over I of accepted I plus of Epsilon So all of this must be integrated over epsilon plus our I Minus of Epsilon plus a Minus of Epsilon of a plus our plus I of Epsilon and all of this to be integrated From zero to infinity over the epsilon So now a beautiful thing happens and this is if this move is What if they if epsilon is such that this move by minus epsilon would be rejected Then I cannot have moved particle I By epsilon let me show it so what happens the accepted flow in plus direction by Epsilon plus the rejected flow minus by Epsilon is Equal to one so this is particle This is particle I and this is particle I minus one Okay, if Epsilon is such that I could have moved here Okay, I could have moved from here to here and it is clear that the move minus Epsilon Cannot have been rejected. So one of the two is true So this is this if this is if there's enough space and if there is not enough space then it Can be written anyway, so you have this and you have also a I minus of Epsilon plus R plus of Epsilon is equal to one and Then fortunately, okay, it's really really easy Fortunately what happens what you saw that all of this is is equal to one plus one so this thing here is Equal to two and this makes a Some one over two n some over I from one to n times two means equal to one Pi of Epsilon Or that I'm That I'm saying I'm I have to integrate the whole over the Epsilon P of Epsilon Okay, but because I have this this equality for each Epsilon So I integrate the function one over all Epsilon so the Epsilon drops outside. It's not so complicated Which is took us like a day or two to realize this was really easy Okay, so now the flow into this configuration is one and we satisfy global balance which of course the two of us We knew it immediately and but we are happy to have realized this But this gives okay So this is what we what we have done. So now let us go to another algorithm number three I have moved up to three Let me put yes So yes exactly so there is some normalization that I don't take into account I'm just saying that there's a probability density, which is equal to one Okay, this is also what I used in the in the computing of the partition function I compute every everything as one well the difference is in the heat bath algorithm I take the particle out and I put it in at a new place and The metropolis are you might take the configuration and I move it by Epsilon. Yes, of course, of course. Yes take take the You know take a take a you take a simple distribution Distribution take a simple this take some distribution, you know, it will work It will you know, just take some distribution. Okay, I of course as much to be say but don't allow hopping so now Let me move on a little bit but of course so then Okay, yes, so but thank you of course there should be some distribution, but it should be something like this But there should that that doesn't have to be a cut-off it can go it can go to infinity It should it should allow small movements and large movements and So now let me do something. Let me go, you know Using the principle of liberty and egalité French principle I now move to your side so that you know, there's equality Egalité between those sitting there and sitting here and I even draw as much large as possible So we have a galley day between those sitting in the back and sitting in the front So now I do another algorithm, which is the sequential Now things become more complicated. I do the sequential Metropolis algorithm the sequential Metropolis algorithm is I don't sample so in the I don't sample I But what I do is I take I don't sample I use Equals one two three up to n when you're done, then you start again one two three up to n or any other sequence containing all eyes So now we pose the question So I take particle one Try to move it then I take particle two try to move it then I take particle three try to move it Then I take particle four and try to move it Does this algorithm satisfy detailed balance? certainly not Because if I move from I if I move this particle then the next particle cannot be the same one moving back Then the next one is the particle I plus one and so on So if I have made this move for example moving particle I from here to here Then the next particle to move is this one and I cannot move back So I I violate the detail balance condition. All the other things are the same. I take particle I take particle I I move it to plus epsilon or minus minus epsilon and I accept or reject with the same rule like this So now I have to check That the global balance condition is satisfied of particle I minus one I plus one Yeah, I And let us say I just moved particle I This is something. Well, I just watch which particle was just moved It was this particle So what is the flow into this configuration? What is the code? So this is configuration C at some time T What is the flow into this configuration? the flow Into C Well, it can only come from moves or Attempted moves of particle I and then so now I used to have a one over two N Because N was the choice of I but now I only have a Factor one over two because I Could have moved particle I or attempted to move forward or backward and then I have a I plus of epsilon plus R I minus of epsilon plus A I minus of epsilon plus R I Plus of epsilon you see What is it? It's equal to one so this sampling stage is Unnecessary so I can do sequential updates of particle I I One of particles first one two three wherever they sit or I can update one then four then three then five And this is one and the sequential algorithm satisfies global balance now I can give a historical note if you read carefully the article by metropolis et al They in fact used the sequential metropolis algorithm. They never invented what we call the metropolis algorithm Which satisfies detailed balance. They immediately used an algorithm that satisfies global balance Yes, yes No, it's more of course, it's smaller It's by it's smaller, but I give a at the end. I get it I give a list of Correlation times it is the same order of magnitude I must say that there is no mathematical proof of the Correlation time it's only simulation for the for the heat baths algorithm. We have mathematical proof and It's of the same order, but it is a little faster but now I don't want to of course I won't don't want to use up your your precious time Explaining things that were invented already 1953. Let's move forward a little bit Let's do algorithm number four on the same Algorithm number four is you know what now I started out with the metropolis algorithm moving forward and backward and taking any particle from one to end So now I notice that I don't have to use a sample particle. I I can take one two three four five So now we become more radical And we say why should we move forward and backward? All right Then we do the forward metropolis algorithm the forward metropolis means sample I and then update X I going to X I plus Epsilon so what this means I have my configuration and I take a particle at random I move it forward and I take another particle I move it forward Then I take another particle. I move it forward and I take another particle. I move it forward so this is a Continuum version of the total what in in a discrete version is the total Total asymmetric simple exclusion totally asymmetric simple exclusion process that was solved by professor Dar a long time ago So the forward metropolis algorithm I'm only at four out of nine Seven excuse me the forward algorithm as I told you is you take particle and You only move forward. So this is configuration See and what is the flow into this configuration? I choose I From one to N sum over I class class class of Epsilon, but I can write this sum using periodic boundary conditions in my system and Even using periodic boundary conditions in the sum from I from one to N. So so the right neighbor of N is one I can write it as a I Plus of Epsilon Plus our eye minus one plus of Epsilon so let me do again the presentation. So this is particle. I and This is particle. I minus one Okay, I and I minus one So if I could have moved Particle I From here to here. So if Epsilon is only this big Okay, Epsilon is this what is big is this big Then I cannot have rejected the move of I minus one Because Epsilon is too small So this means that a I I of Epsilon plus are the rejected of I minus one is equal to one Let me do it again. So Epsilon is small Then mean that means that this means that a I of Epsilon is one and The rejectant so then if I move this one by Epsilon I would actually go there Okay So these two are equal to one and this means that I have one over N sum over I times one Which is equal to one so this means that the forward Metropolis algorithm Consisting of sampling I and Moving only forward also satisfies the global balance condition I'll show you a little later that this algorithm is really interesting Because in the limit of N going to infinity it is infinite and infinite time infinitely fast much faster than The other algorithms so now things become really interesting Exponent was is the same as the one computed by professor Dar Long time ago, even though it's the exponent for the mixing time And it's not the exponent for the correlation time that was computed at that time. Yes. Yes Well, it's they can have different rate, but so in some sense. It's the same Because it is true. So what do we have is our I epsilon plus our my aim? I minus one of epsilon is equal to one, but it's not true that AI plus our I is equal to one So I have to have the sampling stage So this bug does and Now the last algorithm that I showed before the break is Yes So so this is the forward metropolis cannot be used in sequential version Yeah, so I must sample which one I want to move and then I move it and then I'm Unless Unless you think very hard For about two minutes Unless you think very hard and this brings me to the final algorithm before the break number five So the forward metropolis I am sample I update I from X I to X I plus one And now I told you it's possible now, but now I do the forward I Do the sequential? They the names get longer and longer and the reason why I stop at seven is because they become so long that I Need the whole blackboard to name them algorithm with relabeling and This is what I just did with Z-lay at E&S And now I have an algorithm is like this. I have let's do it like this. I Changed labels So I explained the algorithm I go only forward and there are two outcomes So if I want to make a move that is accepted So if I move like this, then the new configuration will be K So this is configuration C at time t and this would be configuration C prime at time t plus one But now let's say the move is too far. I Cannot accept it and then in the original version. I said well, let's stop where I am But now I do the following If the move goes too far So I want to move particle I To here then I place it here, but if it's too far Then I Arrived at configuration K, so it must be the same configuration But now I do a relabeling I call this particle J and this particle I I find it really a Find you understand what I'm doing Okay, it's the break already so big I will stop Okay, so what I'm doing is the following I try to move let me not do it with my head So I have this particle particle I if I can move it then I move it But if the move is too far What I do is then I swap the two It's you say that it's the same rate So it it should be so now let me ask Where can this configuration? Where I just updated I where can I have come from? and This configuration can have come From a configuration where I was a little bit to the left and then I moved forward I moved here or It could have come From a configuration from the same configuration as this one, but where this one was particle I K and J and now You see that the flow if I just updated I The flow into this configuration C is I updated I It is one half I plus of epsilon plus R I plus of Epsilon But now as it was this particle which was before was I minus one in the previous Argument, so then this is equal to one No, no it all the algorithms since the last 30 minutes it violates detailed balance But it satisfies the global balance condition because the flow into this configuration is equal to the statistical rate of the configuration Which is equal to one so now let me just resume but now I really have to break So this I had the heat bath algorithm. I had the metropolis algorithm I had the sequential metropolis algorithm the forward metropolis algorithm and the sequential forward metropolis algorithm and Only two of them satisfied the detailed balance condition and three of them already satisfied the global balance condition and now I Liberate you because the third principle in France is liberty Okay, it is not enough you also need the liberty to go out and take ten minutes of break From these all these algorithms and we meet again at 15. Yes. Yes Well, thank you That was there was a question of whether this was just This was just games for you know 1d and stuff for it was this game or whether this was serious stuff and So the answer I want to give it now is I would of course not abuse of your time of Your precious time Multiplied by 100 if this was only if this only applied to One-dimensional heart sphere systems that we can solve analytically anyway So of course the algorithm is that I present to you have a deep meaning as Heart sphere models have had a deep meaning in Monte Carlo algorithm since metropolis 1953 they presented the metropolis algorithm for heart spheres But of course they understood that the algorithm generalized to more more complicated model in particularly In particular some of the models that I will now go into the algorithms I go into of course generalized to higher than one dimension They of course generalized to arbitrary Interaction and to give you an idea. I use the algorithm in fact that I will present the name the algorithm number six In some version we are using it right now for problems for solving problems related to the cool on plasma So particles with long-range interactions, so they have nothing in three dimensions Real particles with water molecules and so on and so on so of course we would not we would not As I said that we would not waste your time if this only was a heart sphere example So now let me go to algorithm number six the pen ultimate algorithm in this list Which is the lifted It is the lifted so this algorithm is really important, I think in my opinion is really important and it was More or less of course we Represented it last year But it is very closely related to what is called the lifting idea which goes which traces back to Diakon is the Homes and Neil and if you have very good memory you remember that the name of Percy Diakon is was already on on my Recommended reading list yesterday in this really nice article from 2011 which is called the art of Mixing things up the mathematics of mixing things up So lifted forward metropolis algorithm is the following is you have a configuration C and You single out One particle and this is the particle I so now The algorithm is following it is a forward metropolis algorithm that means you only move in the forward direction and Particle I is singled out and you move you sample You sample a displacement of epsilon and you move it forward Next move is you take the same particle. It's related to questions I had just before you take the same particle and you move it forward Next move is you take the same particle and you move it forward, but now something happens the move is rejected Now what happens is that this what is called the lifting index is Taken away. So let's say the lifting index is this I always move the particle that has the chalk So I move this particle and then I move the particle again And then I move so now I want to move but the move is rejected So what the move is now it's called a lifting move. It is this Okay, so the move as I said as I just explained so the move is the following you go from here by some epsilon which is positive and the next configuration is This one, but if the epsilon is too large you want to move like this then The new configuration Okay, let me Just for more clarity. So So is if the particle move is accepted is Accepted then you do the particle move But if the particle move is rejected The move that you do is the particles remain where you are But it's this particle That moves the next so this is configuration at time t And this is configuration at t plus one Okay, so this is kind of strange so but it is a it is a true. It is a true Monte Carlo algorithm So let me just do again a simulation of the lifting move the lifting move means that I Keep I freeze the particle configuration and I move from this configuration To this configuration. So this is called a lifting. So this is a particle move. I have it here particle move And this is a lifting move. I have noticed since the beginning of this series That there are a few people here who are very interested in semantics. They want to know why is an algorithm called You know this or that so why is it called the lifting move and The reason why it's called a lifting move It's you have a conflict. Sorry all before we have a configuration C C was C was the n particle coordinates and This configuration C gets now lifted in n configurations C if one is Is Is the the active particle the particle that move C where 2 is the active particle see where I is the active particle You know you see to describe this configuration now. I need n plus 1 variables and real variables and one Index of the particle that is moving and this in the original inspiration of the Authors corresponded to have a configuration and then I just take n copies of them And they had the idea that these and that these n copies are kind of lifted on top of the old Then later on in order to compute observables then I decompose I undo the lifting and I do a reduction So this means I have a configuration C. I and This either goes to C prime I Or it goes to the configuration C and then it's called this I plus one If we keep the the order So the I can move from C. I color to C prime. I or to see I Prime or I plus one Now this algorithm, of course does not satisfy detailed balance. We are we have for almost forgotten what detail balance meant We don't I have a hard time. Remember what detail balance means Because I don't I don't use it at all anymore But now what I what I'm what we are all what we all get get to be Experts in is well, how do we check that this algorithm satisfies the global balance condition? Well, the way I we check this is we have to compute what are what is the flow into this configuration? Well, the flow into this configuration is either a Particle move Okay So it's a particle move. It goes from here to here Okay, so it's epsom was smaller than this in this interspace So I moved this particle from here to here and this is how I could go gotten here the other possibility is That previously The configuration was this one That's the same as this then I tried to move it too far and this move was rejected So now the active particle is no longer this one or this one You realize that for hard spheres and even though this algorithm can be generalized to cool our systems to Any any any system that you want and to any demand? I'll show you later how to any dimension So either this is realized either epsilon was smaller than this thing here Then this configuration could have can have come only from here or epsilon was larger than these 2.5 centimeters then I must have come from here. So this plus this is one and the flow Into the configuration see I is equal to one So this means that the lifted Forward metropolis algorithm satisfies the global balance condition all right and now I have Yes, I don't have to sample I I just take at the beginning of but this is what I'll do in just one second I Don't sample sample I because this chain is infinite So it have an infinite chain and the initial configuration doesn't count. It's just like starting up in the upper right corner but so Exactly, so what exactly what so now I I'll do exactly what you what you kind of propose So this is the lifted forward metropolis algorithm and now I have the great regret of Showing the final algorithm. I really regret because we could go on for longer, but we have to stop Just like yesterday Right there was we were in front of a beer area, which is my favorite beer here But we we made a common decision to go home because I still had to prepare and we did not have a beer So it's the same kind of regret that I know show you the final the final algorithm and It's exactly the algorithm That you may have proposed just two minutes ago It is the following I do the lifted forward algorithm with restarts with restarts So what is it? What does it correspond to? So who posed the question? You did right? So what is the algorithm? I sample I? Then I move particle then I do this lifted forward metropolis algorithm for 17 times Then I take another I I sample I I move it for 27 times then I take another time I move it for 315 times and so on and so on so I do I do the lifted forward lifted algorithm But then I stop it and then I I have to start with a random initial condition I a random I have to then I have to start with this random sample of the same thing Unless so this would be would have been algorithm number 8 But you know just like in the birria yesterday. We don't do it and we we do other things so the lifted forward metropolis is sample I Sample so this is the particle index sample a parameter that we call L which is the repetition the number of Times that we will run the lifted forward and then and then we pick I and we move it We take this as the active particle and we move and so on So this algorithm of course also satisfies the global balance condition and now Let me conclude Yes, I always satisfy the global balance good what I'm doing here Yes, well, I mean there was we had a lively discussion just in the break and Some of you presented other versions and that just didn't didn't work for example the Yes, but for example the forward metropolis algorithm the sequential forward metropolis algorithm like this does not satisfy the global balance condition Yes, of course, but so you're complaining that I'm not doing Too long Calculation so but it is always trivial. That is what I'm sure but nobody else does it So nobody else does it so then I have to do it and I'm really I'm really surprised that everybody Has learned, you know, every everybody who has done Monte Carlo knows how to set how to compute detailed balance condition why you take two configurations and you check that the flow from A to B is equal to the flow from B to A and Exactly what you are saying. This is my my the message. I want to drive home checking the flow into a configuration is just as easy But you can be sure that in the process of coming up with these nine algorithms We had a number of other ones and some of you here in the in the break Presented other algorithms and they just don't do they just don't do it, but it's you have to check it But now let me do it the following Let me give the result of whether these algorithms are any good and now I'll do a table mixing times mixing times is what is the time to start from and from an arbitrarily bad initial configuration and You will believe me that the worst initial configuration that I can choose is having all particles glued glued to them to each other So having you know starting from the configuration Which is this this this this this this on a circle So what is the time but you know then there can be proof that this is actually the worst initial configuration What is the time that you take to get into equilibrium from this thing? So again, I wouldn't waste your time if the result was just that all the nine algorithms are You know are in in one percent among each other, so we don't do this So the heat bath algorithm Let us remember what it is the heat bath algorithm if you take a particle you raise it and you replace it between its two neighbors there is proof now In a few years that the mixing time is of the order of n cube log n independent of density Independent of all of sigma independent of L independent of all parameters and this is a mathematical result That is due to Randall and Winkler about ten years ago. It's a beautiful calculation Gives the first rigorous Mr. Dara, this is a rigorous rigorous result It has been proven by coupling methods Upper bound lower bound and we were just it was again the day when I read this paper. I was struck I didn't do much else that day. I can tell you So it has also a discreet version which is called the simple Exclusion process and this is called SEP and again three years ago Ubell la coin Proved that the mixing time of this algorithm is of the order of n cube log n So now we have the metropolis algorithm Metropolis I can you take a particle and you try to move it forward or backward The metropolis algorithm also has a mixing time You know, I could put in a little all of n cube log n. It also has an n cube log n This has not yet been proven rigorously mathematically Because Randall and Winkler in another papers were able to prove that it's either n cube or n cube log n but it is true that It is also n cube log n So for example in the heat baths algorithm We have reasons to believe that the mixing time is n cube log n But that the correlation time is n cube. So there you see that the mixing time can be infinitely longer in the limit of n going to infinity and you see that some Logarithms have this tendency of creeping up now the forward metropolis algorithm which was just the same as The metropolis algorithm But with the with the with the change that I only moved the moved forward we will move in one direction So this forward metropolis algorithm Has a mixing time where the time is measured in the number of individual steps Which is n to the five three three five halves This is called this is replaced in this is Related to a discrete model which is called the totally asymmetric simple exclusion process tacit the correlation time In other words the inverse or the the gap of the transfer matrix Was computed by professor Dar The the correlation time was computed The professor there to be of the order of n to the five to a five half and since last year there's a there's a rigorous mathematical result showing that also the mixing time is O of n to the five half So this is the case where the mixing time and the correlation time scale in the same way so now the lifted Forward so let me put it here. So this is this is rigorous This is rigorous This is rigorous in the in the discrete version and this is Numerical the lifted forward algorithm is Has also a mixing time of n to the five half and this is again is a numerical result that we have But now the final algorithm that we discussed is The lifted forward with restarts with restarts has a mixing time of all of n square log n and this is a rigorous result That I obtained a few months ago with Zilei And you know so this is n square log n They lift it forward with restart so it goes like n square and the algorithms That I did not show they are actually able to go to all of n square Without the logarithm and this is also a rigorous result, but this was not shown so now I Think I'll use the last five minutes because Yes, yes Yes Yes, when I say rigorous it means that Or at least in the first one it is it is a rigorous proof. Yes, you can go through it We have the references in our papers so now We have a we had a Discussion before in the in the break and One of you asked me to actually to explain how this can be generalized To higher dimension. What is the idea of how how can you generalize this to higher dimensions and? the way I Want to I want us to want to spend the two minutes on this Five minutes. I only do the hard sphere case Even though as I told you we can go we can do anything else is the following how to go to higher dimensions and Other interactions and quantum systems and whatever so but all of this is possible how to treat higher dimension What this treat means? treat means how to suck abandon detailed balance algorithm and go to global balance algorithms, but Okay This is what I will do in let me let me let me open one parenthesis and The one parenthesis I want to answer. I want it is related to what you just mentioned I just want to explain where does this logarithm come from this logarithm comes from That using the restarts that we discussed You have to choose each particle and what happens if This is the initial configuration that you that you start Then you may start with this one, but you cannot get very far then you may start with this one And you can get can get so at some point you will give you will start you you will you will look at this particle And you'll move this particle and this is called You know if you take you have n particles And at each time at each time step you take one particle of random Then you put it back then you take another particle and you take it back you take another particle you put it back After which time Have you seen every particle each and every particle you understand? You take a part your n particles and you always put them back after which time have you seen every particle? This problem is called the coupon collector problem And it has an answer and the answer is after n Log n of the order of n log n Trials you have seen every particle and this explains this logarithm so now I Come back how to treat how to treat higher dimensions and The idea is now we have hot spheres But they may be in two dimensions. They may be in three dimensions and they may even have more complicated interactions So now I do the same lift it forward. I Do the lifted forward metropolis algorithm? So I sample this particle as the starting particle. I Move it by epsilon. I Would like to move it like this So this is configuration time t and the configuration at time t plus one is simply I Just draw the upper part is simply this configuration So the epsilons Let's say are always in one direction Okay, I can I can do this I can also move in this I can do in any direction So now the configuration would be this one. I should have protest from you because Imagine that the displacement was not this one by this epsilon But it would be this particle by this epsilon Okay You see that we are stuck now Because who will be the next particle that is going to move in the lifted algorithm. I move the particle Until there is a rejection and this rejection will end me then I give I give the chalk Okay, I give the chalk from this particle to this particle But now in higher dimension I can have multiple overlaps and To whom do I give the chalk? But do I give it so if I have this if I have this Configuration do I give the chalk to this one or do I give the chalk to this one is undecided? This is the chalk. Okay, so I can give it with poverty P here or with poverty P here But then you will notice that very easy to see that then this gives this this makes the global balance condition break up badly badly So there's a solution and the solution is instead of moving by epsilon Which is a finite displacement? I just move it by an infinitesimal displacement Now this that was a question on this thing also What is the distribution of epsilon now you asked the question now? I take epsilon just infinitesimal But because I have the lifted forward metropolis after if the move is accepted I move the same one and I move the same one I move the same one I move always the same one until there's a rejection. This sounds really strange, right? Well, the probability for this to happen is zero In the in the continuance in a continuum system So what so the algorithm is the following take take the epsilon going to zero limit With the repetition rate L going to infinity That's that epsilon the order of epsilon times L is constant and this algorithm Has been my constant joy over the last few years It's we call it the event chain So it's a you do you move until you have you understand if I move if I move this particle here with infinitesimal Displacements that I'll certainly move I'll get here Or first I get here or here. So then it will be a uniquely uniquely defined particle with which I can exchange the lifting variable and This is another algorithm that we have used from for many years and We now we are happy to be to have been able to prove rigorously That this algorithm has a different Dynamical scaling exponent. So in 1d the dynamical scaling exponent should be 3 If we have detailed balance and we have if we have global balance, we have a dynamical scaling exponent of 2 All right, so this but I don't know. It's a whole it's a whole new world. I can I can go on for For many hours to discuss more algorithms Let me just make use the five. So I wanted to have another discussion on the Metropolis Hastings, but because we had such a lively discussion. I didn't I didn't do everything I wanted I think that's you will not Take this against me, but so let me do a resume Conclusion so the resume of What I have been discussing discussing I take your question in two minutes So the resume of what we're discussing is we had of our discussion was we have been Presenting the or I've been presented to you the fundamentals of the Monte Carlo method which is a Method traditionally based on the detailed balance condition where you go from one configuration to the other configuration It's the same flow that you go from one the other configuration this configuration In physics, we usually express this that we when we go to the from the distribution pi t We go to the distribution pi we call this we approach equilibrium So we start for example from this initial configuration Which is a non equilibrium configuration and then after a certain time we have configurations that are nicely distributed and they are in equilibrium equilibrium exists in Statistics in the algorithms I presented to you, which means equilibrium in statistic means we approach we go from pi t To the limit pi In the statistics, this is simply called the distribution, but in physics we call this the equilibrium now this is the equilibrium of the world of Monte Carlo of of of statistics and And but there are other definitions of equilibrium and one is that the equilibrium in the chemical equilibrium Or the physical equilibrium in chemical equilibrium means you have two species one conformation another conformation of some molecule and equilibrium in chemistry means that that the rate of producing B from a is the same as the rate of producing a from B So in chemical equilibrium You Naturally have the detailed balance condition because this is the detailed balance condition is the detailed balance condition of chemical equilibrium What we've been discussing now for three days is that the true condition in the world not of chemistry part of statistics is the global balance condition the global balance condition is and is a is a condition of Flow in equals to flow out or someone in this audience has come up to me. This is simply a continuity equation This is a continuity equation for the probability flow what comes in must come out This was maybe a little bit theoretical or abstract yesterday, but today it can no longer be abstract Because all the algorithms that we showed that we discussed today Have we have periodic boundary conditions. Otherwise, that would be a problem, but besides that they have a flow That's strictly that algorithm four five six seven and I regret again not having been able to go to eight and nine So four five six seven eight and nine go all only in the forward direction and the condition that we have that we have studied and we have verified you Say it's trivial, but it's There are many are you in that don't satisfy it anyway But so what we have studied is for any core any configuration see the flow in equals the flow out So now if you have chemical equilibrium we have of course we have a basically fundamentally diffusive process So we go once to the right then we go to the left then we go up then we go down and the diffusive process comes With high values of the dynamical scaling exponents The initial Hope that we have when we started working on this is that when we have algorithms that are more convective People move like this particles move like this, but they don't go back then instead of having an approach and a diffusive approach to equilibrium which was which is Which is the the world of detailed balance. We would have a convective one The first result on in this direction was the result by diaconis et al in this model But only with a single particle where they showed that you could go from a dynamical scaling of n square to n for a single particle and now we have been able to extend this showing that even in an n particle world we can change the dynamical scaling Which after all is natural because you all know that taking putting sugar into your coffee and Waiting until the sugar has dissolved takes a much longer time and if you take a spoon and turn it around so these algorithms are specially designed such that you can spoon So that the configuration moves around like crazy It always moves in this direction two or three dimensions Only move plus x or plus y plus z it never goes back, but all this spooning does not create eddies and does not change the equilibrium distribution that you actually sample So what we are doing is we are mixing the world of equilibrium physics and non-equilibrium physics We are using a hydrodynamic convective way of Equilibrating but the configurations that we visit that we see are exactly the same configurations that we would have if we use an equilibrium So Okay, too long didn't breathe. So if you do the if you do the resume of the resume is I think in this this world of statistics or of Monte Carlo algorithms We are just at the beginning of science And I think this is true for all the rest of you many of you don't work in exactly my field I think we are not just there to do the final little bit of things that people have done before us It's just the beginning and a really exciting time to be scientists And so with this I would like to thank you for your attention