 So, the first thing is done, and this is all done by Gibbs, NVT, that means energy is the first thing, but because it is all natural, that's a system will exchange energy with the surroundings. So, that goes to give you, so then Gibbs did this amazing, he must have a very clear idea of what was going on. It must have been very, very interesting, because if you think of it, there is a man alone, sitting in Stalin's laboratory in Yale, he once went to Europe, came back, then rest of his life he never left. Many people gave him offer, but he never left, he didn't marry, and in the morning they used to go like Immanuel Kant of the great philosopher in Germany. He used to in the morning used to go to his laboratory, and in the evening used to come back to his home. All his entire life he lived with his sister, took care of him, and that's it, but those were very common those days, you know this kind of people like kind of aesthetic, highly aesthetic and dedicated life. So, they are sitting there, he must have this huge clarity of vision that how to go. So, the first clarity and brilliant idea was to go to the ensemble, because I give you my again and again the example of the 10 water glasses, you know they are all in different microscopic states, but their properties are the same. So, that would give you the idea of creating an ensemble, because in one stroke you went from solving Newton's equation to a distribution function, that was whole idea one sample, that you go to distribution. As soon as you go to distribution you can define averages, as soon as you can define averages you can start doing calculations, analytical calculation, otherwise you are stuck, you cannot solve in three body problems. So, that was the brilliant beginning of a whole of statistical mechanics, this ensemble. Then he made those two hypotheses and there was time average equal to ensemble average is I have explained many times, follows from the concept of space space which I think I did in second class or first class. Now, then he had to make equal upper high probability, because there is all in all in microscopic states of the same energy and there is nothing that we know about them other than that, that the same energy. So, there are no other option than to assume they are equal probability. So, there comes the equal probability, but then he landed in trouble, because his ensemble is now taken care of. But who guarantees that the time average equal to ensemble average is the first postulate, because for doing that my system, one system that has to go through all the microscopic states. So, then all the microscopic states must be visited. So, the concept of dynamics or time dependent came into Argodic hypothesis. That is why there is a huge school of mathematics even today who are working on Argodic hypothesis. And in the A.R.S.S. system who are shown to be Argodic first time by great Bovinovic and Sinai, only 1982. They said that a billion ball model or rather two dimensional disks are Argodic. That the first time by showing the diffusion exist and the main thing is that two trajectories starting together, go over all the space. You might have heard the beautiful things like chaos and the open of exponent. All these things come from these are the part of these Argodic hypothesis and chaos and that is why so much interesting chaos. Now coming back, so when you go from relax the constant of energy, then again I do not have any next step to do. That is why he must have a huge clarity of vision, because only thing that we have is this thing. So, what does he do? I put my picture here on the board. I put all these members of my ensemble, all the members of my canonical ensemble, very carefully listen. All the systems in my canonical ensemble who are exchanging energy with the surroundings. I put them against each other and then put them in a bath. Let them attain this is my super system. My super system now equilibrates at a constant temperature, but they are exchanging energy. So, all of them are at different energy states. So, these now becomes my one super system. Now I make an ensemble of the super system. So, this super system is now isolated completely after initial bath. So, this has a total energy. I hope you get the picture now. So, now I do a super ensemble of my super system, but the super system is now characterized by constant number of particles. So, total number of systems in my canonical ensemble which part a super system multiplied by the number n each. Similarly, volume. Similarly, the total number of particles. So, there are constraints. Now, I now want to write down. So, this is the distribution I discussed last time that how this was the one we considered in micro canonical ensemble. In micro canonical ensemble, I have only a system has a micro canonical ensemble where a system is characterized by four energy levels and energy. So, n equal to 4, v does not come here. Though v enters through, v enters through given these energy levels. I hope you understand like in particle in a box. Remember, n square h square by 8 mL square, L is the length. So, energy spacing comes through. So, these are the and I have four systems. So, this is my micro chronical ensemble and micro chronical ensemble. This will my omega equal to 4. All right. Now, I allow it to have different energies. As soon as I do that these different states become available to it. So, micro canonical ensemble has many more states available to it. Now, immediately realize not all of them will have equal probability. So, now I need to find out how to talk which energy is more probable. And that is now done by saying n is the number of particles in system. J and add over that you get the total number, total energy in my super ensemble. So, n remember n is the number of again, n is the number of particles in energy state Ej in my super. There is a two layers that we are doing here. These are number of systems. n is the number of system in energy state Ej. And then this is the bare combinatorics. But that has to be this combinatorics has to be maximized with this constraint. So, this is the constraint. And I can define now the probability. So, what I was asking the question, what is the probability? Now that all the energy levels are allowed, what is the probability? Now, what will be the condition that will determine what energy state is most probable in the system? So, I have removed all the barriers. The system can be in different energy states. I have still the constraint of number and volume. And in a super system, super ensemble I have constructed. What is the condition that will tell me what energy level? What is the probability of a given energy level? Now, energy levels are allowed. Energy levels are allowed. So, what is the condition that will dictate? No, no, that will come up. There is an outcome. But without doing Benjamin, you will take a guess. Very simple. It will be the arrangement which will maximize the, which will maximize this. This is an amazing thing that, really amazing. And even now it never fails to surprise me that this holds with enormous accuracy. 1 in 10 to the power 23 or probably whatever that kind of number. This law of maximum entropy, law of the maximum number holds such accuracy. It is just amazing that because these are called laws of large numbers that come into play. So, by the time you have some hundred particles or thousand particles, amazingly this distribution kind of, there is a force constant of course. We will discuss that kind of zoom scene. And it is essentially connected with a very deep level, the stability of the system. And that is why statistical mechanics gives you a real insight into natural systems. That how things work. And it is really mind-boggling if you think of it. So, now I can define the probability of my system, my system in an energy level and that is probability is Pj. That is where I ask the question. Because I am going to go for Pj. And only principle I have is maximize entropy. The maxima is omega. So, this Nj is the number of systems in energy level Ej. And now that particular energy level, my system can be, my ensemble, super ensemble can be distributed in the omega ways. And so then this is the normalization. This defines the probability Pj. Take a look into that. It is not difficult. But we are, I am saying that we are looking at a higher level. And when I go to grand chemical ensemble, even another higher level will come. But this is resting on micro-canonical ensemble. Grand canonical ensemble is rest on canonical ensemble. Let us say what we are trying to say. Omega is the number of ways I can, like here, omega is the total number. They might be the same. Don't let me work it out. So, this is the, in this case, I says it should be 440. So, given an energy Ej, I have a number of ways to define omegaj. All right. Now, so I now come there. I said it has to be weighted by omega, which is the, so the omega Nj given the weight of the, so in probability, how do we go about? We go about, we define a probability by, I did discuss little bit in lecture 2, probability of statistics, which is given in my book in, I think chapter, second chapter itself. You have the concept of the phase space. Sorry, sample space. What is the sample space? Samples place the total number of space that is available to you. Like, if I cross a coin, then the sample space is 2, 8 and 10. If I throw a die, dies, then I have the sample space 6. Then it is the total number of the sample space that is, so here the sample space is defined by total number of microscopic states that is available, not the total number of systems. So, you remember the Venn diagram we draw. So, that is the reason this should be omega. Okay, is it clear? It's a good question. Now, we have to go, what we did now? We have to do starlings approximation, you know omega ln n prime ln, and if you do that you get this quantity here. This is the thing that we have to maximize, but we have to maximize with this constraint. And then we do starlings approximation, and we have the very important thing which I would love, but it would take one class, is the method of, this is called method of constraint variation, method of Lagrangian multiplier, it was done by Lagrangian where we found out that, so it is essentially say that you are maximizing something, but you put a constraint to it. And for example, you want to have a certain distribution, but you put certain conservation condition of the distribution. And these are the conservation condition. Now, the way that Lagrangian multiplier goes is to put the constraint in the following form, so that you can see very easily that this is, this condition if I take dNi with respect to that, then that variation since this is constant, this will go to 0. This is constant because of total amount of energy, because that is my super system, super system is a microchemical system, that has a constant energy, so that will again be 0. So the way the Lagrangian multiplier works, that you put the constraint in the following form, that the whole thing is, so this is the one you want to maximize, but with respect to this constraint. All right, in the book in the appendix, I have described little better. So when you do that, we get this equation, you can go over, because this quantity is this quantity, and you can take the derivatives. And then you get this is the condition, taking the derivative, this equation 17, then you get the 18 comes out. This is actually easier of the things that you have done here. And then you can easily see that I can take some, this is the distribution that comes out, because just I have to take ln of that, and this that comes out, I sum over this. So star is there to denote, that this is the most probable distribution, and at least what I was telling that it works, though it is the most probable, it works with amazing accuracy. By sum over that, I get the total n, and then the e to the power alpha is e to the power minus beta en. There should be actually n in front of it. I do not know where the n disappeared. And then pj probability is this by n, and then this is, so sum over n, the sum over this is capital N. So then e to the power, so probability is, so this is the probability that, this what the probability that you are telling that nj by total number of states, and total number of systems in my microclinical ensemble is this thing. So you get now, this is the probability of getting my system in a jth energy state. So pj, j is the index of the energy state. So pj is the probability, and this is what you recognize. Two things, a, this is the, our Boltzmann, but we have not yet proved that beta equal to 1 over kBt. Beta is just in a constant parameter here. That's why it is called method of undetermined parameters. At that level, they are not determined. They are introduced at constant, and we have to determine them with the time. So, and this is the quantity that comes here. As the normalization is the partition function. This is one of the most important quantity in equilibrium statistical mechanics. But this is just, Boltzmann factor, but sum over all the energy levels. So total weight is the, this is equivalent to omega, exactly equivalent to omega, but this is canonical ensemble, where the difference between this micro canonical ensemble is that here I have all these different things, and in micro canonical ensemble I just have omega, total number of states, but in canonical ensemble each of these states are weighted by beta. So they are weighted by beta, and that is, this is the weight. So I just don't have the states, and now I can also have a degeneracy here, but that we will discuss later. So this I am not going to do. So then, now the most important thing which I think we will just do a little, then probably stop, because I think these are, how long this kind of heavy class can go on. So what the next thing is that, so we have the following, this is micro canonical, so then the way we write is N, V, this is a common way of writing the micro canonical ensemble, all the energy levels, you can have one or zero doesn't matter, this is determined by V, and so very interesting, so temperature will show that beta is one over KBT, the temperature comes through that, and total number of particles, energy levels is determined by the volume, and total number of, this quantity is a lot of interest, and a lot of insight can go into that, but I will discuss that a little later. So what do we need to do now, to show two things we need to do, A, that we have to establish the thermodynamic, how thermodynamic, like we have done, how thermodynamics follows from entropy, that was easy, because we knew these things, we, once this entropy, of course we have to establish the entropy by corresponding with Euler equation, and the relation omega, but once we know how to get temperature, by DSD is the temperature, that is well known relation, then we know the expression of pressure from entropy, expression of chemical potential from entropy, it is not the way you usually think about it, but micro chronical ensemble gives you those relations, now we have to define similar relations, from the similar relations for the, in the chronical ensemble, so that is the idea now, so we start with the average, defining an average energy, so this is the average energy that comes out of my partition function, of my distribution, and distribution is pj, pj is the probability of the system in energy level j, so this is the energy of the jth energy level, and this is pj, and then this is the standard definition, if this is the probability, if this is the probability, this is the probability, then I have the partition function again, so partition function q is actually measure of the sample space, so all the averages will be determined with respect to this, is the total weight of the sample space, so just like entropy omega is the weight of the phase space of the system, and all our averages are calculated with respect to this quantity, in canonical ensemble this is the weight of the system, so a system characterized by nve has a weight, that weight is given by entropy, or rather given by omega, and then log omega is the, similarly in canonical ensemble, this is the weight of the system, so that is the weight to talk of partition function, it gives a weight of the system, that is why we maximize the weight, or minimize the weight, so now we then again play the, it is a very interesting trick here, that I will probably not get to do fully, but de trivially from here to here, now as I said there one and now uses a very nice trick, nice trick is that we use definition of pj, and then pj is defined by e to the power ej by qj, so dpj is then written as I can take d of the left hand side pj, then I get these two terms, so ln pj, so ej on the other side is in the numerator, in the exponent, so this is in the exponent, so I can take d of that, differential here, then this comes out, and then I take log first, then I took the differential, I take the log, so ln p where e comes out, beta e and become, so beta e ln q, so I have a expression for e, that I put it here, ln pj by ln q, then dpj comes dpj, the differential will come later, then I have this term, dpj as dvdvj, and then we know the condition that sum over pj is one, because it has to be somewhere, because there is probability of being in the energy state, I add over everything that is normalized, that is also come from this definition earlier, tell me if you have any problem, and if that is so, the sum over dpj is zero, I take dpj is zero, this is the thing we use constantly, then we go back and we use the relation between entropy, which same as this relation, kb ln pj ln pj, you know this, do I have to explain this or not, yes or no, this is okay, but tell me if there is, actually what happens, this same as this, because if all microkernical ensemble all are same, then you put, this is what some information entropy and all the things, but they are all the same in this kind of setup, so if pj is one over omega, then one over omega, then sum, then all of them come out, but sum remains that gives omega, so these one over omega cancels that omega, and then these, when one over omega is minus, I think I did that last class little bit, they is equal to kb ln omega, okay, so is it kind of a, same generalized of these things to the energy levels, and then I can use this, do the ds, this quantity, and I get this equation, it is, one can work it out, that you get how these things work, ln pj, it becomes pj, which I don't know why I did that dpj, ln pj dpj, these equations can be, these equations I will link d as this, can become mined to, you can get this, okay, okay, entropy is now written as ds equal dE plus pdV, okay, so this is now, pj get cancelled, I have sum over dpj, and sum over dpj I just showed is equal to 0, so this term goes to 0, so I have left with this thing, right, and ds equal dE plus pdV, okay, so now I found this, we compare, and we, so till now I have these things, 1 over beta equal to dLn pj dh, and that is now, so from here these terms, from here this term comes, and that pd, we compare with tds into pdV plus pdV, then the first relation with thermodynamics is tds is 1 over beta dLn pj pj, yeah, 1 over beta so this, so Ln pj, this is pj Ln pj, so this term is okay, ds equal dE plus pdV is okay, so here we have dpj Ln pj, this quantity, and this quantity become Ln pj, this goes to 0, then let us stop now, we will continue from here, I have a little confusion what has happened here, so we will start at 4 o'clock from this equation, just no, at the end of the day you are, at the end of the day you are defining so what is this here let me describe this little bit so as I was telling I forgot his name, so I have a phase space okay, and what is the phase space as I explained that the total number of states in the system is a phase space always, that phase space is now partitioned into the the total space now constitutes say I for example bias my coin, if I bias my coin then I can make head and tail becomes different and then there are two ways now look into it that the biasing that I do that probability has to add to something that is my sample space okay that is the question he was asking that that is also equal to total end in this case it will be two so the but by biased probability that add up to one also total number of space so here this quantity is the giving that total number of so I have this is I have partitioned my system which is given to me that all the different energy levels I have partitioned my whole system in my configuration space all different energy levels are there okay this is my sample space and each of them given my micro grammatical because all of them is constant energy exactly so this is now these are my micro grammatical so I finding for each of them omega each of them are finding omega but now say 10 this is 12 this is 14 all this so I say now go okay I give you energy level tail in my diagram how many ways I can do it that is this this omega this this omega look at this so this partitioning so now I have to add up all the counts that I get here and that my total I have to get the total sample space like in Venn diagram that total sample space is this quantity I have to go back and say this energy 10 has a number with it which is number of ways I can do 10 in micro grammatical ensemble that quantity is this quantity okay this is very neat and clean where the stat mac part is very clean where one tends to get confused sometimes when the interface with thermodynamics comes because I always think that stat mac interface is still a little rugged there is always a little approximation that is made there so that is where little bit ambiguity sometimes crops in so we will meet again at and we will start from the thermodynamics part and g plus pdv at we will finish canonical ensemble and start canonical today tomorrow I will do hopefully tomorrow we will start doing the fluctuations atleast half towards the end of tomorrow morning that is the most interesting thing in my book in initial title I have realization of promises that statistical mechanics has see difference from quantum is stat mac is the following in quantum I do Schrodinger equation no derivation is required then I say this side you go to 0 h equal to hc 2m plus vx then that is a differential equation now I put it in a vx equal to harmonic oscillator I solve I immediately get experimental observed quantities which is the vibrational spectroscopy the lines if I make a little distortion because of rotation I get what is q or branch which is exactly experimental similarly you do particular box you know you go to conjugate polymer so at it almost trivial level you get results which are connected to experiment and spectroscopy that is why spectroscopy came first quantum calculated statistical mechanics on the other hand you have to do a lot of work go through the whole postulates which is suddenly far more non trivial than writing down Schrodinger equation of course if you go out Schrodinger did it is original paper if you read it will be lot of fun that how we argued which was written in the first edition of Barry Rice and Ross I used to have all these original papers at one time that is what I knew that he did everything he did it from debaugly wavelength debaugly hypothesis he does not do that operator equal that was done by didap history is very interesting so there was a formidable thing that went on for the electric effect black body radiation, planks that led to quantum mechanics here the Maxwell then Boljman but Boljman is so difficult we cannot work out Boljman's equations in a class so we start with Maxwell then ok Boljman tried and did s equal to k b l n omega then comes the Gibbs this formidable wall of the postulates micro-granical, canonical, grand canonical then you come to applications but the first applications are actually the fluctuations which just the and the linear response that is one of the most fundamental thing of natural science is the response function that we will be able to do tomorrow