 we will start, we will briefly recap what we did last time with the extra things here and there. So basic idea is that you know statistical mechanics has two parts one is equilibrium statistical mechanics that addresses problems like phase transition and properties of matter and then you have the time dependence statistical mechanics which addresses things like chemical kinetics details of protein folding that is the dynamics of protein folding and things like that. These are very complex systems and as I try to explain that there is a huge amount of work that in chemistry theoretical chemistry only one term we really do not do one term in any kind of sophisticated way we do density functional theory and that kind of thing nothing against it because we will develop them are physicists and they are very smart people. In statistical mechanics however starting with many people like Onsagar, Karkut, Zvonzek and others there is a considerable contribution that has gone in from chemistry physical chemist of course physical chemistry as you know was born essentially 1901 or 1901 around that with event of Arrhenius and Ostwald and what they did essentially was some form of other is the statistical mechanics. However there is a history which I will recap again but before that let me tell you what would be the first few things that preliminaries we did last class I will just do five minutes recap of that because I want to I like to put really heat on the head again and again on few points that I think important to the students and I get very upset when they do not understand. Now what I will do today probability of statistics then I will explain why we need to know some amount of very elementary what you have learned by a large in your class 12 11 or 12 and fundamental concepts actually what we will do here something like phase space trajectory and then I will I will try to motivate and explain to you why this time dependence statistical mechanics not equilibrium and time dependence statistical mechanics needed these things. So I hope to finish this today so the Liouville theorem and Liouville equation we will do these things and these will be next not in this thing then you go on my very favorite and one of the most favorites is the fluctuations or response function then we will do this is amazing the ideal monotomic gas but even the translation entropy is used almost every day in the drug discovery or drug DNA interaction because that gives you the entropy of a drug going into DNA and the entropy it is losing which is the circuitry equation then of course here we have all the things that we need you know rotational vibrational entropy things essential for chemical kinetics we use them without knowing where they came from but they came from a very elementary statistical mechanics so we will do that and then we will do bit of quantum statistics I am not too greatly fond of doing this thing but there are some nice points here that I think students should know and they are not taught and are not covered in books very simple things then we will go to ising model which I think is one of the model simple model the first and perhaps the only solvable model of interacting systems and one must know ising model and this is the a of the the chronology and the of my doing now what I was briefly tell the hierarchy the way things happen in statistical mechanics though it was evolved I did is it started with Maxwell who did the out of the velocity distribution that was the first step and then Boltzmann Lujik Boltzmann who was as I told you so impressed that rest of his life he carried this paper of Maxwell with him and he is the one so this is the chronology he is the one who started in very seriously introducing concepts of probability theory which he was enormously criticized because those days everybody believes the mechanical mechanical law of motion will be able to explain everything and Boltzmann introduced many many things one of them is the Boltzmann Boltzmann kinetic equation where he tried to develop so Maxwell gave the these Maxwell and then a little bit of Boltzmann was able to these one all these things came out from then your P13 mnc square Cv 3 by 2 RT then expressions for viscosity that I forget but something to do with density and other things sigma cube so all these things huge number of things flew from this equation so the beginning birth of kinetic theory of gases started with Maxwell and that was also the start of the statistical mechanics but Maxwell and Boltzmann could connect to Thounderings but at the same time get the expressions like this which are transport properties and as you know they also this approach also gave the universal gas constant which ultimately also used by many people including Einstein to get the estimate of Avogadro number one of the landmark of Einstein 1905 paper Boltzmann wanted to go beyond this ideal gas collision less ideal gas he introduced then this binary collision and binary collision operator which even today is used but in the process he is had to introduce essentially many body distribution function because the probability of a particle at R1 V1 colliding with a guy another particle R2 V2 and then going off R1 prime V1 prime R2 prime V2 prime this is a hugely complicated because these are three dimensional vectors so he made an approximation that if R1 V1 R2 V2 that is F2 that becomes F1 R1 V1 and F1 R2 V2 that is called the approximation of molecular chaos by enlarge for which he was criticized but then what what came out of Boltzmann the rigorous what is the definition of entropy s equal to kelene omega and that ultimately lead to the you know very heavily used by Gibbs okay so then around that time we had van der Waals and as I told you last day the van der Waals had this famous van der Waals equation of state and which has this kind of unphysical shape but P verses density and Maxwell did the Maxwell-Tie line construction this work and this work you motivated the person across the Atlantic will that Gibbs who made this important he could observation that you know he of course knew all Boltzmann he knew this the difficulty Boltzmann suffered in order to do a Newtonian description that means the everything obeys laws of conservation Newton's laws of mechanics and then of course as I told you we cannot insult the body problem will it Gibbs then realize that there could be time independent because if we have to solve this avocado number because you know we are interested in one cc at least or two cc then those very molecules you cannot solve in three body and even spheres and forget about water or astronaut time so he realized very early the time-dependent approach of Boltzmann is not going to go very far if I am doing the equilibrium properties because for equilibrium properties it should not be that difficult after all as I told you yesterday that we have millions of glasses of water they are all the same properties then he realized one very important thing which I forgot to tell last time he realized that is such a system of avocado number of there were huge number of microscopic states and if I have billions and billions of my glass half full of water then each of them will be in different microscopic states and since each of them in different microscopic stated any given instant exploring other microscopic states it makes sense to talk of a distribution and distribution that's why they introduce the concept of ensemble which I'll elaborate a little bit more because these things are now other than I the book of Tolman I have not seen anybody doing a really good job on describing these things but Tolman of course goes a little old-fashioned and goes different things so then the wheeler gives we don't need anything beyond wheeler gives as of now wheeler gives explained how to think of van der Waals he explained the face equilibrium the famous book of thermodynamics of heterogeneous system and they were extremely impressed Maxwell of course Boltzmann was going his time dependent venture and when Maxwell died at very young age will everybody joke was wheeler gives was unmarried lived this sterling laboratory here and used to only thing used to take is to journey from his home to the second floor but of course before that he has spent in Europe after the where he learned all these things and then Maxwell died then the joke at Yale was that there was only one man who understood gives and he's dead so but he alone kind of went on doing and did many many things okay now coming back so they he introduced all them these three one two four introduced the concept of probability and now I'll briefly go through not wasting too much time to what and you know there are enormous number of jokes about probability theory and many many poetries and poems of probability and this is my favorite of Shahlok homes so now I go through but before that let me tell that we have this is my favorite book this is not the book I have in mind my I told my student to put this together but an introduction to probability theory by Kaili Chung it's famous in a mathematics department by KLC so they have two KLC KLC one introduction to that's a wonderful book which I studied when I went to take course on probability theory and this is the one absolutely fantastic which I many help I got when I solve the barrierless in the reaction which is is is really quite a significant and is classic now I could do that because I knew how to solve method of images by method of images and repeated reflections all this technique I knew that I could solve that and this is another beautiful book which is an introduction this I think all of them are available free on the internet and but this is beautiful because it are things of a talks of probability determinism then probability and deterministic together goes on deriving beautiful equations so these are highly recommended so these three books for the probability theory that I recommend to everybody so this particular part of the lecture will now will be motivation I have already done little bit more then I'll go through what is the random variable how do you define sample space which is so important in when you do the statistical mechanics then for distribution and the condition probability is extremely important because condition probability is the one that goes over to become the radial distribution function so this condition probability of hard and read in I think and based on theorem things like that in your undergraduate in your class 12 but that is the same thing we again and again we come then we will do central limit theorem as the name says mathematicians are not given to adjectives not like chemists we don't we say everything very interesting they hardly say anything interesting this it is trivial fine man has a wonderful joke about mathematicians ok so as I said what is the motivation is there in statistical mechanics what is the interest in large macroscopic systems with many degrees of freedom not several degrees of freedom many many degrees of freedom once is your liquid water contains these many molecules and each has nine degrees of freedom so we are dealing with huge number so if we want to have a Newtonian description of solving positions and momentum of each particle not only that you won't be able to do it even in computers we are doing about now probably few thousand even then we need the same concepts that we will be using also if we have all those things we have no need for that it is not necessary to have all these huge huge a when we are started doing that we could hardly simulate few hundred water molecules or few hundred layer zones so we are little bit more elaborate in describing but now computer simulation has come a long way so now these are very important and very loaded set statement and that we do longer can afford and no longer it is realistic or practical to take a deterministic approach which is Newtonian approach instead we have to go to probabilistic approach which was already initiated by Maxwell was the first thing we did the single particle distribution now there is very very important point that comes now that what the the the experimental observables that we do for example specific heat or the density they are all averages over random variables there if you look for example the standard edit kastel and book if you remember has this nice picture that how to get the pressure of a liquid and what kastel and did was that he had a color thing and then it is full of water now so water up to this and then water up to this is a color liquid what is now given that if you now with a microscope optical microscope do that we have in our undergraduate or laboratories you will find that height is continuously fluctuating so there is nothing really a constant pressure and the pressure that we mean is we do an average over that and the time average okay so these are very important point in understanding that experimental observables are themselves a random number and of course when you do spectroscopy you get a broad distribution that because the probe which is a die molecule you put in a liquid and then this die molecule phases okay when you put a die molecules a point one motor or point zero one motor so there is a end to the 20 die molecules each of them facing a different environment so what you see in spectroscopy is an average again over all these molecules now I will quickly go through something I am just using this slide I don't need to use that but this little bit it saves time and it will be more precise so now we have to think when I start discussing the going towards the statistical mechanics so we have till now we have argued that we give out the deterministic approach of Newtonian dynamics Newtonian approach and we are trying to go to a probabilistic approach and the reason is that a deterministic approach is not feasible be that this the should be the matter of choice the probabilistic approach and as I said that they are all fluctuating so then the central quantity in statistical mechanics is a random variable X for example h here is a random variable energy of a system in in environment or in equilibrium with a bath is a random system so is a random variable means we do not know and we'll never know the exact instantaneous value of this quantity but we know its average we'll know its second moment if necessary we'll know its fourth moment but this quantity is a central quantity in our description of statistical mechanics for example you realize that this random variable or set of random variable I want a liquids and position of a water molecule in liquid then it will position R orientation omega 1 then I would like to know how far another water molecule with orientation omega 2 at position R2 so in statistical mechanics in chemistry physics all the time whether doing a conductivity all the time we are talking with these quantity so these are very general thing X can take the quantity of our interest some examples of random variable when you throw a die a random variable is a number of one okay so possible outcomes are this there's a one or six each of them with one over six similarly our so this is the die and this is the coin where inside head or tail will be half now there are some things which immediately comes to our things that the way we do random walk left or right if one dimension is like a throwing of a die so distance traveled by different walkers of a given time interval instantaneous pressure of a liquid as I said number of nearest neighbor in a liquid spherical molecules it is anywhere between 9 and 13 so these are very important quantities you want to discover a random walk you want to describe a walker you how far it is moving how fast it is moving which is diffusion one the pressure everywhere we are doing with this kind of random which is same as essentially throwing of a die or the coin so random variable then let me summarize is a quantity of interest which can take a number of values which are not predetermined but which follows a distribution as we go through now the collection of all the outcomes of the experiment is the sample space this is very important so when we do in statistical mechanics we integrate over some vx or something we integrate over all the possible values and all the possible values together define the sample space okay like sample space of a a coin tossed is head or tail is 2 then you go twice the sample space become 4 because it can be headed head tail same thing going to say this is the then the total weight of the sample space similarly dies if you throw many times then we have if twice we have the 36 possibilities so then sample space which you have done in your school like Venn diagram and similar things it just the collection of possible events we need to know the collection of all the events because we would need to know the total sample space in order to find probability of a given event so we need all the events to know once we have all the events then now we can say okay if I have a value that the probability that a variable has value in a continuous case between X and X plus DX PX now in in an in a discrete is easy as we say in a die it will be 1 over 6 that the what is the probability that it has value 1 it is 1 over 6 in a continuum you need the measure between X and DX if it is in uniform distribution then it will be the length DX by the total L that will be and it will be normalized because we integrate you get DX equal to L so these are fairly trivial things a straightforward things but one gets a little bit more difficult made more later we just kept the probability of outcome to a value between X and X by PX DX this is normalized so that it has to be somewhere and that is now again the distribution it will be you can do minus L to plus L there are many many examples of that as I told you very fast one was given by Maxwell and you have to realize that it came from nowhere that means at that time nobody in nineteen and I think 50 or 60 1850 or 60 nobody talked of probability distribution except mathematicians but they are not applied to physics or chemistry so this was done by a and he derived it and the derivation is you have done in your undergraduate textbook how you get this particular form this is normalized because in 1D for example it have a velocity can go to minus in to plus infinity and this is normalization constant this average velocity we know will go to 0 but the you have this square which gives the width which again is dated to temperature and the derivation of many things of kinetic theory of gases almost everything in the elementary kinetic theory of gases flows from this equation