 So, let us begin this course on non-equilibrium statistical physics or statistical mechanics with a little bit of recap of equilibrium statistical mechanics which I presume all of you have had a course in. Now if you recall equilibrium statistical mechanics does not anywhere have time, the notion of time in it. Things are supposed to be in equilibrium, in thermal equilibrium and when you deal with just the averages of microscopic variables then you have this thing called thermodynamics and when you include fluctuations about the average you need statistical mechanics of various kinds in various ensembles depending on the external conditions and that gives you all the information that you have that you can find about the system the macroscopic system in thermal equilibrium and just to recall to you when you have a state of thermal equilibrium it does not mean a state of mechanical equilibrium for instance if you take a container of gas and you put it at a fixed temperature then even though the molecules are moving the individual molecules are moving and there is a lot of dynamics going on overall macroscopic quantities certain macroscopic quantities would not change with time the average values would not change with time the variances would not change with time no moments would change with time at all. So you can get away by removing t little t from consideration and dealing with everything in thermal equilibrium. Now of course the systems that you look at are very complicated typically they have a very large number of particles or atoms or molecules of the order of Avogadro's number for instance and it is impossible completely futile to write the equations of motion down for every one of these objects and try to solve these equations and then calculate from the solutions whatever quantities you want to compute macroscopic quantities. But the need for statistical physics is deeper than that it is not just the fact that you have a very large number of degrees of freedom the fact is that in almost all cases if you look at classical systems for instance the systems are chaotic in the classical dynamical systems point of view in that sense. In other words even if you could write down all these equations and compute quantities you would soon find that your computational power is not sufficient to enable you to find out what happens to the system keep for given initial conditions once a sufficient amount of time has elapsed it is impossible and that is true even for very very small systems even if you took a little box with a handful of particles in it the system is in general chaotic even if you took two particles in it the system is still generally chaotic if you took just one particle in the box even that can become chaotic depending on the shape of the box. If there are obstacles in between if there are scatterers and so on fixed scatterers then the system that future trajectory of the particle is chaotic and essentially not predictable not computable in polynomial time. So right away you know that there is need for some kind of statistical or probabilistic approach to the entire thing but mercifully for us fortunately for us for instance taking the example of a gas the kind of question you ask of the system the kind of information you want from the system does not in general involve microscopic degrees of freedom you are not really interested in finding out what is the velocity history as a function of time of an individual molecule you are not interested in that you are interested for instance in the pressure exerted by this gas which means you would like to know what is the average force exerted by it on the wall per unit area and you are not interested in which particles are hitting the wall at any given instant of time and of course if you sketch the instantaneous force of the molecules on the wall at the level of femtoseconds for example you would find a very grainy kind of graph it would be fluctuating very widely but if you looked at it on time scales which are more macroscopic say of the order of microseconds or so then you discover that so many collisions have occurred in your time span that things have averaged out and you get a certain average which is quite robust and does not change with time okay similarly you would like to know not what is the individual energy of a molecule but you would like to know what is the average kinetic energy of the system as a whole what is the average internal energy and so on so the questions you ask of it are also questions about statistical averages now you can compute a statistical average in 2 different ways in such a complex system you could for instance follow a given particle and find out what its history is and if there is a sufficient degree of mixing among the particles then the time history of one particular particle is essentially the same as the history at a given time of all the particles in other words the time average over certain degrees of freedom would be the same as the ensemble average over the collection of particles so the system in some sense self averages as you go along as time elapses so which is the reason why you can eliminate time averages and use ensemble averages okay now we will say much more about ensemble averages more carefully but this is just to give you a rough idea of what is involved in this business now of course once I say an ensemble average you have to tell me an average over what distribution so the whole of statistical mechanics is really a specification of what is the correct distribution function probability distribution function over which statistical averages are taken and which we believe will yield the same answer as long time or infinite time averages so this equality of long time average and an ensemble average for suitable distributions over suitable distribution functions is what makes statistical mechanics work and then of course what specific average you have depends on the conditions to which you subject the system if for instance you had an isolated system just for example let us take a fluid with a given number of particles in some enclosed container and let us say this container is isolated from the rest of the universe it is completely isolated and it is sitting somewhere without interaction and here it is in this fashion the number of particles in is in a volume V the number of particles is n and let us say that the total energy of the system is E then if this is completely isolated no exchange of either matter or energy with the rest of the universe then in thermal equilibrium this systems energy does not change it remains constant that is a constant of the motion and then the way you would say kind of average you would take would be over a distribution which takes into account the fact that there is one constant of the motion in the problem so whatever happens inside you not bother about but the total energy of the system is constant in other words the Hamiltonian of the system and I presume that you are familiar with classical dynamics at the Hamiltonian level which is specified by a whole lot of generalized coordinates generalized momentum for all the particles this quantity if it is a conservative system we make certain assumptions and we will make that more clear as we go along for such a Hamiltonian system which does not explicitly involve time it is an autonomous system this quantity its numerical value remains constant as a function of time so as the q's and p's change with time according to Newton's laws or whatever equations of motion you have this function of all the coordinates and generalized coordinates and momenta remains constant in this case so you could now ask what is the constant what is the distribution that I am going to work with what is the distribution in phase space as a function of all the q's and p's and of course a good first guess would be to say rho of q's and p's in equilibrium so let us put a little subscript here this quantity must be equal to delta of h minus e a delta function dirac function which says that the total energy of the system remains constant in other words the motion of the system is restricted to a hyper surface on which this function h of q p equal to a constant value some prescribed value so that is the equilibrium distribution function now of course you have subtle questions arising here which is to say that maybe there are other constants of the motion maybe the total angular momentum is constant the total linear momentum is constant so there are other constants of the motion of that kind which you can identify even in the simplest of cases how come how come I am using a distribution function which depends only on the Hamiltonian and not on those other constants of the motion that is a much more subtle question and I leave that as a question for you to ponder over for the moment and we will come back to this much later okay at some stage we will answer it but this is worth asking why is it that the other constants of the motion do not play a role they actually do but it is not explicit here and I want you to think about it now this distribution function is called the micro canonical ensemble so for an isolated system you have the familiar micro canonical ensemble which is essentially saying that the motion is restricted to an energy surface okay well there is always a resolution in these matters this is an energy of a macroscopic system so there is always some resolution so if this schematically is the surface on which e the energy is e what is really happening is that the system is restricted to a little shell on which the Hamilton here is e plus delta so there is a certain resolution delta e and the system remains in the shell it does not go inside does not go outside and that is your distribution function essentially and that is the simplest case now as you know from equilibrium statistical mechanics the very the very fact that you have this conservation law here tells you a great deal once I have a system which is isolated and is in thermal equilibrium in other words macroscopic averages over this ensemble should not change as a function of time then you can make a large number of deductions from it and that is the business of equilibrium statistical mechanics just to refresh your memories the first thing you do is to say aha okay suppose I draw an imaginary partition out here in this case and break this into two roughly equal pieces for instance so there is a system 1 and a system 2 here and let us suppose that this has got in one particles and the energy is e1 and this has got in two particles and the energy is e2 and the volume is v2 and here the volume is v1 and the system is in thermal equilibrium then of course this energy e1 is not constant because it is constantly the suffering fluctuations from this side and vice versa so particles from here are able to go there particles from there are able to go here the energy of the system fluctuates the energy of that system fluctuates and so on so if you have taken the total number of degrees of freedom broken it up into two partitioned it in with an imaginary partition into two pieces then you have two still macroscopic systems but neither of them is an isolated system because each of them is an interaction with the other this case but now we impose the condition that this be true this thing be true and now we ask what are the conclusions we can draw from the fact that these two systems are in equilibrium with each other okay then you need to introduce now the concept of the number of microstates that you have of this system now what do I mean by microstate taking our example of particles gas particles for instance specifying the position and momentum the three position components and momentum components of a particle specifies its state completely at any instant of time we assume there are no internal degrees of freedom further then you have to do this for each one of the particles and that is a microstate of the system if at any instant of time you tell me what is the exact value of the momentum of each particle all the components and the position of each particle then you have a microstate of the system entire system okay and now the question asked is how many microstates this this full system have you assume that it has some total number of microstates sigma total for the full system and with this thing here you have omega 1 microstates under these conditions and omega 2 microstates under these conditions then a fundamental assumption is that the total number of microstates of the system is essentially omega 1 times omega 2 because these are completely these are both very large and then you can say well there are there are some particles here in this partition which are the fellows actually exchanging energy or driving fluctuations into each system but that number is infinitesimal compared to omega 1 and omega 2 okay because imagine for example a partition in a box which is 1 cubic meter you have 10 to the 24 particles in it I put a little imaginary partition there and ask how many particles at any given time are passing through this point here that will be of the order of 10 to the 24 to the power two-thirds it is proportional to the surface which is 10 to the 16 and that is one part in 10 to the 8 so you will see the number of degrees of freedom which sort of talk to both systems is negligible compared to the number of degrees of freedom here and here so effectively omega total is omega 1 times omega 2 and these are enormous numbers huge numbers for a liter of gas or something like that and normal temperature and pressure the number of microstates probably be of the order of 10 to the 30 or more okay because you have to take the full phase space volume and then divide by h to the power cube which is the cell in phase space and then you end up with a very very large number okay. So you have the product of two large numbers and then it is convenient to take the log of it so that it becomes additive completely and that gives you a measure of how many microstates there on each side and as you know the entropy of the system of any system is given by Boltzmann formula this fixes the dimensions of this entropy and this is essentially the log of the number of microstates of the system right and that is an additive quantity so what it is saying is that if the entropy of the system is a measure of the disorder here then the total system has an entropy which is a sum of the two entropies here okay so it is see you see immediately that is natural that this log of the number of degrees of the number of accessible microstates appears naturally and then imposing the condition that this entire system will be in its most probable state you immediately start getting information about it for instance the first piece of information you derive is that this omega is a function of the energy and of course of these two variables as well and the first thing you derive is that delta log omega of E 1 omega 1 of E 1 over delta E 1 will be equal to delta log omega 2 of E 2 by the way I should note that E 1 plus E is equal to P you know that total energy is conserved again I neglect the interaction energy in between here compared to the energies of these systems here what do you call this quantity the derivative of the log of the number of accessible microstates of a system with respect to its energy pardon me the inverse temperature this is got dimensions of the inverse temperatures the energy is in the denominator so it essentially implies that 1 over k Boltzmann T 1 is 1 over it says the temperatures on the two sides are equal similarly you could differentiate with respect to V although I have not written that explicitly omega is a function of V as well and you could ask what is the derivative with respect to the volume and with respect to the number as well what would those tell you omega 1 let us just write again you use the constraint that V 1 plus V 2 is equal to V what is this telling you yeah this essentially tells you the pressures on the two sides are equal now I leave you to verify that this is so by very simple check what you need to do is to use the fact that the entropy is given by the Boltzmann formula then of course you have the entropy here and the derivative of the entropy with respect to volume will give you pressure because if you go back and look at the laws of thermodynamics this is exactly what it does so let us write that so-called laws of thermodynamics they follow from statistical mechanics but let us write this down what is the what is the first law of thermodynamics let us look at a fluid so one molecular species just a single molecular species at some volume V and some temperature T and a fixed number of particles N so what is the what is the what is the what is the laws of thermodynamics say about this what does it say about the internal energy of the system well you start by saying that if I supply a certain amount of heat D Q to the system and that is an imperfect differential this thing here will be equal to D W another imperfect differential this is the heat supplied to the system an incremental piece of heats applied to the system that is the incremental piece of work done by the system and what is left what is left is the internal energy of the system which is D that is a perfect differential because E is a state function okay on the other hand Q and W are not state functions which is why the differentials are imperfect till you multiply them by integrating factors and then you get perfect differentials and what is the second law of thermodynamics say it says for reversible processes this quantity is in other words if you take D Q and multiplied by 1 over T that is the integrating factor which makes this D S a perfect differential so on this side you have T D S and on this side you have D E plus P D E right that is the familiar laws of the first first and second law combined for reversible processes okay alright so this immediately says that D E equal to T D S minus P D E if you allowed the number of particles to change as well then there would be another generalized force here on this side in addition to P D V it costs a certain amount of energy to change the number of particles and the cost in energy per particle is called its chemical potential this is the chemical potential so it follows from here that since this is a perfect differential it says delta E over delta S is T or delta S over delta E is 1 over T and that is what I have here so it is completely consistent with thermodynamics similarly it says delta E over delta V this thing here is going to be proportional to the pressure and so on so you can derive all the thermodynamic quantities and notice just while I am at it let me also point out that this thing here this change in this is the generalized this generalizes the laws of thermodynamics you can add many many other terms here okay in general what you have is this is equal to T D S plus summation in this form where these are generalized forces and these are generalized fluxes here and the simplest of these is minus P here and V here and then a mu here and a dn there and so on okay. For instance if you applied a magnetic field the system would respond by changing a magnetization so the force in this case is the magnetic field and the response is a magnetization the extensive variable you applied an electric field the system could change its rate of polarization so again you have a generalized force which is the electric field and the response which is or a flux which is the polarization and so on. So now from this it exhausts the contents of the laws of thermodynamics you derive all sorts of relations using the fact that this these are all perfect differentials they are all state functions okay. So to come back here in the micro canonical ensemble the fact that these two parts of the system are in equilibrium with each other immediately tells you that the temperatures have to be equal the pressures have to be equal this side and the chemical potentials have to be equal. So we can also write the same thing with delta log omega 1 over delta n 1 should be equal to delta log omega 2 over delta n 2 this will imply that mu 1 over T 1 equal to mu 2 over T 2 since T 1 is equal to T 2 anyway already the chemical potentials are also equal so much for the micro canonical ensemble okay but now in real life we will by the way we will come back to this we will come back to the laws of thermodynamics suitable intervals there is one more piece of information that goes into thermodynamics which I have not put in here while this says something about ds dv and dn you express it as a perfect differential here you could ask can I make a statement about df times xi can I do that okay and the answer is you need a little more physical input you need a little more input about what this quantity is this thing what sort of dependence does it have on sv and n you see it only says that since d e is proportional to ds dv and dn it only says that e is a function of those 3 variables okay it does not say what sort of function but now if you make an assumption that it is proportional to these quantities namely it is a homogeneous function of these variables of degree 1 if you double the number of particles keeping other things constant the internal energy doubles and so on if you made that assumption of extensivity then you get a relation which also says xi times dfi summed over i is 0 and that is called the Gibbs-Duhem relation or the Euler relation in this case and from that you derive the Gibbs-Duhem relation and so on the rest of thermodynamics proceeds that way okay but we are not interested in thermodynamics because first of all it applies only in equilibrium and secondly it only deals with average quantities it does not tell you anything about fluctuations about the average in fact anything which involves fluctuations about statistical averages has to be put in as an input parameter input information into thermodynamics can you give me a notable example of such an input well when you deal with gases in thermodynamics whether it is a real gas or an ideal gas does not matter when you deal with that there is no way in which you can put in you already have information about specific heats you cannot compute specific heats within the framework of thermodynamics because specific heats involve the variance of the energy and thermodynamics does not go to the level of the variance it sticks at the level of the first moment for the average. So specific heat has to be put in as an input parameter into thermodynamics now in your elementary courses in high school physics you learn that the ideal gas the monatomic gas has a C v which is 3 half k T per particle and a C p which is 5 half k T okay now you might say huh but that is basic information where did that come from it came in because in that particular case you know the equation of state we know that p v equal to RT or n times RT. So once you know the equation of state of a system which is just written down by the way empirically then matters become different all together one of the purposes of this whole business is to find the equation of state and if you give it to me then that is it I mean I can compute specific heats okay. But in a real gas for example the van der Waals gas and so on you do not have if you have an equation of state you can start trying to compute these quantities but a priori in an arbitrary thermodynamic system given thermodynamic information you do not know what specific heats are you do not know what susceptibilities are you do not know what the compressibility is you do not know what response functions are they are put in from outside okay. Yes that is a good point thermodynamics will say however the fact that you are in thermal equilibrium means that you are always at the minimum or maximum depending on how you define it of some thermodynamic potential and if it is supposed to be a stable equilibrium then it gives you certain convexity properties the fact that when you have a function of two variables a parabolic ball the bottom of it is the state of stable equilibrium right. So the fact that you have stability tells you that the various partial derivatives second derivatives has specific signs and that will tell you something about which quantities are positive which quantities are negative. For instance you can show that thermodynamic stability implies that the specific heat at constant volume of a system cannot be negative that the compressibility of a certain system cannot be negative right. So certain inequalities are obtained but you cannot find the actual absolute values of these things for that you need a little more information if you give me the Hamiltonian and you tell me how to do statistical mechanics then in principle if I can compute what is called the partition function of a system then I can find all the thermodynamic variables all the information that I need all moments can be found. Again in thermal equilibrium in thermal equilibrium but now in this course we are going to go out of thermal equilibrium. So even the fundamental postulate of equilibrium thermodynamics goes out of the window because as you know the fundamental postulate defines in some sense what we mean by thermal equilibrium it says that when you are in I will come back to the definition when you are in thermal equilibrium it says all accessible microstates are equally probable for this system but now it still begs the question of what I mean by thermal equilibrium well short answer to that is a state in which the distribution all macroscopic averages are time independent okay and how is that achieved? It is achieved obviously by taking a probability distribution function which is itself time independent only then when you take averages with respect to it will everything be time independent right. So the statement is in thermal equilibrium which is defined by saying that the distribution function is time independent all microstates accessible microstates are equally probable okay. Now this is not derivable from mechanics because if it were then we could go home statistical mechanics becomes a special case of mechanics classical or quantum no one has ever been able to derive this on the other hand people have been able to find sort of conditions under which it is true etc what we do know is that if at any given instant of time various averages are time independent various averages are well there should be another way of saying it if at any instant of time you discover that all microstates are equally probable accessible microstates are equally probable then you can show that it will remain so at all times okay and that is a statement about the dynamics. So I repeat if at any time you can show that in an isolated system all accessible microstates have equal probabilities then you are guaranteed that it will remain so for all time in other words the system is in thermal equilibrium but we are trying to do the converse of it we are trying to say if the system is in thermal equilibrium what can you say about the probability of various accessible microstates and the postulate of equilibrium statistical mechanics says that they are equally probable okay. So I want you to appreciate the fact that this is not derived from mechanics there is an extra postulate that has gone in alright. So let us get back to where we were this is what happened in the case of the micro canonical ensemble. So the whole thing all that I have said so far is in the micro all this portion of it now you can say look this is not very realistic because a much more realistic physical situation is not a system in isolation from the rest of the universe but a system which is kind of interacting with this environment like a glass of water placed on this table it certainly in thermal equilibrium with the external atmosphere so there is a system with a much larger number of degrees of freedom namely the atmosphere which is maintaining this system at equilibrium at some temperature. So one can then ask what is the correct ensemble what is the correct distribution function in this case so we have now an example of a system which is closed so not an isolated system but a closed system in thermal equilibrium. So we have in mind a huge heat bath or reservoir which is assumed for example to be isolated and thermal equilibrium and inside it you have your little system here which is the system of interest and this is the heat bath or reservoir all the rest of it and we will assume that this system has a fixed number of degrees of freedom so it does not exchange any matter with the surroundings but certainly it can exchange energy for example it could be a beaker of water which can be heated we are assuming that the number of particles remains the same. Then what happens in such a case the energy of the system is no longer constant okay so let us suppose the energy is E but that is fluctuating very rapidly as a function of time and if the system is sufficiently small compared to the number of degrees of freedom here then the fluctuations in this energy can actually be quite large can be enormous okay. So there could be a huge scatter about whatever average you have out there then the question is what is the distribution function in this case what is the equilibrium distribution function in this case the temperature of the system is essentially decided by this guy so the temperature of this is whatever this is fixed and in fact the ignorance of what is going on in the heat bath is summarized in the parameter called temperature which is the external bath sets the temperature and then whatever internal dynamics happens here ensures which state system the system will be in etc etc this is called the canonical ensemble and what would row equilibrium be in this case once again it turns out that is a function of the energy of the system for reasons which will become clear when I talk about the Louisville equation but now the question is is it this it is clearly not this because it is very clear that large energy fluctuations are driven into this system by the environment some function of the Hamiltonian but certainly not the delta function at E and then you have you can derive this you can actually take this case and make this smaller and smaller and ask what happens to this distribution and then you discover that in this instance row equilibrium it is a functional of H the Hamiltonian of the system this is given by e to the minus beta times H where beta is 1 over k T and this T is the T but it is proportional to it has to be normalized and so on this is called the density operator it is already written in somewhat abstract form okay so I should not do this I will come back and tell you what the meaning of this is more carefully both classically and quantum mechanically this is actually called the density matrix the equilibrium density matrix or density operator this is in general some operator but the more practical thing is afforded by what we mean by the partition function of the system okay the partition function of the system is the canonical partition function is equal to a summation well there is a formal expression for it which is the trace of row equilibrium trace e to the minus beta by trace I mean all the sum of all the diagonal elements in some suitable basis for instance in the basis in which the Hamiltonian has eigenvalues specific eigenvalues and it is a diagonal operator okay. So written out in language which is familiar to you this is equal to a summation over e to the minus beta times the energy levels of the system actually it is sum over all the states so states let us write it properly states but it might so happen I have written it in a discrete notation that states I have assumed are e 1, e 2, e 3 in a denumerable way but whenever you have a continuous set of energy levels possible then you replace the summation by an integration in a straightforward way but the question is this is over the states of the system it might so happen that you have more than one state for a given value of the energy as happens for example in the hydrogen atom where you know that there are 3 quantum numbers which specify the state of the electron without spin being taken into account and those are n, l and m but for a given value of n you have 2 n squared possible states right. So you have to sum over those and therefore you could write this as summation over levels let us continue to call it i g i e to the minus beta p i where this is the degeneracy of the state of the energy level e sub i you have to count that particular energy level as many times as its degeneracy what happens when this becomes a continuous spectrum so this is fine as long as you have states looking like this and you say well at this energy level there are 2 possible states at this energy level there are 4 possible states and so on. So you are including counting over all these photos here but what happens if you have a continuum of energy levels you must replace the summation with an integration and then you have a certain resolution you ask now the question how many states are there in an interval between e and e plus d e so this will go over for a continuous spectrum to an integral over e to the minus beta e times the number of states that there are between e and e plus delta e and that is generally given by something like rho of e d e this gives you the number of states that the system has due to its dynamics between e and e plus delta e and what would you call rho of e the density of states so you can now see that this Boltzmann factor e to the minus beta e tells you the relative weight which is attached to the energy value e here and the very fact that this decreases as a function of e as e increases tells you that in any given temperature the higher energy states are less probable than the lower energy states depending on this quantity here. This quantity has nothing to do with the external heat bath it has nothing to do with temperature or anything like that it is a property of your system so that is got to be computed by doing whatever you do for the system classical mechanics quantum mechanics we do not care whatever you do the dynamics tells you what this density of states is so this is where the mechanics part of it comes in and this is the statistical part. This is the ignorance factor it tells you we do not know what this external heat bath is doing to us at a detail level but we do know that the relative probability with which an energy like e value occurs for the system is proportional to e to the minus beta e okay this is nothing to do with classical or quantum mechanics it is true for the canonical ensemble for both classical and quantum systems so it is essentially if the energy levels are bounded from below go from 0 to infinity for example it would be like the Laplace transform of the density of states that is the partition function and the statement is that once you are in the canonical ensemble then once you know the partition function all quantities can be found for instance if you take the log of this and differentiate it with respect to beta and put a minus sign you would get to bring this e i down here and then you get the average value of the energy etc. So once you have the canonical partition function all of thermodynamics is derivable and more fluctuations about thermodynamic averages can also be derived all moments are known. So once you have a knowledge of this quantity the basic canonical partition function you have complete information about the equilibrium statistical properties of this system okay. Now you can ask when do I work with this when do I work with that we are the answers already clear for isolated systems you choose this for systems in contact with the heat bath you choose this so this is largely dictated by what are the conditions you put on the system here and then of course you can go from one thermodynamic potential to another just as when you have an isolated system in thermal equilibrium you would say that the internal energy of the system is at a minimum okay. When you have a system which is at a fixed at a given temperature and the volume and the number are fixed then you would say the hemorrhage free energy is at a minimum if the pressure temperature and the number of particles is fixed the Gibbs free energy is at a minimum and so on. So you choose appropriate thermodynamic variables to control the system then depending on that you are in various ensembles various kinds but the standard ones are the micro canonical and canonical ensemble and you generalize this a little bit more by saying that if this became an open system and could also exchange particles with the surroundings then you go to what is called the grand canonical ensemble essentially you include the chemical potential in this Boltzmann factor here. So you write this as beta times E minus mu N and that would be it okay you can find your own ensemble depending on what problem you are looking at it does not matter okay. A deep question is will the answers that I get for a given physical system be the same if I compute it in different ensembles this is called the equivalence of the ensembles it is not guaranteed it is not guaranteed there are exceptions to this rule and there are also the kind of exceptions which would violate this extensivity property here then the various ensembles are not equal to each other the textbook examples that you do in equilibrium statistical mechanics certainly satisfy this so people do not bother about which ensemble you compute things in but there are cases where many many interesting physical cases where the ensembles are not equivalent to each other and physics will dictate the correct one that you need to choose. So this is all I want to say about equilibrium statistical mechanics and most of the time we will work in this canonical ensemble. The next question is if I have a system in thermal equilibrium in the canonical ensemble and describe it in the canonical ensemble and I disturb it by putting an external force on it which might be time dependent so I drive it out of equilibrium what happens then what happens to physical quantities is there a way in which I can compute it I can compute statistical averages is there a way in which I can find the time dependent probability distributions and a very important question here class of questions which will spend a lot of time on is the following suppose I disturb the system away from thermal equilibrium and then switch off the disturbance the system is in general expected if the disturbance was sufficiently small expected under normal conditions to come back to equilibrium and a very interesting question is how does it relax to equilibrium what are the time scales how long will it take on the average to relax and what is the way in which it relaxes to equilibrium that is a crucial question will it do so fast will it do so slow what is the characteristic time scale this will now of course obviously depend on the dynamics of what is going on inside. So the question is can we find a kind of general framework by which this question can be answered a system moved a little bit away from equilibrium how does it return to equilibrium first we got to assure ourselves that it does return to equilibrium because there could be cases when a small disturbance may move you out completely either to another state of equilibrium or somewhere else all to a multiple set of states. So we need to be assured about that and even if you are assured about that we need to know how does it return to equilibrium what is the relaxation to equilibrium like so that is the first question we are going to look at namely how do things relax to equilibrium when you make a small disturbance so in some sense we will perturb it with a small perturbation a time dependent part of the Hamiltonian which will not be a function of q's and p's but there is an external time dependent portion which is driving the system away from equilibrium will switch it on and off and see what happens in this case so first we can ask will it go to a new state of equilibrium if this force remains constant and if I switch it off will it come back to the old state and if so how are these related to each other so that is the business of linear response theory and that is the next question we will answer so we will take start with that tomorrow with a simple physical example which is not quite still linear response theory but it will give us help us fix our ideas and see what is the general trick we are going to use to answer such questions so we will start with that simple model of a single tag particle a colloidal particle in a fluid and then take it from there so let me stop you today.