 Today, we are going to discuss another very important thermodynamic quantity that is entropy. Remember that entropy is connected with spontaneity, all the processes occur in the direction which lead to overall increase in entropy. When I say overall increase in entropy that means I am talking about entropy change in the system, entropy change in the surroundings. This is about entropy change and then you remember in the third law of thermodynamics we also talked about the entropy of a substance. We talked about entropy at absolute 0 and then we also discussed the formal definition of third law of thermodynamics that entropy of each substance is positive which may become 0 at absolute 0 and it does become 0 at absolute 0 for perfectly crystalline substances. Today, we are going to talk about statistical entropy. We are going to derive some formulae connect statistical entropy or entropy with the weight of most probable configuration and then eventually we will establish a relationship between entropy and molecular partition function. Let us get started with that. Our first goal is going to be establishing a relationship between entropy and weight of a configuration. Now how do we go about this derivation? Remember that the definition of change in entropy d s. Remember from the discussion in chemical thermodynamics that d s there we defined as d q irreversible by t. Keep it in mind that d s is not just equal to d q by t. d s is equal to d q reversible by t that means you have to connect the path reversible. Now one usually talks about either constant pressure conditions or constant volume conditions. Let us say since we have been talking so far about constant volume conditions that means at constant volume I will say this is equal to d q volume is equal to d u by t at constant volume. That means in order to connect change in entropy with partition function because remember that we have already derived a relationship between internal energy and molecular partition function. So therefore if I connect the change in entropy with internal energy maybe I can find out some way of eventually connecting entropy with the partition function. So if I am talking about the internal energy that means I am also talking about the total energy because total energy and internal energy are connected with each other. Let us write such a relation. Internal energy is equal to u 0 plus total energy. We know about that that whenever you want to determine the internal energy you find out the total energy and add a constant to this. Can I write this as equal to u 0 plus summation i n i E i. Now I am talking about how many molecules are there in a state of energy E i. Now let there be some change. What I will write is d u is equal to d u 0 plus summation i n i d E i plus summation i E i d n i concentrate on this equation. What I have just done is I have considered the upper equation u is equal to u 0 plus summation n i d i. Let the system undergo some change then d u is equal to d u 0 plus summation n i d E i plus summation E i d n i. Note down that this is constant u 0 is constant so d u 0 should vanish it should become 0. Now we need to figure out that which other term remains or do both the terms remain. First one is changes in energy d E i. Second is changes in the number of molecules in i th state. To understand this let us take a look at this figure. Let us first concentrate on this figure a. Consider this spread out of energy states or energy levels and the population. Let us supply some heat to the system. We are connecting d s with heat. You heat a system. Once you heat a system what is happening is the population is changing and second is you work on a system working in such a way that you are changing the length you are changing the dimensions of the box. If you do that if you change the dimension or the size of the container the length is affected. And remember that the energy levels are connected with the length and then the energy levels will be affected. But the case that we are discussing you are providing heat that does not alter the energy levels it may alter the population. So therefore what is going to happen now this one is going to vanish and there is no change in the energy states. So d E i that also should vanish d i is equal to 0. The term that will remain is this one summation e i d n i. So in the absence of all changes other than heating this is important to note down what we are talking about here is in the absence of all changes other than heating we just established that d u is equal to summation e i d n i. Important to note over here that we got an expression for d u and we just showed in the beginning that d u is connected to change in entropy that means what is d s d s is equal to simply d u by t. We will make use of this now let us do that. We want to derive an expression which connects entropy with the weight of the most probable configuration of the system that is s is equal to k log w this is our next goal establishing this relationship between entropy and log w. We will get into the derivation soon but if you just look at this expression what does it tell that when the temperature approaches 0 let us say that means all the molecules then you will find that are most are in the ground state. If all the molecules are to be found in the ground state there is only one way that configuration can be achieved that means in that case w is approaching 1 if w approaches 1 log w approaches 0 that means when s approaches 0 as t approaches 0 which is compatible with the third law of thermodynamics that the entropies of all perfect crystals approach the same value as t approaches 0. If you read this statement very carefully it says approach the same value as t approaches 0 and it does become 0 for perfectly crystalline substances. Let us start discussing the derivation. What we have we just wrote that d u is equal to summation i E i d n i and we also showed that d s is equal to d u by t we are putting the constant volume condition that means d s I can write this as 1 upon t summation i E i d n i I am making use of this definition d s is equal to d u by t. Now remember that beta is equal to 1 over k t this means 1 over t is equal to k times beta I can use that. So, what do I have now d s is equal to instead of 1 over t I will write k beta summation i E i d n i you will soon realize that why I am doing these kind of transformations. Let us rearrange this I will keep k outside and I will take beta inside beta E i d n i and just keep in mind that we have to keep most probable configuration in mind and you remember that when we derived Boltzmann distribution we went through certain procedure where we used Lagrange's method of undetermined multipliers to set 1 term equal to 0 and let me write that at that time remember that we derived this kind of expression delta log w delta n i plus alpha minus beta E i we had set this equal to 0 in order to find most probable configuration. And then after that we have worked for obtaining an expression for alpha and obtaining an expression for beta from this expression what I have beta E i is equal to del log w del n i plus alpha and now I can substitute beta E i over there. So, what do I have now d s is equal to k summation i del log w del n i plus alpha and I have d n i let us further expand this I have now d s is equal to k summation i del log w del n i d n i plus k summation i alpha d n i which I can further now simplify I have k del log w del n i with this summation d n i this is first term plus k times alpha summation i d n i now let us carefully examine what we have done we have now two expressions over here first one is k del log w del n i d n i second one is k alpha into summation d n i sum of all the changes has to be 0 because the total number of molecules is constant. So, therefore, this term summation i d n i that has to be 0 because total number of molecules is constant. So, therefore, if the population in the ground state is decreasing it must increase by equal amount in the upper excited states. So, therefore, we can set this equal to 0 after you set that equal to 0 I have the expression d s is equal to k summation i del log w del n i d n i because the other part is set equal to 0 summation i d n i is set equal to 0. Now, let us try to recognize this expression refer back to our previous discussion this was d log w k d log w remember when we discuss that log w as a function of n 0 n 1 n 2 n 3 etcetera then we wrote that expression that d log w is equal to del log w by del n i into d n i summation i. Let us remember this equation what we have done at is that we have come up with an expression d s is equal to k into d log w that is the expression that we have come up with. This upon integration strongly suggest that this expression will take a form s is equal to k log where w is the weight of most probable configuration. If we know the weight of most probable configuration and the value of Boltzmann constant it should be possible to find out the entropy of a system. We talked about that if w approaches a value of 1 log w approaches a value of 0 in that case s approaches a value of 0 and this is possible when temperature approaches a value of 0. When temperature approaches 0 I am again repeating what I just discussed in the previous slides when temperature approaches 0 the molecules are going to be in the ground state. And in how many ways you can arrange that what is the most probable configuration w is going to be only 1 because there is only one way you can have that configuration when all the molecules are in the ground state. So, therefore, when temperature approaches 0 w approaches 1 and if w approaches 1 log w approaches 0 and your entropy approaches 0 and this is true for perfectly crystalline substances. A small comparison with the results that we obtained from chemical thermodynamics can be made. In chemical thermodynamics you remember that this expression d s is equal to d q reversible by t at constant volume or at constant pressure. Let us say we keep constant volume then d s will become d u by t this one is possible at constant volume or I say d s is equal to c v t t by t. Now integrate t i to t f so s at t f is equal to s at t i plus integration t i to t f c v by t d t this is a constant volume. And if you keep constant pressure then instead of c v it will be c v you can compare here how we get the information about the entropy at different temperatures. In order to use the concepts of chemical thermodynamics classical thermodynamics the temperature dependence of the entropy is connected with the heat capacity depending upon whether your constraint is constant volume or your constraint is constant pressure. And whereas, when you talk about the concepts of statistical thermodynamics there you are talking about the weight of a configuration. As the temperature changes the instantaneous configurations will change if the instantaneous configurations change then we have to find out the weight of the most probable configuration. Let us take an example the example of perfectly ordered system H c l why I am saying perfectly ordered system is that in this you can notice that all the hydrazons for example hydrogen and chlorine hydrogen and chlorine hydrogen and chlorine the all the arrangement in a perfectly ordered manner. And let us apply this definition s is equal to k log w when temperature approach is 0 this arrangement which is shown here on the screen is going to give the minimum energy configuration only this arrangement. That means weight of the configuration is 1 that means only one way this is the only one way of arranging H c l in the crystal to obtain minimum energy confirmation. So, when you get w equal to 1 as t is equal to 0 then s is equal to k log 1 which is equal to 0. And the reason that why only this configuration leads to minimum energy confirmation is due to the large electronegativity difference between H and c l. What I mean is that if instead of H c l you make it c l H then the energy of the crystal will change. Similarly if you take another example let us say I take the example of carbon monoxide. And here the molecular dipole moment of c o is very small that means whether in the crystal you have c o or o c you see the orientation is changed. But this change in orientation does not lead to change in the energy of the crystal that is either of the two orientations possible with virtually the same energy. That means c o or o c the molecule can arrange in two possible configurations. There are two orientations which lead to the same energy. So, if there are n molecules then w is equal to 2 raise to the power n. For one molecule w is equal to 2 which leads to the same configuration. For n molecules w is equal to 2 raise to the power n substitute w equal to 2 raise to the power n and solve it you have n k log 2 or n r log 2 which leads to a value of 5.76 joules per Kelvin even at absolute 0. So, by using the definition of statistical thermodynamics you can calculate the entropy at absolute 0. I once again refer back to the definition of third law of thermodynamics it says that entropy of each substance is positive which may become 0 at absolute 0 and it does become 0 for perfectly crystalline substances. This particular example of c o is not of a perfectly crystalline substance because there is a configurational disorder even at absolute 0. And from the knowledge of the weight of a configuration here c o or o c two possible configurations. So, for n molecules the weight is 2 raise to the power n there is some value of entropy even at absolute 0 and this entropy has some meaning which we will discuss later on. For the time being let us remember that s equal to k log w can be used to calculate the entropy of a system provided we know the information on the weight of the configuration or weight of the most probable configuration. We have now connected s with w next we will connect entropy with the molecular partition function in the next lecture. Thank you very much.