 So, at this point, we know a fair amount about a few thermodynamic functions like the entropy, but also some slightly less convenient thermodynamic functions like the heat and the work, which are path dependent. So, what I'm going to illustrate now is there's a somewhat surprising and really fairly important connection between this path dependent function, the heat, and the state function, the entropy. To see where that comes from, let's start with, and actually I should say not just between the heat and the entropy, but between the inexact differential dq and the exact differential ds. So, to see where that comes from, let's start with what we know about the entropy. And thinking back a little bit, we know that the entropy can be written in terms of the probabilities of the states in a system. If each state in the system i has some probability p sub i, we can sum up these p log p's with a negative k in front, and that tells us the entropy. This is our measure of the entropy or the disorder of the system. And if I'm going to make a connection between ds and dq, then I need to take the differential. So, if I take the differential on the left side to turn s into ds, I need to also take the differential on the right side. So we use the product rule. We're not taking the derivative, which you might be more comfortable with. We're taking a differential, almost the same idea. So, differential of p gives me dp, and like I said, we'll use the product rule. So if I take the differential the first term, I leave the second one alone. And I do the same thing again, leaving this one alone and taking the differential of the second one. So I've got this first term where I've taken the differential of p to get dp. And I've also got the same summation over again where I leave the p alone. Differential of log p for the derivative, it would be 1 over p, but it's the differential, so it's 1 over p dp. So that is the second term in our product rule. So I've taken the differential of the right side now as well, and we can see that there's some simplification that's going to happen here. p times 1 over p is going to give me 1. And when I rewrite this first term, let's switch these two around. So I'll write minus k summation of log p times dp. I've got minus k times the summation p and 1 over p cancel, and I'm just left with sum of dp. So that term is going to simplify and require a little bit of explanation. So what we'll do here, one thing we know about the probabilities, remember what the p's are, they're the probability of occupying any individual state of the system. We know that if I add up all of the probabilities in all of the states, I have 100% chance of being in some state, whether it's state 1 or 2 or whatever. So those probabilities have to sum up to 100%. Or since I'm talking about dp, the differential, is if I take the differential of this quantity on the right side, the differential of p1 plus p2 plus p3 and so on is just dp1 plus dp2 plus dp3 and so on. So p's become dp's. And on the right side, the differential of 1 is equal to 0. So this is the key point in making this next simplification. The sum of the dp's is going to be 0. Essentially, what that means is if I change the system in some way, if I increase the number of molecules in state 1, that has to come with a decrease in molecules from state 2 or state 3 or some combination of other states. So the sum of all the changes has to add up to 0. So the sum of these dp's is equal to 0, and this whole term doesn't matter. We don't have to include it because it's 0. Now we need to think about what to do with this term, the sum of the log p times dp. But again, think about what these probabilities are, the p's are probabilities. One tells us that the probability is 1 over q e to the minus energy over kt. So the thing we need to take the logarithm of is 1 over q e to the minus energy over kt. That's a good thing because this log will end up partially canceling the exponential that's in this term. So when I write minus k sum, logarithm of p looks like the q's in the denominator, so I have a minus log of q, and that's multiplied by this exponential. So the logarithm of this product, 1 over q times the exponential is the sum of the two logs, negative log q for the log of 1 over q. And I add that to the log of the exponential. The log of an exponential is just the exponent, so that becomes minus e i over kt. So that term I've just written in parentheses. That is the log of the probability, p sub i. I still have to multiply that by dp i. So we're getting there. We have now two different terms inside this summation. So if I break those two up, I can write minus k sum of log q dpi. And actually it's positive. Minus of minus is positive k sum of log q dpi. And another minus of minus, which becomes positive, k sum of e i over kt dpi. So those are all going to clean up a little bit more. In the first term, the log q, q is a number. It doesn't have a different value for i equals 1 or 2 or any of the individual states, so I can pull that outside of the sum. Likewise, the k and the t, I can pull outside of the sum. I have to leave the e sub i in the sum. So after I pull all those various things out of the sum, I find that ds is equal to k log q sum of the dps. And then k divided by kt, so the k's cancel. Sum of the e's times the dps summed over all the states in both of these cases. So the first term, we know what to do with that. Sum of the dps is zero the same way it was a minute ago, so that term goes away. Ask yourself now what I'm going to do with the second term. We've seen some of the energies times the dps, the change in the probabilities before. That, in fact, is exactly what we called dq. The difference between heat and work, remember, is heat is a style of energy where we change the occupation levels, promote molecules from one state to another state, whereas work leaves the probabilities the same, but changes the energy level. So there's a difference between edp, which is what we call heat, and de times p, which we would call work. So this term, the sum of the ei, dpi's, that is exactly what we've called dq. So ds is equal to, this term goes away, I have a 1 over t times dq. So now we're just about done. I've made the connection that I promised you now between the entropy and the heat. I'll make one or two caveats about this statement. Let me rewrite that one more time. So ds is dq over t, and I'm going to specifically label the heat as reversible heat. Because this whole way through the derivation that I've just performed, we've assumed that the probabilities are what Boltzmann says they should be. The probability that each state is occupied is this Boltzmann probability, which means that the system needs to be in equilibrium, which means that any process we need, we perform, needs to be done slowly enough that the system remains in equilibrium and therefore is a reversible process. So I'll put that equation in a box because it's important. We'll come back to it several times. That equation, that's the equation we can call Clausius theorem. This connection between the entropy and the heat applies primarily to the reversible heat. If we remember also, because when we talked about the difference between the reversible and irreversible processes, we convinced ourselves that the reversible heat, the heat in a reversible process is always going to be larger than the heat in an irreversible process. So another way of writing this statement would be to say that ds is greater than or equal to dq over t where the equal sign applies to the reversible process, just as in this equation here, where the greater sign applies in the irreversible process because the reversible heat is larger than the irreversible heat. So that's an alternate version of the Clausius theorem, which will explore more of this inequality when we get to talking about things like the second law in spontaneity, but that's a little bit further in the future. For now, I'll just point out that this statement, ds is equal to dq over t for reversible process. That's a fairly remarkable statement because it makes a direct connection between what we've been thinking of as a state function and what we've been thinking of as a path function. Remember, a path function is path dependent. If I have a system in, so it doesn't matter what these variables are, they could be temperature, pressure, volume, anything we want. But if I have a system at some particular state, I can talk about the entropy of the system in that state. Let's say state A, state B. If I want to go from state A to state B, the change in entropy from state A to state B is some delta S associated with that process. There's only a single value for that. But depending on whether I take this path or this path or this path, every one of these paths might have a different value for the heat associated with that property. That's what it means to be a path dependent property. So what I've told you is somehow, even though the heat can be different for each of these paths, the entropy change is the same along each of those paths, but I've got a connection between the two. And the reason that's not a contradiction is because of this 1 over t. The 1 over t is an integrating factor. You might recall we talked about integrating factors when we first talked about exact and inexact differentials. The 1 over t is an integrating factor that converts the path function dq into a state function ds. So that's really what we've just succeeded in determining is we've found an integrating factor. 1 over the temperature happens to be an integrating factor that will convert path function, inexact differential dq into the state function or exact differential ds. So that's another way of describing this Clausius theorem that we've determined. Qualitatively speaking, I'll point out that this equation matches our common sense for a process that involves some heat. Let's say I have a system, a box containing some molecules. In general, when we talk about heat, remember what a positive value for q means is that energy has entered the system in the form of thermal energy, energy that's used to raise the populations of the higher energy states. So when I put heat into the system, then I've supplied some energy into the system. What that means is the system now has more thermal energy than it used to. Its temperature will have gone up. And one thing we associate it with systems at higher temperatures, they're moving around more quickly. They're more randomized. They have a higher entropy. So it's not a surprise that when q is a positive number, s is a positive number. One way to understand what this integrating factor is doing there, if I supply one joule of heat to a system, that gives me a different boost in the entropy for a cold system than it would for a hot system. In fact, because the integrating factor is 1 over t, the amount of the entropy change at cold temperatures where t is small is larger than when t is a large number. So if I come up with an example to illustrate what I mean by that, remember what the Maxwell-Boltzmann probability distribution tells us. So it tells us the probability that molecules have some molecular speed. So at a fairly cold temperature, so I'll say this is a cold temperature, there's a fairly narrow distribution of molecular speeds at, let's say, 50 kelvin or 100 kelvin. The molecules are not moving very fast and they're moving very sharply distributed around this low temperature. I'm sorry, this low velocity. If I increase the temperature, if I supply some heat to that system, supply some energy in the form of heat to that system, the temperature is going to go up, the distribution of velocities will shift towards higher velocities and become broader. So I've made that system have a higher entropy, I have less knowledge about what the velocities are than I used to, because now there's more width in this distribution, there's more uncertainty in what the velocities are. So there's some delta S associated with that process. And if we're at a cold temperature where the velocity distribution used to be sharply peaked and now it's less sharply peaked, that makes a pretty big difference. On the other hand, and maybe I should do this in a different color. Let's say I've got a velocity distribution at some hot temperature, where there's a very broad distribution of velocities and a relatively high average speed for the molecules. If I supply some heat to that system, the same amount of heat I supplied to the colder system, it will also shift and broaden that distribution, shift it to higher speeds and make it a little bit broader. But the change between this already fairly disordered system to this even more disordered system, that delta S isn't nearly as dramatic as this delta S. So what that says in equation form here is when I supply a given amount of heat, when the temperature is cold, that makes a big difference to the entropy. When the temperature is large and I divide by a large temperature, it makes a relatively small difference in the entropy. So that's a couple of different ways of looking at how this integrating factor converts heat into entropy. The next thing we can ask ourselves is, can we use this equation quantitatively to perform some actual calculations to determine the entropy change associated with some processes? And indeed, that's what we'll do next.