 So, if we want to know how the entropy is changing as we change the temperature of an object, solid, liquid, gas, whatever, we have expressions like this derivative that tells us how quickly the entropy changes with temperature, and if we integrate that, we obtain an expression like this one that tells us the change in the entropy. If I modify the temperature from T1 to T2, I just have to integrate heat capacity over temperature. And we have an equivalent set of equations for CV, just replace the P's with V's, and the same expressions work if we're at constant volume. This only tells us the change in entropy, the relative difference in entropy between two cases. So, if I want, if I have a system that's initially at T1 and has some initial entropy at T1, actually, let's, if I do, go ahead and write that, but over here, have a system with its initial entropy at T1, the entropy after heating it up or cooling it down to some different temperature, T2, is that initial entropy plus the change in entropy. So, that initial entropy plus this integral, integral of Cp over Tdt, if I integrate from T1 to T2. So, I could obtain the entropy at a different temperature if I knew what it was at the starting temperature, as long as I can do this integral. But, of course, in order to obtain it at the starting integral, I need to perhaps have had it at some different temperature. So, that sounds like a chicken and egg problem. How am I ever going to find the entropy at one temperature unless I already know it at some other temperature? Turns out, though, luckily, we do know the entropy at one particular temperature, and to see why that's true, let me remind you that the entropy, when we were thinking about the entropy from not from a thermodynamic point of view like we are here, but from a more statistical mechanical point of view, we often talked about the entropy, this would have been the molar entropy, but if I multiply it by n, it would be the total entropy. The entropy is the sum of these p log p's, the sum of probability times the log of probability, where these are the Boltzmann probability of being in a particular state, the ground state, the first excited state, and so on. If we imagine what that expression looks like, well, first let me just point out that that expression, that's minus k times probability in the ground state, if I let p not be the name of the ground state, there are some excited states, p1 log p1, p2 plus times log p2, and so on. So, if I just write out that sum explicitly, it's minus k times the sum of these p log p's for each of these individual states. Now, let's imagine what happens if I take the temperature colder and colder and approach the limit of zero Kelvin, if I cool the system down to zero Kelvin, then the number of molecules in the ground state is going to be all the molecules, 100% of the molecules are going to fall down to the ground state, there's going to be none of the molecules, 0% of the molecules in any of the excited states. So, at zero Kelvin, probability in the ground state is going to approach one, probability in all the other states, as long as it's not the ground state, those are going to approach zero. So, this expression would be minus k1 log 1 for the ground state, zero log zero for the first excited state, and for the second excited state, and so on. We can evaluate each of those terms, but they all evaluate to zero. Log of one is zero, so that kills this first term. Zero is of course zero, so that kills the second term, all the zeros kill all the other individual terms. So, each one of these terms is zero, so the entropy at zero Kelvin, if I cool a system down as far as I'm able to cool it down, the entropy of that system will be zero, and that makes sense. If we think back and realize entropy is disorder, then if we cool the system down until it's as cold as it can get, then it's going to be as ordered as it can ever get, so the entropy of that system is zero. The amount of disorder is zero, there's no disorder in that system. So, we know that at zero Kelvin, the entropy of the system is zero. So, what that means is, actually let me write that down. That's an important statement, important enough that we call it the third law of thermodynamics. And what that third law says is, for every substance, the entropy approaches zero as the temperature approaches absolute zero. So, it doesn't matter whether we're talking about a gas, a copper penny, any material at all, if I cool that down to zero Kelvin, its entropy approaches zero in that limit. So, that's the third law of thermodynamics, and since we have that statement about the entropy at zero Kelvin, that means if I want to know the entropy at a particular temperature, I can think of that as the entropy at zero Kelvin, which I know is zero, plus the integral from zero Kelvin up to the temperature I'm interested in. Heat capacity over T, DT. So, turns out I can know not just the change in the entropy in changing the temperature from T1 to T2. But if I want to know the actual entropy, the absolute entropy, not the relative entropy, but the absolute entropy at a particular temperature T, I just have to integrate from zero all the way up to T. If I know what the heat capacity is at all the temperatures between zero and T, I can do that integral. And this tells me a way of calculating the actual entropy of that system. So, turns out entropy is very different in this respect from other thermodynamic properties like the internal energy, for example, or the enthalpy. I'm normally able to choose the zero of energy anywhere I want. So let's say I have an energy ladder with a bunch of different energy levels. We've had occasion before to say, maybe I want the zero of energy to be here, or maybe I want to define the zero of energy to be here. I can choose the zero to be wherever I want. That won't change the difference in energy between any pair of these states. The difference in energy doesn't depend on where I choose zero to be. And if I change where the zero of energy is, it won't change any of the physical properties of that substance. It'll change the value that I call the energy, but it won't change any of the other physical properties. That's not true for the entropy. It turns out we have a very natural reason to make the entropy exactly zero when the system is fully ordered, when all the molecules are in the ground state. So that allows us to define an absolute entropy where we can't do that for property like the energy. I can't say exactly what the energy is, or if I do, it's dependent on my choice of where zero is for that system. So entropy is rather special in that sense. So we now have a way to calculate the change in the entropy, as well as the actual absolute value, absolute entropy when I change the temperature. We can also calculate changes in the entropy when I do other things other than change their temperature. So we'll look at those next.