 Okay, so we've met several different thermodynamic variables as variables themselves, but now it's time to start talking about what those look like when we think of them as differentials. So for example, we've talked some about the internal energy, U, and one of the things we know about the internal energy is that we can write it in this way. In fact, that's how we first define the internal energy is that's an average of the microscopic energies of the system. So these e-sub-i's are the individual microscopic energies that we could obtain, for example, by solving quantum mechanics. Schrodinger's equation to obtain these energies. If we have for a particular system, maybe this energy level is allowed, or this one, or this one, I've got a family of different energy levels that the system can occupy. Microscopically, I can occupy any of these. Macroscopically, the average energy for the entire system depends on how populated each of those energy levels is. Each one of those energy levels has its own probability of being occupied, the Boltzmann probability that it's occupied. So if I multiply each of those energies by the probability that it's occupied, summing up over all the different states, that tells me my macroscopic or my average thermodynamic energy. So let's keep in mind, I write this down with summation notation, but it'll be useful in just a second to think about what that means. So keep in mind that that summation notation means that the energy is probability that I occupy this state multiplied by its energy, plus the probability that I occupy a state two, multiplied by its own energy, plus, and then we continue for as many states as we have. So what that means is, if I want to think about what variables u depends on, it depends on a lot of them. It depends on the value of p1 and p2 and p3 and so on. It also depends on the value of e1 and e2 and e3 and so on. So if I'm listing all the variables that the energy depends on, it depends on all the probabilities, all the Boltzmann probabilities, it also depends on all the energies of these individual states. So when I go to write down a differential of the energy, and that's the main goal we're after, is we know how to talk about energy. What is the differential of this energy? How much does that energy change as I change some of these variables? I can write that as, well, first I can say as each of these variables changes. So for example, the first variable p1, I can ask how quickly does the energy change as I change p1, multiply that by the change in p1, and then continue. Let's think about what that term means. What is the derivative of this energy with respect to p1? The only place p1 shows up in this expression is right here. So when I take the partial derivative of energy with respect to p1, I just get e1. And likewise, the remaining terms here, I've got one that's going to look like an e2 times dp2 and so on. So when I change the pressures, the rate at which the internal energy changes is I'm sorry, not pressures. As I change the probabilities, the rate at which the internal energy changes as I change those probabilities is just the microscopic energy. So I've got one of those for each one of my energy levels. But I can't forget the energies are also variable. So I can also say, as I change energy, as I change the value of e1, there's also a du de1, which is only going to pull out this piece of one. So the total differential of the thermodynamic energy is energy 1 times dp1, energy 2 times dp2, and so on for all the probabilities. And then p1 de1, p2 de2, and so on for each of the changes in each of the energy levels themselves. So I can write that a little more compactly by saying, let's sum up all of the eis times dpi's for all the states. And separately, let's sum up pi times dei for all the states. So the first row is all encapsulated in this first sum, second row is all encapsulated in the second sum. So that's fine as a mathematical statement, but it becomes even more interesting when we think about what these terms mean. So let's ask ourselves, what do we mean when we say an energy times the change in the population or the probability that at that state is occupied? So if we go back to our energy ladder, remember what these probabilities indicate is Boltzmann tells us the ground state will be pretty well populated. The next state up will be somewhat less populated, and so on as we climb the ladder, each state becomes less populated than the one below it because of these Boltzmann probabilities. The probability drops as the energy increases. So this first term is describing how much the average thermodynamic energy, the overall energy of the system will change as I change the probabilities, but without changing the energy. So I'm allowing the probabilities to change, but the energies are remaining unchanged. So what that means is I'm taking some of these molecules and I'm lifting them up to higher energy levels. So if I lift a few of the molecules from E1 to E2 and some from E2 to E3, then probabilities of occupying states 1 and 2 and 3 have changed in this illustration. So I haven't changed where the rungs on the ladder are, I've just changed how many particles sit at each level of the ladder. If we think about how we might accomplish that, let's go back to Boltzmann's probability distribution. The probability that I occupy one state is given by 1 over the partition function times e to the minus energy of that state over kT. If I'm not changing the energy levels, the energy levels are one of the variables that I can change, but I'm not changing them in this expression. If I leave the energies unchanged, the only way to change the probability is by affecting the temperature. And that makes sense, so if I have my set of molecules and if I heat them up, if I increase their temperature, if I apply some heat to the system, then that will excite some of the molecules up into higher energy levels, just like Boltzmann says it should. And that is a way of increasing the overall energy of the system. So in fact, that's what we mean by this first term. Any change in the internal energy that comes from exciting molecules up to higher energy levels? We call that a change resulting from heat. When we apply some energy to the system in the form that causes molecules to occupy higher energy levels, that's called heat. On the other hand, if we look at the second term, we're asking about something totally different here. Here we're saying leave the probabilities unchanged. Leave the probability of occupying this state or this state unchanged, but change the energies of the states themselves. So what am I saying there? I'm saying let's take one of these energy levels and let's lift it up to a higher energy level. Without changing the number of molecules that occupy it, I'm just increasing that energy level and letting those molecules that occupy that state go along for the ride. That's another way of increasing the energy. If I increase the energy levels themselves, that's going to increase the thermodynamic energy. But if we think about what would be involved to do that experimentally. If I have, let's say, a box of ideal gas that has energy levels that are given to us by the 3D particle in a box. If my goal is to physically change the value of the energy level, increase the energy somehow, I can't change planks constant. I can't change the mass of the molecule. The only variable that I have the ability to change experimentally is the size of the box. The box length of this box, if I either shrink the box in one dimension or all dimension simultaneously, I've decreased the value of A. So as A decreases, the volume of the box is going to decrease. And also the energy levels of the gas will increase. So that's one way of changing the energy levels. And so what this term represents is I can change the total energy of the molecules in this box by shrinking the size of the box. As I shrink the size of the box, energy levels go up. And again, that makes sense from our everyday experience. If I have a balloon and I squeeze the balloon and make it smaller, that definitely costs me some energy to do work on the gas and compress the gas and apply pressure to the gas. So that energy that I have supplied to the gas has been transferred into the system and that energy of the gas has increased because I have done work on the system. And that's the name we give to this form of energy. We call that work in the same sense that physics work is work when you apply a force over a distance you do work. In a physical chemistry sense, work is any transfer of energy into a system that results in a change in the energy of the system but results specifically from changing the energy levels themselves without changing the probability that those energy levels are occupied. So if we give labels to those quantities du is equal to heat plus work. I can write that as heat and work in this sense. So I've given the labels q for heat and w for work. This is the differential change in the internal energy. So I've got a differential amount of q and a differential amount of w. Notice that I've written those as inexact differentials. And that's because although du is an exact differential, I took this expression for the energy and I took the differential of it to obtain this one. So that's definitely an exact differential. But if I look at only the piece that I've called heat, only the parts that involve DPs, those are only part of this total exact differential. If I look for something whose differential is just this first row of terms, I can't find any function whose differential is just these terms. So that's an inexact differential. And likewise work, the second row of terms, that's an inexact differential. They happen to be two inexact differentials that if I add them together form an exact differential. But by themselves, taken one at a time, they are inexact differentials. So as definitions go, we say that heat is the sum of the EIDPIs or a transfer of energy into a system that changes the energy of the system only by changing the populations of the energy levels. And work is the opposite. Exchange the P's and the E's. And work is mathematically the sum of the probabilities times the change in the energies. Or more descriptively, work is a transfer of energy into or out of a system that changes the system's energy specifically by changing the energy levels without directly changing the occupation of those energy levels. So that's what we mean by the terms heat and work when we use them in a technical sense in physical chemistry. The next step is going to be to spend a little more time looking at heat and work and how they sum up to make energy, in particular not just for these differential amounts of work, but for more finite amounts of heat and work as well.