 OK, so let's recap a little bit what we learned so far and what the purpose of it is. So what we've been talking about recently is we know if we know the energies that a system can have, the molecules in a system or the energies of the particles or whatever the states of a system are. If we know the energies of the states of a system, we can use those energies to calculate the probabilities with equations like the Boltzmann probability distribution. Probability is e to the minus energy over kt. And recall that in order to do this, we need to know q. If we want to know the actual probabilities, we need to know not just one energy. We need to know all the energies, and that will give us all of the probabilities. So the brackets mean, if I know the set of all the energies, I can calculate the complete set of all these probabilities. Boltzmann's probability distribution helps us do that. If you think back a little further, the reason we're interested in the probabilities is because those can tell us something about the larger state of the system. So for example, the entropy of the system or the energy of the system, we have seen equations like entropy. The intensive entropy is minus k sum of the probability log probabilities. And we've also seen that the average energy, the intensive energy in the system can be calculated from the probabilities and from the energies by summing up probability times average energy. So we have equations that, once we have the probabilities, will tell us what is the entropy of the system, what is the energy of the system. This collection of techniques equations is essentially what we refer to when we talk about statistical mechanics. And the whole goal of statistical mechanics is to be able to stop talking about individual microscopic properties of the system, the individual energies, and which particles have which energies, the microstate of the system, and instead be able to talk about collective properties of the system, the entropy of the system as a whole, the energy of the system as a whole. Those are things that help us describe macro states of the system. So statistical mechanics help us move from microstates over to macro states. Those macro states, things like the entropy and the energy, those themselves are useful for calculating other properties. Once we know something about energy and entropy, we can calculate things like enthalpies or free energies or pressures, other thermodynamic properties. So those are terms that are familiar from earlier chemistry courses, but we'll learn a lot more about how to calculate those from the entropy and the energy. This calculation, the relationship between all these different thermodynamic variables, that's what we're talking about when we're talking about thermodynamics. So once statistical mechanics has helped us turn microstates and individual molecule energies into macro states or descriptions of the system as a whole, the relationships between those properties tell us how to predict other properties of the system using thermodynamics. On the other side, however, before we can do all that, we need to know what the energies are. And it's possible to not just measure, but to predict or to calculate what those energies are. And that's the next thing we'll focus on. This part of the process, being able to predict what the energy of a molecule or a system is, that's the domain of quantum mechanics, where we'll use different starting points, different equations that we haven't discussed yet for how to calculate the energy of the system. And we'll use those as input to being able to predict entropies, energies, and other thermodynamic properties. So this is sort of a roadmap of where we're going. But on this roadmap, we're going to start at the beginning. And the next thing we'll do is start talking about quantum mechanics and how to use that to predict energies of molecules.