 So the properties that we like to discuss in physical chemistry, we like them either to be extensive properties or intensive properties. It's very convenient when we combine two systems either to be able to add two masses together and get the total mass, or to ignore the fact that we change the system size and leave properties like the temperature alone. So extensive is convenient, intensive is convenient. We want to figure out for every property that we're interested in, whether it's extensive or intensive. So the first thing we're going to do right now is consider whether the property of multiplicity, we'll get to entropy sooner or later. But the property of multiplicity, we want to find out whether that's an extensive or an intensive property. So let's imagine, go back to our example of a lattice gas or any type of lattice model, you can imagine that this is describing a lattice gas. So I've sketched two different systems here. Let's call them system A and system B, two different microstates. The first system, I've drawn a box with a volume of 20 and N. So for system A, the volume is 20 and I've drawn 10 molecules in the box. For system B, the volume is also 20 and I've put 10 molecules in that box. So now what we want to do to think about whether the properties of the system are extensive or intensive is combine the two systems together. So if I say, let's combine systems A and B together and make this combined system AB, where I've put them both together, then for AB, the volume of that combined system is just 20 plus 20 is 40. The number of molecules in the combined system is 10 plus 10. 20, those are both extensive properties. The volume and the number of molecules are both extensive properties. What about the multiplicity? On the left side in system A, the multiplicity of the system, the number of microstates I could have drawn that had 10 molecules in a volume 20, I can choose 20, choose 10 different ways of putting 10 molecules in 20 boxes for system B. I could also have chosen 20, choose 10 different microstates consistent with this macro state of the system. So they both have the same multiplicity as each other. What about the multiplicity for the combined system AB? There's two ways to think about that. We could think of it as number of ways of putting 20 molecules in 40 boxes, but that's not quite because I haven't taken away the barrier between these two. Really system AB, the right way to describe system AB would be to say how many ways are there of putting 10 molecules in this box and putting 10 molecules in the second box. So the multiplicity would be 20, choose 10 for the left half and multiplying by 20, choose 10 on the right half. So the multiplicity for the combined system, it turns out, is not the sum of the two multiplicities of the individual system. It's actually the product of the multiplicities for the individual system. It's also not the same as either multiplicity. So volume and number of molecules are extensive properties. Multiplicity is not an extensive property. If it were extensive, I'd get 20, choose 10 plus 20, choose 10. So that's kind of inconvenient. The multiplicity is neither an extensive property nor an intensive property, which makes it not as convenient as it could be if it were either extensive or intensive. But the good news is we can define a new property. If I take the natural log of the multiplicity, I can multiply that. Take the natural log of multiplicity, multiply by any constant we want. Right now, k is an arbitrary constant. The reason I've taken the log of the multiplicity is because I know based on how logs work that the log of a product is equal to the sum of the logs. So if I wanted to know this new property that I've defined, the entropy, turns out this will be the entropy. If I want to know this new property for the combined system A and B put together, we've just convinced ourself over here that the multiplicity for the combined system is the product of the multiplicity of A and the multiplicity of B. Logs behave in a way that the log of this product, log of WAWB, that's equal to the log of WA plus the log of WB, both of which are still multiplied by this constant k. But k times the natural log of a multiplicity, that according to our definition up here is just this new property S. So that's the S for system A. And then added to K log WB is this new property S for system B. So the reason to define this new property the way we defined it as the natural log of the multiplicity is because that guarantees that the S for a combined system will be the sum of the S values for the two individual systems. Which means that S is an extensive property of the type that we're looking for. So this definition we make just in order to come up with a property that's as useful as the multiplicity, but in addition becomes extensive as well. So and as the title and my comments have already given away, this property has the name entropy. You're probably familiar with entropy already as a measure of disorder in a system. You may or may not have seen this particular equation before. But what this definition turns out to do is it allows us to calculate quantitatively numerically the value of the entropy or this property for systems that we're interested in. And just to give you a small example of how that works using this equation. Let's say we have a chemistry problem. Let's say we have an electron that's in the 2p orbital of a hydrogen atom. So we know 2p electrons that can be in the 2px or the 2py or the 2pz orbital. So there's three different possibilities, three different equivalent equal energy options for that electron to occupy. Three different orbitals the electron can occupy. So it has a multiplicity of three. If we would like to know what is the entropy of that electron, we just need to calculate k times the log of w. So w we know is three, we can take the natural log of three. In order to do something with this expression, we have to talk about the meaning of k. So earlier I mentioned that k is an arbitrary constant. Turns out that if all we're interested in is making entropy become an extensive property, then k could be any arbitrary constant we want. In fact, the value of k that we're gonna end up using is chosen for some specific reasons that we'll be able to discuss later in the course. But the value of k, when we take the value of k to be Boltzmann's constant, specifically 1.38 times 10 to the minus 23 joules per Kelvin, then that will be very convenient for us when we start talking about gases and other particular physical systems. So the value of k that we're actually gonna use is this particular value of k. And we'll see later in the course why that particular value is the right value to use. But using that value, if we want to calculate a numerical value for the entropy, if we take Boltzmann's constant 1.38 times 10 to the minus 23 joules per Kelvin. Multiply that by the log of three for the multiplicity of the three states that this electron can occupy. Then log of three, our calculator tells us is 1.1. So this numerically works out to be 1.52 times 10 to the minus 23 joules per Kelvin as our answer. So we can calculate the numerical value of the entropy for this electron occupying one of three different orbitals. And again, for now, the important feature of the entropy is that it's simply an extensive version of what we already think of as the multiplicity.