 We are now in a position to calculate the entropy of an ideal monatomic gas. The dynamical state of an atom inside a box is specified by its three spatial coordinates, x, y, and z, and by its three momentum coordinates, px, py, and pz. These can be graphically represented by three phase-space points. Each point specifies a spatial coordinate and the corresponding momentum. We consider each of these three phase-space graphs to be divided into rectangles of area H. A set of three of these rectangles defines a phase-space cell labeled with index i. Using the techniques of statistical mechanics, we have solved for the most probable occupation frequency of the ith cell as fi equals 1 over z, e to the minus epsilon i over kt. Where epsilon i is the kinetic energy of the ith cell, and where z, the partition function, equals the box volume v times the three-halves power of 2 pi mkt over h squared. The entropy of a system, s, equals Boltzmann's constant k times the natural logarithm of gamma, where gamma is the number of possible ways to arrange the components of the system. We have seen that this will not change significantly if we take gamma to be the number of arrangements of the most likely configuration, or macrostate. And we have solved for this as minus kn sum over i fi log fi, where n is the number of atoms. We will use the following terminology, s denotes entropy with dimensions of joules per kelvin. S divided by k is dimensionless, and we call this the dimensionless entropy. And, s over kn with n the number of atoms, we call the dimensionless entropy per atom. For our gas sample, the dimensionless entropy per atom equals minus sum over i fi log fi. The log of fi is minus epsilon i over kt minus log z. We can rearrange this as log z times sum over i fi plus 1 over kt times sum over i fi epsilon i. Because sum over i fi equals 1, and sum over i fi epsilon i equals the average energy per atom, three-halves kt, this reduces to log z plus three-halves. Again, here is the dimensionless entropy per atom. Plugging in our expression for the partition function z leads to the result that the dimensionless entropy of a gas with n atoms in volume v at temperature t is n times the quantity log three-halves power of 2 pi mk t over h squared times v plus three-halves. However, there is a fundamental problem with the entropy expression we have derived, which is illustrated by the so-called Gibbs paradox. This thought experiment was described by Josiah Gibbs in 1875. Suppose we have a monatomic gas system with n atoms in a volume v at temperature t. Its entropy, s1, equals Boltzmann's constant times the natural logarithm of the number of possible arrangements of the atoms gamma one. And we have a second system with the same n, v, and t values, entropy s2, and arrangements gamma two. If we think of the two systems as a single composite system, the number of ways to arrange the atoms in the combined system equals the number of ways to arrange the atoms in the first system times the number of ways to arrange the atoms in the second system, which leads to the entropy of the combined system being equal to the sum of the entropies of the two component systems. So, the dimensionless entropy of the combined system should be two times our dimensionless entropy expression for n atoms in volume v at temperature t. Here's that expression. And the combined system with the two components shown sharing a common wall. Now, remove the common wall. In principle, this can be done without doing work on the system, so without changing the internal energy, hence the temperature. This gives us a single system with 2n atoms in volume 2v at temperature t. The entropy should be the expression at left without the factor of 2 in front with n replaced by 2n and v replaced by 2v. Here's the problem. This entropy is greater than the previous value. Here a factor of 2n in front, but the right expression has an extra factor of 2 in the argument of the logarithm. This says that if we remove the divider, the entropy increases, which is a strange result. The local properties of the gas did not change. In both cases, the temperature is the same, the total number of atoms is the same, and so is the pressure. There does not seem to be any irreversible change taking place. Even stranger, if we replace the common wall, we go from the right situation back to the left, and the total entropy apparently decreases. This is a paradox. It implies that if you very slowly close an interior door in your house, the total entropy of the air in your house decreases. There is no thermodynamic rationale for this conclusion. To gain insight into what is behind the paradox, let's imagine that we can somehow distinguish the atoms that are initially on the left and right, without them being different elements or even different isotopes of the same element, because in that case the mass M in our formula would not be the same for all atoms. We show the atoms colored blue and orange. If we remove the divider, it makes sense that the entropy increases, because the distinguishable atoms mix. There would be observable consequences of this. If you were originally surrounded by only blue atoms, you would start to see a mix of blue and orange atoms. And if we replace the divider, the system does not return to its original state. The blue and orange atoms remain mixed. So we can argue that the entropy remains at the higher value, instead of decreasing. So the key to the paradox is the concept of distinguishable versus indistinguishable atoms. The entropy formula appears to be consistent with atoms being distinguishable. This is actually not surprising, given that the key statistical result we used to derive our formula was developed by considering how n differently colored balls could be distributed among some number of boxes. But identical atoms are not distinguishable, and we need to modify our formula to account for this. The dimensionless entropy is the log of gamma, the number of possible arrangements of the atoms. Suppose we place n colored balls in various cells of a grid. Without changing which cells are occupied, how many ways can we rearrange this system? The answer is n factorial. For the first occupied cell, we have a choice of n balls, for the second cell n-1 balls and so on. But if the balls are indistinguishable, there is only the one possible arrangement shown. Our implicit assumption of distinguishable atoms has led us to over-count gamma by a factor of n factorial. Therefore, in our formula, we need to replace gamma by gamma over n factorial. This leads to s over k being replaced by s over k minus log n factorial. Using the approximation log n factorial equals n log n minus n, we add n minus n log n to our formula. We can combine the second and third terms into a single 5 halves n term. For the log terms, we use n log a minus n log n equals n log a over n and multiply through by Boltzmann's constant to arrive at our final result called the Sacher-Tetrode equation, independently developed by Otto Sacher and Hugo Tetrode in 1912. S equals kn times the sum of two terms. The second term, 5 halves, replaces the 3 halves of our original expression. The first term is the original log term with v replaced by v over n. If our logic is correct, this additional 1 over n factor should rectify the Gibbs paradox. Here's our two systems, each with n atoms and volume v at temperature T sharing a common wall. The dimensionless entropy is 2 times the nvT expression. Removing the divider, we have a single system with 2n atoms and a volume 2v at temperature T. We remove the factor of 2 in front of the first expression, replace n by 2n and v by 2v. The result is that both expressions have a factor of 2n in front and in the argument of the logarithm, they have, respectively, factors of v over n and 2v over 2n. Since the factors of 2 cancel, these are equal. Therefore, if we remove the divider, the entropy is the same. If we replace the divider, the entropy is the same. We have resolved the Gibbs paradox. This gives us some confidence in our formula.