 Welcome to Quantum Mechanics 1, Birth of the Quantum, the first installment of this YouTube video series on Quantum Mechanics. The word quantum might bring to mind exotic phenomena, such as particles having wave properties, electron clouds and atoms, esoteric particles and processes associated with particle accelerators, and spooky science fictionist scenarios like quantum teleportation. But if you want to understand why quantum theory is necessary, you need to look no further than an incandescent light bulb. Near the end of the 19th century, the electric light was revolutionizing society, based on the seemingly simple process of using electric current to heat a filament until it glowed. The underlying physics was actually not well understood. Most of the energy was emitted outside the visible range of wavelengths, severely limiting efficiency. The general problem of understanding the relation between temperature and the variation of light intensity with wavelength is what gave birth to quantum theory. The hero of our story is Max Planck, who began studying physics in 1874. This was after being advised against this by a physics professor on the grounds that, quote, in this field almost everything is already discovered and all that remains is to fill a few holes. A premature statement if ever there was one. Planck's doctoral thesis concerned thermodynamics, the study of the relationship between heat, temperature, and the properties of mechanical systems. In 1894 he was commissioned to research how to obtain the most output from a light bulb for a given input of electrical power. As reflected in the physics professor's advice, many people at the end of the 19th century thought physics was largely complete. Three broad areas of phenomena had been successfully described. Mechanics explained such things as orbital motion, the solar system, the behavior of structures and machinery, acoustics, fluids, hydraulics, and aerodynamics. Thermodynamics and statistical mechanics explained such things as the relation between heat flow, temperature, pressure, and volume, steam and gas engines, refrigeration, irreversible processes, and the relation between microscopic molecular dynamics of these macroscopic phenomena. Finally, electromagnetics explained the wave nature of light, other types of radiation, such as radio waves, and the laws governing electric power generation, transmission, and utilization by electric motors. It's the interplay of the last two areas where the need for quantum theory became apparent. To understand that, we need to start by looking at how statistical mechanics explain the distribution of energy among a collection of molecules at a given temperature. As an example, let's consider the behavior of gas molecules in a box. There are about 30 billion billion molecules in a cubic centimeter of gas. We're going to look at a computer simulation of 10 ideal billiard balls on an ideal billiard table as a very simple model of our gas. We'll start with all the balls at rest, and then we'll set one in motion and see what happens. As time goes on, collisions transfer energy between balls. The kinetic energy that was originally contained in a single ball eventually spread among all the balls, but not equally. Some balls are moving faster than others. In fact, the speed of each ball changes every time it suffers a collision. Let's look at this more closely by following the history of a single ball, which we call it black. The velocity can change dramatically with each collision. It seems quite hopeless to come up with an accurate description of any given ball's motion, and if this 10 ball scenario is intractable then describing a gas with billions of billions of molecules is laughable. Or maybe not. In the 19th century, physicist, most notably Ludwig Boltzmann, realized that while it was impossible to keep track of the trajectory of each molecule, it was possible to make definite statements about the statistics of those trajectories. The resulting analysis, called statistical mechanics, turns out to be very powerful and played a critical role in the development of quantum mechanics. The key is to find solvable problems that can be used to model the behavior of a complex physical system. To see how this works, consider a number of boxes and a number of balls distributed among those boxes. Here we start with three balls and each of ten boxes. Now, randomly choose a box, and if it's not empty, move one of its balls to another randomly chosen box. Then randomly choose another box and move one of its balls, and so on. After doing this many, many times, is there anything you can say about the distribution of the balls among the boxes? Here we'll investigate this by simulating 1,000 boxes each initially containing ten balls. Performing one million of the random ball transfers, we end up with widely varying numbers of balls per box. Now comes the statistics part. For any number of balls per box, for example, the red line here corresponds to 15, we can go through and count how many boxes have that many balls. We can do this for every possible number and display the results on a histogram, where the number of balls per box is shown on the horizontal axis, and the fraction of boxes with that many balls on the vertical axis. For the 15 balls case, we see that the fraction is about 0.03, or 3% of the boxes. Running more simulations, we find similar histograms. These histograms tell us that a box is much more likely to contain a small number of balls than a large number. Now the nice thing about statistics is that the more objects you have, the more precise the statistics are. In the limit of a very large number of boxes and balls, you can solve for these statistics with pencil and paper, and you get this red curve. Here's the result. In the limit of a large number of boxes, n, and a large average number of balls per box, m, the probability that a given box will contain little n balls, is 1 over m times e to the minus n over m, a decreasing exponential distribution. Again, the average number of balls per box is m. OK, so what? Well, now we'll make an analogy to our statistical mechanics problem. Assume each ball represents a chunk of energy, denoted by Greek letter epsilon, and the boxes represent anything that can contain energy, the molecules in our case. The process of randomly passing balls between boxes represents the random exchange of energy between molecules. The balls per box distribution should, therefore, represent the energy per molecule distribution. The average energy per molecule is m times epsilon. The energy of any particular particle is little n times epsilon. To find the probability that a molecule will have an energy e, we multiply in two places by epsilon over epsilon, which is just 1, and identify the expressions for energy e and average energy. Now, temperature is a measure of average energy per molecule, and we write the average energy per molecule as a constant k times temperature t. k is called Boltzmann's constant. Finally, we let our chunk of energy epsilon shrink to infinitesimally small. We'll call it de, the differential of energy. This describes a continuous exchange of energy among the particles. The resulting probability that a molecule will have an energy e is 1 over kt times e to the minus e over kt. This is called the Boltzmann distribution. Statistical mechanics provided powerful explanations for phenomena arising from the interaction of many particles. It accounted for the relation between pressure, volume, and temperature of an ideal gas. It clarified the somewhat mysterious thermodynamic concepts of entropy. But when these ideas were applied to situations involving electromagnetics, a serious problem arose, the solution to which required a radical rethinking of the physics of the very small. According to electromagnetic theory, when an electric charge is accelerated or jiggled around, it radiates an electromagnetic wave. When this wave encounters other charges, it exerts forces that cause them to jiggle in response. They would also radiate their own waves, but we don't show that in this animation. Given that electromagnetic radiation from our sun is the primary energy source for our ecosystem, it's clear that these waves carry energy. Therefore, they should somehow fit into our chunks of energy analysis as additional places to transfer energy in and out of. Consider a piece of metal which consists of an array of more or less fixed atomic nuclei and bound electrons represented by green dots and a collection of conduction electrons which are more or less free to move around represented by red dots. These particles will have average kinetic energies proportional to the temperature T of the metal. The low mass electrons will travel with especially large velocities and accelerations. They're charged, so they'll produce radiation. Therefore, we see that there will be a close relation between the temperature of a material and the radiation it produces. This connection between temperature and radiation is readily apparent when an object is heated until it glows. The initially reddish glow becomes wider with increasing temperature. Suppose we have a box enclosing a vacuum, the sides of which are made out of some material at a temperature T. The charges in the material will be in thermal motion and the box will be filled with radiation. The particles making up the box contain kinetic energy and the radiation in the box contains electromagnetic energy. We've discussed the relation between temperature and kinetic energy. Now we want to see what the relation is between temperature and electromagnetic energy.