 Towards this end, we need to consider the wave aspects of electromagnetic radiation. First consider one-dimensional waves, such as a vibrating guitar string. The string is fixed at its two ends and cannot move there. In between, it can oscillate with a single bump at a certain frequency. It can also oscillate with a two-bump shape at twice the single-bump frequency, or with a three-bump shape at three times the frequency, and so on. We call each of these vibrations a mode of the string, and we index them by the number of bumps in the string shape. We can draw a number line and place red dots at the integers, n equals 1, 2, 3, and so on, and each dot will then represent a mode. We use Greek letter nu, which looks a lot like a v, to represent frequency. The frequency of the nth node is n times some fundamental frequency, nu-zero. So the fifth mode oscillates with the frequency 5, nu-zero. An important question for us is this. Within a given range of frequencies, say between these blue lines, how many modes are there? For the one-dimensional case, we see that the so-called density of modes is constant. If we move the blue lines to the left or right, we'll, on average, get the same number of modes between them. We express this as Greek letter rho of nu equals a constant a. Now consider two-dimensional waves, such as on the surface of water or a vibrating membrane. A membrane fixed at its perimeter can vibrate with one bump at a certain frequency, or two bumps with a higher frequency. But there are two ways to have two bumps, so we can have two different modes with the same frequency, and in general we can have any number of bumps, n, in one direction and any number, m, in the other direction. So the modes are now represented as points on a two-dimensional grid, indexed by two numbers, n and m. The corresponding frequency is a fundamental frequency, nu zero, times the square root of n squared plus m squared. This is constant on a circle, so a range of frequencies correspond to the area between two circles. At higher frequencies there will be more area, hence more modes for a given frequency range. So here there will be more modes with frequencies between seven and eight nu zero than between two and three nu zero. For two-dimensional waves, the density of modes is proportional to frequency, and we write rho of nu equals a times nu. Finally we get to three-dimensional waves, such as the oscillations inside a microwave oven. These are a bit harder to illustrate. The bumps now correspond to cubicles inside a larger volume. We could, for example, divide the volume into two regions in one dimension, three regions in a second and four in the third. So our modes are now indexed by three numbers, n, m, and l. Each mode is a point in a three-dimensional grid. And a given frequency represents a spherical surface. The density of modes now corresponds to the number of grid points between two spheres. This is proportional to the square of frequency, and we write rho of nu equals a times nu squared. The reason this is important is that we would expect each of these modes to act like an energy container, similar to molecules, with an average energy of Boltzmann's constant k times temperature T. Since the number of modes increases as the square of frequency, we should then expect, for a system in thermodynamic equilibrium, the intensity of radiation to increase with temperature and as the square of frequency. That is, much more radiation at higher frequencies. This is the prediction of classical physics, called the Rayleigh-Jeans law. We write the intensity of radiation as u of nu equals some constant a times nu squared times k times T. This is often plotted using logarithmic scales, in which case it forms a straight line. Let's see how this stacks up against observation for the largest box we know of, the universe. Here we have the intensity of the cosmic background radiation, the afterglow of the big bang versus frequency. The points with error bars are observed values and the line is the Rayleigh-Jeans law. The agreement is excellent, at least for the range of frequencies shown. In general, this law accurately describes what is observed at low frequencies. We see that it cannot be true in general, however, by continuing to follow the line to arbitrarily high frequencies. At infinite frequency we would have infinite intensity, not good. This dilemma is sometimes called ultraviolet catastrophe. Fortunately, this is not the way nature works. As frequency increases, the radiation intensity reaches a peak and then rapidly drops off. Convenient for us, but what does this happen? High frequency radiation, ultraviolet light, x-rays, gamma rays, certainly exist and the density of modes argument is sound. We are forced to conclude that something is keeping the high frequency modes from absorbing as much energy as the low frequency modes, but what causes this? So the prediction of classical physics is that radiation intensity varies as the square of frequency due to the greater number of modes at higher frequencies and as the temperature due to the average thermal energy available for each mode. The failure of this at high frequencies leads us to seek some mechanism that would cause the average energy per mode to decrease rapidly at high frequencies. The problematic KT term comes from the Boltzmann distribution, which says that the probability of mode will have an energy E varies as the exponential of minus E over KT, the average value of this being KT. Let's see if we can put the frequency new into the energy factor so the entire term will depend on frequency. Thinking back to our discussion of the Boltzmann distribution, we used a bookkeeping trick where we assumed energy was exchanged in the discrete chunks, epsilon. We then let epsilon decrease to zero to get our final result. Let's do the same thing, but now let the chunks be proportional to frequency. We write epsilon equals some constant h times nu. By letting h decrease to zero, epsilon will decrease to zero for all frequencies, which describes a continuous exchange of energy as before. So if a mode of frequency nu has n chunks of energy, the mode energy is E equals n times h times nu. Now Boltzmann's distribution says that the probability this mode will have this energy is a constant A times the exponential of minus n h nu over KT. To make the sum of all probabilities be 100%, the constant A needs to be that shown here in parentheses. Now we calculate the average energy. This is the sum over all possible energies of the energy times the probability of that energy. Putting in our expressions for E and P of E, and summing from zero to an infinite number of energy chunks, we end up with the result shown at the right. That is, the expected or average energy of a mode with frequency nu is h nu over exponential of h nu over KT minus 1. For large frequencies, the exponential and the denominator will make this very small, which was our goal. Unfortunately, if we complete our bookkeeping trick by letting h shrink to zero, this expression reduces to KT, and we're back to the Rayleigh-Jeans law, which is what we were trying to fix in the first place. However, for a particular small but non-zero value of h, replacing KT in the Rayleigh-Jeans law with our new result produces Planck's law, first presented in 1900, which does indeed correctly describe the intensity of radiation at all frequencies and temperatures. The current measured value of h, called Planck's constant, is about 6.6 times 10 to the minus 34 joule seconds. Small but not zero. Here's Planck's law in blue compared to the Rayleigh-Jeans law in red. They agree at low frequencies, but Planck's law reaches a peak at h nu equals 2.82 KT, and then rapidly decreases. This accurately describes the cosmic microwave background radiation and indeed the radiation of any so-called black body at a given temperature, T. So nature tells us that Planck's law is the correct description of black body radiation. But built into it is the assumption that there is a finite quantum of energy, h times nu, such that a mode of frequency nu can only have an energy, which is an integer number of these quanta. Does this have any physical meaning? Initially Planck and others treated it as just a formalism used to derive the correct law. Indeed the concept of energy quanta seems laughably absurd from a classical point of view, and here's why. The energy in a wave is related to the oscillation amplitude. For example, energy is put into a guitar string by stretching it with a finger. When you let the string go, it oscillates. As it radiates away energy, the oscillation amplitude continuously decreases until all energy is gone and the string is flat. If energy is transferred in discrete quanta, and say we start off with two quanta, the wave would oscillate with a constant amplitude until it lost one quantum, at which point the amplitude would suddenly reduce. Then it would oscillate at that level until it lost a second quantum, at which point the wave would vanish. If electromagnetic energy truly comes in discrete quanta, then electromagnetic waves would seemingly display these bizarre quantum jumps. This is absurd in the context of classical theory. Maybe we don't need to worry about this. After all, seemingly no one had ever observed this type of quantum behavior. If you can't see a quantum, how can you say it does or does not exist? This conservative approach is sometimes summed up as shut up and calculate. There's no point in worrying about the reality of energy quanta. Planck's law is verifiable, let's just use it. Or maybe this is more than just a formality. Maybe there's a there there. Indeed as we'll see in the next video, these quanta are more than just a math trick or formalism. They have observable consequences that lead us to conclude that they are physically real, but physically real what exactly? And there's the rub. They're things unlike anything we are familiar with in our day to day lives.