 In 1965, Bell Labs engineers Arno Panzazz and Robert Wilson discovered an isotropic background radiation in the microwave frequency range. Over the years since then, ever more accurate technologies have increased our measurements of this key piece of evidence for a small, hot beginning to the universe. To understand how important this discovery is, we need a temperature history of the universe. The Big Bang theory holds that there was a time when the universe was very small and very hot. The contents of the universe would have been in thermal equilibrium at that time. Particles in thermal equilibrium all have the same temperature. Roughly speaking, temperature is a measure of the kinetic energy of the particles. In this example, we have a small volume of protons and electrons in thermal equilibrium at 10,000 degrees Kelvin, the temperature at the surface of our sun. At 10,000 degrees, the electrons and protons are too hot to combine into hydrogen. If we add photons, they will scatter off the charged particles. Light cannot travel far through this space because it is constantly interacting with these free-moving charged particles. The plasma is opaque. As the universe expands, it cools. At some point, it cools enough for protons to capture electrons to form electrically neutral hydrogen. This process is called recombination. Once the transformation to electrically neutral particles is complete, light will travel through space without any further interactions. This is called decoupling. The radiation is said to have decoupled from the electrons and protons. The plasma becomes transparent. In addition to all the electrons and protons in the universe packed into this relatively small space, there were around 1.6 billion photons for every baryon in the universe. Baryons are the protons and neutrons in the plasma. Being in thermal equilibrium, these photons would be characterized by the black body radiation curve. Knowing the energy that it takes to separate an electron from its proton in a hydrogen atom, we can calculate the temperature at which recombination and decoupling would occur. The current figure is around 3,000 degrees Kelvin, the surface temperature of a cool star like Proxima Centauri. As we observe the space around us, we see our solar system, our galaxy, and our local group of galaxies first. We then see significant numbers of large, well-formed galaxies in our local supercluster. The further out we see, the further back in time we go. And the further back in time we go, the more we notice a reduction in the size and structure of the galaxies. Eventually we reach as far as the first galaxies to ever form from the first stars that started to shine. Before that, it was just hydrogen and dark matter. No light was being created for us to see. As we look back in time, we are also looking back at an ever-shrinking volume because the universe was getting smaller and its temperature was getting hotter. Eventually it reached 3,000 degrees. At that point, hydrogen atoms began to disassociate into protons and electrons and space became opaque. Coming back the other way, the surface where the transition from opaque to transparent occurred is called the surface of last scattering. At that time, all the photons in the universe were released. These photons are still with us today. We see them all across the sky in tremendous numbers. They are the cosmic microwave background photons, CMB, and they tell us a great deal about the past, present, and future of the universe. Here's a projection of the celestial dome as seen by the Wilkinson Microwave Anisotropy probe factoring out all local and local group motion. The mapping preserved the relative sizes of the surface objects. The key observation is that the light fits the black body radiation curve perfectly. Its mean wavelength is around 2 mm and its peak intensity has a wavelength of 1 mm. That's in the microwave range. This gives us the temperature of the radiation today. It is 2.725 degrees. We know that at decoupling it was 3,000 degrees. So the temperature has been reduced by a factor of 1,100. We also know that the ratio of the current temperature to the temperature at decoupling is equal to the ratio of the current scale factor to the scale factor at decoupling. So the universe has expanded by a factor of 1,100 times since decoupling. The black body radiation formula also gives us the number density of CMB photons. There are over 400 million of them in every cubic meter of space throughout the cosmos. This is a thousand times more than all the photons from all the starlight ever created by all the stars and all the galaxies for all the billions of years that stars have been shining. Earlier measurements of the CMB indicated that it was homogeneous. That would be a problem because if it were 100% homogeneous, the resulting universe would be 100% homogeneous and it isn't. But our measurement technologies have improved dramatically over the past half century. The Planck satellite measurements detected small amounts of temperature deviation called anisotropy, meaning different in different directions. The image uses color to show variations from the average with blue for minus 200 millionths of a degree through green and yellow to red, which represents plus 200 millionths of a degree. That temperature deviation comes to one part in 100,000. These temperature deviations come from small mass density deviations in the plasma at the time of decoupling. For example, suppose we had a small mass density excess in this region. Light from this region would be gravitationally redshifted. These mass density deviations would be the same magnitude as the temperature deviations, one in 100,000. It is also important to note that these anisotropies have structure. We see large structures, small, even tiny structures, and giant structures. We even see structures within structures at every scale. In other words, they're quite fractal. These small-scale anisotropies in the CMB are what led to the large-scale structures such as galaxy clusters, filaments, and voids that we see today. For example, a very tiny spot of red on the surface of last scattering, representing a small decrease in mass density in that region, will have expanded 1,100 times to the size of the coma cluster today. The fact that there is a cosmic microwave background with all these characteristics is one of the most important pieces of evidence we have that verifies and validates our current Big Bang model of the universe. Just how the universe evolved from small-scale matter deviations at the time of decoupling to filaments of superclusters and vast voids can be explained by a physical process called caustics. Originally developed to explain light behavior, it works just as well for protons and dark matter. Picture a set of uniformly distributed particles on a line, each with slightly different velocities. They start out with a uniform particle density. But because of the small velocity differences, the particle density will vary as time goes by. Areas of high and low density will develop. The density at a later time, t, is described by an equation. The equation has hotspots when the denominator approaches zero. Here's a plot of the rate of change in velocity with respect to location on the x-axis times time. Naturally we get a very small curve when time is small. But as time goes on, the deviations grow. When they reach the critical time and beyond, we get hotspots. Expanding this to two dimensions, we get density peaks along curved lines that themselves intersect at points with maximum intensity. I see this phenomena in my own backyard. The lines at the bottom of a swimming pool are examples of caustics caused by small waves on the surface of the water. And when we extend this to three dimensions, we get curved surfaces with increased density that intersect along lines that intersect at points. This is the web-like pattern we see in the large-scale universe. This is what you get from the initial conditions characterized by the cosmic microwave background. No additional forces are involved. Gravity is the key to the formation of stars and galaxies within these filaments of dense accumulations of baryons in dark matter. But it was caustics that took the minor anisotropies that existed at the time of decoupling and turned them into giant walls around great voids. We've mentioned dark matter several times in reference to the CMB. It was the largest matter component of the opaque universe at the time of decoupling. And it is the dark matter density anisotropies that produced the filament structures we see in the universe today. But this only works if the dark matter doesn't fly off at near the speed of light like neutrinos do. In this respect, it is said to be cold, cold dark matter, CDM for short. Using galaxy rotation rates, galaxy cluster dynamics and gravitational lensing, the best estimates for the dark matter density parameter in the universe is 0.262. The baryon matter density parameter is measured to be around 0.048. Radiation mostly from the CMB is significantly less than that. The problem is that all of this isn't enough to account for our flat universe observations. But there is one component of the universe that we have not yet taken into consideration. And that is that empty space, the vacuum itself, has energy. You may recall from our video on the Higgs boson that empty space is not actually empty. It is filled with matter and energy fields. We model the waves in these fields as quantum harmonic oscillators. And given the Heisenberg uncertainty principle, the zero point energy for any wave in the field must be greater than zero. In the 1940s, a physicist named Hendrik Kazemir proposed that this zero point energy was real and for the electromagnetic field it could be measured if you place two parallel low mass conducting surfaces close to each other. The fluctuations in the quantum field will be limited in between the surfaces but not outside the surfaces. This will create a negative pressure on the surfaces and push them together. Negative pressure is called tension. This effect is called the Kazemir effect. It wasn't until the 1990s that instruments sensitive enough to register the very small amount of force involved were available. Here we see that the force on one square centimeter plates placed one micron apart is equivalent to one one hundredths the weight of an average mosquito. The Kazemir effect shows that the vacuum energy is real, it's small, and it has negative pressure. But this is only for the electromagnetic force field. It's only one of a number of fields filling empty space. We do not know how many there are, so we can't predict the total amount of vacuum energy. Vacuum energy has a key implication for our flat cosmological model. We have seen that radiation with w equal to one-third dilutes by the cosmic scale factor raised to the fourth power. Relatively still matter that exerts no pressure has w equal to zero and dilutes by the scale factor cubed. The vacuum energy density with w equal to minus one is a constant. It does not dilute. Therefore the total amount of vacuum energy increases with the volume of the universe. In a small universe it would have a little impact, but today it is almost seventy percent of the energy density of the universe, filling the gap left by the matter radiation only number. Through this model the universe was matter dominated for most of its existence since the Big Bang. It was radiation dominated for a mere forty seven thousand years, matter dominated for nine point eight billion years, and currently in transition to complete vacuum energy domination. The name we use for this zero point quantum vacuum energy is dark energy. We use the symbol lambda to represent this component of the universe. The symbol was first used by Einstein as a cosmological constant to account for a static universe. It went by the wayside when Edwin Hubble discovered that the universe was not static, but it has now been repurposed to represent this vacuum energy. Well before the major role played by vacuum energy was understood it was assumed that the universe was matter dominated and that the matter was slowing the expansion of the universe. Two major efforts were started in the late nineteen nineties to prove that the universe's expansion was decelerating. Both groups used distant type 1a supernova as their standard candles. Supernova provide a luminosity reading that enables us to determine their luminosity distance via the inverse square law. And they provide a redshift reading that gives us the cosmic scale factor at the time the light was emitted. The redshift also enables a correction for the expected luminosity due to the expansion of space. Luminosity and redshift combined can tell us if this expansion is constant, decelerating or accelerating. Here's how it works. If the expansion rate is constant, the relationship between the luminosity distance and the redshift will be constant. Given a redshift we can compute the expected luminosity distance and therefore the expected observed luminosity. Comparing this to the actual observed luminosity, we would find them equal. But if the expansion is slowing down, the expansion rate in the past would have been greater than what we see now, which means it took a shorter time to expand from its size at light emission time to its present size compared to a non-accelerating universe. This results in a shorter light travel time, shorter distance and brighter supernova. By the same token, if the expansion is speeding up, the universe was expanding more slowly in the past than it is today, which means it took a longer time to expand from its size at light emission time to its present size compared to a non-accelerating universe. This results in larger light travel time, larger distance and fainter supernova. This is what both of the supernova study teams found. The supernova observations that discovered the acceleration of our universe's expansion also provided key missing information for our benchmark model. What astronomers do is to plot the expected luminosity distance for a variety of scenarios concerning the contents and curvature of the universe. Then they lay the actual observed luminosity distance over the graph to see which scenario is the best fit. Here we see that the lamb to cold dark matter scenario with matter accounting for 30% and vacuum energy accounting for 70% is the current best fit. It is the current benchmark model. With this benchmark model, we can map the expansion history of the universe from decoupling to the present and on into the future. To illustrate, let's take a final look at GNZ11's numbers. Its redshift of 11.09 gives us the scale factor at emission time, which gives us the time the light was emitted. All the other numbers follow. For the CMB radiation, the redshift tells us that the light we see now was only 42 million light-years away from our location when it was emitted. It traveled for just under 13.8 billion years to reach us and its starting location is now 46.5 billion light-years away, making the diameter of the visible universe 93 billion light-years. Type 1A supernova redshifts provide a measure of the scale factor back to near the earliest galaxies. The cosmic microwave background provides the scale factor for the surface of last scattering. We can tie scale factor and time together given the vacuum energy, radiation, baryonic and cold dark matter contents of the universe. So we can say, with a fair degree of accuracy, that decoupling started around 250,000 years into the expansion, and the surface of last scattering occurred around 370,000 years into the expansion. But how did we get to the cosmic microwave background? Where did the matter come from? And how did a fractal landscape form? To address these questions, we need to go beyond the surface of last scattering. Back to the time when atomic nuclei formed, called nucleosynthesis, kicked off by neutrino decoupling, and back further still to the time when quarks and protons were created from radiation, called baryogenesis. It is in this time frame that cosmologists propose that a super rapid increase in the size of the universe happened, called inflation. In these earliest times, the universe was radiation dominated, and we have a fairly simple relationship between time, scale factor, temperature and energy that we can use as we cover each of these key areas. Neutrino decoupling triggers the start of nucleosynthesis. Both processes happen in the unobservable realm, so we have to rely entirely on theory. But the theory is well established in nuclear physics laboratories. When the universe was just one-tenth of a second old, its temperature was around 30 billion degrees. That's about twice the temperature of the interior of our sun. At this temperature, photons can create electrons and positrons, and neutrinos are coupled to protons and neutrons. That coupling enables a fluid flow of protons to neutrons and neutrons to protons. This puts the entire mix of photons, neutrinos, electrons, positrons, protons and neutrons into thermal equilibrium. Given the mass difference between neutrons and protons, we can calculate the expected ratio of neutrons to protons for any given temperature. Note that the number of neutrons decreases exponentially as the temperature cools. Not unchecked, the universe would have had only one neutron left for every million protons by the time it was only six minutes old. But as the universe continued to expand and cool, the neutrinos decoupled from the protons and neutrons. Using the best available laboratory information, this would have occurred when the temperature cooled to 9 billion degrees when the universe was around one second old. At that point, two things happened. One, without neutrino interactions, the ratio of neutrons to protons froze. Computations show that at the time of neutrino decoupling, the ratio would have been one neutron for every five protons. This neutrino decoupling process would have lasted around one second. And two, while the neutrinos were coupled to the protons and neutrons, they could not travel far. But once decoupled, they were free to travel across the universe, like photons did at their decoupling. The end of neutrino decoupling marked the beginning of nucleosynthesis. The universe was around two seconds old, with five protons for each neutron. Here we track the predicted ratios of each kind of nuclei as these processes evolved. At the start, there was sufficient energy for protons and the remaining neutrons to combine into deuterons, the nucleus of deuterium. As the deuterium density climbed, significant amounts of helium and hydrogen isotopes were formed, along with lots of tightly bound hydrogen. This diagram illustrates the various nuclear interactions along the way. Hydrogen resists combinations with protons and neutrons, so the creation of beryllium and lithium went more slowly, producing far less of these elements than helium. By the time the temperature of the universe had fallen to 300 million degrees, when it was around 1,000 seconds old, the nucleosynthesis energies were no longer available, and the epoch was over. The vast majority of the baryonic matter wound up in the form of hydrogen and helium nuclei. The percentages of deuterium and tritium were much smaller. There were just traces of beryllium and lithium, and the remaining free neutrons decayed over time into protons. The best way to measure the primordial percentages of these elements is to look at the spectra of distant quasars like this one. The actual amounts of hydrogen, helium, beryllium, and lithium in the early universe does match the predicted levels. This represents significant evidence that the Big Bang Nucleosynthesis process did occur. The inputs to nucleosynthesis were baryons, protons, and neutrons. This is the process that created these baryons. Quantum chromodynamics shows how photons can create matter-antimatter pairs, and how matter-antimatter pairs can create photons. If we go back in time to when the universe was only 20 trillionths of a second old, its temperature was over 1,000 trillion degrees, and photon energy was around 100 gigaelectron volts. At that level, no baryonic matter could survive, and all of space would have been filled with pure radiation. Then, when the temperature of the universe cooled to around 1.74 trillion degrees, 33 microseconds into its expansion, created quarks and anti-quarks could survive. These quarks were not confined within baryons as they are today. Instead, they formed a sea of free quarks, a quark soup, a quark plasma. During this period of quark-anti-quark production, the number of quarks, anti-quarks, and photons were in thermal equilibrium, and therefore present in equal numbers. But the universe today has almost no antimatter at all. Now, suppose nature had a very tiny tilt in favor of quarks over anti-quarks. For example, let's say that for every 800,000 and three quarks, there were only 800,000 anti-quarks. Then, when the universe cooled to the point that quark-anti-quark pairs were no longer being produced, all the existing quarks and anti-quarks would annihilate each other. In our example, only three quarks would remain, and these three quarks would be surrounded by 1.6 billion photons, the product of the annihilations. As the expansion continued and temperatures continued to cool, the free quarks were bound into protons and neutrons, with the resulting baryon-to-photon ratio to our familiar 6 times 10 to the minus 10. Baryogenesis was over. Before the 1960s, elementary particle theory held that the laws of physics were exactly the same for matter and antimatter. The arising a process that selects one over the other to create a small imbalance was at odds with this equality premise. But in the 1960s, small differences in K on decay showed that nature does treat them differently. But so far, no good explanation for how things happened during baryogenesis has been developed. We have taken the history of the universe back to the first few microseconds of its expansion. At each stage, with baryogenesis through nucleosynthesis to recombination and decoupling, we had the universe in thermal equilibrium. At this point, there are two notable issues with the flat lambda-cold dark matter theory. One is that nothing we have covered so far gives us the anisotropy we see in the CMB radiation. In two, the universe appears to have been too large to be in thermal equilibrium during recombination. This one is called the horizon problem, and it's a showstopper. You'll recall that the horizon distance is the furthest distance that light could have traveled in the time available. Assuming a matter and radiation dominated universe appropriate for the time, we get a horizon distance that is 2% of the distance across the region. In other words, there is no way for the vast majority of the particles in recombination to be in causal contact as required for thermal equilibrium. To solve this problem, a theory of cosmic inflation was suggested by cosmologist Alan Guth in 1982. We know from quantum field theory that the universe is permeated with any number of matter and force fields. Like Higgs proposed a new field to explain W and Z boson behavior, Guth proposed that the universe contains a scalar energy field with a very large vacuum energy, much larger than today's cosmological constant. The field is called the inflation field, and its force particle is called an inflaton. If we assume that this large potential energy isn't changing appreciably, the large Hubble parameter would be constant and as we have seen before, the universe would expand exponentially. Here's a graph that relates the potential energy density of space against the changes in the energy of the field. A large enough vacuum energy density would cause the universe to double in size once every 10 billion trillion trillions of a second. At the end of the plateau, there is a sharp fall in the potential energy. We don't know what triggered the start of inflation, and we don't know what triggered this end, but during this split second fall, the inflation theory has it that all the potential energy and all the inflatrons were converted into all the heat, matter and energy in the universe. This is the fiery Big Bang we had entering the bariogenesis epoch. The horizon distance at the start of inflation would have been submicroscopic. The horizon distance at the end of inflation, a tiny fraction of a second later, would have been the size of a whale. And the horizon distance at the time of last gathering would have been 652 million light years, 800 times larger than without inflation. This puts every particle in the decoupling process that created the CMB into causal contact with every other particle, easily enabling thermal equilibrium. The zero point energy quantum fluctuations are short, small, and create localized deviations in energy density. Under normal circumstances, the restorative force returns the deviation to normal almost instantly. But an exponentially expanding space weakens the restorative force. Each wave stretches with the expansion and freezes once it reaches the size of the horizon. So we have a large ambient population of waves of every wavelength undergoing this expansion. This way large numbers of small, short localized energy deviations become small, long-lived extended deviations. In fact, these quantum fluctuations persist long after the inflation ends. We can tweak the variables to produce an energy density deviation on the order of 10 to the minus 5, the amount of the temperature deviation we found in the CMB. These tiny quantum fluctuations are the origin of the anisotropy in the CMB that eventually led to large galaxy superclusters and great voids we see around the universe today. The ability to solve Big Bang problems like the horizon, CMB, anisotropy, and others have given the theory of inflation a great deal of credibility among cosmologists. Here's a summary of the Big Bang timeline. Quantum mechanics has a notion of a smallest length called the Planck length. It's theoretically impossible to determine the difference between two locations less than one Planck length apart. This gives us a smallest possible time interval called Planck time. It's the time it would take light to cover the Planck length. And it gives us the highest possible temperature called the Planck temperature. We'll start here. Cosmic inflation created a radiation-only universe that cooled into a quark plasma following an exponential expansion. Baryogenesis turned the quark plasma into baryons, eliminating antimatter. Neutrino decoupling freed neutrinos to travel across the universe. Nucleosynthesis filled the universe with hydrogen and helium. The photon epoch was the long time period that followed nucleosynthesis. The universe was filled with a hot opaque plasma of photons, atomic nuclei, electrons, and dark matter. Recombination united electrons with protons, creating a sea of hydrogen and helium atoms filling the universe. Photon decoupling freed photons to travel across the universe. At this time, the entire sky was as bright as the surface of our sun. The sky darkened as the expanding universe stretched the bright surface of last-gattering radiation into the infrared range. With no stars having formed to give off light, the universe literally went dark. During this time, the caustic process worked the dark matter into filaments, with the baryonic matter tagging along. During this time, the universe left thermal equilibrium. Eventually, the dense clouds of cosmic gas in the filaments started to collapse under their own gravity, becoming hot enough to trigger nuclear fusion reactions between hydrogen atoms, creating the very first stars. Galaxies of stars formed and gravitational attraction pulled these galaxies towards each other to form groups, clusters, and galaxy superclusters. Our sun is a late-generation star, incorporating the debris from many generations of earlier stars. It and its solar system formed roughly 8.5 to 9 billion years after the Big Bang. Today, the universe is dominated by vacuum energy and is expanding in an accelerating rate. Eventually, everything gravitationally bound will be in a galaxy, and all other galaxies will be beyond the visible horizon. CMB radiation will disappear. No intelligent species will be able to detect that they exist in a universe bigger than their own galaxy. In addition, all star fuel will eventually run out and the universe will go dark forever. When it comes to the cosmos as a whole, there is far more that we don't know than the little bit we do. Based on new information and discoveries, the benchmark model is changing all the time. Some of the key unknowns are being investigated by intensive efforts across a spectrum of scientific projects. There are just a few. There's a hunt for the nature of dark matter. For example, the LHC at CERN now reaches 13 trillion electron volts. That's enough energy to find support for, limit, or eliminate various dark matter theories. There's a hunt for the behavior of the hot-cork plasma in existence during barogenesis. For example, thousands of times a second, the Brookhaven National Laboratory's relativistic heavy ion collider creates a cork-gluon plasma. This research can reveal subtle details about the cork-gluon plasma and by extension the origin of matter. There's a hunt for neutrinos with new detectors like Diab Bay in South China. They hope to detect neutrinos from the neutrino-decoupling epoch and eventually solve the riddle of why so little antimatter survived barogenesis. There's a hunt for early gravitational waves with projects like the laser interferometer space antenna, LISA. The laser arms will be millions of kilometers long, making it sensitive enough to register weak gravitational waves like the remnants of waves created in the earliest moments of the Big Bang. And the hunt for more Type 1A supernova continues along with deeper analysis of each and every one of them. This will provide a more exact measure of the scale factor dynamics since the end of the dark ages. And to further our knowledge, a new space telescope called the Wide Field Infrared Survey Telescope is in design and scheduled for launch in the early 2020s. This telescope is specifically designed to further our understanding of these key standard candles. As these researchers uncover more of nature's secrets, we can expect significant changes to the benchmark model. So stay tuned.