 So this will be my last lecture and slight bit of a hodgepodge. I was trying to include all the topics I had managed to cover, but I'll try and be coherent. So what I want to discuss today to start with is the cosmic microwave background, which I think you've all become aware of in the last 10, 20 years, is perhaps the most important repository of cosmological information. And the first thing that we notice about it is that it has an extremely accurate blackbody spectrum. This is perhaps the only example I know of so-called precision cosmology. It fits the spectrum of a 2.7255 degree k blackbody to uncertainties, which are usually smaller than the thickness of that line. In particular, these measurements made by the Corvia satellite on the wind peak, where most of the energies were decisive in determining this. And what that tells us is in the context of what we were discussing in the last lecture, that everything that you have assumed for simplicity, namely that the expansion is adiabatic, nothing is happening, there is no vorticity, dissipation, no antimatter annihilation, no funny stuff going on, all that is actually true. I mean, we are very fortunate that nature has given us such a pristine blackbody spectrum as testimony to the fact that our past was extremely simple. It could not have been like this. When this was measured, a lot of people didn't believe this. And there had been measurements earlier from a rocket experiment that showed features in the spectrum here. They've all gone away. So this is very fortunate for us because it makes our life very simple. Everything in cosmology that we find makes our life simple. The fact that photons outnumber baryons, the fact that everything is thermalized nicely, if it were not so, we would have a much harder time doing cosmology. So I'm going to take you very briefly through how a spectrum like that can develop in the early universe. But first I want to say something about the measurement because sometimes people say, this is the most accurate blackbody spectrum in the universe. Well, that can't be right. If you learn to think like an experimentalist, the best blackbody in the universe must be the internal calibrator of the experiment which measured this. Because of course, otherwise you can't measure a spectrum. So what they actually did was in this satellite that I think was the most revolutionary instrument for cosmology ever. In the first eight minutes of data taking by this cosmic background explorer, they measured the collecting microwave radiation in this horn antenna, which by the way is a design due to Bob Dickey who invented the scalar tensor theory and that is still was the basic technique used for microwave astronomy until recently. You collect that radiation and then you compare it with some internal blackbody. So you can measure something to be a blackbody only as good as your internal calibrator. And in fact, the measurement was not that precise when it was first done. It took another four, five years before they understood their internal calibrator well enough in order to code that very precise number as I just gave you. And John Mathew who designed this interferometer which compares the two radiation intensities was awarded quite rightly the Nobel Prize for this. And this is basically the technique which I'm showing again because there is now a proposal for a new satellite which might improve the measurements that Kobe did by another factor of 100 to 1000. And that is going to provide us new insights into the early universe as you'll see later as we go on because there must be, there have to be tiny, tiny spectral distortions because we know that things happened. We know that there was matter and antimatter that annihilated. We know that electrons and positrons certainly annihilated at half a MVV. There have to be slight traces from the fact that the photons are coupled to massive particles, namely electrons which are in turn coupled to the ions through Coulomb scattering. And although photons have an invariant distribution as the universe expands because the energy red shifts exactly at the same rate as the temperature both go as one over the scale factor for massive particles. This is not true. If you think of the distribution of a massive particle it's the Boltzmann factor. It goes as e to the minus e over t and e is square root of p square plus m square. So the p red shifts, the m does not red shift. So in fact, a massive particle cannot have an equilibrium distribution in a expanding universe. This is actually a theorem. It has to do with the absence of null kilpels as time like killing vectors for a Friedman-Rovitz and Walker metric. But fortunately in our universe the massive particles, namely electrons and protons are very strongly coupled to the photons. As long as they're so coupled because the photons outnumber them by 10 to the 9 to 1 every time they try to go out of equilibrium the photons drag them back into equilibrium. But that in turn must distort the photon spectrum a tiny, tiny bit and the next generation of experiments will actually find that. So here is a plot showing what you'd expect in the form of possible signatures on the microwave background spectrum. So this is the temperature plotted as a function of the frequency. If it was a black body then it would be a horizontal line and currently there are measurements as you see here from Kobe and this was the instrument on Kobe that made the measurement. And they're very, very precise at high frequencies which is short wavelength. So these are the, this is around the wind peak. The measurements are less precise. They're not shown. They're less precise in the Rayleigh-Jean spot which is where Tensius and Wilson first made their original measurement. And you can have distortions which are indicated here by these green lines and purple lines which are due to for example decaying particles in the world universe. And we don't currently have the data to determine whether these things are actually there. But you see that I'll explain what this mu and y is in a second but basically it's a measure of the fractional energy release into the microwave background. So even if you inject one extra photon for every 10 to the five photons that there are you will be able to see it. So especially here that is where this proposed arcade is an experiment that was ground-based that is meant to be doing this. But there is also a proposed satellite experiment. So how do you study this? I'm only going to sketch this out because this is worth at least one or two lectures and I don't have time for that. But it's kind of interesting because this is the problem that was initially solved by this guy Komponitz who was a student of Landau and this work was classified all through the war for reasons that I leave you to guess because he was studying the behavior of interaction of photons through Thomsen scattering with electrons in a fireball. So basically we only got to know about this work much much later, not in 57. So the Komponitz equation basically tells you that when photons scatter against electrons, the scattering of course is elastic it's just Thomsen scattering. The number of photons does not change but what happens is that the photons diffuse in momentum space. And here if you define momentum, so the energy if X is H nu over KT electron temperature and so this is in terms of X and this number density is of course the usual occupation number for particles for bosons and this Y is defined as this quantity here. So this is a measure of the electron temperature measures how relativistic the electrons are. The whole description I'm going to give you is for non relativistic electrons times the electron density times the Thomsen cross section that tells you the rate of scatterings. So in those terms you get this equation which conserves photon number as you can easily check by calculating what the rate of change with respect to Y is. And this equation which you can recognize a term which looks like diffusion, it looks like the heat equation and the heat equation you know has a simple solution. The green function is if you have a delta function it just spreads out into a Gaussian with increasing time and essentially that's what the photons are doing. They're random walking in momentum space. So I'm not going to give you the derivation of this equation. I want to encourage you to have a look at people says very nice book. In fact an even nicer book is the first version of this book from 1981 which has never been reprinted where he gives a very nice discussion of this classical problem of electrons and photons interacting. Why does this interest us? Because the solution that we are interested in is the Planck spectrum. In general however the solution to the components equation is not the Planck spectrum. The solution to this equation in general is what's called a Bose-Einstein distribution which has got a possible chemical potential for the photons. So now some of you might be thinking yesterday this guy said there is no chemical potential for photons and why am I talking about a possible chemical potential now? Well that's because then I was talking about an era where photons can be created and destroyed for example through E plus E minus annihilation or whatever. I'm now talking about when the temperature is low enough that you're below all mass thresholds there are no particles annihilating and therefore the number of photons is conserved in Compton scattering, Thomson scattering number of photons does not change. Whereas for a black body there is a precise relationship between the number of photons and their total energy density. You know that one goes as whatever some constant times t to the four the other one goes as t cubed, right? It's two zeta three by pi squared t cubed is the number density, pi squared by 15 t to the four is the energy density. There's a precise relationship and arbitrary distribution will not have the relation. So in order to make an arbitrary spectrum of photons into a Planck spectrum you will have to create or destroy photons. You need radiative processes that's what makes it slow. So the general solution as I said has this chemical potential because photon number is conserved in Thomson scattering. So if you start with too few photons I'll show you a plot of what that looks like in a second to make it clear but first I just want to show you that if you integrate that normally for blackbody spectrum you would get two zeta three by pi squared. Here you get an extra term, okay? And that extra term the coefficient is this chemical potential and similarly if you calculate the energy density you get an extra term. So the energy density and the number density of a Bose Einstein spectrum are slightly different from that of a Planck spectrum which is the limit in the case when mu goes to zero. So what you want to do is the following. Supposing some process injects some power into the microwave background. Some maybe matter annihilation or I don't know black hole evaporation, think of whatever exotic process you like. That is not injecting photons with the precise Planck spectrum. It's injecting with some random spectrum. Not random, it will be decided by the process. If it's black hole evaporation it'll be some cascade of high energy particles. You can model that with herwig or whatever, right? But to convert that into a Planck spectrum we first have to work out how much energy density we put into the microwave background and then ask what is the equivalent number of black body photons. Then I see that I have too few photons I'm ejecting and need to make a lot more and that is going to take some time. So the energy density at constant number density if I keep that constant is proportional to mu. It's of order mu that is why I said this is like the fractional energy input into the microwave background. And the experiment Kobe, the far infrared spectrometer, gave a bound or mu of about 10 to the minus four. So what that means is that given a certain amount of energy in the microwave background I'm not allowed to inject photons into it at more than six parts in 10 to the five. A very, very tight constraint, okay? And that is what assures us that the expansion was more or less adiabatic because whatever happened, the photons would have carried a memory of it. So as I said, the thing that actually looks like, sorry let me just put this whole thing up. A Bose-Sainte-Strand spectrum looks like that, okay? This is the Planck spectrum that you want and that you actually see on the sky. The general solution to the components equation however is some spectrum that looks like that. The effect of adding that mu in the exponent is to put this characteristic dip, okay? So what's going to happen is that if I inject a lot of high energy photons, generally there will be high energy photons, I have to break it up into smaller energy photons. How do I do that? Well, I have to look to radiative processes. Radiative processes like Bremsstrahlung generates an extra photon, Compton scattering can radiate of an extra photon. They're of course suppressed by one power of the coupling, of the electromagnetic coupling. And those photons will typically be soft. The rate of radiative processes always has infrared divergence. And those soft photons will be created at very low frequency and then they will diffuse up according to the components equation and slowly fill in this gap so that the spectrum will evolve towards the Planck spectrum. So if it is Bremsstrahlung, then the evolution looks like that. If it is double Compton scattering, the evolution looks like that. So you can see that mu is gradually reducing. Ultimately, you have a Planck spectrum. And so what I can ask is that this process should go on for long enough to give me a Planck spectrum as good as the one that is seen. And that then gives me a time scale for the problem. It allows me to say that the Planck spectrum was created at least, well, no later than a certain time because later than a certain time, the density of the universe should be too low and the time scale too long to generate the necessary photons to give me my Planck spectrum. So you can see that this is a very powerful constraint on the thermal history and it allows you to put a very model independent constraint on any energy injection into the microwave background. So, okay. So I don't have time to discuss this in detail, but here is a paper written in fact by colleagues in CISA here, Burijana, Danece and Dizotti who worked this problem out in some detail and they showed that basically in order for some arbitrary injection of photons to be thermalized perfectly, you need that energy injection to happen no later than a redshift of about 10 to the six, okay, which is about the first day after the Big Bang. So you have heard it before, somebody made light on the third day. So the microwave background spectrum was decided on about a day after the Big Bang. Well, no later than a day. It could have been, of course, decided at the Planck time for all we know, but it could not have been later than one day, otherwise there would have been a spectral distortion on it. So just as nucleosynthesis tells us about things that happened at one second, the microwave background allows us, the spectrum allows us to put a strong constraint on anything that happened up to about a day after the Big Bang, a day to actually several days. So I can turn that thing around and I can work out from that what the corresponding bound on the energy release is and this is what the Kobe team did in order to draw this plot. So they show the fractional energy release that is permitted into the microwave background as a function of the redshift. So you can see that actually, I didn't discuss the wide distortion, which is slightly different. That is like a superposition of lots of black bodies slightly shifted in temperature and that was first discussed by Zeldovich and Rashid Sunayev and so was the new distortion for that matter. They discussed the Russian, the Soviet cosmologies discussed all this in the 1970s and gave a full formulation of this and that formulation allows us now using modern instruments and data to put this kind of a constraint on a fractional energy release which can be thermalized. So it can be no more than one part in 10 to the four for most of the history of the universe and then this curve asymptotically goes up because you can put an arbitrary amount at higher redshifts, but up to about 10 to the six, 10 to the seven, there's a very strong constraint on how much you can put. Up to here, there's a very tight constraint. So what this means is unfortunately that most of our imaginings about cosmology and what could possibly have happened have to be restricted to earlier epochs. You can't mess around here because if you do anything there then you would have seen it and what Sunayev and his colleagues are now proposing is that you can actually improve these measurements by another two or three orders of magnitude to see some small residual distortions that must be there. And it turns out that the technology of measuring the spectrum has got good enough that you can possibly do that. So there's a proposed space mission called PICSI which is aimed at doing this. So first let me tell you something about the decoupling of the radiation. So I have so far discussed the formation of the black body spectrum and if you think about it intuitively to make something into a black body you need a black body. You need something which is entirely enclosed and where scattering happens often enough. So here we don't have an enclosure. The whole universe is the enclosure but we want to make sure that the scattering happen often enough. In other words that there is a scattering rate is much higher than the Hubble rate and in particular that the rate of photon creation is higher than the Hubble rate. So Thomson scattering is a very efficient process but Bremshaw-Lang double-common scattering these are inefficient because they are damped by some power of a coupling and therefore you have to wait longer for those processes to be able to operate or you have to have a higher density. So they can only work at high red ships. They cannot work at low red ships. Therefore we have that constraint that I showed you. So the universe therefore we worked out must have developed the Planck spectrum by red ship of 10 to the 6, 10 to the 7 but the photons are still coupled to the electrons. It's still a plasma. What happens then? Well this is the process and the interaction rate is as we have been discussing the number density times the cross section averaged over the velocity. The number density goes as T cube. This thing has no energy dependence and Xe is the fraction ionization fraction which is the measure of the number of electrons. Let's write it like that because then we can see that we need to compare that with the Hubble rate which in the matter dominated data will go as T to the 3 halves. That's when this is going to happen and as before, this is we keep coming back to this again and again. Any rate in the early universe you must compare with the Hubble rate. The Hubble rate of course is changing with time. It is much faster in the early universe. The corresponding time scale is shorter and shorter. So whatever process you're thinking about has to become more and more efficient to keep up with it. At some point it might fall behind. That's why we saw yesterday that no process, no two body scattering process can keep equilibrium above 10 to the 14 GeV. But now we are talking about much lower temperatures. We are talking about sending of the order of an electron volt as you'll see when Thomson scattering will finally be unable to keep pace with the expansion of the universe. And once the Thomson scattering rate falls behind the Hubble rate, the photons and matter will decouple. Now at this point, the ionization fraction will drop very rapidly because the protons and the electrons will combine. People call it recombination, but they've never been combined. So I don't know why you call it recombination. It is combination. You start with the ionized plasma. And at that point, the Thomson scattering rate drops very, very sharply because the number of ionized electrons is ionized. Fraction is also dropping very sharply. And this defines a last scattering surface, which means it's just like the thing of photons coming to us from the sun. So they are random walking out from the sun. They take typically a few million years. I don't know if you know that. What we see coming from the sun today, the sunlight, actually is a few million years old. It's only the neutrinos you measure from the sun that come to us in seven minutes or whatever, right? So in fact, if the sun switches off today, you won't realize it for a few million years. But what happens is that the random walk to the surface of the sun, so let me just draw the sun and the photosphere. So here is a photon random walking, you know, the sun. And of course, like a drunk guy, it is getting further and further away from the thing. And at some point, because the density of the sun is decreasing very sharply, right? At some point, which is the photosphere, the photon basically then comes to us where we are looking at it, right? And this is the photosphere of the sun. So basically, it'll make at most one scattering before it comes to us again. So we call that the last scattering surface of the sun, okay? Now imagine this whole thing inside out, okay? The sun's there, we are here, but now we are looking at the cosmic photosphere which surrounds us. It is back in the past, back on that hypersphere somewhere near the South Pole, close to the Big Bang in the kind of scale we are talking about. It's only at a red sheet of a thousand, but still it's a long way away from us. It's almost the entire scale of the universe away from us. And we can't look beyond it because of this problem that the universe becomes ionized at that point. And that is what we see today as the cosmic microwave background. And in order to calculate when this happens, we have to again look at this chemical equilibrium thing that I was talking about yesterday. And we see here that the chemical equilibrium here is determined by the combination of protons and electrons into hydrogen. And this thing is 13.6 electron volts, that's the binding energy. So you just have to solve the Sahar equation which tells you what the equilibrium ionization is for a plasma which is recombining, right? And that essentially, as you'd expect, goes as T to the three half E to the minus 13.6 by temperature. This is the Boltzmann factor. And I have written it in parametrized terms of the Baryon to photon ratio, okay? And this equation, by solving this equation, we are going to get the answer as to how the recombination actually proceeds. So basically, this is a calculation from a code. In fact, I'll put the whole thing down there. Recombination is happening. This is the redshift. You can see the ionization fraction drops very sharply from being fully ionized to being almost neutral very, very rapidly around the redshift of level 100 or 1200. And the decoupling of the photons and baryons according to that criterion gamma of order H happens at about the same time, okay? And this is extremely sharp. This is important because if that photosphere was extended, then essentially what you'll see is a fuzzy image. It'll go out of focus because it'll be smeared out by the thickness of the photosphere. It's because it's a very sharp surface that we see a sharp image of the cosmic microwave fluctuations. Otherwise, something called silk damping would totally wipe it out. So you're very lucky that that didn't happen. And if you want to do this calculation more carefully, you have to take into account the fact that there is helium and so on and so on and there are standard codes to do that, okay? The interesting thing is that this does not actually alter the spectrum of the CMB at all. Because again, because there are 10 to the nine photons per electron. So the fact that the electrons are non-redivistake that they're going out of equilibrium that they can no longer have an equilibrium distribution if left to themselves, none of this matters, okay? The spectrum of the photons is unaltered except that Sunay has pointed out recently that there should be a little residual distortion of about one part, I can't remember exactly, 10 to the nine, I think, is the level of distortion. And that could be in principle measured. So I just want to, I'm not going to talk about cosmic microwave-bound fluctuations that is worth several lectures, but I want to show you another way in which nature has been very kind to us in giving us some, you know, relic of the past. So what we see on that last scattering surface is the fluctuating gravitational potential. And when photons come from that surface, obviously they're redshifted or blueshifted according to whether they're coming from the peaks of the troughs, okay? So for example, this is a trough in the gravitational potential, there is a lot of matter there, and that therefore will cause the photon to be redshifted. It will lose energy as it climbs out of the well. However, if there is a potential well there, it means there's a lot of matter there. And if the fluctuations are adiabatic, which means that the Baryon to photon ratio is constant, which is what you expect if the excess matter has been made by some particle physics process, right? Then that is where the matter is also more compressed. And when you compress matter, it heats up the photons. So the photons are hotter to start with, the ones which are going to lose energy climbing out. And these two effects more or less cancel, okay? But they don't quite cancel. Thanks for that, right? God, okay? Because if they cancel, then you would see nothing. In fact, they don't cancel to about 1 third of the total effect. There is also a Doppler shift effect because these things are not static, they're moving. Okay? There are sound waves on the last scattering surface. So you a little, you see a small Doppler shift. So these used to be incorrectly called Doppler peaks. They're actually acoustic peaks. And because these effects don't cancel, the micro background has a memory of the past. And that is, of course, the subject of a rich vein of inquiry which has resulted in a great advances in cosmic micro background. But as I said, there is no effect on the CMB spectrum, at least not at the level up to what, one part in 10 to the nine. I want to show you another effect that you can have. This is very topical because there has been a lot of talk recently of dark matter annihilation, giving a positron anomaly which Pamela and then AMS detected. Thing is that if you had particles annihilating in the past, what would happen? What is this is showing you is the last scattering surface. This is the optical depth that there's a function of the red shift. And you can see how that is decreasing very, very rapidly. If you had some particles which were injecting ionizing photons or some particle annihilations which are injecting ionizing photons, then you mess up what is going on. Then that ionization fraction will not fall as sharply as it would do in the absence of any exotic physics. And the result of that will be because you're spearing out the last scattering surface and the acoustic peaks would actually be seriously damped. So this is a old paper which we wrote when the acoustic peaks had still not been detected. This was the first data from preliminary experiments and they were already showing some fluctuations but they had not clearly detected the peaks. So we just said, look, if you see any structure at all, then that rules out the idea that something is injecting ionizing photons. And in fact, the same argument has now been generalized and extended to annihilating particles. And this is the latest result from Planck which puts a bound on an annihilation cross section of any particle as a function of its mass and the value of the annihilation cross section and mass that you require to explain the positron anomaly seen by Pamela and AMS in terms of dark matter is in that box there. And because Planck makes a very precise measurement of the acoustic peaks, they can improve our argument by a factor of 100 right away. Plus you include polarization information that improves it by another factor of eight. So overall, now you can actually constrain that whole region, you can say that is ruled out. It doesn't matter how you get such a large cross section. If that cross section is there, if those particles are annihilating, you can rule it out simply from observing that the acoustic peaks of the CMB are not damped. You see what a powerful probe this is, right? And very, very model independent, yes? No, no, so this is all happening at a redshift of a thousand which is there 100,000 years after the big bang. So all the dark matter freeze out happened way, way, way back, right? So this is very late times. Oh, sorry, the question was, does this not depend on the time at which the dark matter froze out? So my answer was the freeze out happened, you know, at much less than a second, right? Typically at around the clock headrun, phase transition and stuff like that. So this is 100,000 years after the big bang. This is the beauty of it, that if the dark matter is annihilating, it's not going to just annihilate in our galaxy, it has been annihilating all along. Of course, the rate of annihilation was quite small because they were not clumped together. They were uniformly distributed. But this is just a, what we are constraining here is the rate of injection of photons of energy more than 13.6 electron volt. I don't care where they come from, right? And therefore, so I'm presenting completely model independent bounds here. You translate this into your favorite model, right? You might have some favorite particle, black hole, God knows what, which does all kinds of crazy things. So this will put a constraint. Okay, so I'm not actually going to talk about dark matter much because this will be covered in three lectures by Catherine Zurich. But I'm going to touch upon it because the constraints on dark matter properties, unstable dark matter. So dark matter is of course generically, I mean, we refer to dark matter as the stable particle, but there could be many other particles in physics beyond the standard model, which are unstable or which can highlight with each other, get, you know, they might have disappeared, but all these relics from the past allow us to constrain that. So what we are really doing is using the standard cosmology which I've set up as a laboratory for particle physics, right? It's as good a laboratory as the reliability with which we know the numbers concerned, which is why I've been at some, gone to some care to point out that we actually know some things. We can measure nuclear abundances, we can measure the spectrum of the CNP, and then if we interpret these numbers in very simple physics, all the physics I've talked about is classical physics. Okay, there is no ambiguity there. So now I'll talk about basically what we can infer about the early universe from all these things that we have set up. So I'll give some examples of constraints on new physics. So first of all, what do we have in our universe? If you ask the astronomers, they'll tell you this is the breakup of the universe. We have a few percent of the stuff that we are made of. We have about six times more dark matter and the rest is supposed to be dark energy, right? So, baryons we have already discussed. You of course see them out there gluing with light. So most of the baryons actually are not in stars or dust. They're in the hot extra emitting plasma in clusters of galaxies. And that makes up only about a third of the baryons, but we can get a total audit by doing Big Bang Nucleosynthesis calculation and I showed you this yesterday. That gives us a measure of the baryon to photon ratio which tells us that there are at least three times more invisible baryons as visible ones, but that it's not enough to make up this dark matter. Of course, there are no anti-baryons. That is the important question that comes up as to why. Then we have all this information from the Hubble diagram of Type 1 supernovae from analysis of this pattern in the microwave background from the correlations of the large scale structure of galaxies. All that is supposed to tell us that, sorry, all that is telling us that about two-thirds of the universe is dark energy. It's actually inferred from this sum rule. You actually measure omega k to be close to zero from the size of the spots, okay? That's the first acoustic peak. And you measure omega matter to be about 0.3 from the distribution of galaxies assuming that the power spectrum of primordial fluctuations is a power law, okay? In that case, you can say this is 0.3, this is zero. Therefore, this must be 1 minus 0.3, 0.7. That's how you determine the dark energy, but you don't actually measure dark energy directly, okay? The supernovae gives you the difference between omega lambda and omega matter, not omega lambda directly. Dark matter, however, has both geometrical evidence and dynamical evidence. That is to say, if you understand the dynamics, if generally it is correct, then you know about rotation curves. This will be discussed, I'm sure, in the dark matter lectures. You know about gravitational lensing, and the most powerful argument is that to form that structure of galaxies, if I take the power spectrum of that and plot it, the data points are those error bars there, and if we had only baryons, then it would look like this. You would see strong acoustic oscillations in that spectrum just as you see in the microwave background spectrum. So these are so-called Sakharov oscillations. We don't see them. We see very, very faint ones. These are the so-called baryon acoustic oscillations, but overall, the spectrum does look like that. It doesn't look like this, and this is the most powerful evidence that there is, that there must be some particles that don't interact with photons, which we call dark matter, in which the fluctuations are growing through gravity because that dominates over the baryons by a large factor, at least six. So as we have already noted several times, to explain these things requires new physics, right? Dark energy is another issue. No new physics that I know of explains dark energy because the scale of the dark energy is the scale of the present Hubble radius, which is not a scale in physics that we know about. It's an infrared scale, okay? It's the scale of the universe as a whole. So let us ask from the point of view of particle physics, what should the universe be made of, right? We think that we have a complete description of all fundamental forces and particles. Surely you should be able to say something about what cosmology should look like. So we do, after all, know about the particles you have made of. You are made out of baryons and you have a complete understanding of the force that binds them together, makes them stable because they're stable, because they carry baryon number. It's a global quantum number, but nonetheless, experimentally, we know it's a pretty good one. There's no proton decay scene. And if it is violated, it's violated much more weakly than that. And we know everything about baryons. We know they're coupling. We know this infrared freedom, the asymptotic freedom, which I told you yesterday, is the sole reason why you can do cosmology at all, why the universe does not become a strongly coupled QCD plasma. Fortunately, otherwise, you would not be able to do cosmology at high temperatures. We know the spectrum of states in QCD. We can calculate that on the lattice now and the agreement is pretty good. You see the agreement between the expectation and the measurements on the lattice are good enough that we can say, you know everything about QCD, both at the prohibitive and the non-prodivitive level. So we should be able to predict how many baryons, think of baryons now as dark matter. I already told you, at least three quarters of the baryons are dark, right? Can we say something about how many baryons should be left over from the Big Bang? Well, we can do that, but in fact, the standard expectation is that nothing should be left over from the Big Bang. The value of omega in baryons should be about 10 to the minus 10. What we observe is bigger by at least eight orders of magnitude, right? How did that happen? Well, this calculation was first done by Yakov Zeldovich and so this equation, which is a averaged form of the fundamental Boltzmann equation, just basically tells you detailed balance. The particles are expanding in some co-moving volume. They are being diluted by the expansion as 3 a dot by a, because the number is conserved in a volume which is increasing as a cube. So the differential of that will give you 3 a dot by a. And the number is determined by the balance between annihilations, which has got a negative sign. So the particles are disappearing at a rate which is n square sigma v. They are also being created. At what rate are particles being created? Well, Zeldovich is a simple answer. Stop the expansion, okay? So I take out that term. I wait for the system to come to equilibrium. So n dot goes to zero. Then the rate of creation should be the same as the rate of annihilations. So the rate of creation therefore should be exactly the same term, except that I put a subscript t for thermal. The abundance of particles in thermal equilibrium, okay? So it's very simple. See how you can take shortcuts to solving a very complicated Boltzmann equation by just doing the simple trick. And that equation therefore should describe what happens when particles start out at a high temperature when the mass is much smaller than the temperature, the relativistic, the abundance is given by whatever, Fermi or both statistics. It's something of order one in units of the photon number, right? But then as the mass becomes comparable to the temperature and basically the universe is cooling down this way. So m over t is increasing and therefore e to the minus m over t looks like that. So the particles will decrease their abundance, extremely rapidly according to that Boltzmann factor, okay? And because the number of particles is decreasing exponentially, the rate of annihilations is decreasing exponentially. And at some point therefore it will drop below the Hubble rate which as before is t square over plan scale and that point is called freeze out. So the particles will decouple and then their number density is constant relative to photons or constant in a co-moving volume. So the lines become horizontal, okay? If I increase the annihilation cross section, they can stay in equilibrium longer and deplete themselves more. So the more strongly interacting a particle, the less of it that will be left over, right? If I had electrons with electromagnetic coupling, you might be here. If I had nucleons with a strong coupling, you'll be even further down. In fact, nucleons are down there. This is the expected abundance of nucleons. It should be 10 to the minus 19, okay? And the argument is very straightforward. It just comes from equating these two and working out what is the abundance at the freeze out temperature, which for nucleons is roughly the mass of the nucleon by 45. For weakly interacting particles, it'll be about 20, that factor, because it goes as the log of the cross section as you can easily work out from here. So you see, this is what we expect. This is what we actually observe and there are eight orders of magnitude in between. So in other words, we have failed miserably in doing the simplest calculation we can do in cosmology of working out how many particles, such as the ones you are made of, about which we know everything, should be left over from the big bang, right? So you should keep that in mind when you discuss dark matter, about which we know almost nothing. We don't know its interactions. We don't know its spectrum. And yet we talk glibly about dark matter and the freeze out argument, which is exactly what I've given here, okay? So this is however a bonus. This means that we have gone badly wrong. So there must be new physics to explain this eight orders of magnitude distribution and Professor Buchmiller will discuss how that might be, how this initial asymmetry might have come about, such that for every 10 to the nine pairs of baryons or anti-baryons, there was one extra baryon, okay? That's all you need. In fact, another interesting fact is that dark matter is six times more than baryons. If dark matter had the same asymmetry as baryons in number, then a particle weighing about six GV is just what the doctor ordered for dark matter, right? And it is attractive to think that. I'm sure Zurich will discuss that. So Giovanni asked me to say something about the freeze out argument. So that's why I thought I would present it in a slightly different context than the one that it is normally spoken of in. It's normally presented as the wind miracle. What I've shown you is the baryon disaster, the same argument, okay? But the same argument, which fails by eight orders of management for strongly interacting particles, works for weakly interacting particles because the cross section is much smaller, more of them survive and you get the right amount to be dark matter. So it's entirely self-serving, okay? That's why I don't believe it. But the argument is basically that if particles decoupled up there on that plot, then the number density will be comparable to photons. If the particles decoupled when they're non-radivistic, that's called freeze out, then their abundance is done by a Boltzmann factor. That's all there is to it, okay? The rest of it, of course, you can do a elaborate calculation. This was first done by Visotsky, Dolgov and Zeldovich and then a bit later by Lee and Weinberg to work out the slight difference between the freeze out temperature and the asymptotic value. So you have to allow for the inhalations that happen on that one, but it's a small correction, okay? So if the particles freeze out here when they're still relativistic, then the number density is comparable to photons. So there are about a few hundred photons per cubic centimeter today, microwave background photons. So if you had particles with the same number, then their mass would need to be of order a hundred electron volt to give you omega nu H square of one. Actually for dark matter, omega nu H square is about 0.1, so you need about 10 electron volts, right? However, if you had, we know that neutrinos don't have that much mass. The upper bound on the neutrino mass is 2.2 electron volt from trichium beta decay and since all the neutrinos mix, the sum mass must be less than 6.6. So it's not enough to give you what we need and also structure formation does not work in relativistic particles like neutrinos. However, more recently, there has been interest in sterile neutrinos. That's exactly the same argument. They're not entirely thermal relics because the rate at which sterile neutrinos are made would be through mixing with the left-handed neutrinos or spin flip scattering or something like that, which doesn't populate them to the full thermal level. It populates it to some fraction of the thermal level and therefore you can have a little larger mass, maybe 10 keV and you can still give a omega nu H square of order 0.1, right? But then the question is in particle physics, I'm not aware of any natural way for a keV scale to arise, okay? If you can think of some way to generate it, then that's worth thinking about but it's not something that we know about. The other possibility, however, is far more attractive and which is why it has gained so much currency, which is that if I had, if I chose this solution and I don't want to go down there like the Barion's did and annihilate to nothing, I want to be around here, which is the value that will give me dark matter, then I see that the cross-section I'll need for that to work has to be of order a weak cross-section. So this is written in centimeter cube per second in terms of keV inverse square, it's 10 to the minus 10 inverse keV square. So it's basically the inverse of the Fermi scale, okay? So the Fermi scale, keV mass particles, keV mass, keV scale cross-sections will naturally give you the dark matter because they don't annihilate very rapidly. So it's kind of paradoxical. The more weakly interacting a particle, the bigger is its relic abundance from thermal equilibrium, okay? So the more you fail to find super symmetry, the more super symmetric particles dominate the universe, okay? Which is why you are squeezed into smaller and smaller bits of the parameter space where the annihilation rate is large enough due to some, you know, because of co-annihilation or you're close to a threshold or something to give you enough annihilation to reduce it. Over most of the parameter space of super symmetry now, the relic abundance of the lighter super symmetric particle if it is stable, it's too large, it's too much dark matter, right? So anyway, this is all I want to say about the wind miracle and I'll leave it to later lecturers to discuss this in more detail, okay? So now let me take a little aside into putting, you know, sketching in a big picture. So we have, of course, you have had, I guess Michael Peskin took you to the whole field theory gamut of the standard model. Why I want to alert you to what aspects of the standard model and in particular, its extensions are relevant to cosmology. So you know that we have a Lagrangian, which I've written in sort of simple notation in order to describe all the interactions that we are aware of. We know the gauge group, we have written down all the model, all the terms that are allowed by the symmetries of the standard model, okay? At the renormalizable level, these are just these, right? So you have, so this is all notation. So these are the fermions, these are the gauge fields. So these are the couplings that will give you masses for the fermions. This is the potential of the Higgs, okay? And we have seen all of these particles now. There is nothing left, okay? In particular, we have actually measured now the self-coupling of the Higgs. We know that value is something like 80 GV squared. We know what the self-coupling is 0.75 or whatever it is in whatever normalization you like. We know everything about this thing now. So what is then left for cosmology? Well, of course, there are some very interesting aspects of the Higgs mass for cosmology. I'm sure you have heard about that because the present Higgs vacuum is not entirely stable. It may be metastable. We'll know soon when somebody measures the top quark more precisely. There is only one ambiguity in the standard model, which is that there is a naturalless problem associated with the violation of STP in the strong interactions. And this is normally solved by making the theta parameter of QCD into a dynamical parameter, okay? Inventing a new symmetry, the Pechequin symmetry, which is then broken. And so then the quantity can relax to zero because just like the cosmological constant was a balance between a bare term and something from the right-hand side, the energy momentum tensor, here you have a balance between a bare theta and the argument of the determinant of the quark mass matrix. So if you can make that quantity dynamical, then it can naturally float to zero, but that will then give you a number Goldstone boson, the axion, which is then a candidate for dark matter. And I guess Sujit Rajendraan will discuss that in three lectures, so I'll not say anything more about it. It's a very interesting possibility and the only one within the standard model, in the sense that it is motivated by the standard model. Of course, the physics of axions is very non-standard model. However, we now know that these renormalizable operators are not the only things that there can be. We can write down an infinite number of high-dimensional operators, add it to the standard model, they can violate the symmetries of the standard model. But we know that they do not do so at any significant rate or observable level because they're all damped by powers of the cutoff of the theory. If we view the standard model as an effective field theory, valid up to some scale, then that scale M will appear in the denominator. I can write dimension five, dimension six, dimension seven, et cetera, all the way up. And that is very, very attractive for two reasons. One is that it means that if the scale is high, then what you measure at LAP or LHC, et cetera, doesn't matter what the physics is at high energies. It decouples. I gave you an example in cosmology yesterday. Neutron proton ratio, when it freezes out, is independent of cosmology at the Planck scale. That decoupling there was because of thermal equilibrium. Here the decoupling is because it's an effective field theory with powers of the cutoff damping new physics. So this term, for example, dimension five operator, this is a possible term to give neutrinos a major on a mass, and I expect you'll hear about it in the lectures on baryoleptogenesis. This is a operator that can give proton decay. We have not yet seen it. It can generate flavor changing currents and so on. So the name of the game, if you try to construct new physics beyond the standard model, is how to keep these operators in check because they have coefficients which have not shown you which have to be protected in some way. Otherwise, in supersymmetry, for example, there is a very strong problem with flavor changing currents as those of you who work on supersymmetry know, proton decay happens in supersymmetry in a microsecond. You have to stop that from happening. That's why you invent our parity. But that has an advantage because once you do that, then you have, sorry, before I get to that, let me just show you, before you do that, if you want to understand the tension between these values of m being in the numerator and in the denominator, you have to see that I can also add two terms here which are so-called super-enobelisable terms. There, the power of this cutoff is in the numerator, not in the denominator, okay? And you know about this term, at least you know about this one, that is the notorious Higgs mass divergence because the Higgs here is a fundamental particle. It propagates through the vacuum. It sees everything in the vacuum. In particular, we'll get a correction to its own mass and the heaviest particle it sees in the vacuum is say the top quark. It'll get a mass correction which goes quadratically as the cutoff, okay? Because it has no chiral symmetry to protect it. The electron sees the same thing, but the electron has a chiral symmetry. So its mass correction goes as log of m. It doesn't go as m squared as it does for a scalar which has no chiral symmetry. So therefore, this is the famous hierarchy problem which suggests that the Higgs mass will be pulled up to whatever higher mass scale there is in the new physics that underlies the standard model, okay? The hierarchy problem of course exists only if there is a new scale. If there is no new scale, then one can't meaningfully talk about a hierarchy problem. Of course, you could still worry about the plant scale, but for practical purposes, this was motivated initially with the idea that there was a cut scale and so on, right? So how do you solve that problem? Well, one very popular way as you know is to say that these corrections are of opposite sign for fermions and bosons. So why don't I just double the spectrum of particles, a fermion for every boson, a boson for every fermion, and then the Higgs has a fermionic partner that is protected by chiral symmetry and therefore, the Higgs mass is also correspondingly protected. And that would then imagine a very, very rich phenomenology and of course for the last 25 years, this is mostly what you've been discussing and unfortunately, supersymmetry has not shown up yet as it should have. So now we are talking as we have heard in Ulzer's lectures and so on, that the Higgs could be a composite particle, it doesn't have to be elementary. So this is of course what the most of the focus is in particle phenomenology. What is the deal for cosmology? We get a lot of things out of this, out of all these speculations about beyond standard model, there lie the ingredients for doing the sort of things that are missing in the standard cosmology. For example, we can get a candidate for dark matter. If this new physics has got a conserved quantum number such that the lightest state is stable, it could be the lightest supersymmetric particle, it could be the lightest technibarion and new strong dynamics which is invoked to break electric symmetry. It could be a Kaluzhe Klein state if you are fond of invoking new dimensions of the TV scale. You naturally get a candidate for dark matter which naturally has the required dark matter abundance because the cross section is determined by the TV scale and you saw in the numerology we did earlier that the WIMP miracle will give you the cross section. In fact, the funny thing is that as was pointed out there is also a WIMP less miracle. In other words, the cross section just depends on the square of the coupling but the square of the mass. And this is the same value in supersymmetry for both the lightest supersymmetric state and for the so-called spurions, the things that communicate supersymmetry breaking to our sector, they have exactly that value. So the dark matter doesn't have to be TV scale, it can be anything which has got that combination of mass and coupling. So I think this was pointed out as the so-called WIMP less miracle. And there of course, all these other possibilities you'll hear about this later. I now just want to mention that there is this last term here which people don't like to talk about because it's very embarrassing. It is something that we have no understanding of. It is something that ought to couple to gravity if the standard model, if gravity is described by general relativity, Einstein tells us that all forms of energy density coupled to gravity. And from this argument, this is at least of order TV to the fourth, which is about 60 orders of magnitude higher than the energy density of the universe today, the maximum we can tolerate. So we have no idea about how the standard model couples to gravity and therefore if you find that lambda is of order, the Hubble parameter square which is 10 to the minus 42 GeV, rather than 10 to the three GeV, then you can see that there's a big problem. That there is no theoretical understanding of that at all. Let me now then come to in the remaining time to what we can actually figure out about all the new particles that are thrown up by all these discussions. I mean here there must be few thousand papers have discussed all these possibilities, new spectra, new particles, many of them unstable, decaying and highlighting something. So in the old days, people used to think you can do whatever the hell you want in the early universe, who cares, it's all wiped off, redshifted, that is not so. There are traces left, there are traces we can look for in the elemental abundances in the spectrum of the microwave radiation, in the amplitude of the power spectrum of the fluctuations and so on and so on. And that has developed into this, what I described as a laboratory for particle physics. So let me take you through this very, very quickly. I'm running out of time. The first one that you have heard of, I'm sure before, is using nucleosynthesis to constrain the number of effective types of neutrinos or radiation of any kind. So I showed you yesterday the helium abundance for three neutrinos. If I had two or four or five or six or seven, then you just get parallel lines with more and more abundance. And remembering that the helium abundance was a large, uncertain box, you can see that you can put a constraint on it, but not a very tight one. However, if you can determine the baryon to photon ratio by determining the deuterium abundance, which does not change that much with the number of neutrinos, then you can actually put a very tight constraint. Another set of very tight constraints comes from what I already mentioned that the neutron proton ratio at freeze out is sensitive to all the three interactions in nature that we know about. Because the mass difference of neutrons and protons is sensitive to strong and electromagnetic interactions and that freeze out temperature itself is determined by the balance between gravity and weak interactions. And it's exponentially sensitive. So therefore, you can get a bound of order a percent on any extra radiation, which is present in any form. Or you can get a bound of the order of a few percent on the variation of any coupling weak or electromagnetic or whatever from now until nucleosynthesis, which was a redshift of 10 to the 12th. And that is very interesting. This is a small remark here, which shows that a lot of efforts to understand dark energy or the coincidence problem of dark energy are simply wrong because they say, well, to avoid a coincidence problem, why lambda should be of order H naught square? So this is what we need today. And this is this coincidence problem. What about making it lambda of order H square at all times? So maybe you can interpret this as some kind of a Casimir energy of whatever. So this does not work because H square equals eight pi zero by three plus lambda by three. So if I make this proportional to H square, then what I can do is I can take that on the right left hand side and I can write one minus one third. So what that means is that this G just becomes G goes to GN divided by one minus one third, two thirds. That's all. You're just renormalizing G. You're not actually doing anything else. And that is ruled out by nucleosynthesis. You can't renormalize G because GN according to nucleosynthesis was the same at redshift of 10 to the 12 as it is today to within a few percent. So you can see that even simple arguments can be quite powerful in ruling out whole classes of theories. Now, neutrino counting here is the latest status. If I just parameterize the number of neutrino species, then this was a game started long, long ago by these people. Then before the CMB, we did not really have very tight constraints from nucleosynthesis alone. You could basically just about say you can allow one extra species. That's all you can do. But as I showed you yesterday, we now have a precise determination of the barrier to photon ratio from the CMB. And if I put that in, and if I assume that this barrier to photon ratio did not change in the 100,000 years between BBN and CMB decoupling, then if I put that in, then I get it somewhat tighter bound. It's still not very good. It still allows an extra species at a couple of sigma. I actually hesitated to use the word sigma. It doesn't mean the same in cosmology as it might mean in a laboratory experiment. But basically, if you have some extra particles in your theory, don't be scared. It's pretty okay. You can pretty put them in. It's not the importance of these constraints is sometimes a bit over exaggerated. But it is still true that they are complimentary to what you can measure in the lab. So it's very, very powerful. It doesn't matter how well you measure the Z naught width. It will not tell you about sterile neutrinos. This is the bound that I mentioned on alpha electromagnetic, for example. You can say, so this is a nice exercise which became very popular about 10 years ago. People are trying to work out what would be the impact of changing fundamental constants on cosmological processes. And then you have to worry about what is changing. Is the Higgs-Wef changing? Are particle masses changing? And then you certainly realize that you don't actually know how the neutron proton mass difference is generated because it's actually non-pernovative. The electromagnetic part is simple. The strong part is not simple. It has to do with violation of flavor symmetry. You can't calculate it from first principles. I think very recently, just a couple of weeks ago, I saw someone has managed to do this on the lattice. And so you can determine how the neutron proton mass difference varies according to a variation in the fundamental, say, in alpha. And then you can work out that this change should be very, very small, which is consistent with the data that I showed you earlier up to redshift of 10-4. Now the same argument is going up to redshift of 10 to the 12th, okay? And this means that, you know, as I mentioned already, if somebody tells you that the physical constants are expectation values of modern life fields, then tell them that they must finish up their business before once again, before nucleosynthesis, okay? Otherwise, you would see it. And here is a nice review by Pospillov and Pradler in annual reviews, which basically shows how the mass difference depends on how it actually affects the nuclear abundances. So if you have a theory where you might expect a change in these quantities, you can read off from this what constraints there are on your theory. These are in the slides, I'll go through it quickly, because I want to leave 10 minutes for questions. So decaying particles, this is very interesting. Supposing there are particles in your theory which are gravitational interacting, and therefore have extremely long lifetimes, because the decay, for example, gravity knows, decay with a very, very long lifetime, which, as I said, is of ordered days, okay? So three, 10 to the seven seconds is a year, a day is 10 to the five seconds. And for a weak-scale gravity know, the lifetime is very, very long, these will be decaying during and after nucleosynthesis. And what that allows you to do is to put a very tight constraint from the fact that in the gravity know decays, for example, sorry, there should be a wiggle there, into say a photon and a photon, the photons would be at very high energy, they have half the mass of the gravity know, and then they initiate the electromagnetic shower in the dense plasma at the time, and that then generates a lot of photons of energy higher than 25 MeV, which can photo dissociate helium. And if you break up one nucleus of helium, you make 10,000 deuterium nuclei in principle. You can make three protons and neutrons, they'll give you deuterium. And deuterium is 10,000 times less abundant than helium. So you can see how sensitive this thing is. If I just start breaking up helium, I completely mess up the deuterium abundance. So from the upper bound on the deuterium abundance, we can put a upper bound on the abundance of these particles, which this is just the number of particles divided by photons. And that is very, very tight as a function of the lifetime, and this was worked out some years ago, and it has proved to be extremely tight constraint on particles like this. In particular, since these particles are gravitational interacting, they're created by two bodies scattering in a thermal plasma, the number density, this zeta, should be of order, the maximum temperature to which the universe heated after inflation divided by the Planck scale, which determines the coupling. And so you can see that if the temperature is at most, we are allowing 10 to the 6 GeV, because 10 to the 6 by 10 to the 19 is, sorry, 10 to the 18 is about 10 to the minus 12, which is about here, okay? And we are basically saying that if these particles are there in your theory, then that puts a constraint on the thermal history of the universe, okay? So of course, these are all hypothetical particles. They may not be a TV-skilled gravity, you know, you might find some loophole in the argument, but you have to worry about these things. And again, here is a detailed calculation reproduced in this annual reviews, which I invite you to look at, which shows, for example, in generically, if a TV mass particle of whatever kind releases half its rest mass in photons, then what would be the effect on the nuclear abundances as a function of time? So this is the universe cooling down. You can, I showed you the plot of the first three minutes. This is a different plot of what it would look like if you're decaying particles present at the time of nucleosynthesis. And again, this is a very powerful test bed for unstable particles. So I want to quickly mention that sometimes people are enthusiastic, and if you see an anomaly, you can also try to turn it around and argue that it is evidence for new physics. So one such thing that has been going on for a few years is that whereas the helium abundance that is predicted is an agreement to the data, and deuterium is an agreement to the data, this is slightly old. The new version is right on the bang on top. The lithium abundance is very discrepant to the data, okay? And this you already showed in the plot that I showed earlier. And this can mean new physics in principle because you can actually make lithium from annihilating or decaying particles, okay? And this was lent additional support from the fact that any such annihilation on decay will not just make lithium-7, it will also make lithium-6 in roughly the same amount. And somebody claimed that they saw lithium-6. What is the evidence? This is a plot of the abundance that is seen of lithium. So here is the lithium plateau that I showed you earlier as a function of the metallicity of the star at its constant suggesting it is primordial. And then so these authors suggested that there is also a lithium-6 plateau. They detected lithium-6 in a few of the stars. And that also seems to be roughly constant. Although, you know, don't be too misled. There are lots of upper bounds also here. Some of the stars, you don't see lithium-6 at all. It's very fragile. It's easily broken up. But if we buy this, okay, then lithium-6 by lithium-7 is actually, you know, of order one, whereas the expectation from the Big Bang, nuclear synthesis, it's 10 to the minus five. So this is a possible anomaly, right? And this, you know, created a lot of interest in whether this was evidence for new particles. I just want to show you because this is something that particle physicists don't often look at. If you look at that paper that claimed that it's in lithium-6, the evidence is this, that when they try to fit the shape of the lithium-7 line, the fit is slightly different if you add about 0.06 or 0.1 of lithium-6, okay? As a difference between those two lines. So it's a matter of how well you know your baseline to argue whether you need a little admixture of lithium-6 or not. It's not, I mean, this is rather uncertain stuff, okay? You have to know the astrophysics of the decay, of the emitting region very, very well in order to do this. And there are concerns about the fact that the stars in which you detect this, the convection is very pronounced and so on. So in other words, that detection needs further confirmation. But if you are a enthusiastic particle physicist with a particle which is decaying and you think that it can decay with the right lifetime and abundance, then in fact you can create lithium-6 and lithium-7. So this is showing how it would depend on the lifetime of the particle. And there is this little region here where you are consistent with all the constraints that I showed you and you can actually make lithium-6. And you can find candidates for such particles in various extensions of supersymmetry. For example, split Susie and you can have, for example, the next to lighter supersymmetric particle might be a neutrality node with the LSP as a gravity node which will then decay into the gravity node and they have lifetimes and abundances in the right order of magnitude. So you should keep an eye on this if you are looking at this kind of physics because this could be true or it may not be true, of course. We'll see, right? Here is one last example before I finish. So I'll just go rapidly through this. I've given the reference here. I'm just trying to illustrate to you that people who construct physics beyond the standard model, especially if that includes decaying or annihilating particles are now aware that they have to respect the constraints from nucleosynthesis from the microwave background radiation, et cetera, in order to be consistent with standard cosmology. And these are, for example, constraints on a gluino in split Susie which would be a long-lived particle. It would bind to ordinary isotopes and create anomalous isotopes of hydrogen that we don't see. So that region is ruled out. It could create a diffused gamma ray background of strength higher than the one we measure with Fermi that is ruled out. It could create spectral distortions in the microwave background. That part is ruled out. BBN would be affected. This part is ruled out. And you can see how these bounds are complementary to the ones from the colliders which are basically in mass, in reach, and these are in sensitivity which is the Susie breaking scale in this case. So this is an illustration of how cosmology complements collider physics in discussing new ideas about physics beyond the standard model. And basically, it's fun. It's fun to do these exercises because you learn a lot of new physics which you would not have bothered to find out about otherwise. And you see, the bottom line is a pretty interesting one. The in-variance split Susie, Susie scale has to be less than 10 to the 10 GV. That is something that you could not have guessed otherwise. Final example, Pospillor pointed out that if you had new charged positive stable particles, they would actually bind to nuclei. So then you do a bit of nuclear physics which he does know and you work out that you can have, actually you can catalyze some reactions. You can change nucleosynthesis directly, not by affecting the neutron to proton ratio or changing the rate of expansion, but by directly catalyzing some reaction like that. You can make lithium-6. So if the lithium-6 excess turns out to be true, who knows, it might be some charged particle. In fact, the particle that he had in mind was in fact a stow. And if any of this is true, of course the bottom line is that you should see it soon at the Large Hadron Collider. Okay, so I'm running out of time. I'll just show you one example from the microwave background. That same spectral bound that I showed you which bounds the amount of energy release into the CFB as a function of the redshift. I can translate it into a bound on the abundance of a particle with some mass and some number density as a function of the lifetime of the particle. And these are the kind of bounds that you get which your particle has to respect. In the early phase, it's a mu distortion. In the late phase, it's a y distortion. The recombination of the plasma is happening around there. And these are all things that you have to respect and these bounds will only get tighter and tighter with better measurements. So this is my last summary slide. You have seen this before. This timeline that you have constructed of the standard cosmology. And I hope I've convinced you that we actually know quite a bit about the thermal history of the universe from today back to about this point very reliably. We can, thanks to the fact that nothing much happened. If something had happened, then you'd have trouble. But the perfect blackbody of the microwave background and the fact that the calculation of nucleosynthesis made in 1953 by Alfa Follin and Herman seems to actually work pretty well. That tells us that we know the thermal history up to there and trusting to the standard model and our lattice trends who calculate these phase transitions, we can extrapolate back to 10 to the minus 11 seconds. That's not bad. We have a laboratory going up to a 100 GV of thermalized plasma in which we can discuss whether any new physics was there because that would leave its trace on these abundances or on the spectrum and so on, so on. Beyond that point is just speculation and that is of course very interesting speculation because that is the speculation that touches on all the questions we have about the universe. What is the baryon asymmetry? What's the dark matter? Where did these fluctuations come from? What is dark energy if it actually exists? And then of course, if you're more ambitious than you worry about the fundamental issues of cosmology, what is this initial singularity? Is it evaded in a theory of quantum gravity? The cosmological constant problem, which is generate if you have general radiability and then of course, if you're really ambitious, then you would worry about where these three plus one dimensions came from and I have left a dot, dot, dot for further questions that you can think of. All this increasingly speculative BSM physics cannot be directly constrained unless it leaves some kind of a remnant that survives into this era and can affect the observables that we have. So the name of the game is to try to think of such observables, things that we can constrain by direct measurement, by humble quantities like the helium abundance measured by an astronomer or the temperature of the microwave background measured by Kobe or whatever. We can say something about gravity knows from the early universe. I think Dennis Schama who used to work here at CISA, he was very, very struck by this connection. In fact, he coined the term astroparticle physics because he was very struck by the fact that astronomer measures can say something, something so fundamental about particle physics and this connection is the one that we celebrate in this subject. Thank you.