 lecture on astroparticle physics by Seline Berm. Let me remind you two things. Today, we have an event at 11.30. We have the ICTP prize ceremony, so we'll finish a few minutes earlier. So there will be no question and answer session, but you still can ask questions during the talk. And I encourage you to point out the questions by using the raise hand option, which is easier for me to identify, but you can also use the chat. And you should have received by email a link, zoom link to the event, to the ICTP prize ceremony. So you can follow that as well. Having said that, Seline, if you want, we can start. Thank you very much. All right, so this is a third lecture and it's going to be more particle physics this time. So one thing though, I had promised, I will explain more about nuclear synthesis, and I'm afraid I was so busy today, I had no chance. So I'll try to do that next lecture tomorrow. And I will jump just here just to say also, I had promised to give the archive number, so this is now on this slide. And there was a question about black holes of 10 to 15 grams. And you can see basically, you can make the conversion and see where they are, but in principle, it's all good, they're still surviving. And I also gave a definition of a mass, which the Mr. Sharp mass, which I didn't, which goes into this equation of the metric, which I didn't mention last time. So it will be written in the notes of this in the slides when you get them. All right, so the other thing I wanted to remind you is we finished yesterday looking at possible candidates. And if you don't want to add anything to the standard model, then the only candidate which is essentially stable, so it can be in galaxies today, and which is natural, so you won't see a boom dissipate, is the neutrino, as I told you yesterday, already in the seventies, people realized that there was a problem. And in fact, foreign neutrinos are must in order for them to be able to feel, to explain basically the formation of galaxies and so on. The mass needs to have a certain value, so we found the limit, but at the same time, very quickly in the 77 basically, it was a very important paper, which guided everyone to consider instead, not neutrinos, but very heavy particles. And I told you yesterday, the reason why people immediately jump to something else was that not only there was some theoretical developments, which is basically the creation of a theory, which is called supersymmetry, and I mentioned this later, but also because people realized that actually foreign structure was really difficult with very light particles. So there were several papers. One I didn't mention yesterday, which I added on this slide, David's Le Caprio written. And you can see, you can read it eventually, but you will see that what this paper is saying is that you come from galaxies, unless the neutrinos are very massive. And so in a sense, this paper was back, was consistent with the other events which happened with Lee and Weinberg in 1977, suggesting that the documenter should be able to, sorry, before I jump, when I mentioned Lee and Weinberg, I always, and I'm one of a rare person to do that in the community, but I usually mentioned not just Lee and Weinberg, but it's all paper by it. Exactly the same year, so they were doing essentially the same calculation. They arrived to the same conclusion, dark matter has to be heavier than a proton. And so when they've done, I mean, a lot of people know Lee and Weinberg because they're obviously extremely famous. Not so many people know the paper by it. I actually don't know if there was some issue between the authors, but I think for fairness, it's better to mention it and Lee and Weinberg. So you will hear me mentioning the three. It's not necessarily what other people do, but I think it's fair to at least. So the one thing I didn't mention before, I mean, I mentioned it, but that was really going fast. And I will go back to this in a fourth or fifth lecture, but I wanted to mention it already because it's an important criteria why the neutrino can't be the dark matter. And it's called the free streaming scale. So I mentioned that when neutrinos were decooking early and then we're basically leading the life on their own and essentially free streaming. So what does it mean? It means that at early times, the dark matter of a neutrino in this case were basically interacting with electron and till a few minutes, basically they had to kill temperature of the universe, which was a few amelie, they continue to interact. So they were coupled and therefore they couldn't free stream. They couldn't basically interact it. But then eventually around the temperature of the UV to an UV, but at that moment, they stopped interacting with electron. And that is because basically the size of the universe was such that they couldn't find electrons anymore. The densities of electron became much smaller because the electrons started to annealite with oppositions, but mostly the universe, the expansion was so such it was hard for neutrinos to see any electron. So they decoupled, don't interact with neutrinos, and then after that they're on their own. Because they didn't have any more interactions, what happened then is they couldn't be, I mean, because they're relativistic too, so most of their life and they didn't have any more interaction, they wouldn't cluster. So they would actually free stream, which means that a pocket of matter, which then we know some had to be small, they would expand, expand, expand. And eventually you wouldn't have a very small pocket that you need, a very small fluctuations that you need in order to form small objects. So you'll only collapse very big objects and as a result you form only very big structure in the universe. So neutrinos give you clusters of galaxies, they can predict that very well, but neutrinos cannot explain the apparition of very small galaxies, certainly not object like the LNC or SMC at physical value. Now we know that the mechanism is a bit more complicated, but at the time it was clear that it would be very difficult to explain the number of galaxies and small galaxies using neutrinos. And that was obtained in the 80s and that's where everybody moved to something else. I come back to this formula, but this is essentially how you compute the free streaming scale. And what I just want you to realize is that it's very simple. It's an integral of during the time where the particle has decoupled, so is free stream, is free and free to propagate, so free stream, to in principle nowadays, except that before nowadays something is going to happen which is a collapse of refluxation. So from the moment the decoupled to a moment of the collapse of refluxation, the free stream moonscale would be given by essentially the velocity of a particle which for the neutrino is C for most of their life divided by the scale factor. And that's it. So you can compute the free streaming scale of a neutrinos, anyone can do it and you find the value and you realize that the free streaming scale is very large. So it's one of the easiest calculation that is one of the most fundamental because it tells you that the dark matter cannot be a particle from the staron model. Just to remind you, if we look at the staron model, so we have the quarks and leptons and the mediators and the hicks, all of them decay. The only one which didn't decay is actually the neutrinos. So the others which do not decay is the electrons but the electrons are charged. If they're charged, you would have seen them. And as I say, we only seen 5% of bionic matter in the universe, so forget it. And then protons, but again, bionic matter, we know where it is, we know it's only 5%, doesn't work. So the only option was a neutrino. Now we have just ruled out the neutrinos. So we know that now from that moment, it's clear that you need to do something else which is adding a new sector to the staron model. So the other option I told you, it could be that gravity is not what we think and it has to be modified. I told you in order to do that and to predict the CMB, you need to have a relativistic theory for this modification of gravity. The first one who had a goal at it, and a good goal, let's say it was Baconstein in 2004. And our study, basically, and many studied after that, showed that it's not working. It could have been, but it's really not working because of a cell damping. So again, this is an evidence that whatever dark matter is, you need to counteract the fact that the bionic is effect. So whatever you do, you need to find a mechanism to prevent at least the dominance of the bionic on the CMB spectrum at small scale. So just to recap, so we said we want a particle, a new particle which has no electromagnetic coupling, no storm interactions. It has to have a lifetime sign which is greater than the age of the universe because it's in galaxies. We know it's not neutrinos anymore. There has to be a new particle if it's a new type of matter. It could also be a type of matter that we never seen and is not related to particles, but if you're a particle physicist, the natural explanation is a new particle. And then we know that actually it should be massive. It should be greater than ATV, but I will come back to that because there are exceptions now and we know how to do. So the last statement on that slide is actually wrong, but it's on purpose just for making a thing from that. And I will keep the statement. It has to be massive and greater than it should be. There's a question, Céline. Please. Yes, hello. I would like to ask firstly, this mass of ATV, this bound comes from my experiment evidence or? Well, historically it comes from the simulations that people realize you can't form. If it's any things that revolve or so, but initially it was, well, it has to be actually, if all the latrinos to be dark matter was more in the bulk of 17 for having the right density and then spending structures. Nowadays, there's a lot of development and nowadays we know that from cosmology, for example, we know that there is a limit and the limit is below, I think it's turned to my, I didn't follow exactly some, I'm going to just give you all our magic. But I think, so recently it was open to EV, but I didn't look in detail, so I will check this, but it's basically smaller than it should be. And then with Catherine, I think the Catherine experiment recently had a measurement to probably some people here saying, you know, people in the room can tell what was the measurement. I know it was an arms and your results, but I didn't see the value, but it's also a very small value. So just to go back, so it has to, we know that the dark matter has to be heavy, but there are conditions to that and I will spend the whole ritual actually demonstrating you can find exceptions to that. So that's why I'm saying I shouldn't maybe not return ITV, I should have written, I could have written any ITV to be provocative, to be honest. ITV would be, what the free streaming land would give you basically. Yeah, and I have from the chat says, and in fact I confirm it's about one naive, the limit from Catherine. Oh, thanks. Okay, and I have another question. The dark matter particle, I mean, the dark matter is supposed to be just one particle or maybe other types of particle. Yeah, very good question. It's open, it's open to interpretation. Actually, I myself, but many, many people have considered several and I actually think it's in your eyes who's organizing the school, I think was also considering many components. You can have many components. You will see why people tend to think one particle, one species, it's just more convenient to make predictions, but it's not, it doesn't have to be. Yes, because I mean, ordinary matter consists of like, many particles, yes. Yeah, that's right. And it could well be, I will try to show you why people thought about just one, and it was, you know, it's always a combination of the development. So there was some theoretical development which could tell one particle, and then there was another reason which is predictability, which is so very simple if you have one. But in reality, you can rely on this assumption and this is really what happened in the last maybe 20 years. I will explain that later. And the effects of dark matter can be explained like in various, they have various explanations like it is part of it that would be a new particle and part of it to be like muchos or primordial black holes or even again modified gravity. I mean, it must be one, it must be one of these or it can be simultaneously all these things. It could be several, it could be several. Again, you will see that it's hard if you start to think about many combination, it's a combination of many solutions. It gets hard to make prediction. It could be that indeed there is a large fraction of primordial black holes or distribution and it's only a small fraction of the halo, but that plus particles, new particles make everything, including explain the CMB and explain the halo. It could be that indeed it's a modification of gravity and new particles. Honestly, at that stage, I think everything is open but this is not, historically it's not what the community has done. The community was very focused on one particle and because we can't find it, now we obviously need some revision and so people in the last, in particular, I would say in the last 10 years have revised every assumption. I see. Thanks very much. You're welcome. And there's another question. Just to say for the last line, Reed, I should have been more conservative. I should have put T and that would be the free streaming line so that there is no confusion. All right. And there's another question by Julian. Yes, so I'd like to ask, besides modifying gravity, what are other non-particle solutions to that matter? Over the moon, you said? Yes, and besides that, what else is out there? Yeah, I can also give you another reference. Many things have been tried, as people try by gravity, basically. So, Beckenstein, in his theory, has two metrics. I will not go into details here but he's using the fact that there are two possible metrics and a lot of people after that started to think, oh, what about if we extend this or if we use this principle? So many people have explored this but it never really gives you, it makes some modification on small scales for gravity, but never it doesn't really explain the CMD. And it's only very recently, I think it was last year that Costas, who was my collaborator on this paper, has proposed actually a theory which now seems to explain the CMD. But as I mentioned yesterday, the problem is that night doesn't really reproduce the formation of galaxies of a dark matinee galaxy. So it's really hard to make both, the CMD, the primordial evolution for gravity and what happens today compatible. But we are actually working together, I have a PhD student looking at this, so there may be various hope. It's really hard. And in fact, yesterday I mentioned when I was working with Costas from the theory that, I can just tell you my purpose, it took us a year and a half, but it took us a year and a half of full day, working on it full day, morning, night. It's really hard. So maybe it is a solution, but it's not easy. So, and here we go. Completed, but it's not big. Yeah, so it's, I would say if you want to do modify gravity, it's not something you improvise. I mean, you don't wake up and start to work on it because there are lots of risks and I spend a lot of time for a theory which in the end doesn't give you a right result. So it's not an easy game. And we will see on the other hand, I think new particles work for whatever reason. I'm not sure. To me, I mean, and I am very provocative and people get annoyed when I say this, but what the new particles do is essentially mimicking a modification of gravity. But they do it as in a scale-dependent way if you want. And obviously there is a reason because it's a question of density, but it's essentially a guide to modify gravity. So if you can find a modification of gravity that mimics the production of new particles, then you win. I hope that answers your question. Okay, so there's two more questions. So maybe Synthetic, please, Maruan. Hello, Professor. Please, I have a question. Why we search a particle that lifetime greater than age of universe because all particles was created in Big Bang. So why do we search a new particle that is dark matter particle that lifetime is greater than age of universe? Yeah, sorry. So we know that they exist till today. And we know that they were there because I showed you if you go back in time, if you remember with the lensing, you see that they were there at large redshift. And in fact, we know that there was dark matter at the scene. So we assume, and you would see why, but we assume that it's been created sometimes and between, well, we don't know exactly when, but in between the Big Bang and the moment of decoupling, which is the moment where we can see the scene. And it's still here nowadays. That because of that, we know that it needs the same age as the universe. Okay. And last question by... Yes. Hello, Professor. This is a very quick question. So is it that the limit on the neutrino mass is a guide for the lower limit of the mass of dark matter? Or... Yes, because I've been, I mean, I wanted to be provocative, but we see hopefully I have enough time today to show you what can happen. Okay, okay. Okay, thank you. So in reality, there is no limit for the dark matter mass, but if you go to low, it has to be, you will have problem with structure formation and therefore you can't really consider a particle per se, you have to consider more of it. I'm not sure I have time to say that today, but I'll try not to know. Okay, thank you. And Julia in the chat specifies better the question, which was if there's anything except, there's anything else beyond particles and modification of gravity. Sorry, excuse me. Not really, no. I mean, apart from matches, but then we saw that the permanent black holes are back on track unless, you know, unless a metric is incorrect. So yeah, I mean, apart from matches, new particles per se, and modification of gravity that's basically the only proposal I've seen so far. All right. Should we continue? Why is he not got five months? I see maybe a very, very last one, if we have the word, and then we can have some questions later on from Gautam. Actually, so my question is, since dark energy is sort of overtaking the amount of matter in the universe fully, would like dark matter to dark energy mechanisms also place bones on what kind of particles we're looking for? But is it just that we do not know enough about either of them to actually point out or would matter to dark energy to dominate more? That's right. So it could be that they're related. It could be the different aspect of the same field. It could well be, that's really possible. But in the end, as you said, we don't know enough to really, so people are trying to make, we're trying to make such a models. There were predictions. I didn't intend to speak about that, but I don't think, you know, this is not the favorite candidate of the community. So even though they may be related, that's not an explanation, which I think attracted a lot of attention. But eventually we may discover that this is the way to go. I think you need to consider everything I say with a pinch of salt in the sense that at the moment everything is to prediscuss because since we don't discover any particles yet of dark matter, any assumption that the community has met so far might be wrong. So when I tell you something, assume this is kind of consensus of a community today. It might not be tomorrow. Okay, thank you. Okay, so maybe I'll continue. Yeah, so it's another paper I wanted to highlight, but I will move on. So for today, I wanted to go through some particle physics candidates. And for that, I first need to speak about the thermal density, using the limit and then exception to the density calculation. And then after I move on to the candidates. So relic density means the measurement of the post-medical parameter today was a question. So historically people have came with the idea of thermal whims. And I will explain what means whims a little bit later, but the notion of thermal also is basically what I'm going to discuss right now. And I put the old good times, sorry, the good old times, because this was something that you had to consider. You couldn't write the paper on dark matter without considering the relic density and the mechanism to do it, assuming usually thermal dark matter until 2000s. And then after I finish that with both and I will explain you why. So why thermal? Well, if you assume, I mean, you know all the electrons on the photons and you know they were coupled. The idea was what we've been produced very early on and then they evolved. But when they were produced, they've been in contact with the cross-sectional which was very large. So the idea is they've always been in thermal contact because they always had an interaction. And it's only the moment of where we can see a piece of the last category that they stop interacting. So for a long time, all those particles had the same temperature. So then if you think about dark matter and if you're in the seventies, for example, well, although you want to introduce a new particle, at the same time you want to do, you want to use the fact that, for the standard model particles, they were in thermal equilibrium for the right time. So what you have to do is to assume that the dark matter was produced very early on. And like everything else, maybe it was produced just after the inflation, for example, maybe it was produced just at the creation with the control situations. Whatever happened, you assume that the dark matter has essentially the same temperature as the electron in the early universe. And then eventually something happened to them. Maybe they have stopped interacting with those particles. Maybe they stopped interacting and became non-rativistic or maybe they became non-rativistic and then stopped interacting that you don't know when you need to figure out. But the first assumption was, okay, all the particles were created at the same time with the same temperature. It's important to realize also that for photons, so Gwanna told us, I mentioned the scale factor which entered the metric of the scale factor is actually the inverse of the temperature. So a very small scale factor correspond to a very large temperature. So the universe nowadays with a very large scale factor has a very small temperature. And we know what it is because we measure it. We measure the CMV at 2.773 Kelvin. So we know it's very cool. In the past, it was very hot. And so the first paper saying that with fever, for example, was basically telling you, okay, the scale factor is defined by the temperature of the photons. So if you now want to make any calculation, you need to use the scale factor. It will always be defined by the photos which come to us, which arrived to us. But if you assume that they were all the particles were produced at the same time or more or less at the same time and were in equilibrium, then you can use the photon temperature as a sort of the temperature of all the other particles. And this means that you can make prediction. It makes your job much easier. So I hope you can understand what I just say, but basically by making an assumption of a thermal assumption, you can use the fact that you know the temperature of photon because that's the scale factor and you can measure it with a shift and so on. And therefore you can access the temperature of all the particles, including the dark matter as well. Now the evolution, which I'm sure you've seen was, I will pass the details, but the universe was radiation dominated in the early universe. Then eventually transition to become matter dominated and then eventually lambda is taking over. So we expect dark matter to become very important during the phase where the universe is not dominated by matter. But it was probably produced very early or during the phase of rotation. So again, you have to follow through the different stage of the universe and the evolution, potential evolution of the dark matter temperature, which if you assume it's thermal is easy because it's the same as the photons. I'll just put you this because I thought it was quite interesting but there's more for you. So now here is the logical step. If you assume that you have thermal particles, thermal dark matter particles, and if you assume that you have only one species. So in this case, you have, you consider the universe, draw it as a box. In the early universe, the box would have been very small but the densities were enormous because you have many particles in a very small volume. Then eventually the universe expand. So the box is increasing and eventually it arrives today where the box will have a certain size. So as the universe evolves, the volume changes but the number of particles doesn't necessarily change. So how the photon, nothing happened to the photon, they don't disappear. So the only thing which happened to a photon is if you define the number density, the number stays the same, the volume is increasing. So the number density decreases but only because of the expansion of the universe. I put three H just to remind you. So the expansion of the universe is given by the upper right. And when you actually know the equation you will see there's always these three in front of the HRI in most of important equation. That's because the three generally comes from the fact that you have to take a derivative of a volume and the volume excuses is a length to the cube. So you always end up with a three H. The point here is if a dark matter, if a dark matter behaves like photons, then the number density of dark matter will decrease but the number of particles of dark matter particles will stay the same. But it could be that the dark matter is the same as, so VA is the same as the stondar model particles. And in the stondar model particles you have electron and positrons for example and we know they annihilate. We know they produce photons. You don't want the dark matter to annihilate and to produce visible photons but it could annihilate in principle into photons as long as they're invisible to you. So if it's a different wavelength for example and they could annihilate into electron and positrons. They could annihilate it into anything in principle but historically people assume so we didn't really consider photon and field but assume that the dark matter could annihilate into possibly quarks and anti-quarks but you could do also leptons and anti-leptons. If you have this annihilation then the story changed with respect to what I showed you in the previous slide because now you have not only the box which changes for the number of particles in the box which is decreasing. If every time you have an annihilation, if it's efficient, if it doesn't go back if the particles which dark matter annihilate into don't annihilate back and produce the dark matter then if you have only one way the dark matter annihilates into something then the dark matter disappears. And so you started with a certain number of dark matter particles and then you're left with a smaller number. And so eventually you arrive to a point where if you assume that dark matter was produced firmly you have an idea of the evolution of a number density in terms of volume. You know how the volume changes because that's basically given by the photons but then you have to describe whether dark matter has some interaction and whether it annihilates and if it doesn't annihilate then it changes the number of density that you will see eventually today. I hope that's clear. It's not necessarily obvious to explain by slide but eventually you can describe this problem using an equation which is called the Bosman equation and this was really the work of Hüt and Liam Winder. So it goes as follows. It's a very simple argument and it's basically thermodynamics in an expanding universe. At this stage I don't really need to speak about particle physics. I just need to assume that they are particles and that's it. So the argument is well the evolution of a number density. So the number per volume is evolution with time is given by the evolution of a volume. So the expansion of the universe would change the size of the universe. And then the other term is the annihilation cross section. You will need two dark matter particles to disappear together to annihilate, to self annihilate. So that's why you see the N which is a number density, you see N squared. And as I said, you have annihilation in one way but potentially you could actually have annihilation in the other way where the particles which have been where the dark matter produce some particles and those particles may annihilate back into the dark matter. So you have a certain equilibrium which is actually given by N not in this formula. So it's a very simple equation. If you didn't have sigma v, the annihilation cross section you just have a dark matter. Nothing happens to the number of particles. It's just the volume which changes. So you will just describe basically the fiber the number density evolves with the skin factor. But if there is an annihilation cross section then obviously you have two terms now you have the expansion which is driving which is decreasing the number density. And then the cross section which will change the number of particles. And so the question is, which one is dominant? And obviously because you don't know the dark matter particles, you don't know the answer. So what Lin-Wen-Bel deal which is very clever is to say, I mean it's just a solution but is to look at the problem this way. Is to say, okay, at first the dark matter was annihilating into thermal but the dark matter was actually relativistic. I'm sorry, there is a typo just realize there is a typo on my slide it should be relativistic instead of non relativistic. If it was relativistic then it would produce particles with a lot of energy which basically had enough energy to produce a dark matter back. So the number of particles doesn't change and it's so it's like constant and nothing happens. But then eventually as expansion happens and so the universe becomes bigger and bigger the temperature of the universe decrease the temperature of the photon decrease and if dark matter is still somewhere in thermal equilibrium with the other species then its temperature would be such but eventually it becomes lower than the mass of the dark matter. In this case, the dark matter become non relativistic and from that moment, it can produce particles but those particles don't have enough energy to produce a dark matter back. So when this happen, the number density of dark matter decrease exponentially. So eventually it goes down. And the question is, is this process forever or not? The answer is no, it stops because eventually what happens is the expansion is so efficient that dark matter particles can see each other and they can't annihilate each other. So when the volume is so big that the number density are extremely small so the particles are the number of densities are extremely small then the annihilation are no longer efficient. They can still happen with them but do not change, remember? The number density of dark matter and then the number is constant again. So you have really both three phase the moment where the dark matter become non relativistic then you have the exponential drop from the number density and then eventually you have what we call a freeze out where the expansion wins and the number of particles become constant. Now that is extremely important because you don't know that moment of freeze out but you know what is the number density today. It has been measured plank as I told you who CMD experiments gave you a number. So you access the number today and because you know the number was conserved from that moment to nowadays you have actually access to the number density of particles and to the number of particles at the moment of freeze out. So you know where the freeze out happen and then the question is whether with exponential you're going to learn on the right. So the slope, if you want the slope which is determining where you're going to end up depends on the cross-section that is essentially fixed by the cross-section. So you can see for example if the cross-section is weak you have your exponential drop but because the cross-section is weak you're going to end up with still quite a lot of dark matter because the annihilations are not efficient. So you will end up with a number of dark matter which may be too high compared to the observed value but it would be actually I mean, it would basically translate into a high number of particles today. If the cross-section is very large then you will annihilate so much that there will be no dark matter left. So you can see for example if the dark matter has electromagnetic interaction there will be too little left if it has strong interaction that will be the same not enough dark matter nowadays. So you can see that there is only one I mean, there must be a small range of values which explain the density that you have observed today. It's a very fine human essence. I gave you my way of doing it which is an ethical way and I give you the numerical solution if you want to have a good yourself. I won't explain this I might be to explain that maybe in another a tree in a session but for now what I want to insist is the physics. So just to repeat what I said in the previous slide. So you have an equation which is the Bosman equation which described essentially what happened to particles in an expanding universe. The number density evokes with expansion and with potential annihilation cross-section. If a dark matter was annihilating I would need to add a decay term in this and you will see that it can be relevant but if the decay term is too fast then there would be no dark matter left. So it can't be a dominating thing. Now this equation that you see the Bosman equation it doesn't involve any mass it doesn't involve any particle physics at all. I'm not telling you it's a neutral particle I'm not telling you how it behaves. I just tell you it has an annihilation cross-section. So I can solve this equation. I usually give it to my years two to hear undergrad students so it's always fun to see them solving it because they're basically they could have written the new one and the new paper. But when you solve this equation what you get is that the cosmological parameter associated with this equation is simply 310 to minus 27 centimeter cube per second divided by the annihilation cross-section. That's it. That's all. That's the solution of that equation. So when you have this now you have to think in terms of is it reasonable to think that what is the value of the annihilation cross-section and is it reasonable? So as I said, now the CMB experiments give us very precise value of the cosmological parameter. So you've seen on the previous slide yesterday but I recall here the cosmological parameter associated with dark matter multiplied by the value of the constant square divided by a normalization which is fixed to 100, so on the square is 0.12. So if you want the value of omega dark matter you need to take 0.12 divided by 0.7 which is more or less the value of H naught today to the square. And that will give you the bulk pack of 30% of dark matter madness. So if you do this now you have omega h square you know it's 0.12. So you immediately deduce that the annihilation cross-section which give you the right relative density if dark matter annihilates if it was thermal then is 310 to minus 26 centimeter cube per seven. If you don't like the centimeter cube per seven multiply by C because it's in a magnitude V over C and then you get basically 310 to 10 to minus 36 let's say centimeter square. That is the weak interaction. It's the same order of magnitude as a weak interaction in the solenoid. So at the time people said, oh wow, immediately found a solution which is why we knew it was not having a too many interaction we knew it didn't have a strong interaction. So it was not a U1, it was not described by U1 it was not described by AC3. Then you think maybe it could be described by AC3 and then this value of this calculation tells you well actually maybe yes it could make sense. And so people were very, I mean suddenly let's say a picture I made which is you have particles which as I said live very long so they're more or less stable, they're massive so I could be easy, as you would be easy but I didn't explain really why. And then so as I said it's like weakly interacting. So you have a weakly interacting massive particle and that become what we call the wind. Everything seems to work very well. Then once you have this you can start to make prediction you can go back to particle physics because the question is you didn't specify the document of model you didn't specify particles here from which particle can explain this course section which model. And so this is what the Soviet near Wainberg did and then the time in the 1970s since 1977 and especially when you went back you know that you can have some interaction you know that actually he's a W boson. So it's very tempting to think of the neutrinos light neutrinos didn't work but maybe heavier neutrinos would work. So you put heavy neutrinos and then well in order to annihilate into say electrons and positrons they need a W boson. So very quickly you have this heavy laptop heavy neutrinos if you want annihilating into electron and positrons for example the IW you can compute this course section and you obtain that the course section is proportional to the dark matter mass square divided by the mass of the W to the fourth. And now you know the value of this course section it has to be straightened to minus 26 when I say it has to be well in reality I'm forgetting that there are some error bars and I was just showing here I mean well I actually cut them sorry I should have put them back but Planck is not doing a perfect measurement so it's not just one value but it's a very narrow range. And now the question is you have a mass of W so they didn't have Planck but they cannot assume that it would be it wouldn't exceed 100% and it would be more than zero so you have basically you very quickly obtain a value a range of values even that the mass of a W is fixed so you found the range of values for the dark matter. And this is what is in this paper and you can see so this is a picture from the Lee and Wender paper which might explain also why it was so popular and you can see the range so they show you the 17 kV for neutrinos but never mind you can see so the density assuming I mean basically the red range is where they think heavy neutrino or heavy leptome could line so where the relative density could be appropriate now the side could be principal could be zero and then it cannot be more than one because then it would dominate the units so it has to be in between the Fugili and then eventually a Fugili and that argument has dominated the field for a good 20 years so everybody was thinking the Leeuwenberg argument tells you dark matter has to be heavy have you even a proto and it can't be really heavy. All right so this was the historical argument now let me show you maybe we can have a little break and then I can go into the exceptions. Yes exactly perfect and perhaps before the break there's a question by a visitor, please. Hello Ms. Bohem, it's a nice presentation. Actually I have a question from weekly interacting massive particle. Yeah so massive means in what sense it is massive and because currently I'm working on week as of heavy had drawn including both Barry on and miss on. So normally heavy in the sense I understood it is heavy value on like 10 GB to 14 GB mass that we know. Is it something that particle or it is something else? So you will see I will spend all the rest of the presentation showing you that massive actually questioning the notion of massive not that I mean the dark matter needs to feel gravity. But if you speak about particles with a mass the question is what is the mass? You went back we're telling you it has to be heavier than a few GB but in reality it could be smaller than that and I will show you that it could be could have been MEV, could be QV. Okay so it's not an ordinary matter ordinary matter particle like we know that about. It has to be something else it cannot be an ordinary matter. It could be as some such people were thinking of the composite particles. So the one thing I didn't mention was I'm kind of having in mind a fundamental particle. You could think about composite particle and then people were thinking it could be something like a pine so you could have some equivalence to some extent. There's always a question of a lifetime. But essentially because you're introducing a new sector at that stage it could be I mean it doesn't have to be the mass of the solar model particles. Okay can you roughly tell me the mass range compared to the heaviest barrier on till date we know? The moment for that matter the range can actually be down to 10 to minus 20. 20? Minus 20 so extremely small but then you don't speak about particle they say you speak about it here. So it's called fuzzy dark matter if you want and maybe you're familiar with axiom so axiom would be 10 to minus 60. And fuzzy dark matter is basically a version of axioms but driving them instead of looking at the usual mass range going really really low. But then after that you can also go very heavy. So we went back we're making some assumptions with those assumptions they found this range between GEV and TV but in reality now we know that we can remove those assumptions and you can go very heavy you can go very low. Okay thank you. Okay there's another question from Lalith. Hi my question is by where you have this decoupling and relativistic regime. So you have first the relativistic regime and then it is decoupling. So what if it is decoupled first then goes even it is written. Say you have a dark matter of 50 GV and it is coupling to W or top in that case. So I didn't fully hear but you're talking about it could decouple very early on in the case of the mass which is taking into account the mass. I like that. So with the mass it does I mean it could decouple very early on but it will still decouple after being non relativistic. So if it decouples very close to the inflation still for whatever reason it would mean that the mass would have been decoupling after the inflation. Okay then the last part we remain the same only this number density the question that you're in that will only change. So yeah I mean having trouble to hear you. Okay I'm saying that in between the two green columns there is this written that the decoupling was started happening. Okay so I'm saying only this part this part will change. Yeah so that's right that's right. So if it's only true remember for thermal dark matter but if you know the dark matter mass this is it. I mean you use this formula and you know the number density you will try. It's said that this number density should go there. It needs to go there but because you have a universe which is an expansion as some say remember the expansion week and eventually you change the region with the number density and there's no more annihilation and not the inflation basis. And then the number density statement. All right and there's another question from Chandana. Hi ma'am so I would like to ask you about the equation dn by dt. Yes so here are we considering d I mean the number density to be constant like after after annihilation. Oh yes after the freeze out maybe it should go back to here. After the chemical decoupling so a freeze out then the number density is constant. So you see on the slide you see the equation again which is on the left corner. This means at the stage when you on the when the density is constant this means that sigma v is sigma v and square is much smaller than hn. So it's a regime where basically it's negligible. I see I don't understand this sigma v I mean sigma is the cross section and what is the purpose of like I mean what is sigma v called actually. Oh we call it sigma v is sigma times the velocity of a particle. So if you if you compute a cross section I don't know if you ever done that but if you compute a cross section you basically the way to compute it is basically to look at the interaction but you need a flux of particles coming so that they can interact. And so this flux is contained in the notion of velocity but in the universe you don't know at which velocity the black matter particles come together to annihilate and so because you don't know you just multiply by the velocity and then you get rid of it. So it's like saying it's a parameter I don't know and I don't need to know it because anyway it comes like this in the equation. So it's a very convenient thing that what counts in the in the Boltzmann equation is sigma v and in the definition of a cross section is basically one of the v so sigma v is actually doesn't have any v anymore. I hope I'm not sure it's clear what I said. So we are considering average of sigma v right? Yeah that's right it should be the average. And that average of sigma v would that be constant or I mean after the when we consider number density to be constant would average of sigma v be constant? So if you have to see sigma v and square so what happens after the chemical decoupling is you have n square minus m0 square which is basically becoming if you want n sigma v and square. So n0 is given by the formula the exponential drop and then n is really the right value is the value that you measure so n at some stage exceed m0 so you can neglect m0 and not in this equation and you just basically have that moment of decoupling you just have sigma v and square but then you have to compare it to the minus 3 hn or hn and you can see that I mean you basically have hn equal sigma v and square and then you can see to press one of the density so you have the expansion equal to sigma v and actually I think I did I do that no sorry it might be a bit returned but it's not like it so when you do this essentially you can see that sigma vn becomes negligible with respect to h and so it's not that it becomes constant but it just becomes negligible. I mean it becomes constant if you want because I mean it doesn't become constant but n is evolving the number doesn't change anymore but the number density which is in this equation the number density is still evolved because of volumes of all maybe that's what you were asking. Thank you. Thank you and the very last one maybe quick from Shivam before I go into the break. Hello I just had this question that you said the number density will remain same or even after the annihilation but the expansion of the universes still will stay the same. The number is a number which stays the same but the number density drop. Okay yeah that's the same but the number density decreases. Okay thank you. Okay very good. All right so let's let's have a break five minutes is it okay Celine five minutes so we resume at three minute three. See you in a bit. Right you're ready to resume. Okay. Yes please Celine. And now maybe um maybe we should ask if the students have managed to follow I think there was a bit of a drop in numbers maybe so maybe um yeah if you want to comment if you don't like the lecture I need to do something better please please let me know. So we arrived at the point where we were saying that Leigh went back and the Huit was choosing I mean was was showing that there is one range which is very important but then the question is is this range valid forever or not and I want to show you that eventually this range actually assumes so many things that you can alleviate it. So there are plenty of exceptions and I will I will show you this. So before and I used 1998 because there was a bit of a time where there was a change in the community where people started to consider many different models. So before that the main assumption as I said for many years for 20 years and most people in the community assume the following first of all dark matter annihilates it doesn't have to be but people were assuming this then it's people assume that um the dark matter and anti dark matter particles number density were equal doesn't have to be they could be like a for ordinary matter we know that there is an asymmetry it could be that there is an asymmetry in the dark sector. The other thing is we assume that the dark matter was annihilating into standard model particles again doesn't have to be but that was the logical explanation because you know the ordinary matter if you don't do this then it's likely you end up with a new sector but you can't see it so it doesn't really help you. So you can and I will show you some people are doing this now but you you really want to make prediction that dark matter annihilates into some other particles. Now of course nature has chosen something we don't know what it is so it may be that this is irrelevant. The other thing is there was an assumption which stemmed from Lee and Weinberg and Hüt which was that the mediator of the annihilation is heavy doesn't have to be and the dark matter somehow was assumed that the dark matter was a fermion and the mediator more like a gauge boson doesn't have to be so for 20 years people were making this kind of assumption. The last thing about the mediator was not really true because people consider and the last line maybe was also true because people started to consider other options with the dense of super symmetry but in 1998 people really started to question the Lee and Weinberg and Hüt argument and started to think about other possibilities. I will show you something after but the first thing I would like to show you is really what I've done personally not that it's because of my work but because it was I think a strong transition in the community. So I told you everybody was saying well if Hüt and Lee and Weinberg are correct then the dark matter has to be heavier than a potter and what I realized in 2003 was that was my PhD thesis to some extent I realized that it doesn't have to be and I show you the argument. So the argument was stemming on calculation I was doing in super symmetry but the argument is fairly simple. First of all let's assume dark matter is not a fermion but it's a scalar. In this case it can annihilate and I too assumed that it would annihilate into fermions, stoner model fermions, so you can have dark matter annihilation to fermion and anti fermions. If you assume that dark matter is a scalar the mediator has to be a fermion. So then you can write you can do the calculation in a very generic way and you found that the cross section is proportional to so the denominator is the mass of the fermion to the fore if the fermion is a certain behavior than the stoner model fermion and then you see two you see two parts you have one part which is proportional to mf which is a mass of a fermion in the final state so the mass of a stoner model and then the mass of the mediator itself. Then you have terms which are cl and cr which are the couplings left and right. I'm not sure how familiar you are with this but it's basically telling you it's coupling to the left left-handed electron right-handed electron if you want and what I observe is I mean from there when you do this calculation in this framework I mean there is nothing particular here just assuming that the dark matter is a fermion is a scalar coupled to a fermion you do that calculation and then you can see immediately that if you assume that the coupling c left is non-zero or cr it is non-zero so the product cl cr is non-zero and you assume that the mediator is heavier than the fermion you immediately from this cross section you immediately get that the cross section is proportional to the couplings divided by the mass of the mediator to the square. Now why is this important because I showed you before with the Lie-Wenberg argument you had the mass of a dark matter in the numerator so I can just show you quickly again this is it you have the mass of a dark matter square divided by the mass of a mediator to the four in the Lie-Wenberg argument so when you impose a value for the cross section you impose a value for the dark matter but now with what I'm showing you now in this cross section you don't have a dark matter mass so when you impose the value of the cross section that gives you the right for identity you don't have a constraint anymore on the dark matter mass which means that now you can have dark matter which is much lighter or much heavier than the range that Lie-Wenberg was finding so you can have very light dark matter which at the time meant as light as an electron because this is the lightest you can think of in the you know if dark matter anneales into a fermions the lightest would be the electron so it has to be heavier than a few mev oh it could be much heavier and so from there for me it was an important moment because I started to look at the phenology of light dark matter and a lot of work came out and I mean basically I think the community shifted and started to realize that we could explore light dark matter that was I think the first paper really showing this and then the other thing I was noticing in the same paper but I noticed before so was the fact that you can have also the exchange of a mediator which would be like a gauge puzzle in this case you want it to be neutral but it could be like a z a z boson it can't be a z boson because then you're back to having a light mediator but you could imagine that it's a similar mediator as a z boson but not with the same mass and not necessarily with the same coupling and in our paper we call it a u boson because that was motivated by the by supersymmetry but in reality is what nowadays you would call a z prime or dark photo and when you do the calculation now you can see that the cross-section is proportional to the dark matter mass so you're back to the same argument as uh hutan li and weinberg the cross-section does depend on the dark matter mass but now you have flexibility because it could be that the mediator has a similar mass as a dark matter and if it's the case so if the mass of a z prime is similar to the dark matter mass then the cross-section is the most independent of a dark matter mass which means that again you can have light dark matter as long as the mediator is also as light as a as a dark matter so those two examples that I show you here are basically I'm going to summarize them here but I'm basically telling you you can have light dark matter it can be much lighter than a proto and I really think this was a shift in the community which was so used to think about heavy dark matter later on five years later there was there's a very famous paper by a fame in kuma which is called windless miracle so before we found that the annihilation cross-section is of the order of the weak interactions and suddenly it gives you a wimp what we call people call wind miracle and the moment you change the moment you realize the cross-section doesn't have to depend on the dark matter mass and therefore the dark matter mass could be anything you want including for example MEV to TV or to have you on TV then you have basically a wind plus miracle so this was really I think a shift in the community which suddenly basically opens the way to thinking differently about dark matter that is an example in my paper I don't want to scare you but I just want to show you I was doing all those calculations without the other change of in the community which happened I was doing those calculations saying I don't usually you come from a theory you assume a theory and then you look at the phenology you look at the consequence of that theory I was saying I don't care about the theory I can compute all those process without knowing a theory as long as you know there is a mass and there are complaints so I was doing all those calculations in a very generic way that is known as simplified models and that became a very powerful way of computing because we don't really know what is a theory and in fact the only real theory we have is supersymmetry and we don't find it so this is a way to keep going basically without having a theory which at the time I knew it was powerful but I didn't know I appreciate how long well how much people will want to use it later so with this you can compute many things including for example the g-2 of the electron and I mentioned this because again this is relevant today you have you heard about some anomalies in the g-2 for the for the mu g-2 for the electron is extremely constraining well measured extremely constraining you have over anomalies and again the u is called z prime but you can compute all those contributions in a very generic way and you can connect to the dark matter contribution so you can really have the moment you start to think about light dark matter or light z prime you basically start to change the whole phenomenology and you shall start to basically look into process which nobody had considered before so I think this was an important moment where this more the same I don't want to bore you but the point is at that moment people started to say oh well actually we can compute all sorts of interaction there we will simplify models and what people have done after that is even revisit the way dark matter interact with the standard model and realize that many of the assumptions that were made at the I mean in the early 70s and 80s to compute for example well to compute sorry oops to compute dark matter interaction with standard model were extremely restrictive and so people after that completely revisit every everything you can think about introducing new operators and so on that free basically the space for any kind of mass any kind of interaction and any kind of models I think what I'm trying to tell you is basically um I'm not sure my paper was the instrument of a change it was there were many anomalies coming and many people starting to think about new models but really I think that's the you know the 20 sorry the 2000s were really the years where people starting to think differently about dark matter but instead of coming from an approach which was theory and then eventually a model and the phenology the change which is still currently the case is we don't think so much about theories anymore but we think about models so the dark matter now is a question of models more than theory and anything in principle can be considered so now that you fully understood that in reality there is no concern on the mass but there could be a provide depending on how you do calculation but it doesn't have to be then if you have understood that I'm going to show you another exception to the really density calculation still assuming that dark matter is thermal that was actually in the 90s 1990s and that was a very important paper and there were actually two two sets of paper again one was by a French physicist called Pierre Salati and I should have put the name and another one which is extremely famous 90 so the 1991 is by an American King Greece who personally found the hero and then the paper actually in 1984 was a French group but the paper by a French group was not well known so the American paper is actually much more famous again but it's a paper but there are three exceptions and one is called co-annihilation so I show you the Boseman equation I show you an equation where the number density again not just the number but the number density evolves with time as because of the expansion because potentially of annihilation but now if you were thinking this was a question I had at the beginning if you were thinking that there is not just one dark matter particle but maybe they are too maybe they could actually live more or less at the same epoch maybe in this case they would interact together annihilate and disappear together and this is called co-annihilation so when you write the term like this you can see you have an annihilation cross section you have a co-annihilation cross section and now it involves the species the dark matter species and the other dark matter species so you have n i times n g and you have two equation two coupled equation one for species i one species j you have to solve this I don't think it's really important for me to go through this but essentially what you found is you know it didn't put the plot but eventually this both co-annihilation helps you to or basically indicates that the dark matter can be much heavier than was presumably thought maybe I'll add the reference tomorrow to this work the other thing was noted is if you you can fall on a resonance if a dark matter basically have a mass of a mediator you can have an enhancement of the cross section and that changes also the prediction that you make there is another thing which is actually supersymmetry related it's called focus point I'm not going to go I'm not going to explain this but I just wanted to show you that this paper in 1991 was even in 1984 we're already thinking about how to change your prediction from it and you went back however the change has been only in 1998 because suddenly everybody was looking into of those equations and starting to compute so 1998 there are many papers about co-annihilation and then as I said you know people have really progressively started to think about more and more exception more and more models and more variation but the thing I didn't tell you is that there was a theory behind for a long time which was supersymmetry and I didn't really explain it so far you know if I maybe I start here so supersymmetry is a complicated base maybe many of you have studied it um the simplest um simplest theory if you want as also sorry as one operator supersymmetry and this operator is such that every time you have a particle from a solar model which is spin one-half so a theorem it will transform it it would create another particle which is spin either zero or one um so when you apply it when you apply this operator on the solar model spectrum you get new particles with a spin either zero or one now in effect what what supersymmetry does in particular n equal one is to double the solar model spectrum so you have solar model fermions plus pin zero particles which are called the fermions and then you have the hex and the gauge boson which are um sorry oops which are transformed into uh spin one-half particles potentially spin one also but uh those spin one-half are called exenos and gauginos now what is very interesting at that stage so you have new particles in uh in your spectrum but what is very interesting is that you have basically a new symmetry coming which is initially was a u one but you can break it and it becomes basically the following which is called r parity the expression of r parity is written so you have to compute depending on the particles that you're considering but in a sense you don't really need to worry any supersymmetric particle as r parity minus one um and then uh the particle will be plus one the rule is you can't have two in the final state if you have a supersymmetric particle with decay so what i represented here is a decay the rule is what you need to conserve is minus one if you want so if you have in the final state if you had two particles with minus one and minus one that would give plus one if you want that would and actually conserve a supersymmetric charge so you don't you basically always have to have in the final state of a decay from a supersymmetric particle you always need to have a standard model particle and a supersymmetric particle now having this this means that if you take the lightest supersymmetric particle it will not be able to decay into um anything because it would need a supersymmetric particle uh in the final state and they would be zero and they would be none because it's the lightest one so with this symmetry uh supersymmetry is telling you you take the lightest particle from the spectrum and that would be basically a particle that would be stable forever it would never decay now that's great when you're looking for a dark matter candidate you're looking for a particle which is stable so that immediately gives you a dark matter candidate providing that it's massive at the time at least people were thinking it has to be massive and it needs relatively weak interaction and that's exactly what you get if you have a combination of these genomes and genomes and so on um so there's a combination which is called neutralino and those neutralinos can be um they're um I mean they can be naturally the lightest particle in of a spectrum and therefore and they have weak interaction they have a mass and therefore they are the natural candidate for dark matter so this was the same here but maybe less um less clear well fully scare so a lot of people before 1998 a lot of people were using supersymmetry for doing those calculations um and they would write um they would find the result in a very complicated plan which depends on some parameters which depends on the theory so it's not really readable as a dark matter mass or um dark matter mediator and therefore that's why I actually use the simplified models technique because then you had access directly to the dark matter mass and to the mass of the mediator you didn't need to assume any theory and to understand whether your parameters would give you the right fit um the only thing I wanted to show you here is that you see this diagram here on the left you see the green region and you can see the mass which is written on the x axis m one half corresponded basically of half a mass of um the neutrino so of your dark matter candidates and that gives you a dark matter mass of 200 gv so before 1998 what happened or around 1998 people had refined the calculation so well that they were saying what dark matter has to be lighter than 200 gv and we had uh lep we didn't have lhc but we had lep and we knew that um the limit was not there was 30 gv but we knew that we were getting close to that limit and that's why everybody started to think maybe we need something else maybe we need to think how to extend that limit uh whether it is possible to extend that limit or it's really rigid and we should find a supersymmetry exists we should find the neutrino uh at lep or lhc eventually and so people started to revisit the um the coannulation and suddenly you realize that actually the dark matter can be much heavier than 200 gv and so since then people have read down those calculations many times the supersymmetric parameter space has opened and basically we realize that in principle dark matter so this is a coannulation plot that i wanted to show you in fact yeah we realized that the dark matter could be much heavier than a tv even um it could go up to tv almost so um right in 1998 there was a change and as i said in the 2000s people have really um investigate everything so this is more of the same about supersymmetry i don't think i have the time so i will move on but i just wanted to show you the type of plots people did in the community for a long time um and you see all sorts of resonance and so on so everybody was hoping that if you uh fall on the resonance you have a smaller quality density and so maybe then you can have two dark matter particles and so on so this was also the moment where people started to think about multiple dark matter components instead of just one let's keep that yeah so we are almost uh at the end of our time yeah thank you very much so just to finish off so by now i think it's safe to say using the survival mode you will see some paper from uh 2016 for example uh the last refuge of mixed uh xeno dark matter very degenerate xeno non-fermore cosmology so this was basically what it takes now to survive and to to make it work and then of course um there was a he expos on discovery which also started to bring the idea that maybe we need to think about different type of interaction i think i should probably um i think i will probably stop here and i might start again tomorrow then because uh otherwise i think i will just rush all right uh yes thank you very much selina for for being also for being in time and a very nice lecture and for i see that there are hands raising but unfortunately we don't have time we need to to stop now so that we can who who once can follow the icp ceremony prize but you have an opportunity to ask the questions tomorrow in the question and answer session for example so for the time being thank you again selina and hopefully see you at the icp ceremony see you bye