 Alright, so we're going to pick up where we left off yesterday, so you'll recall that yesterday we spent the first part of the talk on reviewing the evidence for dark matter, in particular the various epochs that constraints on the nature of the dark matter sector come from, and we'll return to that either later today or tomorrow, in tomorrow's lecture, and then we spent the rest of the talk going through the various dominant paradigms for setting the density of dark matter, which we said the reason that that was important is that it relates a macroscopic quantity, the observable density of dark matter in the universe, to a microscopic quantity, the cross-section, or the chemical potential, or the general interaction rate with the standard model. So we spent quite a bit of time going through this thermal freeze out paradigm as the dominant one. We talked about using a chemical potential, this is known as asymmetric dark matter, to set the relic density in a similar way to which the baryon density is set, and then we spent a little bit of time going through freeze out and decay and freeze in, and so just to remind you to make a little bit of continuity with what we did yesterday. So if you were to take freeze out versus freeze in, it can be pretty well summarized in these two plots. So in the freeze out paradigm, so you have on the y-axis here the ratio of the number density to the entropy, and early on this is constant, which just means that the number density of dark matter scales with the volume, so as the universe expands it just dilutes with the entire volume, then once this ratio of the mass on temperature becomes larger than one, that's where the dark matter is becoming non-relativistic. What happens in that regime? How does it scale? Once x becomes bigger than one. The exponential suppression, it goes like e to the minus m on t, or in this variable x, e to the minus x. So that's where you start seeing this exponential suppression, and you follow this Boltzmann equilibrium distribution until the dark matter becomes so rare that it has trouble finding another dark matter particle, and when that happens, annihilation stops, and then you re-enter the regime where the dark matter just starts diluting with the volume, and that's this regime here. So you have these three separate regimes, one diluting with the volume, two exponential decay in the number density, it's not to cave the particles due to annihilations, and then you exit that regime and you re-enter the regime where you just dilute with the volume. And the place at which you exit this curve just depends on what that annihilation rate is. The larger the annihilation rate, the easier it is to annihilate, the further down you follow this curve until you exit, and that is the freeze-out paradigm in about 30 seconds. Okay. Yes, question? Can you repeat your question? Yeah, sorry. What makes it start colliding? I mean, initially it's just diluting, I mean it's expanding, the universe is expanding. Right, so that's a good question, and the point is that up here, the annihilations are happening, but the inverse annihilations are happening at the same rate. This is what equilibrium means. I have dark matter, dark matter goes to two standard model particles, and then it happens in the equal rate coming back. What's happening as the dark matter becomes unrelativistic is it becomes kinematically disfavored to have standard model annihilating back into dark matter because the temperature is dropping below the dark matter mass. And that's the reason why you're getting this exponential suppression in terms of the microscopic. Good question. Okay, so this is the picture of the standard freeze-out picture, and I wanted to put this other plot here for freeze-in to emphasize the point that freeze-in happens in the opposite way. So unlike this case, where I start out with the thermal number density initially, I imagine that I start with the dark matter not being populated at all, or almost at all. So it's not in thermal equilibrium with the standard model. It doesn't have strong interactions with the standard model, and so it's just not populated. On the other hand, the standard model sector is just thermally populated with some temperature the way it usually is. And what's happening as the temperature drops, so here again is the same axis. Time is going this way. As the temperature drops, you have an occasional process that produces the dark matter particle. Okay, so for example, you might have the occasional decay or superpartner plus standard model particle produces a dark matter particle. That's one process where that coupling constant is imagined to be very small. If that's the case, that process is very rare, but it becomes more important actually as the universe cools. And the reason that is is you just have more time. You can afford to sit there and wait for the occasional dark matter particle to appear. And so what happens is that once I produce a dark matter particle, it's not in equilibrium with the standard model. So it doesn't come back. You just produce it and it's there. And so now what happens is you as you occasionally produce these dark matter particles, you get a buildup. And that's what's shown at these curves. Now going up here, this is larger coupling. So the more often you produce them, obviously the more dark matter particle you're going to have in the end. Okay, so these are sort of two opposite regimes of dark matter production freeze out and freeze in. Are there other questions about that? Yes. So eventually, okay, and I don't want to confuse language, you can say that this process freezes out in the sense that eventually, so we went through these scalings last time and we found that the cross section went like one on T squared. And so this protection went like one on T. So that's this regime. Eventually though, you expect the temperature to drop below some mass scale. Okay, and once that happens, it goes, it stops going like one on temperature and it starts going like one on V. So once the temperature drops below whatever this mass scale is, then the process asymptotes and you're no longer having this production process happen. So an analog for this would be, suppose I had super partner plus standard model particle going to dark matter particle. Once the temperature drops below the mass of the super partner, that process isn't going to happen anymore. And so at that point, the process just saturates. Yeah, in that case, it would be like that. So you pass a mass threshold and passing that mass threshold tells you that the process is going to stop. The other process that we focused on last time was to use a particle, anti-particle asymmetry to set the number density. And this goes by the name of asymmetric dark matter and we went through a mechanism by which you could use a particle, anti-particle asymmetry to set the density. So we gave the example of UDD, a neutron interacting with the dark matter. So if I have a baryony asymmetry, that interaction will give me an asymmetry for the dark matter as well. And so that was what we saw. We could relate this particle, anti-particle asymmetry and you guys remember how much more anti-matter than matter I have, what that ratio was. It's one part in 10 to the 10, which I can't obviously show here, but it's a tiny, tiny asymmetry. It's a very small chemical potential. And the idea was that you could relate the chemical potentials in these two sectors. And then what happens as the universe cools, I have dark matter, anti-dark matter, annihilating. And so at the end of it, I'm only left with dark matter and baryons in the standard model. So that was the third paradigm for setting the dark matter density. And then simply because I didn't spend any time on it, and I have one slide here, Sergeet will go into a lot more detail I imagine, you have the fact that dark matter can actually be a coherent field. This is what the axion is. So it's a completely different idea than thermal dark matter. So where the dark matter acts as a particle, the idea is that if you take a particle and a potential and you just leave it oscillate, its energy density is just the kinetic term plus the potential term. And the pressure is the kinetic term minus the potential term. And then a particle oscillating in here, as Sergeet said yesterday, just has a cosine, just oscillates. And now I take the time average of this guy, and if I take the time average of that, cosine squared averaged over time is a half. So I plug that in here, and that just gives me a half m squared a squared for the energy density, but lo and behold, the pressure is zero. If the pressure is zero, what is that the equation of state for? Cold dark matter. So that's the entire idea behind any type of particle that's oscillating at a potential. It's just that a coherent field configuration that oscillates has zero pressure and some energy density, and it's just going to dilute with the volume. So he said wouldn't it lose coherence just like everything else that we know? So in the early universe, it retains its coherence. Now as structure forms in the universe, you would expect some winding of this phase space sheet, and I'm going to let Sergeet spend more time talking about that. So why don't you hold those questions for the axion, because these are exactly pertinent for that. So that's the only thing I'm going to say about coherent field configurations, okay? One slide. And I'm going to spend the remainder of these lectures talking about the case where dark matter behaves as a particle and not as a coherent field configuration. So we went through freeze out, freeze in, asymmetric dark matter, freeze out and decay, misalignment. I didn't actually talk about compact object formation. So for example, cue balls. I'm sure many of you have heard of cue balls. These are examples of oscillating scalar fields, where you have these coherent field configurations setting the dark matter density. So there are hybrids of these things. So you guys have heard about semi-annihilation, but if you look a little bit closer at them, you'll see that all of them have elements of these basic mechanisms built into them. And so that's why I want to sort of emphasize these kind of overarching paradigms for ways that you can set the dark matter density. Now I want to spend sort of the larger chunk of this lecture on going through the classic case, the one that has received by far the most attention in literature, and that is supersymmetric dark matter. And I want to talk in particular about how direct and indirect detection experiments have been designed to look for these experiments. And then I'm going to say that what's happening now is that the standard supersymmetric dark matter candidate, in particular the minimal one, is being squeezed. And so I want to talk about other types of dark matter candidate and the types of constraints that we can put on them from cosmology and astrophysics. But let's go back up and talk about the standard Susie dark matter because this is the most studied case and there are several nice reviews and literature on this. So supersymmetric dark matter is a classic. Okay, so I guess I'm betraying my nationality here when I think of a classic, somehow I think of a classic car. Okay, and the thing that's nice about it is that it has all the all the right ingredients that we need for dark matter, and they're there for other reasons. So in particular there's a neutral particle there that has all the properties that we need to make a nice dark matter candidate and it's stabilized by a symmetry which is there for other reasons besides just making a dark matter candidate. And so there's some sense in which you get dark matter out of this theory for free. And I think one of the reasons why this paradigm of a really simple dark matter sector has had sway for so long, namely that the dark matter is a single stable weakly interacting massive particle that we can go and look for at the LHC in indirect detection experiments is precisely because that is exactly what this paradigm predicts. Dark matter sector is actually very boring within supersymmetry. You have a dark matter particle, it freezes out, and once it freezes out it doesn't really do much until the occasional perhaps interaction in one of our direct detection experiments, or maybe you get the occasional annihilation in the galactic center, but nothing else really happens. You're not going to see any deviations from ordinary cold dark matter structure formation. You're not going to see other types of dynamics happening in the dark matter sector. It's actually very boring. And I think this is one of the reasons why we as a dark matter community have gotten so focused on this very simple paradigm is because it's what Susie predicts, at least the minimal Susie. So let's back up and go through the very basic things that I think you already know at some level. In order to make a dark matter particle absolutely stable, you need some type of a symmetry in the theory. Now in the standard model, these are global symmetries. Global symmetries plus electric charge that stabilize things. So we actually have three stable particles. So we have the proton, the electron, and the neutrino. So the proton is stable by baryon number. The electron is stable by electric charge and then the neutrinos are stable by lepton number. So those are what we have in the standard model. In supersymmetry, the symmetry that does this for you, as many of you know, is our parity. So you've already had lectures on supersymmetry, but bear with me for two slides. So every particle has a super partner that differs with its spin by a half. And so this seems extremely minimal and nice until you start to look at it just slightly more closely. And you find that when you look at it slightly more closely, the fact that you have not just the fermions, but their super partners, which are scalars, means that you can write down some interactions consistent with all the gauge symmetries that do bad things to the standard model. And that is these interactions here. And actually, we used these interactions yesterday. We had, for example, an up and two down chords. This is a neutron. We had actually all of these interactions, but we had them talking to the dark matter. If I don't have them talking to the dark matter, what happens? Well, now each one of these particles has not just the fermion, but also the scalar super partner. So if I have a scalar super partner, so if I have only fermions, I can't write down three fermions as an interaction. On the other hand, I can't write down three scalars as an interaction. So this type of interaction would not be allowed when left alone within the standard model. But if I take its super symmetric analog, I can write down an interaction that has three scalars. Or I can write down an interaction that has two fermions and one scalar from each of these in the super potential. And what do those additional interactions do for you? Well, they wreak havoc. And the reason why they wreak havoc observationally is, for example, I can take two quarks and their scalars super partner. So the super partner of the strange quark in this instance. And then I can take another one of these interactions. In this case, it's with a lepton and two quarks. So this is coming from this term. And then I can just have the up quark come along for the ride. And this induces proton decay. Proton goes to positron plus pi zero. Now, if you're comfortable just forbidding these to be order one, then you can suppress that sufficiently that you wouldn't see the proton decay. But an experiment like super k puts very strong constraints on this kind of process. And our belief is that in general, if you generate these, some of these couplings are going to end up being order one. And so it's sort of a mystery why it is that that these terms appear to be small. Now, you can imagine instead of having an ordinary r parity that instead you just add an interaction with the dark matter particle. That's what asymmetric dark matter does. But the other thing you can do is just to write down a symmetry to forbid those terms. This is what particle physicists do. If you don't like something, you make it, you rationalize the result by putting in a symmetry. And this symmetry is not so bad in the sense that it does come from some UV complete theories. And so you might imagine that when we understand better, the way to UV complete the low energy supersymmetric standard model, that we would understand why the symmetry is there. But for the moment, we can just pause its existence and see what happens. So let's pause at the existence of this r parity. It's defined to be minus one. So the r parity of some particle is defined to be minus one to the three bearing on minus lepton plus twice the spin. So let's plug this in. Any and all standard model particles carry r parity one. So you have the lepton, the quarks, and the gauge bosons. You can just plug in their spins. The lepton or bearing on number. And you find that the r parity is one. Whereas all the super partners carry r parity minus one. And that's simply from the fact that they have the same bearing on a lepton number, but they differ by spin a half. So that just flips the sign of the r parity. And so the consequence of this is if r parity is conserved, that means the lightest particle has to be stable. It's just the analog of bearing on a lepton number in the standard model that stabilizes the proton and the neutrino. I just have this new symmetry. So there you go. It forbids those terms, l le, for example, because that has one super partner, the slepton or the squark, and two standard model particles, which are the fermions. So those interactions that we wrote down on the previous slide, these all have r parity of minus one. And therefore, those interactions are forbidden. So problem solved. And the result of this is that the lightest supersymmetric particle, the LSP, is now stable. So if we further assume that the lightest one is neutral, then we have a good dark matter candidate. Now, these particles, you don't just have one, you actually have four that mix with each other. So you know within the standard model that the gauge bosons mix. And the way that we describe that is the Weinberg mixing angle. For the super partners of these guys, so we have the bino, the neutral part of the W gauge bosons, the wino, and the neutral part of the up and the down type Higgs zenos. Now in supersymmetry, you need two types of Higgs's. And I'm sure you guys know why. Yes, the potential must be holomorphic. Yep. And so therefore, in the standard model, which you normally do to write down mass terms, for example, for both the up and the down type quarks is you write down the Higgs and the Higgs star. And Susie, you're not allowed to write down phi and phi star in the super potential. And so therefore, your solution is just to write down two Higgs's. They're both SU2 doublets, but they carry opposite hypercharge. And that means that as a result, you don't just have one type of super partner to the Higgs, but you have two types. Okay. So each of these are two component fermions. Okay. They're not four component fermions. They're two component fermions. And they have a mixing matrix now that is set by some soft parameters, by a mu parameter, and then some parameters that are set by the standard model. And then lastly, this tan beta parameter, which is a ratio of up to down type, VEVs of these Higgs's, subject to the constraint that the sum of the squares is 246. Okay. So I would have met, yes. Why are the zeros there? So you're wondering why they don't mix in the same way as the standard model gauge bosons. So the mixing of the standard model gauge bosons, you can write down from the kinetic terms. And the kinetic terms for scalars are different than the kinetic terms for fermions. And so the kinetic terms for the scalars allow you to write down mixing terms there, whereas the kinetic terms for the fermions do not. That's the reason. Any other questions? Well, you also needed just to be able to write down mass terms for both up and the down type quirks. It also cancels the anomalies for the fermions. Other questions? All right. So I've said that these really is the mass matrix now for things that are potentially a dark matter candidate. What's the thing that I left out as a potential dark matter candidate? Gravitino, yes. That's not the one I was thinking of. But yes, absolutely. It's neutrino. So you expect in general that this neutrino would be the most amazing dark matter candidate. And our priority it is. However, the fact that you have the supersymmetric transformation turns a neutrino into a neutrino. And therefore, since we have a neutrino Z boson interaction, we also have a neutrino gauge boson interaction. And this interaction here is actually fixed by the gauge couplings. So this interaction rate is actually completely fixed for the neutrino for it's for this neutrino through the fact that neutrinos have weak interactions. Now what happens in that case? Well, I can have either a neutrino or a neutrino scattering off the Z to off of quarks. And this potentially you could see in a direct detection experiment. So let me spend some time going through direct detection basics. And I'll be able to tell you then why it is that the act of neutrino is no longer a good dark matter candidate. So in dark matter direct detection, there are two basic types of interactions spin dependent and spin independent. The spin independent is distinguished by the fact that it's coherent in the sense that you couple to the entire charge of the nucleus. So the best example of this is the Higgs boson. So if I have a dark matter particle interacting with the nucleus through the Higgs, you'll end up coupling to the entire charge of the of the nucleus. And so spin independent experiments now are I think in many ways coming into their prime. And the reason why I'm showing this plot is solid lines. This is even now slightly outdated because it doesn't show the latest looks results shows the constraint on the scattering cross section off of a nucleon. Note it's not the nucleus. It's an individual nucleon as a function of the dark matter mass. It starts up here at 10 to the minus 39 centimeters squared. Keep that number in your mind goes all the way down here this neutrino floor. I shouldn't call it a floor. It's a background. Life becomes much more difficult once we reach this background that the cross sections around 10 to the minus 48 centimeters squared and what the experiments are currently probing is on the order of 10 to the minus 45 centimeters squared. And you can see that planned experiment so LZ is an experiment that is funded and currently being being built. And you can see that they're going to push through almost down to the level of where this coherent neutrino background is. And we're going to talk about what this orange blob is right here. This is actually a prediction. I say prediction in quotes because it's a target. Let's say it that way. And you can see that these experiments also lose their sensitivity right around 10 GeV. So those are the couple of things I'd like you to pay attention to on this plot. Why is it that they lose their sensitivity around 10 GeV? Well the reason why they lose their sensitivity at 10 GeV is just kinematics. So this is really trivial, non-relativistic scattering. I quite literally have a dark matter particle which is represented as one sphere here and a nucleus which remains coherent. I don't ever pull it apart. So this is completely elastic scattering in the standard picture. So I just write down the fact that the nucleus is imagined to be sitting still in the standard picture. Dark matter particle comes in. There's some final momenta for the nucleus and the dark matter. I just impose energy and momentum conservation. And when I do that just plugging things in, what you find is that the reduced mass of the dark matter with the nucleus, not the nucleon, the nucleus, times the velocity of the dark matter. So the velocity of the dark matter in our Milky Way galaxy is typically on the order of 300 kilometers per second, which is about 10 to the minus 3 times the speed of light. So it's highly non-relativistic but still moving around at a pretty good speed. So now if I just plug in for a 50 GeV target, plug this in, imagine that this reduced mass then is dominated by the nucleus mass. What I get is an energy that's deposited in one of these interactions which is on the order of 100 K GeV. So what you can imagine just by scaling this not very far from here is that if instead I have 10 GeV dark matter coming in and interacting with the nucleus, this recoil energy becomes more like 10 K GeV. And so the effect that you're seeing here at low masses is just the fact that these experiments as they're currently designed and built have energy thresholds around 10 K GeV. And that's the reason why they don't have sensitivity to dark matter lighter than about 10 GeV. Now why is it that they designed the experiments that way? So I've heard people saying for a long time that it's hard to build experiments with lower thresholds than that. Well the reason why they're looking for dark matter there is because that's where we fear is told them to look for dark matter. And so this is the reason if you have a dark matter candidate that's below 10 GeV at the moment it's tough luck. We'll come back to talking about that later because the the panorama is changing right now in terms of what they're building for experiments. Now let's apply this to scattering through the Z boson because we want to find out what happens to this neutrino dark matter. So if you compute the scattering cross section through a Z boson off of a nucleus so this is the coherent object. As promised you get the coherence so the scattering goes like Z off of the coupling to the proton plus A minus Z. This is the neutron number coupling to the neutron coupling you take that quantity squared. That's not what's reported on the y axis of that plot. What's reported is sigma p okay which gets weighted you basically divide through by this factor in order to get out sigma p. Then you put in a velocity distribution for the dark matter. You take this cross section which is a differential cross section with respect to the recoil energy. I weight it with this Maxwell Boltzmann velocity distribution then I do an integral over phase space. I have a factor of v which is just for the flux and then I weight it by my number of targets in my experiment and the number density of dark matter. When you do that you get a result that depends on the dark matter mass. You get some rates as a function of the energy. Here this is just something that's called electron equivalent energy. For our purposes it doesn't really matter. The point is you tend to get recoil energies which are 10 keV depending on what the mass of the dark matter is. Now let's just plug in and compare. So let's take a typical scattering cross section through the Z boson off of a nucleon because that's what's on the vertical axis. So you just put in back of the envelope g weak to the fourth reduce mass of the dark matter with the nucleon squared over 4 pi mz to the fourth. Pretty simple. Plug that in. What do you get? 10 to the minus 39 centimeter squared. You can just check that number for your own edification and the point that I want to make here is that 10 to the minus 39 centimeter squared is way up here. So if your expectation is for weak scale dark matter which is usually what we expect within supersymmetry. So if dark matter is 10 keV or heavier for this neutrino this is excluded by something like six orders of magnitude. So the thing I want to emphasize is that strictly speaking the weakly interacting massive particle has been dead for a long time. If you scatter through the Z boson unsuppressed you are dead by six orders of magnitude. That is not actually what we are going after in direct detection experiments right now. What we're going after in direct detection experiments is this blob here and that's what we're going to talk about next. Now people still talk about neutrino dark matter but when they talk about neutrino dark matter what they generally do is to mix this active neutrino with another particle that's been extended from the MSSM. So you add a sterile neutrino, you put in its super partner, you mix with that super partner and if you make that mixing if you make the dark matter predominantly sterile neutrino with a little part active neutrino then you suppress that scattering cross-section by that mixing angle squared. And so then you can push that scattering cross-section from this 10 to the minus 39 as far down really as you please. So just keep that in mind when people talk about sterile neutrino or sterile neutrino dark matter they're not talking about the active one. They have to do this substantial suppression in order to make it consistent with direct detection experiments. So questions about that? Yes. So in general, it would depend on your UV completion. So there are a couple of things. One thing that you do run into with a neutrino is the fact that it couples to the Z and the Z will go to a sterile, sorry, an active neutrino. And that invisible width of the Z is pretty strongly constrained. And so you can't just make the and so you can't just make the neutrino, the active neutrino arbitrarily light. So that's what's preventing you here from just dialing the thing arbitrarily low purely from a phenomenological point of view. There's also the other point that, you know, if you want to write it down from UV complete point of view, you might wonder, well, okay, all the other super partners are pretty heavy. Why is it that this one just happens to be designer low? But okay, I'm pretty sure you can write down a model without working too hard where that happens. But the phenomenological constraint is just the invisible width of the Z. Between 10 and 100 GeV, sorry, what was your question between a 10 and 100? Yeah, we're focused on this region up here. And that reason, the reason that we're focused on that region is really because theorists have said this is where you should look for dark matter. I mean, we've been focused on the weak scale. We're finally starting to pull ourselves away from that. But well, well, so I think the reason, so this blob should not be taken too literally. As a matter of fact, I will go through qualitatively where that blog comes from. But I think the fact is that these experiments have been focused from 10 GeV on up. So in my mind, there's sort of a dividing line in terms of what the experimental constraints below 10 GeV and above 10 GeV. Wait, wait, so later. All these plots assume that we have a density of dark matter that is like IRH. If we leave like in a bubble with very small amount of dark matter, then is there any other constraints or neutrinos on direct dark matter? So let's say that the local density of dark matter for some reason is very small, and all those cores just move up. You can't make the local density, well, unless you happen to live in a void. It's very hard. We do have local constraints on the density of the dark matter, just because we do know some things about the gravitational potential locally. You can't just send it to zero. But you can definitely have, you know, at least in order of magnitude variation in the local density of dark matter, from what we take to be the smooth background of dark matter in our galaxy. But in the very wicked case that we're in a local very dark matter. Yeah, but just look at how many orders of magnitude spans here. This is a log plot with many orders of magnitude. Log plot with many orders of magnitude. But certainly people do play that game, certainly. So you should always keep the caveats in the back of your mind about what are the theoretical assumptions that are going into making this plot, and how hard can I push on them to try to evade these constraints? Those are definitely amongst the rights questions to be asking. Okay, so this is the reason why active neutrino dark matter doesn't make a good dark matter candidate, even though mixing in sterile neutrino can can improve the situation. So what about the neutrino? The neutrino, as we said, also couples to the Z boson. So why isn't the neutrino dead by this same constraint? Well, it's a it's a technical reason, really. So the components are two component fermions, two component myron of fermions. And so you can write down all the interactions with the Z boson, which are consistent with the symmetries. So scattering through Z boson, you have the pure vector interaction of the dark matter. And that turns out to be identically zero, you can show. There is the interaction, which is the pure axial vector part of this. And this interaction is purely spin dependent, doesn't couple to the charge of the nucleus that couples to the spin. And that's just what a gamma mu gamma five interaction does. And then you can mix the vector with the axial vector. And if you mix the vector with the axial vector, you'll find that this interaction goes like V squared. So V, because it's 10 to the minus three, that means that this interaction gets suppressed by that 10 to the minus six factor. So, so the one that you would be worried about vanishes identically, and the others are suppressed. And this turns out to save neutralino dark matter from already being dead. Yeah, so there are cases where people have tried to construct direct neutralino dark matter by adding in an additional component to marry up with the two component guy. So you can construct direct neutralino dark matter. If you do that, then you have to worry about these constraints again. Other questions about this. In general though, you expect that when you diagonalize that mass matrix, that you're going to end up with four different states with four different masses. Okay, if you want to actually make a pair degenerate, you have to do a little bit of model building work to make the dark matter direct. You can do it, but it's not what happens in the minimal case. Other questions. So for the neutralino, the thing other type of interaction that's there generically is scattering through the Higgs boson. So if you look at scattering through the Higgs boson, so generically, you get an order one coupling or not too small coupling of the neutralino to the Higgs boson. It looks like the lower part of the quarks got cut off here. But the coupling of the Higgs boson to the quarks is suppressed by the fact that the Yuccao coupling for the light quarks, which predominantly make up the nucleus in terms of the quark content is small. So the dominant coupling of the Higgs and the nucleus is actually coming through a loop of heavy quarks coupling to the gluon component of the dark matter. So you actually compute the anomaly diagram to extract with the effective coupling of the Higgs to the nucleon is, and there's some formula that describes it, and I would actually recommend that you go and read this classic paper. It's a really nice paper. And so the end point is that you can just plug in this coupling of the Higgs to the nucleons. Now it's going to depend on tan beta, this ratio of up to down quark vevs. And the larger you make tan beta, then the lower the down quark vev, which means the larger coupling you get to bottom quarks and that tends to boost the coupling to the nucleon. So when you do that, you plug in these things and the cross-section that you get out depends like the tan beta squared because the coupling to the nucleon is proportional to tan beta. You get something that's going to depend on the mass of the Higgs, and then this is just the size of the coupling of the neutrilino, in this case the what's imagined to be dominantly a down type Higgs. So you put that all together, you plug it in, and the numbers that you get coming out are typically on the order of 10 to the minus 42 centimeters squared for a large tan beta. So if you imagine for a standard model Higgs, tan beta of 1, you put in 125 GeV, you leave this about where it is, you tend to get more down to the 10 to the minus 45 centimeters squared number. So the thing that the experiments, the direct detection experiments are probing right now, is actually the neutrilino coupling to the nucleon through a sort of standard model like Higgs with a tan beta which is not too large. That's what the direct detection experiments are getting right now. So if you just put this on a plot with tan beta of 1 and 125 GeV Higgs, you get somewhere down right around here, which is what the experiments are probing. Now you might say, well why haven't they discovered anything? Or does this mean that they're necessarily going to discover something? What about all this blob that's leaking all the way down here to the 10 to the minus 48 to 10 to the minus 49 centimeters squared? Well that's because I haven't put in yet all the gory details of all the ways that the neutrilino can couple to the Higgs. In fact, the fact that again, the Higgs is, all of these couplings are constrained by supersymmetry. So every time I write down a coupling of the Higgs to a standard model particle, that interaction is constrained. Then that tells me about the interaction of the super partners which can occur. So for the case that we're interested in, what that means in particular is that the neutrilino has a mixture we said before of bino, lino, and an up and down type Higgs-ino. Now I take the standard model interactions, I supersymmetrize them, and then I look at what kinds of interactions can occur. Well, what happens is if I want to look at the interaction of the neutrilino with the Higgs, either the up or the down type Higgs, then what needs to happen is the lino goes to an up type Higgs-ino, talks to the Higgs, or the lino talks to a down type Higgs-ino, which goes through a down type Higgs, or a bino talks to a Higgs-ino to the up type Higgs, bino talks to a down type Higgs-ino, goes through a Higgs. The point that I want to make there is if I have a pure bino, a pure lino, or a pure Higgs-ino, it does not couple to the Higgs at tree level. At tree level, the coupling to the Higgs vanishes identically. And so in that case, I will actually not get a direct detection observation through the tree level scattering process. Now people have started computing one loop cross sections for these pure states, and in some cases you can generate one loop cross sections which are on the 10 to the minus 48 to 10 to the minus 49 centimeters squared. Those are awfully small, but the point that I want to make is that even though generically you would expect the neutrilino to scatter through the Higgs at tree level, it does not always happen. You can tune away those couplings. Now, so what if the coupling to the Higgs does vanish at tree level? Are we just sunk? The point that I'm making with these graphs that are shown here is that even if the neutrilino does not scatter off of a nucleon through the Higgs boson, you can have annihilation processes that happen. And you can look for these rare annihilations of dark matter particles. So you can have, for example, a neutrilino annihilate through a super-partner of one of the fermions to pairs of the fermions, or you can go through the Z boson or through the pseudo-scalar Higgs. Or these bottom processes tend to dominate for the case of a piruino or Higgs-ino dark matter, you can have dark matter annihilating to w plus w minus or to zz. And so if these bottom processes are dominating, then what do you see? Well, where's a place that I should look for dark matter annihilating? Where's a particularly opportune place to look for dark matter annihilating? Anyone? What? Yeah, galactic center, because there's a particularly large amount of it there. What's the disadvantage of looking towards the galactic center? It's the galactic center. Galactic center is a complicated place. There's a black hole there. There's a lot of baryons, big backgrounds, in other words, which we often don't understand very well. Where else might we look for dark matter? What? Dwarf spheroidals. So if I look in our galactic halo, there are subhalos, the most massive of which are dwarf spheroidal galaxies. I can look for dark matter there. Those are nice because they tend not to have much of a background, tend to be pretty baryon poor. And so from that point of view, I don't have to understand my background so well in order to be able to get a constraint on them. So what happens when I have dark matter particles annihilating to W or Z gauge bosons? Well, the bottom line is that at the end of it, I end up with, so I get a basically hadronic mess. So imagine what happens at the LHC when you produce a pair of W or Z bosons. You end up with a mess of pions and nucleons and photons and leptons. You get these jets of stuff. And the same thing happens when dark matter annihilates in the galactic center or in a dwarf spheroidal galaxy. I produce W and Z bosons. And at the end of the decay chain, I end up with photons from the pion decay with protons and then also with photons which are radiated off charged leptons. And so these are all things that I can look for. And so you can, in principle, put very strong indirect detection constraints. And in fact, we have pretty good data on this. So this is the annihilation cross section. So this is for a line in particular. So I have dark matter annihilating in this case through a loop of charged particles to a photon line. And so you can put constraints on this annihilation cross section. Do you remember what we said yesterday, the thermal annihilation cross section was? Just to sort of normalize your sense of how good these constraints are. Three times 10 to the minus 26 centimeter cube per second. That's one benchmark you should always keep in your mind. That's the thermal annihilation cross section. It's a three times 10 to the minus 26 centimeter cube per second is all the way up here. So does this kill thermal dark matter? The answer is no. There's a loophole. What is the loophole? It's not really a loophole. It's a giant archway that you can walk through. So this process here is specifically for a line. So this is a subdominant annihilation process. This is not the one that's setting the relic density. So in this case, this in particular is a line process. And it tends to be a small fraction of the total annihilation branching fraction. But nevertheless, pure Wino and pure Hixino still have a substantial annihilation rate through this process. And in fact, you can go and calculate it, which people have done. Turns out that you need to use some, you need to resum large blogs. We're not going to go through what that is. Okay. Theoretically, dark matter is becoming a more and more interesting subject just in calculationally what you can do. The thing I want to point out to you here, this blue area here is corresponds. Now look at the dark matter mass. We're zooming in on the one TV sort of mass range. And so you can look at the one TV. So this data here in red, that's the Hess data is being shaded in here with blue. So everything above here now is ruled out. And what they've done is to take the pure Wino case and to put in the corrections to it. And the answer that they get is this dashed line here. And the other piece of information that I want to give you is that the thermal Wino has a preferred mass that sits right around here. So if you just take where that's at, you can see that it's intention with the data, with the constraint. Now what might be my way around this constraint? Or what is the theoretical uncertainty that I should keep in my mind? So this is data taken from the galactic center. It depends on the dark matter profile in the galactic center. So people simulate these things. And the profile is highly peaked towards the galactic center. But because those simulations are dark matter only simulations, we're not exactly sure what the profile is very near the galactic center. And so there's some uncertainty and the overall normalization of this dashed line, which you can argue about how much that uncertainty is, it could be from anywhere of a factor of a few up to an order of magnitude or more. However, you can see that with this line here, this is going to cover this line by an order of magnitude. So at that point, this starts to look increasingly disfavored. So this is actually what CTA, the Shrunkov Telescope Array, will be able to do in the future in the next several years, in terms of constraining a line. So we'll be able to actually put a fairly substantial constraint on Wino dark matter. Xeno annihilation cross sections are smaller and aren't going to be harder to constrain. Yes. I didn't understand what you said. How much this data is reliable? How is this data available? I mean, when two dark matters annihilate, so many bunch of particles are coming. In our LSE, we know there are so many dedicated detectors to detect them. Right. So this constraint is coming from photons. So it's dark matter annihilating to gamma gamma plus z gamma. And photons have the fortunate property that they point. So if we look towards the galactic center, those photons are coming from the galactic center. And so we can put a constraint just by pointing our telescope or looking towards the galactic center on the flux of dark matter annihilating there in the galactic center. Now, the thing you have to contend with is the fact that the density profile dark matter is uncertain there. And the fact that there are huge backgrounds from disordinary baryonic processes that produce photons. So this is a challenging region to look. But nevertheless, if all you're interested in doing is putting an upper bound and furthermore, this is a line. It's a line process and it's harder for ordinary baryonic processes to produce something that looks like a peak. So none of these things are without some caveats. You have to think about what it is exactly you're constraining and how strong that constraint actually is for the fundamental physics process, not unlike what happens at the LHC, even though there you can get a little bit better handle in your systematics. So that was dark matter annihilation to photon line. As I said before, annihilation to Ws or Zs. This, if you compute this, as I indicated before, you get this three times 10 to the minus 26 centimeter cube per second parametrically for dark matter, which is around the two to three TV range. Now it depends on whether it's Wino or Higgsino, because I get different factors that enter in those vertices. But parametrically, that's what enters. Now the result is that pure Wino and pure Higgsino, because of their interactions with the W and Z boson, the fact that they annihilate to Ws and Zs, the fact that they annihilate to pairs of photons. And the other thing that it didn't actually go through, they actually have one loop scattering cross sections with nucleons, even though the tree level process disappears. On the other hand, a pure Bino, at least at tree level, escapes. Because it does not have the annihilation process to the W and the Z. And it doesn't have the same annihilation to photons. And so pure Bino is harder to constrain for this reason. On the other hand, this does not come for free either. And the reason why this does not come for free is that it has a Higgsino component set by mu. And so if you require this mu parameter to be large, in order, so you decouple the Higgsino component by making the mu parameter large in comparison to the mass parameter, which is setting the Bino mass. And that's M1. So if I make mu much larger than M1, which if you take this thought beyond the order of the weak scale, in order to get weak scale dark matter, you can do that. But you should be aware of the fact that the same mu parameter enters into the Z boson mass. So this is the formula for the Z boson mass in the MSSM. And you can see that it depends on the mu parameter as well as the soft masses for the up and the down type Higgs. So what that means is if I push mu well above the weak scale in order to decrease the Higgsino component of the Bino, then that means that I pay simultaneously a fine tuning price in the Z boson mass. So you can do it, but it's theoretically maybe not that pleasant. Questions about this argument? Yes. Sorry, what? In the Neutralino annihilation process, like also neutrinos are originated, right? So it would be like highly energetic neutrinos. Yes. That's right. So you can also use Neutralino telescopes. So if the annihilation cross-section is large enough, you can actually use a Neutralino telescope. Like for example, people talked about Pingu, for example, being able to put a constraint on when back when people were talking about dark matter annihilating leptonically with cross-sections, which were on the order of 10 to the minus 24 centimeter cubed. What they were saying was that Neutralino telescopes in the future would actually be able to put a constraint on that process. So you're absolutely right that you do get leptons, that you do get neutrinos out of these very energetic neutrinos. The constraints, because these are invisible particles and neutrinos are hard to detect, are not as strong as what you can get out of photons. But there are models for which it's relevant. So how does this look at the end of all this? Well, I didn't actually talk about the fact that not only does pure bino, wino, or hexino escape, at least at tree level, but there's also the fact that I can tune the amount of wino, bino, and hexino component of the Neutralino and get a cancellation in the coupling to the Higgs. These are called blind spots. And I don't want to go through this plot in detail. But the point I want to make here is this is tan beta versus the mu parameter. So this is for mixed bino, hexino, dark matter. And the shaded regions is what they expect the xenon one-ton experiment to be able to do. And this is the region where that would escape, essentially because you're tuning away the coupling to the Higgs boson. So what I want you to take away from this is that detecting neutralino dark matter with direct and indirect detection experiments is not a home run. You don't kill all the parameter space. Even if you get all the way to the neutrino background, you still have parameter space. You still have ways that you're going to be able to escape. On the other hand, this paradigm is becoming increasingly tuned. And we as a community are going to have to make a decision about how far and how long we're going to put our focus almost solely on neutralino dark matter as the candidate in direct detection experiments. And also in indirect detection experiments. So at what point do you stop just saying, okay, I'm going to build bigger and bigger experiments? And instead you start saying, maybe I'm going to start looking elsewhere at alternative models. And tomorrow we're going to spend the majority of the lecture on alternative models. And Susie is not the only model by far. It is very nice from the point of view that it's highly predictive and that it tells us where to look. It sets the relic abundance in the right ballpark of what we need in order to get the observations. And you have a symmetry that's already there to prevent proton decay that stabilizes the dark matter. There are other weak-scale dark matter candidates out there. I'm not going to spend time on them simply because I had to make choices on what to talk about. And supersymmetry by far is the one that's most talked about. People have spent some time getting dark matter out of using occluzocline symmetry. They, you can also have sterile neutrinos where you just put in a phi LH, chi-phi LH, where the chi and the phi are charged under a z2 symmetry. And that stabilizes the dark matter. Okay, so L and H would both have charged plus one under this z2 and chi and phi would carry charge minus one. So that z2 symmetry by itself can stabilize the dark matter. Really, I think that this is a time when you should just feel free to invent your own model. I don't think that we should feel too constrained by the theoretical confines of a minimal model. And really what we need to be doing is to be writing down thinking about different types of dark matter candidates that give qualitatively different types of signatures. So one other constraint that I wanted to mention before I may stop, or at least I'll take a few questions, is what do we mean by massive dark matter? Well, massive dark matter, at least for the case that the dark matter is thermalized and behaves like a particle, typically means that the dark matter particle has to be heavier than about a KEV. And the reason why it has to be heavier than a KEV is essentially because relativistic and non relativistic matter behave differently when they form structure. So relativistic matter, like neutrinos, freestreams. It's hard to make it clump because if you try to drag it into gravitational potential well, its natural tendency is just to flow right back out of it. And in order to get structure that forms the way that we see it in the universe, we actually need dark matter to clump, which is to say we need it to be pretty highly non relativistic. And so you can make that statement quantitative, which people have done quite a lot of and looked at what is the clumpiness of dark matter. So you remember yesterday, we looked at the power spectrum. It was normalized a little bit differently than this one is, but it was the clumpiness of dark matter and we said if machos were heavier than about a thousand solar mass, you would be able to see them in terms of their clumpiness. Well, this is sort of the exact opposite. Instead of causing the dark matter to clump more, the hotter, which is to say the lighter the dark matter, the more you tend to suppress the clumpiness of the dark matter on small scales. So here again is the weight number. This is going towards smaller scales. This is the way the power spectrum of cold dark matter behaves from the limon alpha for us. We have measurements of this power spectrum out to about 10 inverse megaparsec. And so you can see that you get this suppression and you get more of a suppression on larger scales the lighter the dark matter particle is, which is to say the lighter and the lighter the dark matter is, the harder and harder it is for it to clump. And so this tells us that in general dark matter that behaves like a particle needs to be heavier than about a KeV. Now, why do axioms escape? Or why does really light scalars escape? Well, they're not individual particles, thermalized particles in the way these are. Those are coherent field configurations. And those coherent field configurations mean that they're much colder than than what you get just for an ordinary particle dark matter candidate. So that is, I think, where I'm going to stop for right now and maybe have a few more questions.