 Thank you and thank you very much the organizers for the invitation to be here today. It's really a great pleasure to be able to come out here to visit I'm opening here with an image of the gamma ray sky from for me and The purpose of the talk today is going to be on one area of this Image specifically here right near the galactic center where for some time now there's been Observation of an excess of GV gamma rays and it's caused a considerable amount of excitement Because it could possibly be interpreted as a dark matter annihilation signal What I'm going to be discussing today is a new analysis of this excess That's essentially trying to characterize how clumpy the photons in the excess look The general formalism philosophy for the analysis was laid out in this paper from last December with Samuel Lee and Ben Safdie and The data analysis came out just a few weeks ago In this paper with Samuel Lee Ben Safdie Tracy Slater and ways way So indirect detection provides a very promising avenue for dark matter discovery For these types of searches. We're looking for dark matter. That's annihilating To produce standard model final states if the annihilation goes directly into photons such as in this case here Then we expect to see a line in the photon energy spectrum Of all of the photons that we're observing where the location of the line is the energy of the line is given by the mass of the dark matter in In the case where the dark matter annihilates to other standard model states like the w the z or some quark These states will shower. They'll hydronize and produce photons in that shower so you lose some information in that showering process and essentially the The excess of photons that we would observe would be a continuum so no longer aligned, but you would still see An excess above the expected background Now the Fermi large area telescope is at the moment one of the best probes for high-energy gamma rays from dark matter annihilation It was launched in June of 2008 It scans the sky roughly the whole sky every three hours and is sensitive to a very wide range of energy So from down to 20 mev all the way up to even a little bit greater than 300 g e v so this here is an image Showing the photons observed by Fermi in the inner galaxy So each panel here is spanning plus or minus 20 degrees on each axis here around the center of the galaxy The galactic plane is mass so everything that's a dark block is just mass And each panel here is showing photons in a different energy range So you can see the first one here is point five to one g e v one to three g e v This is three to ten and ten to five In all the panels you see that there is a brighter Bright spot and you're closer to the galactic center However, you should note that the scales are different in each of the panels So in particular here the scale goes up to 20 whereas here it's just above four and here It's just above one point five. So the excess that has been observed is mainly dominated by photons in the one to three g e v region You can see that it extends fairly High above the plane and in particular it goes up roughly ten degrees above the galactic center The amount of photons that are observed is about ten percent of the total flux observed in this region of the sky This is not a statistical fluctuation. There's a lot of photons that are there There's been considerable study of this excess and convergence on the fact that there certainly is a More photons here than we know how to explain The question has really come down to whether or not You know how large the systematic uncertainties are this area of the sky has a lot of extra, you know A lot of astrophysics going on trying to characterize the systematic uncertainties from those astrophysical backgrounds is It's a big task and is really kind of the main thing that we need to get under control before we can really Get at the heart of what's going on here So this is showing the energy spectrum for the photons in this excess You can see that it peaks roughly around slightly low to g e v or so the Excess is fit so that the data points are shown here, and then the solid line is the fit this is a assuming dark matter annihilation and An NFW profile for the dark matter density distribution In general the morphology of the signals actually consistent with what you would expect for dark matter annihilation It's very symmetric. It's centered on the on Sagittarius a and like I said the density fall off of the the intensity fall off of the excess is consistent with What we expect from n body dark matter simulations for the dark matter case Now the biggest issue is whether or not this might be coming from Diffuse emission from let's say cosmic rays in the center of the galaxy this paper which came out last year was Looking at what happens when you varied over a wide range of possible models for the diffuse emission And in all cases they ended up finding that there was still evidence for an excess Essentially the models sort of spanned anywhere between the dotted lines here So the dotted lines are an envelope of all of the the spectra that they picked out with these different diffuse models so it does seem like the signal is fairly robust to This the the modeling of the diffuse background And this is just showing what the best fit dark matter candidate is so If you have dark matter annihilation to BB bar The signal is best fit by roughly a 30 gv Dark matter mass and thermal close to thermal annihilation cross-section and all of the different studies that have been done Tend to find values that are very close to this So that's fairly consistent And then this is the fit for the inner the slope of the inner profile for the dark matter density again consistent with results from n body simulations now because this signal is so intriguing and has potentially Extremely important implications for dark matter. It's very important to consider all possible explanations for it One alternative explanation that's really going to form the the heart of the talk my talk today is whether or not the excess can arise from a Population of unresolved point sources in this region of the sky And so in particular what I want to do is present a model independent way of testing this hypothesis Which takes advantage of photon count statistics? So to just kind of give you some intuition for what I mean by photon count statistics I want to start with a just a toy example So here shown here is just some monocarlo of what the best fit dark matter signal would look like So this is our 30 gv candidate. It annihilates to BB bar and Shown here 20 degrees by 20 degrees. Here's the galactic center I haven't superimposed any backgrounds or anything here. This is just the dark matter signal This is what a collection of point sources Might look like Where we've chosen the spatial distribution of the point sources to fall off In intensity in the same way that you would expect the nfw For an nfw density And so there's there's a few things you could probably see just by eye here that are similarities a few things that are clearly different So for example in both cases the the distributions fairly spherically symmetric The fall off is the same relative in radius from the galactic center, but then there's obvious differences So for example the point sources are clearly more clumped together in the sky There's more hot pixels and cold pixels relative to the dark matter image So let's focus on Particular area on both of these So in the dark matter image what I want to do now is actually just go in this image and count the number of photons That I see in every pixel and just make a histogram of that So that's here. So number of pixels and photon count When I do that for a diffuse source like dark matter, there isn't a huge amount of variation from pixel to pixel So, you know, there'll be some pixels with one photon some with none, you know Not like I said not very much variation. It might look something like that if I do the same exercise now But assuming that the source is these point sources There's gonna be more variation and in particular I'm going to hit pixels that have a lot more photons in it And I'm also going to hit pixels that have more more cold pixels, right? So if I make this distribution I'm going to expect to have a tail here at high photon counts and also more More pixels that have zero photon counts So the distribution of photon counts looks different for sources that are diffuse like dark matter versus Sources that are not diffuse and you can do this out exactly fully analytically and you can see you what you'd find is that the photon count distribution is Poissonian for diffuse sources and You get the exact non-Poissonian form for it for this case here with the point sources So the goal of our analysis is to take advantage of these difference To be able to distinguish whether or not the photons in the galactic center look more diffuse or not diffuse So let me take a step back and explain how the standard searches are the standard Dark matter searches that have been done at the galactic center They approach that they take so this is often referred to as the template analysis and what's done is that you start off with spatial templates that Describe the just the spatial distribution of the photons you expect from a given source So for example here, I'm showing two different templates One is showing the distribution of photons you'd expect for diffuse The fuse background and the other is showing the distribution of photons you'd expect for dark matter distributed according to an NFW profile what you do next is you choose a pixel P and You just count up the number of expected photons for each of these components So I'd count up the number of photons in this pixel here coming from the diffuse background That gives me this number count up the number here in the NFW dark matter template That gives me this number and then summing the two gives me the expected number of photons in a given pixel and The probability of observing these photons in that pixel is just given by a Poisson distribution Once I can write down this probability that I could do likelihood analysis and do best fits and pull out You know how much dark matters there etc etc doing that's different is We're adding additional templates that spatially look the same But that are described by different photon count statistics So for example here what I've done is added an additional template that Would describe NFW point sources so spatially it looks exactly the same as the NFW dark matter template But the the statistics of the the the photons in this template is going to be different It's not Poissonian versus the statistics in this template Now writing down the probability for this is going to be more complicated than just writing down the Poisson distribution We've described this in in a lot of detail in our first paper Which was developing formalism that was first developed by Malashev and Hogg a few years back I'm only going to outline it very very briefly Which is that you can write down a total generating function for the number of photons in a given pixel So where one you know that this total generating function is a product Of the generating function describing the Poisson fluctuations and the generating function describing the non Poissonian fluctuations From this then you can just following standard procedure you take this derivative and that Allows you to recover the probability of observing k photons in a given pixel So in the limit where I do not include this template I do not have this term here in the generating function And when I go here to do this derivative I recover completely the Poisson distribution Adding in this additional term here complicates it and then I get a Different distribution that can properly account for the non Poissonian fluctuations. All right, so Once we can write down that probability we can do a Bayesian Analysis to find the posterior distributions for the pre parameters in the model There is a few parameters that are going to be of particular interest to us today The first is the normalization of the dark matter component and the second is the normalization of the point source component Additionally One thing I do want to emphasize is that we are not making any assumptions as to the nature of the sources This is what we call a point source is any sub pixel structure in the map So it can be a pulsar. It could be a gas pump. It could be a cloud It could be dark matter substructure. We're agnostic to it. We don't care The only assumption that we are making about the point sources is that their source count function is described by a double power law Now I'm going to be showing you a lot of plots that look like this in the next few minutes So I'm going to start by just showing you cartoons You can just gain some intuition for it when I say source count function What I mean is just the number of sources in a given pixel that have a flux in a certain range And so we're assuming that it's a double power law So the three parameters are just going to be the slope above and below the break Which is just given by FB and then the overall normalization of this source count function Okay, so these are the templates that get fed into our analysis We have a template for the diffuse background a template for the Fermi bubbles One for isotropic diffuse and one for NFW dark matter These are these four templates are the ones that have been typically used in the past the statistics for these four templates are just Poisson statistics and each one of these just carries its own normalizations. That's the free parameter that we're varying What's different about our analysis is that we're adding two more templates We have one here for isotropically distributed point sources and we have one for NFW distributed point sources Each of these templates carries for free parameters coming from the for free parameters of the source count function And the reason why we're considering this NFW distributed point sources is because we ultimately want to be able to test whether or not We could be able Mistaking a dark matter signal at the galactic center that's distributed like NFW with point sources that are also distributed like NFW All right, so we start off by doing the analysis at high latitudes The reason we do this is because this area of the sky. There's going to be less contamination from the diffuse background so we can run here and just make sure that our Analysis procedure is giving us results that make sense So we mask third plus or minus 30 degrees around the plane So we don't look at any of the stuff that's within the gray region and Let's see the each dot here. So the little gray dots each dot is Masking a the known location of a Fermi point source. So the Current Fermi catalog listing all of their known point sources is called 3 fgl So each one of these is a 3 fgl point source as identified in the sky So we do the analysis both masking these point sources and then not masking them And this is what we find so there's a lot going in this plot. So let's just take it one step at a time This is the source count function. So dn df here and flux on the horizontal axis What's shown in the black dots is the source count function of the 3 fgl sources that are observed by Fermi in this region of the sky So you can see that as you go to lower flux the number of sources increases Up to a certain point and then appears to fall off. This fall off is indicating Roughly where the threshold is for point source identification I should mention that everything that I'm plotting here is focusing only on photons in the energy range roughly from 1.9 to 11.9, which is where we expect the the excess to be dominated Okay, so the first thing we do is to run our template analysis Without masking the 3 fgl sources the purpose of doing this is to see whether or not we can just recover all of the point sources That for me has already identified What we find from our best fit is given by the green band here And so you can see that we do a very good job at just recovering the right slope of the observed population in this region of the sky The next thing we do is to mask all of the 3 fgl sources The purpose of doing that is to see whether or not we can recover Additional sources that may not be in the Fermi catalog When we do that we find this orange band So as we would like the orange band is only showing that there's a contribution below the Fermi threshold So what we're picking up is You know at these high latitudes there's likely a population of sources that haven't been identified yet by Fermi so they're unresolved and We think that they fall roughly in this region here now we can Use these results to obtain an estimate on the intensity of the Isotropic gamma ray background our results are consistent with those that have been presented by Fermi in the past We can also obtain the fraction of extra galactic Background due to resolved and unresolved point sources again our results are consistent with those that have previously been considered in the literature So this gives us confidence that the method is working We also think we can push the method even further in this region of the sky to get some really good measurements of the isotropic gamma ray background That's a whole separate talk in itself. So I won't have any time to get into that Instead I'm going to forge ahead to the area of the sky that I promised I would talk about which is the inner galaxy So we do the analysis in this region here. That's within 30 degrees of the plane We mask out plus or minus two degrees so this gray band here We don't look at anything close to the plane because it's just way too complicated And the gray dots here are showing again the identified 3 fgl sources in this region of the sky Notice that when we do mask out these 3 fgl sources We effectively end up masking within five degrees of the galactic center So we're really looking mainly in this inner galaxy region Okay, so we do our template fitting our modified template fitting procedure here and this is the result that we find So again source count function here as a function of flux the black points are the identified 3 fgl sources in this part of the sky When we do the analysis without masking these 3 fgl sources, we find we recover the green band So we do a nice job our analysis does a nice job at picking up the identified sources that are already there And then we repeat the analysis masking all of these sources and what we find is the orange band so this is telling us that There appears to be evidence for a population of unresolved point sources just below the threshold of the Fermi of current 3 fgl catalog from Fermi The the evidence for this is quite strong You can see just this is roughly 68 percent confidence. It you know, it's definitely not zero and so the next question is In our fit what fraction of flux is pulled out is Accounted for by this point source population relative to the fraction of flux That's accounted for by the dark matter component because we've included both in our fitting procedure And this is the result So this is showing the posterior distribution of the fraction of flux for these two different components the blue line is the fraction of flux that's absorbed by the nfw point source template and The red line is the fraction of flux that's absorbed by the nfw dark matter template What you can see is that the pretty much no flux is absorbed by the dark matter template and All the flux is observed absorbed by the point source template If we we can repeat the analysis and just remove the point source template What we find is that in that case the flux just gets completely reabsorbed by the dark matter component So essentially if you put in an nfw template It wants to absorb this this flux But if you allow it to distinguish between a point source non-Possanian photon count statistics or Poissonian photon count statistics it prefers the Poissonian the non-Possanian photon count statistics much more How much more? Well, we could look at the Bayes factor Which compares the likelihood for a model that includes both the dark matter and point source component relative to a model That includes only the dark matter component the Bayes factor is 10 to the 7 Any number greater than 10 would have indicated a strong correlation Sorry not a strong preference for the point source and dark matter model In terms of numbers what we predict Is that the half of the excess can be explained roughly by 60 sources? What that would mean is that in this bin here the one just below the threshold We expect the sources to be here the number sources roughly 60 would put you up here To account for the entirety of the excess you would need more like 200 point sources But I don't that that number should be taken with a huge grain of salt because it requires Integrating down to much lower fluxes here where the errors on the source count functions start becoming quite large Okay, we've done numerous cross checks And unfortunately I don't have the time to really go through most of them So I'm just going to quickly flip through a few and then if you have any questions I'd be more than happy to discuss them in more detail afterwards So like I said the biggest concern here is a diffused background It's the thing that's kept me up at night wondering whether or not for some reason This new approach is picking up random fluctuations in the diffused background over some generally new point source population We've tried a few things here. We've First test was just putting in different diffused models So the analysis I showed before was using the Fermi p6 v11 model We've we done the analysis with the Fermi p7 v6 model. We recover consistent source count function We've also done we done the analysis with about 13 additional diffuse models that just vary a whole bunch of different parameters This is just showing the full spread of what we find over those 13 models again Everything is fairly consistent Second test we've looked at other regions of the sky where excesses have been observed because one thing You might be worried about is that your analysis might just be picking up Any fluctuation in the diffuse background and just calling it point sources So what you want to do is look someplace else where there is an excess and see what you pick up So we looked at this region on the plane centered at l equals 30 degrees where people had observed an excess in the past and What we find in this region of the sky is Shown here so the orange band shows the source count function When all of the 3 fgl sources are masked so in this region of the sky where an excess has another excess has been observed We don't find any evidence for for point sources over the dark matter Interpretation and actually when you look at the comparison between Fraction of flux absorbed by NFW point sources versus dark matter in this case. They're totally consistent with one another We can't differentiate Okay, I don't have time for thought So The analysis leaves open many Questions and in particular the source count function that we're recovering has several unexpected features So one argument that had been made for a while about why point sources couldn't explain the excess Was that if you look at the luminosity function of the observed Pulsars millisecond pulsars in the Milky Way. They tended to over predict the number of sources you would get in In the galactic center and so this is illustrated here So the red line here is showing what you would have expected to see if the Luminosity function of these sources was the same as that of the observed millisecond pulsars So the solid red and the dashed red Same thing except with different cutoffs and luminosity. The important point is that they tend to give you Predictions for sources that are above the threshold So this was the usual argument of you know, this luminosity function would tend to over predict the sources that we would have seen You can use the same luminosity function, but with a lower cut on the threshold you get this dotted red line This would predict that you would expect to see something like a thousand point sources in this region of the sky Which seemed like a lot So what we're saying is that this is probably order a hundred sources can account for the excess and the reason that our Result is different than what people have been saying in the past was essentially because the source count function is different We're get finding that most of these sources should actually be living pretty close to the right below the threshold All right, so to conclude I hope I've been able to illustrate that the use of photon count statistics provides a concrete way of determining whether the photon Distribution is non-Possonian We've taken advantage of this and developed a template fit procedure that can include non-Possonian components And we verify that this procedure works at high latitudes And then applying the same procedure in the inner galaxy we find evidence for sub pixel structure Which would not be indicative of a dark matter signal Little asterisk here So at the same day that our paper came out an accompanying paper Came out by this group here used a very different type of analysis method But also came with came up with a similar conclusion for this additional structure So I just want to end with this slide here, which I think should summarize the next steps that we're hoping to to pursue So what this is showing is the inner galaxy region? Each of these being the individual pixels The color of each pixel tells us the probability that that pixel contains a point source So the brighter the pixel the more likely it is to contain one of these point sources You can see that most of the red pixels here have a white circle around them The white circles are the identified three FGL sources So we do a good job of picking those up what we hope to do in our next Analysis is to provide the locations of the sky that are most likely to contain the non-Possonian sources The sorry the the locations in the sky that are most likely to contain the unresolved point sources That we're finding as well as the energy spectra for each of those sources. So the hope is that We should be able to Combine this with information from other way wavelength analyses to Really be able to get that to heart at to what exactly the nature of these sources is. Thank you Time for a couple of questions Thanks, it was probably on your list of things you checked, but I just was curious about How since to be results are to the form of the source count function you choose to choose to a to power law Yeah, so We'll just here we go. So we this is a first attempt at doing this We're hoping to be able to do this more carefully in our follow-up analysis but we could try bidding as opposed to just making an assumption of double power law and I Can I can talk to you about this in more detail later, but essentially the results are consistent And we're hoping that in the next Paper that we put out will be able to decrease the arrow bars on this and take properly take into account the correlation between the bins But with this first pass it does look like the results are consistent with the source count function So very naively Why do you need at all these templates for the point sources? Why don't you just? a Strongly rule out statistics is the person in distribution of the of the gamma rays of the photons. I mean you simply have many photons and You can check whether they are consistent with this with a smooth distribution It's like some drug matter and according to what you said I would have thought that you find many sigma exclusion of that hypothesis So what sorry I guess what I don't understand about your question is what would be the statistic that you want to you Want to just look by eye to see whether or not there's a that the photons are comfy No, I thought if you have you have whatever 10,000 photons in a certain region Yeah, and you have a hypothesis that they are Poisson distributed and you look for example at correlation function of two photons of pairs of photons. Oh You mean to Not I mean you just exclude the hypothesis of a small of a smooth distribution isn't that a statistically well-defined question Independently of distribution that actually is there. Yeah, so you can do a two-point correlation essentially what we're doing is a one-point correlation You can do a two-point correlation that should work what we found is that the template analysis helped give us a better handle on Being able to properly subtract out the diffuse background because the diffuse background is sort of the the challenge in everything and so the By including the spatial information It helped us to to reduce that the to eliminate the background in a way that was we felt robust and So that we could increase the sensitivity to the point sources But yeah The point about doing two-point correlations is an interesting one and one that we definitely want to explore in some more detail Because there's probably more information that's coming from that. Yeah