 Okay. Hello everyone, and welcome to the Latin American Webinars in Physics. My name is Joel Jones from the PUCP in Peru, and I will be your host today. This is webinar number 52, and we're having Nancy van Kofsky as a speaker. She is a postdoc at the University of Madison, Wisconsin, and more specifically at the Whiteback, the Wisconsin Ice Keep Particle Astrophysics Center. So Nancy finished her PhD on the Calz Hue Institute of Technology where she worked at the Catrin experiment. But she's been a postdoc at Ice Cube since 2014 and has worked on astrophysical neutrinos. So today Nancy will be talking about the high energy astrophysical neutrinos, and we're very glad to have her as our speaker today. Before we begin, let me remind you that you can also be part of the discussion by writing questions and comments on the YouTube live chat system, which should be on the right side of your screen. So now I'll hand you over to Nancy. Okay, so please go ahead whenever you're ready. Thank you very much for the nice introduction and for actually giving me the opportunity to talk to you about astrophysical neutrinos. Let me try to share that screen. Can you see the slides and full screen? Yes. Okay, I could see them, not anymore. So are you sharing your screen? Right, and start screen share. Okay, now I can see it, and now I see you. Why does it go away? It's apparently once I go to full screen with the screen share. Maybe we can switch off your camera. Say again. Try to switch off your camera. Oh, that could work. I can do that for you. How do I do that? I switch your camera. Okay, let's see. No, that didn't work. No, I can switch that one. Nancy, maybe you can try to share the screen, the full desktop, because maybe you're just sharing the PowerPoint window. So usually what works for me is... Go ahead. Was that similar? That's perfect. Then it was the problem with just trying to share the PowerPoint. Okay, so after these issues, let me give a short overview. So I will firstly, obviously introduce a little bit about astrophysics neutrinos and ice cube before I will then come to the main topic, which is the high energy starting events, a selection that basically gave us the first detection of astrophysics neutrinos. And then I will talk a bit about our basically latest hot topic, which is our multi-messenger approach to actually finding the sources of astrophysics neutrinos. Okay, so why do we actually expect astrophysics neutrinos to exist? Well, we know that cosmic rays exist. So they consist mainly of protons. And if those protons interact with the intercellar medium, you expect to have a vast production of pions. And from those, you actually expect to become... to get neutrinos. So you expect a tight coupling between those cosmic ray protons and the neutrinos. So we have measured the cosmic ray spectrum very well. We have mapped some of its features, like the knee and the ankle. And we know that at some point there is a switch from a galactic contribution to an extra galactic contribution. And although we know so much already about the cosmic rays, we don't actually really know the sources in detail and the mechanisms that actually driving them. So we want to utilize the neutrinos to actually learn more about those mechanisms. And what I forgot to say is that the nice thing about neutrinos is that they are not affected by the magnetic fields in the universe. So they actually point straight back to their source. So that's a nice feature about them. And basically the two most prominent sources for neutrinos are the active galactic nuclei where you have the relativistic jet coming from the central engine, most likely a black hole in galaxies, which produce a continuous emission of neutrinos. And furthermore you can have gamma ray bursts where the current model is like a relativistically expanding firewall or jet. And those are neutrinos sources of transient nature because of the spinning of the particular source. So both of these mechanisms can lead to neutrino production and are the models that we are currently testing. So in order to detect neutrinos because of their low rate of interaction, you actually need a really huge detector. In the case of ice cube, this is a one cubic kilometer volume of clear Antarctic ice which is instrumented with about 5,000 digital optical modules on a total of 86 strings, which are recording the light produced by neutrino interactions. I'll come to that in a second. So the domes on these strings have a vertical spacing of about 70 meters and the string, the inter-string spacing is about 125 meters. The full detector was completed in 2010, so since now seven years we are taking data in full detector operation, but even before that we already took data in with the partial detector. So let me see if that works. So here's a nice video of what happens in the detector. So you have a neutrino incoming interacting with one of the ice molecules and producing secondary particles. In this case, a muon is being produced and this muon travels faster than the speed of light in the ice and hence emits Cherenkov light or leads to the emission of Cherenkov light. And this blue light is what we actually detect with our sensors in the ice. So you see each of these coloured blobs is one sensor that actually recorded a hit and the colour scale denotes you, gives you a timing information which basically you can use to get a sense of the direction the event came from and the size of the coloured blobs gives you a sense of how much charge was recorded which is proportional to the energy deposition at this position in the detector. So that's the information that we can use to learn something about the neutrino that actually produced this event. So there's three types of events of neutrino signatures that we expect in the detector. The one that we just saw was a charged current muon neutrino interacting and producing a secondary muon which leaves a long track in the detector. So the stochastics can be stochastically to either homogeneous emission or sometimes also very inhomogeneous emission so you can also see stretches without deposited light in between but this one is a nice muon track that you see here and due to the fact that this muon can actually leave the detector or the interaction can actually happen before the detector and the muon then only enters the detector. You don't have such good information about the energy. So our average energy resolution is only about a factor of two but due to the long lever arm of this track you have a very good angular resolution so we can pinpoint the event very well which is great for finding point sources. Then the second type of event which is actually the main event for today's talk is what we call a cascade event which is either any neutral current interaction or any interaction of an electron neutrino in the detector and here basically due to the fact that you have electrons which don't travel very far or you have the neutral current interaction where you have local hydronic processes you get a very localized energy deposition that means that all of the energy of your event is ideally contained within the detector so you have a very very good energy resolution of about 15%. Unfortunately though since we are in this case missing the lever arm the angular resolution is relatively bad so it's still decent 10 to 15 degrees depending on the energies but it's by far not as good as for the tracks. And then the third signature which we actually still need to find is the signature of a charged current tau neutrino interaction where you basically first have the interaction of the neutrino producing the lepton, the tau lepton and then subsequently you have the decay of this tau lepton and the thing is that these two processes produce your cascade-like event signatures but in order to be able to separate those you need a very high energy incident neutrino so the average tau decay length is about 50 meters per peve and since we have this 125 meter string spacing we need pretty high energy in new taus to produce a signature which you can actually distinguish from a single cascade so we are still searching very hard for that but so far we haven't found it. So how do we actually isolate those neutrino events from the vast backgrounds of atmospheric muons that are incident on our detector all the time? There's two main strategies the one is shown on the left which are up-going tracks where you basically remove all the muon background because you're only looking at those neutrinos that actually travel through the earth and any muon that would attempt to travel there would obviously be absorbed so you don't have to deal with the atmospheric muon background and the effective volume for this is actually pretty large because as I said the neutrinos can interact outside of the detector and the muon can still make it into the detector however this selection is only sensitive to muon neutrinos and to the northern sky so in order to get neutrinos from the whole sky we implemented a second method which is utilizing a veto to get rid of the atmospheric muons so in this case you can actually also look at the southern sky however since you require that your event has to be deep inside the detector you actually have a smaller effect of volume so let me talk a little bit more about this active veto technique which is what we use in the high energy starting event analysis so on the left sketch you can see the orange region is the veto region that we define so what that means is that we actually look for hints of an incoming muon in this region so a muon that comes from outside and is of decently high energy has a very high chance of actually depositing energy there while if you look at neutrino interactions that happen inside the detector there would be no trace of this event in this outer layer region and at the same time we can use this technique to get a better handle on our remaining atmospheric muon background so for that we actually define a second veto layer inside of the outer veto layer where we basically check if we find events that deposit light in the outer veto layer so hinting towards actually having that this is an incoming muon but no hits in this inner veto layer so because of the stochastic nature of the muons this can actually happen so you try to infer from observing this type of events that don't deposit light in this second veto layer you try to infer of how probable it is that an incoming muon can actually sneak through your real veto layer so another very nice thing that nature gave us here is that we have what we call the atmospheric neutrino self veto so what you have here is basically you have your atmospheric neutrinos which are produced in the cosmic ray air showers and those neutrinos obviously in the same air showers you have a lot of atmospheric muons produced so these muons accompany your neutrino in the case of a muon neutrino you have a very tight coupling with a correlated muon in case of electron neutrinos you only have the uncorrelated muons but anyway there is a high chance that the muon deposits light in your outer veto layer and by that you actually remove the atmospheric neutrino from your sample so and how we parameterize that or how we deal with that is that we actually we have simulations of atmospheric neutrinos for our detector and we correct their detection probability according to this process so we performed quasicar simulations to obtain a parameterization where you can find the references to here to perform quasicar simulations to get this parameterization of energy and zenith dependence and by that we basically correct the neutrino detection with this what we call self-veto probability and that means that on the right you can see the spectrum or actually the number of neutrinos that you would expect as a function of cosine zenith or cosine declination and you see that this self-veto results in a reduction of atmospheric neutrinos in the southern sky so there you would expect then to have to find less atmospheric neutrinos and the astrophysic neutrinos clearly sticking out so here's what I just said it's how we parameterize it so the crosses are the quasicar simulation and the solid lines are the parameterizations that we actually use in our correction and we are currently working on an updated version of this using the so this used Sibyl 2.1 which had no charm in there but the new version of Sibyl 2.3 has charm so we are working on updating that currently okay so with two years of data we basically were able to show that astrophysical neutrinos exist by now we have collected even more well we have now collected seven years but the latest analysis was about six years of data where in total we observed 80 events and in this full sample you expect about 15.6 atmospheric neutrinos and about 25 atmospheric millions and in the plot you can see all these events in our observables which is the deposited energy in the detector and the direction the event came from so we still can't find any sign of prompt in our data and one thing that we changed which is important if people are trying to reproduce our results is that we are actually now using a more correct cross-section model which predicts about 15% different flux so since the atmospheric millions are mainly populated at very low energies we actually only fit in a region above 60 TeV deposited so that leaves us with about 50 events and you can see that here the background in this region the background gets drastically reduced and you only have an expectation of about one atmospheric or a little more atmospheric millions so what does the spectrum look like? so on the left you can see the distribution in energy so you see that we get rid of most of the muons in red and we have a clear excess above the convention atmospheric neutrino expectation and on the right you can nicely see the suppression in the northern sky due to the absorption in the earth and you can see in the southern sky that we have this suppression of atmospheric neutrinos due to the self-beato and the clear excess above that expectation which we attribute to astrophysical neutrinos so by now we have the background-only hypothesis can be rejected with 8 sigma so pretty significant however if you... so one thing that is interesting that with each additional data that we... data sample that we added or with each additional data that we added to the sample the spectral index got softer and softer so we started off with something like 2.3 and this latest iteration gave us 2.9 which is a really soft spectrum so how does this fit in with the other analysis? so I told you before that we have this other method where we are looking for outgoing neutrinos outgoing muon neutrinos and they are latest... so they are using by now a series of data so they are using some of the non-complete detector configurations and they obtain a relatively hard spectral index of 2.19 so if you look at this in a combined plot so the pink is this outgoing muon neutrinos and the black is our starting events 6 year analysis you can see that there is a tension of about in this case it's about 2.3 sigma so in the end we think that we are observing the same neutrinos so the question is where could this ever increasing tension come from and of course it's a little hard to say because we are not looking at the same at the same events here so one is outgoing neutrinos and one is all-sky neutrinos dominated by actually electron-neutrino interaction so what we tried in our 6 year analysis is actually looking into more complicated models than just a single power law so the next extension that you can do is actually assuming a two component power law so a dual power law and if we just fit for a dual power law with just the starting event data the fit definitely prefers a single power law so what we did is that we actually introduced a prior to one of the components which is in this case obviously this outgoing muon neutrino spectrum so the orange contours here show you the results of this dual power law fit where we used the pink contour as a prior for the orange contour with the down-going triangle so you can see that the fit pulls it towards softer spectra which is somewhat expected and especially the contour of the second much much softer component is very very large so it's absolutely not conclusive with the small number of events that we have in the starting event sample but since we have this small tension we have to look at possibilities of more complicated models so as I said there is some inherent differences between the data sets so our starting events are mostly cascades while the other analysis are tracks only so new e versus new mu the starting events are mostly dominated by the down-going portion where we act or by the southern sky portion where we actually have the galactic center and the other analysis looking at up-going only so we tried to basically look at up-going versus down-going within the starting event sample or cascade versus tracks but unfortunately we cannot do the nice combination of up-going plus tracks because we have basically no events left if we choose this combination so that would be the most fair comparison to the up-going mu and neutrinos but unfortunately our statistics in the starting event sample for that is too small however if we look at these other splits we can't really resolve the tension so we are currently working on a new analysis that actually combines the different data samples in one likelihood analysis which will give us a much better handle on this okay so if we assume that whatever we measure with the starting events is defined as to the astrophysical neutrinos so on e to the minus 2.9 spectrum then the first thing is that you would not expect a glacial resonance event with this soft spectrum so the fact that we haven't seen that in the starting event sample would not be surprising and you can say a few more things about the production mechanism of these neutrinos so if you look at the ultra-high energy cosmic ray accelerators we know from measurements of OG and others that we find energies above 10 to the 29 electron volts and that if you think that Fermi shock acceleration is the mechanism that actually produces these high energy cosmic rays this would imply a spectral index of about 2 and not softer than 2.2 so if you neutrinos are mainly produced by Pp interaction that would mean that there would be a strong coupling between the spectrum of the neutrino and the spectrum of the proton so that would mean that you would expect a neutrino spectrum of not softer than e to the minus 2.2 which is clearly not what we see with the starting events it's still compatible with what we see with the muon neutrinos however if you allow proton-gamma interaction then however the neutrino spectrum is not coupled to the proton spectrum anymore because also the photons that your protons interact with matter here so in this case you can produce much softer spectra so we can make some restrictions here with our data and so we have looked at 6 years of data now but as you know data is coming in every minute and we are currently working on a 7.5 year data set and obviously we want to also improve our methods and one area where we want to improve are the reconstruction of these events because especially if you look at point sources you want to make sure that you are actually pinpointing your events reliably so the reconstructions rely on the correct modelling of the south pole ice and in the simplest model or in the model that we used so far we actually have not considered the fact that the south pole ice actually shows an anisotropy and the tilt what does that mean? so if you are looking on the right plot it's a sketch of the detector so all the numbers correspond to the different strings that are installed in the ice and due to the shear of the ice layers and all the... the shear of the ice layers what you actually have is that in one direction you actually have an enhanced scattering while in the orthogonal region you have a decreased scattering of the photons there and that in the end leads to an anisotropy in your observed pattern of light so and if you do not take this into account you are actually making a wrong assumption obviously about what you would actually see and this became especially important for our recent search for new towers because here as I told you we are looking for the signature of the two cascades so this double bang and if you neglect the anisotropy then actually a single cascade can be misidentified as a double cascade because you're actually... you're separating the single cascade into two events if you do not account for this enhanced scattering in one direction so in that for this reason we have implemented the anisotropy in our reconstructions and here is basically a likelihood map of our old reconstruction which does not use the anisotropy information and what we do basically for our reconstructions for the HESI events is that we produce a... we scan the entire sky so we assume a helipix map over the sky with a very fine grid and we fit for the particular pixels so for the particular direction and record the likelihoods that the event could have actually come from that direction and that's the map that you see here all the different likelihoods and you get a corresponding best fit position and an uncertainty and if you now switch these reconstructions that do not contain the anisotropy to the new reconstruction that actually has the anisotropy information you can get a very significant shift of your best fit even outside the old error contours so we are currently verifying that all of these improvements are actually doing what we think they are doing but this could produce a significant shift of some of our previously released events which obviously can have implications especially for the point sources that we are trying to find with these events Okay, so as I said we are currently working on about seven and a half year data set we obviously want to use improved reconstructions we also in the last year we have undertaken a large effort to recalibrate our historic data so we know by now that our calibration has not been entirely correct previously and effectively this means that we have to down correct all the charges and hence also the energies by about four percent so it's not a big effect but something that will effect the energies that we previously released and as I said there is a slight tangent with the up going your neutrino analysis and this of course raises the question if the single power law is still the best fit description of the data set so we want to test different models obviously to do a power law as the next extension then we are also looking into models like the log paraboloid power law which is basically many many power laws with different spectral indices and there is obviously also the possibility that we have something else and just as to physical neutrinos in our data for example dark matter from decay or annihilation in the galactic center and there is obviously always the exciting possibility of finding new physics in the data which we will be handling via effective operators and since the high energy starting event sample was in the beginning not aimed at making detailed yielding a detailed understanding of the astrophysical spectrum which was just aimed at actually discovering astrophysical neutrinos we neglected some of the systematics that at the time when we had the very low statistics actually didn't matter but by now we collected so much more data that we also have to take into account additional systematics like for example the pi k ratio we previously only used one atmospheric flux in our fits and didn't give it much variability beyond just denormalization and some more and if we now add systematics and we are also looking at more complicated models than just the dual power law we are obviously introducing a larger number of parameters and to be able to handle that we actually need to utilize a Bayesian approach now beyond what we did before so Wilkes doesn't necessarily hold anymore and for this we are actually planning a publication for spring 2018 okay in the last few minutes let me talk a little bit about our latest hot topic in ice cube which is multi-messenger physics coming back to this picture that we had in the beginning we know that or we would expect that the neutrinos and the cosmic rays have a common source obviously so our best chance of finding these the common sources are actually looking at both signals at the same time so either the photons or the cosmic rays so what we have set up is this amon network so astrophysical multi-messenger observatory network which contains several observatories so either triggering observatories like ice cube where you get an event and you send it out to the participating observatories and they can then follow this event up so those are the other observatories the follow up observatories you can see on the lower left a map of what the picture looks like at the moment or half a year ago so the main effort here is concentrated at Penn State University and I want to give two examples so one is ice cube as a follow up observatories so on August 17, 2017 the LIGO Virgo consortium observed binary neutron star merger so the first neutron star merger actually and the most exciting part of that is that this event was also seen in photons by the Fermi gamma ray burst monitor so that of course so that was basically the first multi-messenger observation of an gravitational wave event and this triggered us to actually look in our data if we find a coincident neutrino emission unfortunately we haven't found any but this is a nice example of how we can work together and there will be more of these events coming in obviously from the LIGO Virgo collaboration so we are currently setting up an automated follow up of this type of events and then you can also have ice cube as a triggering observatory we have been running now since about one and a half years so what we do is that we are identifying high energy track like events so we are only looking at track like events to have a good pointing resolution and we identify them in real time at pole so we are running our selections at pole and then if an interesting event is found so an event that has a high probability of being of astrophysical origin we send a notice to this GCN list so one can subscribe to that to get this information in real time and there is information like our current reconstruction reconstructed direction and this allows other observatories to actually look in this direction to see if they can find a coincidence for example gamma signal and this event then usually it takes about a day for the data to arrive from pole in our data hub but these events are sent with high priority via satellite so there is a very small delay until it arrives here and then an automated process starts actually refines the position of the event as it was reconstructed at the pole because there we can't run computationally intensive reconstructions so within about one hour we get a very good resolution position which we send as a revision to this GCN list again so this then results in a resolution of less than one square degree usually so most of the telescopes will then hence be able to observe the 90% uncertainty region and with some of the observatories like SWIFT we have special MOUs where we can request them to follow up if we think a signal is very or an event is very very interesting we can ask them to follow this event up specifically so there has been a lot of activity going on recently so you can look at them in these astronomer telegrams where basically the follow up observatories report their results so the most interesting one is the upper left where we reported an event high energy through going track actually which was then followed up by the Fermilat Observatory and they actually detected a flaring source in this region or in our error region so we are still crunching the numbers on this event and you can expect a publication of that pretty soon but if you look through those lists you can see that a lot of observatories actually make use of these events and follow up the signals so this will hopefully lead us to the detection of the sources of astrophysiconeutrinos so let me summarize so as I said we have detected the first astrophysiconeutrinos which was already done in November 2013 with two years of data so in summer 2017 we have released our six year results which were shown at ICRC and in this iteration we got a very soft spectrum with slight tension with the upgoing muonetrino analysis which of course leads you to considering more complicated models beyond just the single power law and upcoming we have a seven and a half year analysis which utilizes improved reconstructions other improvements and more detailed lists of actually physics models that we can test now that we have much more statistics than we had in the first iteration of this analysis and finally lately the hot topic as I said is finding the point sources via a multi-messenger approach where we are utilizing this Amon network for so let me finish with this video thank you very much so you can see here the other telescope the south pole telescope and our ICL so our ice cube lab in the center and this is a nice aurora so those people that are at the pole during the winter can enjoy this nice view what's happening now I'm getting an infinite whoa you have to stop sharing your screen probably there you go thank you very much for the very nice webinar I think it's the first time a speaker shows us a video so there you go so I think it's time for the question round I don't know if anybody from the audience has got a question let's see hands up let me ask a basic question so first of all we didn't see any neutrinos coming from this neutron star merger right so what was the chance of actually seeing one I mean was that expected not to see any neutrinos actually I have to be more specific there so we didn't see any neutrinos within a 500 second time window around the around the time of the merger so but then you didn't really I mean from the astrophysical background you did not expect to find anything in this region so but then if you actually look at the possible models for neutrino emission for these for these mergers you would actually you could have the possibility of finding neutrinos for a period of up to two weeks after the event so we looked in that data sample and there are some neutrinos in there but in this case obviously you have to consider the background so we'll actually release a more detailed paper on this in the near future so you can see the numbers in there so the paper that I quoted here only basically is talking about the 500 seconds time window so the more detailed analysis you have to be a little bit more patient okay so some people here are asking permission for questions people you don't have to ask permission so okay let's go maybe Mauricio can ask his question I am muted my microphone can you hear me yes hi Nancy thanks for the talk so my question is the recalibration of the domes in ice cube will that also affect things like precision things like sterile neutrino searches so I mean it will over all affect the energy spectrum of all of the events but since it's an overall average shift of 4% it doesn't actually influence spectral features so it shifts everything very slightly towards lower energies but you cannot have basically it could not have hidden a spectral feature okay but your resonance would be hit at the different energy right for sterile neutrino searches right but then you have to consider 4% which is probably much below the resolution that you were using yes probably so I was just curious about it I mean it definitely has to be checked of course yeah thank you okay I have a question for Nancy Nancy first of all very nice your talk I wanted to ask you how many neutrinos are above 1pb is there more observation about to know how you are branding that part of the you mean our data sample so far so I mean from the starting events we have three events about pv deposited energy so you always have to be careful if you are talking about pv neutrino energy or pv energy in our detector quantities because the starting event analysis is actually measuring deposited energy and there we have three events above pv but if you are talking about the muon neutrinos since we have this large uncertainty of the actual neutrino energy because the connection between the observed energy of the muon and the neutrino isn't as tight as it is for the starting events so you have first the event doesn't necessarily have to start in the detector yada yada so there we it's much harder to say exactly how many events we observed with neutrino energies above pv so but with reconstructed energies we have this one very very high energy through going track and just recently we had another track which might have above pv energies but we haven't finished the analysis the analysis on that one either so that was also one of the alerts that we sent via the GCN via the amon network and then we have a third analysis which is actually aiming at finding gck neutrinos so the very very very high energies and that analysis has found a partially contained event which is which its most probable energy is about 6 pv so there's different samples and it's a handful of events, 5, 6 so in principle there is not going to be any surprise in the future with the number of events except that it is an exceptional event astrophysical event that you can get with many new a bunch of pv neutrinos I mean with the current selections that we are using to get these neutrinos if it continues as it is now we don't expect to find a whole bunch of neutrinos but if there is any event for example in the near vicinity that produces high energy neutrinos like a transient event of course it might happen but we have not blinded the data in some cases so we actually don't know ourselves so and one more question I mean maybe the Camilo also asked permission for a question so maybe Camilo can go ahead are you there Camilo okay oh he's here but apparently he's having issues with the microphone he cannot unmute let me see if I can oh yeah yeah I cannot unmute yeah let's see okay maybe while you write down your question maybe Roberto can finish with his previous question yeah regarding not the pv neutrinos but in general what is the status with the determination of the flavors of the neutrinos flavors I mean to know the neutrinos like to be more muonal trino or down trino or electron so as I said so we had this one analysis that was looking to find tau neutrinos within this starting event sample so that's what I said where we developed these new reconstruction techniques and this was not able to find a clear signature of a new tau in the six year sample so we have other methods that are utilizing the waveform information but most of these samples most of these analysis have not unblinded beyond two or three years of data and in those we haven't found anything but we are working on the new iterations that utilize this new recalibrated data and the six to seven year data samples so there should be coming more analysis of this type in the near future so what I'd say within the next half year but yeah the latest result was the six year starting event search for new tau and that hasn't found any signature of a new tau unfortunately okay thanks let me go ahead with Camilo's question so he asks what would be the physical interpretation of a model that goes beyond power-low-energy spectrum you mentioned this paraboloid scenario what would produce that so that is a model that is motivated by the blazer spectrum so unfortunately not really an expert on these types of models but I mean for the double power law you could imagine something basically two types of sources so one that produces you a very hard spectrum like the pp-interaction pp-interaction spectrum and another type of source that is dominated by the photohydronic interaction so where you can have the softer spectrum of maybe e to the minus three and since if you have enough of those sources then those neutrinos appear both as isotropic so you can have these two overlapping contributions to your diffuse astrophysical neutrinos and since this up-going muonutrino analysis is only sensitive at very high energies maybe therefore it's only seeing this part of the sources so it can't actually see the lower energy part that's maybe for example produced by the p-gamma interactions so that's why we are looking at these combined data sets to get a better handle also on the different systematics because all these different analyses handle their systematics slightly different and things like that so Camilo says thank you yeah alright I don't know if there's any other question I've seen that on the on the YouTube there's a bunch of questions so maybe we can if nobody has a question I had a very good question so you were talking about really nice talk by the way Nancy you were talking about this these new models include the anisotropy because of the ice shear does that affect any of the point source searches that have been done oh yeah definitely I mean as I showed in this one example so this was one of the previously released starting events and if that turns out to be correct I mean it always depends if basically everything is just shifted and it still stays as diffuse as it is now then the effect at least for the diffuse point source searches is marginal because you're just looking for a diffuse emission but if you are for example trying to so we have a analysis that is actually using the cascades where you actually have the biggest effect of this anisotropy on the directional reconstruction and if you're looking for extended point sources for example with those cascades and suddenly and you're doing for example a catalog search but your event suddenly gets shifted out of the previously considered error uncertainty of course that has a significant effect on this type of point source searches so it depends a little what kind of search you're looking at yes definitely I mean the ones that already used all data so we are also planning to do a point source analysis we just haven't defined in what detail we definitely do the diffuse part but there as I said I don't expect too much of a change unless it turns out that there is a systematic shift that everything shifts towards the horizon or something or towards the galactic plane but if it just shifts within the diffuse regime I don't expect much changes there okay any other question from the audience alright let me have a look at the youtube chat okay so Hermann has got a question he refers to slide 5 and he's asking what does the color and size of the spheres mean? so so the the color gives you the timing information red means early green means late later hits in time so you can use that to infer the direction and the size means is proportional to the detected amount of photons which relates to the deposited energy so in this cascade case you have this very big blob in the center which is very close to the interaction vertex where most of your energy is deposited so most of the photons are produced super I'm sure he'll be okay with the answer let's see then there's a question by Nicolas oh but he's asking the same thing I did before and he's he says that anyway he would like to know how many neutrinos were expected sorry so from the neutron star merger from the neutron star merger he was asking that anyway how many neutrinos were expected oh yeah now I need to recall the numbers exactly unfortunately I don't so in the as I said in the 500 second window we actually did not expect to see any neutrinos but if you're talking about how many you would expect from the merger itself I mean that strongly depends on the model on the exact on the size the distance I mean there's so many model parameters that actually don't know all of them yet to make a precise statement of how many neutrinos we would expect from an event like that I see okay let's see so Hermann is thanking you for your answer then Roberto is asking the question but I think that you just did right yeah yeah I was the one that I asked yes okay and oh okay that's about it Farinaldo is sending his regards Farinaldo Queiroz is sending his regards specifically to Hermann Roberto and Nicolás so everybody else should be offended okay any other question from the audience here yeah yeah I have one more question Nancy what is the future of ISQ I mean is there a plan for an upgrade to make even more compact array of detector or to combine with other techniques oh yes there's several things planned none has been approved yet so we don't have money for any of those yet but we are going in all directions so we are planning for a high energy extension so making ISQ bigger that is what we call ISQ Gen 2 basically it would just mean that we put more strings ISQ would be basically the infill array of a much larger array and by that we hope to find many many more beyond PEV neutrinos so probing the GCK regime as well but then we also have which I didn't talk about at all a plan for low energy extension so there we are talking about the regime of correlations, neutrino mass measurement via the mass hierarchy and sterone neutrinos and things like that so there we are planning for a much denser array that's looking at much much lower energy is going down to a few GEV in energy so here we are talking about 100 TEV, 10s to 100s of TEV so there's these two frontiers and basically as compromise for the intermediate future we are planning what is called Gen 2 Phase 1 which is a few additional strings which carry several calibration devices because now we are in a regime where we are not statistically limited anymore in many of our analyses but we are actually systematic limited and so we are limited but once about the knowledge of the ice properties and for that we want to deploy plethora of calibration devices that help us understand the properties of the ice and that would benefit of course the upcoming extensions but it would also benefit our old analysis because we could basically redo the old analyses using the additional information about the ice so we would improve on that a bit so that is basically the intermediate plan to show visibility of the upcoming improvements and at the same time improve with the already existing data. Okay so in principle with all these upgrades that they are planned ice could last three more decades. It's actually pretty amazing I mean we have these 5200 modules in there and we expected to have 1% or 2% dying per year but we only really lost sensors in the beginning during the freeze in so when everything was still moving in the ice so some connectors broke and things like that but we only lost very small number and basically since then we have not lost any of the modules or only a handful of modules so this gives us hope that actually the detector can be running for many many more years so that was a very nice surprise. It's a long-lasting experiment. I mean everything is static so as long as nothing breaks as nothing internally breaks it can be very stable. Thank you. Okay super so any last questions before we close the webinar? Okay I think we're good so before saying goodbye I'd like to remind everybody that oh hang on a second let me check if there's any other question at the YouTube chat. So that's empty. Alright so before we finish let me remind you that we have an upcoming webinar we're going to be having Anirban Das talking about dark matter scattering on webinar number 53. Okay so I hope to see you all there. Thank you again Nancy for the very nice talk and we'll see you next time. Thank you very much. See you around everybody.