 Don't be afraid. We are live now. So welcome, everybody. Welcome to the law physics webinars. Today is a special day because we are going to do the dark matter day. So it's going to be a whole law physics episode only dedicated to dark matter and especially dark matter and also about Latin America. So today, we're going to have a very interesting program because we have physicists and astrophysicists working in different areas across Latin America that in somehow they are related with dark matter. That is going to be our main threat during all this emission. So we're going to have, for instance, starting the series. It's going to be Fari Naldo-Quiroz from Brazil, then Andreas Reisneger from the Universidad Católica de Chile, Jose Paso from Universidad Católica del Perú, Sergei Cobalenco from Universidad Técnica Federico Santa María, Rogerio Rosenther from UNESCO in Brazil, and Juan Carlos Muñoz from Universidad de Antioquia in Colombia. So this is going to be a very nice, it's the first time that we're trying this connection because we're going to be in direct connection with three universities, Universidad Antioquia and Medellín. They are now broadcasting this transmission in the aula 03 Museo Universitario Rogeri 15, Universidad Antioquia, also in Lima, in Peru, in the Pontificio Universidad Católica del Perú, in which this is the transmission in the Aulario A409, and in Santiago de Chile, in the Pontificio Universidad Católica de Chile, and in the auditorium in a slap rally. So today, we are going to have this presentation, and we can already go to the topics to each of the presentations. So one reminder to the people that is following this transmission, that you are watching this via YouTube, all the questions that you have, you can start to type it in the chat. YouTube has a chat in which you can write all this stuff. And then after each presentation, we are going to have some few minutes for some interesting questions that can help us to develop all this quest for the dark matter. So if Fernando wants to start, we have here is the first speaker. Hi, Fernando. Hey. Hi. So whenever you want, you want, you can start. All right. Yeah. Share this slide. Can you all see this slide? No, now we can see you. You have to make sure screen. Yeah. Yeah, if you can make full screen. Yeah. All right. Okay. Yeah. Yeah. Okay. So like first, thank all the organizers for these opportunities. I'm quite proud for being a Latin American princess working on dark matter in this epic moment, I would say. So having an event on dark matter in, by Latin Americans is something beyond average, I would say. So I'll be, this talk has emphasized and on the web page is targeted for bachelor students. So I'll be not very deep into the subject, but I believe that every one of you will be, I think you'll be able to understand what I'm trying to do. So first my, always I start my talks addressing my take home messages, which are very, very, very simple, super simple. So first the one is sort of an advertisement, which is why I believe this is the right moment to become an hospital physicist. And the second is that I think you all see and I agree with me that dark matter research is a multidisciplinary endeavor. And I must warn you is very addictive because I take a dark matter pill every single day, I should say. So I would like first to start discussing where the evidence is for dark matter. And one of the evidence is for dark matter, which was the very first one. And I think this, all students are fascinated by the early universe cosmology, physics of the universe, fundamental laws of nature. So that's one of the first things that comes into textbooks. So dark matter is key to the evolution of the universe. Why is that? So for those who haven't seen and it's been a course on cosmology, you should be aware of that temperature, the temperature of the universe or the temperature of the photons in our universe are proportional to the size of, inversely proportional to the size of the universe. So the bigger the universe, the bigger the universe, the smaller the temperature. That means that in the early universe when the size of very, very small, we had very, very high temperatures. In other words, what I'm trying to say is early universe means hot universe. So why am I saying this? Because of the following sentence. In the early universe, the universe was hot and the matter, the constituents of the early universe were tightly coupled to each other. So they were very tightly coupled. And you needed something. We need something that has to be coupled from that early enough and start form clumps of matter through gravity and then evolve with time and form the galaxies we observed today. So we need, is not body now the same and is not how Beth is saying or Diego is saying, is we need the data tells us that we need something that has to be coupled early enough and through gravity form the galaxies and the structures we observed today. And what that is, dark matter. So that's one of the most important and compelling evidences for dark matter, which is the presence of dark matter in the evolution of our universe. So as you see, so to have assessed and test these hypotheses by using cosmological simulations, what these are, simply is that you pick loss of particles and put in a huge volume and let them evolve through gravity. As you've seen, for those who have are in the second year of bachelor degree, you've seen already that gravitation force from Newton. So mass attracts mass, right? And it's version proportionate to our squared. And what it means, what I'm trying to say is whenever you have a mass, whatever, whenever you have a mass, does you attract each other and do it form clumps and clumps of matter and evolve with time. And this is what this is, is just an assemble of particles which you have in a large volume and let them evolve through the Newton's law, basically. And what you'll get is are those filaments of matter at very, very large scales. And these very, very bright points right here or here or there throughout this picture are simply galaxies and clusters of galaxies. And the nice thing about this picture beyond everything I've just said is that at very large scales, these cosmological simulations agree with the observations. So if you have the universe, the universe is comprised of dark matter. Dark matter is sufficient to explain the formation of galaxies and cluster of galaxies. Okay? So that's very, very, very important. One second evidence for dark matter, again, for those who have seen, or at least in the second year of bachelor's degree, is that this is like these now YouTube videos that people are posting every single day. Three steps to the dark matter discovery. So in these three steps to this dark matter discovery, it goes as follows. So imagine you're here, you're a galaxy of mass M, capital M and then surrounding their galaxy, you have a star of mass, small m. And the distance between the center of this galaxy of capital M and this star of mass M are r, sorry. And if this forms is a stable system, that means that the gravitational force is just balanced by the centrifugal force. And what it means is you just solve this equation, which is this step two, centrifugal force equal to the gravitational force. When you solve this, if M, capital M, is constant, does not depend on the distance, what I'm trying to say, there are no more mass beyond this radius r. Therefore, if you assume that, and that is true, then v, the velocity of that star moving around their galaxy should be proportional to one over square root of r. So that's what you get just from basic physics. However, what you observe is something entirely different. What you observe is these data points here in this figure below. This curve here just fits the data points, which you see is not at all, nothing close from one over square root of r, which is this plot here, yellow in the bottom. So you should expect something like this, and you'll observe something like this. So this is what I'm trying to say is, you need either two things. One, what am I reasoning? The laws I'm applying are wrong, are not valid at this scale. Or two, this mass does depend on r. So throughout this talk, I'll be focused on the letter. So if that mass depends on r, it means that in this galaxy, in this step one case I was just mentioning, this mass encompasses surrounds far beyond this distance r. So you have actually a halo of mass around that star. So what we call it is a dark metal halo, why dark? Because we don't see it, we don't see the visible light coming from that mass beyond r, we don't see it. So therefore we call it dark. This is one of the major evidence for dark matter, and it's one of the proofs that we have our dark metal halo in galaxies, not all galaxies, but some galaxies. So based on that, we have a variety of experiments today searching for dark matter, okay? So the third one is for those of the students in the audience who haven't played with this, I strongly advise you to it. So you just link this link here in red. So you access the WMAP, it's not actually Plunk, it's the WMAP website, where what you can do is the following, you can play and find out how much dark matter you need in your universe. You can play, what kind of universe do you want? You want the universe with dark matter, without dark matter, and with whatever you want, okay? So just look at these two pictures. So these two pictures are the following. Look how amazing this is, I get goosebumps. Just talking about this. So looking at the photons, look at this, looking at the photons that come from distant, different points in your universe. The difference in temperature of these photons are tied to the universe observables, such as the abundance of dark matter in our universe. Therefore, you can use this tool from WMAP and play with it. So what you do is, let's assume that, I don't want dark matter alone, like dark matter, but now there's just a crazy guy. So let's see what he's saying is really true. So these black points here in this curve are the data points. So this is data, you cannot mess up with it. Okay, so the data points, I want to fit this curve. And then I'm going to assume the universe does not have dark matter at all. So what you get from the predictions is that you get this green curve. So you don't fit the data whatsoever. In order to fit the data, you need the universe to be comprised of 27% of dark matter. It's not 20 and it's not 40. Why emphasize this? Because it's a precise assessment of dark matter abundance in our universe. Using data is not someone's opinions or tastes, is data. Data tells us that the universe needs dark matter. And that abundance has to be 27%, okay? So in overall, we have a need for dark matter from across different time scales of the universe evolution and different distance scales of the universe today. And that said, what can we learn from this? From a particle physics point of view. So I just remind you that everything around us are ruled by fundamental laws of nature, right? And the same way these laws of nature are described in terms of elementary particles. Therefore, it's quite plausible and natural that to assume that this dark matter is also described by elementary particles. And what not just me, but many physicists throughout the world have assumed so far is start trying to search for signals of those dark matter particles. As you can see in this plot, this slide, this is how many experiments is a piece, okay? They're not all of them, just some of them. How many experiments from collider, direct detection, direct detection, which I'm gonna explain later, that somehow are sensible, sensitive to dark matter signals or are devoted partially to search for dark matter. There are many of them. The ones in bold face are the ones which we have Latin American members involved in, such as CMS, Alta de Vavara, I won't go through all of them, there are so many. So just to give you a snapshot of how much dark matter is important, some of these collaborations, there are two, 3,000 people involved. So it's not this crazy guy on the other side of the screen who is trying to convey and explain to you, advocate for the presence of dark matter in your universe. Actually, there are thousands of, not just crazy, but amazing citizens trying to search for signs of these little things, okay? And hopefully in the future, you'll be talking about this dark Latin America experiment. So how do we search for these dark matter particles? One of, there are three of them. I mentioned the first one. The first one is called collider. So what do you do with collider? You just take the LHC, which is the largest hydrogen collider ever built in mankind. And you want to do is crash proton beams together. And by crashing these beams together at very high energies, you probe the fundamental laws of nature. And by probing the fundamental laws of nature, you also might produce dark matter particles and you, LHC has been intensely searching for signs of these dark matter particles. And you can see in the picture, Diego is already with a microphone, ready to give them good news as soon as we have them, okay? Another way of searching for dark matter is through direct detection. What a direct detection stands for. Direct detection stands for, you have a dark matter particle in red here and just scatters off nucleus and deposit some energy. And depending on how this energy was deposited in a given target, you can discriminate signal from background and by doing so, you can find signals of the dark matter. And there are many Latin American researchers involved in these searches. And the third tool, and just before I go to the third one is just to give an idea, on this top one was just a guy who made a cave, out of a cave, he made a hotel. And some physicists out of a cave, they do direct detection experiments, okay? And this thing, little thing, might just unveil one of the most exciting puzzling, open questions in nature. The last one is called indirect detection. What indirect detection means? If you have to remember, we've seen before, that dark matter is in our galaxies and there is a dark matter halo in galaxies, name these dark matter halo, okay? If these dark matter particles, they are in concentrated regions where the dark matter density is expected to be very, very high. These dark matter particles you just might find in itself, one dark matter particle, I find it's anti-partner or not. And they might just annihilate like matter, anti-matter. Similar, it's not quite the same, but it's similar to matter, anti-matter when they annihilate. We've seen even in Star Wars movies. So matter, anti-matter. So two particles when they find each other, they might self annihilate and produce all kind of signals, okay? From gamma rays, neutrinos and anti-matter, they are many others. And you can use and search for those byproducts of dark matter annihilation, gamma rays, neutrinos and anti-matter, using ground-based telescopes, such as HASS and CTA in the bottom. HASS and CTA are these right one, bottom, firm left is the satellite, this on the left. And you can also search for neutrinos with underground detectors such as ice cube and also another satellite such as AMS, which is searched for anti-matter. So we will go from satellite to underground and ground-based detectors trying to search for signals of dark matter, okay? So look just to conclude. I've talked about critical physics. I've talked about early universe, galaxy formation, laws of nature and you can search for gamma rays, neutrinos and anti-matter. And there are even models involved, okay? Particle physics models involved. There are so many physics involved which is just fascinating and addictive. The reason why I conclude this, why is the right moment to become a master particle physicist? Because there's so much physics involved and there's so much data involved. Every single year we have new data and new excitement. I just, I mean, it's so passionate. I'm passionate about what I do and I think if you engage on this topic, you will, too. Thank you. Thank you very much, Farninaldo. So I guess we can have some questions if there are people that want to address to Farninaldo. I don't know, I'm just gonna, okay, just stop to present this, okay. Okay, I don't know if there are some very short questions from the, I'm gonna take you as in YouTube. So for the moment, no, and I guess one of the best stuff that Farninaldo was saying is that all the, I mean, here we are trying to provide like a sample of physicists and as a physicist working in line that are related with their mother. But Farninaldo was right, he said that there are a lot of Latin American physicists around the world, not just only in Latin America that is participating in this collaboration because they are making a very big impact in the field. So if, for the moment, I don't see any question. Remember that you can make all the questions via YouTube. If you are in one of these, you are not in the rooms in which we are broadcasting this. Otherwise, at the end, we are gonna have the round table and we are gonna be able to talk more relaxed about all what we have seen during the day. So maybe we can pass now to the next speaker that is gonna be Andrea Rice-Nager from Contifigio Universidad Católica de Chile and he will talk about the Cherincão Telescope Array. So, you can start. Hi, can you hear us? Yes, I can hear you. Okay, good. And what about the... Well, we will put the presentation. Yeah. And just one second. I can see the auditorium from the Catholic University. Yeah, it's almost full. Yeah. That was good. Okay, which will be in full screen now, right? Yeah, we can see it perfectly. So, whenever you want to start, Andrea? Okay, thank you very much, Oeko and Cichirban and everyone else for organizing this and for inviting me to speak about the Telecom Telescope Array, which is a very exciting laboratory that will be here in Chile in the next few years. And as it was announced, I'm here at the Catholic University in Santiago and I am Chilean representative in the CDA Consortium Board. And well, a broad outline of what CDA is. CDA is an international project to build by far the largest time array of observatories in the world. And this will consist of two observatories, actually. One in the south, which means in the Atacama Desert at the Paranal site in northern Chile, which will be about 100 telescopes and another normal array of about 20 telescopes on the Canary Islands, La Palma and Spain. The CDA Science Consortium, which has been working on the science case for this and the key science projects to be carried out, consists now of 32 different countries with more than 1,000 scientists and at least three Latin American countries are in the Consortium, which are Brazil, which is very active, Argentina and Chile. The center of mass of the CDA Science Consortium is in Europe. Now the main offices are in Bologna, which is the central administration of the CDA and southern Germany, where the data and computing centers. And CDA will cover energies from about 20 GEV to 300 PEV. So that's the highest photon energies that we can observe. And this, of course, as Parinaldo pointed out, this is interesting for astrophysics, for fundamental physics and for the intersection, which we call astroparticle physics. And, well, so if we look at the electromagnetic spectrum, we have the lowest energies, our lowest wavelengths, we have the right of waves and, well, we have the whole spectrum that we know and we have many orders of magnitude at the opposite end of the highest energies and the smallest wavelengths, which are the gamma rays. And on the upper end of the gamma rays is where CDA will operate. So that is, of course, the most interesting range for particle physics and for the most energetic processes in the universe. So among these processes, we have gamma ray bursts, we have supernova remnants, galactic nuclei and pulsars. All of these phenomena accelerate particles. They accelerate charge particles to very high energies and these charge particles then radiate gamma rays. And, well, at the moment, the gamma ray observatories operating in the higher energies are on the one hand, Fermi, which operates in space. So it has very little background and but it has a small collective area. So it can observe mostly the gamma rays at not so high energies, basically around GB. And then there are the ground-based telescopes with us, magic and tests in North America, in the Canary Islands and in Namibia, which are observing the highest energy gamma rays. And these are observatories which consist, which have somewhere between two and five telescopes which are the state of the art at this moment. Now, these ground-based telescopes and the future terrain of telescope array operate by the terrain of Tercneet, which is basically that we have a very high energy gamma ray coming in from space into the Earth's atmosphere and there it interacts with the particles in the atmosphere, generates a shower of charged particles which move at velocities which are higher than the speed of light in the medium in the atmosphere and therefore they generate terrain of radiation in the direction in which they are moving and this terrain of radiation is optical light which can be detected by ground-based telescopes. So these telescopes generate images like the one we see here and if we have several of those telescopes we will get several images and we can basically triangulate where the gamma ray came from and from the size of the signals we can get the energy of the gamma ray and from the shape we can distinguish what's a gamma ray compared to, for example, cosmic rays which are charged particles of higher energy and which generate somewhat different signature, a different image. So the idea with CTA is to have these ground-based optical telescopes to detect the gamma rays but many more than in the current observatories. So there will be telescopes of three different sizes. On the southern side, there will be telescopes of all three sizes. In the northern telescope, in the northern of our sites there will be only the large and the medium-sized telescopes which, of course, will have a performance which is much better than the current observatories. So here are the two sites where they are located in the Canary Islands and in the Atacama Desert in Chile. And here's a comparison of the sensitivity of CTA compared to the current observatories. Here we have magic as veritas and we see that CTA has a similar energy coverage but about the point of magnitude with better sensitivity. And in addition, and not shown here, it will also have much better angular resolution. So it will produce better images than the present-day observatories. The key science drivers for CTA are to understand the origin of cosmic rays and their role in the universe. So basically, this process of accelerating particles in very violent processes in the universe and these charged particles then produce the gamma rays and the advantage of the gamma rays over the charged particles is that the charged particles will be affected by the stellar or even by the fields. And therefore, all the problems are on straight lines so we don't really know where they come from. Whereas the gamma rays do travel on straight lines and therefore we can produce images of sources. And we can start the particle acceleration around black holes, which is in a sense a particular case of the above. But then there are the ways of studying the nature of matter, in particular, arm matter, and physics beyond the standard model. For example, the possibility of Lorentz and variance violation. So this experiment or observatory will be of interest to astronomers and particle physicists. On the other hand, it will be very important to observe the same sources in other wavelengths, which for the engineering, we are very well equipped to do so because we have many other laboratories operating in Chile between the optical and the radio range. The main target for the target for on CTA south is the galactic center, which is also a particularly important subject of study here at the Catholic University. The galactic center is the strongest concentration of arm matter in our cosmic neighborhood. So it will be the source for which we would expect the strongest signal from, for example, the ionization of arm matter particles, which would produce or could produce high energy gamma rays. But on the other hand, in the galactic center region, there are many other sources of very high energy gamma rays, such as supernova remnants, pulse surface, cosmic rays as accelerated by different sources and which hit molecular atoms in the gamma rays, and a positive hematron that's an accelerator of charge particles, PDV, 10 to the 15 electron volt energies, which is suspected to exist near the galactic center. So here you see images of the galactic center regions produced by Fermi and by S on different scales and of course in different energy ranges. And you see that there is always a strong emission from the galactic center, which will be much better studied by CTA. Of course, since there are so many sources together, it will be important to have good models to disentangle all these signals. So here's a timeline for CTA. You see in the next few years, CTA will be built up progressively and it shouldn't start actual science operations some time around 2022 or so. And well, CTA will function as an observer to which astronomers or physicists can make both of their proposals and get time to work their favorite sources. However, there are also key science programs which are designed by the CTA consortium. And during the first years, the key science programs will take up most of the observing time. The open time will be a smaller amount at the beginning. So it is particularly important for us here to know that we will have 10% of the observing time throughout. So at the beginning, we will have a large fraction of the open time. So it's very important to be prepared to use CTA effectively at the beginning because then we will be able to produce very interesting results. So far in Chile, the Chilean participation in the CTA consortium is about 30 scientists and engineers and seven different universities which are listed here. And we are working or trying to work on different subjects. We are actively involved in the modeling of astrophysics and biophysics, the Galactic Center, Darmetta, Tidal Instruction Events. And we are also involved in software, particularly at the Universidad de Santa Maria, Valparaiso, and in the Array Control and Pipeline software. And there is interest in the attempt to do some atmosphere modeling and atmosphere monitoring, as well as in the future, what will be the observations relevant to the CTA. So we expect to become more and more involved in this. And we are interested in forming a larger group here in Santiago and Valparaiso in other places in Chile to work towards the CTA. So the conclusions I have been with Marinaldo exciting times ahead in astropharticle physics. So it's a good time to get involved in this. People are welcome to contact me if you want more information about opportunities to work in Chile or collaborate collaborations with us. And you can find more information about CTA on the CTA website and on the recent Science with CTA document which is on the internet. Thank you very much. Thank you very much, Andrea. So we have, yeah, it's very amazing what is the vision for CTA for all the Ingenium Array Astronomy. We have some questions maybe I can make it to some, now or later in the rocket or another, but one of the questions that is left in Nicolás Mésar is what is the expected angle resolution for CTA? I just want to answer that. It is energy dependent, but do you remember the numbers? Well, I think it's around the arc minute or something like that. I remember at least it's better than it has, but I'm not quite sure. And it gets better at higher energies. It's worse, worse lower energies. I think it's 0.1 degree. Yeah. Sorry? 0.1 degree. 0.1 degree. For another say, 0.1 degree. Yeah, another question, because it's also related with what you are showing that is very challenging for the world related of the computational level. So Nicolás Mésar is also asking what is the speed of CTA process data? I mean, this is probably very challenging for the world. Maybe Chile participates in the data processing. The rate at which it produces data, that's also a number which I don't have in my head. So I don't know. Yes, CTA is one of the largest observatories in the amount of data that we produce, no? Yes, yes, it will be producing a huge amount of data. And so it is being discussed how much of those data will be processed on site. And here in Chile, how much of the processing will be done abroad in the countries that are putting most of the money into CTA. But no, I don't really know the number. So I don't know whether Farinal knows or German. Well, there is one thing, the question is, what is the speed of CTA processing data? At the moment, there is no automatic pipeline for processing the data in CTA. It's something that is under construction. Then it's very... But I guess the relevant question is at what rate CTA will be producing data? Because you have to process everything you produce. So maybe we will have to get there somehow. Yeah, yeah, yeah. Yeah, yeah, yeah, yeah. No, but also you were saying that, I mean, one of the stuff that Farinal, his talk was saying and you also were confirming that is what was the CTA and the quest of the matrix general? It's very easy to decide. So there's going to be a lot of opportunities also for data scientists or people who are getting big data. Related with this stuff, with the same details of how much data is going to be used in the matrix. Yeah. Yeah. So maybe we can continue with the... Thank you, Andreas, but we're going to then come back with all the rest of the questions because in the wrong table. But now we can give the next speaker, there's going to be Jose Vaso from Pontifical University of the Republic of El Perú. And he will be talking about the Alpha Magnetic Spectrometer. So Jose, whenever you want, you can share a screen. Hi. Okay, hi. So thank you for the invitation just to mention that I used to work in AMS while I was a postdoc in Italy, but there is one group in Brazil led by Manuela Becchi that they're also involved in AMS. So I'm going to talk about this. Just let me share the screen. Okay. I hope you can see it. Yeah, you can see it. Well, so in general, AMS is looking at cosmic rays, charged cosmic rays, even though there was no application about gamma rays, but mainly charged cosmic rays. So at the energies between GV and TV, as you can see in this plot about the fluxes of the different particles. So it's mainly protons, what it's been possible to see, but then if you remove these protons, so if you consider it as background, then you will get electrons, positrons, and also antiprotons, and we are looking for also different other particles. So the advantage of being in space, so is that you don't have the atmospheres and it's easier to identify particles or antiparticles. You get a precise measurement of the energy and a high proton separation. However, of course, since you are at the space station, the detector has to be smaller. So you have a smaller perspective area and well, the lifetime will depend on, well, the stages of the detector. In general, there are, right now, 14 countries involved, 46 institutes. As I was mentioning, there was one in Brazil. We here in Peru were not directly involved, I was there. It was lunch with the space shuttle Endeavour in 2011, May. So now it's been there for more than six, she's gonna have years in the space station. So since it's on the space station, it's orbiting Earth every 23 minutes at an altitude of 400 kilometers. So far, it's more or less more than 107 billion events. And then in the lower plot, you see the payload and operation control center at CERN, where the detector is monitored and the communication with NASA is 24-7. So it's not one big detector, but it has some sub-detectors like a very small particle physics experiment. Here you see all of them at the top of the detector. There is one layer of the tracker, but then you have one of the main sub-detectors that helps to identify protons or from electrons and positrons. So it's the TRD, the transition radiation detector. Another thing that also helps us to identify, separate positrons from protons or electrons from protons is the electromagnetic calorimeter, which is at the bottom of the detector, so the E-Cal. And in the middle you have TOF, which helps for the triggering system. And then several layers of the tracker, which in contrast with the magnet, can give us the, well, with the magnet, you can have the directions of the sign of the charge and the momentum. And there is also reach, which helps us to measure the charge and the energy of the particles. So as I was saying, one of the main things that we have to do is to separate the background of protons from what we want to measure, for example, positrons and electrons. This is done with only two detectors, the TRD and E-Cal. Here you can see both of them. In the case of TRD, you have positrons or electrons, which are lighter, and the transition radiation probability is inversely proportional to the mass. So that's why you get more radiation from positrons. And if you gather all this information, you can create some likelihood for proton background and for the signal in this case, and you can separate them nicely. And then we're using the electromagnetic calorimeter. So you have different, also, signals in the LED, larger shower or positrons or electrons in this three-dimensional configuration. So if you take into account also all these parameters and you mix them together in a multivariate analysis, like a BDT, post-decision tree, you can also separate electrons from positrons. And here you can see one event, an example of one event. This is a one TV electron starting at the top, one of the layers of the tracker, then it leaves a signal in the TRD, then the TOF, more layers of the tracker, also in the reach and in the electromagnetic calorimeter, which measures the energy in this case, one TV electron, just to give an example of the data that we have used. AMS has had different publications and PRL starting with the positron fraction. The positron fraction is just the number of positrons divided by the sum of positrons plus electrons. The first one was up to 350 GB, then we went to 500 GB. There were also publications about the electron, positron flux, together protons, helium, antiprotons, boron and carbon, and there are others coming in the next month. So about the anisotropies or time variations. So now what it's concerning us, so that matter, what we see, for example, in this case, you can see a plot of the sum of the positrons and electrons, so the sum of both particles, the particle and particle flux, and the data from AMS will be the blue dots. Now you want to fit these dots with some theories, so you start with what we know basically from astrophysics, so you have electrons from supernova remnants, and then you will have some positrons coming from interaction in this interstellar medium. So then you will have, for example, the light blue curve for electrons, which is the highest one, and then smaller one contribution from positrons. But still, if you add those two things, you are missing something that we are seeing in the data. So that would be the green curve. We call this, in this case, a common source. Now the question is, is this common source coming from dark matter, coming from pulsars, coming from other sources? That's the question that we have to address. You also can see here the separate fluxes for electrons. You see the contribution from this source, common source is smaller compared to the diffuse flux. And in the case of positrons, it's a much larger contribution, especially at higher energies. Now as Parinalo was mentioning in the first talk, you can have dark matter annihilation, and in this product of this annihilation, you will have positrons, antiprotons, and other particles, that is what we are measuring. So now if you want to explain the data that we are seeing with AMS, with dark matter, you can postulate dark matter candidates. For example, in this case, with one TV, the mass of the dark matter candidate, and then you can fit the positron flux and also the positron fraction. As I mentioned, the positron fraction is the number of positrons divided by the sum of positrons plus electrons. And you see that it is nicely fitting. Another thing that will be necessary in the case of dark matter is this isotropic distribution, and also you have to fit the antiproton spectrum. Here you see the ratio between antiprotons to protons, and using another explanation with dark matter, you see that it fits. You have the normal collisions producing antiprotons, but still there is something extra that we are seeing that could be due to dark matter, so dark matter annihilation. However, well, in this case, the antiproton contribution is not so easily explained by polythors, but you can postulate different alternative models to explain these extra excess in the case of positrons and antiprotons. So for example, you can say that there are polythors, so also in radio magnetized neutron stars, you can fit the results of AMS also with this, or you can say, for example, there are supernova remnants producing also antiprotons or fitting the positron fraction, but even there you can find some papers, but at the time of the first publication of AMS, there were more than 200 papers trying to explain these results with different theories. So in this case, for example, it's a modified cosmic ray propagation. So just by secondary production, you can also get the positron fraction. Now, the thing is that with the data that we have so far, it's not so easy to distinguish between different models. So we have to wait still some more time to gather more data. So in this case, it was done some time ago, but by 2024, we will get enough data for example, to separate between models caused by pulsars and some from dark matter. You can see here the samples explaining the positron fraction and the positron flux. And then you see the points with the, so this is the expected number of events given some of the models. And with the error bars, even the statistics and the systematic errors, and you'll see the models produced by pulsars are different from those that are due to dark matter. So you could see in some more time, gather more data. Then there's also the anisotropy, which is still something that they're doing and will be published soon. The current value of the value of the anisotropy is compatible with both pulsars and isotropy. So again, we have to wait still for some time to see if there is a real anisotropy or not. So in conclusions, you don't want to know more details in general about other results for an AMS. You can see this link, but then we have to wait again for some more data to be able to distinguish. But there is a dark matter explanation from what we have observed with AMS. Thanks. Thank you very much, Jose. So one of the first questions that we have here from Fari Nalto, in fact, is asking the audience is the question that everybody want to know, how long is gonna be AMS still running? Because I mean, it has been in space of like five years at least or more maybe. It's kind of half years. Well, still running and we hope that it will run for longer time, at least the time of the International Space Station. Of course, there have been some problems. I think I cannot reveal much because well, I'm no longer part of the collaboration, but still I have some insights, but it's still working, still working and we'll be gathering more data that might be some other evas, so extra regular activities to help it go farther. So yeah, I just have one question. So I just heard a rumor that AMS has been already funded for 10 more years. So I just wanted to know if that's true or not. Well, that about the funding, I'm also involved anymore, but yeah, this detector still working and I'm sure that they will get the money. So there's been a new film about professor team getting the detector to the International Space Station fight for flight if you want to see it from NASA. So I think there is no problem with getting funding, things just that they should keep going on. But as I said, so we gather more data by 2024, something like that. At least from the models, we could tell apart if it's dark matter or pollsters or something else. So, I mean, we have another question, but I don't know how long it could take. But anyway, I'm gonna ask it to you. Jorge Diaz is asking, how much does the earth magnetic field affect the measurement on board of the space station and how this is corrected by in AMS? If the earth magnetic field has an impact on the, on AMS itself? Well, for example, when it goes through the South Atlantic anomaly, then we get lots of data, so we just don't use the data when it's in the South Atlantic anomaly. And then there's some course correction for the lower energy events for the, but at higher energy, what is important for us of such a good idea. Of course, if you want to go to solar physics with lower energies, then you have to take into account the Germanic cathodes and everything, but so that's the effects of the solar activities and everything is below 10 GB, well, some say below 100 GB, but actually you only see those effects below 10 GB. So, yeah, in fact, this is one of the nice stuff of AMS. Also it's very interdisciplinary, interfield, let's say. People that want to do solar physics, cosmic ray physics, dark matter, or another type of astrophysics, everything can be, I mean, not everything, but there are a lot of fields that have a lot of interest in the observation that AMS can do. Yeah. Yeah, it's very interesting. So maybe we can continue with the question, but in the round table. So thank you very much, Jose. So now we are gonna pass to Sergei Kobalenko from University of Federico Santamaria. Sergei. Yeah, thank you. Yeah, can you see the slide? Yeah. Yeah, we can see you in the slide. Okay, so thank you very much to the organizers for this opportunity to present the UNDESLAB project. It is still a project. In this context, UNDES is short for Agri-Negro-Deep Experimental Science. The idea is to construct an underground laboratory spaced in the southern hemisphere and the third deepest in the world. Inside the Agri-Negro rule tunnel, which is going to connect Chile and Argentina in around 10 years, statistically. This project was proposed by an international team of scientists from Argentina, Brazil, Chile, and Mexico. Javier Bertu from Barivocchi is a chief coordinator and there are national coordinators from each country. The Agri-Negro tunnel, first of all the tunnel. Okay, this is a strategic, very well-economically motivated project due to growing trade of Argentina and Brazil with the Asia-Pacific region. Currently, the transport connection between the countries is realized by several high mountain passes. Agri-Negro pass is one of them. All of these passes are heavily dependent on the weather conditions and suffer severe cuts in winter due to snow. The overall solution, of course, would be to drill the tunnel in the mountain range and construct a corridor between the west and east of the continent so that the goods would be transported through the tunnels towards Chilean ports and then ship it further. The total estimated cost is about more, more than one billion dollars. This is an estimation with the tendency to rise, of course. This is a geographic location of the Agri-Negro pass and the future tunnel and it will connect the Coquimbo region from the Chilean side and San Jose province from the Argentinian side. The tunnel, actually this consists of two tunnels, one directional two lane road tunnels, 40 kilometers long, 12 meters of diameter, separated by a distance of 60 meters. The laboratory, the laboratory is planned for installation in the Chile-Agentina branch in the deepest point of the tunnel, which is about this depth and which amounts to four and a half kilometers of water equivalent of rock or burn, which makes, will make or would make the laboratory, the third deepest laboratory, underground laboratory in the world, the location of this deepest point and the laboratory is at a distance of three and a half kilometers from the Chilean entrance, this entrance and 10 kilometers to the Argentinian exit. This is a schematic layout of the laboratory. It is typical for the underground laboratories of this scale. It contains two, it is going to contain two holes, the main hole and the secondary hole with these dimensions, 50 meters and 40 meters long, 21, 16 meters wide and so on. This is a cross-section of the holes with the human figure for the scale. There will be also two pits, for installation of large detectors. This is a large pit, a large pit, 38 meters depth and 30 meters of diameter and another, another smaller ultra low radiation pit for some experiments, very low background experiments and this big pit, I forgot to comment, is conceived for installation for hosting of a big liquid scintillator neutrino detector, akin to Camland or Boricino detectors. So this is a layout. There are several places for smaller size experiments like for applied physics. There are two support lamps envisaged in this project on each side in La Serena and in Robel, Argentina. The idea or the hope is that these lamps and the Neobard University is mutually benefit from collaboration with this laboratory. Now the question, why do we need to go deep underground? Mainly due to cosmogenic mounds which are produced in the atmosphere by cosmic rays and produce it appear in the case of charged mounds. The hydronic component of the shower could be easily screened with a few meters of concrete or burn and while mounds and of course, neutrinos are capable to penetrate very deep underground at sea level the flux of munt flux is about 100 pounds per kilometer per second. Why these mounds are so unwelcome because of a new installation reaction with an environment, with a nuclear, with the nuclear of the surrounding matter in which in this reaction there appear abundantly neutrons and these neutrons are the principal background for the direct back matter detection. Why elastic scattering on the nuclear? So this is a reaction of direct back matter detection and neutrons could easily mimic this reaction in an elastic scattering producing a similar nuclear recoil. Nuclear recoil is what is detected in this sort of experiments. That's why the underground labs are searching for such processes, very rare processes should be installed deep underground. The flux is a decreasing function, exponentially decreasing function of depth. This is a curve of the munt flux versus depth in terms of kilometers of water recoil and of rock overburden. So the deepest lab in the world is a GP in China then calls a snow lab in Canada and unders with it's four and a half kilometers of water equivalent. So this is a list of underground running underground existing underground labs. There are the small underground labs. And this underground labs is going to situate in between Stanford underground research facility and snow lab. What is special about this lab, about the unders lab? It's location. As you see from this map all of the existing underground labs are located in the North and this year. And so the advantages of the unders lab, the first advantage is location in the South and this year, which has opposite very induced demodulation of the signal, possible modulation of the signal. And which is important for the claim of Dama Libra experiment on the detection on observation of annually modulated dark matter signal. And another advantage is that it is the deepest in the world. There are other advantages mainly for nutrient experiments. It's a mountain is from the reactors, nutrient reactors and then low reactor nutrient background and so on. As to the scientific program of the lab, it is too early to discuss anything definite because this field is very dynamically evolving in a couple of years on some near future. Its status may dramatically change, of course. But nevertheless there are a bunch of possibilities discussed by the collaboration and among them to host a large Latin American nutrient detector, liquid scintillator detector, nutrient detector of life scale in the pit and large pit which I mentioned before. Double beta decay neutrinos, double beta decay experiments. These two sort of experiments are critical for nearly every existing underground lab. Some experiments for applied measurements and of course dark matter search experiment. Currently the most attractive possibilities, among the most attractive possibilities there is the idea to host a clone of Dama Libra experiment in this laboratory. In order to disentangle dark matter annual modulated signal from the very induced seasonal motivation and ultimately confirm or refute this claim of Dama on annual modulation of the dark matter signal. Organizationally this laboratory will be managed by a Latin American consortium for underground studies last which currently includes four participants formed by four countries, Argentina, Brazil, Chile and Mexico. And as far as I know, Columbia also expressed its interest in participating in this collaboration. Summary, and this is the unique opportunity for deep and large underground laboratory in the southern hemisphere. With various advantages, scientific and educational advantages, it received a strong international support and interest from many and international experimental collaboration consortium. Class opens additional opportunities for integration of Latin American science. And the expected commissioning of the lab is in 10 years, optimistically, of course. And more information can define this side. So, thank you. Thank you very much. Thank you, thank you. So, I guess we have a question. What is from Fernando, maybe Fernando, you want to ask him? Yeah, so one of the things that I wonder is, why we haven't pushed for a scientific paper regarding aunts? Because I think this is missing the literature. We have done this in the past for Fermilab for, is no lab and then many other, and even experiments like just DUNE, they pushed for a paper before even they built and they got the experiment approved. Shouldn't we do the same? Do you mean under slab? Yes. Yeah, I didn't get your question actually. What should we do? It is still on the way. It is not yet, unfortunately approved completely by the Chilean government. It is on the way, so we should debate. But there are four workshops and four workshops were organized in the last years on the subject, so on the scientific program of this. Yeah, but I mean, not even the workshop. I mean, for instance, if you do an inspired search with the title aunts, Ah, I mean. You find 11 papers, out of these 11, they are like two, which are real papers. So maybe if we had a paper describing what kind of physics we could do, I'm sure that we'll trigger loss of set following studies on this and on this. Do you mean on the scientific program of this lab with the solution? Yeah, maybe, but we should probably discuss this. There are proceedings actually of the workshops which are published online, nothing inspires. Okay. Okay, no more comments, it's a good idea, but I cannot comment more. Yeah, now maybe later in the round table, we can also expand a little bit. Yeah, thank you very much. Okay, we are gonna continue with the program because the time is passing fast. So the next speaker is gonna be Rogério Rosenfeld from UNESCO in Brazil and we will talk about the dark energy survey. Please, Rogério. Okay, can you hear me? Yes, we can hear you. Can you see my screen? Yes, we can see your screen. Okay, so thanks for the invitation, I'm going to talk about the dark energy survey for the dark matter day. To tell you what the dark energy survey is, it's a large operation of 200 scientists around the world. And so we have a Brazilian consortium that is participating in the collaboration. It's led by Fermilab and the director now is Josh Freeman from Fermilab. So what we want to do in terms of science, we want to use four different observational probes of dark energy. So we want to study large galaxy clusters and the statistics of galaxy clusters. We expect to have tens of thousands of clusters up to the ratio of one. We use also the weak lensing, these distortions of the shape and magnification of galaxies. And that has information about the matter solution. Use very specific oscillations or if you want the correlation function of galaxies which has this feature. And so we're going to use 200 million galaxies also with Z ratio up to one and yeah. And also supernova. And one thing that we want to do is to constrain or get more information on the dark energy equation of state. So if you have a very simple parameterization like this one of a two parameter W0 and W8 or the W of A is the relation between the energy density, sorry the pressure and the energy density of this fluid which we call dark energy which is responsible for the accelerated expansion of the universe. This is the type of forecast that we have for these two parameters. And so this is just a forecast and one thing that's important to notice is how much do you gain by combining these different probes here. So just to tell you that the project is based in Chile and the CTIO actually not too far from the place where the tunnel is supposed to be built. And so it's close to La Serena. Also it's actually in the same row that leads to the tunnel, the Aguas Negroes tunnel. Actually I forgot to mention that the next workshop of the Aguas Negroes at the end of this project is going to be here in some fall of August next year. So this was just a parade. So we use this old telescope here which is called the Blanco Telescope which is a four meter telescope. It has been refurbished in order to be able to have this camera here. So this is what we call a dark energy camera. This is the Blanco Telescope and all the support was built in order to hold the camera which is a four ton camera. And so this camera here is a CCD camera. We have 690 megabytes and is able to see light from more than 100,000 galaxies up to a million light years away in each snapshot. So this is just the array of the CCDs and I just have an idea of the resolution. It's really good resolution. I couldn't reframe myself to show this. This is the dark energy camera picture of the famous gravitational wave event of neutron star fusion. This is the optical counterpart. So it's the first optical counterpart of gravitational wave event. And with this one can even estimate, with one event it can estimate the Hubble constant because you know all the parameters of this thing. So in terms of dark matter, I think one of the major contributions of the dark energy survey was to identify new dwarf galaxies. So we discovered 17 new dwarf galaxies and 27 were known before dark energy survey. And there was a fundamental contribution from our team, especially from Brazil, Santiago's team from the south of Brazil. And there was a joint paper with Fermilat because Fermilat was targeting these dwarf galaxies because the dwarf galaxies are dark matter rich. And hence the idea was to get bounds from indirect dark matter searches. And one of the conclusions is that WIMPs, thermal WIMPs with masses less than say 100 GV are excluded, but of course there are some assumptions if they're currently produced and the BV bar models I think have a, oh sorry. This is just how the cumulative number of dwarf galaxies. So you can see how much the DS has contributed to this. And this is a plot of recent results and not so recent anymore, but last year results of Fermilat and DS using 45 dwarf galaxy, Milky Way satellite galaxies which are rich in dark matter. So what you see here is the cross section the y-axis and the dark matter mass in the x-axis and what you can see is that if you want, and this is the BV bar channel and if you want the dark matter to be a thermal relic you exclude dark matter masses below 100, say more or less 100 GV. So this is something important. So one thing that's also nice is that when you study the galaxy power spectrum distribution of galaxies in the universe through a transform of that, it's sensitive to new physics. In particular sensitive, it could be sensitive to the mass of dark matter. So if you have, instead of cold dark matter, you have warm dark matter, what happens is that you have a suppression in the power spectrum just because of the free streaming of this warm dark matter. However, don't get too excited about this because this is a linear power spectrum and that this scales, nonlinear physics is very important. And if you take into account nonlinearities here, this effect is almost washed out. I have a student, Jessica, who is working on this and unfortunately you lose much information, at least it's more zero, but in this observable. So this is bad news. I think the dark energy survey will not be very sensitive to the mass of dark matter. And now I've finished just saying some cosmological results from the year one data. So we have an area of 1,300, approximately 1,300 square degrees, which is approximately 1 fifth of the survey area. And we have combined two cosmological probes, so the galaxy clustering with the weak gravitational lensing, we have almost 26 million source galaxies and 605,000 lens galaxies. So we combine observations from this weak lensing pro, so just the distortions of these galaxies and also the two point correlation function of these galaxies and also mixed observations. So this was really the first time this was done in the same experiment. I forgot to mention that the dark energy survey does not measure redshift using spectra, but measures redshift using filters. So the redshift information is not very precise and but we gain in terms of how many galaxies we can observe. So I showed you some of the results that were, they were public this year, not so long ago, a couple months ago. First of all, the cosmological parameters that we analyze is the amount of matter in the universe, the amplitude of the scalar fluctuations, the index of the scalar index of the power spectrum, the amount of variance, the constant, contribution of neutrinos and the equation of state. So these are the cosmological parameters, but there are also other parameters which are related to modeling in this framework of DES. So we need to model bias and redshift uncertainties. So there are many parameters, we call nuisance parameters and the covariance matrix is also not easy to estimate, but this is the results that we have. So this is amazing, I think, because this is a mega-matter and something that's related to the amplitude of fluctuations at the scale of eight megaparsecs. And what you can see here is the results from DES in blue and the Planck in green. And the amazing thing is that the areas are not very different. So it's the first time that the survey of galaxies is giving results which are comparable to Planck results. And it turns out you can combine the two datasets and you get the most precise values for mega-matter in this parameter S8. Now, this is in the context of lambda CDM. Now if you want to get constraints, so we can go one step ahead and one step forward and also constrain the equation of state for a constant equation of state. So there's one extra parameter and that's what we get. So this is just what you see here is the equation of state parameter constant. And this is the result for, okay, DS only, but you have to look at the DS plus Planck plus GAO plus supernova, which is this red thing, the red curve. This is the PDF for finding the equation of state combining all possible data. And the result that we get is minus 1.00 plus or minus 0.05 basically. So the sentence that goes with this, when we first saw this Scott Donaldson wrote, there goes the Nobel Prize because we are just confirming lambda CDM. So this is, I think this is all I had to say. DS is finishing its five year data taking. So next February is going to be finished. Maybe there will be six months extension and the analysis for the year three, year one. So this was all based on year one that we're starting the analysis on year one, two, and three. So this is all, thank you. Thank you very much for your area. So what, I mean, I don't know if there are questions, but one of the nice stuff of BS because just to the public that is listening, watching us now. So the S is giving another information that we were not discussed before with the other talks, that is how the structure formation is going, no? And the history of the universe. That's right, that's right. And the importance of dark matter in all these, of dark energy as well. Right, right. So we are sensitive to total amount of matter in the universe. This is only I am, we're sensitive to fluctuations in the matter in the universe, the sigma eight or S eight. And we're not unfortunately not very sensitive to the properties of dark matter at this point. And also, I think the main contribution as a set of the darkness survey to dark matter is the discovery of dwarf galaxies, satellite galaxies of the Milky Way, which were then targeted for the indirect detection of dark matter. I don't know if there are another question just to, because we were just talking about dark matter and the ray and in fact here behind me, I have in fact these two, this is dark energy, these particles two puppets, I mean, flash toys. And also I have the dark matter because today is the day of the matter. So today we're talking about this guy. So let's, I mean, just a break, you know, okay. So we are gonna go now, I mean, let's keep the question for the round table, but let's go now to the last talk that is Juan Carlos. He's gonna talk about the body simulations and the role of dark matter in that. So Juan Carlos, if you are there. Yes, I am. Yeah, hi. Hi. Can you give me one please? Yeah. Sure. This time. This one. This one. So, I'm sorry to give you the back because the microphone is just here. Thanks to organizers for letting me, giving me the opportunity to talk here about this issue concerning to one of the ingredients or one of the key ingredients we use to study the, the, the, the matter in the universe which in principle is, is this technique of using the matter simulations that we actually use for many different things from a, from the tactile or modeling detections of the matter signal or to, to study the large scale structure. So in principle, observation suggests that around 25% of the energy density content of the universe is in the form of, of mass. Around 80% of that, of that, that, that mass is in the form of, of that matter. This means that in general, the density field in the universe is dominated by the presence of that matter. The way it distributes and it behaves like, like we see here in what we know as the cosmic web when we have the distribution of mass in filaments, a halos and boys, which is the, the common or the, the regular pattern in which non-linear structures are organized because of the presence of, of gravity is the common thing we see when we go to the telescopes and study the, the distribution of mass. Actually what we do when we go to the telescopes to study the distribution of galaxies, galaxies we believe they form in the very deep region of the potential well provided by the armature halos which we are going to assume are just the peaks of the, of this mass density field. So for us, talking about our matter and the distribution of the matter in the universe will be very closely related to the way galaxies are actually distributed in the universe. And we already saw it when we were talking about in the previous talk when they were talking about a dark energy survey. So the question is, if we want to really understand the nature and the properties of the matter we actually need to know also the way it is distributed in space and in time, we need to be able to solve the question about how is the mass the dark matter distributed in the universe in any place in the universe at any time in general. In order to ask this question you have of course to write down some stuff. And one of the, in principle one will say easiest thing is to use a linear approximations. What we see here is what we get as the linear growth equation what we get when we use perturbation theory to study the growth of density contrast or density fluctuations in the mass density field. In general if we want to study the density distribution we can focus just in the regions where the contrast is because on the regions where the contrast is are the regions the places in the universe where the interesting stuff is going on. The point is that this thing is just an approximation that doesn't work in general. And in principle we use it to explain for example the behavior and the growth of the matter helos and galaxies that we see them today in the universe. And it is actually in any case difficult to solve even in this case the analytic form because we have to consider always the behavior and the interaction of the matter with expanding the cosmological background. So in general what we do when we want to solve this problem from the very, very beginning for any general case is to use embody our cosmological simulations. Actually what we do is to map this continuous density field to phase space. In phase space we cast the problem as a problem in phase space for the distribution function of particles in phase space where we can use them on the Karel approximation where we then sample the distribution function in particles and we solve the equation of motion for each of these particles intending to provide a solution for the motion of the different set of particles in the volume. In this way we cast the problem from the continuum of the mass density field to the issue we know as the embody problem. That's what we have here. We have here a common image of the evolution of a piece of the universe. For all of us the universe is Kuwik. It's the easiest way to include for example the cosmological principle through periodic boundary conditions. And we start from very homogeneous initial conditions with small perturbations provided by the linear theory and observation from the density contrast observed in the temperature fluctuations in the CMV. And then we let the particles evolve in the expanding background with the interaction with the gravitational interaction among them. And then gravity does the work and then the structures appear and the filaments and the halos and voice appear. And then in those halos, we hope to be able to form galaxies. All we are familiar then to this kind of nice slices of these huge body simulations where we actually try to mimic the distribution of mass then on the large scales. And we end up having these kinds of features like the armature halos were inside of a halo you have more armature halos that in principle should have or should be hosting galaxies. But that's what we should see in the universe if we were able to look at the universe looking at the eyes of the armature. One of the interesting and the same difficult things when you are doing simulations are the issues associated with the limitations of the numerical approximation, the embedded approximation and the technical limitations you have when you want to run the simulations. One of the technicals and at the same time theoretical limitations is that we need to constrain the simulations to a given resolution. We would like to have as large resolution as we would like in the simulations but the resolution which in principle has to be with the number of particles you use to track the density field is not unlimited. The larger the number of particles you use the larger the computing power you have to invest in order to be able to solve the problem. The point is that if you use a larger number of particles tracing the density field you will be able to recover with better detail the properties of the density field the density function in the phase space but also you will have the opportunity to resolve very much detail on the physics happening in the piece of the universe you are assimilating. Also, if you don't only focus on the behavior of the armata but also try to model more complex physics which is happening there which is related basically to the way the armata interacts with variance you will need to make the things a little bit more difficult introducing then the physics related to the variance that we will see play an important role when we want to see the observable things which is the thing we do when we do our experiments. Another issue is that we have to be able also to run large and no simulations not in the side of the number of particles but indeed in the size of the piece of the universe you are using the larger the box of the piece of the universe you simulate it will provide you with better statistics and a better information about the cosmological context you try to use. Here we see a trend of the evolution of the simulations in the last few years where we are trying to go in the direction of increasing the resolution of the simulations without losing the statistics we can get when we run in better conditions in box sizes. The ideal actually will be growing the curve in this direction, the opposite direction but the larger the box, the larger the number of sampling particles you will need to use and indeed you start to lose resolution when you don't play it in the right way with the numbers. So the best in the last few years have been increasing the number of particles while reducing the size of the box in order to be able to recover the more complicated parts of the physics behind the phenomena. As far as we know that matter cannot be seen all we can see as I already mentioned is the galaxies. So when we run the simulations we build this kind of maps with the distribution of the mass in the very large scale where we see the halos the clusters of halos or clusters of galaxies and the filaments connecting those structures. But we actually see the stuff which is forming inside which is this thing, the galaxies as the result for example of this TNG simulation from the recent illustrious simulations. So once you run the simulation all you have to do is to try to pair the results of the simulations with the stuff you can see with the observable you have which in this case are the galaxies. Things doesn't work that good in that direction when you do the stuff. For example, this is an old result of the not the first simulation one of the things you can get and actually you can get wrong when you try to analyze the results of the simulations when you compare them with the results of the observations. What we see here is a comparison between the abundance of subhalos hosted in a dark matter halo similar to the dark matter halo of galaxies in the local group. A Milky Way galaxy and a galaxy and a galaxy. What we see here is the abundance of sub structure sub dark matter halos hosted in that dark matter halo which in principle should host a satellite galaxy. And what we see here with this point is the result of the observation. In principle what we have here is the result of dark matter only simulation. No baddings, no gas at all has been included in these simulations. And in principle the result is the thing we already know the problem we already know which is already called the problem of the missing is satellite's problem in which dark matter simulations produce or predict an overabundance of satellite galaxies which is of the order of a factor of 10 compared to the stuff we get when we go to the observations. One has to stress that this is not only a problem associated to the sub structure it is also for example we can see the same problem when we go to the field galaxies in the local group not only studying the satellite galaxies in the Milky Way. Solutions, possible solutions to this problem. There are many. Actually this problem is very interesting because it opens the door to many different situations where we could have different physics playing in the game and provide an interesting scenario to solve the problem in many different ways. For example, this is the result of a very recent simulation where the formation of dark matter halos with structures and mass similar to the mass of the Milky Way halo has been performed but in this case including the effect of the formation of the galaxies in those halos. That means the body and physics has been done in those simulations and the star formation and feedback recipes have been introduced. When you do such a thing you produce abundance curves like the ones you see here in green and pink curves here in the corner of the plot where you see comparing with the points which are the results of the observations that when you turn on the body and physics you can actually in principle solve the problem of the missing satellite problems if you do it right, if you do it in the proper way. The point is that we don't really know which is the proper way. I mean we are kind of tricky tuning parameters doing tricky things here with these simulations we kind of tune parameters in order to be able to reproduce for example this thing the abundance of satellite galaxies in the Milky Way halo but however when we fix the parameters to feed the data in this region we miss for example the abundance of field galaxies in the local group which is the star we see here. We see here in this solid curve the abundance of galaxies when you have only dark matter only simulations and here in blue we have the abundance of galaxies in those simulations when you have already included the physics in the same way you use it here to solve the problem we had already for this thing. So in principle we think that many things have to be learned concerning to the physics we put or we plug in into this embodied or embodied plus hydro simulations when we try to understand the results of the formation of structure and this is the moment where we are already getting close to it but already not there in order to solve all the problems. Another possible solution will be let's change the nature of dark matter let's suppose the dark matter is not cold dark matter but it is warm dark matter, hot dark matter or mixtures of those. An example of those tests is shown here in this slide where we show the results of the cold dark matter simulations aquatic simulations and a Q-allent dark matter simulation where we are using a warm dark matter simulation. The change in the nature of dark matter in principle in this case changing the mass of the dark matter particle which in principle also will change the velocity dispersion of the particle will suppress the formation of small structures and this will suppress the abundance of substructure in the low end of the mass function as we see here where we see a strong suppression of these small things we see here like flies on a piece of cake. One of the possible solutions will be this thing that not that we have the physics wrong in the variant side but that we don't know the nature of the properties of the particle of dark matter or in the end the solution will be a mixture of both of them or many of those cases. Another issue concerning with the simulations is the idea of the nature of the distribution of mass in the very, very high density regions of the density field which is the center of the matter hills. It is important because the galaxies are forming in the first 10% in a region of the halo and in those regions the simulations are not precise and are strongly affected by the effect of the collapse and the flow in and flow out of material because of the feedback of supernova. What we see here in this figure is the result of cosmological simulation where variants have been included but where the in the region, the yellow region we are highlighting the region where the numerical convergence is at risk because of two-body relaxation in the simulation. In these kind of simulations two particles of dark matter could interact to each other and modify the trajectories the path of each particle changing the slow point of the density profile in the inner region in an unphysical way. This is not physical. This is numerical noise in the simulation. So we shouldn't believe the results of the simulations in the very center. However, in the regions where we can believe the results of the simulations we see that they interplay the interaction between gas flowing in and out in the region where the galaxies are forming because of the explosion of supernova which is ripping out and moving the material outside the dark matter yellow and the gas cooling down and falling into the potential in the inner region of the potential will provided by the dark matter yellow modify the orbits of the particles of the matter particles in a way that the original dark matter only simulation change its density profile to from this CUSP behavior to the to the core behavior we see here. Results that this thing is important when, for example, we want to provide a forecast for the detection or the detectability of the matter annihilation in, for example, the galaxy. Here we see the maps of the eventual flux of gamma ray provided originated in the center of the Milky Way halo because of the self annihilation of the matter particles, sorry. The point is that all we know that in principle then in the inner region of the Milky Way halo the density is the larger and the probability of annihilation is larger where the density is the larger in the very center. However, since we don't know which is the effect of barring in the distribution of matter in the inner region of the halo, we don't really know what should we expect about the signal of eventual annihilation of the matter in the inner center of the Milky Way, for example, or from the full sky. I mean, in principle, we can make maps as the one presented here from the Millennium II simulations where we could in principle try to forecast the detectability of annihilation from the neighborhood in the local universe, but still we don't know the impact of the variants in those density profiles in order to see if these regions are indeed deep enough or have density contrast, large enough to provide detectable flux of the matter particles. So there is a lot of work that has been done in these directions, but there is a lot of work that has to be done in order to be able to solve many of those questions associated with the numerical issues or the physics that we have to implement to be able to, in the end be able to determine the way the matter distributed in the universe. In the regions, we can actually detect any behavior of those or any evidence of the presence of these things. So many take away or take to home things from these simulations. Actually, they provide a lot of information, not only for particle physics, also for astrophysics and galaxy and large-scale structure things, but I mean, those simulations are expensive, difficult, but still very inaccurate. And we have still many, many, many things to do in order to be able to solve the problems from the side of the, the matter from the side of the simulations. I mean, the good thing is that it is fun and it looks like there is plenty of space to look many physics from the particle physics side or from the astrophysical side, as well as from the numerical point of view. And that's all I wanted to say about it. Thank you very much Juan Carlos. So yeah, let's, sorry, I'm gonna just enable this. I'll enable this. Okay. So there are very, I mean, there are some questions that is for you that, I mean, like a take-off that is like, how technical is the simulation itself? It's required a lot of resources, computational resources, no? Yes, yes, yes indeed. I mean, it depends much on the resolution you want to recover, but also on the physics you include or you introduce. If, for example, you want to run a high resolution there matter only simulation. The one of the most, the largest simulations we have seen which is what's shown in one of the slides. It took more than, it was like 15, 1500 million CPU hours just to run one of those, one of those simulations. The point is that if you include bio and physics on this thing, it will take until 20 or 50 times more time to run this kind of simulations and it will require you more time in the cluster and more CPUs to do it. I mean, these kind of simulations usually running in thousands of CPUs, 10,000, 15,000 CPUs during for periods of time of months. So the larger the resolution or the richer the physics you include on the problem will require you much, much better computational resources. In the end, it means more money you invest on it. Just looking for a hotel. No, yeah, but all this, I mean, how is the, or maybe just to, maybe a bit better, we can pass to the round table. So first of all, we thank all the speakers. It was very interesting. And it's amazing to know all the different topics that it's been people working in Latin America. So we just put that here as a small sample. There are many more that unfortunately we cannot have all the people here, but yeah, this motivated us to continue with the stuff of the physics. But we're gonna start with the round table. So I'm gonna start just continue with another question for Juan Carlos. How is the, all this, the infrastructure itself is not, if there is a small cluster or something like that in Colombia that you run stuff or you're lying in external. Yeah, with the small scale facilities, you can run kind of successful simulations. For example, isolated galaxies or isolated are not the hills. But in Colombia, for example, we don't have the facilities to run such a large simulation for example, we run, we get access to Jewish supercomputing center in Brezden where we have the opportunity to run with thousands of CPUs, this kind of simulations. However, I know, for example, in Brazil, there are this GMO as a cluster, for example, is available and it's actually facilities are competitive with the requirements one will actually like to use to run this kind of simulations in the Latin America in the local neighborhood. So in Brazil, sorry. Yeah, yeah, in Brazil, I know there is a, I mean, there is a cluster and mostly, I mean, I know it because they use it for geophysics and oil exploration, but it's called GEMOJA. And I mean, now I don't really know the details of the size of the facility, but as far as I know, it's one of the largest clusters in the South America. Yeah, I don't know if Rogério knows about the, because I guess also for Dark Energy Survey, require a lot of computational power for the... Right, so for the Dark Energy Survey, right now we have our own cluster in Rio, which is in Rio, there's something called the Laboratorio Nacional de Computación Científica and it hosts several clusters and our cluster is hosted there. I think the largest computer that we have there is called Santos Dumont, but I'm not sure how accessible that is. I think it is accessible if you make some project. And right, so this is what, so we are using our own cluster at this moment for our purposes. So maybe Santos Dumont also, I think there's some projects up there in running the system. You're right, maybe Santos Dumont has the largest facility and I guess as far as I know, those are the largest facilities placed in South America for this kind of jobs. Yeah, also I remember that if I'm not wrong in also in the Catholic University in Chile, there is a group doing by simulation, no? Yes, yes, Padilla, Nelson Padilla and they actually work very hard in these kind of things. Also in Buenos Aires, I already forgot the name of the university, Patricia Tisera is working also in this kind of simulations. And Cecilia Scannapietta is also back there since a couple of days ago, working in this kind of stuff too. So maybe we can, I'm gonna check just some of the questions that are still in YouTube. So one that we have from the very beginning, for Ronaldo, is probably from in, Axel Bonilla. Sorry. I wanted to answer the questions that I was asked before, that I didn't know the numbers. Okay. So I was asked what's the angular resolution of CTA? I looked it up in the science with CTA document, above one TV, the angular resolution is 0.05 degrees, that's three arc minutes or better. It gets better towards larger energies, but at one TV, it's about three arc minutes. And the other question was the data rate of CTA, which will be about 10 petabyte per year. Petabytes. Petabytes, so 10 to the 16 bytes per year. Yeah, that's a lot. Yes, it is a lot. Guys, it's amazing. Yeah, maybe I was, Mesa Retamal, the one who asked this question. So thank you, Andre. So one was, at the beginning there were a question that we didn't ask to make it to Fernando, but it was Alexander Bonilla that he was asking to him if he can explain a little bit, a possible relationship between neutrinos and dark matter, even though there were some talking in the chat. Yeah, but he was, look, let me see if my microphone is on. Yeah. So he was actually mentioning to you about relic neutrinos, which are neutrinos from the CMV, prior to CMV, so from all the universe. So what happened is the relation between dark matter and neutrinos is that depending on how dark matter is produced, you can actually mimic the effect of neutrinos on the CMV power spectrum. So see the neutrinos, for instance relative basic degrees of freedom, they have an impact on the CMV. And depending on how dark matter is, let's say a fraction of dark matter is produced relativistically in the early universe, you can actually mimic the same neutrinos, the relic neutrinos that we all know. There are also other connections, for instance if the dark matter interacts with neutrinos in the very early universe might also determine when these dark matter be coupled from the plasma. So there's another connection with the relic neutrinos. So those are the two main connections between dark matter and relic neutrinos that I know of. Okay, I guess for Alexander it's okay. Maybe if he has more questions he can also comment here in the chat. So another question, but this is for the case of Sergei, it's about Andes. Jorge Diaz was asking first if you know about this Andes very long basaline neutrinos experiment proposal. And his question is pointing an neutrino beam down to one side of the Earth to another is very challenging and has any of the relevant labs like Fermilabs or G-PAC officially comment about this proposal? You have to unmute yourself Sergei. We can hear you in the upper part. Hello? Yes. Now we're here. Okay. Well, but this is not a proposal but one of the options for physical program of Andes labs. But it is very difficult, very, very difficult to realize technically. Of course for a very certain and Fermilab are the laboratories, accelerator laboratories which can produce neutrino beam in the beam dump experiments towards the lab, Andes lab. I don't remember precisely the distances but it is very appropriate place, I mean Andes for this sort of experiments but imagine how to launch the neutrino beam in certain direction. Technically it is very difficult to do but okay there is a proposal. No one discussed in details and seriously especially now when Andes project has not yet been approved completely. So, but well this idea exists. Yes. Fermilab and CERN are possible laboratories which may generate these neutrino beams. And the type of physics in this case is going to be affecting matters, no? I mean matter affecting neutrino oscillations. Of course, of course. Yeah, of course. Yeah, good. Because also this is another part that I mean there are many models of dark matter that requires a strong connection let's say between quotes with that way neutrino physics that one is essential for the other. So any experiment that could help to see if there are deviations with respect with the standard picture of neutrino in which that matter could be related it would be very, very nice. So I have another question. Well, I guess I don't know who asked this but also it's for Sergei. Australia is not supposed to construct an underground lab as well? Yes, I hear it. Even there is an underground and a glacier. Well, there is in the south pole a laboratory and experiment already installed dark matter. GM ice 17. Yeah. At the bottom of the ice cube experiment at the depth about 2.2 kilometers of the water equivalent as far as I remember. Well, but it is still in progress in the data taking stage. Already reported some data about this annual modulation which is the data consistent with no annual modulation but they are still taking data. So in principle there is a laboratory but at the south pole. But to repeat or confirm or refute as I said the result this claim of Dama Libra one need to definitely to do this in a definite way one need to install a clone of this experiment which is quite large experiment to install in some place in the southern hemisphere like in under laboratory and repeat this experiment. Well, in Australia I know about this project but I don't know about the present status of this proposal. I haven't heard anything definite but I hear about that. Yes. And I cannot give more comments on this project. Okay, so there is another question. I don't know if the other I mean if people here in the hangout want to start to debate with me to talk please be free to unmute yourself and ask questions to the speakers. So for instance One from Nicolas I might answer that from Nicolas I guess. Hello? Yeah, yeah. Okay. Which one about the general question. So if there is an actual model that has been falsified within undetectional dark matter you want to ask the answer the question, no? Yeah, I can answer that one. Please. Yeah, so there are not actually just one there are several models and there is also the statement has to be quantitative. So please models which use the Higgs like the standard model as a mediator between the dark matter particle and the ordinary matter like variance, quarks, problems has been excluded up to roughly TV scale. The same happens with the dark matter talks to the cinema particles through the Z boson. The Z belongs to the cinema spectrum has also has also been ruled out up to the TV scale. There are combinations of those models as now have also been ruled out using the based on the latest results from undetectional experiments. Those are the two main ones but there are also models which you have a for instance you have a scalar also as a mediator a new scalar exotic scalar as mediator. If this scale a scalar has masses around the weak scale has also been ruled out and there are many many many other models which if you have a vanilla like interaction what do you say vanilla means that I just saw if it's Susie's ruled out no Susie goes beyond the MSSM. So the MSSM neutrality in the dark matter just answering the question that popped up and the neutrality in the dark matter in the MSSM is basically ruled out. If you want to explain through thermal freeze out dark matter if dark matter is thermally produced the MSSM is not a solution for it. There's just one case in particular actually it is if the MSSM in the MSSM you have a Higgs-Zeno dark matter with a mass of 1 TV it's not 1.2 it's not 0.9 TV it has to be 1 TV and it's very very very fine tuned but then you can still obey all the data but that's it. Besides MSSM then Susie is not ruled out and we will not be ruled out anytime soon. I'm not a Susie advocate but that's the truth. I'm seeing that there is some Susie We're going to more of a dark matter where are your expectations? So my expectation is the following so my statement is a little bit strong which is the following. As far as WIMPs WIMPs which are the most popular dark matter candidates in the literature because you can explain the relic density through thermal freeze out and are very predictive produce signals either at current or future experiments. So those models by 10 years from now so I'm saying 10 years because I'm thinking of the ultimate dark matter direct detection experiment which is will be potentially dark side or darking sorry, darking or maybe even xenon and tone depending on how things go if by then we haven't seen any signal of dark matter if big if then basically all WIMP models look what I'm saying and I can prove it to you basically all WIMP models have masses below one TV you'll be excluded under the assumption of thermal production so this assumption is very crucial to everything I'm saying thermal production is there all WIMP models below one TV will be rolled out almost all WIMP models will be rolled out if you don't see anything in the next 10 years Yeah, but this that will be like WIMP models in the in the simplest because now all the time there is leftophilic dark matter and stuff like this that could Yeah, then you have to really go into some sort of fine tuning the models for instance you didn't something that you would naturally predict so would you predict that the model will have a UV complete model that is leftophilic probably you would say that's your first answer so maybe it's no if things go wrong then I can come up with something else I'll say naively what you expect that's what you expect obviously and I said below one TV if you go to masses heavier than one TV then you can still circumvent the limits but below one TV there will be few models that's why I said most with models to be ruled out not all of them but most and if if most models are ruled out you'll be like Susie it is today there Susie is ruled out no is not but the main model for Susie is for dark matter is ruled out so people lose interest so that's I think will happen to WIMPs if we do not see anything the next decade so I don't know if there are he's still asking I mean Nicolas I think from the last the gravitational wave plus electromagnetic observation the dark matter emulators are being ruled out like theirs yeah but I don't know if someone want to comment about modified gravity or alternative to dark matter that is not what we're talking about now but I mean there is plenty of possible scenarios there if you want to consider any options apart from the standard scenario for the matter however the observational evidence in principle is very much supporting the standard scenario you can find evidence for the matter if you don't believe in Newtonian dynamics you can see gravitational effects then gravitational effects which are completely relativistic and they agree very much very well with the other results of the observations so I guess indeed you can actually explore a lot of physics playing with alternative models or modifications to gravity but going to the simpler observations are very well fitted with the standard scenario yeah so one thing that I like to add I completely agree with him the point is so what MON does is they can explain one data set so you just forget everything else exists and you can explain just one data set but the same law you're using to explain this data set I can use to predict something else and that it fails yeah fails in a way that you have to introduce in Mondian theories something which is called the phantom dark matter in some of the theories in order to fit the data which in the end is like what are we doing then we are introducing something we don't see because we don't it's basically the same the same idea so I guess it's going to the simpler scenario the best fit the observations I guess the standard model is like the best bet you can actually do I have a question about AMS concerning the results you were showing you were presenting which are the properties of the dark matter particle candidate explain the results you get from AMS yes at the part well the only thing that I've shown and that is the one TV mass that will be like like making an I guess will be something like I don't know so I was saying there were several models trying to explain the data not an expert in each one of these models so I might because I have worked on this recently so the model that Joachim which is the plot he showed in his slide he has a one TV dark matter model and he uses one very very very particular background model to explain the AMS access and in his case the dark matter annihilates into in very weird cascade so I would say that's very very very unlikely that that access he showed could be addressed only by the argument and there is one another thing that I might add is that the same dark matter he showed that would explain the AMS access is very very strong tension with gamma reobservations yeah that's true so I mean yeah very interesting though is that recently I say recently of March this year and there is a new AMS access which has been observed in anti protons but that's a lower energies yeah there are an excess and lower and also there is a spectral break in the cosmic ray spectrum so yeah I mean also when I was working in the topic with the AMS cosmic ray one of the nice stuff about dark matter and cosmic rays in that period in the beginning when the heat access that was the previous experiment that was started to see this access was how much the dark matter community helped to develop better models for cosmic ray propagation and improved a lot the cosmic ray physics that at the moment was kind of there were a few people working and then came on an avalanche of people trying to look to explain this signal with dark matter but in order to explain this signal using matter they had to improve a lot the propagation models and all the astrophysics related to cosmic ray so it was a very interesting from my point of view the AMS excess heat excess positive aspect was a kind of a very nice feedback between two communities and of course not all the signals has to be dark matter but nowadays we know that thanks to the observation gamma rays and all the stuff that start to be a strong tension and maybe the most plausible explanation for the excess is just pulsar emission from gaming that is the closest that we have so I just a fast question to Jose is there other I mean I know that in the International Space Station there are other experiments that are also cosmic ray oriented but is there a proposal for the after AMS what is going to be I don't know if you have some some idea or there are some news or something that what is the future after AMS well they're working another project but it's more Chinese but still something bigger than AMS probably it's not so feasible right now but yeah I'm not sure of new experiment so for now AMS AMS and then nothing no I remember when I was dad 5 years ago it was called the golden area for astrophysics for indirect detection of everything there were many experiments but of course after Fermi at least there is going to be CTA and that it could improve all the gamma ray astronomy in the TV range but anyways how it's down the game at the end there was also a point concerning now that you were talking about CTA and cosmic rays I had a question for Andreas if it is still there I mean because we saw it when we were looking at the data from Fermi and in general I mean the background I mean there plenty of sources astrophysical sources emitting radiation from the center of the galaxy and the neighborhood is just amazing and the question is could be possible to use CTA for example to start to filter out all of this information from astrophysical sources in order to be able to someday somehow recover the actual signal of possible dark matter emulation from the halo is somehow CTA designed for this kind of of experiments or detections somehow it helps to recover the signal of the physical sources well Andreas is not here but I can take the question what's your arm then well the problem is that CTA will reach higher energies than Fermi and most of the sources that Fermi they let's say they die at higher energies that is difficult to conciliate these different bands because most of the sources emit at the GEV or something like that in Fermi if you go to CTA you will be sensitive at the TEV scale then they can be different fittings maybe in the far future we will extend CTA to reach lower energies but it will be not in the next future okay yeah so I don't know how well that is in one of the rooms I don't know if they have questions that they can yeah he is asking for questions there because they have a problem with the microphone for that reason we haven't hear anything from them but anyway there are people they are watching the transmission so I don't know also one of the studies we haven't discussed is how because we are all here we are all from work in Latin America from Latin America so it's very interesting to know that there are some common topics in which all the scientists in the future or something like that could help to develop the region in fact I don't know if for the moment just to they are asking questions one of the idea that we have with these low physics webinars is in fact that one to try to integrate the region to improve the communication between different institutions that do research in common topics but in most of us we know what is happening in the US or in Europe that since the university are so close in distance and also there are access to much funding it's very easy to go to different centers and interact with people that's the stuff that in Latin America is still missing but we can overpass this restriction because of the finances and the distance for example with technology like internet so now the Hoel is asking is is there is any maybe for Juan Carlos a student ask here is there any info about the shape of the universe such as curvature to be obtained from that matter maybe for Juan Carlos for Regerio I can try to answer this so the shape of the universe is basically determined by the total amount of energy density so if the energy density is the same as the so called critical energy density the universe is spatially flat so it's not only dark matter that contributes to the total energy density of the universe but mostly dark energy actually and in fact we have a good indication that the universe is in the critical density and regime so that the energy density of the universe is the critical density so the so now there's if there is any curvature it's very very small so the universe is spatially flat as far as we know yeah in fact one of the plots that you saw that the I mean it was not the curvature it was equal one minus one very precise the value that is lambda cdm yes so in fact most of the analysis now it's performed in the context of flat lambda cdm but we don't even consider curvature anymore in the I mean analysis that's what we get from cnb bao to turn over data together it's actually very confident the the numerical results I don't know if I remember there were some controversy or some tension between this recent gravitational wave observation with some estimation of the they have a constant and stuff like this if I was completely attention it was very consistent now you're right attention to measurements of the Hubble constant from local measurements say local supernova which depends on the distance ladder and the measurement indirect measurement from cnb so and now there's a new this gravitational wave event the neutron star merger but it already says a little bit about the Hubble constant it's actually in the middle the best value is in the middle but you're right so there is some tension and ds was not able to put some tight constraints on this because we're not very sensitive to the Hubble constant yeah I don't know if because I have to jump from Bidu to the Hangout of Time just to see there are the there were some questions about dark matter candidates but Farnando is answering these questions in the chat about the axions and stuff like this but also one of the stuff that was not discussed Farnando was mentioning this thermal free south for dark matter but in fact there are there is a lot of different models that produce other type of type of mechanism to produce this dark matter that is for instance the free scene scenarios or in the case of extremely light dark matter in fact the imagination is very large in the community and that is nice also because even though we don't have observation we are trying to explore all the possibilities and seems very reasonable at the end so they are telling me that we have to finish this transmission this dark matter day sorry we like to continue talking here but yeah we have to in my case I have to have dinner the other people have to have lunch or breakfast or whatever all the people that is watching us in the around the world so first of all I want to thanks all the speakers Juan Carlos, Farnando, Andreas José Rogerio, Sergei and it was very very nice to know that all the activity that is done in Latin America in different areas and in this case related with dark matter but it could be very interesting to keep going this type of event focusing in a specific topic let's say next time we make the neutrino day and we talk about neutrinos or the gamma ray day and stuff like this using the same idea try to show that in fact in Latin America there is a community that work in different areas and they are us and super interesting like all the part of the work that they are doing similar stuff so thank you to everybody to the people that is watching us in VIU2 in the future not in the life you can if you like this idea to continue doing this type of events please make a comment propose your topics and we are going to do it and try to make the permutation to do that and yeah we can see the next next week for the next week for the next week we are going to have Aurelio Carnero-Rosel the next Wednesday and it's going to be a regular webinar and he will talk also about the it's going to continue a little bit the talk of Rogerio he's going to talk about research for me I would like to continue but thank you all for watching us thank you all the speakers it was very interesting and let's hope that in the future also we are going to have a dark matter day about the discovery about dark matter alright bye