 Perfect. Okay. Welcome, everyone. Thank you for joining us for today's webinar. My name is Alejandro Cardenas Abendaño, and I'm going to be your host today. Today, we're presenting Cosmology in the Multimessenger Era by Macarena Lagos, who is currently a postdoc fellow at the Catholic Institute for Cosmological Physics at the University of Chicago since October 2017. Macarena holds a Bachelor in Physics and a Master's from Pontificio Universidad Catolica de Chile and a PhD in Physics from Imperial College London. Before moving to the University of Chicago, while doing her PhD, she was also an academic visitor at the Department of Physics at the University of Oxford, also in the UK. She will be moving whenever possible to Columbia University, but I guess she's already in New York, so she will be she is already there. So good luck with that. This fall for another postdoc position. Macarena's research focuses on understanding the origin and evolution of the universe and the behavior of gravity in the extreme regimes. So remember, you can ask questions over email through our YouTube channel or Twitter, and the questions will be read at the end of the talk. Now, without further ado, we will turn the time over to Macarena. Thanks for joining us. Thank you. Do you see my screen? Yes. Great. Well, so thank you for having me and good morning, good afternoon, or good evening to everyone that's watching. So in this talk, I would like to tell you about how we can test cosmology, and in particular the properties of dark energy with the direct detection of gravitational waves in situations in which we can also get electromagnetic counterparts. This is what we call the multi messengers. So let me start with a very short introduction. And by telling you why I think it is important that we do test cosmology. So currently the concordance cosmological model describing the the behavior of the universe on larger scales is the Landocidian model. These are assumed to be the universe constituents where you can see that 95% of the total energy content of the universe is made up of dark matter and dark energy. These are components that are crucial for the evolution, but at the same time we actually don't understand yet what they are and exactly how they behave. So certainly from a theoretical point of view, there's a lot of work to do here to understand these components better. Now, from an observational point of view, here in this table you see the basic six cosmological parameters in the Landocidian model and one of the latest constraints, observational constraints on them. And you can see that many of them are constrained to better than 1% precision. But at the same time as we are getting more data and more precise constraints, actually recently it started happening that small tensions have showed up. Tensions on constraints on specific cosmological parameters that you get from different data sets. And the most famous case is the case of H0, the current expansion rate of the universe. These are the cosmological constraints that you get, which is in tension. It is estimated to be in tension between four and five sigma with the constraints that you find on H0 coming from local observations, for example, from supernova type 1A. So from this observational point of view, now actually we are having this additional motivation now for testing cosmology because it could be that the assumptions that we make about the universe and this constituents is biasing the constraints that we are finding. So this is an objective trying to test the cosmological skills of the universe. And before getting into all the specific details, I would like to mention what I think the outlook is for achieving this task. And in general, I would say it looks very good basically because we will have a bunch of telescopes that will become online in the next few years that will give us a lot of observational data and therefore constraining power to test the lamnocidian model and possible alternatives. So here you see some examples of telescopes that concern galaxy surveys or the cosmic macro background. So all these collaborations and observations concern electromagnetic spectrum. They focus on getting electromagnetic signals. But in this talk, I would like to focus on the fact that we can also use gravitational waves to test cosmology, not only to test the strong field of gravity regime as you and I have to think, you can also use it to test cosmology. So we already have some detections from like one very ago. Now Kagura has joined the network and hopefully in the further future, we will get detections from other ground-based and space-based detectors. And overall, together they will probe a wide range of frequencies for gravitational waves and they will detect gravitational waves from sources at different distances from us. And this will allow us to test cosmology at different red chips as well. So how do we use gravitational waves to test cosmology? The answer is by looking at the way they propagate. So gravitational waves are emitted in this very dense environment. Let's imagine that we have a binary system of compact objects that could be black holes or neutron stars, for example, because they are very heavy and dense, they emit a considerable amount of gravitational waves and they lose energy in that form and therefore they start inspiring, coalescing and eventually merging. So these gravitational waves are emitted in this entire process. It starts traveling towards us over cosmological distances until they are eventually detected on us. And in this propagation is when we are going to probe cosmology because it turns out that if the universe expands in a different way or if dark energy behaves in different ways, then that will change the way gravitational waves propagate. So I want to focus mainly on dark energy in this talk. So dark energy we don't know exactly what it is, but it is responsible for the observed late time accelerated expansion of the universe. In the lambda CDM model dark energy is assumed to be this component with a perfectly constant energy density, this lambda. However, we don't know if that's an accurate description. It could be that its energy density evolves in time. It is dynamical. That's one of the questions that we would like to answer. And if it is dynamical, if it is evolving in time, then what is it? Is it, for example, a new fundamental scalar field similar to what we assume happened during the early universe during this period called inflation, where we also believe that there was this new fundamental field responsible for an accelerated expansion of the universe. So we want to answer these questions by looking at gravitational waves. So in particular, I will be focusing on multi-messenger observations. And I will, I want to explain you now with like a very simple cartoon why you can use multi-messenger observations to test that kind of energy. So let's imagine that we have the lambda CDM model. In that case, we have the space-side metric g mu nu that is coupled in a specific way, in a minimal way, to the cosmological constant lambda, and to the rest of the matter components, that here I include everything that means photons and even dark matter. Now, if dark energy is, for example, dynamical and evolves in time, then a simple modification that you could consider to the lambda CDM model is to assume that the, that dark energy is promoted to be a new dynamical field. We don't know exactly what kind of field is it. Is it a scalar or a vector? And we don't know also how it interacts with gravity. So that is, that is less that is left that remains to be tested. But one important thing is that I will not be changing the way matter behaves, the way the rest of the matter components in the universe are coupled to gravity. The only things things that I change is the way dark energy behaves and how it could be potentially coupled with gravity. So in this kind of situations, it turns out that if you have multi messenger signals, that means gravitational and electromagnetic waves, they are traveling in the universe that as a first approximation, we can describe to be a perfectly homogeneous and isotropic expanding universe that here we're present with this black circle. So we have both of these waves propagating towards us. But it turns out that if dark energy is dynamical and has non-trivial interactions with gravity, then the gravitational waves do feel an additional effect coming from the evolution of dark energy that here we're present with this field chi. That is supposed to be my dark energy field. Gravitational waves get an explicit effect coming from the cosmological evolution of my dark energy field. But electromagnetic waves don't, at least in the cases that I am imagining, where I do not change the way photons and a standard matter components behave in the presence of gravity. So in this cartoon, you immediately see this kind of obvious difference and the ideas that we want to exploit these differences in the two signals to learn about dark energy. So mathematically, this is the action that describes how gravitational waves propagate in these expanding universe. These, here you will see two specific modifications that will arise compared to lambda CDM when dark energy is dynamical, specifically when it is, for example, a scalar field or a vector field. So these are the simplest scenarios. You could have more complicated things and more modifications, but this is a simpler situation. So in this action, you see first here the scale factor A is just the expansion of the universe. Then we have H that corresponds to the gravitation wave amplitude. So this is the quantity that we want to study and see how it evolves and primes are time derivatives. Now the first, another factor that you see here and corresponds to the first modification that you will see with respect to lambda CDM is the presence of this gravitational coupling that in lambda CDM corresponds simply to Newton's constant and it is a constant with a specific value. But if dark energy is dynamical, it could be that this gravitational coupling here now evolves in time. It is different and it evolves depending on how dark energy evolves in time. Now the second modification that you generically expect to have is the presence of this parameter alpha t that we call a tensor speed axis. In lambda CDM, this alpha t quantity is equal to zero and that means gravitation waves propagate at the speed of light. Now again, if dark energy is dynamical, it could be that there is a non-zero alpha t parameter here and that means that gravitation waves propagate at a different speed, that light. And we want to test for that. So these are the two simplest modifications that you could expect to have in the propagation of gravitational waves beyond lambda CDM. And in the rest of the talk, I will be discussing how we can observationally constrain these two quantities. So the simplest one is to try to constrain this alpha t parameter. So let's imagine that we have a binary system of neutron stars now. So neutron stars, when they merge, they will emit gravitational waves and electromagnetic waves. And if we are lacking off, we can detect both. And this is what happened for the only confirmed event to be neutron stars detected by LIGO and Virgo collaboration that is a GW170817 event. So here you see the data. At the bottom, you have the gravitation wave signal. You see the frequency as a function of time. We're here time equal to zero by definition means what we identify as the moment of the merger. So this is the signal, the gravitational wave frequency is growing in time as the stars are merging until they eventually merge at this time here. And on the top, we see one of the electromagnetic counterparts detected that corresponds to a gamma ray burst. So here you see the gamma ray signals, a function of time as well, where here you see this peak that we identify as coming from the same binary system at the moment of the merger. So one simple comparison that you could do here between these two signals is a time delay. As you can see, there was about an about two second delay of the electromagnetic signal. And you may wonder why that happens. And one of the reasons could be maybe because they propagate at different speeds. So they arrive at different times to the observer even if they are emitted at the same time. So you can impose a bound on this parameter alpha t that I show you before the tensor speed access. Alpha t tells you what is the difference between both speeds. And from this event, GW 17 or 817, putting together actually with other observations, but it gives us this constraint on alpha t. Alpha t has to be smaller than 10 to the minus 15. This is a very tight constraint. And because of these actually nowadays, most people or many people do assume that gravitational waves probably do propagate at the same speed as light. And that this difference, this delay that was observed in which electromagnetics like arrived after gravitational waves was simply because of the astrophysical process that happened during the merger. That basically the gamma ray burst, for example, took a while to form or took a while to get out of this dense environment to start traveling towards us. So for this reason from now on, I will indeed assume that alpha t is actually exactly equal to zero. And I will focus on trying to test this other modification that was related to this gravitational coupling. But this is a simple comparison time delays allow you to constrain the propagation speed of gravitational waves. So how do we test this gravitational coupling? Well, the ECS way is to first get an intuition by looking at the equation of motion that you get from that action that I showed you before. So this is the equation of motion for h for the gravitational wave amplitude. These are time derivatives remember. And here we have a k that corresponds to the wave number of the wave. And here you see this term that is proportional to the first time derivatives of h. So this corresponds to some kind of friction or damping term that basically dumps the amplitude of gravitational waves as they propagate towards us. You will always have this term here proportional to the Hubble rate of the universe. That is this h, this capital H, that simply tells you that the amplitude of gravitational waves decays when they propagate towards us simply because of the expansion of the universe. But now if the gravitational coupling has a time evolution, then there will be a non-zero additional alpha m contribution. So alpha m by definition is well proportional to the time derivative of the gravitational coupling. I recall in lambda CDM the gravitational coupling is Newton's constant. It is a constant. So alpha m by definition in lambda CDM is equal to zero. And we would like to see if our observational data does prefer a zero value for alpha m or maybe prefers a non-zero value, which would be a signal for a time evolution for the gravitational coupling that could be due to dark energy, also being dynamical. So that's the objective. Now, in particular, the only difference between lambda CDM and this alternative dynamical dark energy universe is the alpha m. So we can actually write a specific relation between the amplitude detected that I would expect in lambda CDM compared to the one that I would expect if I had a non-zero alpha m. And the difference is only due to the fact that g is evolving in time and it is given by this ratio. At the top, we have g evaluated at the moment of detection. So this is redshift equal to zero. And at the bottom, we have g evaluated at the redshift at the moment of emission, which is at the redshift of the source. Here, I am assuming that the difference in this amplitude on the detected amplitude is only due to the difference in propagation due to this alpha m. And the emitted amplitude originally from the source, it doesn't differ in these different cosmological models. So now we can use this relation and rewrite it in a more useful way. We actually know exactly how much the amplitude of gravitational wave decays in lambda CDM from emission to detection, and it decays as one over the luminosity distance. This distance tells you how far the source is, but including effects of cosmological expansion. And it depends, of course, on the redshift to the source, as well as all the other cosmological parameters that affect your expansion history. And here I explicitly showed you the case of H0, the current expansion rate of the universe. So now in this alternative cosmological model, we're going to define a similar and analogous quantity that we call the gravitational wave distance. So the amplitude of gravitational waves from emission to detection will decay as one over this distance. And that's the meaning. It's by definition how much the amplitude decay. We call it the gravitational wave distance, and it differs from the luminosity distance only by this ratio that I mentioned before, that now I rewrite also in a more useful way. So just to clarify, so you don't get confused, I just chose a specific unit where g today is equal to 1. And g at the moment of emission, here we have this g, but here I write explicitly an additional dependence on one constant parameter that it's alpha m0 that corresponds to the value of alpha m today. So this is the simplest assumption that you can make. Basically, g is evolving in time, and we are trying to parametrize that time evolution. And the simple situation is when there's only one free parameter that corresponds to the amount of friction today. And here I'm basically explicitly showing you that g will depend on that one single parameter. You could of course consider more complicated situations, but this is a simple scenario. So now we're going to use multi-messenger signals to try to obtain constraints on these alpha m0 that I recall again, it has a value equal to zero in lambda CDM, and it will be non-zero in these dynamical dark energy models. So how do we use these multi-messenger signals? So first let's imagine that we receive the gravitational wave signal that actually comes from the entire inspiring process, from the entire coalescing process. And actually, we do understand from first principles from general relativity what the emission looks like from these compact objects. And because of that, we can actually reconstruct from the waveform, basically from the evolution of the frequency of the gravitational wave signal. From the waveform, we can reconstruct the intrinsic parameters of these sources and reconstruct the intrinsic emitted amplitude. Then we compare that to the detected amplitude, and by definition, we get how much the amplitude decayed, which is this distance. So because of this reconstruction process that we do to get intrinsic parameters is that these sources are called standard sirens. They are standardizable in that sense. Now, on the right-hand side, actually, we have no information, a priori coming from the gravitational wave signal. The redshift is unknown, and your cosmological parameters are not known either. So that is why we do need an additional piece of information. And in the case of these multi-messenger signals, that would come from the electromagnetic counterpart. From that, you can get the redshift. So this is, again, what happened for the GW710817 event from the electromagnetic counterpart. People were able to identify exactly what the host galaxy was and what was its redshift. So now, we do have enough information. We can start imposing constraints on alpha M0 and actually on H0 as well. We can mix and match. We could use these multi-messenger signals to set joint constraints on both alpha M0 and H0. Or we could, for example, fix H0, bringing some external cosmological information, for example, from, for example, plan collaboration, and then get constraints on alpha M0. Or you could do the reverse, assume something about alpha M0, for example, that it is equal to zero because we are in number CDM, and then get constraints on H0. So I will show you all the situations. So these are the results that we find for GW1710817. So this is a constraint that we find for alpha M0 fixing H0 to be given by Planck. So here you see that the constraints, the uncertainties are around between 20 and 30 in this case. So these are rather large. Now for the Hubble rate, this is the constraint that you get on H0, assuming lambda CDM. So you fix alpha M0 to be equal to zero and then you use your multi-messenger signal to constrain H0. So this is how you can use multi-messengers not only to test our energy, which is alpha M0, but you can actually test your overall cosmological behavior. And this constraint on H0, because it's just one single event, it doesn't have a very large constraining power. Indeed, in principle, this constraint, it is much worse than the current constraints that we have on H0, coming from other observations. So it is not very informative. And if now we have both H0 and alpha M0 as free parameters, it turns out that they are highly correlated from these observations. And just as an example, I showed you that the constraints on H0, if now we do not assume lambda CDM and keep alpha M0 not completely free, then the uncertainties get worse by a factor of four to five because of this degeneracy. So certainly one single event cannot tell us that much. So that's why we look at the future. What if we had a hundred events? So we say let's assume that we have a hundred binary neutron star mergers and we are lucky enough to get both gravitational and electromagnetic counterparts and they are detected by LIGO. So in that case, the question is how well can we constrain H0 and alpha M0? And the answer is that the uncertainties on alpha M0, well, they improve by one order of magnitude. So now they are close to order one, which means that you should be able to make an statement of this type. For example, alpha M0, it has to be smaller than this order one number. So we should be able to distinguish whether alpha M0 is larger or smaller than one. Now for the case of the Hubble rate, the uncertainties are also greatly improved. They are close to order one, which means that you should be able to get one to two percent precision constraints on H0. Now these constraints are completely independent from all the other local observations or cosmological observations and this constrain starts to be competitive if we get it. So that's why the number hundred events is kind of relevant because it starts giving you informative and competitive constraints on these parameters. And now I want to recall that I told you that alpha M0 and H0 were highly degenerate. So as an example, we look at a specific situation where we again have a hundred binary neutral stars detected by LIGO. Now the difference is that we assume that the real cosmological universe from where this population comes from has a non-zero alpha M0 value but close to one. In that case, if you, for example, simply assume lambda CDM and try to fit a model with a value of alpha M0 equal to zero, it turns out that the constraint that you obtained for H0 is more than three sigma away from the true value that your cosmological universe had. So this example, we do it to emphasize the relevance that your assumptions about your cosmological universe and about dark energy could have in biasing sensitive parameters, sensitive cosmological parameters like H0. And this emphasizes the need that if you ever want to get a robust constraint or a robust value for H0, you do need to test for these extensions for dark energy on your universe to make sure that you are not making these large errors. So now that I've shared you that, I want to tell you what does it mean to have an alpha M0 close to one, for example. Are there any particular models that predict that or do they predict that alpha M0 is very large or very small? So that's a question. So the simplest and specific scenarios for dark energy that you could imagine is when it is a scalar field. But even in that situation, there is a lot of freedom because this dark energy field could be interacting in different ways with gravity and with itself. So its potential energy or kinetic energy could highly differ. So we have a wide range of models in that case. And we show that the simplest models actually do predict the models that are viable, do predict that alpha M0 has to be very small, smaller than 10 to a minus six. It could be much smaller than that, actually. And what this means is that these kind of dynamical dark energy models will not be distinguishable from lambda CDM with these hundred events because the sensitivity is just, it's much worse. So you will not be able to distinguish these models. However, there are more complicated models. If you start changing all the possible interactions that could be present, that do predict that alpha M0 could be close to one. So these kind of models now should be falsifiable with these multi-messenger detections from LIGO. And it is close to one, you may argue, it is close to the sensitivity that we're going to get with 100 events. But actually, people have also done forecasts for future generation gravitational wave detectors, for example, the Einstein telescope or LISA, and the constraints on alpha M0 will improve by two orders of magnitude. So you should be able to probe an alpha M0 that is 10 to minus two. So certainly these models will be falsifiable, but not all of them. So this is a conclusion that some models will be distinguishable from lambda CDM, but not all of them. So now that I have showed you all these results, some of you may be wondering, these alpha M0 that I showed you is related to a time variation of the gravitational coupling G, and don't we already have very tight constraints on time variations of gravitational couplings? And the answer is yes. These are the two kind of most famous ones. Some of them come from lunar laser ranging. So you look at the motion of the moon and around the earth. And if the gravitational coupling is changing in time, then the force between the moon and the earth is changing in time, which in turn will change the motion of the moon. So you can test for that by looking at the motion of the moon. So that's one observation. The other one come from binary pulsars. You have two neutron stars, for example, they are bounded by gravity in this binary system. And one of them is a pulsar. It is emitting regularly light like a lighthouse, and we can detect that light and we study the frequency and evolution of the pulse. So in this case, there could be two reasons or two different effects if the gravitational coupling is changing in time. One is again, the force between the two stars, similar to lunar laser ranging, that will affect the binary motion. And you will see that in your pulse. And the other effect is that if the gravitational coupling is changing in time, the amount of gravitational radiation that they emit will also change in time. And that again will affect the motion of the binary. So those are the two effects. Now, lunar laser ranging gives you the tightest constraint that tells you that alpha I am not. That I recall is this is how much is the time derivative of the gravitational coupling in kind of Hubble units. This is the Hubble rate. And that has to be smaller than 10 to minus three. So then you may wonder, okay, this is much better through three orders of magnitude better than the standard science constraints that I mentioned before. So will this standard science give us anything useful or not? And the answer is yes. And the reason is that when dark energy is dynamical, not only your gravitational couplings could evolve in time, but actually, they're not universal anymore. In general, in dynamical, dynamical dark energy models, couplings will depend on the particular environment that you're looking at and the particular bodies of place, which is not the same to look at gravitational waves than solar system observations, because the environments are different. In particular, in the solar system, for solar system observations, when you look at the force between, for example, the moon and the Earth or two stars, in that case, you're probing a coupling that I call the matter matter gravity interaction. And that I denote here by this gene coupling, the Newtonian gravitational coupling. This is what determines the force between these two massive objects, but standard science and gravitational waves are constrained with what I call the gravitational self interaction coupling that I denote here by this GGW, this gravitational wave coupling. And it turns out that this one is the one that I've been talking about with the standard science. This is the one that you will be able to probe with with LIGO. Whereas lunar laser ranging constraints constrained the time evolution of this one, that generically, these two couplings could differ in particular models. It could be coincidental that they're the same, like in lambda CDM, both are constants and they coincide, that it's Newton's constant. But that's not generic. So if you want to be completely agnostic about what are energies or your gravitational model, then you should assume that they're different. How about binary pulsars? Well, technically, binary pulsars, we actually showed in this work that observations are sensitive to both, both of these couplings. And because of the two effects that I showed you that that you could have changes in the force between the two stars, but you could also change the amount of gravitational waves emitted. So technically, the observation does probe both time evolutions of GN and GGW. However, we do show that the observation is mostly dominated by time variations of GN. In other words, binary pulsar observations are rather insensitive to time variations of this gravitational wave coupling, the one that we are going to test with LIGO. And the conclusion is then that LIGO and all these direct gravitation wave detectors will give us completely new constraints on this specific coupling. So this is all the information that I had for you here. This is a summary with the takeaway messages. I will read them. Multi-messengers can test beyond lambda CDM physics. I mentioned two specific effects. One is changes in the propagation speed of gravitational waves. And the other one is friction, this alpha M. I showed you that with time delays for 17, or at 17 only, you already get very tight constraints on the speed. And then I showed you that for binary neutron stars, again with multi-messengers, you could get completely independent cosmological constraint on both the star energy parameter, this friction alpha M, and also on H0. And I showed you that LIGO should be able to constrain H0 to 1 to 2% precision with 100 events. And that H0, however, it is highly biased. Oh, sorry. H0, that alpha M0 could highly bias your constraints on H0. And finally, that future gravitational wave data will falsify some dark energy models, but not all of them. Because the sensitivity is not good enough on some dynamical dark energy models to make predictions very similar to lambda CDM, a very small alpha M0. And finally, I showed you that Lunar laser ranging on binary pulsar constraints are actually rather relevant for these standard surface measurements because they probe at different gravitation wave coupling evolution, which is not the one that we are concerned with. So this is all I had to say, and thank you for your attention. Thank you, Makarena, for this nice webinar. We do have lots of questions, so let me start with the YouTube channel. So the first question. Could I start with a quick question? Sure, Joanne. Sorry. I'm sorry. Just a quick one. Seeing the conclusions there, you say that H0 would be measured with 1 to 2% precision. Is that in the lambda CDM or did that include the correlations with the alpha? That's why I say 1 to 2. So it could be 1% in lambda CDM or 2% if you include the correlations. Okay. Super. Thank you very much. Thank you, Joanne. That's a very nice question. Okay, so let's go to the YouTube chat. So the first one is by Yvonne Nova. Can you give more details on the scalar and vector simple models you are testing and which field gives you, which field gives which contribution? Yes. So can I go back? Yeah. So I guess this is a person that probably knows more about this, so I can be, I'm going to get more technical. So the one that I call simple models versus complex models are actually models that have different screening mechanisms. So all these models with dynamical dark energy are made up such that dark energy has very little effect in solar system scales. And this is the so-called screening mechanism. And in the simple models that I mentioned, these are models that have chameleon screening mechanism. That includes, for example, FFR models that people study in the literature, whereas these other complex models have the so-called Weinstein screening. I don't know if that answers the question. There's a little delay with the YouTube viewers, so we can jump into the next question and then we see if they have a follow-up. That person has a follow-up question. So thank you. Let me just speak another one. There is one by Lautaro Vergara. The first question is can a scale dependent Newton constant be consistent with general covariance? A scale dependent, a scale dependence with general covariance. I think it could. So in my case, I assumed that the gravitational coupling and alpha t were only time dependent instead of scale dependent. And that is because that is what you usually get from theories that have second order equations in motion. But you could have more complicated cases, theories with higher derivatives, higher spatial derivatives, for example, that could lead to a scale dependent, these gravitational couplings or this alpha t parameter. And they could certainly be invariant under the filmophysm. And I think mainly the difference is that they have higher derivatives. Thank you. Let me keep going. So if that person wants to have a follow-up questions, you can write it down. Then the next one is by Juan García Vergido. How likely is that we get 100 binary neutron star events within a few years, given that we haven't seen any in the last run 03? Yes. So okay, there are two big uncertainties. One is the number of binary neutron star events that you will see. And now from that side, I mean, people say there are tens per year. You could get, you know, 100 in maybe, I mean, if we are lucky in five or six years, the most the most uncertain part is that we do get these electromagnetic counterparts, because that we certainly hasn't, hasn't, hasn't happened for any of the potential events that were seen by LIGO in 03. So it could be that it is very unlikely. And I think that's a good question because all these exercises and constraining, getting these constraints can also be done even if you do not get the electromagnetic counterpart. So people have developed and have found different methods for getting or estimating the redshift in, if you did not get an electromagnetic counterpart. So maybe the most promising way is that one. And that's, for example, if you do not need the electromagnetic counterpart, you could also use binary black holes. And then you will certainly get many more. And in that case, you do an statistical analysis of correlating the estimated region in the sky coming from the LIGO observation. So they also get, they always get some constraining in the localization of the source. And you can correlate that with a galaxy catalogs to try to see all the possible galaxies that could be present there. And making a statistical analysis that, of course, you will need many, many more events. And I think some estimates say that you could constrain it not to maybe 5% or 10% in maybe five years, if you do that. Okay. Thank you. Macarena, I'm also copying the questions on the chat. So it might be easier for you also if you want to see them. Okay. The next question is, thanks for the talk. And this is by Piri Florey, sorry. And says, could you give more details about what you call G underscore GW? And how is it defined? Yes. So my GGW, I technically define it as I said here, which is that corresponds to the gravitational self-interaction. So when you study gravitational waves, these are technically linear pervations on, and I give you a background. And here I identify this GGW as basically the coupling, the coefficient that is in front of the kinetic, actually the kinetic quadratic terms of gravity. So that's what I call the kinetic self-interactions. And in this paper with my student, where we study binary pulsars, we make this difference between these two Gs. So this person can also have a look at that if they want to. And GN on the other hand is something different. Here you look at what the coupling is between gravity and an external body and external particle, a test particle if you want. So basically I look at different coefficients in the action. That's how I define them. Thank you. And I think Javier Rubio has a question. More like a comment. So I mean, if I understood properly, the way you define this GGW is through the propagator. So any change in the propagator, you call it GGW. And any change in the coupling to matter, you call it GNU2. But I don't think any of these things is what you do. And it's not even not well defined. I mean, you can get a GGW in transdicate. Transdicate gives you a variation of GGW, and coupled quintessence gives you coupling to matter and gives you a variation of GNU2. But I mean, I can always move from one frame to another and get a theory in which I have gravitational waves propagating with a standard propagator and move this into couplings. So the only thing that really matters is some dimension, a full dimension less quantity. So it has to be brought to the sum G or sum coupling, gravitational coupling times a mass or things like that. So I don't think it makes sense really to define dimension full quantity. So look at this dimension in the independent. Yeah. So in this case, you're right that there is this ambiguity of the different frames, the Einstein frame versus the Jordan frame. In this case, I define these quantities in the so-called Jordan frame where, as you say, there are modifications in the self-interactions of gravity. And we prefer that frame because the Einstein frame, in the Einstein frame, you change, for example, how photons would behave. Basically, you change the way matter behaves because you're changing all the coupling to matter. Photons are conformally invariant. You don't change photons. But you change, well, you change the way photons and gravitational waves interact, for example, with the detectors and things like that. Yeah, but that is something that I'm saying. I mean, what you really measure is some dimensionless. Yes, I agree that maybe there is a better way to define this so that you don't have the ambiguity of the frames. This is defined in the Jordan frame. Okay. There's another question by Mauricio Gamunal-San Martín, greetings from Chile, Macat. I have a doubt with respect to the behavior of gravitational waves in the context of lambda CDM model. In flat space time, the dispersion relation is equal to one. Is it the same time in lambda CDM model? In lambda CDM, yes. The dispersion relation is the basic one. I guess that's what he means by one. That is because this alpha t parameter that I defined is equal to zero. That alpha t parameter, basically something that changes the dispersion relation. In my case, I assume that it could only be a function of time. You could have this additional modification, but in lambda CDM it's equal to zero. I know that other people have studied what if the alpha t does depend on a scale and you have the scale dependent modifications to the dispersion relation, but that's all beyond lambda CDM. Okay. Thank you. Juan Garcia-Vegido has another question. He asks, how do you expect to distinguish the host galaxy and thus obtain its redshift for events at high redshifts, say less than three, if the 90% area is large and transient sources are common in that huge volume? Yeah. I think in this, I guess that's kind of a follow-up of the statistical analysis that if you have a large localization area, then you cannot really do much and that's the answer. In these estimates that I said of the 5% and 10% are using black holes, I believe that they simply throw away all the events that are badly localized. They put kind of a threshold to how well or badly localization is and all the ones better localized than certain degree square, those are taken into account, otherwise they're simply thrown away because as he's saying, the person asking is saying, they really don't add much information. Okay. Is there any other question here in the low physics coordinators? Or should I keep reading questions on the YouTube channel? Check for another one. There is a comment by Shantanu Desay says, GW 170817 also sets tight constraints on relative cumulative line of size Shapiro delay between gravitational waves and photons. Okay. Let me go back. Let me put another question here. There's another question again by Lautaro Vergara, which asks, are these effects measurable, given that the gravitational wave signals is obtained after strong field filtering and then use a template? I mean, they are as measurable as I mentioned here. So these are realistic or maybe he's asking if they are trustable or... Okay. Yeah. So Lautaro, if you are around, you can comment on that. Okay. Let me pick another one. Okay. There's another one by Mauricio Gamonalza-Martín. Can we interpret the alpha term as a dispersive term as well as a speed excess? He had a question. So the alpha term as a dispersive term. So I guess this person is talking again about the dispersion relation and... Yes. Yes. Okay. So in this case, alpha m does not change the dispersion relation. It's the other one. It's my alpha t parameter that changes the dispersion relation. Okay. I think we don't have more questions. Everybody's saying thank you. Thank you. Great talk. Leo Stein is cheering you also. And if we don't have more questions, I think that's it for today. Thanks for joining us. You know, you can reach Macarena. I think she put her website where you can find her contact info if you have more questions about her papers or their talk. Please stay tuned over Twitter or any other social media for our coming webinars. We will have a very special seminar, seminar 100th. And then we will announce why it's also very special, this one. So stay tuned. Thank you very much. Great. Thank you. Thank you, Macarena. Okay. Let's see, are we...