 Awesome. Okay. So let me press go live and then we'll go live. Awesome. I think we are live. So welcome back everyone. Thank you for joining us for today's low physics webinar. My name is Alejandro and I'm going to be your host. Today we're presenting data-driven exploration of cosmic mysteries using gravitational waves by Suvodip Mukherjee. Suvodip got his bachelor's from Ferguson's College, University of Pune. Then his PhD at the Inter-University Center of Astronomy and Astrophysics in India as well, where he worked with Professor Suvodip. He then was a Grava Fellow at the University of Amsterdam, Lagrange Fellow at the Institute of Astrophysics in Paris, and then a Flat Iron Research Fellow at the CCA, at the Flat Iron Institute. And then he moved to Canada to do another postdoc, to be a postdoctoral fellow at Perimeter Institute. He's now a reader or an assistant professor at the Tata Institute of Fundamental Research in Mumbai, India. And he's the principal investigator of the research group Data Theory, Universe Lab. He works on data-driven astrophysics and cosmology using multi-frequency gravitational waves and electromagnetic waves to understand the history of the universe and its building blocks. We are delighted to have him in low physics today. So remember that you can ask questions over email through our YouTube channel or Twitter, and then the questions will be read at the end of the talk. So without further ado, we'll turn it time to Suvodip. Thanks for joining us. Thank you very much, Asimov, and other organizers for inviting me for this talk and giving me an opportunity to present my work. I'll start sharing my screen now, and let me know if everything is working properly. Perfect. Great. Okay, thank you so much for the nice introduction. And that is it. I already said that I work on data-driven science or data theory cosmology. So that is the basically the first part of the title of my talk. This says that I'm going to talk today about data-driven exploration of costume industries using gravitational waves. So when I say about I'm going to do a data-driven exploration, the first thing we would like to understand as far as what have we understood from the perspective of data about the cosmos so far. So this is a very nice image created by our colleagues in Carnegie and MPIA, which very nicely talks us about in one side how the observation has actually helped us in driving the theory of cosmology, understanding its evolution. What we have learned so far is universe back at very early time was radiation dominated, then at some point become matter dominated. And now it's what we call as it's being dark energy dominated. And universe has gone through several phase transitions. So at some point now universe is ionized at some point it got re-ionized. And in acquiring the early high-reshaped universe, universe was completely everything was atom, and if you go to very high universe time, then again you go back to plasma. So this is the story which we have. Of course, our low-reshaped measurements are pretty good, which helped us in driving the cosmological model, which we call as the lambda-cold, dark matter model. And what is important to understand is when it comes to really the understanding of the model, I say LCDM, will it still understand about the model? We know it fits well, but it still yet to explore several aspects of this model. I have only listed a very few questions over here, but it goes without saying that there are many more things which we do not know about the cosmos. I just said here that we do not currently know what is the expansion rate of the universe, so called the Hubble constant problem, what is cold, dark matter. What is dark energy? Is it really the cosmological constant? Or is general theory of relativity the correct theory of gravity? And how do you have to understand that all these questions may not be there 50 years back? These questions actually came up with the health of our observations. And now there are several things we have not observed. We have not observed the very high-reshaped universe so far. We have measured something in the cosmic microwave background, which is really high-reshaped, and our ratio is 1100. But something between that and the shift 1 or shift 2, we have not had several observations so far. And what is really interesting that the era of astrophysicist cosmology is soon to be driven by observation. We have already several observations coming from the electromagnetic sector, because we have the help of several telescopes and several great scientists who are doing day and night on these observations. And we are able to map the sky from radio to gamma-ray, and practically this effort is kept on improving. This is one side of the story where we have very good EM observations to map the cosmos. Other side, we have basically opened up a completely new probe to the cosmos. We call it the beginning of the GW astronomy. And it is now as an observational probe, thanks to LIGO-HARGO-KAGALA collaboration and of course several other ongoing efforts. So this is a slide which tries to explain this in a simple picture that when you talk about gravitational wave astronomy, it's not only about what LIGO-HARGO-KAGALA has seen so far. It's much more than that. Similar to electromagnetic observations, you will expect gravitational wave also to be across a huge spectrum, something from 10 to minus 15, minus 16 hertz to something, you know, like thousands of hertz. And we have several ways to detect gravitational waves over this huge range of frequencies. In the extremely low frequency gravitational waves, we expect to detect from the cosmic microwave background, be more polarization, the effort from several ongoing and upcoming detectors. In the around nanohertz range, we expect to see gravitational waves to be detected from radio observations of the pulsar timing area. In the millihertz range, we expect to detect gravitational waves from upcoming space-based detector called LISA. And in the extremely high end of this particular spectrum of 10 hertz and so on, we are able to currently detect gravitational wave with the help of LIGO-HARGO-KAGALA collaboration. In future, this effort will be joined by LIGO India, Cosmic Explorer, as in triscope. And as you can possibly understand, as like any other band of observations, these different frequency bands will be actually covering very different kinds of physical phenomena, very different kinds of sources. If you explore from, you can go into explore very early universe scenario to the epoch when basically few compact objects like black holes or neutron stars or neutron star black holes are coalescing. So you are going to basically see a huge range of physics aspects happening over cosmic time with the help of gravitational waves, multi-band observations. And in this sector, we are basically in the very dawn of the gravitational investment. We currently have about 88 to 90 sources we have detected so far with the help of LIGO-HARGO-KAGALA collaboration, O10203 combined. This is a very nice summary plot made by the collaboration telling us about what kind of sources we have seen, what are the typical masses we have seen. We have a lot of details over here on this, but this is our current state of the art scenario with the help of gravitational waves observations. How does this change in the future? So if you ask me that question, I will say that the future looks pretty bright in the gravitational astronomy. So I'm showing a plot in the left-hand side where you can see the typical range of masses we can probe using ground-based gravitational detectors. And that mass range is in the source range in the x-axis. And in the y-axis, you see the typical redshift range up to which we expect to detect the sources as an event. We can confidently say this is a detection. So with advanced LIGO, we will be able to reach typically up to a figure one for a system of total mass around 100 solar masses. That's maybe a ballpark number to keep in mind. But in future, with cosmic explorer and Einstein telescope, the next generation of detectors, these are not yet funded, but these are very much in our proposal to get it funded. We get funding for that. That actually did a gravitational wave way up to high-reshift, up to a shift of 70 or 80. So now you remember what I was showing you at the very beginning of the Smythe that our observations about the cosmos is currently limited up to low-reshift. We have seen something up to very high-reshift within cosmic microwave background, which is great. But now we have a very different kind of source present, which can see up to very high-reshift using a complete different probe, which is complementary to electromagnetic observations. Okay, that's all about the high-frequency gravitational waves. What about in the millihertz-length gravitational wave, which is detectable from sources like supermassive binary black holes or extreme mass assurance barrels? But in the space-based detector called NISA, here is again a plot showing in the x-axis the masses of the binary systems and the y-axis is the ratio of reach up to which we can detect the gravitational wave sources with a high signal-to-noise ratio, which are quoted in numbers over there. Again, the take-home message is very similar. With the help of NISA, we are going to explore the high-reshift universe. And with sources, typically few thousands of solar mass to few tenors to seven, tenors to eight solar masses. Keep in mind now, we are now going into an epoch of protecting cosmos using galaxies or quasars or cosmic microwave background, or supernova. Along with that, we are now entering into an epoch where we will be exploring gravitational sources across mass ranges to explore up to very high. Each of these mass ranges are kind of your different kind of sources, which helps you in understanding about the universe. And I really am quite fascinated to see when these both sectors, EM observations and gravitational observations, will be basically joining their hands together to tell us much more about than what we have learned so far. So now the question which keeps me awake in mind is, how can you learn about the universe using these flows? What is the best way to combine sources? And what is the best way to understand new things which we have not explored so far? So in this paradigm of multi-messenger astrophysics and cosmology, there are two ways in which I find sense fascinating. One is, how can you understand about the universe using these compact objects as a probe? That's the upper arrow in this particular slide. And the lower arrow that you can learn is from our current understanding from electromagnetic observations, from radio to gamma ray, or cellular properties, what can you understand about the transient sources across cosmic history going up to high-reson? And this paradigm which I have tried to show by the simple diagram is practically possible, you know, because gravitational astronomy has some very salient feature. The way I would like to put it in one single slide by trying to draw a four-dimensional graph in a two-dimensional plane. So please bear with me if there's some confusion in understanding that. What I find is amazing about gravitational ray astronomy is this four words, you know, L, R, S, T, length, ratio, source, density. Gravitational ray astronomy is going to really be extremely great in exploring the cosmos because it has these explicit four ways to study the universe. When I say about length scale, you can see that we can study the universe from a few kilometer size when the black holes are coalescing to something like gigabit scale when they are propagating from a far away galaxy to us. So basically, from few kilometers to cosmological scale, we can throw physics using gravitational ray sources. In Cleopasic-Parser scale, the physics drive in the galaxy actually gets imprinted on black hole properties. I'll talk about it today's talk. About sources, as I already mentioned briefly, we have several kinds of sources, as we see, using which we can measure gravitational ray sources and up to higher ratio. Like binary neutron stars, BNS, NSBH, neutron star black holes, binary black holes, intermediate black hole masses, extreme mass ratio inspirations, supermassive black holes. And these all sources are detectable up to quite higher ratio. So that's an additional dimension of Cleopasic-Parser. And one of the very important thing is style. These are transients. And their time scale of variation or the time scale over which you detect the sources can be a few hundreds of milliseconds to seconds hours up to a few years depending upon the source you are measuring. So this extremely fascinating epoch when gravitational astronomy can help us in studying the universe. And in my opinion, these four sectors is quite impressive for me for a single probe to basically explore, helping us in understanding quite a lot of physics. So this slide tries to tell you kind of physics it can do or in to be more specifically what kind of physics I explore using gravitational rays and other game observations. So here if I plot as one, just one single plane from that previous slide of this four-dimensional plane, I have just taken the landscape and the lens scale problem now. So you can see, you can explore from black hole scale like a two-kilometer scale to galactic scale to cosmological scales within gravitational waves up to higher ratio, even if either using events which would detect or using stochastic gravitational wave background, which is not an individual trajectory. So the range of physics which you can do, as I have already mentioned over here, in the cosmological scale you can do a special history of the universe can test GR in cosmological scales. You can search for exotic objects like primary and black holes using gravitational waves. You can learn about the galactic physics, you can learn about how black holes are forming in the universe. What are their ratio distributions? What physics goes in? You can learn about gravitational lengthing of gravitational waves, which is a, we haven't detected so far as far as our base analysis tells us, but sooner or later it will be happening. And finally, in the extremely small scales, we are able to try understanding completely unexplored physics or sectors which we have no, we have express signatures, but we do not have any measurements so far. And it goes without saying I cannot do judgment in my single talk today, it tried to explain all these things. So I will now get to focus only just two topics in my today's talk. One is about what have we learned and what can we learn about the formation channels of compact objects across the ship using gravitational resources. And then, and I will tell you about what something quite new we can explore in very small scales using gravitational resources in a completely new way, which is impossible before. So the first part of my talk is going to be about the multi messenger view on black hole populations. When I to tell these, they are broadly two divisions you can, you can think about understanding black hole population from individual rejected events, or also from the stochastic gravitational background. One more nanodime research work, I'll be today talking only about events and not about the background. So when I tell about population of black holes or something, I want to understand the formation channels of black holes. One question to understand is, okay, what do we really measure from observations or what do we really measure from observations so far, as of now. I was saying which it can be neatly measure using gravitational strain data is this mass, or to be more specific, the rest of the, which means the black holes mass we see are actually one plus Z times is true mass. So, a black hole of whatever mass you talk about it's a 10 solar mass will appear 13 solar mass, if you keep it at a sheet of two. That's what it means. And another thing we can very nicely infer from gravitational observations is about the merger rate of black holes or to be general, but merger rate of compact objects, which mean you can ask how many sources of what masses are coalescing at what ratio or what distance. That's another quite interesting thing we can infer from gravitational observation. We can also understand about spin, which is also an extremely interesting topic. I'm not talking about it today in my talk, because our current measurements on the spins are not great. It will be, but it will be great soon. But by today's talk on the part I'm going to actually tell you about what can be learned from these two observations from our current. So I'm primarily focusing on this. So what we have now is well whatever we have observed so far around 90 binary sources, this is a simple very nice cartoon diagram made by the life of our collaboration, where it was trying to show what are the kind of sources we have directed to the mass ranges that's in the y axis in blue are all the sources which you have seen like over the cargo. In orange you are seeing the Lego cargo cargo binary neutron star even classified on the basis of their masses. In red, you see the black holes which have detected using electromagnetic observations. So you can see the neutron stars of the EM observations. So you can clearly see we have a few populations of black holes are directed thanks to the collaboration. So far up to one or two or three observations. So if I have to tell you a summary plot, you know, what have we learned about black hole masses from this data. Let's ask this question. So so far is well for the very very first time with the observation we are able to plot the mass spectrum of compact objects. Let's talk about black holes here. And that's the plot is you are seeing. It's a phenomenological model is a power law plus the ocean peak model. But what's really there in the data is we kind of see that you know that the mass was distribution is a power law. More black holes are present at the lower at the lighter end in comparison to the heavier end. And there seems to be a slightly over density in the number of black holes around 3040 for a mass, which is cannot be explained by simply a power. Remember, five 10 years back, such a plot, nobody would have imagined. This is the first time we are telling you how binary black holes in the universe exists in which masses and how they evolved with this. That's what I'm still about to come. So, well, that's one of the plot you can ask, this is a phenomenological model. Did you try other models. Yes, we have tried Gaussian peak model for broken power model several models we have tried more or less our understanding is very clear that well there seems to be a peak around 3040 for mass after that it falls down at higher end. And then the black hole mass distribution seems to follow a power law with the negative index, which means more sources at lighter dead in comparison to the heavier. So, what about the merger, what have you learned about the merger rate of the compact objects from our current observation. We have seen so far that the merger is quite nice we can be explained by a power law form one person to a copper, the value of copper seems to be in agreement some habitat in you know to do so. We are experts in the star formation of history they can quickly say that well, from electromagnetic observations, we have seen that star formation rate in the universe typically follows something like one plus Z to the power 2.7 or as a mother would be king since star formation rate. And when you plot that kind of line for in the dashed over here in the left hand side plot, you see there's a quite a deal. And you would have expected that man because black holes are expected to form from the stars is when you consider astrophysical original if you're not talking about I'm not here. Great. So that's what we have learned so far from the observations is the thermological model. But now we can ask a bit coming a bit more your deep question about where these binary black holes are coming from. What's the feelings behind, can you get some interesting for this out of it from these observations. So, one of the things you need to understand that we in the collaboration the like of our collaboration the help of coverage waves we do not detect a single black hole. We always detect a pair of that means you form a star of star in into a black hole that another black hole that black hole needs to coalescence with another black hole form. And then it will merge. So there is a delay between the formation of stars and merger of. In simple way to understand that black hole mergers are going to be delayed. And this is a very nice cartoon diagram by Martin. In her article, she was trying to show this basically the simple in a diagram by the diagram that there is going to be a delay between the formation of stars and the mergers of clouds. Depending upon what formation channels they're talking about, we are going to have a very different kind of delay time distributions. Again, it goes without saying we don't know much about this. So it is a completely different data driven sectors of using gravitational observations. Now the question is what drives it. I mean, if I say you that there's going to be a delay in the formation of the black holes and the formation of the mergers of the black holes is respect the star formation rate. Then you can ask well, how the star formation rate evolves in the universe. And that's the plot you are seeing the left hand side corner, which is coming from the Madhavadik incident 2014 paper with the help of infrared and UV observation. There's a constraint on the star formation rate delay, which peaks around a shift between two to three. The lower shift and evolves with, you know, about one plus zero to two point seven. So when you include this with the delay time means the stars from but they've been maybe hundreds of mega years to a few giga years to to merge with the corresponding merger rate of black holes can be obtained by basically doing integration over the delay time and the star formation. And that's the plot you are seeing in the right hand side, showing you about the merger rate of black holes as a function of C. The final message from this plot is basically that as you can see when the delay time is more, there's a two giga years shown in red that speaks at a lower ratio in comparison to the blue line for which the delay time is less like something like 0.5 giga years. That's because the black holes have spent more time in finding another black hole to merge in the red line. So they shifted to a lower ratio because of the look back time effect in comparison to the blue line. That's an observational level. Daddy can measure from that. Another thing we have learned from observation again is the metallic city about about of the audience. Remember, I am telling you about the astrophysical black holes. Astrophysic black holes form from stars. Stars has metals or start with metal pool depending upon when they are forming. And one of the key thing with dry and stiller properties of black holes are their metallic. Black holes of coming from metal pool stars are going to be heavier in comparison to black holes forming from metal rich stars. The question is, well, that's great, but how that controls the evolution of the black holes in the universe as a function of ratio. If you want to understand that, you have to understand how metallicity evolves in the universe. This is a rain of thoughts from Madhavadakinsan 24 interview article. This is a data-driven measurement of the metallicity evolution in the universe with respect to the solar metallicity as a function of redshift in the X axis. As you can see, the data trains to say that at high redshift, we have more, much less metals in comparison to the low redshift. Now, when you combine this effect with the delay time distribution, as I told you, it gives you some extremely interesting physicality. I remember when you were observing black holes, you never know whether these black holes are coming from astrophysical origin or primordial origin. We know stars form from astrophysical objects. So if you really have to understand anything about primordial origin of the black holes, better we understand at first how the astrophysical black holes. There is no way out from that, because otherwise you can never understand how these are these black holes, really the primordial one or the astrophysical. So coming back to the point of how these metallicity evolution is going to affect the black holes, the masses. So I already told you that black holes forming in low metallicity regions are going to be much bigger. That's typically goes up the couple of physics. At first, the typical gene scale are much larger. So black holes, stars which form from metal pool regions are big, are heavy. And also, when these stars form, they are very big. They can have a pair in silver to supernova, and they can throw away a lot of material by wings. Among the material they throw away by wings depends on the metallicity. In a long story short, it implies stars forming at high ratio or in the metal pool region are going to have giving black holes which are bigger or heavier in comparison to black holes are forming at a low ratio. Another thing about a very simple picture which I'm trying to show you in this blur, that black holes which are forming at high ratio are going to be heavier in comparison to black holes forming at the low ratio. But now I told you these black holes are not going to be merged right after they form, but they're going to merge after a delay time. That means the black holes you are seeing today must have formed from stars at a higher ratio. And this delay time is not a unique number. Not that it's a unique number we don't know. No, it's a distribution function because black holes of different formation channels of different properties are going to have very different delay time distribution. So we don't know the minimum time of the black holes to merge and we neither know the distribution. So it's a quite a few unknown things which are there, which it implies that when you see black holes at a lower ratio. These black holes are not going to come from a particular cost being ratio at high at higher universe, but rather they're going to come from over a range of ratios. If they come from a range of ratios, you're going to mix black holes of different masses. It's an inevitable effort. You are going to mix mixed black holes of different masses. That means when you observe using gravitational waves data, black hole population, these black holes are observed at a particular ratio to merge, but they are not forming at a particular ratio from a star. They may be forming from a range of ratios. What's the implication of that? The implication of that is the probability distribution of the black hole masses can get a ratio dependent. Depending upon at which ratio they are merging and up to what's the typical delay time distributions of the black holes. They may be coming from stars which are formed at very high ratio, when the universe was pretty much metal poor. Hence the black holes can be heavier in comparison to the lower ship universe, but the middle of the universe is high and you start seeing black holes, which are like. There's a plot showing you that in the lower mass and that's not much impact of because of this parents to notice for now, but the higher mass and we are going to get a significantly different distribution is possible. So now the question is, what is the model I've told you how the model fits the observation or does our current data, which you're looking at. Here is the plot on the black hole masses detected mean the reshifted masses in the y-axis as a function of the luminosity of distance in the x-axis or reshifted in the x-axis in the top. In the left hand side you are seeing the one mass companion masses. And the right hand side you're seeing for the second one with the underlying assumption that M1 is always given them to that's the construction. So that what you're seeing you can see in this data that we have, this is the black hole distributions we have seen the pastures. We have not seen very light black holes at very high ratio, because the emission of gravitational signal from them is weak, we cannot detect them. We have not seen. So most of the black holes we have seen are typically in the lowest range and but they are still wide up to higher masses. So can I explain this observation in by the stellar physics we know, or do we really need any exotic stuff? So now I am showing you a plot where how the black hole population which you have seen from gravitational data using a physics driven model which I just now discussed with you using this mixing of black hole models to tell you how this black hole distribution works. Of course this plot looks extremely annoying to read because it's everything is so small and tiny. What the main take home I am to say from this thought is one needs to understand, because of the similar astrophysical aspects, there's a lot of differences and correlations, which one is to explore from observations. So, right now as you often tell about our best fit cosmological model is a six parameter model. If I have to tell you that, what about my best fit gravitational population model? That's not going to be a system. That's with the six parameters rather we need maybe 12 current parameters. So we need to really vary these in the setup of the basal framework. I'm going to show you a couple of summary plots, which are obtained by basically marginalizing over a lot of astrophysical parameters. What have we seen so far? These black holes they told you that we see a typical peak around 30, 40 solar masses. What we have seen that these black holes can be explained by pair instability supernova coming from masses around 40 to 50 to 60 solar mass as shown by the left hand side top plot, which agrees with our current understanding of where the end of the black hole masses exists, the lower end of the black hole masses. We don't see any strong evidence of the red sheet evolution that's because of the only a few sources, but you can clearly see that from the right hand side plot there is a trend towards red sheet evolution of the black hole masses. So heavier the black hole masses are as you go to high action, and that is basically getting driven by the very fact that the middle of the higher shift is negative is less. One thing I must point out to you that you can ask, isn't it an effect of prior are you assuming metallic decreases? Sounds like no. If you see this parameter over here, this alpha z right on the corner of this of this huge annoying corner plot, the last one alpha z that controls and the gamma z, those controls the metallic variation. What you can see here the gamma z parameter is extremely constrained to the negative and it's not a positive. It's not positive means the metallic evolution, increasing with red sheet has been completely ruled out by only gravitational observations, which agrees with the game observation as well. And what's interesting, if you plot the merger rate of black holes as a function of masses and as a function of red sheet now, you tend to see a mild evolution in this red sheet as you can see the loop plot bottom. But I'm showing you the merger rates of black holes as a function of masses in blue, you can see those black holes are merging a ratio of 0.2. And in green you're seeing the plots for the black holes merge a ratio of 0.8. What's the main take home message you should understand from here that we are seeing a heavier black holes at a higher distance in comparison to black holes at the lowest. That's the something we have seen in data, and that's something is kind of agreed upon by this model and this constructs. And when you now can ask, well, now we have a fitness driven model, we can some constraints on the delay time distributions of the black hole. Here is a plot on the delay time distribution of the black hole from the original data, the GWTC3, for different kinds of population assumptions. As you can see right now the constraints are quite weak, but one thing is interesting that for pliers above 2.5 or 3 giga years seems to be strongly constrained for the orange though, but for the green one and the blue one that seems to be quite a circle. That what it means that black holes can come from very very high ratio and can coalescence today. Another way to understand what I am exactly saying now is basically the black holes you are seeing today to merge in the LIGO Virgo Kagura observation band may be coming from high ratio. That means this is a completely new pro to study the high ratio of universe. Remember the stiller properties of black holes or stiller properties at high ratio is not well measured from observations so far. So gravitational wave is actually opening a completely new in window to learn about the stiller fitness happening at very high ratio by studying the dead by the study the black holes with that the end product of the stars. And the right hand side of the plot you are seeing a plot from the merger that constraints from the gravitational observations. Again, it's agrees pretty well with GWTC3 observations from LIGO, showing you that typically it is a driven by the star formation rate, which and that peaks around between you know, one to two. And one of the quite interesting event, which you have written in gravitational law is the one of the very heavy man's black hole is maybe known to many of you, called as the GW 19 0 5 21, a heavy man by any black hole around a giant successful mass. So after this came up people started talking about maybe this is a source which is unless unlikely to origin from astrophysical source, because you don't expect such high masses. But we have seen it by basically this physics driven model, because your metallic in the universe evolves as a function of ratio black holes of this heavy masses can form at high ratio when the metallicity of the universe is low. So if these black holes are forming from stars which are extremely middle poor, you expect to therefore them to even merge at a lower ratio, because of the bigger chance. But in other words, GW 19 0 5 21 is not an outlier, according to physics driven model which I am presenting. To summarize, not my talk, but just this part of the slides of the work. What I told you is a person in a supernova mass scale agrees to the theoretical range from gravitational light observation. Astronomy predicts heavy mass, black holes, binary black holes, driftable at low ratio, like 19 0 5 21 constraints on the delay time distributions one can obtain by this model. And it's very very interesting that a concurrent picture of the metallicity of the star formation rate evolution of the universe and the back hole distribution now comes seems to be in agreement. It's a very very cool thing because now that means we can learn about higher ship universe using the black holes which are merging and observing now using current gravitational. Well, that's the part of the talk I presented. You can ask what more we can do in the frontier of gravitational light population. I told you I can use metallicity to understand about black hole masses and their metallicity evolution of delay time to understand, you know, their population how the evolves. So the natural question to ask because I started telling you about multi messenger from the very beginning of my talk, how the multi messenger part actually plays a role. What's interesting is black hole properties of the universe can actually tell us about the chemical composition of the universe. And that's basically because black holes are forming from stars of what kind that depends on the metallicity evolution. How many black holes are merging that going to depend on the star formation. And metallicity evolution of the universe is not only traced by gravitational waves, but also by other sources like emission lines. You can see emission lines of oxygen, hydrogen, nitrogen, carbon. And you can see emission land galaxies. We can see line density mapping of different emission lines from electromagnetic observation. You can see a snapshot from a simulation made by film, where he assured that from a cosmological simulations, where you expect to see emission lines in blue you see extremely bright galaxies, we call it emission line galaxies. And those which are not detected diffuse background you can also see like a line intensity mapping observations in future. The real fact that metal lines can be detected using emission line galaxies or line density mapping. The same metal lines is going to control the stellar properties and hence the black hole property. As a result, they're going to be a diet one to one correlation between the observed black hole mass distribution and emission and gas properties. So it's a completely new way to study the chemistry evolution of the universe, chemical evolution of the universe using multi messenger observations. And here I'm showing you a plot on how this can play a very interesting role. This is a very messy plot so try it it's the bear with me in, in different colors you are seeing the ratio evolution from blue to red you go from a shift of zero to 2.5. In the y axis you are seeing the number of gravitational sources we expect to detect. And the y axis you are seeing a typical number of emission line galaxies you will measure. Now the very fact that heavy math that metalicity evolves in the universe as a function of ratio. If the emission distribution is going to follow the metallic revolution. You will expect a very, very interesting correlation between the emission line galaxies properties and the gavelier-short-air merger. Here I'm showing you a plot for large delay time, means black holes are taking more time to merge by large marker size. Small delay time, the black holes are merging sooner in by the small mergers, the small marker size for different emission lines say O2, O3 and H alpha. What is quite interesting now that vary from this very fact that both are driving driven with the same physics, you will expect a correlation in the, these two sectors. One can measure this from observation that means now from a good population model of understanding the black hole properties. I can go and tell you about the black hole properties and about the chemical evolution of the universe by combining emission line galaxies with black holes which are directable from upcoming directors such as sphere rates, in combination to with LIGO-VARGO-CAGRA. So this is the paper which I have written with my collaborator Azadeh Divza. This is now published in the FG letter where we have shown that how the formation challenge of that mainly the delay time distribution can be constrained from this very new correlation which you have pointed out in this paper using LIGO-VARGO-CAGRA and the line matching pattern signals from sphere rates and combining the emission line galaxies from Daisy. And you can clearly see that from a measurement which is currently extremely poor, we are going to make a 10 to 20% measurement of the delay time distribution and the typical global distribution of its delay time from these observations. So, which I find quite fascinating now because we are going to explore extremely higher state universe in the galactic scale using black holes. Remember my very initial plot, black holes gives us information and teaches about a quite large of redshift and about a different landscape. So far I just talked with you about landscape we can explore using gravitational wave in the galactic scale driven by metallic revolution, star formation of radars. Other thing which I promised to talk today about is about the extremely small scale physics we can learn from gravitational wave observations or basically things we don't know about so far because who has gone and measured black holes when they are very few kilograms. So here comes a way to actually try to understand about the unexplored physics using gravitational wave observation and search for deviations from fiducial model or your favorite model. So here I'm trying to show you let's say the binary black holes coalescing. And in the left hand side you are seeing a gravitational strain your best fit model hidden something in the noise. This is created by a PhD student from Perimeter Institute, Giyam Didera. Let's say this is your actual model you are considering but that's not the true model your actual model is something quite different shown by the green line in the right hand side. And there is a deviation. You don't know about the green one impossible just know about the red. How do you going to measure the actual physicality. And ask the question what all can drive this. Well, there are many things unmodeled grfx gr may not be the truth area of gravity. Unmodeled cosmological effects. At one gravitational wave propagates to space time there can be effects from cosmological origin, which can be less saying which can be gravitational propagation effects, which can distort gravitational wave. There can be completely unmodeled unknown as to which you do not know about. And there can be unmodeled known as to like black holes may be in a disk may be around the black hole in a thick disk where it is getting a bit delayed with a fiction doesn't affect us all. And remember one more thing is very very crucial you cannot forget about unknown jitter noise stuff. So when you really want to get into a neat measurement for the first time, because you really don't know what is signal and what is noise. Well, these are my interesting physics effects which can happen. I told you question is how can I explore this unmodeled figures from data. I have proposed that well, there are maybe couple of things which is a physics driven way again same physics driven hammer comes here, which can help you in exploring even the unmodeled. What are those if there is a common effect. But the green line in comparison to the red line means you have a residual, which is of astrophysical origin, not coming from your detector. It is going to be common in multiple degrees. That's a key feature. Number two, a deviation which you are seeing in the gravitational wave waveform can be modeled as a loss or gain in the energy from the fiducian model. These deviations are likely to depend on the gravitational wave source properties such as this mass of the spin and so on. So this in our opinion, me and our collaborator as I come over here. I'm myself and Luis Lena. What is proposed is a completely new technique called scope structure correlated residual to look for these particular effects from observation by basically hammering on these three physical effects. So first one. Physical effects will be common in multiple detectives. So what it proposed to do is what you have residual between the gravitational wave best fit model. And your data. And you can do a cross correlation between the detectors, the residual signal to find what is called second step is about a physics driven way to understand is there any coherent deviation from your model. One of the key challenges is to distinguish between what is a physical effect and what is coming from noise. So the first one constellation helps you between a bit of those uncorrelated noise. That's great. But that's not sufficient if you really have driven by very tiny minute deviation. Remember, when you are going to look for any new physics. It's not going to be showing up right bomb as a high signal on your date. And there is a lot of reason behind that is just not because we don't know the physics they are going on that's also because of the degeneracy between the source parameter and the physics you are modeling. It is possible that your new physics you are trying to measure mimics some other source parameters of general theory of relativity. As a result, any tiny deviations needs to be very minutely measured from observation. So a physics driven model or template to search for the data will be really nice and cool. That's the second step. The first step is basically projecting the correlated signal on this template which down as the source depend on depending upon the gravitational resource, such as its mass, such as spirit, rather than just lively combined. And these are the three salient features of the our method score. So, let's take an example, how score is going to perform. On the right hand side, you are seeing a standard best model in blue and some beyond model signatures doesn't matter what right now in green in the right hand side. So let's say that this is my I have generated the data that this work has been led by young withdrawal. We have obtained let's say our mob data of gravitational signal, the residual signal in two different detectors having uncorrelated noise. That's the ones you are seeing on the left hand side the right hand side top left. Well, now you do a cross collect these two residuals between the two detectors, the green line in your beyond model signature, I can clearly see the correlated residual is well up. Even after you have correlated, you remember you still have so much noise and similarities. You still have deviations. It's not so easy. Then what do you do you take an agonistic approach on the template you choose to project this coscorated signal on this template. When you combine in a physics driven way several sources so in this particular floor is obtained from around 500 gravitational resources you start seeing that is trying to get info that the injected deviation in the green line quite nicely in the left hand side The alpha one is the strain or the strength of the deviation from the best from the best fit model. What's interesting to understand that we're taking about template how they know the template. I don't know the model I don't know the fields going on I remember using the template. And here comes a very, very important aspect of the method score. Score can search for both model dependent and model independent deviations from your fiducian model. A way to project this in a in a simple way is basically let's say that you have taken a beyond model beyond sort of model signature on the black hole binaries or come back object binaries for individual masses of one solar mass to solar mass and fire the green the blue one was seeing the true deviation residual power. I have not told this at all the technique that there is a deviation of that time. The very fact that I model the deviation in terms of the source parameter. I vary source parameter and I model the template. You can start seeing that depending upon the source parameter how the blue line changes the green line kind of that stuff. Of course we will not track it perfectly it cannot track it perfectly because it's not a stitching of one model assuming the other model know it's a company data driven hammer. So they are in the noise there's an uncertainty. But what is cool thing to understand that it is tracking the light regime. So we combine several sources we can get to basically measure tiny deflection or distortions. It exists in gravitational and these are very difficult to explore and currently are not explore at this full glory is the method as a member likes go. And they're ongoing efforts by the bias on this. So all I'd like to say that a new way to explore unmodeled physics so everybody who loves to unmodel physics so that it is dark matter signatures on gravitational waves. Lensing beyond your effects effects of accretion days. Effects of waveform system at its come up with whatever you feel like an order that are unmodel scenarios score is great in exploring those sectors. Certainly score is a very new frontier so it has several may it will also have multiple shortcomings or challenges to overcome since a very little age of technology which I'm talking about so more work needs to be done on this. And with this, I will like to conclude my talk, telling you about, well, data driven expression of gravitational wave seems to be a really new frontier. We are not only going to learn about the physics or astrophysical phenomenon we have learned so far from him observation, rather than learn about something very new from this sector. With the help of multi band observations, we can do inference data analysis with robust statistical techniques to tell us about astrophotos cosmology and the fundamental for this going on. And using favorite models are using our best understanding so far, we can make predictions, which can be tested from our current observations. So it's really a circle of inference and predictions, which goes together in this sector or data driven exploration are using avalanche. And it is really possible because of multiple directors up on going on upcoming and given a standpoint over here, like two directors and forgo is currently operational. O4 will be starting in the coming next year with camera. Like India is currently approved, which is most likely going to happen by some time at the end of this decade, if not later. Lisa is funded and going to be likely to be operational from 2035, 2036 hours for about four years or so. And in future, proposals are currently ongoing to have measurements of from cosmic expert analysis and telescope in 2040 hours. So I would like to conclude with a couple of some of the points that I mentioned today in my talk, like every general sources can tell us about the high relationship. This is something that take home messages you should remember. It's not in future, but now, by very much from the population of sources we have mentioned, binary black holes mass distribution can tell us about this formation chance. Soon, we are going to show you plots with the help of LVK observations, how black holes evolves as a function of masses and registered. And such a plot will really tell us not only about astrophysics, but also about if black holes of non astrophysical origin exists in the universe or not. That's key, because primary black holes are still hypothetical objects to understand us, discover their need to first understand how astrophysic black holes exist. Finergy between the emission line galaxies, EIGs and line intensity mapping with the gravitational observations can tell us about the bigger time distribution and is dependent on the chemical composition of the universe. I'm finally in the extremely small scale physics or something that is happening at the, you know, near to binary combat objects. We have proposed a new way cause go structure corridor residual, which can search for unmodified physics from gravitational observations, basically using with this I will end my talk and we have taken questions. Thank you very much. Awesome. Thank you for this comprehensive and nice talk. I see we are a little bit past the hour. Let me check the body for that. No, no worries. So let me see if there are questions for you to receive a question. Let me see this one. One by myself. This is, if you go back to the mass distribution of the binary black holes. Is there a physical explanation. I don't know if you said it about it. Why do we have like these two peaks. Is there a way to nicely understand it. Yeah, yeah, got it. That's a very good point. So, see try to understand this way, suppose you have black holes coming from the stellar operation. I have a question to ask how the stellar mass are distributed. We have not seen stars of the very high masses so far. It's very poorly measured. The lower end has to have measurements seems to say that there is a power distribution called one of the hybrid model is for the Cooper mass function. This goes to the black holes follow something like, you know, like a power law form. On top of that, now if black holes are forming from stars and extremely big stars. So, wherever all this material by, you know, first is to supernova, they will not form that was at the extremely high end from those stars. That means you are now going to form black holes from very high masses to up to a mass, which is limited by the parents of the discipline. The supernova mass is metal is really dependent. This is a key point to remember. So now, if you have a power distribution and you start piling up a lot of black holes around the mass, because black holes don't form at higher end easily. You're going to form another peak. And that's our explanation for the second P which is seeing the black holes mission between 30 and 40 slow mass. You can go and ask this question. Why is it, you know, is it, is it 30 or 50? Why is the rage? Because in these black holes are coming from a star whose metallicity are not fixed. Stars comes, of course, different metallicities in cosmic time, depending on the galaxy properties. So this P is going to be slightly sneered along this true value. And possibly there's a hint towards that from a current observations. But for sure, in future with O4, this understanding will be even will be improving, or we can even rule out such machine models. So, yeah. Awesome. Thank you. Okay, receive this question. Is this even if we do not have data on the spin? Can this mixing of black holes that you're talking about tell us something about the spin? Yeah, that's right. Yes, for sure. Depending upon the formation challenge of the black holes, you can also start predicting about the spin distributions. You can say that, okay, well, the black holes are coming from hierarchical mergers or not at the isolated binaries. Those are going to have a very different spin distribution. In this world, we have not included that at all. And one of my students is currently working on that project. But what is the simple answer on where I can answer that question right now that answer is yes. Assuming we have a posterior distribution of the spin distribution and it's registered evolution, you see, we can actually start putting, you know, some constraints on this kind of. I must say here one thing that recently with again with Christos Granthaneses, we have put out a paper called GWC, which is a population centered model for basically also incorporating the spin distribution. And it's an open source code to basically generate black hole populations, including spin, including mass distributions for different cosmological models. So if you want to ask this question yourself, just go ahead and play with it. Okay, let's see if that one has a follow up question. I think you have a question. Yes, I have a question for. I was wondering, because with all these analysis for more or less I understood you use the Bayesian analysis, right. So, is there is any space for training or to use machine learning techniques to explore the data driven part of the. Totally machine learning is one of the important things when you have a lot of data and you can train it well. So, yeah, yes, the answer to your question is yes, we are one of the key challenges in such problems, as we have been saying Bayesian methods, they are costly. One of the approaches in future will be basically putting this towards machine learning by basically helping you to learn how to predict the posterior from observations, and if you can get something faster in comparison to what you are doing. But there is is as usual to maybe to all of the machine learning problems I must say is all depends upon how well I can train my data set. If you're really going for new physics, and you're paying billions of dollars for measuring something new, you may not be very sure to just discover what you have trained on. So, even if I want to measure something with a Bayesian with a machine learning technique for getting it fast faster. I will still read it out of Bayesian and a technique to find something. Is there any mismatch from the machine learning. If not, then that's great. But if yes, that's missed most. So I will, I will skip on that in both. Okay, thank you. I don't see more questions here, then I might ask, perhaps the final one. So, so at some point for the last part of your talk, I miss something a little bit. So you say you can do, you can use a score to model on physics, like, when we don't know the physics of the model perhaps. But, but I think you also mentioned you can search with your method for physical mechanism. Could you, could you please explain that. Yeah. That's great. This is a good thing because I can express something more on score. I made a statement, let me, let me repress a question. I said a statement. I can also explore physical effects using this unmodeled search techniques go. How is it possible is basically from this step very projected on a template. That means suppose you have a receipt you have a gravitational data. You take your favorite model okay you take up your favorite model for which it has a maximum support from the posterior is the machine using maximum estimate and so on. Now you have a decision on the date. Now asking is there is your favorite model, whatever it is, let's let me be my favorite model I want to search for deviations from gr by some effective field theory approach. I want to understand is that it exists in the data on. If I'm a third edition, I possibly know how the signal look. I can go make a template for that particular signal project it on the cross version signal and ask is there a support for that from the data. What it means is that you can basically use this technique to search for a very specific models. As well as if you don't know but you have a different approach let's say, given a shot of any deviation from general literature. Can be modern as an energy law more energy laws or gain in comparison to the part you are a parent. That's a very yeah go ahead. Yeah. But what if I start from this EFT theory and do the game from the beginning. So you're saying perhaps that I don't know like that higher power at a particular location or whatever is going to disappear if you if you got that model right like if you are modeling. Yeah, yeah that's right so what it means that suppose they were residual okay residual say that they slightly more residual in certain frequency bands in comparison to the. Okay, perhaps what I'm missing is what why is this different from starting from let's say okay my favorite theory is the theory of suicide and then do this analysis. Okay, got it that is because there are plenty of theories. It's very, very difficult to search for make waveforms and make a parameter estimation for all different waveforms occupying a huge theory by space and parameter space. Think about the problem like you have 15 parameters, you're going to explore the gravitational source properties, and you have maybe 10 theories is going to be a is it going to be a huge large amount of effort, I mean you may not get anything. So score you need because your GR is great, it's has a nice prediction, unfortunately there is no next theory which is which is so popular so far. That means you need a technique which is fast, you can combine hundreds of black holes. There are many cycles over a large frequency range quickly tell you do I see something interesting. If so, go behind that's what score gives you. Okay, awesome. Okay. Thank you. Thank you very much. Let me see I don't see more questions so people can go to your webpage and look for your email or ways to contact you. And I believe right over to this is the last webinar of this year, we will recombine in late January I think early February, we have to check stay to awesome. So, thank you very much for the. Thank you very much for audition and thank you for inviting me. Thank you everyone for attending to the stock. Okay, see you soon. Okay.