 Okay, thank you. Thank you to the organizers for inviting me here. Also, thank you for sparing me the anonymity to be shared by my wife and probably bossed by my wife, at least in this occasion. I will be telling you something about dynamic scaling. So, I think that biology is too complex for us. When I mean us, I mean physicists. Some people could say that it's too difficult for us. It's the same thing. So, I'm part of an institute called the Institute for Complex Systems. But it's actually, I think that what we do is to try to brutally simplify complex systems. So, sometimes when we simplify systems, in particular biological systems, we just abuse them. So, we do something bad. In other occasions, more rarely, we do things which actually simplify the system and also make us understand new things. So, this is the best case scenario. Almost never happens. But today I would like to tell you something about one such tool that we use in physics, which is called scaling. A lot of the talk will be about telling you the virtues of these tools. It's a very well-known thing. I didn't work out any of these things. These are very well-known, venerable concepts in statistical physics. But my point is that we may try to apply this kind of analysis also to biological systems. So, it's basically a tool to simplify the system you are studying. So, what are we simplifying? I will focus on something called correlation function. The correlation function measure to what extent the deviation from the average behavior of the group in one position in space and time are influenced and influenced, the deviation from the average behavior of the group in another point of space and time. So, imagine that you are monitoring, for example, the trajectories in a swarm, as I will be doing. You have the velocities of each individual. So, then you compute the velocity fluctuation. It is the change in velocity with respect to the mean of individual i in this position. And you multiply that with the fluctuation of individual j in this other position. And so you have two natural variables, the distance in space, but also the distance in time. It's a space-time correlation function. Now, there are many reasons why correlation functions are such a big deal in physics. And I think I built most of my career simply in selling correlation functions to other fields, not doing much else. Yes, Francesca, I know. Thank you for reminding me. But it's really something very, very important, and there is one dry but relevant reason. When you have a theory that describes correctly your correlation function, I mean, at the quantitative level, well, you are in a very good position. And typically the theory is, you know, quite good, almost correct. Because it's tough to get correctly the correlation function. So it's a very demanding object. So our experience in physics is telling us that if your theory reproduces correctly the correlation function, you are OK. Of course, in biology it could be different, it could be very different. But still I will use it as a necessary condition to start. If you don't reproduce the correlation function correctly, probably there is something wrong in your theory. It's just a very tough, demanding object that you can use to test your theory or your model. However, the problem is apart from space and time. What does the correlation function in general depend on? Well, the answer is on many things. So in physics we know we have some understanding of what are the external factors affecting the behavior of the system, the temperature, well, first of all the dimension of space, the symmetry conservation laws, the temperature, the pressure, still many factors. But more or less we know a priori what they are. But in biology, really the complexity explodes in your face. A quantity like that could depend on pretty much anything. The species of the animal you are investigating, individual heterogeneities, environmental factors, wind, temperature, humidity, season, whether you do your experiment in the lab or in the field, noise, I mean real noise like children shouting in your experiment, or noise like Langevin noise, which are not the same kind of noise, unknown factors, more unknown factors, so you really don't know. There's no reason to believe that the correlation function should not depend on those. So you may say, so what's the problem? Well, from our perspective the problem is then you have to build a theory with all these many parameters. Okay, maybe yes, but that would not be very satisfactory for us. So maybe there's a way to reduce the number of parameters and the complexity of the system. Maybe the underlying theory kind of wraps up all those parameters into something simpler. Well, this is what scaling does. Let me tell you why this is important at a quantitative level. You may say, okay, but cut the crap. Imagine that I measure the correlation function and I plot it as a function of time. And I do two experiments, two different systems, different children around, different temperature and so on, so I get two different curves. But then you can say, well, but of course the time scales are different there. So the decay time is different. So do the following. Simply rescale now your time by the relaxation time and see what happens. Well, then you do that and you get something like this. Now these two curves have more or less the same relaxation time, more or less the same time scale, but they are different. So not only the range in time depends on all external factors, but the shape of the function. And the shape of the function is the one predicted by the theory if you're lucky. So if you have that changing, you know, villa in Rome and doing the experiment, the shape of the function changes. If I change the species, the shape of the function changes, then you need a theory which takes into account all these different factors. And this sounds difficult, perhaps hopeless. So the scaling argument goes through some steps. And the first one is there is a very important scale in the system, a scale which is more important than any other scale. And that is the correlation length. So the correlation length really measure the special span of the correlation near system. So this is a flock of starlings. Those are the velocity fluctuations in the flock of starlings. And you can clearly see that there are some correlated regions. So the correlation length is the size of those regions. It's an important length scale of the system. It's actually the most important. And of course it will depend on the world. On the 1,000 unknown parameters that we have. Why shouldn't? It's a very important characterization of the system. So it certainly depends on temperature, pressure, wind, children, species, group, demographics, and so on. So you can be sure that Xi has all that dependence. Okay. However, the key idea of scaling is that the correlation length absorbs all that dependence into itself. So that this function, which is in principle a function of, I don't know, 1,000 variables plus 2, it's actually a function of space, time, and the correlation length. All the dependence on the external world goes through here. So this alpha is a whole vector of things I don't know. And this is a huge simplification. Because now this is no longer a function of 1,002 variables, but of three variables. And what is most important is that, even if you don't know these variables, you can measure Xi. You can actually measure Xi in your experiment, which is what we do. So it's something experimentally accessible. Right? There's actually more in the scaling hypothesis. So not only the scaling hypothesis telling you that all the external world goes through the correlation length, but it's also telling you that space and time, they also scale together in some way. But to see that, it's actually easier to go in Fourier space. So I apologize for that. This is very common in physics. But it looks like a weird complication here. I mean that you have to go from a function of space r to a function of momentum k. And k is the inverse of a length scale. For technical reason, it's much easier to talk about the scaling relation in momentum space. So what you're doing in this case is that you are more or less looking at the correlation on a region of size 1 over k. So 1 over k is really the scale over which you are studying the system. But it's still space. It's 1 over space. So now you have a function of k and t and the correlation length. So let's see what the scaling hypothesis is telling us. So this is it pretty much. Looks complicated, but it's not. So the first equation is telling us that that function, which is in principle a function of space, time and the correlation length, is actually simpler than that. It's a function of time over some time scale, the relaxation time tau. And that every other dependence, both in the correlation function and in the relaxation times, is contained in this product k psi. So the relaxation time grows like a power of the correlation length times the scaling function of k psi. So k psi, k is the dimension of 1 over length, psi is the length, is a dimensionless parameter. It's a pure number. So the idea of the scaling hypothesis is that basically all the world comes through psi and psi multiplies k everywhere. And the relaxation times grows like a power law of the correlation length. So you see the idea is that the correlation length really rules everything. It's telling you how the time scale grows. It's telling you how space is ruled. So the people who wrote that, wrote that out of common sense. I will say not so common, but you know, there were these phenomenological laws in the 60s. So the idea was basically to say the only relevant scale must be the correlation length. Okay, so what are we going to do? k, the momentum, is 1 over the length. So it can only appear in this relation as a product k times psi. Fine. Then there is time. Well, time is a different quantity. But still we say they must depend on the correlation length. So we introduce a new number, which is called dynamical critical exponent, to say how one grows with the other. So why this is very useful and why this is a huge simplification? Because now imagine that you keep k psi constant. So what does it mean? You do your first experiment, experiment one, and you measure some correlation length psi one. You do experiment two, and you measure another correlation length psi two. Now, in analyzing the two systems, you fix k such that k psi has the same value in these two different cases. You don't know why in one case psi was 100, in the other case psi was 50. Different external parameters. But you can't choose k in such a way to make k psi constant. So if you plug this into that, you see this is now a constant. This is also a constant. So this function here, which looks very complicated now, becomes only a function of t over psi to the z. And the shape of the function. Now depends on only one parameter. Out of the many, many parameters you had in your theory at the beginning, the only relevant one is the product between k and psi. And what is that? Basically that is the ratio between the correlation length and the length scale over which you are observing your system. That is the only thing which is dictating the shape of the correlation function. So then imagine that we observe many correlation lengths at different case, different temperature, different whatever. They all look different. But now I do this trick. I keep k psi constant and I plot things as a function of t over psi to the z. If this is true, they must collapse on one single curve. Which is a remarkable simplification. And the shape of this function only depends on this little number here, which is telling you what is your scale over which you are observing your system, given psi. Are you observing your system at psi over 2, at the same order of magnitude of psi, at much larger than psi? That is the only relevant parameter in your theory. So if this works, then you only have to find the theory for this object with one parameter. Of course, it is still very difficult. But you can forget about the tuning parameters and the unknown parameters if this works, of course. And as a byproduct of this dynamical scaling hypothesis, you get this, I wrote before. So if now you fix the product k psi, what you get is that the relaxation time of your system grows like the correlation length to some power. This is what is called critical slowing down in physics. And it's a very profound concept. It's telling you that space correlation always goes together with time correlation. It's impossible to have a system which is strongly correlated in space, but weakly correlated in time. It would be very weird, not impossible. It would be very strange. And the reason for that is that if you have large correlation length, so big correlated regions, in order to mix the system, so to lose memory of your past, you have to change larger regions. You have to mix up larger regions. And to do that, you take a longer time. And this number here regulates how time and space scale together in the system. Okay, so this was the preamble. And this is what we want to test in a real biological system. The system will be the one of natural swarms. The one described it so nicely yesterday in a talk by a young guy, now I forget his name from Nick Wellett Lab. So we are studying in the field, not in the lab, but in these buggers here, so non-biting midges of the chironomics family, not only that family, but mostly of that family. We're in the field, we have several cameras. This lady here is in the audience. She will be giving a talk tomorrow. And after we get the data, the biggest challenge... Oh, sorry, this is very weak. Anyway, this is the real swarm. The challenge is to turn the photographs taken in the field into three-dimensional trajectories, which is not easy. Not easy at all, it's called the tracking. And Stefania Melilla will talk about that tomorrow morning. Of course, you would like to take as large a system as possible, but you have many constraints. So I think the most we can do is 600, 700 midges, which in three dimensions, I know is not huge, but it's the best we can do, and I think we will do better in the future. And from that, we compute the correlation function, exactly the ones I described to you before. So how the velocity fluctuations of one guy i here are similar to the velocity fluctuations of another guy at some different position and different time. I will not go through the technical details on how to compute that, but it really is the average of this product here, over the many individuals in the swarm. Again, we have to go to Fourier space, to K space to check the scaling relations, but this simply means monitoring the system on lens scale 1 over K. So it's nothing particularly fancy. Okay, so these are experimental correlation functions, and they look like a mess. So these are different Ks and different swarms, taken in different conditions, different dates, different parks in Rome, so all different. By the way, sometimes they say, well, this cannot be an experiment because there are no points. They are smooth because we are shooting at almost 200 frames per second, so this is half a second, so we have many points. But apart from that, they really look like a mess. Okay, so here you have now all these parameters. You can start changing them and seeing how it depends on K, how it depends on the noise, how it depends on the species, how it depends on this and that. It is usually high dimensional space, but we are scared by complexity, so we try to simplify that. And here is a reminder of what we saw before, but this is really the message of my talk, not what we did. So these relations are really important. So according to the dynamic scaling hypothesis, the correlation function is just a function of t over tau. Tau grows like psi to some power, and everything else occurs as a product K times psi. So now we keep K psi constant. We can do that because we measure psi in each experiment. Swarm 1, we measure psi. 20 centimeters. Swarm 2, we measure psi. 15 centimeters. Swarm 3, we measure psi. 25 centimeters. We have no idea why psi changes in that way. Well, we have some ideas, but this is not relevant here. We are sampling different swarms. They have different psi. So we select for each swarm a K, which is our variable. We do whatever we want with K, in such a way that K psi is the same. And for clarity, we fix K equal 1 over psi. So K psi is 1 in all our experiments. So then, if this is true, we should find that this is a potentially very complicated function. It's only a function of t over psi to the z. And then the relaxation time shows K like a power law of K and of psi. So these are the data. Look first at the first row, blue data. Okay? So these are real data in natural swarms. It's about 20 swarms. So each line here, these are not all the 20 possible swarms. Otherwise, it would be too crowded. But each one of them is a correlation function, not scaled for a different swarm of different K. Actually, here, they are all at the same K, and you see there is no scaling here. But now you do exactly what I told you to do. So you fix K psi, and you plot this as a function of t over psi to the z. And you do indeed see a remarkable reduction of the complexity of your system. So they almost collapse. I would say that consider these are real biological data in the field that they collapse especially here is quite good. And also, the relaxation time scales, well, with a lot of scatter, of course, with space, with a certain power. We wanted to compare this with something because the power we have, this z exponent we found, was one. So we said, where is this from? It's an anomalous value in statistical physics. So where is this from? So we did the simulation in the Vicksheck model. And Thomas, I hope you don't mind, we call it Vicksheck swarms. But you don't mind with rats, so I hope you don't mind with midges, okay? So we take the Vicksheck model slightly above the critical point, the transition point. So in the paramagnetic phase. So this ordered phase is not the ordered Vicksheck model. So we are abusing the Vicksheck model as well. So we are using in a regime where we're not supposed, well, I don't know. This is what is nice when you have a strong model that it works also in regions where you didn't actually think it would work at the beginning. So we run the Vicksheck model above the transition point but still close enough to the transition point that correlations are sizable, large correlations. This is not equilibrium statistical physics. This is an active matter model. So first of all, dynamic scaling works wonderfully. So dynamic scaling was introduced in the classic statistical physics equilibrium, da, da, da. This is a system really far from equilibrium. Dynamic scaling works wonderfully. Well, of course, these are simulations, so the data are very clean. The dynamical critical exponent in Vicksheck's forms is 2. 2 is very much like the standard critical exponent in statistical physics. Eisenberg model has Z very close to 2. Eisenberg model has Z very close to 2. There may be changes but the standard dissipative system has Z close to 2. We find 1. But these are both self-propelled systems. These are real midges and these are self-propelled particles. So you see the self-propulsion here is not the real cause for this strange anomalous exponent. Okay? But there is also something else which is different between the two cases. Look at the kind of relaxation. Now you have the shape function. You have scaled your data, so you have the shape function. This one is clearly exponential. So nice, easy exponential relaxation. This is not exponential. Apart from some funny stuff going on here, it looks like you're having some oscillation. But the part here, which I will blow up in a moment, is non-exponential. You go to 0 with a flat derivative. So what is that from? So if you plot your data in log, in semi-log scale, an exponential is a straight line. So this is what goes on for VIXX swarm. Perfect exponential relaxation. And these are real swarms. Okay? So to understand what is going on, think about a very simple system. A spring oscillating in a viscous fluid. So the stochastic harmonic oscillator. So in that system, very simple, one particle. Okay? The dynamics of the system strongly depends on the ratio between the mass and the viscosity. If you have large mass compared to the viscosity, you are in the under-dumped regime and the correlation function does something like that. So you have oscillation and you go to 0 with a power which is not 1. If you are in the over-dumped regime, then you have simply exponential relaxation. So to blow up this feature here, what you can do is to define X, which is the ratio between T and your time scale. The ratio is your reduced time. You take the log of the correlation function. If this is an exponential, the log of e to the minus X is minus X. You divide by minus X and you get 1. So this quantity, which we call X, must go to 1 in the case of exponential relaxation. And it does for VIXX swarms. And it does for the over-dumped harmonic oscillation. But now, if your correlation function goes to 0 with an exponent with a larger than 1, then this is X to some power. You divide it by X and it goes to 0, which is what happens in the under-dumped harmonic oscillator. And it's also what happens in real natural swarms. So the ideas that we are finding from the correlation function, once we have cut all the details and all the influence from the external parameters which we know, we find a clear signature of some inertial dynamics, of some second-order dynamics in the system, some not purely dissipative over-dumped dynamics, which was surprising. And why that is surprising? Well, because this is a disordered system. There is no global order. You are above TC. You are above the transition. Now, if you are in a disordered system, so in the paramagnetic system above the transition, and you observe the system far enough, so on a length scale which is much larger than the correlation length, then there are very good reasons in physics to believe that the decay of the relaxation function should be exponential. In other words, if I look at the system from very far, so now my fluctuation, the correlated regions are very small. So the system is basically homogeneous. So I lose all the nice dynamical structure of the system, the nice static structure of the system. What I have is simple exponential relaxation. And this is what is called the hydrodynamic regime. So if I observe a system on a length scale much larger than the correlation length, I must find exponential relaxation. So the question is, why don't we observe exponential relaxation in swarms? Well, the problem is that we can't. Because we cannot observe a swarm on a length scale which is much larger than the correlation length. Why? Because if we take a larger swarm, the correlation length also will be larger. So we have previous evidence that the correlation length in a swarm more or less grows with the size of the swarm. So then it is possible to observe the system on a scale which is much larger than the correlation length. So it's like there were some near critical, some censorship of this regime. The system is always out of this regime. And that was very lack for us because we could observe this non-exponential relaxation, this inertial relaxation exactly because of that. Well, you cannot observe the, well, let's say boring, but anyway interesting exponential relaxation. So the final result of this is also that not only we get the scaling holds, but only we get an exponential, a critical exponent which is anomalous compared to those in standard statistical physics. But also we find some evidence of inertial dynamics. And I was very happy yesterday to see your talk because this is also what you seem to find in your data. You cannot reproduce those kind of oscillations if you don't put some inertia. Now, of course, we don't know whether it's the same effect or the same thing. You know, it's not in contradiction. So I think there are some inertial effects even in the dynamics of insects like this, not only of birds. So to conclude, why is all this relevant? Okay, probably some of you didn't buy the simplification thing. Why? Because this may say, yes, well, look, I actually want to see all this dependence on the difficult external parameters. So I'm actually opposed to doing that. By losing complexity, you're losing richness. So, sorry, this may be cute, but it's not convincing to me. Well, this is a very good objection, and I'm afraid that this is actually something that may be divisive between physics and biology. But I will try to convince that this is relevant anyway. So this is a slide which makes all the physicists go, oh, yes. Because these are really the fathers of the 20th century statistical physics. Widom, Kadanov, Halperin, Hohenberg, Fischer, and Wilson. So this is a series of paper in six, seven years, between 65 and 72. In these first two papers, these two guys, Widom and Kadanov, wrote the scaling relations in the static case, no time. Again, out of common sense. So they basically said if the system is strongly correlated, the only relevant scale must be the correlation length. All the microscopic details of the system shouldn't matter. Two, three years later, Halperin and Hohenberg, other people too, generalized this relation at the dynamic level, which is far richer. Scaling at the dynamic level also tells you that time and space are not independent. You have to rescale space. You have to rescale time with one particular exponent. But it's important to understand that up to this level, all this was an hypothesis and answers. They had no way to prove it. They said it must be true. Come on, it's just too good. It must be true. But then people started to see that they were right experimentally in many systems. Actually the scaling hypothesis was many, many systems. So at that point many in the field started to take seriously the idea that there must have been some mathematical structure underlying this by which it was possible to prove the scaling relations in some way, not to just guess them. And this is exactly what did Wilson and co-workers with this funny title here a couple of year later by introducing a mathematical tool which became very famous in physics which is called the renormalization group. So the RG, the renormalization group, is a mathematical theory which is able to prove the scaling relations by using the scaling variance of the system close to a critical point. Of course I will not go through the RG now but the important point is that in the 60s and the 70s scaling laws were really the main driving experimental and phenomenological motivation behind what was the real explosion the renormalization group. After RG everything was clear the problem was solved for all practical purposes. So there was this chain scaling RG which is a way to prove scaling to understand why scaling of course. And then why the correlation length is the only relevant length scale in the system. And what was the by-product of the renormalization group? Universality which is this word that we use over and over in physics which is very annoying for people outside physics because it seems that it is very wishy-washy, what do you mean universal? Well, the RG it's a mathematical way to prove why some system that looks so different in physics actually have the same behavior. And the idea is that they all belong to the same fixed point in some funny space. So the idea of universality I want to convey just this fact in physics is not something that the physicists hope for. It's not just wishful thinking in physics maybe wishful thinking in biology but it's not in physics and it's deeply rooted in the RG and the RG was deeply rooted in physics. So our idea would be to try at least to go through these steps even in strongly correlated biological system. We know that in some systems, not all of them but in some systems correlations are strong. We know that at least in one system I show you today scaling works more or less. So let's gather information about the system. So we have this step. So my hope is perhaps we can work on something like that in the future. And that would be good because then there would be a reason why I'm working on birds and midges at the same time. Otherwise why should I be doing that? Why we're all here together talking about so apparently so different things? If there weren't any hint of universality this wouldn't make any sense because yes disciplinarity is fine but we're not inviting here people from low school. That would be cross disciplinarity but we're not doing that. So we have some idea that there must be some universal features across these different systems. And I think that this is the right maybe the right path to prove that. Okay. These are the people working with me. This is not a freak this was a student was not present when we took the photograph. These are my senior collaborator Irene Giardino and Tomas Grichera the funding agencies that make our lives so miserable. And the last thing I will do some openings at the postdoc level in the near future so if you're interested in this let me know. Thank you very much.