 Okay, thank you Antonio. Hello everyone. I would like to thank Leija and Matthew and colleagues organizing this joint workshop and I will take this opportunity to share with you some of our recent work and trying to link, let me see whether I had the proper slide. Since not, just a moment, let's see. No, the slide was okay. Was okay, right? We can move the man's queen, we can make the man's queen bigger because sometimes if you don't say it right then your picture become bigger but there's something you can move around. It's okay, right? Yeah, so you just listened to this talk from Professor Yu and mentioned that the neuron spike is actually very, very sparse. Maybe the purpose of one important purpose is to save energy. But here I want to show you that kind of the dynamical principle how neural circuit can organize non-trivial dynamic for like efficient information processing using very low firing rate. So here is the outline. I will give you some introduction of the brain as a complex system and again also mention this cost efficiency like trade-off. Not only you want to minimize energy but you still want to keep like efficient processing. Then I will come to talk about how we can do council. This irregular actually very random sparse spiking of the neuron into collective dynamic and like avalanche which allow the system to respond to the signal and then very efficient. Then I hope I from this kind of theoretical model I can show you that the neuron system actually can use less wiring cost and less energy to give you more function compared to for example random network. So we know inside our brain we have like we already mentioned a huge number of neurons. It's a highly complex system because each neuron is the highly non-linear element of a threshold element. And when so many number of these non-linear elements are coupled from physical science we know it's categorized by the emergency of activity. So here in the neuron system we have irregular and the spiking of single neuron but we have a complex activity when for example when we are talking and listening here. And in the last years due to the advancement of the neural imaging technology and the people like it just we also showed you we can now look at the connectivity. How the neuron or neuron components are coupled and in a very complex network and structurally and dynamically. So we emphasize on this complex connectivity. So in this kind of three-dimensional space of complexity I think this is a now we know that brain is human brain is perhaps the most complex system we know so far. But we just mentioned that the whole brain just have two percent of our body weight and use 20 percent of resting metabolism. And but it is remarkably efficient compared to like machine. You just showed you this remarkable data. So we emphasize that the brain is a complex system but it's a functional complex system designed by the evolution try to minimize the energy but at the same time I think also try to optimize a lot of functional processing especially multiple functions. So there must be some trade-off and this kind of a trade-off of the cost and the efficiency must be reflected in the architecture activity and their interaction. So obviously we just mentioned that therefore we emphasize here as a dynamical system this cost efficiency trade-off and we just mentioned that in principle we are talking about brain inspired intelligence. I think we perhaps we cannot we don't have to be constrained by this kind of a very small energy and also may may not be constrained by the simultaneous optimization of many functions. So therefore in this sense I think AI should be able to like be much better in certain aspect of the intelligence in a human brain. So of course we just mentioned that in terms of energy, majority of energy is used to generate the spike and the propagation of the spike in the network. So therefore from this constraint you can see that we want to have a low rate. You already showed that it's basically one hertz on average and you want to minimize this kind of fiber as a network fiber lens in order to save energy, right? So when we come to talk about the connectivity we know locally the neural circuit looks like a random circuit but if you go to a larger scale you see this kind of a column and the layers actually between the column is this like a black vessel because all the energy must be supplied by this as you just mentioned actually black oxygen and to have these glucose. So when you come to a bit a larger scale of the brain network the whole brain can be divided into functional subsystem like a visual system, auditory system, somatosensory and then each functional system within you have a different functional areas. It's really depending on the translation resolution but then if you just talk about this fiber like a white matter linking different brain areas you see it is like a density connected and within this functional area. So the brain actually is the kind of I think it's a very pronounced feature is characterized by this modular interaction close to many scales. So locally we have a very dense like a network the connectivity locally like you have about like a millimeter scale you have about 20 percent of the neurons are connected low but globally the overall connectivity is a very spot because we have about 100 billion for neurons but each just have like a thousand or 10,000 of connections so globally it's a very very spot. So if you imagine a random network the whole network is very spot but locally it's a very dense so this is the plot to show like a connectivity decay with the distance. And for you just mentioned we have two types of neurons excitedly neuron try to activate the others and the inhibitor neuron will once activated will try to prevent the activation of the other neurons. So these are the very basic like a property in terms of a dynamical or anatomical but also dynamical interaction. So I think this kind of architecture like hierarchy of modules is challenging for neuroscience and statistical physics because every level basically we have a kind of finance-sized system. Now I come to quickly introduce you for your to see features in the neural activity at the single neural level is very low firing rate irregular but when you come to the circular level you have a collective non-trivial dynamical features. So this is the plot a slide to show about 20 years ago people have the experiment to put the electron into sorry into the animal brain when the animal is doing processing like visual processing looking at some picture and the picture is like changing and then you have the same electron but the measure for many many trials you look at the spiking of the neuron around the electron and you see this kind of pattern if you average over the over the trials so you see these color changes this is related to the input of the signal so this firing rate is changing but if you look at the spiking of single neuron we see like a similar input it's very irregular the inter-spike interval is like a random Poisson process very similar to random Poisson process. So it's amazing like when we are talking when we are thinking doing very precise advanced things the neuron is spiking like a random very random and how this kind of irregular random sparse connectivity or spiking can come this is from a statistical physicist proposed this idea of balance we mentioned that you have two types of neuron accessory in the individual neuron so in a local for example the current network a neuron is sitting there you get the input from accessory population the current coming to the neuron is a positive current and then you have the inhibitory current coming into your neuron body is negative and if you have a balance if the average of this current positive negative are canceled then this neuron will receive something like a random noise because all the neurons are firing randomly it's irregular but then this neuron with random input will also produce irregular and sparse firing so then you have a kind of self-consistent system with sparse network connectivity then the whole network will have a stable array and irregular totally asynchronous firing this is a beautiful really beautiful and then later with advanced technology like a climb technology people can measure this positive and negative current and then when you see like they are really correlated when the circuits are activated differently so you have at least the important feature of the neuron like cortical circuit is balanced it's very nicely balanced but the more advanced measure shows that this is not like following the simple assumption like the current is a noise the current it has pattern this kind of oscillation the excitation and inhibition seems to follow each other almost precisely but there is sometimes a little time window the excitation will start and then inhibition will come and to surprise the firing of the neuron so I will show you like our model can recover this kind of property so this is not completely asynchronous because if you measure the neuron's spiking simultaneous in many neurons this is the paper from Bielik they measure like retino even just the inside eye all go to the culture of the cortex circuit you see like the firing of the many of the neuron can can be clustered and if you measure the probability it has quite high rate to observe like many neurons can spike together if you try to shuffle this spiking interval and then this probability becomes reduced so this kind of clustering is not trivial and the if you go to measure however the PR wise correlation of this neuron then the correlation value is actually very small most of them is like close to zero but you have some piercings have a little correlation at point one point zero five and then this work showed that even the PR wise correlation is a very small actually it using the maxima entropy assumption the IC model you can already generate this kind of a larger scale spiking of the neuron and then this kind of a correlation actually can scale up like this cluster can separate as a avalanche so this is actually the measure from this a few years recently and then you can look at the size of this event called avalanche and then it has this property of a fat tail a power law and actually you have this size of the event or duration of the event or the size and duration relationship and then these follow this kind of relation of the exponent of the critical state but in the neuron system we do not need to find till the primary so sometimes people who call this the like self-organized criticality and then actually in the beginning much earlier people put found this kind of avalanche from an actual array when you put the actual array on the surface of the culture or into the animal brain you measure this kind of local field potential this fluctuation of the potential you have some kind of rhythmic like a gamma oscillation but it's not the harmonic not very regular and then if you put a threshold and this kind of activity means the neurons around the electrode are spiking together and but then if you look at this electrode the activity they also spread in space like you see here is already come to the meter this scale it's also kind of avalanche now if you measure the size of this avalanche with a suitable threshold and the time window you can see that it's a power law function so it is the mesoscopic like evidence of the brain working at the critical state and now if you now go to the whole brain like you just mentioned human brain is the largest like largest compared to our body mass so we can measure the human activity like brain activity whole brain from a functional effron more actually it's an indirect measure of the blood oxygen concentration related to the firing rate in a period of time and then you see it's also irregular if you go to do the analysis of this kind of activity again using threshold avalanche and you see again this avalanche this activity will spread in the form of a cluster and the size of the cluster has a power law distribution and if you use other measures has the signature of like the whole brain is like working at the critical state from is the empirical data aspect so this is not so surprising I mean from our physical science field point a critical state has a lot of advantages for information processing for example a neuron circuit like kappa one means close to the critical state and can respond to some more perturbation or larger perturbation properly so this kind of dynamical range is maximized if you use drug to make the circuit more inhibit or more activate you go away from this kind of critical balance state then you reduce kind of dynamical range and then this different like activation of electrode can in principle these patterns be used for the like information coding representation and this capacity or entropy is also maximized when different tissues or neurons are involved in the avalanche then they are coupled temporally or synchronized and then but they will decouple when they are when this avalanche move to another places right and then this switching between synchrony and a desynchrony is also maximized in this so this kind of a dynamic core process offer the system very rich like patterns and in principle allow the system to respond to environment so that's why it Alexander just mentioned that the resting state will consume like 95% of the energy budget for what? So maybe this is the reason that we want to establish a system which will be like ready for whatever processing, and that will cost a lot of energy. And if you go to the largest scale of the brain, you can also observe this oscillation. Yeah, so the initial assumption like spiky is total asynchronous cannot generate this kind of oscillation. So if you close your eye, you will see this kind of like I have oscillation 10 hertz of your part of your brain. And then during the processing, you have different like a slow or fast oscillation, they can be coupled. Then this is due to the different like firing of the neuron circuit and cross different scales. So I want to emphasize that typically when you go to some task like a monkey looking at the point here, I want to move your eye. Then if you go to measure the activity of spiking of the neuron, or this local field potential, then basically you observe this kind of multi-level activity. Single neuron firing is irregular, but the collectively you see this kind of oscillation, but you have a kind of fat tail. It's not really like uniform oscillation. But in the neuron science, different sub fields, people use this data differently. Like in the neuron science, people when they see irregular spiking, they mainly use this rate coding idea like you go to average over many trials, you see when the signal comes, you have the change of firing rate and maybe firing rate across different areas, then you can see, get some idea how the neuron system are involved in the process. Or in the cognitive neuron science, people measure like EEG outside the green like a local field potential, then you average over many trials at the onset of the signal, then you see this so-called event rate potential. And you have different terms to look at this dynamic, but they're not much like emphasize on this understanding of this multi-scale dynamic, how they are organized. Yeah, so here we are interesting a few questions, how these irregular spiking and the neural avalanche and the oscillations can be cancelled in the neural circuit dynamic. And what are the important like neural biological factors influence this dynamic? What are the dynamical mechanisms and how the in principle, this kind of dynamic can realize this cost efficiency kind of trade off. So now let me go to, so we try to address this kind of question using biologically plausible, you know, network model and using some mini-field and non-linear dynamic analysis. So the model I'm going to present is kind of a general local circuit model, but biologically capturing a few important features. We have two populations, excitatory population, inhibitory population, and this is a local circuit and then have some external input. We model with Gaussian spiking chain and mimicking the interaction with other other circuit. And the different neurons are coupled from these synapses, like you just mentioned, actually, you have different slight different types. So they are coupled through these synapses, but the synapses you have involved the release of the neuron transmitter and then close the the door and then it is kind of a transient process. You open the door and then close the door. And then typically the fast excitatory synapses is the fast, close the door within like a two or three millisecond, but the inhibitory synapses is a bit slower. So in the model, we could have like thousands or 10,000 neurons, 80% are excitatory, like local connectivity density is 0.2. But later I will show you like connectivity has an impact on the dynamic. So this is a dynamical equation, the voltage of each neuron and it's changed by the input from other neurons. And then this is the current of the spiking. You have this kind of alpha function. This is the neuron transmitter processing. And I mentioned that excitatory synapses is a fast, inhibitory synapses are typically slower, but they are different neuron transmitter type and and can be different. And the model is integrated firing and when the voltage comes to a threshold, you fire. Yeah. Now I come to show you some numerical results of simulation of a circuit. The primary that I changed is this, let me go, this decay time of the synapses in the model. So when these two are similar, then what we observe in a balanced situation, the whole network is irregular. There's no correlation and the neuron firing is very irregular. But if you look at the current coming into a neuron, the excitatory inhibitory current cancels each other. So the average is almost zero. But sometimes you have a kind of a little deviation so the neuron can produce irregular spiking sometimes like bursting. Now, if I come to change the parameter, go to more biologically plausible regime, excitation becomes faster and inhibition synapses become slower. Now you see a very interesting self-organization. Although the network still receives irregular pulse input, now you see internally the spiking starts to organize this kind of oscillation. The firing rate will have a gamma oscillation, but the amplitude changes because every time during this oscillation, different number of neurons will be involved in the oscillation. But if you look at the balance of the current, positive and negative current follow each other, so almost completely canceled, but you have a little time window, excitation will come first and then the inhibition will be activated. When the inhibition is there, there's no firing of the network. So you have some kind of delayed feedback, inhibitory spiking and during that time window, some neuron can fire. So each individual neuron firing is very irregular. So in each of these periodic oscillations, only small portion of the neuron and different part of the network will be involved in the oscillation. That's why you have irregular. But if the inhibition becomes very slow and this difference becomes very slow, this time window becomes so large, like this activation from external signal can activate the network to split and to the almost to the whole network. Then you have very strong sequenization and very strong oscillation. So if you go to this perimeter space, you can see when we change this decay time of excitatory inhibitory synapses, you see a transition from asynchronous to synchronized. You have some transition. And the interesting thing is around this transition point here, the single neuron firing, if you measure this coefficient of variation, it's about one. It's random. It's like a random. But the network can generate oscillation. If you look at a spectra, it's like gamma oscillation, 30 and 40 hertz. But if you look at this spiking, I said spiking like avalanche and it's a power law. So this model can recover this kind of a multi-level dynamic, like single neuron firing is irregular, but the network dynamic you have oscillation and avalanche in the biological possible primitive regime. And this is the result, part of the result in this paper published a few years ago. And recently, we continue, try to better understand this. Sorry. Can I ask a question on this? So what is the network that you assume in this model? Yeah, this is a network question actually for the local like a neural cortical circuit. Although in the cortical circuit, you have some like geometrical structures. But here we just assume random network of two populations. Yeah. Thanks. Okay. So recently we'll go ahead and so this is like a numerical simulation. But then we come up with an NSS using statistical physics. We, from this microscopic like a neuron spiky neural network, like a 10,000 neuron, we go to a microscopic like a field model, you look at the meaning, voltage of a excitatory or inhibitory population. And then this is there, you have input due to the current, like the current, but the current, it will go through this neural synaptic future, due to the spiking rate, but that the spiking rate is a function of the average potential. So actually, these derivation of these mini field actually is not long to be because the earlier people, when you come to the spiking correlation, it's challenging. But in our case, we do some kind of approximation. And it's not a completely like closed form. It's kind of a first order approximation. But they're very nicely using these mini field model, we can then reduce the larger network to four dimensional, you know, oscillate. And then you go to the fixed point. And we found that the asynchronous state that people earlier talking about is a stable fixed point. And then you can see the eigenvalue has the real and imaginary part. And when you increase these parameter delay time of inhibitory synapses, and then you have a transition from negative and to positive value. And that is the hop-up application. So, and then the imaginary part game actually predicted the firing rate from the large scale model. And then after the transition, you see this kind of organization of oscillation. But they're very important to emphasize that is that even you go beyond the bifurcation of oscillation, the single neuron spiking, the CV is still close to one, still fairly irregular. The reason is that the spiking rate of the neuron is very low. It's one or two hertz and then, but the oscillation frequency is much higher. So like every, you just have about like a five or 10% of neuron join this oscillation. So, when you go to compare this field model and this spiking network, you can see when we go to measure this spiking, and then at least the transition point and the off the mini field, we have this power law distribution of the spike and this is the neuron evidence at the microscopic scale. Yeah. And then, and we also reconfirm our analysis from some data set that this neuron culture of the monkey, this monkey, yeah, monkey network. And you see the irregular spiking, CVs, different neural data sets. CV is close to one, but you can measure this kind of neuron spike at lunch. So this is if you're interested in more detail, and then we have article post on the website. And we also found that this connectivity we mentioned local circuit is dense. And the connectivity actually is also important also from the mini field. We can see this perimeter space, I get the decay time of the synaptic decay time and the connectivity in this two dimensional space, you see this bifurcation population. So when you increase the density, interesting in this system, I will show you this import, the firing rate actually becomes smaller and then the dynamic will move from a synchronized to this like oscillatory dynamic with the neural firing rate is lower, but the dynamic can generate like a large scale oscillation. And the reason is that when you have a higher density in the connectivity, then you have a topological correlation, the two neurons have common neighbors, then your input is not like this kind of totally a synchronized, your input is a correlate. When you have a quality input and in the balanced state, your current is like you have deviations suddenly to give you the firing of this neuron. So the neuron can find much easily when you have a quality input. So overall, the circuit can sustain with much lower rate like in reality like one hertz. Now I use a few minutes to talk about how this kind of neuron dynamic in principle can give you the efficiency for information like representation. So here we have the so-called economy of the spike. So we are talking about spiking of a group of neuron at different time window. And the idea is that each spike will consume energy and we just showed you like even at the resting state, well, I'm not talking about those like supporting glacial just even like at resting state of neuron will also consume like part of the energy. So the total energy of the circuit will be like a number of spikes and plus some energy in the resting state. But then you generate the spike with the very sparse and irregular spiking. You could have many patterns and in principle these patterns can be useful like information representation, the capacity can be measured by entropy. So there was a definition of energy efficiency. So you can use kind of theoretical or maximum entropy, given some firing rate of the neuron. You can calculate this optimal efficiency and depending on how much energy these are put into the resting state, then you see there's a kind of optimal firing rate. And through this argument, actually, you can see the neuron firing rate is about 1 hertz. Using this pyramid like about 10% of this energy will be used in the resting state. And this is the theoretical kind of theoretical up bound of the energy efficiency. When I go back to my biologically realistic network, I showed you earlier in the in the pyramid space here, you have the co-organization of irregular spiking oscillation average. And we found interesting thing here is in this critical state, the firing rate is minimal. But this kind of energy information representation like you can generate very rich patterns due to the critical avalanche is a maximized. So this is a very interesting and at this kind of critical state, indeed, you can you can organize this larger scale, you know, spike like I showed you earlier, people found that you have a larger scale of a larger probability of this classifier. And there are there are some more detailed analysis of how these can be realized in our early paper. So it is almost the last slide. And I recently I showed you that this architecture, like locally we have this network and then dynamically we have irregular spiking can also be understood from this biologically realistic network. So I actually compare I have a model about the early I just have a local random network. Now I consider kind of cortical sheet locally I have a several group of neurons like this is going to model like a mini column they are spatially separate, but they're spatially you have this architecture. But I start from a random network, like the neurons are randomly connected without the spatial constraint, but very sparsely. But now I start to rewind the network. I cut the connection and put them into this cluster. So then the collection become very short range and you will save a lot of like a wiring energy like according to what the professor we mentioned. But what is very interesting here? Yeah, this wiring cost when you go to render the winding random mass random network to this modular network spatially, then the wiring cost will be reduced the significant let's chip you because you have a most of a connection becoming very localized. But the most interesting here is the dynamic. I just changed the topology the dynamic will change the firing rate will decrease from earlier like irregular a synchronized firing of 20 hertz go to a few like one or two hertz. And at the same time you see the dynamic initially is irregular random spiking with like a 20 hertz for each neuron, but then the dynamic will become highly variable. And if you go to measure the neuron firing is a kind of average. Now, if you put a small perturbation to this network, in the initial sparse random network, you don't induce like enough like a response. But in the in this critical state, you can induce like response like all this magnitude large. So we have here when you come to something similar to our real network, you have you use much less energy in the wiring in the firing, but you actually create a much useful dynamical patterns. So that's all. So basically what I show you is under the constraint, you want to have a low rate and the local like a dense connection. But the from the function requirement, you want to have efficient processing, for example, sensitive to response, high capacity in your presentation. What I show you is the kind of a generic neuron circuit with interaction between the excited way in the inhibitory neuron with some important biological factors like excitation inhibition balance, like a neuron is different by fluctuation, you have a locally dense network, like you have a topological correlation, and you have these kinetics, like a synapses is inhibitor synapses is lower. So that dynamically, you can have a kind of self organized delayed inhibitory feedback and to induce this kind of hope application, the system seems to sit closer to this critical like hope application. So the whole brain can be like treated as a considered as this coupled like critical system. And in this way, the neuron system seems in a evolution found a way like that you use much less energy, but you can still keep efficient like a dynamical process as I just mentioned this kind of a low rate like resting state can be very, very important, although to consume a lot of energy. So that's the end of my talk and I would like to thank you and thank my collaborators and the funding and actually, we also have some more work on this characterizing these brain criticality signal complexity related to brain like cognition and so thank you. Thank you. Thank you very much. I think we have one question from our broader audience. So Leigh-Han can probably Yeah. Well, this guy, a girl asked, what's the current knowledge on connectivity at the single neuron level for human brain and maybe for mouse? And whether this is very important for your conclusions? Yeah, depends on the only resolution. Yeah. So you have a lot of imaging like from China, this Huazong, universal science technology, they use this other brain, you can measure some kind of fiber like a projection exos, but then you don't have any information about which neuron are indeed connected because these like synaptic SNPs and these kind of things are missing. And recently, I listened to a talk from a China university of science technology and they can use the high resolution like imaging technology go to very, very high resolution level to map out the whole brain of a 3D brain of a mouse and monkey and go to human, but the data is like a PB. It's a huge data and you have a dynamical activity and a structural activity. I think there are a lot of like fiber-chasing method, but coming to the single level is still like I think not very precise. Yeah. Is it true that these properties at the network level doesn't really depend so much on the detailed connectivity? I think it depends a lot actually because if you go to different brain, you have this. So now people pay attention to more and more. So called hierarchy in the early days, like functionally people say different brain areas are doing different things. Now they go to have this data and then different like gene expression and this connectivity neuron density layer structure and you can see these different functions are closely related to there like this kind of local circuit architecture. And also this local circuit architecture also is correlated with this long range like a projection. And so as I mentioned, they have many levels that they are actually correlated. Yeah. Matteo? Yes, I was wondering. So during your talk, I saw that the exponents of this power load distribution for the avalanche size was ranging from 1.7 to 2.7. So what is the, is there some universality or is a question of how you measure this power load distribution? I think yeah, this is a counter for me. I am not very strong in the statistical physics. I think people are already debating on this whether it's really a universal like this copper people compared to 1.5. But actually more and more measure found that this depends on the way you measure depending on the time window, how you define and depending on like even input different conditions. Now, since people agree on that this exponent can change, but this relationship I showed you like different exponent we also like in our data. So this exponent from the size duration and the relation between size and duration, they follow a relationship. This is a confirm. Yeah. And I think in neural system there's another kind of a source of the so-called power law is that you have a lot of heterogeneity in the connectivity. I just mentioned that the circuits are very non-uniform and then it's still not really well understood that how much already contribution from this kind of heterogeneity to the so-called power law. Yeah. Thanks. Thank you. Thank you very much, Chancellor. Are there any other quick questions? If not, we can move on.