 Yes, now we can see your presentation. Yes, you can start. Can you hear me? Yes, we can hear you. Yes, we can hear you. Okay, thank you very much. So first of all, I would like to thank the organizers for inviting me and giving me the opportunity to present. So, my talk is going to be based on our work on our work titled what to learn from a few visible transition statistics, authored by Pedro Harunari, myself, Matteo Politani and Edgar Olden. I would also like to drink to your notice that there is another work in the same line by Vandermeer, Urtile and Cyford. I would also look at it for for better understanding. Okay, so now, moving on to the background of our work. So, in our day to day life, we observe the systems which are very complex in nature with high degrees of freedom. It's very difficult for us to understand the systems, but there's a certain class of systems among them, which can be, they can be physical, physical, chemical or biological systems, which can be understood using a Mark of Ian framework. So here we can have, we can see a network where we have the internal states and the transitions among them. And the, the point is that in this network we need to have the complete information about the internal states. But, but, and if we have that information we can predict the thermo dynamical properties of such systems. Most of the experimental apparatus is that we, we see we have in our limited in in detecting this and probing the systems, and they are at most when to see some few visible transitions in this whole network. The whole of the, the internal states and the transitions are completely hidden, and a few transitions can only be few visible transitions can be observed. For example, in case of a molecular motor, we can see that for, for example in this experiment, we can see that this molecular motors are taking vesicles on the tracks. The only information here is the translocation of these motors on the track. So that is the few visible transitions that are being able, we are being able to observe. So in this scenario, our question is that our motivation of our work is that how one can use this transition information to make inferences about dynamical dynamical and biochemical properties of such systems. So imagine that you have an observer with recording the time in a stock watch and is looking at this few visible transitions. So here the black black trajectory that you can see is the complete trajectory when we have all the internal states visible, but here we can only see only the blue and the blue transitions that are visible to us, and we can record the time which is, which it takes to move from one visible transition to another. And from this, we can have a time series for this particular trajectory, which is kind of a visible time series, visible time series of visible transitions where. So, for such kind of trajectory from statistical, such kind of time series, we can have certain statistical quantities that we can compute this from this kind of time series information. So one such is that the if there are the what is the time taken for between two successive transitions, say, chosen transition say li and li plus one, given that these two transitions happen. So then we have the conditional frequencies that we can compute that if there is a particular transition, what is the probability of observing the next transition to be li plus one, or we can have the frequency of the trans transitions itself. For example, what is the transition of observing class kind of transition. Okay, for example in our molecular motor it's just meaning that the forward transition. So that kind of information we can just compute from this time series analysis. Now, in order to. We would like, we would also like to compute the analytical expressions for such quantities and, for example, here we have a Markov network for state Markov network, where all that all these states and transitions are not visible to us. Only the transitions that are visible are the magenta lines and magenta and what we the main purpose here is to find the time it takes to observe a particular transition. Okay, so that is like a first transition time problem that we are trying to solve. So we already know about first passage time problem. And so here we are trying to map this first transition time problem into a first passage time problem by reorienting this transitions that we observe here. The physical transitions into things, and then actually we get a more extended network and auxiliary network where we have the saints. Since S1 and S2. So, in case of the, if we had noted known the whole network, and then we can we could have used just the master equation here, and we could have solved for the probabilities and from there we could have computer and quantities. Since, in our case that is not so. So, so how to go about computing when we have few visible transitions. So in that case we have this extended stochastic matrix, where we have this this blue part which is the survival matrix that is if you are in this region you will just like the terminology of first passage time problem you will be you will survive and you'll not get into the sinks. But the main thing which we will try which we were trying to compute here is the what is the first passage time to reach the sink. What is the time to reach the sink, which is equivalent to saying that what is the time that it went through this transition. So, this is this has been inspired this kind of methodology is inspired by people by second motto, and he and so from this quantities and one more thing I would I would like to mention that we have a special notation here. We have the if we have say transition equals to 123. So then this double row will represent the source of the transition and double kit will represent the sink, or rather we can say the target of the transition. So, using this framework, we can compute the joint probabilities that there's an inter transition time that falls within TNT plus DT, and we have a transition li plus one next physical transition li plus one, given that we have observed a transition li. So, and we can also compute the conditional probability of the next observed transition, given that li has a good. And using this two quantities we can also we can compute the inter transition time probability density, which is here we just using the base theorem and and getting to this expression. So I'm not going into the details of how we compute these quantities but this is like by solving the first passage draw time problem for the extended metrics. So, then we can also compute the probability that a transition from a time series is in so that these are the different statistical properties that we can extract by solving the problem solving the master equation for the extended. Stochastic metrics. Now, moving on to be here we have this transition statistics for 80 ATP driven motion of molecular machine. So here we have the chemical coordinate and the position coordinate. And along the position, we are not, we cannot see these transitions this chemical conformational changes in this in the during the motion of the molecular machine, but the only thing we can see is that the, the forward movement of the molecular machine and backward movement of the molecular machine. So this is given by precisely by 421 and 124. These are the forward and the backward transitions that a forward hopping and backward hopping of the molecular machine. So from here we can see that for the for the case of system which is repeated has a have for repeated transitions. That is, if you have say a transition for one, and you want to compute the probability densities for such events where we have for one and again a for one. So the null time probability density is zero, because, because there is no possibility that if you have to, you have to go through the network before you come to this 421 transition. And, and in case of our repeated alternative transition that is not so, there is a high chance that after a 421 transition you can have a wonderful transition and therefore there's a significant probability density for this kind of null event. Okay, so, so these, these quantities can also become predicted from the, from these quantity, the quantities that we have computed, and so they match very well. So now, after going into this transition statistics are the statistical properties of the system now we move on to the, to the thermodynamic properties of that system how can we infer that hermetic properties of a system from visible net transition trajectory. And then we have the visible transition trajectory, where the visible transitions that we can see, and we have to take care when we do a time reversal, because, in this case, when we have a time reversal, not only the order of the order has completely changed. We can see a flip in the transitions, so the, the back, the backward transition now becomes a forward transition, and also there is a, there is a time. There is also time translation, which we can observe. So now for this forward trajectory of for the forward visible trajectory. We can compute the probability of this forward trajectory from the joint, the conditional probabilities and also the boundary term. And also, we can compute the reverse strategy probability of a reverse trajectory in a similar fashion, and from this we can see that we can say that if we have the. If we compute the, so basically entropy production is the irreversibility is the signature of the irreversibility in the system. And therefore, this, the distance the distance between the two, the forward trajectory probability densities and the backward probability density would will hold this irreversibility signature, and that we say to be the stationary rate of entropy production. So if we had the whole of the trajectory that we have observed here. In that case, we could have computed the stationary rate of entropy production from the black light divergence of the whole transition of the whole probability of the trajectory and. But so, so this, this is done by in paper by Edgar Holden and Jonah and Jonah and but here there's a point that when we have this the whole trajectory, so we are actually involved with more number of random variables in our system. Yes, we are, we are looking at a very subset of the trajectory we can kind of kind of subset of the trajectory. So in that case, the, we can say that the sigma L that is the stationary visible rate of entropy production will be less than the stationary rate of entropy production that we can compute. There is another way of computing the stationary rate of entropy production that if you know the just the whole network and we know the transition statistics in the probabilities of being in this different states, then from there, we can also compute the total entropy production for the steady state. Now from these quantities solving this, this using this on the pro the conditional probabilities we can see that that the sigma L that is the visible entropy production rate at rate is has two contributions one is sigma L and the other is sigma T sigma L is the contribution from the sequence of transitions and sigma T is the contribution from the inter transition times and the inter transition times has this quantity which is the cool black lively divergence between the inter transition times. Okay, so, so now, if you focus on this on a on a particular special case where we have only a pair of transitions visit, which are visible, that is, the transition say for example 421 and one to four is visible only one single is visible. In that case, we see that if we compute the sigma L and sigma T, since the plus minus in the time reversed case, it becomes the order becomes neither. Not the order becomes reversed, but also the transitions themselves get reversed. So then, this quantities becomes zero, and we see that our sigma L and sigma T, they are only dependent on the repeated inter transition time statistics. Okay. So inter transition time statistics and also the frequencies of the successive transitions. Okay, so from this study. Next, we want to see a compute for a particular example where we have a single single transition that is visible. In that case, we have computed the inter transition times for repeated transitions in and for alternated transitions. And here we see that the entropy production rate for the exact one is given by the black line, and the sigma L and sigma L simulations and the, the, the total visible entropy production rate, they match. And there is a contribution of sigma L in that, and that is exactly matching with the estimation from the thermodynamic uncertainty relation. And that is understood because we could see that sigma L actually is a product of the current and affinity of that network. Okay, so from there we can, we can say that that is why we can have a correlation between the two. And as, and therefore the, the car when the system is in a stall condition, they don't predict the entropy production rate, but the sigma T, which is the inter transition times, they have the signature of the entropy production rate even in the stall conditions. As you can see here that by the changing the bias parameter we don't see any effect in the sigma T but that there's a contribution from that. Yes, so in case of ring networks that is networks where we have topology like this. We can, we see that the sigma L sigma T has, as you can see here we have taken only one visible transition. So it will depend on the repeated transition statistics. And, and we can see that they exactly match and that is, we can understand that. So, so in that case sigma T has no contribution and the contribution is completely from the sequence of transition contribution that is sigma L. And, and, and also the estimation from the thermodynamic answer relation also gives the same and sigma T has no transition contribution. Now let's look at a few, few molecular machine models where we have where we can apply these studies. So here we have this dining dream model where we so we have one cycle and we can see that as we increase the concentration of ATP there is an increase in the entropy production rate. That is, as, as it's taking more and more ATP ATP and moving the system is dissipating more compared to as we increase the concentration of ADP. So moving on to the kinesin multi cyclic model. So we're here we have a molecular motor transport motor which has two cycles so in the, in contrary to the previous case, here the forward and the backward both actually take energy ATP to to take energy into mechanical movement and but since there is a symmetry between the two cycles the forward and backward. We see that that the repeated transitions are exactly the inter transition time for the repeated transitions have exactly same distribution, but that is not the case in general for multi cycling models. So from here, we can also compute the, the inter the distance between the correct line distance between the alternated transitions. And we can also see the entropy production. There is no contribution from Sigma T. So the entropy production at the stall situations. They become zero. And so, and we also see that as we increase the ATP, the entropy production rate increases and it has no effect on the ATP concentration. Now we move on to another part there. So this was mostly this was involved with the repeated transition statistics. Now what about the alternative transition into time into transition times for alternative transitions. So, here we have considered molecular machines, because I see in molecular machines there are certain molecular machines like ribosome RNA polymerase which have this whose tracks are heterogeneous in nature. And this heterogeneity also affects the, the, the, the rate of transitions that are also involved in this the translocation movements. So we can see tracks such track where we have a kind of monomer when the molecular motor is on this track, we have this set of rates. And if we have another kind of monomer be in that case we have a different set of rates, which has a. I wish I is offline now. Hello. Oh, hello. Yes. Have I lost you. And now, now I can hear you. Which slide have I lost you. You were the, the non repetitive sequence of the stepping motor. Okay. Okay, thank you. So, so this case where we have we can introduce. At the moment you're not sharing the slide. Okay, thank you. Thank you. Yes, it's this one. So, so we were talking about the disorder in the track. So we can introduce disorder into the track by the rates through the rates by having a different set of rates for a different kind of monomer on the track. And we could have the distribution of this monomers on the track. We can have that also as a disorder parameter. So, so if we look at the results. So we see that in case of when we increase the heterogeneity of the rates. Between the two kinds of monomer a and B, as we increase the heterogeneity, there is an increase in the distance between the black library distance between the two distributions for the alternated transitions, but the repeated transitions have no signature of that. And in case of as we increase as we change the probability of the monomer on the on the track. Now, we can see that there is a there is a non monotonic behavior of the of the repeat and the alternated transitions distance between the alternated transitions. So we can have the two cases on the two sides which represent the homogeneous case where we have either a monomer a or a monomer B. And so from from these studies we can say that the degree of disorder is there is a signature of the signature we can get about the degree of disorder in such systems from the alternated transitions. Okay, so now I would like to come to the summary and so for the broad class of stationary Markov processes we can derive exact analytical results for condition one condition probabilities of appearances of successive transitions. We can measure the inter transition times, which are crucial for the more dynamic inference, and we can see that repeated transition frequencies and inter transition times contain the signature about the irreversibility and the alternated transitions do not contribute to entropy production estimates, but the statistics provide means to identify the presence of disorder and hidden state space. So I would like to conclude my talk and thank you everyone. Thank you for her talk to Anisha. Thank you. Now I see some question in chat. Oh yeah, it's the long one. If you can see or I can. So I would like to ask you a question between your decomposition of entropy production and decomposition proposed in the previous article inferring broken detailed balance and the absence of observable currents. Sorry, I should have referred this paper is the current decomposition or generalization of the decomposition of that article. So in that particular case we have not taken the inter transition times between the repeated transitions. So that is a factor which has not been considered in that particular study. Yeah, that's what I can see. Repeated and ordinated transitions. And are there any questions. Okay. Yeah, please ask. Yes, thank you for your interest in talk. My question is so is it possible that C theta capital L is identically vanishes, although the total system is totally out of equilibrium. Sigma, which sigma are you talking about? Sigma capital L. Okay. Sigma L and can you repeat your question again. Is it possible that in some, is there any cases where sigma L totally vanishes, although the whole system is out of equilibrium? Yes. So, so Sigma L, so Sigma L has a contribution from Sigma L and Sigma T. So Sigma T could be zero in cases where we have ring kind of networks. So in that case we can have zero if the system is out of equilibrium and Sigma L could be zero in cases where there is no current in the system. So in such cases, we can have a situation where both the quantities can tend to zero. So there may, there's no current in the system. Isn't it just equilibrium? Yes, yes, yes. Okay. Let me just think about it. In our case, Sigma L, yes that's true. In case of Sigma L to be zero, that will mean equilibrium. But we are, if we are looking at a particular single transition, that could be zero. The other transition could be non-zero. So in that case, we could miss that particular behavior. So in case of ring network that we can conclude that because in case of ring network because the current is same in each of the edges. So if there is one transition which is zero then all the transitions, the current through all the edges should be zero. But for a complex network which is multi-cyclic network, in that case a particular transition, if that has a current zero, that will not mean that the whole system is in equilibrium. Yeah, I can understand. Thank you. Thank you. Okay, I think there are some things in chat. Some comments. Are there any questions? I added a comment, but it's a bit long. You can read later. There's a question, maybe. A state system with no current but far from equilibrium, either because it is a transient or because it's transient rate change with time. In this case, when Sigma T be zero. So I cannot comment on this. With the work we have done, so it's not coming from this point. Okay, thank you. Okay, I think, thanks again, Anvesha for your very nice talk, and we go to next speaker. It is.