 but I think two fundamental reasons. One of them is that, so this is what we're going to be talking about today. This is a set of bonds that goes by the name of thermodynamic uncertainty relations. So yeah, the first reason that it's close to my heart is that I think besides David's work that's presented in the previous lecture, this is about seriously ongoing research. So we are going to be talking about maybe two or three of them, you're going to be driving one of them from the fluctuation theorems. But if I tell you that there are at least 50 of them out there, believe me, because it's true. So why do they come with so many different flavors and just like so many different expressions? It's because they are bonds that actually are written in terms of that captured the conditions that you impose on the system and the system's dynamical evolution. We're going to see it, we're going to discuss it. So this is the first reason. But the second reason is that, so during this course, we've been always talking about that resources, right? So there's a resource, what is like the minimal something something, minimal cost of actually carrying some process. When you're given a minimal resource, how do you carry out a task efficiently and so on and so forth? So these guys, they talk about it. Okay, so let's go back to equilibrium statistical physics just to sort of give like a big perspective, okay? So what we had in equilibrium statistical physics was that, okay, you have this isolated system and they sort of like the fundamental thing that you get on. This is that all the microstates are equally likely. And when you write down the Boltzmann, Gibbs Boltzmann distribution, basically you have a universal principle that is guiding you to derive the thermodynamic properties of the system such as temperature or pressure and so on and so forth, right? So and also there is a kind of an expression which we call the second law that kind of looks like a resource constraint because it dictates what kind of a physical process is allowed or not. Only those that actually increase the entropy are allowed, right? So when it comes to non-equilibrium states, so we are considering mesoscopic systems, right? Small fluctuating systems. They're incredibly, incredibly exposed to fluctuations. And this means that, yes, okay, they're often also far from equilibrium. So can we find the universal principle that actually identifies the behavior of the systems as we did in equilibrium statistical physics? The answer seems like no, because they come in such, again, in different flavors and different types. But what we can do is, for example, we know how to identify non-equilibrium state states, right? How do we identify non-equilibrium state states? We know it, right? It's a question, seriously, like, okay, sorry. Okay. How do we identify non-equilibrium state states? No, no, no, just don't. Okay, this is a detailed balance. But how, this is a detailed balance, what you're mentioning is that. But basically the way that we identify non-equilibrium state states is that you have non-vendation currents, you have entropy production, okay? You have non-zero entropy production, you have dissipation, you have an entropy cost of actually maintaining a process, a time of dynamic process, it's non-equilibrium state state. By the way, this is like the one of the take on messages, okay, we need to be, we should definitely know this with our hearts, it must be encoded there. So one of the things that we can do when we have a non-equilibrium state state is that also, depending on this definition of like this dissipation and the vanishing fluxes, we know that if you have a time-independent driving protocol, and okay, I'm going to abbreviate it as NESS, okay? I don't wanna keep saying non-equilibrium state state because it's too long, I get tired. So when you have something like that and time-independent driving protocol, what you have is basically a constant average entropy production rate. And when you have something like an average entropy production rate being constant, we can write down beautiful, lovely, and really, I don't know, friendly fluctuation theorems, okay? Like for example, like this one. These are the fluctuation, I mean, this is a form of a fluctuation relation, detailed fluctuation relation. I think we saw also the integral fluctuation relations yesterday, okay? So this is basically suggesting that this is a universal formula, okay? Maybe it's not stating some universal principle, like as an equilibrium statistical physics, but we cannot have these universal principles because now what we have for the entropy production is that entropy production becomes a random variable, okay? It's exposed to fluctuations. We need to speak in terms of probability distributions. We need to concentrate on the statistics that define these thermodynamic quantities such as entropy production. So this is as universal as we can get. This is basically saying that it is exponentially more probable that you observe an increase in the stochastic entropy or entropy when you actually compare it to the probability of observing a decrease, a corresponding decrease in entropy, okay? So this is one form of a detailed fluctuation relation, but one other set of relations and more like an expanded version, actually not only include entropy production, but some currents, okay? You're going to be defining them formally, okay? Currents are basically, some think of it as like some set of physical quantities that actually contribute to this change in the entropy as your system evolves, okay? So today by using that kind of a joint fluctuation theorem we're gonna derive a set of complimentary relations called the thermodynamic uncertainty relations, one of them, okay? And what these guys do is basically providing an understanding of the minimal energetic cost to maintain some physical process if you want to satisfy some amount of precision, which is like given in terms of inverse uncertainty. But to make it more precise because when I say it like that it's just like worse in the air, first I'm gonna put a verbal statement, okay? So the first TUR, it was derived in 2015. You can go and check this one. And in the context of, is anyone interested in biology? There was an end going like no, there was. Okay, that's perfect. It was derived in the context of biochemical reactions because the question they asked was the following, okay, stochastic thermodynamics is good and cool and you can universally bound some fluctuations, but let's say often enzymatic reaction. So this is the between, this is the end. Between this guy, okay, we are going to, we want to say something about the uncertainty or the precision of some physical quantity that we want to relate to the thermodynamic cost, okay? I'm going to say process, but take it as like a, I don't know, biochemical reaction. Okay, the reaction process. Okay, is the main message clear? Because if that is not clear then we're not going to be able to get beyond and about this. But give me feedback. If you don't give me feedback, I feel really, I feel sad, like seriously sad. Okay, thank you. Okay, so we talked about, okay, we are asking this question basically and it can be I think summarized as, we want to do that because it's useful because we think biological systems actually do it and we think, to design really more efficient computers you need to be able to do that too. Okay, you need to be, you need to have as like little amount of an uncertainty as possible. Okay, so let's start with actually a minimal system. Bound, yes, exactly. Bound's precision by dissipation. You want to provide a bond on the precision by dissipation. By the, I mean, you can think of also as like the uncertainty. Bound by dissipation. I will come to that, but if you don't get the intuition of it, the formula won't mean anything, I think. Okay, I will come to that. I'm doing it. So let's start with a minimal model. Okay, this is the system. So non-equilibrium steady state. I am coupling it to one heat batch. Is it enough to have non-equilibrium steady state? Or am I coupling it to multiple heat batch? Okay, multiple heat batch. Okay, I'm gonna tell why then. But let's say one, that too. If you don't have, if you don't want to have an non-equilibrium steady state, you can just couple it to one heat batch and you're good to go. But so just, let's remember this one. There is an implicit time dependence. Let me write it just like that. Okay, this is the master equation that's describing the system dynamics, right? So we got that, we know that. Okay, that's perfect. So one of the things that we wanna do is also, I want to sort of emphasize it if we don't know the difference between sort of like, if you, what happens whether you couple it to like one heat batch or multiple heat batch. There is an implicit sum here. When you couple this system to multiple heat batch, this actually is this term. Which is basically encoding the probability of making a transition per unit time. It's just the sum of the contributions of each heat batch. That's the, because when you couple a system to multiple heat batch, you know that system is gonna make a jump, right? You know that this time dynamics of different heat batch, they're actually imposing some dynamics of the system, okay? But you don't know which jump will it make and for, I mean, due to how, due to which coupling it will jump. So there is actually an implicit sum when you have multiple heat batch. This is actually in Esposito and Wandombrok, like page three or something like that. You need to see that, okay? And indeed, if you need to have one equilibrium steady state, you're having multiple rats. So that's good. So we can define a trajectory, right? We are evolving the system. We have, okay, continuous time and space, but discrete set of like the states or the configurations of the system that the system can visit, okay? So let's use little like this, maybe omega and define it as, let's say that this is the initial state that your system is in. It's time t zero, okay? And then you're going to jump to another system, I'm sorry, another state, bup, bup, bup. So I'm going to keep writing this up until two xn, tn, vn, where basically tn is your final time of the dynamical evolution. So system evolves between time zero and t, okay? So what this trajectory encodes is the successive set of states that the system is visiting, okay? I'm jumping, I'm here initially at t zero, I'm in the state x zero, and I'm jumping to x one at t one by the effect of the heat path v one, okay? And then you can basically encode all of these history of the states in this kind of trajectory, okay? This one, what do I, can you help me? I'm really afraid of, yeah, okay. Now I need to make some changes because it's going slower than I thought, but okay. Okay, I can still talk about this. Okay, so but let me tell you one thing. Esposito van den Broek, did we read it or not? Be honest, you didn't, right? Okay, I know there are too many classes and there are like, okay, too many things to do, but it's really, I mean, to understand something like that at the level of something, this is, as I said, I mean, this is ongoing research and this is serious research, so we need to be able to sort of, for example, just reply like directly what it takes to have an equilibrium steady state or how to write down like a trajectory because we are now going to define something called current. It's a function, an isometric increment function. So in these, I mean, it's gonna be mathematically precise. You're gonna understand it, but we need to be able to physically understand it, okay? Okay, perfect. So given a trajectory, we can actually define a new quantity that would allow us to count this net transition rate. Transition rate is basically encoding the jumps that you're making from one state to another, okay? So this is jumping, this is the transition rate from the state x prime to x, okay? I'm gonna use this one. Say this one. So we are counting and don't forget the word of counting because actually, I mean, maybe it's gonna change, but one of the questions in the homework is actually dependent on full counting statistics and we're gonna see why it's called counting statistics, but. Okay, so let me actually label this by, for example, i or j, yeah, okay, j. So j equals zero is encoding this x zero, t zero, okay? j equals n is encoding this one and our trajectory is encoded starting from j equals one to j equals n, okay? So I'm defining this in terms of this j's. So I'm going to write it like this one starting from j equals one to n. How do you count jumps from one state to another? This is something that you can encode in terms of direct delta functions, right? If you're making a jump from x prime to x, you need to basically write down the direct delta function that specifies that at that point in time, if you're making a jump from x prime to x, then you're gonna get a counting, a count, one count, only if you're making that transition. So this is why we're using direct delta function and this whole term itself is gonna come in terms, yes. Exactly. By the influence of, by the influence of v. Exactly, okay, okay. Exactly, yes, yes, exactly. So when I use this script v over here, it means that, yes, exactly, it's under the influence of heat bath v, okay? You can have, I don't know, we want to be in kind of like heat baths and they're all fixed at some inverse temperature, basically. We want them to be also another point. We want them to be at different temperatures. You're gonna keep them at the non-equilibrium state. And the idea is that basically, I mean, everything interesting in the world happens because there is conflict. You're coupled to two heat baths. They have their own temperature. They want to dictate their own temperature. They're stretching you like that. So what are you gonna do? You're gonna go out of equilibrium, right? So this is like the intuitive explanation. I'm gonna put this one. Do I need to put something different? Yeah, okay, the heat bath. Okay. This is basic, uh-huh. What are the two screens of the delta function? I think it's the chronicle delta function. Chronicle delta, yeah. So the second delta should not be there, probably. Oh yeah, I mean, I mean. If you look at what you wrote, probably the second delta should not be there. The second, no, no, but this is, I mean, the second delta encodes the fact that you're jumping from some X prime to X. Yeah, yeah, but the way you wrote, there is delta, the delta should be a comma. The second delta should be a comma. The here or? No, no, no, previous one. Okay, this one, okay, okay, okay. This one you're talking about. Yeah, yeah, yeah. Yeah. Are you thinking about it? No, no, but it's okay. I mean, it's just a notation issue. Okay, let's think, yeah, okay. Anyways, yeah. So basically, I will argue that it's actually true, but because you need to keep, do you see it? This is delta saying that X prime, J minus one should be equal to X, right? So this delta should not be there. I apologize, I apologize. I'm sorry, I was thinking that maybe you weren't like, yeah, maybe I was getting something here wrong, but I'm thinking no, no, you need to keep track of this. Okay, I apologize. Yeah, okay, I'm gonna do my best. Okay, do you see, okay, let me erase this. Okay, I'm gonna take this one over here and then write it again, okay? You see this though? Okay, I'm gonna do some over this. Okay, is it okay or should I go bigger? Okay, perfect, perfect. Okay, this is something that you use to count the transitions between states. So we understand what this means physically, right? I'm happy, okay, I see has nothing, I get really happy. Okay, now, here comes the definition of current, something that we didn't actually define before, I think, in this context. So of course, there is an implicit dependence on the trajectory of these things, right? So let me not lose notation. So now we are summing over v's if you realize, okay? Because we are going to call it something, I'm going to define what it's called. Okay, this is something important. Okay, this is what we call a generalized accumulated current as you go through a trajectory. We have a trajectory dependence, okay? So what is this? Did we define this? We didn't define this. So this is actually the only one in position on how do you define a current function, okay? The current function is incredibly general, okay? But there's only one in position that you need to satisfy and it is this one, on this one. You're taking this and impose this. This is a minus. So what does this mean? So this component over here, it allows us to actually count the state. If there's a question, I can definitely take it. Question? Can you repeat what are xj and xj prime? Okay, the thing that, okay. So this is how we write the master equation. Okay, maybe I should have used y. I apologize for that. Yeah, this is how I got used to it. But these are two different states, okay? This is basically something that tells you that you're jumping from x prime to x, okay? So I say that let's use j to actually keep track of these numbers. So this is basically how you're labeling j. j equals zero gives you this part of the trajectory. j equals one gives you this part of the trajectory and the trajectory that you encode is basically going from time t equals zero to time t equals some big t, finite t, okay? So the final value of j is equal to, it equals n, okay? So when you write this kind of a thing by Kurnakar Dalta, what you're keeping track of is that you want to count the net number of transitions between states, right? A longer trajectory. Okay, that's why you're summing, yes, great. Okay, that's why you're summing from j equals one to n. We should have made it two hours. We should have made it two hours, I said, the t yours. Okay, okay. No, but no, no, no, it's not, I have time, no, I was just making a comment. I was getting your feedback. Is that the current is defined in terms of, is not the net number of state transitions. Rather, at each state transition, you have a originating state, the initial state and a final one. We want a function of those two things, your initial state and the final state. The only restriction is that that function's gotta be anti-symmetric. If I were to do the same transition the other way, then I get negative. The current, there's a way that that term is used in this literature, is the sum total of that function which is defined at each transition along all the transitions in a trajectory. The trajectories are themselves random variables. The number of transitions there will be as well as the transitions themselves are going to vary from one run of the experiment to the other. So everything that's going to be talked about here is the averages of these currents where you're averaging this accumulated number, this accumulated sum of this function of each transition along a given trajectory, you are averaging that over all the trajectories. So as Gilder was emphasizing, it's a very, very general beast. Current in the sense that you and I normally think of it, like a number of electrons accumulating at a one side of a plus terminal, minus the ones that left at the negative terminal, that is a current, water flow is a current, but it's really very, very weird things can be a current. For example, it's a valid current to say the change in the opinion of the people in this room about, I don't know, whether it's dark outside or not. If you have that going, they give it a plus one, if you think that it's actually now becoming dark and a minus one, if you think of it's now not becoming dark, that is a current. I think the easiest example is that you have a coin, you're flipping a coin. Every time you get heads, just add one. Every time you get tails, just get minus one. Oh, it's gotta be the same coin that's going along. Once it becomes a head. But you're, okay, yeah, okay. It's not IID, it's once you get a head, then you would go back the other way and that would give you a negative. Okay, and more going thermodynamic. I think one of the things that we can do to make sense of this kind of a thing is that this is basically what we call an anti-symmetric increment, okay? It can come in different forms and the way that it comes in different forms is how we generate different expressions for currents. Okay? So for example, let's say that E of EX prime is actually corresponding to the energy that you have in state X prime and also EX is the energy that you have in the state X. So something like this, if you put it here upon this description, what you're gonna get is actually something like a net heat full of, okay? Along that, right, but I mean, yeah, of course, I mean, it's always along that particular trajectory or we wouldn't be discussing stochasticity in currents. That's why I'm always, yeah, I try to put it here and we have this dependence on trajectory. Okay? How many hands are you getting? Is it clear to people what a current is? Okay, but so from now on... I think there is something wrong in the number of particles hoping from X to X prime this is about V because we have XJ prime, but... Yes. And you need to keep track of like at which step of the trajectory you're jumping from where to where. That's why you're using some label like J. Yeah, okay, but what's XJ prime? Because we'll define X1, X2. Okay, but I said that J, I mean, if it's not clear, you can just take it like that instead of V, you're using J. Okay. Oh, okay, okay. Okay, yeah, you can do that, but I'm still going to argue, but anyways, okay, okay, let's go like that. So now I need your help because I thought that would be like a one hour of a class because we would be, I don't know, we would probably know how to deal with non-equilibrium steady states and sort of like how to, I don't know. I think, yeah, maybe you didn't reply because of some shyness, but yeah. So I'm going to then try to go really, really slow, okay? And I'm going to do my best, but I need you to give me an incredible amount of feedback. Okay, please, please. What's the difference between an equilibrium and a non-equilibrium steady state? What is the definition? Not how is it achieved with baths, but what's the actual definition? Do people know? Okay, exactly so. So that's the crucial thing, that there's zero. So let's say you have a system with three states. I can have it be that the marginal distribution over those three states is not changing in time, but probability mass is cycling in those three states. That is a non-equilibrium steady state. If you're in an equilibrium, there is no net probability flow going between any two particular states. That's the crucial thing and the way it is actually achieved in practice is by having multiple baths, okay? So part of the intuition that's going to be driving here is that entropy production goes up when you have very strong asymmetries in that non-equilibrium stationary state of moving the probability from one flow to another one. That's how you get it, that you get a lot of current. It's the current of the probability flow in the non-equilibrium steady state and the current of these operators that Gilger was showing is a relationship between those two that's actually driving the thermodynamic and certainty relation that is coupling the energetics of the process of being at the non-equilibrium steady state to the actual just transition values of the currents. Okay. And now it's gonna go more smooth because there is one thing left. Okay. Based on this one. Yeah, I can, yeah, just write this. Okay. So one thing that we can emphasize at this point is that we know how we, okay. Do you have, if you, you do have questions? Yes. For d xx prime, how it is dependent on vj, like. I didn't understand. D xx prime, the generalized accumulated current. Okay. It depends on vj, like I mean the delta. Yes, it must, right? Yeah, no. Because you need to know that you have a state transition, right? You are jumping from one state x prime to x and there's a difference in energy. But let's say that you're a couple to different heat paths. If there was only one heat path, yeah, it would be trivial. But let's say that you have, you're a couple to different heat paths and we labeled them to, we want to we and blah, blah, blah, and we labeled them by vj. And we want to know that which heat path is actually causing this transition, right? So you need to include this kind of a term here. That information is in and new, but also in this one. So that's, that's correct. Any kind of a vj, you don't have to think about it. You are summing all over, all of these vj's when you're, when you're actually defining your current. This is how we define a generalized accumulated current. Okay? You're summing up all the contributions. Okay, so, great. It's a baby j and this is a, okay, i. Okay, let's go with i. But I mean, from this point on, you're not even going to actually label these kind of things. Let's just keep it like that. This is a baby j. And yeah, this is a big, big j. I thought this was actually obvious, but okay. Okay, yeah. Sorry about that. So I will be careful with the labels, okay? So what you want to do when you're computing the current, the total current is bad. You have heat vat one and then heat vat two. System is making a jump from one state to another. You want to keep track of like this jump, you know, okay, this jump is happening, but due to which coupling, okay? And when you're computing the current over a trajectory, you need to keep track of, or they account for all the jumps that are occurring due to any of the heat vat couplings. That's why you have to sum over v's. And once you sum over v's, you're basically getting this, all of these contributions, this current mini, mini baby. You can think of like this baby currents from one heat vat and then the other heat vat, but once you write down the complete generalized accumulated current function, what you want to have is that all of these contributions, the sum of these contributions, okay? Is it clear or not? I mean, this is something that we've been doing with the master equation, right? So it shouldn't be that, is it familiar or not? Okay, it's great. We're labeled with vj the number of this buff that's causing the jump between xj and xj plus one. Yes. So that's what we're labeling with vj. I actually say this sentence. Yeah, yeah. I remember it when I defined the trajectory. Yeah, okay, okay, okay. Yeah, but thanks for reminding that. I think it makes things clear. I just wanted to be sure. Exactly, no, but that's great. I mean, this is the case. If you're making a transition because it's, you know, like this j equals t equals, let's say, I don't know, three, then basically you need to account for the contribution of the batch that makes you jump at p equals three, okay? And this is just an abbreviation for that, okay? So after that, one thing that we can do is, okay, we talked about thermodynamics and certain two relations, we talked about fluctuation theorems and we know how to write down fluctuation theorems, right? You're basically using the probability distributions of like you run the movie forward and you're generating like this, you have this increase in the entropy, okay? The entropy change. And then you run the movie backward and you're having a decrease in entropy, but we all care about the entropy production as a random variable itself. And the, just the probability distribution that is characterizing entropy production, right? But in a thermodynamic uncertainty relations, there are two components. When I actually started this lecture, I tried to emphasize it. There are some, for example, joint fluctuation theorems that not only say some things about the statistics underlying entropy production behavior, this entropy behavior of the system, but also currents that contribute to this change in the entropy as you evolve through a trajectory, okay? So thermodynamic uncertainty relations include these two terms. Basically, one of them is the average entropy production that you accumulate, okay? And the other one is this one. The variance of the current divided by the mean square of the current. This is called the precision, okay? You want to keep this as low as possible. You have a comment? Okay, okay. So I'm gonna put like the prototypical bonds here, but I mean, you can also write this as, okay, again, emphasizing your system is evolving starting from zero to finite t, okay? I think this looks like that. Did I forget something? No, okay, okay. You, okay, not for today, but for the exam, okay? I'm talking for the exam right now. We know, we should know, we must know how to write down things like this. Entropy production, average entropy production, like trajectory, current, and so on and so forth. Because this is like the center, like this is the core of stochastic thermodynamics. This is the foundation. Even if you don't do stochastic thermodynamics, even if you hate stochastic thermodynamics, I think it just gives you an idea about how to model things, right? So we should be able, I mean, I'm saying this is a TA right now that we must probably know how to write this kind of things. Oopity. Okay. Everyone is okay with this? Okay, perfect. Oh, we are going to take sort of like this historical bulk, but it's like a really brief history because the first TUR was discovered in 2015. This is something that you can find and I guess right now there's a textbook on introduction to stochastic thermodynamics by Luca Pallici, I guess. And these are things that you can find in textbooks. These are things that you can find in articles, but I'm going to provide them just to give an idea. Mean square. So they come in this form. This is the most general form. This is the first one that people derived. Again, in the realm of biochemical reactions, okay? This is the precision term. This is the average entropy production that you incur as you evolve the system in a time interval from zero to T, okay? What does this say? Is, okay, do you have an intuition combining with the first verbal statements that I had in the lecture? Do you have an intuition about what this says? Oh my God, I'm a terrible lecturer probably, okay. Like, yeah. Okay. So do you remember the question that I posed here, right? Yeah, okay. So just thinking about the positions of this, this is inverse entropy production, right? So if you want to have large precision, what do you want to have, or vice versa? Yes, exactly. There's a trade-off between precision and dissipation, right? You're paying with dissipation. If you want to have precision. This is what, okay, if someone just asks you, interventions, even if you don't remember anything from, I don't know, these moments of your lives, you can think about it as like this, okay. What is a thermodynamic uncertainty relation? It's a relation between precision and dissipation, and if you want to be more precise, then you're going to pay thermodynamically, okay? And the payment is done in terms of entropy production, okay? This is one form of it. So I told you that you can apply different thermodynamic uncertainty relations to different settings, right? So it means that, okay, we define different settings by imposing different conditions on how the system evolves dynamically in some finite time interval, and this only works for a specific set of physical scenarios. If you are in non-equilibrium state state, you can use this time-independent driving, okay? So by the way, I'm using always this term driving, driving protocol. Do you know, I mean, how we can sort of like driving the system or is it just like, yeah, for example, great, okay. So what we think about when it comes to driving, you know, like a thermodynamic, you know, like this system in a thermodynamic process is that, and a thermodynamic process is, I think, it can be formally defined by two components. This, I was having a problem with because it was so amorphous to me that I read Chris Dresden's Key's paper on actually the first like this derivation of detailed fluctuation theorems from finite bath formalism that we are gonna discuss at some point in time. So he gives a definition of thermodynamic process as a composition of three components, okay? You have a system, heat reservoirs that are fixed at some different, you know, that are fixed at some temperatures. So beta, I, and they are fixed at some temperature. So they have their own equilibrium distributions that are characterizing themselves. This is the first one. And we are talking about a thermodynamic process. So if you don't couple the system and the reservoirs, you're not gonna have something like that, right? So establishing and breaking coupling with the thermal reservoirs. This is one part of the thermodynamic process. And the other one is, this is mostly how it symbolized a lambda. It's a control perimeter, okay? And it is basically, yeah, external force, external field, apply a magnetic field. This is what we call a driving protocol. This set of functions of t, little t time, okay? This is a, yeah. But yeah, sometimes, I don't know, some sources say that these two of them is also a driving protocol. So when I say we're driving the system, we need to think about two things. Establishing, breaking contact with thermal reservoirs and just using an external work perimeter. This third element here, basically, yeah, it basically, yeah, corresponds to what you said. You have an external agent that is manipulating the system from outside, okay? So I talked about this because, yeah, we keep saying that, oh, non-equilibrium steady-state. There's this time-independent driving. But what if you actually want to do something different? Okay, consider some different types of driving, such as periodic driving with some different conditions. One of the UIs that work for the scenarios, because this is incredibly strict. You can't believe how strict this is. Yeah, non-equilibrium steady-state is cool, but it's comprising a really baby subset of scenarios, okay? Are we out of time? Where does that appear in the equations, for example, on the left? That's behind Gilder right now. It's an argument of the W. So when people talk about doing work on the system, the way that that manifests itself in this stochastic thermodynamic formulation here is by changes of W. And so lambda of T is just a way of encapsulating that. So it's actually W of lambda of T. So something different that you can have if you make this condition a bit more flexible, but not too much flexible, is something like this. It's a weaker bond, but you're actually making this condition flexible. What you wanna do is to see what kind of physical scenarios you're allowed to apply these TURs for, okay? This is called, I don't know like this, this is the original TUR, an ESS TUR. This is called something that I plan to drive, but probably not today, because we are gonna be using joint fluctuation theorems and so on and so forth. Joint fluctuation theorems of entropy production and current. So it's a fluctuation theorem of two random variables, okay? The fluctuation theorems that we saw yesterday, they only comprise one random variable, which is entropy production. Now we need to also, we want to also account for some other random variable or variables. We are gonna start with two, like entropy production and one current, that contributes to this entropy production. When I say contributes, just think about the definition of the current and the definition of molecular rim steady space. So here is the thing. An ESS identified by non-zero, let's say, zero current or non-vanishing currents is, I think it's better, entropy production. Non-zero entropy production, okay? So these are all the same thing. So when you have non-zero entropy production, there are some non-vanishing currents contributing to that non-zero entropy production. This is the idea. We want to keep track of these currents. Okay, this is something called GTUR or FTUR. I mean, it first went with the name of generalized thermodynamic uncertainty relations, but then I think it's now called fluctuation theorem, like thermodynamic uncertainty relations, because it is basically using in its derivation this joint fluctuation theorem of currents and entropy production. Okay, so depending on the time, I am either stepping here and getting feedback from you because I think I must know where you're standing, how you're feeling, and so on and so forth, because I also started to get scared about the homework. Like how do you feel about the homework? If we had, yeah, if we have some hard time with this kind of a thing, then maybe I can do extra restation hours, we can have discussions. I can do so many things for you, just ask me, okay? And if you want, we can actually slow down with the schedule itself, so we don't maybe have to talk about some really, you know, like thermodynamics of computation, like in all by like itself, in its own profundity, okay? I think it's better to understand this kind of principles right now. So I want you to give me feedback. You can tell me how terrible of a lecture I am. It's also allowed. But yeah, please give me feedback. So how do you feel about this kind of a thing? Do you think it's, what is the pace for you? A flight example from Io channel for other things and not like in the general form, so that's what's part of what confused me. Okay, because you had a course on stochastic thermodynamics, but from, I don't know, like this, yeah, again Io on the bead and so on and so forth, like more like the continuous, like the, like an Udo Seifried approach, I guess, yeah. But you, okay, that's good. Okay, how can I make it easier for you? Do you want to give me your previous lecture notes so that I will fix the notation for the upcoming lectures? Okay, you wanna come here, actually? Yeah, I think your two previous lectures were awesome in the sense that, I mean, we already saw them, these subjects, but that's an overview. So, it was fast, but because we already saw that subject, I think, was good. The subject is new, so that is a little bit harder because we didn't knew the subject. Okay. And just low the pace. So, I think it's fine. I think I have to see again the lecture and then I can tell you if you are a good or a bad. I mean, I have to see it again because if I didn't catch it because I'm too sleepy, that doesn't mean you were bad. So, I can't tell you later. That was fair, yeah. But so, when I came to this class today, I assumed that you know how to, I mean, because we provided you the Vandenberg and Esposito, this holy book. I say that it's a holy book. I think he said something along these lines. It's okay, it's like, okay. It's richness is like a book. I must do it, yeah, I know. But yeah, so yeah, I thought when I was coming to class today, yeah, I don't know. Like most of the participants in the class today would be able to write down like this, for example, I don't know, like this master equation and tell directly without being shy, like non-equilibrium status states, like how do we characterize them and so on and so forth. So, yeah, it surprised me a bit, but I think this must be totally normal. So I'm shutting up right now, but yeah. Okay, so a couple of things. It is crucial. Everybody please go through that paper by Christian Vandenberg and Massimo Esposito. It is very easy to read. It's very clear. It does not have this kind of material in it. So that's the first thing. It'll talk about driving protocols, the stuff that I was interjecting. See, part of what's, there's an issue that, there's an assumed language that's being used and so on, which would have been based upon you having read that paper at a minimum. And if you haven't, then that kind of means that the language we're using doesn't really even make sense to you. So please do go through that article. There's also you can find online. Palletti and Pigoletti, unfortunately, you're not gonna be able to find the textbook online. Yeah, so this is the holy paper. Yeah. I actually send it to Slack channel as well. Okay, yes. Okay. So there's also a textbook online PDF by Naato Shorishi. No, it's not available yet. It's the first draft. It's not available. Oh, not the final one, but the draft is available. It's pretty close. Yeah. And so I would recommend also. No, David, David. It's not pretty gorgeous. Let's not go with that. It's a whole book. This is a paper. But if they need to, please, please, please. If they need it, let them come to me. I will do my best. I will get that. I'll get that. So if you also have questions about things that are not in that paper, take a look in the PDF as well. Now, something else I also want to emphasize. You have a resource. Guja is one of the world's experts in thermodynamic uncertainty relations. She's done some very important, good work in it. Very, very brilliant stuff. So exploiter, user, take advantage of what she knows about these topics. It really is some very profound things, but it will be most effective if you first, at a minimum, have read Van de Brocken Esposito, that paper. Okay? So I think, anyway. Thanks, everybody. Thanks. Thanks. Good job. Thank you very much. How is it for the conserving of this process? I mean, I don't know. It's not conserved, no. No, it's not, of course, conserved. So I'll just tell him. No, I mean. It's not conserved. No, no.