 I'm just trying to join. Okay, Mato, can I bother you for 30 seconds? I lost the zoom info. Are you want to connect? Yeah, exactly. Okay, yeah, okay, good. Okay, good. Oh, yeah, this one. Yeah. Oh, maybe that was the problem. Okay, cool. Okay, thank you very much. Okay, guys. Let me stop first. Okay, first of all, one thing, why am I hearing myself? Because I didn't mute myself. No, did I mute myself? Yes. Yeah, old Joe. Perfect. Okay. First of all, great homeworks. I mean, it seems so. So it made me super happy. You're really interesting people because I mean, everyone is like, I don't know, I was mostly checking. First of all, the answers to question six. And so like this, this James George question, and I don't know what's happened on math picks. Okay. So yeah, it made me super happy. And thank you for all your comments and feedback. So I think I will post the second. Okay, do you want the grading first or the second homework first? Grading. Okay, so grading today, as I told you, probably something like 11.50 p.m. or something like that. And then the second homework distributed tomorrow. Okay, guys, guys, guys. Okay. So today, we're going to start with something new. Up until this point in the lectures, we actually, again, we emphasize it throughout last week. We only studied the infinite bats, right? So we had this idealized reservoirs that actually are fixed at their own equilibrium distributions. And there was this system that was coupled to this reservoirs, and all the thermodynamics of the system, it was induced by the dynamics of this reservoirs. But the way that we modeled these reservoirs, so we have some conditions. Depending on how you look at it, this is like a strict condition or a mild condition, but there was this condition of, for example, time scale separation. Do you remember what time scale separation was? Okay, can you summarize it briefly? There is no wrong answer. Yes? Isn't it something that some, what the processes at some scales are not important so we can disregard them because we are working at another scale, something like coarse-graining, but with time? Yeah, kind of, yes. Okay, I will say directly not important, but the mesoscopic rays, the difference between mesoscopic rays and microscopic rays. So some processes take place at a much larger time scale than the other ones. So that's why we can disregard some of these degrees of freedom. Okay? So this time scale separation is also, yeah, it also goes by some, by the name of, I don't know, local equilibrium or the assumption that allows local equilibrium, and what? Markovianity assumption. So if we didn't have this time scale separation or whatsoever, we wouldn't be able to discuss something like, you know, stochastic thermodynamics from the perspective of, like, Markovian systems. Okay? Now, we are going to break that thing. At this point, we don't want to impose that, oh, we have stochastic dynamics, we have Markovian dynamics specifically. So we are going to be talking about the Hamiltonian formulation of stochastic thermodynamics. Okay? So, yeah, again, this is just summarizing what I just said. We previously formulated the stochastic thermodynamics by assuming that there is a finite system coupled to infinite reservoirs, and it's evolving under some, like, a continuous time Markov chain that is described by a master equation. Okay? So, again, I'm using these terms in an interchangeable manner. Let's say, for example, time scale separation, Markovianity assumption, v-coupling to infinite reservoirs. So for example, what is v-coupling? Do we know the difference between v-coupling and strong coupling? I mean, I think it was pronounced in the previous lectures a few times, but we didn't explicitly say what it is. Okay? So, basically, v-coupling is when you literally do not care about the states of the bat. Okay? So the states of the bat and the changes in the states of the bat, it's, you're not accounting for them, and you're not accounting for the thermodynamic, I mean, when you carry out the thermodynamic analysis over your system, you're not caring about the degrees of freedom of the bat. Okay? This is v-coupling. Why did you look at me like that? Okay, I was like, because sometimes I mix ferromagnetic and paramagnetic, and I was like, did I mix v-coupling and strong and so on? So, okay, yeah, this is true. Okay. And so, and up until this point, we had then collecting all the thoughts that we introduced Markov into assumption, time-scale separation, v-coupling to infinite reservoirs, then now we are asking the question of what if we actually include in our thermodynamic analysis an investigation of the degrees of freedom of the reservoirs as well. So, we are saying that let the reservoirs not be infinite, but be finite classical systems. Okay? So, we are going to learn at least get an idea on how to formulate stochastic thermodynamics by this approach, strong coupling to finite reservoirs. And basically, we start with this saying that to construct some Hamiltonian formulation, we consider a collection of systems. Okay, so, Chalk, this is your baby universe. Okay? You're taking it, you split it into two. Life is not that easy, but basically, this is like on a schematical level, that's what we are doing. There is a universe like a collection of systems. Okay? It's evolving under some Hamiltonian dynamics. What does Hamiltonian dynamics mean? Basically, you have deterministic dynamics, deterministic transformations, or if you're using like quantum words, you can consider like a unitary evolution and so on and so forth. And basically, your dynamics is also what? Crucially, reversible. So, when you have, again, if you want to use ergodic theory words, then you can basically suggest that you have a map that takes you from one point in time to another. You can take the inverse of that map, and you can basically know where you're going to go or where you were. Okay? So, that is the incredible crucial difference from the actually stochastic formulation that we did so far. So, you're dividing this universe into two subsets. One of them is system. We mostly call it the system of interest, the SOI. And the other one, or the other ones, these are the reservoirs. Okay? Now, there is a problem with this drawing because I'm actually going to say some things about the correlation of the reservoirs. So, don't forget that this is only on the schematic level. Okay? So, the whole state space is divided into two. So, we can write it as a Cartesian product of two subspaces. Let's say that this is the face space. So, yeah. So, the take-home messages, because we are just going to dive into some cool mathematical stuff, take-home messages. Now, we can systematically study strong coupling finite reservoirs. And this is particularly important for some of the lectures that we are going to have this week regarding the computer science aspects. I'm also noting this because I realized given the questions, sorry, given the answers to question six, people are mostly curious I think, yeah, about the computer science aspects. So, we are going to have a treatment, formal treatment of computational machines in Chomsky hierarchy by using an extension of this Hamiltonian formalism, okay? So, and two most important points is that we are going to see that we have been talking up until this point about things like entropy production and dissipation, right? And we have been also pointing out the relations between information theoretic costs and energetic costs that come in terms of dissipation. But we actually didn't have, we just had a, you know, like this amorphous idea of how they are related. Can we write down dissipation in terms of the information theoretic quantities that we saw so far? In a way that is similar to how David derived, for example, in his lecture on this, you know, we had this 2KL divergence terms, if you remember that formula, how we wrote the entropy production for the arbitrary dependence on initial states. So, we are going to have an expression that looks like that, but I'm sorry, I think even more profound than that. So, so, and one more, more, more important point is that you can actually start from the microscopic description with this Hamiltonian formalism and actually have an idea or justification of why this error of time issue, irreversibility is arising in the classical or quantum systems, in physical systems in general. We can discuss if it's really like a complete justification by itself, it's arguable, but at least it's given you an idea. It's time dynamically establishing how this irreversibility is actually occurring in physical systems that are so called, that are evolving under the so-called Hamiltonian deterministic dynamics. Because we know that it's been, you know, it's been a paradox for centuries if you have a deterministically evolving system because, you know, your phase space volume is conserved by Louisville's theorem, what happens to the entropy and so on and so forth. So, by using this Hamiltonian formulation of stochastic thermodynamics, I think one of the nicest and cleanest and the most neat justifications of this problem it's given. We are going to be discussing that, touching upon that, and it's going to be one of your homework questions actually with this homework. So, okay, too much talking. So, we can be now talking about this building blocks of the Hamiltonian formulation or the ingredients of the Hamiltonian formulation. So, the way that I try to prepare the slides is that you're going to be talking about the definitions first, okay? We need to be really clear with the definitions and then we are going to have like this two level of a mathematical basis. Level one is sort of like formalizing these definitions that we are given and level two is sort of formalizing the assumptions, the core assumptions that we are going to be using for the Hamiltonian formulation, okay? So, first of all, what do we have in this formulation? We have an SOI, okay? It's a finite system. I wrote finite classical, but it might be finite quantum. In fact, this kind of a Hamiltonian formulation, we can also discuss about it after the lecture. It's something that's much more frequently used in quantum computation in the open modeling of quantum computers and so on, so forth, open system modeling. And so, you have the SOI. And then, you have a number of bats. It can be one bat or multiple bats. It will be just depending, I mean, the way that you choose how to model your system is just depending on your preferences, what kind of physical system that you're modeling. And then, you can drive the system, right? We discussed about this thing by applying, for example, an external force, be it a magnetic field and so on, so forth. So, we also include a work parameter, okay? So, what we are saying is that when you assume this Hamiltonian formulation, the universe itself, so the collection of systems themselves, they're obeying some deterministic evolution, okay? So, the change in the entropy as the system, this collection of systems as they evolve, the change in the entropy over this universe, it is zero. But can we say something about the SOI, for example, how its entropy changes over time? We will try to do that. So, just to specify some things, we keep talking about the work parameter. We, as the external agents, researchers, experimenters, and so on, so forth, we have this, you know, control parameter that we use to manipulate the system, but not the bats, okay? So, there is like an underlying distinction that we make when we use Hamiltonian formalism. And it is that we see this SOI as something that we can manipulate. It is something that we can observe or access, okay? These bats, we cannot manipulate them. We do not know what they're doing. We don't, we do not know how to manipulate them and so on and so forth. So, the degrees of freedom of these bats, we call them to be inaccessible degrees of freedom, okay? So, when it comes to reservoirs, we are basically assuming that because we do not manipulate them and so forth, they're like still in some of an idealized form. And we assume that before we start analyzing any kind of a thermodynamic process, before this SOI starts its dynamical evolution, we are assuming that these bats are prepared with their equilibrium distribution, okay? At some temperatures, and they are fixed like that. And we can change that. We don't change that. What we can do is to establish or break this contact, this coupling between system and bats. Another thing I think this is a point that is also important is that bats cannot be, bats are systems, finite systems, but we don't assume like all they're like in couple to one another and so on and so forth. We basically think that, okay, there's a bat one, there's a bat two, they do their own thing and just like that. They are not systems as the SOI, okay? So, and now, definition of a thermodynamic process. We actually introduced it last week. So, we care about three things mainly. The temperature is just like basically the thermodynamic potentials at which you prepare the reservoirs. Again, this point is important. You prepare the reservoirs at the beginning of a process, okay? And specification of that to have this coupling between the SOI and the bats. And the time dependence of the work parameter. We said something like, for example, again from last week, we have this set of functions, functions of time, lambda t. This is something that symbolizes the time dependent driving force that you're exerting. For example, the time dependent magnetic force that you're exerting on your colloidal particles, okay? So, these two and three, these guys over here, this is actually known as the driving protocol, okay? These three of them, we define them to be the thermodynamic process. Now, we can start with the mathematical basis of Hamiltonian formulation. So, because we basically, just a moment ago, say that what you're doing with reservoirs in your system is that you prepare them initially at some temperature, some equilibrium distribution. You fix them like that. You don't do anything. So, when you try to write down the, you know, equilibrium distribution, what you do is basically write down the grand canonical distribution for each reservoir. And one thing, so basically what you're saying is that you have Gibbs Boltzmann distribution initially for each reservoir, okay? And building on that, one thing that you're assuming is that at time t equals zero, you have this product distribution over the, or over this collection of systems, which also includes the SOI. This is the initial distribution over the system stays and this p of r equilibrium, they correspond to the initial equilibrium distributions over the, over each reservoir are. And what you're assuming is that if you would like to describe the initial distribution, initial joint distribution over the SOI and the baths, this is a product distribution. This is a joint distribution that we can write as a product distribution. Okay? Okay, feedback time. How is it going? Good? Okay, good. Okay. These are things that we know, so I'm not talking about them. Yes. This is subtle stuff. So one thing very, very important, we start in a product distribution exactly as Guruji was saying, but the crucial thing is that the embaths are in the Boltzmann distribution for the overall Hamiltonian. The system of interest is in an arbitrary distribution. That's going to actually correspond to the computational machine, as Guruji was saying. But then once you have the coupling term, what's happening physically is you sample, the physical system, you sample the product of a p of the, on the left-hand side there. You sample that so you get a joint state of the SOI and all the baths. Because there's an interaction Hamiltonian, as that joint state evolves in time, the state of the baths does change. So you can be applying a time-varying work protocol to the SOI or not, as you prefer. You are not doing anything like that to the baths. But nonetheless, the state of the baths does change. And in fact, they get statistically coupled with the states of the SOI. And that's going to be that statistical coupling, its consequences is going to drive all the thermodynamics. Okay. I'm just sort of rephrasing. Okay. Yeah. That's great. And don't forget that because, for example, we don't know yet what an interaction Hamiltonian is. How does it go? How is it located in two slides? Yeah. Take that and insert it in two slides. Okay. So we will remember that in two slides. Okay. So this is just a general way of actually expressing like this kind of a product distribution and so on and so forth. If you want to do it again in quantum physics terminology, what you're working with is density matrices. Today, after I present these slides, David will be presenting just like a bit, a baby bit of quantum thermodynamics where we will get sort of acquainted with how to use some knowledge of quantum thermodynamics to actually study the Hamiltonian formulation of finite quantum systems. Okay. It's not going to be super like in depth. So we will, we will all be getting it. I also don't know much, much, much about quantum thermodynamics. So, but yeah, basically this is, this is like a very basic thing that we can do. Okay. So I didn't write down the Hamiltonian yet because we will be talking about the, talking about the, can bats have distributions ordered at Grant's canonical. So the thing is that initially you want them to be at the, in the equilibrium distributions. And of course, because they are now finite systems, they will be evolving. And as David just mentioned a moment ago, if you, for example, because I'm going to come back to that, let's say that this is the equilibrium distribution R that is expressed as a Grant's canonical distribution of this reservoir R at time t equals zero. Okay. We know that now in this Hamiltonian formulation, bats are finite systems. They are going to evolve. So this distribution is going to change, right? So for example, if you calculate this distribution over the state of the bat, you can do it right now. I mean, not explicitly because we still say that we cannot access a degrees of freedom of the bat, but practically like theoretically you can do it. And if you take this, it's going to be nonzero because the bat is going to be evolving. So yes, as an answer, as an overall answer to your question, bats can have distributions that are different than Grant's canonical distributions, but not at the initial time. As they evolve, they can have it. Okay. You're welcome. Thank you. Okay. So yeah, with this one, I didn't write down the Hamiltonian net. We are going to come to it in one slide. But okay, just emphasizing one point that is really crucial. When we have this reversible invertible Hamiltonian deterministic and so on and so forth dynamics, the entropy of the joint system, does it change or not? No, exactly. Exactly. It doesn't change. Liouville's theorem done. So what we are going to be interested in in this, like this rest of the lecture, but also in the Hamiltonian formalism is that we ask how this thermodynamics of the SOI, the system of interest, due to its coupling to finite classical quantum heat reservoirs, how does this energetics shape irreversibility for the SOI itself? Okay. So we are still interested in the SOI, but now from a totally different point of view than we did last week. Okay. And even though we know that this delta S over all the universe as it evolves, it is zero. We are going to be talking about what happens to delta S and also the components of this delta S because we have entropy production and so on and so forth. What happens to that for the SOI? Okay. So now what we are going to do is actually, I think it's a jump from what we have done so far. So we know a bit about the fluctuation theorems from last week. David introduced them, the integral fluctuation theorem and the detailed fluctuation theorem. I discussed in the last two lectures that I presented how, for example, when we can write down an integral fluctuation theorem for a random variable. For example, for our interest, it's mainly entropy production. And from that integral fluctuation theorem, you automatically write down the detailed fluctuation theorem. We remember all of this. No? Right? Okay. Okay. If you don't remember all of it, you can just go back to YouTube because I think they're all up there right now. So yeah, you can remind yourself how we did these things. And also, of course, what was it? Holy paper. Not the holy book, but the paper. One of them work in Esposito. They drive these fluctuation theorems. So what we are going to do is to drive a detailed fluctuation theorem. When I say, by the way, detailed fluctuation theorem, do you know the form? Do you remember the form of it? The fluctuation theorem? Exactly. Something like that. Yeah. Exactly. Okay, guys. Okay. I'm going to wait for an answer for 30 seconds because you know it. I think you know it. Okay. Let me give you a hint. We were running the movie forward and we were running the movie backwards. You want to talk about in a universal manner what happens to the probability of observing an entropy increase when you run the movie forward, when you compare it to the probability of observing a decrease in the entropy when you run the movie backward. Exactly. So there was a nice break. Yeah. Okay. So basically the mathematical form was something like this, right? And for examples, we used this, a version of it, a strong version of it, which is a strong fluctuation theorem to actually drive a thermodynamic uncertainty relation, right? Yeah. Okay. Okay. I'm not going to do that, but yeah, basically, yeah. So we don't need this tilde over here because we were assuming we were making some assumptions such as that the initial distribution and the final distribution are the same. And we had some time, symmetric driving protocol, and so on and so forth. Okay. Is it because it's Monday or like, okay. Because everyone seems like how I seemed like an hour ago. And then someone, I don't know. Thank you, Matteo. Yes. Someone got me coffee now. Yeah. Okay. So anyways, so yeah, we're going to be driving a detailed fluctuation theorem, but without making any kind of an assumption, underlying assumption that, oh, our system evolves under the stochastic dynamics, or basically we have this time scale separation and so on and so forth. Okay. So when we introduced these ingredients of the Hamiltonian formalism, what we say was that we're considering an SOI, a finite classical system of interest, okay, coupled to a heat reservoir or a number of heat reservoirs, which are also finite. Okay. And we also potentially consider a work parameter lambda. Okay. Now what we're going to do is to say that, let's, okay. I'm introducing some notation over here that we are going to use. Okay. This is basically this microstates of, that are encode, okay. This is something that encodes your microstates in terms of like the momentum and generalized coordinates in your phase space. This is familiar. We all know that because we did classical mechanics and we know Hamiltonian formulation. So this is basically we are taking the Hamiltonian formulation right now and building thermodynamics on that. Okay. And one thing that we consider is that, okay, you can write this z just like that. Okay. This bold z, vector z corresponding to this like this point in the phase space of the system of interest. But of course these finite bats, now because they are not infinite, they are finite real systems that we are considering even though they are not accessible to us, they also have phase spaces that we can talk about. And they have these microstates in their phase spaces. For example, that's why when we write this kind of a formula over here, we can talk about this KL divergence being nonzero. It makes sense to talk about it. Okay. So I'm going to use this bold y or vector y to basically characterize these points, microstates, okay. And what we can do is to define another like something like a trajectory or like a collection of points at some time t to basically encode all of these points that we are considering. Okay. This is, I'm sorry, this is my mistake. I think I, okay, this is basically encoding, for example, the microstates of the first reservoir and the end reservoir and so on and so forth. But, okay, this is clear. Okay. Okay. So now building on that. Okay. Finally, we can write down the Hamiltonian. Okay. So one thing that you realize over here is, okay, we define just this trajectory, which is like, okay, or I mean, it is sometimes called a trajectory, but I actually don't like to call it trajectory. It's basically a sort of like this collection of points that you are considering in this joint global phase space. Okay. I mean, originally, when one of the founders of the field, Chris Jarzinski, he actually derived this detailed fluctuation theorem that we're going to discuss now in 2000. Okay. And then in that paper and in the following, a few papers was like, okay, trajectory, but this was, I think, 14 years before the Massimiliano's paper, Vandenberg Esposito paper, where we actually started to talk about like trajectories in the CTMC, continuous time macro chain sense, but I still don't like it. We can just consider it as like this collection of the set of points in the joint space. Okay. So I'm just trying to emphasize this because I don't want you to get confused like in between just like this Hamiltonian formalism trajectory and the trajectory thermodynamics that we have been dealing with since last week. Okay. So we write down the Hamiltonian over all degrees of freedom that compose this universe. Okay. And it has three terms. One of them is the Hamiltonian that defines the internal dynamics of the SOI itself. That's why in this one, we have this bold Z, right. And this is just something notational, but you can also take this lambda T term inside of the Hamiltonian. So if you remember one of the questions that David asked you last week was, for example, when we were discussing rate matrices and if you want to introduce a driving term, okay, like this lambda, where do we include this driving term? We included in the rate matrix itself, right, in the master equation as we described it in terms of the transition rates. Okay. So this is basically this driving term. It is because that you are manipulating the SOI and only the SOI itself, you're including in the term of the Hamiltonian that describes the internal dynamics of the system, system of interest. So, and then I'm jumping this because I'm going to ask a question about that. I'm, so we have this kind of like this interaction Hamiltonian. Okay. Yes. Okay. So it might be time dependent, time independent, just as you wish. And it's going to include, again, because when you have a system coupled to a heat path, now you can remember what David said about this interaction Hamiltonian and what happens to the SOI as the bad states are evolving dynamically. Okay. Now, it makes sense. So the time of dynamics will be also, will be all to this term. It's going to describe the coupling, this interaction between the SOI and the bats. And the reason again, for example, if you recall one of the things that we said when I told you that, oh, there is something wrong with this, for example, schematic, you know, this drawing over here is that, maybe not wrong, but also not great. When we are summing over ours, we are summing over these contributions of each bat and system coupling. So bats are not interacting. Okay. And this one. This is the term that characterizes the internal dynamics of each reservoir itself. Now, question. Does it make sense to have it time independent? I mean, it does because I wrote it like that. Everybody writes it like that. But what might be the reason that we actually express the Hamiltonian for the bats in a time independent form? Okay. I really don't hear, but separation of time scales. Let me see if I can push that here. Not that much, I think, but it was good. It was good. But thank you. Yeah. Think about how we defined the time dynamic. Where is it? Did I? Okay. I didn't write it, maybe here. Okay. So one of the things that we do when we fix, when we actually, you know, initiate this process, time dynamic process, we say that we prepare the reservoirs themselves at fixed temperatures. Oh, yeah. Go on, please. Probably because if they were isolated by itself without the interaction, they would be already in equilibrium. And so their Hamiltonian wouldn't depend on time. So if we don't count V, which means they are all isolated by themselves, each Hamiltonian doesn't depend on time. Exactly. This is like 99.92. Yeah. Just one point, one more point to make about that. It's that, you know, when you're fixing this equilibrium distribution and you say that these guys are not interacting with one another and so on and so forth. So you're fixing the temperature and you don't want to mess with the temperature throughout the time dynamic process. Once it is fixed, it is fixed. Okay. It's an intensive variable. We don't want to mess with the temperature. So that's why we're basically saying that, okay, it's really like something that we would like to assume when we define the Hamiltonian over the bet. So it's exactly what you said with this sort of like this, I don't know, glimpse at the temperature. Okay. So, yes, Mateo. Okay. Now, after, okay, I'm jumping this because I'm going to shoot it to David. He has an insight about this, but basically one more point before we walk through this slide is that we are emphasizing that the joint dynamics is deterministic, but the induced dynamics over the SOI itself is, what you're doing when you have this universe evolving under deterministic dynamics when you, if you partition it, stochasticity arises. Okay. One of your homework questions, I'm going to ask you, I will walk you step by step about like how mathematically we can formalize it. If you start from a deterministic description of these systems, how can you actually, for example, under some assumptions, some of them streak, some of them mild, you can actually even recover this Chapman-Colmogorov, this master equation. Okay. So what we would like to emphasize is that because we have this partitioning over the state space, the SOI will evolve stochastically. So what we can do is stochastic thermodynamics. Okay. And one thing to emphasize is that why do we have stochasticity? Okay. We discussed about it, but how do we observe this stochasticity? What you do is basically you consider an initial microstate, for example, this bold ZA. Okay. You have some initial momentum generalized coordinates that are basically fixed after you have this equilibrium reservoirs fixed at their temperatures and so on and so forth. And you start evolving the system. But because you have this stochasticity in your system, you will still, for example, find yourself after some time t, I don't know, at some different microstate. For example, this is the final microstate, one thing that you can go. Like this is another one. This is another one, and so on and so forth. So still your face space trajectory, as you know it, it's going to be, yeah, it's of stochastic nature. Okay. And yeah, the reason is that what you're doing, again, I think one of the things that David mentioned was that this stochasticity is allowed because there are different couplings to different reservoirs, and reservoirs themselves are sampled from different initial distributions, equilibrium distributions. Okay. So David, how do we define this delta S term? Controversial thing. Two cups of coffee and still no functional brain. I don't know. Somebody's got to work on my genome. So if you will recall from the earlier lectures on infinite bath stochastic thermodynamics, entropy production when you have local detail balance is also the dissipated work. In other words, it's the work that you do on your system of interest and your physical system, such that as you change the distribution of the physical system to some ending distribution, you take some work, the dissipated work is the difference between that work that you put in and the amount of work that you could extract back if you return to your original distribution. So remember things like the Carnot cycle or something like that. If you are actually doing a thermodynamically reversible compression of a piston, you start with one distribution of the microstates, you end with a different distribution over the microstates. That takes some work, but then you can always just semi statically slowly go backwards and recover all that work. So there's no dissipated work. If instead you do it too fast, then some of the work gets dissipated. Basically, there's a change of entropy of the state of the system of interest that does not arise as heat flow into the external baths that you can extract on your return leg. That dissipated work, I think we showed that, but that dissipated work is equal to the irreversible entropy production. The entropy flow is the actual heat that does go into the baths. The extra, the difference between the work you provide and that entropy flow, that is the entropy production. It's essentially the first law of thermodynamics. Notice what I just said, that the dissipated work is the difference between the change of entropy of the system of interest and the heat flow into the baths. What is heat flow always for any physical system? I'm going to give you a physical system that is my bath, and I start in one particular time where you've got one distribution over the states of the bath. I ended another time with another distribution over the states of the bath. The Hamiltonian of the bath does not change during that interval. What is the heat flow into those baths in terms of those distributions in the Hamiltonian? By the first law, by conservation of energy. So what is heat ultimately? Heat flow actually is not a state variable. It's energy. Change of energy. What is going to be the total heat flow? If I change the distribution over the states of my bath from say P0 to P1, and I've got a fixed known Hamiltonian over the states of the bath, where does the heat flow into the bath? By conservation of energy. Sorry? Push it. What do you mean by the Hamiltonian? I'm not changing. You're almost there, but I want you to use different words. In 95%, exactly. Yes. The Hamiltonian doesn't change. So what is it that you mean to say? You're almost there. No, the curve is going to be dissipated work. So in other words, what I'm saying, we've got, let's say there's a single bath, Hamiltonian reservoir, states of the bath. We start with a time zero, some distribution over the states of the bath. We end at time one with another distribution, and that thing doesn't change. What's the total heat flow into that bath? You are almost there. Here's a way to cheat on final exams. If there are very, very few symbols that the question involves, you know the answer has to involve some way of combining all those symbols. So given these symbols, what do you think? It's the change in the expected energy of the bath by the first law is equal to the change of expected energy of the system of interest plus the work done on the system of interest. That change in the expected energy of the bath is the heat flow into the bath. Remember, we just said that the dissipated work, the entry production, is the change in the entropy of the system of interest minus the total heat flow. That's what it means in infinite bath thermodynamics. So therefore, right here, this term is going to be the, because he's got multiple baths in this particular form, because Guruji has multiple baths, this term is going to be the total heat flow, and so the entry production will be the change in the entropy of the system of interest minus this total heat flow. I am not seeing a whole bunch of light bulbs going off above people's heads. So what aspect of this is unclear? Okay. Thank you. Sorry. Silence is a sense. Okay. So there was a question from the chat. Once we integrate reservoirs out, what kind of noise will influence the SOI, you're going to give the, if you can give it, can I ask you to give the pedagogical description? I'm going to give the mathematical one. Is there a way to derive its form? I think its form, by the way, if we are talking about noise and the mathematical form, I'm going to assume that you're going to, you're curious about something like the master equation, I guess, like a stochastic description. If it is so, if it's in this formulation, or is it still phenomenological, I mean, you can drive it. This is, as I said, this is going to be one of the homework questions. What you're going to assume is going to be, you're going to be saying that you start, okay, you partition the state space, you write down the product distribution initial product distribution, and you're going to assume something like, again, Markovianity assumption, and then you are going to write down the conditional distribution of the system, given the bad states, and you're going to evaluate it for time t, and then, yeah, and write it in terms of the initial distribution, and I think, yeah, there are no more assumptions that you need to make, but yeah, you can always recover by this kind of, I think, the master equation from the deterministic one. No, if you're making Markovianity assumption. In general, it's not going to be Markovian. Yes, but the question is, okay, yeah, this is why you're going to give the pedagogical one. Okay, so I think the first part of the question. To answer the first question, there is no noise anywhere. Rather phenomenologically, the system of interest evolves. If you just look at it, it evolves as though there were noise, because you're just looking at it, you're not seeing the distribution over states of the bath. So you're integrating out the states of the bath, that you're marginalizing over the system of interest, so it looks like the system of interest is going to be evolving in a noisy process, but that's because you're ignoring all these other degrees of freedom in the bath. And if you think about it, that picture of where noise comes from actually underlies all physical phenomena around you. That the reason everything here appears to be noise, appears to have noise, appears to be stochastic, is precisely because there are many degrees of freedom that you are not paying attention to. If you were to look at the precise phase space position of the entire set of gas molecules in this room, and all the carbons and hydrogens and oxygens and so on and all the people, then there would not appear to be any noise. It's deterministic dynamics. The only reason noise appears is because there's some degrees of freedom you're ignoring. And what's so super cool about this finite bath formulation that Chris Drozinski and others constructed is that it's really formalizing that. Whereas when you do any kind of thermodynamics with an infinite bath, well, there's a lot of degrees of freedom in this room, but it's not an infinite number of them. So the more standard approach in all statistical physics is really a little bit more of a hack, whereas this is really what's going on down on the fundamentals. So I assume that answers the question of the person online. So that one there, this is unfortunate. What Gulja is doing, okay, this is, physicists, no matter how smart they are, have a nasty tendency to use really bad notation and to conflict with the notation that other people use. Gulja is here using the same notation in this 2000 paper by Chris Drozinski. Unfortunately, it's bad notation. He just defines that thing called Delta S, and he's then sloppy in saying what it means. And he goes forward and derives a detailed fluctuation theorem involving that Delta S, but that actually just view it as an arbitrary symbol. What it is is the total heat flow into all the baths. It's not a change of entropy in any sort. It's just the symbol that he happened to use is the total heat flow, and the dissipated work will be the change of entropy of the system minus that beast there. Does that answer the question? Sorry? Yep. Yep. And if you actually read the paper even, he uses confusing language to describe it. Like this trajectory definition, for example. Yeah. But okay, so the follow up question, but yeah, you see it. It can be described. We just assume it's white. Oh, we most definitely do not assume it's white noise. Can't be described. Yes. Is it, it's the information that is like, the thing that we're assuming is that these degrees of freedom are accessible. These degrees of freedom are inaccessible. So the best you can do with your description is that to assume that as you dynamically evolve this universe, you have information irretrievably lost while having system, the SOI and batch correlations. Yeah. So what you can do describing the noise is ultimately the way it's done in these papers is you actually write down the formulas for things like the mutual information between the state of the bath and the state of the system of interest. Surprise. Here we have the entropy production dissipation term in terms of mutual information between system and bath as David describes. And this Cal divergence term that I just wrote here as the bath evolves. So there are two components correlations between the system and the bath. And David's first comment about how the bath system evolves and how this is by the interaction Hamiltonian, how this is inducing some irreversibility over the system. Yeah, but the degrees of freedom questions, it's, it's, they, they are, I, I totally get the interest actually because they are incredibly tricky, specifically when you want to talk about systems such as computers. For example, if you want to model a computational system such as like a Turing machine, David is going to introduce them. If you want to model them by using this kind of a finite bad Hamiltonian formulation, how are you going to define the system and the bath? You have a computational machine that describes the overall universe, but you're going to partition it, right? Different partitions can correspond to different physical realizations. And so, and baths degrees of freedom, they are going to be inaccessible. And the system degrees of freedom, they are going to be accessible. So the things that you can manipulate. So it's going to get even more interesting when David gives the lectures throughout this week, like with, with all this computational machines. Okay. So I need to ask about the time. It's safe. I mean, it's safe.