 On Thursday, but we have time in the class also to talk about the exercises or to address some of questions that you could have. Who told me that he has questions? I don't see him. Yeah, you have some questions. I don't know the exercises. Okay, let me show the exercises here and there. So as you say, as you have seen, we prepare a lot of material, but at the end, we are not going to cover the whole thing. So, yeah. Well, for example, in the exercise 1.3, like in the second part, could be 1.3 exactly, like in the image that you see in the bottom, on the right, like, could we try to compute the entropies in each step and see the... Here? Yeah, exactly. Yesterday, I tried to compute the entropy in each step and then compute the work and the heat out of the difference of entropy and stuff like that. But probably I don't really understand what's the entropy of the memory and what's the, like, how to compute it properly. In which step do we ever change the entropy, for example, and, yeah, how to see them mathematically? Well, I think in this case, the entropy, the global entropy is constant. Can you write it then? Because for me, the first step, you have one-half log, one-half, one-half log, one-half, that one. Then in the second one, why not one-fourth log, one-fourth, plus one-fourth log, one-fourth, plus one-half log, one-half, that is different from the previous. In the entropy, or where? The entropy in that point. Like, when you put a barrier, like, I don't know, if I think of the Gibbs paradox, for example, you have, that you start with a box, then you put a barrier in between and you change the volume. And so you started with an entropy that was a log of V1 plus V2 and you go to K log V1 plus K log V2, which is different. So, like, the entropy of putting a barrier, I don't know, does it change? What is this meant? I think here nothing changes, no? But the free energy, can you show it with the, but exactly, also talking about the entropy of the memory. Okay, anyway, let me tell you about the philosophy of these exercises, because I think you don't have your idea. Okay, most of the things that we are studying of the Maxwell-Demond, we only limit ourselves to isothermal processes, so there is always a bath, no? And when we talk about the Seeler engine, the Seeler engine, for instance, when you expand the work, the work is PDB between the initial volume and the final, and this is, if you use the equation of ideal gases, this is just KT log of the ratio, no? It's, well, since the work is the work that I put in, this is a minus because it is, because it is, the way I remember this is, if I compress, the work is positive because it's the work that I'm doing. Okay, we did this, this was a bit, remember the first day, this was a bit confusing because, how, what is an irreversible compression in a single molecule gas and so on, so, and it is very limited to just one example, the example of one particle in a box. So, but with these, you can do a lot of things, like you have done the error or things like that, so it is a very, I mean, you can explore a lot of things, so this is why we like it and we have the first exercise. So the first exercise is to show that with this idea of the serial engine and just one formula, you can explore a lot of things, like errors and things like that. You can imagine several compartments, wherever, I mean, it's very, it's, you can do a lot of things. But it's very limited because it's just one particle in a box. So the second exercise tries to convince you or that the same formula can be applied in more general situations, like the Landauer eraser. And so this is just to, for these, you have to use just the, yeah. So this is volumes in space. So it's a volume occupied by a particle in the serial engine, for instance, and so on. This exercise tries to generalize this formula to general process, I mean, well, general process, to general systems where this calligraphic V, you see this V is different. This is a volume in phase space, so, which is different. Well, at the end, the volume in phase space, this is also, the volume in phase space is related with the volume in space, but the physical volume. But here is more general. And actually this second exercise is in the spirit of the Landauer where I explained on Tuesday last week that, remember that we said that, and I used this letter, the volume of the system and the volume of the bath. And when you erase you divide this by two, so you have to multiply this by two and using the Boltzmann entropy, which is scale of the volume, you use this for the bath, and you see that delta, and you see, so these are the three things that we use. You build theorem that tells you that the volume is constant, so if you divide by two, you multiply by two. The Boltzmann definition of entropy, and the Clausius. Of course, for the Landauer theorem, and actually this is Landauer original paper, you utilize the three things. So you have to, you can criticize that, ah, why are you using this? Okay, because this works, at least in equilibrium, this works and this works. So for the bath, this is okay, I mean. For a thermal bath, everything works, because it's a big thing, you are in a thermodynamic limit, it's in equilibrium, so this works, this works, this works. And this is equivalent to Shannon actually, because we are in equilibrium. So these are the three things that we used in the Landauer principle, and exercise two tries to do the same, that generically, so, but it's the same. You see, you can replace this by an alpha, by an alpha, and that's it, and this is exercise number two, essentially. Well, yeah, we don't use alpha, but this is the final volume in the initial, in the phase space. So now, this little toy model of the stiller engine that uses a very simple formula, can be extended to any system, any physical system, obeying the, you built the, this is the idea of exercise two. And exercise three, exercise three is just the application of this. Once you have this formula, you just apply this formula to this process. So here, the only step where you have some work, and this means some heat is this one. So here, you just have to compute the work needed to do this process, and then check that this work is the shadow entropy of this guy, minus the shadow entropy of this guy. This is the idea. So you don't have to compute the free energy. The free energy, this is chapter one. So at this stage, we were not supposed, we didn't know the free energy, okay? This is lesson one. This is just to work with a very early, the elementary concepts of statistical mechanics, which are these three, actually. These three things, these three formulas, basic, this is basic mechanics, we will theory, this is basic statistical mechanics, and this is basic thermodynamics. So this is, you are supposed to do these exercises with this. Of course, this is just a preparation because then when we have the, when we study the non-equilibrium free energy and so on, we can use now what Leah explained yesterday to give a more modern, let's say, interpretation of this. This is the difference of free energy and so on. So the exercises try to be, first, the exercise of lesson one, they, it's just to explore information processing using the tools of statistical mechanics and thermodynamics. And then once we have more tools, which are non-equilibrium free energy, Shannon, entropy and all these things, we can reinterpret these exercises. And this is, I think, this is the idea of exercise. This is the idea of, you see, we revisited exercise one one, and I think in exercise one three. So, and now here you have to reinterpret those exercises using the new tools from thermodynamics information, okay? This is just the philosophy of the exercises that are, then you have specific questions on how to calculate, but this is the philosophy. Sir, but in the example before the same exercise, is if, as you said, the free energy is the same and the entropy changes by applying the wall, doesn't that mean that also the internal energy has to change? The internal energy changes accordingly. Otherwise, how can the free energy be constant under a change of entropy if not the internal energy changes as well? But then we are at the same temperature. No, you don't need, no, in this exercise, in this exercise, for instance, in the, this exercise is also, this exercise is the implementation of a Boolean function. So you have a Boolean function, and I want to implement this, and then this is an implementation. Actually, it's a reversible implementation, and you have to generalize this. This is a bit difficult, but you can generalize this to any random, any Boolean function. So this is nice that just using these three things, you can compute the cost of implementing in any system because this is generic, a Boolean function. And now, here, of course, inserting these, removing these doesn't cost anything. Doesn't cost anything. All the entropy changes. Any energy. Moving these, it costs you this energy, which is heat that you have to dissipate because this is also interesting. You derive this equation by using UV theorem. So you say, ah, my volume reduces, so the bus must do, do, do, do, and you should be, so you conclude that there must be some energy going through the bus. Where this energy comes from, it could come from the system, but in this case, I mean, even all the microstates could be iso-energetic, could have the same energy. So where this energy comes from, it comes from the external agent that moves this, so it's work. So essentially, in these examples, or this is my point of view, maybe of course you can cook models with different energies and so on, but essentially what you have is a system where the energy doesn't change. It's a bit like the symmetric free, the symmetric memories that you have in a computer or in DNA. There is a work and there is a heat and the work is, and the heat is minus the work. Or if you like, the work is minus the heat and it's positive. So there is a flow of energy that is immediately dissipated because the, which is more or less what happens. If you compress here, you put energy, but this is the, or in a gas, not even in the solar energy, in a gas. When you have a gas. That's the expansion. I was talking about, when just we put the wall. Ah, this is no cost. Ah, but before you said that the free energy stays the same, but the entropy changed. Just before in the beginning when Matteo asked this thing. But at this, at this. Why there we consider the average free energy of these days and not the internal energy? Okay, I just want, not if the questions are very specific, it's better to solve it in the afternoon. Leah and I are here around and we can have a, you can ask, I mean the U.N. room, this one there and there will be around so. But, but it's important this kind of the philosophy of this exercises that I want to. So shall we move to the, we are still in, in lesson three, because in lesson three, sorry for these jumps, but I have to finish the thermodynamics of nano machines, which is what we started yesterday. So this is a lesson three and it was discrete systems. We first, we first analyze the master equation and things like that. And now we have the system that can exchange. There is a bath, there is a reservoir, which is now, there is a change of energy and exchange of particles. Let me, let me put it like that. And eventually we can have an external agent as well. Although, for molecular motors in biology and so on, the final goal, let's say, is to consider autonomous motors. Autonomous in mechanics means that the Hamiltonian is independent of time. So means that there is no external agent. This is autonomous systems where the energy is concerned. Okay, and yesterday we saw the, this example, which is rather generic. We have the, the jumping energy. This is a delta E and we consider the possibility that this jump is mediated by ADP by ATP. So we can write this as the state I plus ATP goes to a state J plus ADP. And of course, whenever you have a reaction, you can have the opposite reaction. And we discussed that the detailed balance condition, in this case, is given by this, the final product, the free energy of the final product, which is the chemical potential of ADP minus ADP. And we also write this as a delta E minus delta mu, where delta mu is ATP minus ADP. Somebody was confused that why this delta means the final minus the initial. This delta mu should be ADP minus ATP. And it's just a notation problem. We usually, in this type of motors, we call delta mu the power of the fuel. The fuel is ATP and this is positive. This means that ATP has more chemical potential than this one. Remember that chemical potential is E minus Ts per particle, is the free energy per particle. So actually it's the Gibbs free energy, so we should put here PV. But if there is no volume, I mean, this is only important if there is a change. If one of these guys is a solid and this liquid and this is a gas or something like that, but this is not important. So we can forget about that. So this, and we discussed yesterday that, for instance, ATP and ADP, the energy is not so important. What makes, this is around 14 kT, and what makes this 14 is the entropy, actually, which has to do with the, also with the pH of the environment and so on. Okay, so when you have this type of system, I've uploaded these notes, so I want to have the same notation as in these notes, so let me, so in general, I can have many, this quantity, this is the change of free energy in the environment. In this case, the chemostats due to this jam. This is the change of free energy. This is minus delta mu in this case, no? Now this delta F has the correct sign that somebody asked me. This is the change of free energy. Let's change of free energy in the environment due to the, in this transition, due to this transition, I to J. Okay, so what we see is that because of this exchange of particles, this can be even electron, for instance, if somebody is studying quantum dots in the, or any electron transport, you have electrodes and then you have a device and then you have the electrons come from thermal bath, from thermal bath, from Fermi gases and then you have the same thing. This can be also due to photons. It's, this in general is a exchange of something that changes the free energy. If in whatever we have for thermal baths, remember for thermal baths, this exchange energy, but the free energy of the bath never changes because the free energy is a function of the temperature and by definition, thermal baths have a constant temperature. So the free energy of a bath doesn't change. But if you have the environment and your system, well, I will, your system here, the change in the environment, if you remember the basic equation of thermodynamics, dS is dE divided by T plus PT dV minus mu N, minus mu T dN. When the environment changes the number of particles, it changes the entropy as well. So what we have is that the entropy of the environment changes as follows, is minus, and here all the species of, which could be electrons, photons, whatever, you have a mu of each species, divided by T and the number of particles. So for instance, in the case of the reaction, in each transition, there is an ATP, so dN is minus one for ATP and plus one for ADP. So you have the change in free energy, which is given by delta mu, but this is general. Okay, in thermodynamics, I've said it before, the heat is defined like that in a process. It's defined like that. In a process, the heat is the energy that modifies the entropy of the environment. So if I put this here, you see that this is delta F of the environment divided by T. So if you put this here, you see that the heat is, I multiply it by T, so this is, sorry, this is delta F minus delta E. So remember, and this is also, well, this is also kind of trivial because delta F environment is delta F is E minus TS. So this is delta E environment minus T delta S environment. So we could have written this formula immediately, you know? But you see that before, in a thermal bath, in a thermal bath, delta S is delta E divided by T. No, the change of entropy is the Clausius formula. So delta F is zero. If it's a thermal bath, but now we have the exchange of particles. So delta F is no longer zero. So this means that the heat is not the energy that is transmitted from the bath. This is basic thermodynamics, but I think it's good to recall it. Now there is a change of energy from the environment introduces energy, but the energy that introduces the environment is minus delta F is Q minus the change of free energy. So this is the total flow of energy from the environment to the system. Not all this energy is heat, only part is heat. So what is this part? This is a part that it's a transfer of energy from the environment to the system, which does not change the entropy of the system. And this is work. We call work any flow of energy that doesn't change the entropy. So this is work, and this is called chemical work. Chemical work. You will see now, in a moment we will go to the, we can erase this. So now we have defined the chemical work as minus delta F in the environment is with a minus. And the energy, let me use this notation, the energy of the environment introduces into the system, which is minus delta E environment is Q plus W plus the chemical work. And then we have a first law. The first law tells us that the total change of entropy in the system, when there is no any sub-index in the system, is the work done by the external agent plus the energy that comes from the environment. So this is, we have two types of work. The work done by the external agent is usually called chemical work, because the idea is a piston, but you can have also a field, whatever the external agent can change, okay? And then also you have a second law. And here you see a second law for a given temperature, which is, here we are assuming that there is a single entropy, a temperature. If there are more than one temperature, then it's more complicated. But I can use these. So I can use that delta E is equal to minus T delta S plus W, plus W can. And then if I compute the total entropy change, it's called the entropy production. So you could also put entropy production. This is sometimes called entropy production. This is the entropy of the system plus the entropy of the environment. And if I use this equation, so I have, I can multiply and divide by T, the whole thing. And if I use this equation with delta S environment, if I use this equation, I have T delta S is equal to W plus W can minus delta E. And here you have delta E minus T delta S is minus delta F in the system plus the work divided by T. And this is because of the second law, this is positive. So this means that, oh, this is F. This means that the work is bigger than delta F. So this is the second law. When you have a system, which is in contact with a single thermal bath, bath that also exchanges particles, which is the case if biological models, most of the biological models. So the concept of chemical work is interesting. It's important here to analyze all this. Actually, for instance, when people in biology, biophysics define efficiencies, the efficiency in a thermal motor is the heat, the work divided by the heat taken from the hot bath. Although this has a very historical, important role in thermodynamics, efficiencies are an arbitrary definition. It's a conventional. And in molecular machines, usually the efficiency is W, the mechanical work, which is negative usually. So the absolute value of W divided by W chem. W chem, the chemical work is the free energy provided by the fuel, if you like. Okay, how this, so this is a basic thermodynamics, but I think it's worth it to recall it because when you study thermodynamics, sometimes you don't understand. And this is also very important that this is the definition of heat. Why this is the definition of heat? Because this is what allows me to derive this. So, this is why the heat, I think the best definition of heat is any energy flow that changes the entropy of the environment. This is the, and this is why this chemical work, so the reservoir can introduce 20 joules here, but only maybe only 10 joules are real heat and the other 10 joules are an energy that doesn't change the entropy of the environment. Yeah, because of this equation, or if you like, because of this equation, delta S environment, delta S environment is, you go to the definition of free energy and it's delta E environment minus delta F environment divided by T. So, this is why the heat is just this part. This is the total energy flow from the environment to the bath, to the system, but you have to subtract this. In the case of a single thermal bath that doesn't exchange particles with the system, this is zero, this is what I said before, the free energy of a thermal bath is the function of the temperature, it's KT log Z or, and this thermal bath is something that puts energy, but precisely the energy that the thermal bath releases is compensated by an increase of entropy. So, delta F is zero for a thermal bath, but if the thermal bath, which is no longer a thermal bath is a reservoir, but if the thermal bath is put in energy and also exchanging particles, then this delta F is different from zero. So, the Clausius, if you like, it's not, it's a way of speaking, the Clausius equation is no longer true for the thermal bath if there is exchange of particles. So now, and as I said, this is the definition of, so the Clausius equation is valid if you subtract precisely the chemical work from the total energy. I don't know if this is clear, so there is a bunch of energy, but only part of this is responsible for the change of entropy in the reservoir. Okay, this is very important. Sometimes it's something that, and the concept of chemical work is very important for molecular motors, because otherwise, yeah. For instance, when you define the efficiency as the ratio between this and this, in a cycle, in a cycle this is zero. So this means that this usually, this is negative, usually this is positive, but they have to be equal, or this must be bigger than this. So the work you put, and efficiency is the ratio between this and this, which is always smaller than one. Okay, so now we can go back to this, and if we take the logarithm here and multiply it by KT, we get EI minus EJ minus delta F. EI minus delta F. Sorry, Juan, why do you say that the efficiency, the W is always less than, minus W is always less than W. Okay, the second law is this one. When you have a motor either in a cycle or either in the stationary regime, this is zero. This is the change of free energy in the system. Remember, everything that has no sub-index is, refers to the system. So in a steady state, delta F is zero, while in the steady state, this is per unit of time also. And then W plus W is bigger than zero. And usually, you have an energy that comes in. So this is positive. This is the system. And W is negative. So you have W negative. This is a possible scenario. There are many scenarios, but this is a possible scenario. The scenario where your system is pushing for this, you have a molecular motor and the molecular motor is eating ATP and is pushing against the force. This is typical experiment in the, so you have the, and efficiency is defined as W divided by WKM. And this inequality, if you divide this by WKM, this means that this is, and actually the entropy production, this is the entropy production, this thing, right? T times entropy production, so this is greater with entropy production, okay? So now let's, now we have, we can complete the thermodynamics of discrete systems in the master equation, because the master equation is compatible with the second law. The second law here of, this is not the proof of the second law. Sometimes this is confusing because sometimes I'm proving the second law and sometimes I'm using the second law. This is just, this is just how the second law, the second law means that delta S is bigger than zero. How the second law, what is the consequence of second law on the energetics of an isothermal motor? But this is not the proof of the second law. This is just, we have, the second law is just, we take the second law as valid. Now we are going to prove the second law, I mean to prove that the detail balance condition implies the second law, which is actually what we did for the, we did this without the exchange of particles, so it's very easy. And this is, sorry, this is EI minus EJ. Remember how is this? And now we have some exchange of particles from A to B or whatever. And the EI minus EJ, EI minus EJ, you see is, is the, is minus the change of energy in the environment. Let me write it in a transition from I to J. If I go from I to J, I take from the environment an energy EJ minus EI. So EI minus EJ is minus the change of environment. And now remember that E minus F, F is E minus TS. So E minus F is TS, E minus F is TS. So this is, no, sorry. If I jump from I to J, I, this is plus. No, if I, if I jump from I to J, my, the energy of the system increases by this. EJ minus EI. So the energy of the environment decreases. And this is, yeah, this is a decrease. And now I have E minus F, E minus F is TS. So this is T, delta S of the environment when I change from I to J. So now the ratio, the detail balance, well, the detail balance is always, so what you have here in the exponent is always a change of entropy. So when you have KT, the log of the ratio is a change of entropy. Now remember the proof that, what we did, what we did on, I think it was on Friday. We took a system, the system with a master equation. Remember that this is Jij, no, Jji of T sum over all J's. And this is the master equation. And then we look at the, at how the shadow entropy changes. No, this is the derivative with respect to time of K, the sum over I, PI, log PI. We calculated the derivatives, there were two terms, one term cancels because of normalization. And then we rearranged everything. We use this equation here and we found that this was K, the sum over all cap pairs of states. And this was the current from J to I. I think it was J to I or I to J. What, it doesn't matter. I to J of PI over PG. PI over PG. No, this is something that we derived, I think on Friday. So now if we compute the entropy production, the entropy production per unit of time, which is the change of energy, the change of entropy in the system plus the exchange of entropy in the bath, in the environment, sorry. Now look what happened. This is K, the sum of the current from I to J of the log of PI divided by PG. And now we have the environment. This is the change of entropy in the environment due to the jump. So it's just to sum of KT, well, we divide by T. So we have K, and in this transition, the environment changes as K is already there. Log of gamma IJ, which can depend on Chin up because we have an external agent, gamma J I. It's just the same, we did this on Friday, but without this term. But now we have this term, okay? And now if you put everything together, we have that in each transition, the change of entropy in the universe today or the total entropy is the same as we found on Friday. And this is bigger than zero because remember that J is this number, minus this number. So whenever J is positive. J to I in the. Ah, J1. The current is this minus this number. So whenever the current is positive, this is bigger than this and the log is positive. Whenever the current is negative, and only when the current is zero, this is one and this. So this is bigger than zero. This is a proof that the master equation, if you like, this is a proof of the second law from the master equation. But I don't call this a proof. This is just a proof that the master equation is compatible with the second law. But now we have this term. And remember that this term is the chemical work. So in each transition, we have and that the fuel can introduce this chemical work which drives the system out of equilibrium, drives the system out of equilibrium. We will see today or today if we have time or tomorrow an example. We are going to see a specific example of this type of motor. This is the theory of discrete systems in contact with thermal baths and particles with thermostats and chemostats. The modern world for this type of a reservoirs where they provide particles is chemostat. The chemostat is the same as a thermostat, but what is a thermostat? A thermostat is something that releases or absorbs energy without changing the temperature. A chemostat is something that releases or unabsorbs particles without changing the chemical potential of this particle. You can imagine this as a kind of reservoir of particles where it can give particles and absorb particles without changing new. Of course, it's an idealization. Whenever a system releases particles, the chemical potential changes. But if the system is big enough, this change is going to be very small. This is the same as a thermostat. In a thermostat, whenever you release energy, your temperature changes. You cool down, if you release energy, but we assume a thermal bath is a system that the heat capacity is so big that you can neglect the change of temperature when you absorb or release energy. So this is the same. Okay, so now we have all the tools to... Yeah, this is the framework, let's say, to study molecular motors. And what has this to do with information and the Maxwell-D1 and so on? Well, because some of these molecular motors can be interpreted as information motors. And the key concept for that is something called the information flow. And this is what we are going to study. So now we finish this lesson. Lesson four was what Leah and lesson five... Lesson four and seven is what Leah explained yesterday. So the application of free energy to information devices and the explanation of the Maxwell-D monos, mutual information which was a measure of correlations and so on. And now we jump to lesson eight. It's eight, no? Eight, which is information flows. And the idea is to have a systems which are composed by two subsystems. So we will have system X and system Y. They could be eventually connected with thermal baths and so on. We know from mutual information that the entropy of the whole system can be decomposed in this formula. And we can do the same with the non-equilibrium free energy if we like, the non-equilibrium free energy. In the non-equilibrium free energy, we have the non-equilibrium free energy effects of Y plus K, the mutual information. So the mutual information is telling us how the correlation between the two systems affects the entropy and the free energy and so on. So for a single temperature, but this is a kind of parenthesis because we are going to use this formula. For a single temperature, remember that the free energy of X and Y is the Hamiltonian, I mean the energy minus TS. So you can also, if there is no interaction energy, you can have this KT while we set that. So everything that we, it's actually the same. The formula upwards, the upper formula is the same as the lower formula. But actually, if you think of the explanation that Leah gave us yesterday on the Maxwell beam, the Cillar engine, this formula is everything. It covers everything. So it covers the, well, once you have, you should have a second law. The second law tells us that the total entry production per unit of time is the derivative with respect to time of this entropy. Now X and Y can depend on time. Minus plus the entropy of the environment. And this means that you have the entropy of system X, the entropy of system Y minus X, Y plus S environment. Equal bigger than zero. And if you let this formula, this is entry production, this formula explains everything. Explains, for instance, the Landauer principle. If you integrate, you can have this formula or the integral of this formula. But if you integrate this formula in a Landauer process, this is, if a Landauer process is something, well, in a Landauer process, you only have one system. You don't have two systems. So in the Landauer process, if it is a cycle, this is, I mean, if it is a cycle, if this can be zero, this is zero, and sorry, well, the Landauer is more trivial because it's just one system, so everything is zero except this. And then you only have that. The environment must compensate the change of entropy. So in the Landauer eraser, the entropy decreases, so the environment must compensate this by a dissipation. The Sealer engine, so it's what Leah explained yesterday. When you measure, you create a correlation, so this minus something must be compensated by a hit here, and in the feedback, it's the other way around. In the feedback, this is negative because you destroy correlations, and then this allows you to reduce the entropy of the environment, extracting heat or something like that. So how can you have these two systems these two systems to depend on each other? The mission information be non-zero without interaction. Without interaction. No, you need some interaction. But yeah, here we neglect interaction. If not, you have to adhere the interaction. But of course, we have interaction, and in the case of the Sealer engine, for instance, the assumption is that before the measurement and after the measurement, the interaction is zero. So when you integrate this, because all these are exact differentials, so when you integrate this is the interaction after an interaction before, and this is zero. You assume that this is zero, but that the interaction terminates, and then vanish. Okay. So you assume that- You have to assume this. Otherwise- Because to measure, you have to put the two things in code. Yeah, otherwise you have to put everything in code. But then you, this is body for when you decouple the systems. But what I want to stress here is that everything that we have studied is summarized by this formula. If you think, or in the case of isothermal processes, this formula. But this is why the whole story is a bit disappointed. We started with very attractive things, intelligent beings and demons and the eraser and mysterious things. Finally, everything is explained by this, which is kind of a, I mean, well, this is the nice new term, not that tells you. But as I mentioned on Friday, when I wrote with Takahiro Sagawa and Jordan Horowitz, this review, the conclusion for them, for me, still the thing has a lot of mystery. And on Thursday, I will give my last, because in this course, I've learned also things. So I've thought a lot on the problem again with your questions and so on, and I have a new vision of what is going on. But it's true that the Cilla engine, the Landauer principle, everything that we have studied so far is summarized by this equation, which is a bit stupid, but okay. And now we are going to apply this equation to these discrete systems and the master equation that we have seen before. Because then it tells you something new and this is interesting. So we are going to have system X, system Y, and now we have, we are going to use this for discrete systems. So X and Y are discrete systems and they can be in contact with some environment. So I have the environment of Y and the environment of X. Of course, they can be all together, but for instance, and there is a connection between the two. And we are trying to, we are going to analyze this type of scenario using information. And for that, consider this term first. This term, as I said, for instance, the Cilla engine can be explained just with this equation. But there is something interesting. Yes, whenever you measure the information, the mutual information increases. And whenever you do feedback, as Leah explained yesterday, that feedback must be something that reduces the mutual information. Feedback reduces, let's say, but let's put it like that. Measurement increases I and feedback decreases I, the mutual information. And this is the whole story somehow. So, and this is what you do in the eraser. The eraser is different. The eraser is just that you want to go, for instance, if Y is your demo, you want to restore the demo to its initial state, and then it's something that you don't need to systems. But this is the idea. And everything that we have studied, even though it was so mysterious at the beginning, is just a question of creating correlations in the measurement and destroying correlations in the feedback. And as I said, if this is positive measurement, this must be positive, so you must dissipate heat. If this is negative, this is positive, then you can extract resources, extract heat, energy from the thermal bath, or extract something. Okay, the idea is to study this with the master equation. So, this is what we are going to do now. Applying the master equation to, and then at the end, we will introduce this thing. So, we have, we are going to have an equation. I will use now X i, T. So, we will have our, this is why all these things depend on time, because these depend on time, because the probability depend on time, this depends on time, and this will depend on time. So, now we have my master equation, which is all the jumps from a state X prime by prime to X, Y. This is the incoming flow. So, remember, this is the J that I used before. Before, we have gamma from J to I, and now the outgoing flow. This is the master equation. And we are going to do exactly what I've done before with the chemical work and so on. We are going to use autonomous systems first. Yeah, autonomous systems. And we are trying to derive this equation and analyze this equation for these systems. Well, first, we could do this. We could calculate, if, for instance, suppose that you have your system, your system to make ideas more clear. It could be, for instance, X could be the position. This is actually an example that we will see tomorrow very in detail. X could be the position of a particle. So you have a motor. The motor consumes ATP. It releases ADP, and it moves in one direction. So X could be the direction, the space, of course, must be discrete, so it could be steps. And actually, this is what happens in biological motors. Many of those motors work in DNA, for instance, and the steps are discrete. So you can have that your motor moves like that. And Y could be some internal state of the motor. So X is the position, and Y is some internal state. Of course, the motor, to work, needs something. It's to some legs or something that, and it needs to move. Well, I'm a motor, so if I have legs, I need to put one leg and then the other. And this is a change in conformation. Actually, some proteins, they don't have two legs and so on, but some of them, they have this conformation and this is an internal state. And usually, the ATP-ADP changes the internal state. It's not just pushing, because the ATP cannot push. It can change the internal state, but that's it. So usually, the environment of Y is the ATP-ADP, and the environment of X is diffusion or it's a random work and things like that. It's a thermal. So this is the scenario. We can have external agent, and if we have an external agent, we can solve this equation, calculate these things. For instance, a Px is the, Pxt is the marginal. So we just do this, and then we can calculate Sx as minus K p log p. And then calculate this derivative, and then this one, and then this one, and also this one, because this is the chemical work, all the things that we have. So we can apply this, sorry, no more. We can apply this, and calculate all these terms, and interpret this as measurement, as feedback, et cetera, because all these things, this can be positive. In some, of course, I have an external agent. Maybe there is a cycle, because my external agent is a modulation. This could be also a chemical motor where the external agent is light that creates a modulation. So I can have a cycle, and I can have my, in the cycle, I can have some measurement and some feedback, exactly as in the Siller-MG. So you can reproduce the Siller-MG with this, within this framework, okay? So if I have an external agent, I can do this. But in the steady state, so if I don't have an external agent, I have a problem. And the problem is that the steady state by definition is that this is constant. This is constant. This is zero. This is constant. So S dot is zero. S dot is zero. This is zero. Everything is zero. The only thing which is not zero is just that the environment, the entropy of the environment has decreased. So everything is zero. S dot X, S dot I, I dot is zero. And the second law is trivial. It just says that this is bigger than zero. So for autonomous systems, remember that this is an autonomous system. This equation is meaningless. So we have to be more clever. We have to develop something because still one thinks that in some of these motors, one acts as a demon and the other as a system. So the idea is how can we, there is another question before there. So how can we develop a mathematical framework to study this from the point of view of information? And this is information flow, which I introduced now. I just wanted to ask, what could be an example of internal state? What could be an example of an internal state? An example. Yes. Yeah, a conformation. Well, I will, tomorrow we will see an example which is the following. You have a brownian particle in a potential which is like that. There are infinite barriers and steps. So it's like a step. And what you do is the following. You can switch between two potentials. Oh, I think there is another. So you can switch between this potential and this potential. You shift the barriers, okay? So imagine that the particle is here. This is an uphill motion, but because everything is encoded with a thermal bath, this can happen. And your demon is very smart. And when the demon sees the particle here, changes the potential. This is a ratchet. It's called a ratchet because you are rectifying something. Well, you are not pushing, but the particle moves uphill just by changing the potential in the proper moment. This can be the internal state. This can be I equal A, Y equal B. This could be an internal state where you move uphill and there is, well, this is external potential, but it could be imagined that, well, no, this is more difficult to visualize. But yeah, this maybe, this is the one that we will see tomorrow. But it's true that maybe for, this could be probably implemented by a molecule and so on, but in kinesine, kinesine is a molecule that moves in a kind of potential like that. Kinesine moves along something called microtubule in the cell. Microtubule are just polymers. They are not polymers, but okay. They are like nanotubes. And I think it moves like that. It has two legs, no? Maybe somebody, if Edgar will be here, Edgar knows, you know a bit about kinesine, no? And this is attached to the microtubule and it can be released. And so you have, this is one internal state, then you have another internal state which is detached. And I think when you detach, because of some reason, this cannot be attached again here. So it moves randomly and then the most likely motion is that it move from here to the next one. And then, so the two states will be something like that. Attach-detach, but this could be an example. But I'm not an expert in kinesine, so I don't remember exactly. But there are many models using internal states for that, okay? So this is our scenario. And some people, you have the papers in the list of references in the bibliography, suggested bibliography. Some people introduced the information flow, which is the following idea. The idea is that, okay, I, when we express this derivative, it's because the two random variables depend on time. So this is the mutual information as a function of time. And this is constant. In the steady state, this is constant. So if you have this derivative, this is zero. Yeah? But you have written previously, where do you use the fact that this is an autonomous system when you say it's a steady state? So we have S dot, I dot, which are zero. Where do you use the fact that it's autonomous? Well, when the system is autonomous, this has a steady state, a stationary state. So it doesn't change on time. This is not necessarily with global detailed balance. So you can have currents and things like that. This is the example that we studied with the two temperatures, remember? That we have the four states and this reaches a steady state. The steady state means that everything is constant. Yeah? Ah, yeah. When there is no external agent, there is a steady state always. I mean, you can have even several steady states, but at least you have one steady state. Unless the system is not confined or something like that. The system is confined. And if there is a finite number of states, you always have a steady state. And then in this steady state, everything is zero. So this is zero. So the idea that Horowitz and Esposito had before other people, it is to decompose this into derivatives. One is, I write it like that. There are many ways of writing this. Well, we decompose this. Well, we write it like that. What is this? This is to compute the mutual information for x and y and see how this mutual information evolves due just to the evolution of x. Do the same for y. This is, we write this like in this way. This is the mutual information due to the evolution of y, which is, we will use this notation, where this is the mutual information, the change in mutual information due to the evolution of x. And this is the change in mutual information due to the evolution of y. And in the steady state, this is zero. So in the steady state, this is minus the information due to y. And this is called the information flow. This is the information flow. Actually, the concept was introduced first in probability theory. I think one of the applications is when you have two signals, x and y, you know that correlation doesn't mean causality. There are a lot of jokes on that correlation that two things happen. That doesn't mean that one is the cause of the other one. So, but when you have two time series, x and y, and you want to check if one, if x can be considered the cause of y, or the other way around, information flow was one of the tools. But in physics was introduced in 2010, I think. By Anna Berdian and all the people. And the paper that really was more important for that is the paper by Jordan Horowitz and Massimiliano Esposito that you have in the outline. Tomorrow we are going to see, first there are two problems. First, is this possible to decompose this change in information flow as a contribution of x and y? Well, this is possible only in some states called bipartite systems. So this is only possible if the system is bipartite. So tomorrow we will introduce the concept of bipartite systems for discrete systems. And we will see it's very simple there. And then, we will introduce the information flows. And we will see that the information flow allows one to prove a second law, a local second law. So there is a second law for the whole thing, which is very trivial, it's just this one, that the entropy of the environment is bigger than zero. But with information flows, we could derive a second law for system x and another second law for system y. And finally, the information flow allows us also to interpret the bipartite system as an information device, so as an information motor or an information machine. Why? Because if I suppose that this is positive and this is negative, this means that the evolution, remember what is the information flow, is the change in the mutual information due to the evolution of x. So x is increasing the correlation between the two systems. So x is measuring. Before we said that measurement, in a measurement like Leah did yesterday, in a measurement, the demon evolves because it becomes correlated with the system and the system does not evolve. This is the change in mutual information considering that y does not evolve and x evolves. So if this is positive, this means that x is a demon. So this means that x is a demon and y is the observed system. Or the working substance you like. And the other way around, of course, so the sign of the information flow allows us to interpret. First allows us to interpret our machine as a demon and something else, as a demon acting on something else. And the information flow tells us, reveals who is the demon and who is the other guy. So tomorrow we will see the two things. First, when we can write this, so we will introduce the bipartite systems, then we will derive a second law for the system and a second law for the system y. And we will apply this to an example. I think this is, and for this we will not have exercises. We will just have the theory because all the exercises are very involved. The example that I, tomorrow I will make, I will solve an example, but it is a bit involved. Okay.