 I think that they can shake his hand. OK. There was kind of a delay there. I can't hear anything now. We hear you. We are here. Oh, OK. I should start already. All right. OK. Yeah, this is about your presentation. All right. So I mean, I can hear you. So please ask questions during the lectures. Thank you for the invitation. I should say it's like 5 AM here. So my son is going to be waking up in like an hour and at home still. So if he comes into in here, that might be like a two, three minutes stop or whatever. That probably is not going to happen, but it might happen. OK. So let me share my screen. Oh, you can already see my screen. All right. Can you see the screen now? I guess so, right? Not yet. No? Oh, you see the? Sorry. I have to share the screen. All right. OK. OK, but OK, what is it now? It's here. All right. You should be able to see it now. Yes. All right. So if you write, I can probably look the chat, but it would be kind of hard for me to look the chat during the talk. So please try to just ask your question. And if someone sees a question in the chat, just please read it out loud for me. Otherwise, try to just ask the question, because it's kind of hard for me to write and read at the same time. OK, so these lectures are going to be about stochastic thermodynamics. So there will be no attempt of going through complicated math. I'll try to use very simple models. She will study the concepts. So I'm going to call these introduction to stochastic thermodynamics. OK, maybe call this one. So there will be three lectures. And so this field, so a few words before I really start. So it's fair to say the field started in the mid 90s. OK, that's where that's pretty much when people discover the fluctuation theorem. That's like 95, 96. So first with Evans, Morris, and Cohen, and then Galavoti and Cohen. And Auschwitz-Zinski, they are different relations, but they come from the same thing. I mean, they are both fluctuation theorems. It can be expressed in different forms. And when they were doing it, I mean, there is a one-year difference between the works. But they did not really understand the works were related or how they really didn't. They had an idea they were related, but they didn't understand how exactly. So the discovery of the fluctuation theorem can be considered the start of the field of stochastic thermodynamics. And then after that, with experiments and everything, the field really pick up some traction. Now, Edgar was just talking about molecular motors. So if you want a picture to represent stochastic thermodynamics, a good picture would be this one. So imagine like a train on a track. OK, that's supposed to be a train. There's a train on a track. A train is moving forward. And that would be thermodynamics. So thermodynamics is the science of how it transforms heat into other forms of energy. And if you want a picture for stochastic thermodynamics, you do the same picture, or the analogous maybe. And I don't really know how to draw a molecular motor, but that would be like a molecular motor. Molecular motor on a track is the experiment you can do. Now, the molecular motor on a track is supposed to represent stochastic thermodynamics. Now, what's the difference between these two pictures? Well, I mean, one difference is the number of degrees of freedom. So for the train, you have like 10 to the power of 23 degrees of freedoms, or probably way more than that. But you have a very, very large number of molecules that make the train. For the molecular motor, it's just a single molecule. It's a big molecule, the complicated one. Probably, I don't know, 50 times bigger than a water molecule, linear size. But it's just a single molecule, while the train is made of many, many degrees of freedom. Now, if I watch a movie of the train, I can only see the train moving forward. I'm never going to see the train moving backward. I mean, the train might move backward if you want to move backward, if there is a very, I mean, that's possible. But I mean, there are no fluctuations that you will see the train moving backwards. If I watch a movie of the molecular motor, on average, it moves forward. But because the molecular motor is bigger than the water molecules, but not that bigger, the collisions between the molecular motor and the water molecules, if the motor is in water, will make the motor fluctuate. So big, big difference between the two pictures is fluctuations. The second case is stochastic thermodynamics. Fluctuations play a major role, okay? Now, of course, there are fluctuations in thermodynamics, but the kind of fluctuations I'm talking about, I mean, if you just imagine the thing moving forward, if you watch a movie of the train again, you are never going to see the train moving backwards because of fluctuation. That's pretty much impossible to see while for the molecular motor, that's a quite common occurrence. The molecular motor is out of equilibrium. Now, the train is probably out of equilibrium also, but when you do thermodynamics, we can only talk about equilibrium states. Now, the really, really remarkable thing about equilibrium states is that they can be expressed, I mean, these are states of 10 to the power of three particles and they can be expressed with just a few variables like temperature, pressure, volume, entropy, and so on and so forth. And so that's the really remarkable property of equilibrium states and the whole field of thermodynamics is built on equilibrium states. Now, most things that in nature or in our bodies or whatever, they are out of equilibrium, okay? And so in the stochastic thermodynamics, you can really deal with models that are out of equilibrium, okay? The other thing that, again, difference in stochastic thermodynamic are fluctuations which come from the fact that they're just a few degrees of freedom. The system can be out of equilibrium and another major difference is like finite time. You know, if you remember your, I don't know, here in the US it's called thermophysics. I don't know if you still have a thermodynamics course and a statistical physics course separately when I did it in my undergrad in Brazil. They were separated, but now at least here in the US, they just have one called thermophysics. So if you did your thermophysics course, I mean, if you remember correctly, you never see time during the whole course, right? This is probably one of the few physics course that there is no time, time never shows up. And the reason there is no time is that if you wanna think about a model in standard thermodynamics, you really have to think about the quasi-static model, okay? Whatever change you do, you have to change things very, very slowly so that the system can always equilibrate and now it's been equilibrium state. So if you change a parameter in your system, like change the pressure a little bit, you just wait the system to relax to equilibrium and then you keep changing the parameter, okay? So in a sense, if you wanna read a model analytically in terms of dynamics, you have to consider a quasi-static model, okay? That moves very slowly. So that means infinite time or at least time must be very, very long compared to the relaxation time you have. Now in stochastic thermodynamics, we don't really have this constraint, okay? I mean, we can do process at finite time and you know, most process that happened either man-made or in nature, they are finite time. So you know, so stochastic thermodynamics is basically a generalization of thermodynamics to system that can be small and therefore have large fluctuations, the systems that can be out of equilibrium and system that can operate at finite time and this generalization of thermodynamics that basically starts in the mid-nineties, okay? And you know, before that, before stochastic thermodynamics, another kind of framework to treat systems out of equilibrium was linear response theory, okay? You know, linear response theory is considerably older. That is the Gonzaga-Rice-Prosti relations, okay? This is 60s and 70s. So that's a little bit older, but that's a precursor if you will of stochastic thermodynamics. But again, when people are doing linear response theory, they are not really thinking about, they didn't know about things like the fluctuation theorem, which again, the fluctuation theorem is what really, really started stochastic thermodynamics. And I mean, what's really the main point of stochastic thermodynamics or what would be the main objective maybe? The main objective in my view or maybe a main thing in stochastic thermodynamics is that once you accept that, you know, or once you build a theory that is able to deal with systems that have large fluctuations that can operate out of equilibrium, a very, very important question is, what are the universal constraints that these fluctuations must fulfill, okay? What happens is that these fluctuations, they do happen, but they are constrained, okay? They cannot be anything. There are some relations that tell you how these fluctuations should be, okay? There must be some inequality or some equality that they fulfill. So, you know, the big, big, I mean, in physics, we always wanna find universal relations, right? And we wanna find some universal laws, universal rule. And a big, big question in stochastic thermodynamics is what are the universal relations that fluctuations must fulfill, okay? And again, one big relation that really started the field is the fluctuation theorem, which I'm gonna talk about here. Again, that's really easy to prove, okay? You're gonna see that. And the other relation is the thermodynamic and serenity relation, which is also universal relation about fluctuations. That is for, that's from 2015, okay? It's much more recent. And that one, oh, I should write the whole thing. Thermodynamic uncertainty relation. And that's from 2015, okay? And that was discovered by myself in the disaster in 2015, okay, so that's another relation, I guess. So, what these two relations have in common is that they are very, very universal constraints that fluctuations must fulfill, okay? They cannot be anything, but they must fulfill this thing, okay? Okay, so, you know, this course is gonna be about stochastic thermodynamics and I'm gonna start from the very basics. Again, there will be no attempt of being, of covering all the math and everything. I will try to make the math as simple as possible. I'll always try to focus on a very simple model, which will be the model of a single enzyme. And yeah, I mean, you know, and I'll probably give you, I mean, I'll send some list of references to the organizers and then they can pass it to you by the end of the course, okay? So, I'm not gonna write any reference to any of the lectures, but I will send some list of references after, so that, you know, if you are interested, you can further study the field. Okay, so, let's start. And our starting point is gonna be the master equation. Let me do this. And please ask questions if you want. So the master equation. Okay, so, first thing, what is a stochastic process? So a stochastic process is just a sequence of random variables indexed by time, okay? So, you know, you have like X0, X1, where this subscript represents time somehow, okay? That's a stochastic process. And you know, here is stochasticity, it comes from the fact that, I mean, we think about some sort of mesoscopic description, right? Again, if you think about the molecular motor, you know, the molecular motor will fluctuate because of these collisions between the two. Smaller water molecules and the molecular motor. It makes the molecular motor, even if it on average moves forward, sometimes it gives steps backward, okay? So that's where the stochasticity will come from here. I probably should say, okay, the kind of experimental systems which the studio of stochastic thermodynamics apply include molecular motor or any single enzyme experiment. I mean, a lot of stuff in molecular biology, like biochemical reactions, and everything. Quantum dots, if you are in a spectrum regime, if they are like weakly-coupled, what else? Colloidal particles, I mean, those are probably the main sorts of experimental setups where these kind of things I'm gonna talk about have been or at least could be tested, okay? All right. Okay, so that's the stochastic process. And you know, in the case of the stochastic process, you know, in stochastic thermodynamics, we deal with Markovian process most of the time. Let's just use the singular, excuse me, Markovian process. And, you know, a Markovian process such that the probability to observe xn, given the whole previous trajectory, xn minus one, xn minus two, up to x zero, is gonna be p of xn, xn minus one, okay? That's the Markovian property. Again, this is a conditional probability. So what this equation says is that the probability to be in a state xn is given by, I mean, given the whole trajectory only depends on the last step of the trajectory, okay? Everything else is irrelevant, okay? So Markovian means kind of you do not remember your whole past, you just have to remember the very last state you were in, okay? So that's the Markovian property. Now, if the process is Markovian and time is continuous, the variables are discrete, then it's possible to show, which I'm not gonna do here, that the system will fulfill the master equation, okay? Now, let's write down this master equation. Again, this is an equation that will give you the evolution of the probability to be in a certain state. And again, this equation would be true if your process is Markovian, if the states are discrete and time is continuous. Okay, so I represent states by i and j, okay? So what's a state? Okay, so if you think about a molecular motor, a colloidal particle, the state would be the position of the motor or the position of the particle. If you think about an enzyme burning ATP, the state would be either the enzyme having ATP bound to it or the enzyme not having ATP bound to it. If you think about the Isim model, the state would be the orientation of the spin, right? If you have any spins in your system, you'd have two to the power of n state. So that's a state, okay? That's pretty much if you did your term of physics course, whatever you call the micro state there, that should be what I'm calling a state here. And I mean, those states, they are in a certain sense, massive states in the sense that again, if you think about the molecular model, it's made of many atoms, okay? When I'm thinking about a state, I'm not thinking about every single atom, the molecular model. I'm just wondering what's the position of the motor in the track, for example, okay? So there is always some sort of coarse grainy when you think about states here at this level. Okay, so w, i, j will represent a transition rate from i to j, okay? And this will be transition probability per unit of time. And so the master equation, which I guess, maybe most of you have seen already, is dpi dt is sum over all states, j different than i, dj of t, wji minus pi of t, wij, okay? So, you know, the first term in the equation, and this term here, is all possibilities of going into state i. So I am in some state j and then I go to state i. And the second term are all possibilities of leaving state i, okay? That's a pretty simple equation. And again, if you start with something called the Schatman-Komogorov equation, and you have a process that is Markovian, time is continuous, states are discrete, states are discrete, you can show that your, if your process Markovian, your probability again, pi of t, is probability of state i, okay? Probability of the system to be in state i, right? At time t, okay? Now, there are also, you know, the variables i and j could be continuous, the master equation is still valid, it's not a problem. There are like Langevin equations, which again, they can always be obtained as a limit of a master equation. So, you know, my choice for dealing with discrete process in this course is just because I think the math is simplified. I also somehow like more discrete and continuous. But another point, which I would say is that at least from the process, for the kind of process we use in stochastic thermodynamics, if you think about a Langevin equation, it can always be obtained as a limit of a master equation, okay? If it's, okay, if it's maybe not all of them, but I mean, I do think that at least the ones we use in stochastic thermodynamics, all of them can be obtained as a limit of a master equation. But, you know, if you think about a continuous process for Langevin equation, it could be obtained as a limit of the master equation, okay? But, okay, so that's the master equation. Again, soon in this course, we're gonna see a particular model. But for now, again, if you wanna imagine a model, again, the state I could be the position of the colloidal particle, the position of a molecular motor, the state of an enzyme. If you think about the IC model, it would be the orientation of the spins, okay? So, those are the states, all right? And typically imagine a case of a finite number of states. So, you know, I will be equals to one to, up to omega. So, I have omega states, okay? So, number of states in this course is gonna be omega. Okay, so, what are these lectures? It's not really a course, right? It's only three lectures. All right, so that's the probability that you can state I at time t. That's the master equation. It's a linear equation. And when I say it's a linear equation, it's linear in p, okay? The probability p, you can state it's linear in p. So, you know, in principle, you can try to solve this equation. In general, it's kind of hard, but I mean, you could do that. Now, deeply it's convenient to write this equation, the matrix form. So, I can write a vector ps, you know, p1, p2, up to p omega. That would be p of t. Let's just put t here. All right, that would be a vector. And so, I can write this equation as the master equation there. Of course, this is true for all y, okay? Nj, all right. So, I can write this equation as the master equation there. Of course, this is true for all y, okay? Nj. All right. I can write this equation as dp, dp is equal to w, p of t, okay? Well, of course, now this w here is going to be a matrix and this w can be written as w equals to... Oh, I should have done something. So, a convenient way of writing this equation is the following. So, let me rewrite. I'm going to rewrite this equation here, okay? So that you can understand what w is. So, dpi dt is equal to sum in j different than i. So, those are the terms that go into state i, pj of t, wj i of t, wji only, minus ri, pi of t, okay? Where ri is equal to the sum in j different than i, wij, okay? That's called the escape rate. So, it's kind of a more convenient way of writing this equation, okay? Now, I want to write this equation, the vector form dp dt equals to wp, right? Where the vector p, the entries of the vector p are the pi's. And so, my w, the diagonal elements of my w are going to be minus r1, minus r2, minus r omega, okay? Then, here, I'm going to have w of 2 to 1, w of 3 to 1, of omega to 1. And here, I'm going to get w12 up to w1 omega. So, you can more or less guess how the matrix is going to look like. Now, w, big w is called a stochastic matrix, okay? I know this is very mathematical, but I mean, those are very basic mathematical elements of stochastic thermodynamics, okay? So, you know, I'll try to make this course as no mathematical as possible, but those are very, very basic things about stochastic thermodynamics. So, basically, I can write the master equation with this matrix w, which is a stochastic matrix. Now, what are the properties of this matrix, okay? So, the sum of the elements in a column is zero, okay? So, let's call this column sum is zero. So, you know, if I do r1, which is just rI is the sum for all wij, and I sum all elements here, I'll get zero, okay? Same thing for all columns, okay? That should be true for all columns. Okay, that's a very important property of this stochastic matrix. Now, the max eigenvalue of the matrix is zero. Okay, that's, I mean, this kind of matrix, the home Frobenius matrix, so you can show it has a unique max eigenvalue. And if the matrix is stochastic, again, the definition of a stochastic matrix is the diagonal elements are negative, the off diagonal elements are all positive, so all w's must be positive, okay? Transition rates are always positive, and the sum of the elements in a column must give zero, okay? Those are the properties of stochastic matrix. If the matrix is stochastic, it has a maximum eigenvalue that is zero, and all other eigenvalues have a real part that is negative. They can have an imaginary part also, but the real part is always negative, okay? And, you know, in stochastic thermodynamics, there is a property which is, you know, Wij, the Wijs, they can be zero, okay? The transition rates from one state to another can be zero. Like, for example, you know, for a molecular model, I cannot jump from position one to position 10, right? I must jump from position one to position two. And so the transition rates, where I would do a jump that I jump 10 positions in a single jump to now are zero. The transition rates can be zero, but if Wij is different from zero, then Wji must also be different from zero, okay? That is not something true for the stochastic process, but that's something that is true in stochastic thermodynamics, okay? So, in stochastic thermodynamics, at least most of the time, we deal with processes that have this property, okay? If you can do a transition from one state to another, then you can do the backward transition, okay? That's the same equivalent thing, there is a chemical reaction that takes you from one point to another, then there must be a reverse chemical reaction. Even if it's very, very unlikely, but, you know, there must be some finite, non-zero thing that, you know, you can do the reverse reaction, okay? That's always, that's a mathematical property in stochastic thermodynamics. It's an important mathematical property, again, it will guarantee that you have a unique steady state and so on and so forth. I don't want to enter the mathematics too much, but, you know, something to remember about the master equation is that there is a stochastic matrix, the maximum gain value of this matrix is zero, okay? And, you know, this property here, again, is not a mathematical property of stochastic process, but rather it's a mathematical property that we assume to be true in stochastic thermodynamics. You know, maybe the first post-late of stochastic thermodynamics would be the dynamics is Markovian and follows either a master equation or a lunch of equation or whatever, so that would be maybe the first post-late of stochastic thermodynamics. Any questions? All right. Okay, so, even that we now know what's the master equation, we can talk about a stationary state, which, again, is a very important concept. In stochastic thermodynamics. Okay, so what's a stationary state? Do I have a question? No? All right. No question. So, yeah. No, no, there's no question. All right. So, in a stationary state, the derivative, of course, becomes zero, right? That's the definition of a stationary state. So, basically, from the vector thing here. So, you know, and for the individualized, I'm just going to call it PSI, that's the stationary solution of the master equation. And so, basically, I can write a W. Yes, is equal to zero. So what's the stationary state of distribution? It's the right in vector associated with the in-game value. Okay, so, I mean, a main problem in generally non-equilibrium start Mac, right? I mean, a main thing is that this probability distribution, if you have a non-equilibrium steady state, okay, it's not going to be the Boltzmann distribution is going to be something else. And in many, many problems, you kind of want to find this distribution. Okay. That is, again, for many different problems, it's very, very desirable that you find a stationary state distribution and how do you find this distribution? You have this matrix, this W matrix. And all you have to do is to find the in-game vector associated with the in-game value zero. Okay. That's pretty much all you do. And so, that's kind of a very important relation that the stationary state distribution of your system is going to be the in-game value or the in-game vector associated with the in-game value zero. Now, notice that I said right in-game value. So, you know, the W matrix, it's not a symmetric matrix or not, you know, if you want to think about quantum mechanics, it's not a symmetric matrix, okay. So, the right thing in values and the left thing in values are in generally different. So, you know, the matrix is also left thing in values. So, the left thing in value associated with the in-game value zero would be just, you know, one, one, one, just a bunch of ones. Okay. W, you're going to get zero. That's because the sums of elements in a column are zero, right? That's pretty straightforward to show. But, you know, in general, left and right in-game vectors are different. So, you know, dealing with this matrix can be something a little bit more complicated than dealing, for example, with a matrix in quantum mechanics. Okay. So, they are, they have this, there are these properties that, you know, the sums of elements of a column is zero, that simplifies your life a lot, but they also have this complication that, you know, they are not necessarily symmetric, okay. In most of the cases, if you are out of equilibrium, they're not going to be symmetric, okay. All right. If you're in equilibrium, they might not be symmetric also, but they can in principle be symmetrized, okay. All right. So, again, those are the very basic properties. Again, let me talk a little bit more about this map. So, let's say we do the k is omega equals to two, okay. So, I have only two states. It could be one is pink, the pink can either be up or down, okay, for example. So, if I do that, my master equation would be dp1 of t dt is equal to w 21 pq of t minus w 12 p1 of t. But of course, the pq of t is just one minus p1 of t, right, because I just have two probabilities. And of course, so I did not write down, but this PI is a probability. If I sum from equals to one and two omega p of t, I must always get one, okay, probably distribution is normalized it. Okay, so that's a true state model and you know, this is an equation I can solve. So, let's find the stationary state distribution, the stationary state distribution is going to be p1 minus p1 s w 21 minus w 12 p1 s is equal to zero, right, that's the stationary solution of the master equation and the stationary solution of the master equation which will give me p1 s is equal to w 21 or w 12 plus w 21. Okay. So again, that's a general solution of the master equation again in general solving the master equation is very hard. Okay, if you have a system with many many states then it becomes literally impossible to find an exact solution. If you have a system with a small number of states, if it's true it's super easy. If it's treated a little bit harder but still quite easy something could do in Mathematica 456 and option 10 maybe you can solve these things but you know, if you have something with many many degrees of freedom, then, I mean, this exact solution becomes pretty much impossible. Okay, but in this case you can solve it exactly that's the stationary distribution. In this case you can even find the p1 of t exactly so p1 of t. I mean, you can do this calculation. It's going to be p1 s one minus each of the power of minus lambda t plus p zero p zero being the initial distribution. Okay. So that's the solution of the master equation. So basically, if I was to plot this p1 of t as the function of time. And this is a plot you really want to remember. So let's say I started some p zero and let's say p1 s is below p zero. I shall probably call it p1 zero, right? It's probably a better idea. Let's call it p1 zero. Okay, so again p1 zero is just the initial condition. Okay. And basically I would decay exponentially to my stationary solution. Okay. That's what happens. And I mean, okay, that's a true state model. It's a true state model. But you know, I mean, you could say the same thing is going to happen, happens, or any omega. Okay, the difference, of course, now is that if I have a more complicated model with 10 states with 20 states, 1000 states, whatever. It is, I mean, of course, I have many, many different things in values and many, many different things in facts and things that are going to get more complicated. But pretty much what happens is we started some initial condition, you're going to decay exponentially to some stationary distribution. Okay. And again, solving the true state model is very instructive because, you know, that's pretty much what happens all the time. Okay, that's pretty much what's going to happen for larger cities. Of course, largest is going to have a lot of finite time behavior. It's important. For instance, you know, the Indian values can be imaginary, not for two states for two states, they cannot. This lambda here is the Indian value of the matrix. Okay, so one Indian value is zero. And the other Indian value, the Indian value that gives you a decay rate, how fast you decay to the stationary state. Okay, which is kind of an important problem in stochastic process. You kind of want to know what's your relaxation time. So this Indian value of the matrix that is not zero. So the next thing in a while kind of gives you a decay time. Okay. But you know, this Indian value, they can be imaginary. And I mean, these Indian values and Indian vectors in this case, for example, imagine that in values, these are very important for physical things or for biological things, if you will. So these are quite important in biochemical oscillations. Okay. I'm not going to really talk about that. All I'm saying is that these mathematical things I'm talking about here all have clear physical implications. So you know, the imaginary part of the Indian value typically if you have a biochemical oscillation by that I mean something like a circadian clock. Okay, and I could imagine human beings you'll have this 24 hours kind of clock in ourselves. And, you know, if you have biochemical oscillations and they are stochastic somehow, and, and you want to make a mathematical analysis through then typically they will use Markov process. And if you want to calculate for example the period of oscillation, you can do that by calculating the Indian value of the matrix. Okay. The Indian value of the matrix will have imaginary part if you have oscillations, and this imaginary part is going to give you the period of oscillations, while the real part is going to give you the decay time of the oscillations and this decay will happen because of fluctuations. Okay. All I want to tell you here is just that, you know, this thing in viruses and in vectors of the matrix can have very, very important physical implications. Okay. Those are important mathematical objects that can have physical implications. And the other message here is, if you solve the two state model you kind of know what's going to happen. For more systems which is they will start at some initial distribution and the case potentially to the stationary state. Okay. That's pretty much what happens here. Okay. I mean, another problem that, you know, again values and again vectors is the PEMB effect. Again, I'm not giving reference here but I will have a list of references. That's kind of something that has received attention. It's not the PEMB effect is the idea that I mean this was first observed in water, and the fact is that let's say I want to hit up something. And in one case I start with a temperature. And in the other key, oh, sorry, let's say I want to cool down water. Okay, let's say I want to cool down water to a certain temperature. And in one case I start with a temperature in the other case I start with a higher temperature. The PEMB effect was the observation that when I started a higher temperature, the cooling down is faster. So I can cool something down faster starting at a higher temperature, which is a very intuitive thing. And you know there is a paper about that in stochastic dynamics and there is an explanation to this effect in terms of looking at the engine values and engine vectors of this stochastic matrix. Okay. So, you know, two main measures I wanted to give you here were that if you solve the tree state model kind of know what's going to happen. The tree state model is very simple, of course. So if you have more states, what happens is we started some initial distribution and you're going to decay exponentially to your station distribution. The other message is that these engine values and engine vectors can have, they are in some situations very important physical objects. Okay. Or they translate into physical objects. All right. And, okay. So again, the stationary state is the engine vector associated thing and value zero is something that can be calculated. And now, let's discuss the idea of detailed balance. So, my master equation was dpi dt is equal to some in J different than I. I know I'm using the stationary state already. W I J minus P J. Is equal to zero, right? That's the stationary state solution of the master equation. Remember that the dpi s dt is equal to zero, right? Okay, so that's the stationary state solution. And now one possibility is that each term in this sum is zero. So that would mean that P I s W I J minus P J s W J I my J may sometimes might look similar, but they're supposed to be different. P I W I J minus P J W J I that's going to be zero for all bears. I J. Okay. So for all pairs I J, this is equal to zero. And that's what's called the tailored balance. Okay. So again, that's a strong condition. It's much stronger than the steady state condition. The steady state is just that when you do the whole sum, it must get zero. But the detailed balance condition is that every single term in the sum in J must be zero. Okay, that's what's called the table balance. And if the table is zero, so I'm going to call this DB in these lectures. So if that is the balance, then the P is is the PI equilibrium, it's the equilibrium distribution, or the system is said to be in an equilibrium steady state. Okay. So the equilibrium which you know is each of the part of minus beta E I over Z beta being one over KBT. Now I should say that KB equals to one here. Okay. So KB is going to be one all the time. Sometimes I might write KB, just because it's convenient to write KB there but if you don't see KB. Because it's equal to one. Okay. Most of the time in this course KB is one. And there's kind of a reason for that. It's, you know, typically start to make we define entropy with a KB. So, you know, if you think about channel entropy, you know, they start they start Mac. In start Mac, we typically do entropy is minus KB, some in I PI LN of PI, right. But I do think that, at least from the perspective of stochastic thermodynamics, it's better to define entropy without the KB. Also, when you're going to connect with information to that to make your life easier. So, you know, but the way of going, you know, out of the sort of discussion whether I should have a KB in the entropy or not is to just make KB equals to one. Okay, so KB is going to be equal to one KB being the Boltzmann constant. Okay. And T being the temperature. So this is the partition function, right? It's just a sum in J, e to the power of minus beta EJ. Again, J goes from one to omega because I have omega states. Okay, again, if I have detailed balance then I have an equilibrium steady state and my probability distribution, my stationary distribution is just the equilibrium distribution of the Boltzmann. PI could be more generally like a free energy. So, you know, you could have different, it could have like also chemical potential. So it would be a bit more complicated. But in general, just have an equilibrium distribution if detailed balance is fulfilled. Okay. Now, probably you might have seen the balance written in this form. So if I do PI equilibrium, W, IJ is equal to EJ equilibrium, WJI. So I can write WIJ over WJI is equal to the power of minus beta EJ minus EI. Okay. And I'm guessing if you have seen detailed balance, you might have seen this relation here as a detailed balance condition, which again comes from the fact that, you know, problem mathematician will prefer to call this thing here detailed balance. And many times a physicist mainly if you have done like, I don't know, metropolis or metropolis algorithm of the ISI model, whatever in your life, you probably have seen this detailed balance relation. So, the WIJ is their transition rates. Okay. And, you know, a particular WIJ also depends on kinetic parameters, which you know is something that is very model dependent in a sense, but the ratios of the WIJs are just as reminded by the difference of the energies. Okay, which again is going to be true only if you have detailed balance. Now, an important thing of the equilibrium distribution is that it's independent of kinetic parameters. Okay. So, you know, kinetic parameter, for example, could say that for a certain pair IJ, the transition is much faster than for some other pair of states. Okay. So, that would be the role of, for example, kinetic parameter. So, for example, if I multiply WIJ and WIJ by 10, both of these transitions are much faster than maybe other transitions in your network, but it's still the ratio of mass-sufficiency balance. Okay. And how fast these transitions are will depend on kinetic parameters. Now, the very important property of the equilibrium distribution is that it's totally independent of kinetic parameters. So, if I make a particular pair of states faster than another pair of states, that's okay, I still keep the same distribution and my equilibrium distribution is completely independent of this kind of change I do. So, the equilibrium distribution does not really care about kinetic parameters. Again, equilibrium states are particularly simple to deal with because if you have a very complicated system with many, many degrees of freedom, then you have a zoo of kinetic parameters and the fact that the equilibrium distribution does not depend on kinetic parameters, make the theory of equilibrium thermodynamics very, very nice, it's very pleasuring. Okay. So, if in the end you just have these few parameters to deal with, you don't have all these kinetic parameters, while when you deal with non-equilibrium study states, then the distribution becomes dependent on kinetic parameters. And, you know, if you deal with a very complicated model, then not so complicated, if you deal with, like, I don't know, 10 states, a model of 10 states, then your distribution is going to have an expression that, you know, it will barely fit the page. So, if you calculate it on Mathematica, it's just going to be huge. And it depends on all these kinetic parameters, okay, which is kind of a not so nice thing. Okay, so, you know, if PI, if, you know, detail balance is not fulfilled, it's only, so let's define j ij as pi s w ij minus p j s w j i. So, here I finished detail balance. Now, if this quantity here is different from zero, again, let's write it again. The master equation was sum from j different than i j ij is equal to zero, okay. And sum over all j, I do get zero, but each individual element is not zero. In that case, if that's the case, then PIS is said to be a non-equilibrium distribution, okay. And maybe one major difference between this non-equilibrium distribution and the equilibrium one is that it will depend on kinetic parameters, okay. It will depend on this w ij is not only on the ratios, but it will depend on the particular mean. If I multiply a pair by 10, that's probably going to, when I say a pair, I multiply w ij and w j i by 10, that's probably going to really affect my property distribution, okay. Now, the j ij is called probability current. And so, the presence of probability currents is really the signature of a steady state, okay. So, if you have a non-equilibrium steady state, you must have probability currents. Now, the reason you call this a current, it's really because this equation here can be seen as Kirchhoff law. Now, if you remember Kirchhoff law, when you did the lateral magnetism, it's about conservation of current. And you know, if you imagine a state as resistors, okay. And, you know, you think about the currents, the current going through then, it's going to fuel a conservation law similar to the one that you would find in an electric circuit, okay. So, that's why you typically call these things a current. And this equation here can be seen as something as the Kirchhoff law, right. If I think about all the currents leaving a state, they must sum up to zero, okay. Which is kind of an important property that I'm not going to discuss in any detail here, but you know, but probability currents are very important and they had a signature of a steady state. And a physical characteristic of a steady state is the presence of a thermodynamic flux, okay. And a thermodynamic flux is, I forgot the word flux, is a linear combination of JJs, okay. So what's the thermodynamic flux, for example, if you think about an enzyme by the ATP, it's the rate at which I burn ATP. If you think about the molecular motor on a track, it could be like the speed of the motor, okay. If you think about a quantum dot, it would be the flux of electrons to the quantum dot, right. The quantum dot is just the point that can either have an electron or an electron, and I mean, this quantum dot might be connected to different reservoirs and the flux of electrons from one reservoir to the other reservoir to the quantum dot is going to be a thermodynamic flux, okay. So all thermodynamic fluxes which are, I could argue that thermodynamic fluxes are the most important physical observables in this forecast thermodynamics. They are all linear combinations of probability currents, okay, of this JJ currents. Hey, no, probably. No, you hear me. Yeah, give me. Yes, I have two things. One, we have 10 minutes more for you and then there will be 10 minutes of questions. And second thing is, okay, I have a suggestion for your talk, because we are trying to follow your unwritten notes, but as a suggestion that could you draw the terms of a Markov process and say there what is the state, what is the current, because maybe this is helpful. Yeah, I'm, I wasn't the person to have two hours story so I thought it was from five to it was not from 12 to two I'm sorry so because the next thing I'm going to do but I don't know if I'll have time if I have. Sorry. Yeah, we have to break in the middle because of time constraints we will explain later. Okay, we can do only one hour today because we are very delay. Okay, sure, sure that's fine because that's the next thing I was going to do but so how much time they still have 10 minutes. Okay, that should be okay. I should have time to do that. Okay, so yeah that was the next thing I was going to do is your suggestion. I was going to do a specific model now. And again I will continue this next lecture okay if I don't have enough time to finish today so let me just see what I am. Okay, so I discussed the balance and stationary states. Now, as an example, I'm going to talk about a three state model, and that's going to be the model for a single enzyme. Now things that are going to be less abstract and I'm going to talk about the specific physical system. Okay, so the physical situation is that I have a single enzyme. Okay. Again, the experiment I would be doing is a single enzyme in a solution so there's just one enzyme. Okay, and then time is going to be burning some substrate as and it's going to be producing some product. Okay, so, you know the most common one is probably would be ATP. The most common product P would be ADP plus PI okay. So the chemical reaction that's going on is the enzyme takes an S from the solution, and then it becomes ES, then it becomes EP, it transforms the substrate into product, and then it releases the product in the solution. Okay. Now the reversal chemical reactions are also possible they are just less likely. Now, I'm going to think about an unequilibrium steady state what happens in an unequilibrium steady state so that the assumption here is that the concentration so again there's just one okay that is one single molecule it's just one single enzyme in this experiment and again that's something you could do experimentally okay. Um, as concentration of S and P is fixed. So there are two ways you could do that either you can assume in a regime where the numbers of S and P is very, very large, and you know whatever the enzyme is doing is just too small for this time scale so let's say I would have to rate a whole year for the enzyme to change this concentration of substantial manner such that I would see something some changing concentration, and I only want to observe this for 10 minutes okay. You could imagine that there is, whenever the enzyme is taking S and producing P I'm always taking P out of the solution and putting S in okay so that is an external thing keeping these concentrations fixed it okay, but the idea is that even if the enzyme is more likely to take an S and transform it into a P, it can take a P and transform it into an S that's less likely, but the enzyme is doing that but I'm assuming both concentrations are fixed okay so whatever the enzyme is doing is not relevant enough to change what's happening in the solution okay and that's the typical situation of a non-equilibrium study state okay. Okay, so how to represent this model in terms of a master equation that we had before so how does the diagram looks like now so I have three states one state is E, the other state is ES, and the other state is okay, so you know if I do the transitions there if I go from here to here in this diagram below there it's the equivalent of doing the cycle so first from me I go to ES then from ES I go to P and then I'm back to E okay. Now please note that these things here, this thing here and this thing here they have to do with the solution okay your system E is the system, this plus S and plus P are related to the reservoir okay, so of course there's a change in the external reservoir which again is irrelevant because I'm somehow keeping these concentrations of S and P fixed, there's just too many of them so whatever the enzyme is doing is not that relevant. But if I think about this model in terms of a network of states I just have three states and you know if I go from left to right in this scheme here, it means that I do the cycle in the clockwise direction and if I go from right to left it means they do the cycle in the anti-clockwise clockwise direction okay, so that's the model and you know if I want to write this model, do I still have time? Yes, five minutes. Five minutes, okay that's enough. Okay so now all I want to do and again I'm going to start the next lecture from this example, all I want to do is to write this in terms of the master equation I had before okay. So I can call this state to state one, I'm going to call this state to state two and this is state to state three okay. Again if I think in terms of this diagram here I will have W12 here. I have a question so I guess before the arrows should be also on the other direction and counterclockwise right. Sure, sure, sure I can also. Yeah both things are possible again the clockwise one is going from left to right in the figure of both and going from right to left would be anti-clockwise, thank you. Okay so I have three states now right and you know if I want to write down my master equation now I will have my vector p of t is going to be you know p1 of t. Let me go back to black sorry p2 of t, p3 of t. Okay. And you know my matrix W is just going to be minus R1, W12, W13, W21, minus R2. W13, then I have minus R3, W31, and W32 okay. Okay so you know description till now or before this part was very abstract I was just telling what's the master equation again I just made a few very basic observations about it. But you know if you think about the physical model that's probably the simplest model you can think of like a three state model. And again, the physics here is there is a single enzyme it's it's burning ATP okay takes the ATP, and and transforming to ADP plus PI. And so if I want to translate this into the language of the master equation that would be these W's which are the transition rates of this chemical reactions. The three states again the state of the system is the state of the enzyme okay whatever happens to the reservoir whether it is a plus or a plus being this equation doesn't really matter. Okay that's a property of the reservoir. And if I wanted a master equation to that. That's that would be my master equation okay that would be a three state system. That's how the network would look like like drawing these networks is not something very useful. Which, during these networks would be impossible if there is a large number of states it becomes very hard to draw these networks. For this one it's very easy. And, you know, now I'm going to, not now but next lecture I'm going to discuss the concept of a current and state balance within this model. And I mean a very important question that we are going to discuss is how are the W's related to thermodynamic. So you know we saw that for the tail balance right for the tail balance and we had this relation. Okay. So this relation tells us how the transition rates W's are related to energies differences when the tail balance is fulfilled but you know for a general non equilibria study state the question is, how they relate this transition rates W with some physical parameter Okay, which is related to this chemical reaction here okay that's, you know, again. In the next lecture, I will discuss the three state model I will discuss this quantity is the current and everything and how the devil's related to physical parameters in a specific physical context. Alright, so that's I will finish it here. And then I guess I continue tomorrow right. If I'm not wrong. Yeah. So there's time for questions. I can just talk like this. Hello, can you hear me. Yes, I can. Okay, so I have a doubt when you were presenting the concept of the tail balance. I'm kind of following everything but for me it's weird how you, I mean, okay, you define your process in terms of these reaction rates, and then do it. Can you always define a temperature and an energy energetic landscape for me it's like, it's weird know you either define your model in terms of rates or in terms of Hamiltonian right. I mean, so I should have said that but you know in the stochastic thermodynamics the system is out of equilibrium okay. So the system is out of equilibrium, but your reservoir is always in equilibrium. Okay, so while I am my enzyme which my system can be out of equilibrium. And the chemical potential here that has fixed concentrations of S and P. It's in equilibrium meaning that the reservoir, the solution where the S and P and enzyme are has a fixed temperature T. And the chemical potential of S, which is the chemical potential of ATP. And the chemical potential of the product would be the chemical potential of ATP plus the chemical potential path. Those are all thermodynamic quantities which are fixed it okay. I define my model in terms of the transition rates but as I told you I want to discuss how this transition rates are related to physical parameters or to thermodynamic parameters and it turns out that this transition rates. They're related to, for example, the temperature of the reservoir and the chemical potentials of P and S. So, you know, I mean, the right way to maybe answer your question would be that I define my model in terms of the transition rates. And this transition rates to have to fulfill certain relations that they are somehow constrained by this physical parameters of the reservoir. Okay, thank you. Yeah, yeah, that's answered my question, which I'm going to do next lecture I want to explain this is a very good question. It's a very important issue. But yeah, that's what I'm going to next lecture is show you how the transition rates are related to the physical parameters that off the reservoir. If you have basic questions that you didn't understand, maybe you read the chat. There's a question online. Okay. Can you read it for me. Yeah, this is for you. Says how does one write the detail balance condition for continuum processes, which do not have finite number of states. I mean, so, I mean, so, you know, if for a length of equation, that would mean that your, your, your noise will feel like a standard fluctuation dissipation relation so I guess that would be the equivalent. But you could, I mean, if you wanted to continue could just, you know, discretize me, for example, what continues in space just to do P of X plus delta and then make this continuous. It wouldn't be much different. It would be more or less the same. But you know, for if you are thinking about the language of equation that the similar thing to be in equilibrium, I will take balance would be that your noise, you know, the strength of the noise is proportional to KBT. So that would be the equivalent thing. But you don't really have a exact thing like it. I mean, yeah, that that would be the equivalent thing of saying that you see some fulfilled detail balance, I guess. It seems no more questions. At the beginning you talk about the dissipation theorem and the thermodynamic uncertainty relation. Yeah, my question is, if there is like some relation between the two principles. So, you know, yeah, there is the fluctuation, the fluctuation dissipation, those two are different. Okay, they are not the same. The relation between the thermodynamics, not really. I mean, thermodynamic uncertainty relation and the different relations, the sense that one cannot be derived from the other. Yeah, they are different relations, I would say. I mean, that is, I would say the relation between both that they are both universal constraint on fluctuations is that fluctuations they happen, but they cannot be anything they must, they must obey certain constraints and they are, I would say two different constraints that fluctuations obey, but there is not a relation between them in the sense that one being derived from the other. No, yeah, what's mean like in the interpretation, because like in the interpretation. In the fluctuation theorem you have like the, it gives you the meaning of the negative entropy production and in the thermodynamic uncertainty relation you like you have like entropy production bounded the fluctuations. Yeah, way. So, I was asking like if they are like conceptual relation between the two bounds. I would say the concept relation is that they're both, they are both constraints that fluctuations must fulfill. But apart from that, I wouldn't say anything beyond that. So, again, the fluctuation theorem tells you the probability distribution of entropy must be a constraint that p of s divided by p of minus s is each of the power of s. And then the fluctuation theorem tells you that the fluctuations of any current including the entropy is bounded by, you know, by the term by the rate of enter production. Yeah, they are, they are different bounds but they define principle between both is they tell you what fluctuations, they tell you what fluctuations cannot be or that the fluctuations, you know, fluctuations can be anything but they must fulfill these constraints as long as they fulfill these constraints. So they're just two different constraints outside. Okay. Thank you. All right, thank you. I have a question about the imaginary value of the statistics metrics. Can you hear me. Yeah, I can hear you. A master equation is a linear system and and shouldn't we have a unique stationary state. So if we have some imaginary value we will see some oscillations, but a master equation. My intuition is that he should, he should always have a single stationary state. I mean, yeah, if one thing is not I mean, you know, the things that I mean, one thing, having a imaginary value and having a unique stationary state. Both things can coexist. Okay, it's not the problem. But so basically, the stationary state will be the engine vector associated with the again values zero. So, you know, you're, you're, you're, you know, your solution your initial you are starting initial state and you're going to the state's potential that's always happening even if you have a imaginary part having imaginary part to just mean that there will be some will see I mean doing that decay time. Okay, so you know, again, I figured I had was like a decay thing right from P zero to PS, right. P of t was PS as a function of T. And, you know, that's also happened but you know maybe doing this decay time there is some oscillatory behavior of course if my system is very big, if I have many degrees of freedom, then you know this this this transient phenomena here can be something that takes a very long time depends on how big my system is so you know, depending on the system is transient thing can be something that is died that takes a long time to happen. So, you know, even if you have a imaginary part you do go to a steady state. But in the process of going to a steady state that might be some oscillations or there will be some oscillations if you have a imaginary part. Okay, but the both things are not excluded. So you, you can have imaginary value. But it's still you are just going to go to a steady state you're not going to reach a steady state and then oscillate okay you just oscillate before reaching the steady state. But the dog with this before reaching the state can be a very, very long time if your system has many degrees of freedom. Okay. All right. For the sake of time we have to stop here and thank you for the nice lecture. Okay, Andre and I think tomorrow we keep the two hours. It's just a day was a bit that's fine. No, that's totally fine. I just didn't know but that's totally fine for my. I mean, that's that works fine for me. So it might not back just tomorrow to right. Yes, yes. All right. Okay. Thank you. Tomorrow tomorrow. Yes. There's no change. Okay. So tomorrow, three 15.