 OK, great. So we are already up and running. So thanks a lot for being around early in the morning. So lecture number three, thermodynamics of non-equilibrium quantum processes. We are now familiar with the way we have to infer work in this non-equilibrium framework. And when the process that he uses work is genuine quantum mechanical evolution. And what we do now are two different things. OK, so what we will be doing in these 50 minutes will be exploiting the sort of knowledge that we got during the first two discussions to assess two different problems. The first one is related to open system dynamics. And we'll be the focus of hopefully the next 25 minutes. So we go through open system dynamics. We extend the formalism that I've introduced for work to heat. And we see how that can help us understanding a fundamental principle in physics, which is Landau principle that was inter-dat yesterday by Martin Plano and tried to get a justification for its validity from a genuine non-equilibrium framework. While the second part of today's discussion will focus on irreversibility. I've already had quite a few questions about topics related to irreversibility. And therefore, I would like to spend some words about that. So let's have a look. This is the plan for the discussion for the first half of this presentation. So we go and address Landau's principle for quantum open system dynamics. And what we say the goal is to provide a non-equilibrium version of Landau principle. So we want to use the framework for non-equilibrium dynamics and non-equilibrium thermodynamics that we have introduced so far to justify, so to say, Landau itself. And in passing, we will also see how it is possible to actually improve Landau's original formulation for its principle. A byproduct of our discussion will be the identification of the role of a very special feature of open system dynamics, which is called unitality, in this non-equilibrium framework for thermodynamics. So we went through work extensively, I would say, in the previous two discussions. How about heat? So how about trying and gather a semi-a description, semi-a stochastic description for a process where a system, again, your favorite quantum system, is in contact with its own environment. And differently from what we did so far, I'm not going to allow this system to be kicked by a time-dependent process. So up until yesterday, we explicitly considered the case of a unitary dynamic, so no environment whatsoever, if not for the preparation of the initial state of the system, and a time-dependent Hamiltonian, so a process, a unitary evolution, that drove the dynamics of my work medium. In this framework, on the other hand, I'm not putting in the bucket, in the bin, my time evolution, my time-dependent Hamiltonian, not independent Hamiltonian, and replace it with the contact, with the interaction, with an environment. Now, in this context, there were, I mean, quite a few instances of study of this work. I mean, there is a reference missing here from, again, from the Ausberg group, it will appear later on. You can reformulate the very same framework for work in terms of a new stochastic variable, in terms of heat this time. So what these two guys exchanged is energy, but in a different form, in a incoherent form, heat. And you can define a probability distribution for heat that resembles very closely what we have seen for work until yesterday. There are quite a few subtleties, quite a few differences that we need to go through. The first one is that we are going to shift our attention from the system to the environment. So if you remember the description, not the form for the work probability distribution, not the expression for the work probability distribution that I gave up to yesterday was something like this. Also these transition rates, these transition probabilities, the initial probability to find the system in one of its eigenstates. And then the amount of work that we were getting was given by the difference in energy between the initial eigenvalue and the final one, right? The difference in value between initial and final energy eigenvalues. And these were related to the system. I performed measurements on the system. And these guys were the outcomes of the measurements performed upon the system. Here, everything that you see in this expression should be referred to the environment. So I'm assuming that I have some possible control, some form of control, upon the environment itself. The environment is an object with a given Hamiltonian. Hamiltonians of which I know the eigenstates and the corresponding eigenvalues. And these guys are the eigenvalues of the Hamiltonian of my environment. So what I'm asking is how much heat the environment exchange with the system, or vice versa, and what is the change in energy in the environment that the environment is subjected to due to its coupling with the system. Makes sense. What are these probabilities? Well, these probabilities are referred precisely to the environment itself. So the environment is, for instance, in a thermal state, right, in a Gibbs state. These PM are the analogous of my PM naught. They are the probabilities that are the beginning of my process. When I perform the first measurement on my environment, I find the environment in the nth eigenstate of its Hamiltonian. After the interaction, right, after the system has exchanged heat with the environment for a given time, I measure again the state of the environment. And I wonder what is the probability that this time the environment is found in the nth energy eigenstate. Given that, at the beginning, I found it in the nth energy eigenstate of its Hamiltonian. Does it make sense, guys? Yes? The difference in energy, the difference in values between these two eigenvalues, the corresponding eigenvalues, provides you with the amount, gives you the value of the stochastic variable, q, that embodies the heat exchanged between the two systems. Please. I'm deaf. I apologize. Oh, yeah. If you want to say, you can assume you are detaching, attaching, detaching. It's fixed in time. It's fixed in time. There is nothing dependent on time, not a priori. You can put some intrinsic time dependence in the Hamiltonian of the environment, but nothing that depends on the coupling between system and environment itself. OK. So the analogy, yeah, and the third arrow is redundant. So the analogy with the word probability distribution is quite strong. And I don't need to stress it again. So this allows us to set somehow the framework for the study now of processes where I allow for an explicit exchange of heat between environment and my working system. That's it. OK. What is the link between, say, open system dynamics and fluctuation theorem? Again, this came up already yesterday from a few of your questions. And the answer is the following. So the answer to the question, what happens to the fluctuation theorems when my dynamics is explicitly open, is that it's an open question. I mean, we don't know yet explicitly. What we know is that the standard fluctuation theorems, those two expressions that we went through yesterday, Yajinsky's identity and Tasaki Kroc's one, well, these two guys are unaffected as long as my process, my open process, is a unital process. Now, let's go through what a unital process is very briefly. So the definition is very simple. It's very easy. So if I have an Hamiltonian process, now my handwriting, you know, is fantastically awful. So if you don't read what I'm writing, just let me know. So for an Hamiltonian process where I defined an Hamiltonian, I have a time evolution operator U, which is if this guy is a time-dependent object, an explicitly time-dependent object, well, this guy is the result of the exponentiation of my Hamiltonian. And this object here, this operator T, is the time-ordering operator, right? The Dyson ordering operator. The total is basically the expansion of this exponential time. And what happens is that any initial state of my system evolves according to these dynamics, right? So this gives me the state of time tau. If I have an open process, which means the word doesn't tend with my system, there is an environment. This environment is coupled to my system. They exchange energy information in general, right? Then I have to abandon the unitary description of the evolution. I know that this is very well known to everyone. Just to reiterate a few concepts. So what I should do is to replace my Hamiltonian time evolution operator with a general map, right? So I define a map phi that acts upon my initial state rho i of my system. This map is, in general, a time-dependent object that drives the state to the final state rho tau. Now, this map rho tau transforms density matrices into density matrices. So it's a physical map, preserves the trace of your density matrix because it's transforming density matrices into density matrices. So it's a completely positive trace-preserving map. And it's unitary when applied to the identity matrix, leaves such a density matrix unaffected, OK? So we define a map, an open map, as unitary, whenever it doesn't alter, it doesn't modify the identity matrix. Now, there are examples of easy examples of maps that don't satisfy this condition. And one example, a very straightforward example, is the example where the is amplitude damping. So do you know amplitude damping? Discipation, energy, right? So suppose that you have a two-level system, like the one that Christopher Wunderlich has taken yesterday, has explained yesterday, right? So a zero and a one, or these are embodied by the ground and excited state of a qubit, an ion, an atom, anything into which you want to encode information. And suppose that you have prepared this initial state of the system, so in an excited state. If you take this system and put it into a dissipative bath, right? So a bath that sucks energy from the system, what happens is that after a long time, or a relatively long time, you will find the system into its ground state, right? So the energy of the system has been absorbed by the environment. I've lost the excitation, and I went all the way down to zero. So now you can understand immediately that if I prepare an incoherent mixture, right? So if my initial state rho i is 1, 1 plus 0, 0 over 2 for normalization, right? So if I'm preparing as initial state, the identity matrix makes sense, yeah? Properly normalized. What will happen through this channel, through the dissipative channel, is that whatever is in one will disappear. So I will not find one in my evolved density matrix, in my evolved state. And all the, say, the entire density matrix will all be ascribed by the zero state, right? So eventually, after dissipation, I will end up with this state, with the zero, zero state, right? So clearly, a map such as that, so dissipation, doesn't preserve the identity matrix. So it's not unital. Makes sense? Yeah? OK, examples of unital maps, yeah, equally easy. De-phasing, right? So de-phasing is, on the other hand, a different open process. It's a different open process that does the following. Now, suppose that your initial state is something like this. It's a A1 minus A for normalization. So I'm writing down my density matrix. B, B star, OK? And A and B are related to each other in a way that the eigenvalues of these objects are all positive. The trace equal to 1, I'm enforcing it already. So assume that A and B satisfy all the right conditions to make it a proper density matrix. What de-phasing would do, what de-phasing would do would be to kill these off-diagonal elements, leaving the diagonal elements unaffected. So the state at a time tau will read A1 minus A with a 0 here, OK, when tau is very, very long, OK? So what this channel does is that it doesn't touch the probability that the system is found in its excited or its ground state. But it kills the coherence between such states. So if this is the action of your channel, it is extremely simple to see that if I now take here 1 half and 1 half, regardless, and 0 and 0 here, right? So if I start from a row I which has this structure, so it's proportional to the identity matrix, after the action of de-phasing, I'm going to find exactly the same state. So de-phasing is unit. So there are pretty physical examples of unitality or lack thereof, yep. Well, under channels such as de-phasing, fluctuation theorems are formally unaffected. OK, we now dig into the actual Landauer principle, OK? So I'm presenting there and displaying the statement taken from the 1965 paper by Landauer. So what he stated is that any logical irreversible processing of information is a company, he says it must be a company, by a corresponding entropy increase in the system or the environment, OK? So in the part of my device that doesn't buy information. And this process must be accompanied by a dissipative process inside. So I need to dissipate heat into an environment because I'm processing information. Now, Landauer was basically an engineer, I mean a physicist, but he was very much interested in engineering processes. So he was interested in understanding what are the ultimate limitation to impose by physics to the processing of information. And the statement, the principle, is fundamental, but it's a principle. So as far as I know, there is no proof of Landauer principle. So a cartoon, as usual, allows us to understand, in a better way, what words entail, what words want to explain. So suppose that this is my system, so this polygon here is my system, and that I have encoded information in the color of this object and the shape of this object. So the information I want to process is shape and color. I now process such information. Let me finish and then you ask me your question. After the interaction with the environment, the color has been lost, has been changed. Well, so I have implemented this manipulation of information step that Landauer entailed. What Landauer states is that this change in entropy, so the entropy of the system has changed because the information content of the state of the system has changed, must be accompanied by, say, dissipative process by some heat that gets dissipated into the environment. And this heat is lower bounded by the amount of information change in terms of an entropic quantifier of entropy change in the state of the system. So this is Landauer principle. And it's really in its bare formulation. It's a verification of, in my opinion, is a beautiful experiment, fantastic experiment. So Landauer is referring to an experiment published two, three years ago in nature. By using a double well, they proved Landauer principle. So you say they assessed Landauer principle, reaching the lowest possible exchange of heat that the system can achieve. So to me, that is a test of Landauer, but it's not a proof of its validity. Say, God knows if there is any other process that will go below that, right? So to me, there is a strong motivation to look into Landauer's principle and try to make it emerge from microscopic principles. I mean, this statement was stated there, was given in 1965, a long time ago, has been tested and retested, used and reduced, assessed in many ways. But can we make it emerge from fundamental considerations, from a microscopic perspective? And I'm not the first person to address the question, OK, to ask the question. And other people did. Esposito did it a few years ago. And getting very close to this microscopic derivation of the principle itself. And among the various formulation of such attempts at proving Landauer microscopically, I really like the approach by these two guys. Rib and Wolf in 2014. I really liked it because the framework that they set is very natural, is a minimalistic setting that ideas perfectly with somehow the way we understand quantum mechanics and the way we describe open system dynamics. OK, let's go through that before digging into the non-equilibrium formulation of it. So what do we have? We have an initial state of my system, ROS. And this is the guy whose information content I want to manipulate. And I have my environment, which the picture is taken directly from that paper. So they call it R for reservoir. And the reservoir is initially in a thermal state at a given inverse temperature beta. And the dimensions of the two systems are different. OK, so the system lives in a Hilbert space of dimension ds, the reservoir in a Hilbert space of dimension dr. So what they do is that they assume initially a factorized state of system and environment. So no correlation whatsoever between these two guys. They let them interact unitary, so there is a microscopic process that connects system and environment. This gives you a joint state, rho dash, of system and environment. In general, a correlated state of system and environment, maybe even an entangled state, depends on the dynamics. And from which I can extract the reduced state of the system, rho s dash, and the reduced state of the reservoir, rho r dash. And then they proceed to define quantities that are pretty natural. On one end, there is the change in entropy, or minus the change in entropy, between the initial state of the system and its final one. So this is the information content that does change due to the interaction with the environment. There is an analogous quantity delta, which is the change in entropy between the final state of the environment and the initial state of the environment. And then there is the average heat that is exchanged between system and environment in light of their coupling, in light of their interaction. Does it make sense? So this delta Q is the change in energy in the environment. As I said, in this framework, we should look at the environment when we want to quantify the amount of heat exchanged with the system. So this is the change in energy of the environment after and before the interaction. Make sense? So if the energy of the environment has changed, this was new to the interaction with the system. Make sense? Yeah? So I have all the ingredients to assess land hour. And what they get are two results, two very nice results. The first one is this. It's fundamental. So to say, this is an attempt at somehow proving land hour, not proving land hour, and also showing that there is life beyond land hours initial state, more original statement. So what they show is that the average dissipated heat, delta Q is defined here, the average dissipated heat, is given exactly by delta S. So the change in entropy of the state of the system, fair enough. This was in land hour itself. Then there is this object I. And I is the mutual information. Did Professor Plano went? Go through that. Yes, so you know what mutual information is. So these, in very layman words, these accounts for the total correlations between system and environment, both classical and quantum. So no distinction whatsoever. But this gives you basically the measure of how correlated system and environments are after the interaction, their mutual interaction. Then there is a third term, which is a relative entropy. A relative entropy between the state of the environment before its interaction with the system and the state of the environment after the interaction with the system. Now, relative entropy, are we familiar with relative entropy? Can I see hands up? Good, great, fantastic. So we know everything here. Now, relative entropy is a positive quantity. Actually, it's a non-negative quantity. Mutual information is a non-negative quantity. And the change in entropy is a non-negative quantity in this context. So what they have is that, well, basically, forget about the delta S, but these two guys are positive. So certainly, this guy is larger than just this guy. So if I take this left-hand side and this right-hand side, I have Landauer. But what they are showing you is that from this minimalistic, perfectly adherent quantum mechanics framework that they have set, they can justify the emergence of Landauer principle from microscopic consideration, from elementary considerations without having to specify what is the form of the interaction between system and environment, but only having to assume, say, assuming initially a factorized state between system and environment. I know some of you might consider that a very strong assumption. It is a strong assumption. But, say, as someone once told me, it gives you the smell of things. Now it gives you the idea of how things would work in practice. So this is the first result by Reib and Wolff. There is a second one, which is equally nice. You have the question? Sure. Yeah, yeah, yeah. You are right. So what's that you are counting twice? I'm not sure that you are counting twice. What is true is that the way you're writing down things is not unique. So in order to let emerge i and the d, they are to add and subtract terms in a smart way. They are not counting twice. They are not counting twice things. Yes, you are right. i contains the change in entropy of the system. This is true. But, say, I think this way of writing things is to highlight two things that are extremely important when you deal with open system dynamics. One is what happens to the state of the environment. So we are always concerned with what happens to the state of the system. But what happens to the state of the environment is something that is equally informative and equally important. And it's actually a fundamental building block of measures for non-marcovianity that have been proposed a few years ago, in particular the one by Breuer, Pilo, and Leine. So one reason for which no-marcovianity emerges from an open system dynamics is that you might change the state of the environment. And there you get. There you go. No. Only at the beginning. There is no assumption. In fact, the state changed. So otherwise, this distance will be zero. And I will not need any detail in this expression. Sure. The temperature of the environment is set. It's set at the start. Then the state of the environment can change. If you want to have a second environment with which your environment has been in touch, it thermalized it. It thermalized with that. You detach it. And from that point on, that is the whole world around your system. Good day, Josh. Prendi una maro. So if you're OK, can I go ahead with the second result? So the second result is that it's a lot more hands-on. So they started playing around with these expressions, writing them explicitly. And what they found is that you can write explicitly this guy in a way that depends formally on the dimension of your environment. D here is the dimension of your environment. You see it's there. So what they have as a second result is that the new bound that they find, fall and hour, depends explicitly on the dimension of your environment. And this is tighter than delta S in general. So not only you can formulate a sort of justification for semi-first principles of Landauer's statement, but you can also aim at improving it. Have a hope to get a better, tighter bound than the initial one, the original one, formulated by Landauer himself. It depends on the dimension. So if the dimension of the system explodes, so grows to infinity, so you go towards the paradigm of a proper infinite dimensional environment, like the one that we usually consider in Macrobean dynamics, then what remains is only delta S, and you retrieve Landauer. So the original formulation of Landauer's principle is somehow implying, assuming, without telling it to you, asymptotically an infinite dimension of your environment. That's log squared, yes, log squared. All the details are in this long but very nicely written paper. And this is the formulation by these two guys, the attempt at formulating a justifying Landauer from this microscopic or semi-microscopic picture. It's not what I want to show you. I want to show you that the non-equilibrium framework that we went through, so all the pain you have been through until now, it's worth when you aim at justifying Landauer itself. Well, the statement, not Landauer. Landauer doesn't need any justification. Is that anyway? So let's try and see if we can make use of what we learned to understand not only how Landauer emerged, but whether really a tighter bound can exist. So let's take again a formulation that is very close to the one by Ribbon Wolf. So you have your system, you have your environment. They interact through a unit I evolution. And let's set the context in these terms. So I have an Hamiltonian of my total system plus environment device. So I have the Hamiltonian of my system, the Hamiltonian of the environment, and the Hamiltonian that generates the interaction between system and environment. So the coupling term. As Ribbon Wolf did, let's assume that initially system and environment are completely uncorrelated. And let's assume that the environment is initially prepared in a thermal state at some inverse temperature, which again works as a reference and nothing else but a reference at the beginning. OK, now I want to retrieve explicitly the state of the system after the interaction with the environment. So this is my initial state, this factorized state. U is the time evolution operator joining together system and environment. And then I'm interested in the state of the environment itself. So in the state of the environment after the interaction, so what I do is that a trace over the system. Makes sense? Now up until the trace, up until a step before the trace, everything is unit I. So I'm here. That's great. But the moment I take the trace over the system, I lose unity. So the effective dynamics of my environment won't be unitary anymore. Its purity will change, right? And the description of the effective dynamics that this environment will undergo is through one of such maps, right? A map that can be decomposed in terms of crowd operators. People know what crowd operators are, yeah? OK, so I see it's nodding. So crowd operators are operators, in general, non-unitary operators that are these guys, AL and AL dagger, that they need is a set, that they need to account for the open system, open evolution of a given system, of a given information carrier system. OK, and if I need a set, and a set of them, a suitably large set of them, if the set is larger than one, so it contains more than one element, the dynamics is for sure non-unitary. And there is one property that these guys, where did I leave the chalk? One property that these guys, that these crowd operators have to satisfy. So the crowd operators satisfy a closure relation or a completeness relation, which is the sum of AL dagger, AL equal to the identity, OK? So this must be the case, OK? If you want to work with proper crowd operator, this should be guaranteed. Yeah, and this is the definition for this specific case. And I wrote down my closure relation explicitly. OK, now what happens at this stage? Well, what happens is that I want to use, and also I'm coupling directly my system to this environment. I'm interested in heat because I want to assess land hour. So why not using what I learned five or six slides ago? So why not using the fact that in this non-equilibrium framework that we now know should be adopted when assessing the thermodynamics of open quantum systems? Why not using the probability distribution for heat, given that heat is a stochastic variable, right? So why cannot I use? Why should I? Yeah, let me use this guy. Yeah, let's make use of that. And let's make use of that, so the probability distribution. I want to use this probability distribution explicitly, so I want to write it in terms of these crowd operators. And the form that the probability distribution takes is this one, right? So rho E mm are the Gibbs probabilities of my initial environmental state. What did we say? We said that the environmental state, rho E, is in a thermal state at some inverse temperature beta. Yeah? So the probability that I find this is the environment in its seventh eigenstate before interacting with the system will be simply the diagonal element of this density matrix, this in the nth, no? So in the nth basis, this is a diagonal element. So what I get is this probability to be this stuff. On the other hand, the probability of transition from such a safe, from the nth energy eigenstate of the environment to the nth energy eigenstate of the environment are regulated by the crowd operators that I have introduced. And also these guys replace the time evolution operator U, the unitary time evolution operator U, to give me the rates of transition. So this is the explicit form that my probability distribution for heat exchange takes when I assume the framework presented in the previous slide. And now I get curious. I get curious, and just like what we did yesterday with Yajinsky, where we have assessed what happens to the expectation value of e to minus beta w, the work that I do on the system, or that the system does for me. I now get curious and calculate the expectation value of e to minus beta q, the new stochastic variable, the heat-related stochastic variable. So what I do is that I calculate this average. So I plug in this expression. I play a bit with algebra along the same lines as of yesterday's calculation. And what matters is the following, is that the expectation value of e to minus beta q is related to the trace of the initial density matrix of my environment times this operator a. And now it's so important that I want to stress it. So operator a is not the identity. It's the sum of l over l a l dagger. It's not my closure relation. Makes sense? Yeah? What is that? Well, whenever is the identity, and this can happen, perfectly fine. The map is unital. So this is the statement. This condition is the unitality statement in terms of cross operators. So the only thing that we can gather a lot of information from this stuff, the first thing that we gather is that there is a link between the degree of non-unitality of a process and the exchange, the statistics of exchange of heat in an open system dynamics. So the value that this guy takes is strongly dependent on how non-unital your process is. Assume that you have a unital process. So assume that this operator, this bold phase a, is the identity. Then what you have here on the right-hand side is the trace of the identity times rho e. One statement, no, a statement of fluctuation-like, fluctuation theorem-like statement along the lines of what we have seen yesterday. So now we have, let me go through a step. So now we have that for unital processes, so if a is the identity, then the expectation value of e to minus beta q is equal to 1. Make sense? It's just what we are getting from this expression. Then you use something called Jensen inequality. Jensen inequality that tells you, if I'm not wrong, that something like e to the expectation value of e to the x is larger or equal to e to the expectation value of x. So use Jensen inequality, plug it here, replace this expectation value with the expectation value of e to minus beta. So with e to minus beta expectation value of q. And what you can derive immediately is that the average heat that you exchange can only be non-negative. So unital processes are associated with a positive exchange of heat from the system, from the between system and environment, from the system to the environment. But whenever, it means very little, it just means you don't have the reverse process. What is important is what happens when the dynamics is non-unital. When the process goes beyond, now when the map goes beyond the condition for unitality on the blackboard. When the process is non-unital, I have, using again Jensen inequality, I have a new bound to the amount of heat that they can exchange. A new bound that depends explicitly on the degree of non-unitality of my map. And this new bound is what the non-equilibrium framework for thermodynamics tells you, you should adopt as bound for Landau-Elai approach. Okay, so this is the bound that you should compare to Landau-Elai. That you should use for comparison with Landau-Elai. We have already seen through the ribbon wolf approach that, say, tighter bound to Landau-Elai can exist. So the question now becomes, is this bound tighter than ribbon wolf's one? Right, so. Where is the try? I keep on losing it. So on one end, we have beta expectation value of Q, larger or equal than delta S. And this is Landau original formulation. Ribbon wolf are somewhere here, yeah? They say, well, it depends on the dimension of my system. I kind of have a tighter bound. I can be somewhere between this quantity and this quantity and be tighter than Landau-Elai. So rib and wolf. Then we have this non-equilibrium, non-equilibrium formulation of Landau-Elai principle as a justification for Landau-Elai principle for the validity of Landau-Elai that provides a new bound. Where are we? Are we here? So are we here? Is this the condition? Or shall I swap them? Is rib and wolf tighter than the non-equilibrium bound? Well, providing an answer turns out to be extremely difficult. It's a pretty much case-dependent answer. We went through a couple, well, more than a couple, quite an extensive set of tests. And we always find that the non-equilibrium bound is tighter than rib and wolf's one. Especially at low temperatures. At low temperature of the environment, we almost always beat by a non-equilibrium approach to Landau-Elai principle what the equilibrium but quantum-mechanical approach to Landau-Elai would tell you. I'm not going for these reasons. I'm not going through the set of examples. So if the chairman can confirm, I think my time is up, two minutes. So yeah, I want to use the two minutes. I want to use the two minutes simply to iterate a couple of points that went through, not came up through the discussions after the lectures and in the afternoons. These is a very, very new framework, okay? The attempt at putting together information theoretical approach and non-equilibrium thermodynamics is something relatively new. It's a field that is growing. There are more open questions than satisfactory answers provided. There is room for fun. There is room for understanding what happens fundamentally at the level of quantum dynamics from a very general standpoint, which is the one provided by thermodynamics. So if you have a pension for foundations of quantum mechanics, and as I do you believe that probably the most general framework in physics is the one provided by thermodynamics. Ever go at putting the two together? So start working in this area. It's an extremely rewarding field. You have fun. You understand a little bit of physics in a new way and you try to approach, to close the gap between, well basically the gap that we have highlighted at the beginning, no? You have the possibility to build machines of a quantum mechanical nature, right? You want to build this framework for this paradigm for quantum technologies. But we already have a very efficient theoretical framework for the understanding of how such engines, such machines work, which is thermodynamics. So it's only natural to bridge the gap between the two and try and get advantages from each other. So if anything, these three lectures, I hope putting you the curiosity to explore this area. And if you have any question, any doubt about what I've presented or what you are going to read, hopefully, feel free to write me an email. I'm more than happy to discuss. With that I'm done and I thank you for, I thank you the organizers for giving me the opportunity to talk about this stuff and you for enjoying for three hours. Thank you so much, guys.