 It is a special lecture and also in the sense that there is a little bit, a very little about scaling, but however the word scaling will appear and it plays a role, but you will see it. Anyway, the subject is the foundation of thermodynamics, and then this is joint work with Elliot Leib, which we did mostly, more than 20 years ago. You see some references, but I am very happy that it seems to be still in demand and people seem to like it or some do not like it, but anyway, this is something which is still under discussion. Okay, so let me start with a general praise of thermodynamics by Albert Einstein. And of course you can read it, so I don't really have to read it for you, but the theory is more impressive, the greater the simplicity of its premises, the more different kinds of things it relates, and the more extended it is, it is its area of applicability. Therefore, the deep impression with classical thermodynamics made upon me, it is the only physical theory of universal content concerning which I am convinced that within the framework of the applicability of its basic concepts it will never be overthrown. Now, for the purpose of this talk, I can summarize the core of thermodynamics in the following statement. We want to understand the basic empirical fact. I see that here on the screen you don't show the one half of the screen, which is not completely visible on the screen, but a little bit, but here you don't see anything of that. Okay, well, we will have to live with that. Anyway, so the basic task of thermodynamics is to understand and separate the possible from the impossible in a quantitative way. One of the first things I will have to do is to define precisely operationally what I mean by this first phrase under adiabatic conditions. And maybe a little note is that for macroscopic systems, their scaling limits, so to say, come into play, this distinction between the possible and the impossible is unambiguous. For smaller system, probability might come into play and it will come into play, but that is not our concern here. So if you will, we are talking about systems in the scaling limits, mostly. We will in the end talk about something a little bit more general. Now here is my operational definition of the concept of possible under adiabatic conditions. I say that the state Y, I will denote states of my system by capital letters and the state Y is adiabatically accessible from a state X and I write it in this way. So this is the basic relation that will play a much role here. So I think it is quite suggestive. It means that, to say, things get more chaotic when you go from the left to the right. But we will also, when I have to talk about the relation, I sometimes say that X precedes Y. And the operational definition is that it is possible to change the state X to the state Y in such a way that the only net effect on the surrounding is that the weight may have risen or fallen. Of course this weight is just an example of a source and a sink of mechanical energy. In a spacecraft you would maybe use a spring or a flywheel or whatever. Anyway, the important point which I was to stress here is that these processes do not have to be gentle. Anyway, they can't be arbitrarily violent. This is important to stress this because the word adiabatic is... Well, sometimes, and in fact often in physics, used for something slow in time. That is not what is meant here. The operational definition is exactly what is written there. Now, the second law of thermodynamics says that for equilibrium states of macroscopic systems, at least, one can characterize this relation adiabatic accessibility by the increase or rather non-decrease of an essentially unique state function which is called entropy and denoted by S, which is extensive and also additive on subsystems. Historical note, this word entropy was coined by Rudolf Clausius in the mid-19th century from what he had been calling the transformational content of a body. So already there he was thinking of some transformations. Now, here is a caricature of this, which is from a popular version of our paper, which appeared in physics today in 2000. This is a drawing done by my daughter then for 20 years, she was around 15, I guess. Anyway, so here is a state X, which is being transformed adiabatically into a state Y. And this is the source of work here is this weight. But there may also be a machinery and there may also be a clever, intelligent being like a gorilla because gorillas can be quite clever. And the gorilla is doing its best to work on the system. It may use part of this machinery to help it. But in the end, the gorilla is in the same state as before. To be exactly in the same state, you might have to feed it with a couple of bananas to compensate. But anyway, this gorilla is also the model to stress that these transformations do not have to be gentle. Why gorilla? Well, it so happened that when we were writing our paper, there were commercials on American TV for indestructible luggage, and there was some gorilla jumping up and down on a suitcase on a conveyor belt, so that is the source of this. Anyway, now the uniqueness of entropy is very important. Because it means that all methods of defining entropy for equilibrium states lead to the same results, provided the basic requirements that the entropy characterizes adiabatic accessibility and this additive and extensive is fulfilled. In particular, the famous Boltzmann formula, well, if one could prove that it really does what is required of entropy, then it's the same. There is always the same entropy. For equilibrium states, I must certainly emphasize this. This is not some sort of idiosyncratic entropy we are defining here. If you, by some means, have a function which does what it is intended to do, then it's always the same. It's a little bit like the story of the blind man and the elephant. You may remember this, that the blind men, they are investigating an elephant. And one of them, he pats it on the side. Another one, he investigates the trunk and the other, the tail maybe. And so they have different ways of researching the elephant, which is always the same elephant. It's like that with the entropy. This uniqueness ensures that. Anyway, the additivity and extensivity are likewise very essential. First, they guarantee the essential uniqueness. And secondly, they simplified greatly the experimental theoretical determination of entropy. For instance, in order to predict the efficiency of a geothermal power plant, let's say, it suffices to know the properties of a one kilogram of water, which you can find in so-called steam tables, which I've worked with quite a lot when I was teaching mechanical engineering students. That's great fun. Usually physicists, they don't learn about that. They just learn about ideal gases. But there's also real stuff out there. Now, the main message of the talk is that there is a simple and direct approach to the existence and uniqueness of entropy for thermodynamically equilibrium states of microscopic systems based only on properties of the relation of adiabatic accessibility. In particular, neither heat, temperature, thermal reservoirs, nor assumptions about the microscopic structure of matter, statistical mechanics or probability are needed for the definition of entropy. These microscopic models may, of course, it will be extremely important in determining the entropy for specific systems, but for the definition of the concept, they are not needed. And that is the message here of the talk. Now, the required properties, they are necessary and sufficient for the existence of entropy. And they are very plausible, except one, which is the adiabatic comparability of state. I will make that precise in a minute. Now, in the case of equilibrium states, this comparability can be derived from some further physical assumptions, but this requires substantially more work mathematically. In fact, there is a simple part of our paper and there is a much more complicated part than what I am talking about today. And in fact, in most of the cases when I am talking about this, I am only talking about the simple part, just as a societal mark. For non-equilibrium states, on the other hand, and we really seriously only started to think about that not so long ago, but the conclusion was that there is an entropy characterizing the relation. Exists, if and only if, comparability holds. And to prove this in the non-equilibrium situation is much more difficult than even impossible, in our opinion in general, because this holds if and only if. Every state is adiabatically equivalent to an equilibrium state, which is highly implausible if you are far from equilibrium. Now, yes? And adiabatically, equivalent is now not the physics, intuition, but the... Well, there is this... Yeah, well, I will make this precise a list of these concepts which I have been using. So far, equivalent, that means that you can go from... Those two states are adiabatically equivalent. If you can go from X to Y and you can go from Y to X, then they are equivalent. Now, the lack of comparability can also lead to non-uniqueness if we look at the entropy for non-scalable or mesoscopic systems. But there you can... In fact, here it says, extremal entropy, it should be extensive. Entropy, they can be defined by means of entropy meters. That's the minor thing that I will mention at the end if I have time for it. Now, here are the main references on this. There is this basic paper from 1999. It contains also a long historical introduction, so from which you will see that this way of thinking about entropy, which is very different from what you do usually in textbooks, has many precursors and there are many names one could mention, going back to Landsberg, Falkland, Jung, Buchtal, Robin Giles, etc. There is a long historical introduction in this paper, but I'm not going to detail that. Now, here is this popularized version in physics today from which this picture was taken. And then, finally, there are two papers published in the Proceedings of the Royal Society 2013 and 2014, where we present some thoughts on non-equilibrium entropy and also entropy of non-extensive systems. So these are the references here, the main references. There are also other versions of this in various proceedings. Now, the basic concepts now for equilibrium thermodynamics. I'm going to list them and then I'll come to the axioms. So, listing them means that I'm simply giving them names and you have to have some intuition what they mean. It's also to say an axiomatic approach, which has similarities with axiomatic approach to geometry. We have here symbols, we give them names, and there are some rules how to handle these symbols in mathematical arguments. But the symbols, it's like a Hilbert, he said when he was talking about axiomatization of geometry, when we have the Euclidean axioms, they are true. However you interpret these symbols, I mean you could say a point is a table and a line is a chair or whatever, if the rules of how these concepts are connected are fulfilled, then that's all right. So it is in this sense that these symbols and the operations with them are to be understood. So to say the rules of wording with them, that is what really defines them for the practical purpose. And you have of course some intuition what you mean by the thermodynamic system that is a lump of matter, maybe in some container or whatever. They can be simple, that is a technical concept which I will mention a little bit more detail later, or compound. Compound means that I have here many lumps of systems. I am also able to cut them in half or even in one fourth, three fourth or take some arbitrary fraction of them and put these pieces here side by side on my table. And so then I have different state spaces. But when I look at some such systems together, then I simply use this notation of a Cartesian product and pairs or tipples, this is just for pairs. Then there is the concept of a scale copy which intuitively simply means if I take one liter of water here and now I take two liters of water at the same temperature and pressure, then that would be scaling by two and I can scale it by an arbitrary real parameter. And the corresponding state spaces are then denoted by scaling parameter times the state space. In concrete realization of the state spaces as subsets of Rn, this of course will have a definite meaning, but here it is just like with this abstract actual matization of geometry that these symbols and the operations acquire meaning through the axioms. Now come the conditions on this relation and these are in fact all completely obvious if you think of this interpretation which I had given to it. You remember a relation means X precedes Y with respect to this relation if I can go from X to Y by means of some gorilla machinery and the weight and leave everything else unchanged. So there is transitivity. There is consistency if I have a system consist of two parts and if I can do something with this and something with that then I have done something with the compound system. Now then here is a very important is scaling invariance. Namely I assume that if I can do something with one kilo here and one kilo here and then I can do the same with half a kilo in the same internal state so to say or scaling by an arbitrary parameter. So the scaling invariance is important. That's what comes in here and really makes the formulas in which we in the end write down rather simple. The splitting and recombination if I have some lump of matter here I split it lambda, one over lambda, take these parts apart and look at them as a compound system. That is an adiabatic operation and I can also put them together again and that is again adiabatic so the splitting from that splitting point of view is where lambda minus, one minus lambda x times lambda x is equal adiabatic equivalent to x and finally there is a certain stability requirement which replaces all considerations of topology. So this says that if I can transform a state x to a state y with the help of some arbitrary small amount of other system then I might just say that I can go from x to y. This one could do away with this by simply defining the relation in this way but this is so to say what replaces any topological considerations as long as we are not introducing concrete state spaces for these systems. Here are some of the notations I have already used namely I say that the two states are comparable if I can go either from x to y or from y to x then they are comparable. So an example of states which are certainly not comparable is one kilo of gold and one kilo of lead for instance. Also not half a kilo of water and two kilos of water they are not comparable. Now they are adiabatically equivalent if both conditions hold and if I can go one way but not back then I say that x strongly precedes y and write it like this and here also some notation because this appears now in the statement of the second law I say that a compound state consisting here of lambda x1 x1 compared with a state like that at least they are in the same state space I say that they have the same mass if the sums of these coefficients are the same this is just a notation in order to be able to state the second law in a neat way. Now these conditions say one to A6 I hope you agree with me that they are all very plausible if you interpret this relation in the way I have done and they are clearly necessary. How are they not sufficient for the existence of an entropy that characterizes the relation on compound systems made of scaled copies of gamma. A further property is needed and that's what we call the comparability property and we could of course formulate it in a more general or stringent way but this is sufficient for the purpose of this lecture the condition is that any two states is 1 minus lambda gamma cross lambda gamma are comparable for all lambdas between 0 and 1. So if I take this part of the system I take the system I split it and then I do something here and I do something there that's comparable with any other state that I have obtained in the same way possibly with a different lambda. Now comes here the following statement and let me see I had stated here before the second law so to say comes again the second law but now in a more formal setting namely as the theorem I'm sort of proving the second law in this form which I had stated it the following conditions are equivalent first the relation satisfies these assumptions A1 to A6 which are obvious and the comparison property secondly there is an additive and extensive function defined on all compounds of scale copies of gamma such that whenever x and y have the same mass then x precedes y if and only if the entropy of x is less than or equal to the entropy of y and the function is unique up to an affine change of scale I can change a multiplicative constant and I can also change by an additive constant now here I can even prove this or give a sketch of the proof this is not very deep mathematics we pick two reference points and let now x be an arbitrary state which lies between them I can do this more generally than I have simply to shift these points if I have an entropy function then we know because this strongly precedes here that this must be strictly less than x and y and so also because of this condition here they must be ordered these numbers must be ordered in this way and this means that there is a unique lambda between 0 and 1 so that I can write this number as a combination of these numbers here now but by the required properties of entropy this is equivalent to x being equivalent to this combination here and if I have another entropy function then we also have this but now with this replaced by some lambda prime but from the assumptions a1 to a6 it follows that this can only hold for at most one lambda that is to say lambda must be equal to lambda prime this is the proof of uniqueness now the proof of the existence is also not very complicated now from the assumptions and the comparison property we conclude that if x0, x and x1 stand in the relation stated there then the following two numbers are equal namely the supremum of this 1 minus lambda x lambda so that I can go from 1 minus lambda x0 lambda x1 to x and the infimum with the relation the other way around now moreover if this sup and the infimum are attained then these numbers must coincide and I do have this relation and so therefore I can conclude what I namely proved this lambda x will be the entropy of x with respect to this reference points x0 and x1 but note that the comparability of all states in this 1 minus lambda gamma lambda gamma not only those in gamma are essential for this now we can choose if you like the entropy of one of these reference points as 0 and the other equal to 1 and then we have this explicit formula for the entropy so S of x is the supremum of all numbers lambda prime so that I can go from 1 minus lambda prime x0 lambda x1 to x and it's the same as the infimum for the relation the other way around and this formula for the entropy you see that it uses only this relation nothing else so sort of an interpolation between these two numbers the entropy of x0 which I by definition can take equal to 0 and the x1 I take it be that these are the sort of say determines these two free parameters which I have but here is an explicit formula and any other choice of x0 of this I don't necessarily have to have here 0 and 1 I could have here from little a and some little b but that simply leads to a shift of this function here by an additive constant and the multiplicative constant and this is shown here schematically a little bit here is the x which I want to define the entropy for here are these reference points and I am considering a process which goes from here with 1 minus lambda to that and this back and this Sx is the supremum of all the lambda primes for which this is possible and that's the same as the infimum for all the lambda double prime for this this is possible now you can imagine that this is not a very deep theorem but it's not completely obvious and it's the proof really uses all these axioms which we had written down here is also a caricature of this I think I will maybe skip that well I am here determining the entropy of 1 kilogram of water here is the machinery the gorilla is missing but it could be here around and now I compare this here the fixed state I simply say here steam and water steam and ice there need not be any phase transition involved here of course these are just some references here and the ice these are the two two reference states and if this is possible then I simply look at this lambda there is a unique lambda for which this picture here is possible the entropy of this kilogram of water here now now the special role of this comparison property is which is not at all obvious although well I mentioned at the beginning that there is quite some literature or precursors of this work I mentioned some names I mentioned Falsk and Jung Landsberg, Buchtal, Giles which have similar ideas none of them really has the scaling as we have but they have also a relation of this kind but all these works they assume that this comparison property holds they simply take that as an axiom without any further justification now we were not content with that and in fact so we started really on this by looking at comparison before we did what I have been saying until now and now I can just tell you that this comparison property can be derived from additional assumptions about what we call simple systems simple systems are systems now we are for the first time introducing coordinates until now this has been completely abstract but now we want to introduce coordinates there is a coordinate natural physical coordinate which is the energy what is behind that is the first law of thermodynamics of course and there are some one or more work coordinates there could be magnetic fields there could be many work coordinates if I have many systems coupled and they have can change the volume of each of them individually well anyway there you have your state space which until now is a completely abstract set now becomes a subset of some r and plus one where this one is the dimension of the of the energy coordinate it's important that there is only one energy coordinate but there can be more work coordinates and at the same time we derive this comparison property from assumptions about such systems at the same time we establish the contact with traditional concepts like accurate pressure and the assumptions for simple systems are here we assume that we can do converse combinations in this concrete state space that this is a this is an adiabatic operation there is a continuity assumption about the pressure where the pressure is defined as the slope of the boundary of states that can reach adiabatically from some given state now well we we assume in fact some lipstick continuity of this function and then the existence of at least one irreversible state change starting from any given state and also and this is also to get connection with really thermodynamics we need something like zero flow of thermodynamics we need to talk about thermal links between systems this is a this is an operation where we have let's say two systems one has an energy coordinate u1 another an energy coordinate u2 then we make a link by simply bringing them in contact with a copper threat or something which or a silver threat something which conducts heat well and then low and behold after a while the energies they have split up in a definite way and the systems have in standard parlance they have now are now in thermal equilibrium with each other and by in this way we can start from two simple systems and we can make one simple system by such thermal links and by stating stating an axiom about such operations we can for instance exclude bad situations like here now this is a picture of some state space with coordinates u and some work coordinates v the state space is a convex subset of this rn plus 1 and what I draw here are some states x, y, z and this these lines here are what is in common parlance called the adiabats namely these are the boundaries of the sets which are adiabatically accessible from the given state for instance this x here if you start with that then everything above it is adiabatically accessible likewise here likewise here this is sort of say the good situation which prevails but a priori if you don't put any further assumption in you could have crazy situations like this and one must do something to exclude that and that's what we can do with very plausible assumptions now these assumptions they allow also the derivation of the comparison property but moreover they also show that the entropy is a one differentiable function of the energy and the work coordinates and that the temperature which is now defined by this famous formula here it characterizes the equilibrium at thermal contact between different systems and from this point of view temperature is an afterword rather than a forward for the foundations of thermodynamics usually or very often you start with the zero flow of thermodynamics and that is the reason it has this low number and so to say temperature or rather empirical temperature is then something which characterizes thermal equilibrium and that is the beginning of the story here it's completely the other way around temperature comes in only at the very end now a few remarks more on this connection with standard thermodynamics one consequence of the existence of entropy is a formula which is very one of my favorite formula it is due to Max Planck already in the early 20th century and it allows ones to relate an arbitrary empirical temperature scale to the absolute temperature scale T well usually in first courses in thermodynamics you define the absolute temperature by the efficiency of Carnot machines but imagine you wanted to do some low temperature physics and you had to always to bring in a pair of or bring in Carnot machines and look at their efficiency to measure the temperature of course you never do that you measure it by completely different ways but in order to be able to use the basic formulas of thermodynamics you have to be able to compute the absolute temperature and that is done here by Max Planck with this beautiful formula which tells you you have to simply you have to have some empirical temperature scale and then you have to know some material coefficients expansion coefficients and pressure coefficients here and then you have this formula this formula is derived from the fact that the entropy is a state function so it has a ds is a total differential and a total differential has special properties and from the integrability condition you derive a differential equation for the temperature as a function of this empirical scale also when once you have derived this what is often called the fundamental equation of thermodynamics and I wrote it here also even with chemical potentials I have not said anything about that yet but you will be familiar with this from your thermodynamic course once you know this formula here and you know that this is a total differential you can derive then this is usually done in courses in thermodynamics as exercises for students but you derive a very remarkable formula which link quantities which you measure by completely different means and there is no entropy in this formula here the entropy is sort of sitting in the background dictating what you can do here is for instance this is for dilute gases ideal gases that the velocity of sound is related to the heat capacities so what on earth's velocity of sound have to do with heat capacities it's a mystery except that when you know that it follows first simply from the differentiability or from the fact that this is a total differential and you can derive this fairly easily here is the Clausius-Clapeyron equation I'm not going to repeat what this symbol means this is the heat of the latent heat and this is the change in volume and here is a tough equation from chemical thermodynamics the equilibrium constant here the temperature dependence on the equilibrium constant the link to the heat of reaction okay, so I'm not doing that badly well, also I want to see how much I can say about the non-equilibrium things now there is yet another namely that you, by using entropy you can say something about maximal work that you can obtain from a system there is a concept for that which physicists usually don't learn about it's called aixergy or availability looks a little bit like free energy but it's not the same because the temperature here is the temperature of the environment and a lot of that of the system so, now okay so this is all what I have to say about equilibrium thermodynamics and now I have how much time well almost 15 minutes of course I had to go very fast I don't know how easy it was to follow this but so let me say something about non-equilibrium states first generally mark there exist very many variants of non-equilibrium thermodynamics so there's a classical irreversible thermodynamics which deals with situation close to equilibrium there is extended irreversible thermodynamics rational thermodynamics and many other formalisms now most formalisms they consider only states close to the equilibrium or stationary non-equilibrium states that's one point here to note another point to note is that the role of entropy is much less prominent than in equilibrium thermodynamics in equilibrium thermodynamics the entropy really determines everything that you like to know or want to know about the system all its thermodynamic properties there is much more of course in non-equilibrium thermodynamics there are flow equations and all sorts of phenomena and the single function like the entropy does not have the same power as in equilibrium situations now what we have to say about non-equilibrium entropy is the following we consider a system with a state space gamma of equilibrium states and imagine that this is a subset of some larger space of non-equilibrium states now while this gamma is generally a finite dimensional space with work coordinates and energy coordinates the non-equilibrium states cannot simply be characterized by such a simple parameters but anyway we assume that the relation lack of idea about accessibility is defined on this extended space with the same operational interpretation as before the restriction to gamma is characterized by an entropy function as discussed previously in the equilibrium situations now the basic question we ask, natural question what are the possible extensions of the entropy from the equilibrium space gamma to the non-equilibrium states now we have to make some assumptions interior assumptions so that it should satisfy the assumption a one reflexivity, transitivity consistency, stability but the scaling assumption and the splitting assumption they are only required for the equilibrium state it's not natural to do that for the general non-equilibrium states now here is a very important assumption, namely we assume that every non-equilibrium state lies between two equilibrium states in the sense that I can generate any non-equilibrium state I am interested in starting with an equilibrium state and likewise, well if I wait long enough then this will relax to an equilibrium state maybe it needs some help by some machinery but anyway I assume this relation here so if I know this, I can define two entropies, one s- and s-plus by a sup and inf a little bit like before, except that there is no scaling parameter lambda and now there are some rather simple properties which follow from the axioms they are both monotone with respect to this relation and any other function on the non-equilibrium state that has these properties lies between s- and s-plus, so this delimits the possible non-equilibrium entropies now s- is super additive and s-plus is sub additive at least well it will only be additive if they coincide and here is a picture of course to illustrate this here is the non-equilibrium state and here is the non-equilibrium state space and here are these states here is the equilibrium state space and s-minus is the supremum of all the entropies here from which I can go to this space this is the infimum for all the entropies which it goes the other way around now the role of comparability we can discuss that, namely the following is equivalent that these two entropies s-plus and s-minus that they coincide that there is a unique s-prime which extends the equilibrium entropy such that x precedes y implies that s-hat x is less than s-hat y now there exists a unique s-hat extending s such that if I have the relation for the entropies then I have the relation for the states and finally that every x non-equilibrium state x is comparable to every y every y that the comparability a property holds on x-hat and since gamma is part of equilibrium states is part of the non-equilibrium states this means that every x non-equilibrium state is adiabatically equivalent to some z equilibrium states and this last condition is of course highly implausible that this holds in general so except in some approximate way if you are sufficiently close to equilibrium so the conclusion from this as we see it is that there is in general no unique non-equilibrium entropy you have to live with this non-uniqueness last thing generalization 2 non-extensive entropies the scaling assumption even if we only consider equilibrium states the scaling assumption is not always natural if you have systems with long range forces surface effects are important or mesoscopic systems so the entropy for such systems however it can be defined again by such formulae using a normal system for which I know the entropy as an entropy meter I define the entropy by by simply coupling the two systems and defining the entropy as well in fact the two entropies s- and s-plus again and if I have well here I must also make an assumption similar to that for the non-equilibrium entropies that these functions are well defined and they have also here similar formulae that both functions are monotone with respect to relation and that every other function that has a property lies between and while there is super additive and other sub-additive and if we have comparability then we have that both entropies coincide and I have the entropy principle and second law in the same form as before also I have that the s is additive and this is uniquely determined by these properties up to an additive constant okay now I am coming to an end I think this is a minute longer than I had intended but I am now about to finish so I want to summarize we have shown that an essentially unique entropy characterizing the relations of adiabatic accessibility of one equilibrium state can be derived from very natural assumptions and the comparability property now comparability cannot be expected to hold for arbitrary non-equilibrium states however one can delimit the range of possible adiabatic state changes by mean of two well defined non-equilibrium entropy functions and comparability holds if and only if the two functions coincide likewise for non-scalable systems the mathematical reasoning behind all this is is so to say depends only on a few axioms and it is independent of a specific concrete realizations or interpretation of the state concept and the relation and therefore the conclusions hold whenever the assumptions are fulfilled and the framework is applicable in all the contexts for instance in quantum information theory where this has been so this last remark I think it resonates well with the quote I brought at the beginning of Einstein's where he said that the theory is more impressive the greater the simplicity of its premises and the more different kinds of things it relates and the more extended its area of applicability so here are some recent examples of papers where such a relation of this kind and even some of axioms have been adopted in particular here a paper by Renato Renner and co-workers axiomatic relation between thermodynamics and information theoretic entropies and here is a very new paper in the framework of general resource theories where our framework so to say appears as one example among many and after this I can now just thank you for your attention questions, comments, remarks yeah so it's a very almost don't dare to ask this question because it's sort of technically I was wondering what structure these axioms impose on your state space because you it's a little bit different than the usual intuition one has when I regard the system and I put two systems together and in your theory this is on the Cartesian product since I can repeat this the state must contain arbitrary Cartesian products so it will get extremely large well I can make such states of course but I'm just wondering about the description usually when you have a set of states you would like to characterize this set is there characterization in this? yeah well a large part of what I said is completely independent of any concrete realizations but then I jump to these simple systems where the state space is concretely even n plus one or a convex subset of that and there it is very important for the analysis to have this convex structure of course the concrete system is clear but I was just wondering about the abstract framework yeah the abstract framework is just what it is yeah it is what it is the question is what can you say anything about how large the state space is usually you have a state space as you say some convex subset of a fixed dimension but even when you put two systems together you're taking a Cartesian product and already doubling the dimension yes yes so it's a very large space in your yes well sometimes where I am talking about the simple systems when I connect them make a thermal contact that of course reduces the number again but in the abstract setting it can be arbitrary finite products but that doesn't do any harm in fact there is one theory which I of course did not discuss here is when you want to calibrate these constants and to be constants for different systems there you might have to make a resource if you have an arbitrary many to something like so to say abstract, hammered bases things like that but that's I think as abstract nonsense you should just think about you have a finite collection of systems but you can make of course as many copies of them as you like that doesn't change anything I was confused about this uniqueness proof maybe I didn't understand the notation you have there when you are rather early yeah well that was very early I mean this here is uniqueness yeah there you proved there you had once the argument but this I didn't get the step from if the entropy is this linear combination that X is can you explain that step from this linear combination where you see that X should be this there is a unique, well this is just a statement about numbers if I have two numbers this is less than that and this lies in between then I can write this number as a combination of yeah okay but now the required properties of the entropy they imply that this equation here for the entropy is equivalent to this equation here for the relation of course we require that the entropy should characterize the relation so this is equivalent to that and now if you take another entropy which satisfies this here then it leads to the same equation but now with another simply related by lambda prime but if you look at the assumptions A1 to A6 go through them and in particular this condition that you cannot go back you can see easily that this can hold for at least one for at most one lambda so that is the uniqueness and the existence was there so to say that these two numbers they coincide then that is the this is nothing deep. No my question was I think what the Wiggled A was the thing I was confused about I think that was the point. The last one was clear again yeah so there are some physical systems where you can read some physics paper where people are discussing whether the system or in which way the system satisfies the second law of thermodynamics like this adiabatic piston for example I don't know if you oh yes yes adiabatic piston yeah good point that Does it give an easy way to sort of resolve these discussions in this axiomatic framework? Well of course writing down axioms will not resolve such things but yeah that is what so our conclusion we discussed this also with Joel Libovitz he was a great fan of the adiabatic piston and well our state we are thinking about this that there is no no unique equilibrium state he would claim there is a unique equilibrium equilibrium state but you have to wait very long maybe to the end of the universe and then the hammering of the atoms on its part of the piston will eventually lead to some definite result but yeah so the adiabatic so strictly speaking the adiabatic piston does not fit into into our framework very questions Nicolas as far as I know you always say there is equivalence in the character's entropy by saying it's equivalent you can go from x to y and the entropy increases yes but can one in the non-equilibrium case say at least that it's a necessary condition and there exists an entropy and it's necessary that the entropy increases without saying it's sufficient yes well now I have to go back here oops non-equilibrium yeah so well if you well these functions here they have these properties that they are monotone with respect to the relation both these functions and any other function which has this monotonicity properties with respect to relation it lies between these two but what you are sort of asking for is whether they characterize the relation and that is of not only the case if they so these are all equivalent here you see if you look at this equivalent statement what you are saying was that is 0.3 I guess yeah you are asking if there exists a necessarily unique extending as such that this implies that is that what you are asking for or about the the other way so there is just a unique as existing as such that this implies that I think about uniqueness but I would like to remove it was possible to remove the fifth condition and get the well these are all equivalent you have a unique you see is it equivalent because you ask that it's unique if I remove uniqueness in the second statement for example but but that I mean by 3 that that is a weaker statement so to say if you have this here and that is then necessarily unique then you have that so this is equivalent to that you see if there exists and as such that this holds then there exists a unique okay we can look at this so there was one more question it's a very short one just from the very beginning you have this clay at the scaling property I mean we had also the state space could be scaled so what does that assume for the state space I mean we started with an abstract set I was wondering what you have in mind well what we have in mind is of course just that if I have one liter of water here and at this temperature and pressure then I can have also two liters so it's taking the intuition but by this is that we we have sort of say the double amount but with the same internal or rather intensive parameters I'm not to define that concept technically because we're not needed this is just for the purpose of interpretation so you kind of don't need those states to make it more concrete what it is at that point you know one can't do without one can't put all state spaces together into one can just talk about states but this is however for these more sophisticated parts of the analysis it's very convenient to have them neatly put together into state spaces but this scaling you will need in any case some scaling in fact in the work of Robin Giles which I mentioned he never talks about such continuous scaling but just about doubling or tripling and not taking pi times or something okay so I propose that we any more technical questions we first go and we will pass it over to you so let's end the game