 So, before the last lecture of the day, after a series of excellent lectures, we have one more excellent, it is my great pleasure to introduce Jan Corbett, Corbett from completely finance club. Vienna, Jan is, I think, one of the best, it's not the best expert in the field of generalized entropies, with a very strong application research, currently working on stochastic thermodynamics in the past, also great experience in conference networks, in financial, mathematics and econophysics. I think that we could be very, very satisfied and excited about his course on foundation for entropies. Today is the first lecture, I would like to thank you all for coming to me, should be all happy to be here and to follow your own course. Thank you very much for having me. So, first of all, thank you for inviting me and having the opportunity to speak to you. When Valimir was thinking about his idea of, maybe you can talk about what is the foundation of entropic, I had in mind many ideas and actually entropy might be something that is so kind of controversial on the other hand, many people think that I think it will be very useful to clarify what everybody means by that and what are the possible applications, what are the possible interpretations of entropy and to have a little bit clearer when two people are talking about entropy, what they are actually talking about. Before I start, I want to say that all these slides are available at this website, www.slides.com. I will be adding more as every day, so you can then have a look at the slides backwards. Let's start. Before I have a little warning, there is a great book called Stated Matter by David Goldstein and he writes this introduction to the Thermodynamics of Statistical Mechanics of the Perfect Dark. Ludwig Bosnig, who spent much of his life studying statistical mechanics, died in 1906 by his own hand. Paul Ernst was carrying on the world diet summary in 1933. Now it's our turn to study statistical mechanics and you might know this, it's so popular to me that it also made it to my introductory picture on Twitter. Don't mean that you have to follow me, but in case you do, I will be giving links to the lectures and sometimes you'll see some interesting things going on like workshops and these things, the small explanation why I have this beer here because beer in Czech, the carbon in Czech means stankard or beer cook, so basically that's the reason. And since this is like something in the last lecture or like, yeah, before I start, I say no questions are stupid. So please, if you have a question, ask anytime, which means there is no silly question like what this L-O-G means. Basically, ask anything. We are not at school, we are not grading, ask anything. My aim is to, at the end of the each lecture, you will know what it was about. So anytime just raise your hand also applies to the online participants, I will try to keep an eye in case just unmute yourself if you can and just ask anything. I'm here for that. And with that, I started a small activity. So here I brought this post bit so they are relatively small and I would like everybody to take one and write on a piece of paper because you know each other, I unfortunately don't know you. I came today, I would like to know you a bit more. So if you could write your name, what do you study, general field, and what is entropy for you? What does it mean to you? Can be one formula, can be concept definition. And then what we will do is that after you write it, we will put the post bits here and see how it evolves. So if I can just pass it and take one, give it to the other, and then you can start writing. And you don't have to think about very much. It's not no wrong answers. If you say it definitely means anything to me, it's also good. Yeah, if the professors want to take part, they don't have to. For the online participants, you can write this in the chat. I will have a look at the chat after we finish here. And it doesn't have to be very sophisticated. It can be only Yeah, so there we have to first. For the online participants after we finish, I will try to write the post bits for you and put it through there. There is a whiteboard where you put all the answers. Just first name. You can just first name. Yeah, write simply the first name. And you don't have to be very specific about your field, just write physics, mathematics, computer science, finance, biology. And maybe once you wait until everybody finish, and it would be nice if everybody just quickly comes here, tells their name, what they do, and what the end spring means for them. And just then come to here. So whoever's ready, let's start this quick thing. Yeah, so maybe you can read it. Okay, so you are, and you say, and you are here. Nice to meet you. So come, come, come. Yeah, whoever has it, come, please come. I'm Jan Adrecht, I'm the director of biophysics and it's some kind of a measure of disorder that can only increase. Okay, nice. Next one. My name is Dana and I study physics because you know, yes, and I just think it is a word. Okay, nice. Somebody else? Yeah, come. Okay, this is the second one. Very nice. And my name is Benza, I'm a biophysics and I'm from the National Geographic. Okay, so this is very nice. Now we'll get to the chat very quickly. Okay, my name is Miguel, I work in running clinical thermodynamics, I'm particle cutting, I'm from the National Geographic, but I think the information is certain. Okay, nice. My name is James, I study physics and currently I'm the director of biophysics and my name is Kossian, I'm a biophysics and I'm a biophysics and I'm a biophysics and I'm the director of biophysics and as a biophysics and as a biophysics and as a biophysics and as a biophysics. Great, thank you. I think it's going to be a better. Great. What do you want? My name is. Be. It's all about. Very nice. Yeah. Nice to meet you. Nice to meet you. The climate. For me. Entity was a number of. Information. Okay. Okay, so my name is very rich. I'm a search associate. The best point I was able to say. Okay. And for you, the entropy is. These are very nice. Okay. So here we have a few more answers in the chat. So I will read them loud. So there is a. There is. There is. There is. There is. There is. There is. There is physics. And. The entropy is the box one formula. Log time. Or the measure of uncertainty. The. The same answer. So young. And. There is. There is. There is ignorance when describing systems. In German. State variables. No. He studies atomic physics and entropy is amount of this order of the system. There is Kirill who is data scientists and statistician. And entropy is information gain. For. Remarkes. It's the entropy is underlying physical constraint of the universe. And it's the quantity that measures information or lack of. So thank you very much for this. I'm really glad that you had. Such a variety of answers. So my answer to that is that. For me, the entropy is this. It's a. It's a constant time slot of the multiplicity. And this is something that is written on both ones. Great stone. What's on greenstone is on the central cemetery. In Vienna. Central free. I would really recommend to go there. So if you have a little bit more time. In Vienna. Recommend going to. You can go there by train. Because. Vienna has this. It's a great honor. I'm happy to say that. That if you are. A famous person. They give you what they call happen. Which means this is a great. Honor. So there is a special. Right. Group of. Rays. They're all famous people like Mozart. Better than all of the famous people lie. So there's. They're even to share. And you will find out that. Very close. of Boltzmann where he has this equation, although some people say he didn't invent it, but we will see in this lecture bit later. So this is me and my youngest daughter. And why there are so many definitions of why there are so many things. So I basically went to Wikipedia and there is so I took only back. So if you if you look at the entropy, there is some creation and you can rather believe that entropy is original property and used to explain part of internal energy of a terminal system us that this is unavailable source for use for work. And then there are other applications and there are these three there are like three dozen, I would say at least, and one of them is from classical thermodynamics, one of them is from statistical thermodynamics, one of them from entropy, and they are all maybe different maybe similar. And I think that our aim is to really find out how they are similar and how they are different. And I will start with an interesting paper that is written by my colleague Stefan Werner and Rudy that is called Free Phases of Entropy for Compact System. And we will see that the entropy maybe has more than one phase, which means that it has more than one origins. And then in certain cases, these more phases, throughout it will be the same in other patients that will be different. And we will see the relations. I also did, I did this work out of the Wikipedia page entropy, and you will see that there are many concepts mentioned. So entropy is this place down to some comments. So for this thermodynamics temperature change, also we see letters like SW, X, Delta K, Q, thermodynamics, energy system, heat physics, so many things that you've mentioned. I tried to maybe do you of them. So I thought, what would somebody describe as an entropy? Is it measure of randomness or disorder? Maybe maximum data compression, distance from equilibrium, information content, part of the internal energy that is unavailable for useful work. Is it uncertainty, heat or temperature? That heat over temperature is also something that many people wrote in here. Is it energy dispersion? Or maybe is it just a tool that can be used, like in maximum entropy principle that you use the entropy to infer the maximum entropy distribution, maximum calibre? Or is it just the way that Prigoshin used it to define this minimum entropy production principle? Or is it softmax that in statistics is basically, this is the statistic version of maximum entropy principle that uses this technique to estimate distribution from the data? Or is it a principle called maximum entropy production that some people claim it exists in the wind system? That's all people can think of. And we will see whether we are, what of these, how these concepts are connected or whether they are or not. When I was thinking about entropy, I was thinking about my personal journey, how did I get involved with entropy? And I was thinking that during my studies, my point of view to the entropy chain, we call my new staff, during, so, and this is a warning, it's the physics course. So I did a physics program during the typical physics program, you meet entropy several times, and it goes and goes away. And then with different approach or different point of view, you think about entropy maybe as a different concept or as a differently powerful concept. So I want to start with my personal opinion about entropy and what it can do in physics. So in summer semester of the first year, when I joined physics studies, I studied classical thermodynamics. And here is what many people wrote there. So the delta S is dQ over dT, where this d and delta are different things. And this is something called the exact and non-exact differential forms, which at that time, I had no idea what they put in genes. But I saw that it's useful because you can measure with that specific heat. You can do various tensile equations, which is the second part of the Maxwell's equation, both these. And I said, yeah, kind of hard good. In the second year, we had a course on statistical physics. And basically the first half was just a repetition of what we already did in the first year. The second half was very interesting because there was a teacher that came, introduced us with this entropy principle, introduced the partition function, which is the sum over the states, which comes from the German to Stamps, from the sum of all the states. And I thought, okay, so now I have basically the concept that I can do everything. So I just calculate one function, which is the partition function. And then this is the whole graph of the physics. At that time, again, we had a problem that we didn't know any probability theory, but yeah, doesn't matter so much. So I thought, yeah, this is really powerful. Wow. In the third year, I also came across, and now I'm sorry to people who do quantum mechanics, but I basically, we did this density matrix theory and entropy, but at that time, there was nothing extra. So it was basically the same written in different languages. So you change some portrays and P4O, and that was it. So I wasn't very much impressed. However, in the master studies, I was lucky that I did my one semester in Berlin, somebody is here from Berlin, so I did one exchange here in Berlin at two, so I invested that and then I took a course on advanced statistical physics, but I understood that it's not so easy to calculate the partition function. We did this term with direct and both science and statistics, and all this spin-easing model and trans-magic theory for the end was super complicated, and real gas and liberates expansion. And this was too hard, but I got a non-mandatory course on non-mandatory statistical physics, and then I've learned about the linear response theory, molecular motors, so how the motors work in ourselves, how the fluctuation terms work in ourselves, and finally what I got is the time. So, you know, thermodynamics is called thermodynamics, but it can be there is no time, there is no dynamics, that it should be more like thermostatics or something like that, and here I finally got the time, and that's why I fell in love with statistical mechanics, entropy in particular. So, if you don't know, I think there is somebody better who can explain you a part of physics that was motivated, motivation for entropy from the macroscopic point of view, and I really like this, and it has only 15 minutes. So, and I hope- I remember when I was in the second one, because it was about entropy, the entropy of the quantum system, I told you in the previous one, where the entropy is the messiness of the disorder of the system, and I thought, how in the world did anyone come up with an equation for messing, and why in the world would anyone come up with an equation for messing, and how did scientists start to believe, as the singer David Byrne said from the top of his head, if the time for the German scientists, they grew up classed with this amazing ability to make almost every scientist around him think he was wrong for decades before they finally realized he's fundamentally correct. Ready for the story? Let's go. I'd like to start in 1849, when Rudolf Klausius was a 27-year-old high school teacher. 1849 was when Klausius read a paper about the theories of heat of a long-deceased French scientist named Saadi Carnot. Klausius decided that some of Carnot's ideas were correct, but he disagreed with the idea that this heat is always conserved. In 1850, Klausius published his theory that we currently believe is true, that heat can be converted into work and work into heat. Klausius wasn't the first to promote the idea that heat works equivalent, but he was the first to say that it did not require one to cast the theory of Carnot overbook, but merely the idea that no heat is lost. In addition, Klausius added a new term, which was a combination of what he called the interior work and the interior heat, that we now call the internal energy. With this term, Klausius became the first person to publish a complete version of the first law of thermodynamics, although he didn't use the word energy for further 13 years, as the term was just becoming popular in 1850s and 60s. An equation that Klausius created is still used to describe the first law with the same letters and sign conventions to say. Klausius's publication made his name in science, and soon he earned a job as a professor, but it also earned him some enemies. In Scotland, William Thompson's paper initially sparked Klausius's interest, but the Klausius was just reiterating the work of a disheveled Scottish scientist named William Rankin and invited Rankin to send a letter to the editors that effected. In Germany, a scientist named Hermann Helmholtz had just written a paper on the conservation of energy and felt that Klausius was just copying him. In addition, all three men decided that whatever Klausius had not plagiarized was just dead wrong, and for many, many years, Klausius could not cover anything without letters from Thompson, Rankin, or Helmholtz or more who's blaming about it. I tend to side with Klausius on this one. In fact, as far as I can tell, Klausius's paper of 1850 is startling in its accuracy and its importance. I'm not alone in this assessment. In 1980, an historian wrote the following, there is no doubt that Klausius, with this paper of 1850, created classical thermodynamics, all proceeding except Carnot, is a small moment. Also, all three men eventually ended up agreeing with Klaus, although they always kept some disdain for the originality of his argument or the strength of his position. Years later, Thompson wrote, quote, the memoir of Klausius contained the most satisfactory and nearly complete working out of the theory of most power feet, but his hypothesis was so mixed that the general practice law, it probably didn't help. The Thompson and Helmholtz specific were both known for their charm and charisma, and Klausius' personality was the mostly lost time. A cyclist's brother saying he was a man of rare modesty. His son writing that the most principal trade in my father's character was, without doubt, the splendid truthfulness of his nature, and a letter from a student years later that described Klausius as, quote, that old grouch. Eventually, both Helmholtz and Thompson were knighted. Helmholtz became Hermon von Helmholtz, and Thompson became Lord Kelvin, let the temperature scale that is named after him. Despite the past, Klausius continued to publish his theories on feet. In 1854, Klausius published his fourth paper on feet, and this is the one where he created entropy. Well sorted. In this paper, Klausius said that there are already two rules. The first one is the one that he can be converted into work and work into feet. The second one is Carnot Theorem, that feet and limbs only work because the heat flows from the hot source to the cold sink, and the amount of heat you get is dependent on the temperature of cold sources. Klausius thought, however, that Carnot's theorem in this form is incomplete because we cannot recognize their ends with sufficient clearness, the real nature of the theorem, and its connection with the first fundamental theorem. What to do? Klausius knew that Carnot had made this hypothetical cycle where if he did it one way, heat would create work, and if he did it the other way, work would create heat. This is currently called the Carnot cycle. Klausius decided there must be some mathematical way to make two heat transformations equivalent, so if he did them in the opposite order, they would work in the opposite way. He also determined that this equivalent cycle had to be a function of the heat and the temperature. He also noted that less heat at lower temperatures was equivalent to more heat at higher temperatures, so the temperature must be in the denominator. He therefore decided what the function he was working with, q over t, where q is the heat, and t is most likely, quote, simply the absolute temperature. Klausius also defined the equivalence value of heat going from temperature one to temperature two as q times one over t two minus one over t one, and gave the letter n for the sum of the equivalence value, which he generalized as the integral of the element of heat over temperature. Note from the present, this function, the heat over the absolute temperature, is an equation for entropy, although Klausius didn't call it entropy yet. Klausius decided that if the process was reversible, then the sum of these functions must add to zero. Here is his logic. He said, imagine it wasn't true, and the equivalent value was negative. If that was the case, then the value of the heat times one over t two minus one over t one would also have to be less than zero for some part of it, which means that heat would have to flow from the lower temperature to the higher temperature, which is not possible to the Carnot's theory. He then added, if the sum was positive, then you do the process in reverse, and then the sum becomes negative, and once again, you get heat flowing from low temperatures to high temperatures, which is a no-no. Here goes, no matter how complicated a special is, if it is reversible, then the equivalent value must add to zero. Klausius added that if the process was irreversible, meaning you couldn't do it backwards, you couldn't get a negative equivalent value for the same reason, but you couldn't get a positive one. He therefore concluded with his second law of thermodynamics, quote, the algebraic sum of all transformations occurring in a cyclical process can only be positive. By the way, I would tell that heat can't flow by itself from a cold object to a hot object because it violates the second law of thermodynamics, but in studying Klausius, I found the reverse feature, meaning Klausius faced his idea of entropy, or what he called at the time the equivalent value, on the principle that heat cannot flow from a cold object to a warmer object by itself, and he got that idea from Staviharno. That's forward eight years to 1862. That's when Klausius wrote a paper about the equivalent value of entropy for a system that didn't go in a full circle, but started at one temperature state and ended up at another temperature state. Klausius decides that heat usually increases the mean distance between molecules, which he calls the discrogation. If you've never heard that term, it's because we no longer use it, by the way. Klausius also noted that water is strange, and that when ice melts, the molecules actually get closer together instead of further apart. So he added that in that case, the discrogation is not accompanied by an increase of the mean distance with the molecules. Therefore, Klausius's discrogation had to do with either the separation of the molecules or their order limit. Klausius became the first state, but one could determine the entropy from the arrangement of molecules inside a box, even if you don't know how much heat it absorbs. Also, when Klausius looked at a single transformation, he realized that a general property of transformation is that a negative transformation could never occur without a simultaneous positive transformation, whose equivalence value is at least a break. On the other hand, positive transformations are not necessarily accompanied by negative transformations of equal value, but may take place in conjunction with smaller negative transformations, or even without any at all. Klausius concluded the algebraic sum of all transformations occurring during any change in conditions, whatever, can only be positive, or as an extreme case, equal to nothing. In other words, entropy of a closed system can only increase. And that's not all. Way back in 1862, some 50 years before Walter Nierce produced it, he came up with the third law of thermodynamics, too, writing, quote, it may be proved to be impossible practically to rise in the absolute zero temperature, but any alteration of the condition of a box. Three years later in 1865, Klausius published his ninth paper on paper. In this one, he said he was motivated by the desire to, quote, bring the second fundamental theorem, which is much more difficult to understand on the first, to its simplest and at the same time of general form. This paper is mostly important for the new terminology, because this is the paper where he renamed the equivalence value to be the shorter term entropy, and gave it the letter x. For no reason I can tell. We still use the letter x for entropy because of Klausius. Klausius said that he picked the term entropy from the Greek word for transformation, and he, quote, intentionally formed the word entropy. So as to be as similar as possible to the word energy, Klausius then included with his version of the two laws of thermodynamics. One, the energy of the universe is constant, and two, the entropy of the universe tends to a massive. Klausius's version of the two laws of thermodynamics that he wrote down in 1865 are still considered correct today. Meanwhile, William Thompson, one of Klausius's big critics, was working on his own version of the second law of thermodynamics, albeit one without an equation. Years before in 1852, Thompson wrote that there was always a waste of mechanical energy available to man when he is allowed to pass from one body to another at a lower temperature. By 1862, Thompson declared that the second great law of thermodynamics involves a certain principle of irreversible action in nature. It is thus shown that although mechanical energy is indestructible, there is a universal tendency to dissipation, which produces a gradual augmentation and diffusion of heat, cessation of motion, and exhaustion of potential energy through the material universe. With entropy, Klausius has given the irreversible action of nature an equation and a name. The way you say, or maybe you say, heat over temperature is not the equation I learned for entropy. That's probably because you and ox were taught Boltzmann's entropy equation. You won't be surprised to learn from its name. That Klausius did not create Boltzmann's entropy equation. But you might be surprised to learn that Boltzmann didn't create it either, even though it is carved on his base stone. So how did he get from Klausius to Boltzmann's equation? And how did Boltzmann get an equation written after him, a constant named after him that he didn't correctly create? That's next time on the lightning chamber. If you're interested, I already have a video about the first. Okay, so I recommend to you watching the other video about Boltzmann, although I don't think everything is correct in that way. I will tell you my version of how, or like I will not go too much to the history, but there are some things that might not be correct. For example, the statement that Boltzmann wasn't using the entropy in his form, that's not true. The truth is that the Boltzmann constant was named after Planck. So Planck basically named it after Boltzmann. There were two reasons because, or like he started using it. And at later time, it was fashion to give constant's name. And Planck already had one constant, which is the h bar. So it was a good choice. But the reason why Boltzmann entropy, many people claim it, it wasn't invented by Boltzmann is mainly because most of his texts are in German. So for many people hard to read his original papers. So in this video, basically, there are two motivations, or like in the end of 19th century, there are two main motivations for investigating entropy. And the first comes with what people did somewhere. So basically scientists or physicists were interested in relation between energy, heat, work, and temperature. And this is all the stuff that is connected to the thermodynamics. So what we wrote there is delta s is equal to dt over t. And it was mainly the work of Halsey, Kelvin, Hamholtz, and Carnot. You know these things from, if you had courses on statistical mechanics, there are certain things named after these people. And the second one is for us maybe more interesting, because it's a relation between microscopic and macroscopic. And this has been done by, investigated by Maxwell, Boltzmann, Planck, and Kipps, who popularized the most. And I think that this is the fundamental way, because all the thermodynamics is very powerful, it's still a kind of logical theory that basically just observes the macroscopic variables and doesn't go to really microscopic foundations. And what we will do now in this lecture, we will try to follow the path of Boltzmann and Maxwell and these guys to maybe think how we can connect the microscopic motion of particles with the macroscopic. And the question is, okay, why statistical physics? If you had a course on classical quantum mechanics, then we know that the motion of these particles is given by either Lagrangian or Hamiltonian or Newtonian formulation of the mechanics. You have the equations of motion. And you can write down the territory of Hamiltonian for root to the 24 particles, if you want, you might be even able to solve it characterically. But the problem is that this is first, not a case that you can solve it, even numerically, because it's such a large amount of data. And second most important, you are typically not interested in solving these equations, because they don't give you so much interesting information. You don't need to know the first and position of every particle here in the room to know whether the heat flows or whether the energy flows from your body or to your body. So that's maybe not too much information that is not useful. And third is that if you have a one body problem, so if you have one particle, then basically it's typically solvable in the sense that you can write out the differential equation and you can solve it in so-called quadratures. So you can write an integral for the solution. If you have two body problems, you can do this transformation to the center of mass. And then you have one coordinate, the center of mass, and the second is the distance between the particles. For free body problem, the general solution is in general not found. And what we do if you have any body problem, you have 10 to 24. And the more important question is do we need to solve this? In most cases probably not. So just start with a brief recap on what we know from classical mechanics, if you have what is the so-called Lagrange theorem. So you have an economical coordinate, so you have the conjugate position and momentum. And then you have an initial probability distribution over the positions and momentum in time. What the Lagrange theorem tells us is that basically the probability distribution in this phase space is conserved. What it means that if you take any volume in the phase space and transform it according to the equations of motion, the volume remains constant, which then means that the trivial consequence of this is that the entropy, the change of the entropy is zero. Basically, by plugging through this formula, you plan out that the entropy when your only uncertainty is the initial probability of the system. Otherwise, you know everything about the position and momentum, then you basically the entropy is constant. And this is the picture that is showing that if you have this small rectangle, it gets the form in time and e, but the volume is constant. So it means that the entropy is not useful? Not really, because typically we either do not have the whole information about the trajectories or we don't want to it or it's too much and we are maybe not so interested in. Here I mentioned two interesting results or like two useful results from statistics, which is the law of large numbers, which means that if you repeat something many times, you basically and then calculate the average value, then it basically, you will get the mean value. So, and this is this is this is this graph. So it means that by repeating many experiments, you will eventually arrive to the average value. And we also know how fast you arrive there, because you the, the, the standard deviation goes with the variant of n. Of course, you can say, yeah, this has some assumptions. Here is the independent and identically distributed random variables. Yes, it doesn't have to be integrated distributed. There are some generalizations of these terms for not identically distributed variables, but the independence is of course very important. And sometimes you will see that the corrections change the game, but let's maybe not consume this at this moment. And the consequence of when you have a large number of systems, they can be typically described by very few parameters, like the box of with one more gas particles. And now I go a little bit forward, you know that the average velocity is determined by the temperature because the temperature somehow plays the role in these statistical terms. Also useful, something that we will need it later is a so-called bars and stars theorem. And in the when counting the state, we will need to basically calculate if we have states, like let's say coins, key coins, and you want to put them into n boxes and in the box that doesn't have to be any coin, but can be one, two, three, et cetera. So I think you know these equations. It's called combination of repetition or something like that. And people do these proofs with these bars, stars, you can have I don't know ways and some books or something like that. I just remind you that we will need this formula a few times in this lecture. I will refer it as bars and stars theorem. And now why we need this statistical physics is the concept called course gradient. So everything here, all the particles can be described by quantum mechanics. So you have the particles and you want to go to quantum field theory. Let's say you have the particles, they have some interactions, and maybe you will be able to describe it by the play function rate. Can you solve it? For few particles, maybe yes. For more particles, no. And also you will not see the interesting properties of the material, which is maybe hidden or emerging on higher scales. So what people typically do is that depending on their scale of interest or their phenomenon of interest, they co-incorporate the system up to a certain scale where you have atoms, you have molecules, you have colloidal particles, you have cells, you have maybe then individual people, you have societies, you have planets, you have galaxies, and et cetera, to describe the phenomena there. Because you cannot use the whole machinery of the all other scales, although it's somehow encoded in your effective theory, because you will be upstairs, typically not possible. So each theory, although not explicitly said, works on a specific scale, time and space scale, where you can solve things and where you see this characteristic phenomenon. If you go to other scale, typically it's either too difficult or doesn't bring so much many new information, or you can do many things with that. And the importance is that if you do this coarse-graining, maybe you will find it, you start with some states. There are many, many, many states. Then by coarse-graining you end with less states. Maybe what happens is that this one coarse-grained state corresponds to many microscopic states, while the other corresponds to maybe other number of these microscopic states. And these numbers are not the same. And now we are again back to the fact that you have many, many, many dices, and you roll the dice many times. And while you can see different numbers on the dices, if you calculate this average value, then you always get a very similar number or very similar number that is quite similar. Here I mentioned one thing that maybe some people will talk not about coarse-graining, they will talk about ignorance, which means that it's just how much you know about the system or a preparation procedure that you cannot prepare a system perfectly. And it means that maybe coarse-graining is a special type of that. I think this is something that can be seen as an alternative approach to it. It goes more to a philosophical point of view. I see it more or less equivalent. So when we try to do the modeling and not maybe the foundational part, I think they are almost identical. But I mentioned it here because some people will say, no, coarse-graining is not what is causing the entropy. But the calculation and the formulas were the same. So if you want some other interpretation, go ahead. And I also here specifically mentioned what it means in thermodynamics that, of course, in thermodynamics, you start typically with microscopic systems described by classical mechanics. And then what statistical mechanics does is that it connects the microscopic theory of thermodynamics where you have only a few functions with this microscopic. And this is what the statistical mechanics does. This calculates this probability distribution, plugs it back, and then you end with thermodynamics. Then what you heard last week is the stochastic thermodynamics where you have some intermediate approach that you have some coarse-graining. But you know something about the system, not only the microscopic quantities, but you know, for example, the probabilities of trajectories and these things. So it is somewhere in between, which is typically very useful for many applications. And now I have to say that now I start explaining the key concept that will be useful for our definition of entropy. And it's the distinction between microstates, mesostates, and macrostates. And basically we can continue on this example with the dices. So we know that dices has six states. If it's this basic one, this cubic one, and then basically we can throw it a few times, let's say five times. And I throw a sequence of four, two, six, two, five. And there are many microstates. The microstate means that it's the exact sequence of these dices. So it's related that you can think about it as a time series if you want that it's the order of how you measure, how you throw it. The number of microstates is here six to five, because each time I can throw six, like choose from six possible values to the five, so six times, six times, six, it's over 7000. Now the mesostat is something that I basically measure how many times each number occurred. So in this case I say the number two occurred two times, number one, zero times, number three, zero times, and four, five, six each one time. So it's something people in statistics call histogram, because it's really the frequency of how many pros I did. If you calculate the number of these mesostates, again, here comes the stars and bars theorem. You can easily do it. And I think you did it during the combinatoric classes. It's this combination number and for this case it's 252. So it's much less than this 7000. So if you do the course graining, it means that basically there are more, there is more demand and many more demand possibility for each histogram. So basically, when we take all the sequences, many sequences go to the same histogram and it's clear. So if you can, you can exchange, for example, do the permutation of the, of the, of the, of the throws, etc. And then you can do even more course graining. This is something that we discussed before. If I'm interested only in the average value here, it's 3.8. And the number of microstates is basically 25. So it's again, yet another course graining and this sequence of 7,225. This is what is crucial for our course graining and distinction between micro, meso, and macro. And now imagine that I had not 5, I have not 5 dices. I had one more of dices, so 2 to the 24th. Then the, the difference between the numbers is even more pronounced and it's like extreme. So, so really the, the set containing 2,1 meso state, my one histogram will be extremely large. And this is basically discounting is what creates this entropy and I will explain why. Because now, if we want to define multiplicity, this W that Boltzmann did is basically the amount of microstates in the same meso states or microstate. So now we get back to the photo and Boltzmann's grade and the question is how do we calculate this multiplicity for this case? So the short answer is to see the combinatorics lecture. Well, the full answer is that what we should do is that we do all the permutation, but we should take care of overcounting, which means that if I have this, this sequence where there are two twos, if I change these two dices, this is basically the same sequence. So I have to exclude these permutations where I permit the two dices with the same value. So what I do here is that the permutation of all the states is 5 factorial is 120 and overcounting is the permutation of just these two, so it's 2 factorial, then the multiplicity of this histogram is 5 factorial over 2 factorial, so it's 60. Good. So now the general formula for let's say the histogram where I have n1 states of 1, nk states of k where this n, the capital N is the sum of all and i is the total frequency is given by this formula, so we can easily think about it. So it's n factorial over the product of n factorial. Another question at stake is why I take clock? The same reason is that clock transforms products to stands which are much easier to handle. Is it working now? I think so. Good. But there is also a physical reason which is called that if you have a multiplicity of this joint system that do not interact with each other, you can easily think that it's a product. So it means that then the log of this product is the sum which then gives this rise of this extensive versus intensive thermodynamic variables. So we will see that entropy and energy is extensive and the temperature is intensive which means that the temperature basically you can measure the temperature. It means that the entropy of two systems that have the same entropy and energy and number of particles is the same. So it's zero for all thermodynamics. But at this point, we can say it's because it's easier to calculate. Good. So then we do what Ford funded and the other did. So basically he's considering the approximation. So basically replace the log n factorial by n log n minus n and the sum as n and what we end with here is that we replace now the n i over n by p i by a letter p i and then we see that the log of n i log of w is minus n p i log p i. So now the question is that some people call this sum p i log p i gives entropy in this case cause in information there is called channel entropy and there are two questions. So first is what is actually p i? The second question is, so does it mean that both one entropy and Gibbs entropy are the same? And I'm not posing it because I would be the first one who asked. Now I want to say something about the probability and the two approaches. So the ones who are statisticians here, you know that there are two approaches to entropy, to probability. So one is the breakfast approach where basically the probability is this limiting success value of repeated experiment, which is really the consequence of law of large numbers. So here what the p i would be is really this limiting value from the repeating experiment. So it really did the frequency. Here it doesn't make sense to, so this is quite frequentist to not like the parametric models because it doesn't make any sense to consider parametric distribution. And biasing approach, the probability quantifies the uncertainty about the experiment. And here we are whether again back to the discussion whether what is the entropy, but let's keep it simple for here. Here the biasing approach tells us that this is the knowledge about the system. So we have this bias rule that you have the prior distribution likelihood ratio and posterior. So the posterior is the updated probability distribution when we observe something about the system. And of course using the saying that the n i or n is the p i is the probability is the frequentist approach. And what the people are thinking about is the thermodynamic limit. So at that time people are thinking about more molecules, so large number of molecules. This is something that's called thermodynamic limit. So in practice, so in theory you send n to infinity in practical situation you say n is much larger than 1, much, much, much, much larger. So there are few natural questions. Does it mean that the entropy can be used only in thermodynamic limit? And does the entropy measure the uncertainty of a single particle in a large system that somehow is like some kind of average probability over many particles? So it means that is the probability that we define the probability for one particle or for all particles? And it's a good question, but the answer is it depends how we define your state. And here I mentioned that this was something that people are really interested in in the 50s, 60s. This is paper by James, who distinguishes the Gibbs entropy, which is the entropy of particle multiplicity versus the n times the one particle multiplicity. And here he writes that they are not the same and they are the one is bigger or equal than the other. And the question is, you had a course, I hope on information theory. Does somebody recognize what this difference is, it means in information theory? Not exactly. Ah, yeah, it's a mutual information. It's a mutual information if you think about it. But you were right with the Kulvak library entropy because the mutual information is a special type of Kulvak library entropy. There you take one distribution of the joint fund and other as the product of the marginal one. So you were right. But it's a special case, which is called mutual information in case of more than two events, it's called multi information, right? So yeah, that's it. So I have a question. I don't see a difference between the Gibbs entropy and the Boltzmann entropy right now. Yes. So basically, this is the multiplicity and maybe in the next slide, we'll be out with an integer number, no? Yes. So basically, this should be the multiplicity of the n particle ensembles. Here the W1 should be the multiplicity of one particle in different states. And I will explain this in the next slide what is meant by that. Is the resolution, is that you have to really think about this? Yeah. Yeah. Yes. So I can see that one integer has a lot of differences. Is it a constant or positive? The partial answer is that so some theories like thermodynamics or maybe quantum mechanics are independently discovered by several people. And they came with very similar yet a bit different ideas. And at that point, they basically called entropy something that looked at the first site very similar. So everything that has this form p of p or n of n, x of x, is nowadays called entropy. But always, but it might be a case. And this is the reason that here, for example, you see that here is the case. So here, this is the Boltzmann I showed in the previous slide. This is something that I didn't show, although I claimed that I might not show it because here it's not about the formula. It's about the definition of your state because what did we define as a state? So here we had states as a single state of the dice. And then we repeated the experiment. What if we threw the dice two times, two of dices at once? And then our state will be like a pair of one dice, second dice. You can do it with three dices and dices. So then the state space goes faster. And here, of course, because the dices are independent, and you will see and I think that they write it here. So they are the same if the product is so the product. So the joint distribution is the product of marginal. So if the events are independent, then you will end with the same formula. So these two have the same value. But maybe if you introduce something, I will also show later, that these throws are correlated, maybe it's more important that you throw the same numbers than by chance. Then basically this marginal approach will not describe the whole feature. So basically, if we talk about entropy as the feature of the whole system, we either have to be pretty sure that all these small particles are more or less independent. Or we have to use this approach where there are many, many, many more states and it's much more difficult to calculate. And besides that, then you will see in other lectures that there are like other people like change who found out in information during the same formula. And then there's this funny story that von Neumann replied that he should call it entropy because there is this thing and nobody knows what it is. Because at the time they were not sure whether this is the same quantity or not. And that's why it still remains somehow buried under this theory. And that's why many people, if they talk about entropy, they can clean different things. This is related to a thing called Gibbs products. I don't want to go into so much detail. Basically, between that we have two boxes. And there is a barrier, you remove the barrier, let it diffuse, come back by calculating the entropy. Then before and after you get some paradoxical behavior, there is some resolution to it. But I think that this is also partially due to the proper count of the state, which is a bit more difficult if you have, for example, canonical states. So if you have continuous states phase, it's a bit more difficult to calculate such an entropy. Here I mentioned two other things that I will need because later that are important. So here we see that if we have two independent systems, the entropy is additive. So this is what we saw. So for two independent systems, the log of product is sum of rocks and it's extensive. So if I have k times particles, it should be k times more entropy. And I'm done with the, with this talk. And after each talk, I would like to have a very short summary. If you just, I don't know, raise your hand or write a picture or something. If you tell me one more, one thing that was either surprising to that was useful to you, or maybe if you didn't like, would be great to have something also for the others. Is it something that during this talk that was useful for you? I mean, it was not useful, something surprising. Yeah? I don't think I was aware that it was from GIFs. Yes. Yes. Yes. This also because the GIFs, then I will probably talk about it a bit later. Also, both models interested in chemical erections. If what GIFs did is that we introduced this ground chemical ensemble where you can exchange particles. So sometimes people call this pitch entropy, but I mean, this is also a bit changing from author to author. Is it something, yeah? I like the illustration of the new microwave, the new microwave based from the device. Yes. Because I think this is something that is typically overseen or done very quickly, at least in physics courses. And I think this is really the crucial part of understanding what the entropy is. My comment was also related to the same file, very nice. Is there something else? So if not, then I will tease you because tomorrow you'll be really calculating the entropies and I will be calculating the entropies, the Boltzmann entropies that are not the GIFs entropies. And I will do it not by doing some abstract calculations. I will take real practical systems that you can imagine quite easily and show you how these entropies look like. And you might know them already. So thank you for today and have a good evening. There is also Lyle in the chat, so she didn't know the fact that the Boltzmann entropy and the plan was involved. Yeah, I might tell you this funny story. Tomorrow or post a video, there is also nice history about Boltzmann and his history. Good. Okay. Thank you.