 So, let me remind you briefly what was happening in the previous lecture, and so it was about entropy, and the picture, the mental kind of the ideology behind that was that there is a big physical system, like a crystal, which you want to observe this, because infinite has infinite entropy eventually, and today I shall do that, we have to define its entropy, which is entropy per unit site, so as we properly normalize, which will be finite, but what we have at the very beginning, this is a kind of abstraction, experimentally we observe our physical systems through kind of some apparatus, some device, and we see something flashes, and nothing else, so one thing we should have, we can count them, and secondly what we can say that eventually will be used, we don't know, there is no meaning of that, out of that we define this entropy and give it some interpretation, but the only thing which are in your position, this blinking, which is systematic, and you can count how many happens per unit time, so time in your disposal, and secondly you know if you have this crystal, and you can move it, the whole thing you can move, and you know it is the same, and that's sufficient to define entropy, and actually everywhere in physics you don't have more, everything else you construct, you have this boom, boom, boom, boom, boom, and you know the certain situation, postulated to be the same, they postulated to be the same, we don't know, of course there are other kind of postulates, you can save two things in a different part of the universe, what happened there is mathematically independent, but this as we know very carefully, it's not quite true in quantum mechanics, intuition breaks down, and now entropy, so this system are modeled, this finite system are modeled, but what is called finite probability space may go one line up, which we defined before, so it is just finite set of atoms, and so there was some justification of categorical language, so but just remind me, because I'll be elaborating on this today, so we have there's finitely many atoms, finitely many drops of water, so they're called finite measure spaces or probability spaces, so there's finitely many, I will call them atoms, but can think about drops of water, and total mass is one, it's normalized to be one, so there are numbers involved in that, we should dedicate of course tricky, because numbers have no, real numbers have no physical reality of course, as I said, what has reality is only this kind of, this, this, events which you can count, and it can differentiate, and notion of identity between certain observations, which means there is some symmetry in that, right, and so this is notations, yeah, which I keep, just look at that, so it is a set with weights, and you forget about the weights, I just say set, right, and so there are two different worlds, just sets and sets with weights, and notations, it's, I think the way I set up notations rather efficient, then there is a little point which looks, looks completely trivial, that if you allow things to be mass zero, just, when you speak about infinite measure space, there are a few sets of measure zero which are throw away, and, but here, just, just atoms which don't participate in anything, however we shall see, not today, maybe tomorrow, but next lecture, that in quantum probability these atoms, zero atoms, kind of essential, they kind of change mathematical framework, and so this is normalization, and this minutation also, so I'm proud of minutation, I don't have to have subindices, you see, I don't have, this is usual, kind of, this means, means the manner in mathematics people use subindices because the artificial sets which are not in the game, right, there is no I's here because they're not ordered atoms, they're just sets, and, and then there is this concept of reduction, this physical terminology in my view already, oh, why, why, what happened? It doesn't want to move now, it was moving go along, and now it refused to move. Oh, what happened? We missed it, what happens to the system, yeah? Doesn't help, you know? You have to move the arrow, ah, ah, now it will, now it can do it by hand, okay, so it's okay, yeah, it's even better, and so this is a key concept, there are probability spaces and reductions, and reduction, again, have a very simple interpretation, two drops of water comes together, several comes together, and masses, of course, add, and these are arrows, and they look extremely, kind of, extremely primitive, the number of arrows almost the same as, it's even smaller than, but just for sets, how well we shall see, it is quite convenient, this language of arrows is extremely convenient, and there is this operation of a product, which you just take all these pairs and with these weights, and here you see a multiplicativity of numbers involved, so numbers are there, yeah, you see, but these numbers will be just frequencies of some events, where they come from counting, they're not coming from measurements, you see, that's the point, you don't measure size, it's kind of, it can still complicate process, all you can do is count events, and you have kind of time comparison, which is as, you know, in relativity, the only thing you have, it's a, you have, you have, you have time, you have way to measure time, you have clocks, you don't have any meter, everything, and then you make some justification of why it's a gotical notation, much concept, more flexible, even if you don't speak about generalization, already better than, than this, and then there is this kind of motto of thesis, entropy is a number equal the logarithm of the number of states, so first entropy is a number, it's a real number, right, which is very, very many things in physics are real numbers, and this one of the things we shall challenge, right, many things are not numbers, despite what we obviously like to say, and then what are properties of this entropy, and so I just admire them, and they make sense whether this number, not, not numbers, but, so one is very simple, and this is kind of quite essential, that if you have two non-interacting system, then entropy is additive, of course you could see multiplicity of taking log or not log, but we shall see log is not so stupid at some moment, and symbolically she is one and two very far away, and entropy is this, and this is the picture, so what you think about that, and that essential, that we have a big thing like a crystal, and you have measurement here and measurement very, very, very far away, and so entropy assigned to these results of measurements, infinite system per se, has no kind of intrinsic entropy, but you, how many blinking you have here, per unit time, how many you have there, somehow you measure entropy, whatever it is, and this must end up, again, because if they're far away, they don't interact, but of course it is a postulate, yeah, it may interact, but you believe they don't, after all, this is point of the distance, if they do interact, means they're not far away, this is how you define distance, after all, and, but next, this is kind of, simple, and secondly, it's more subtle, this symbol means join, means two systems join together, but they may interact, not necessarily, but may interact, so what does this join, what product is defined, this is not truly operation, again, we have my picture, this symbolically how you write it down, and this is how you imagine the picture, you have two systems next to each other, and so they close, and they may interact, and again, intuitively is that because they interact, so one state in one may prevent, that's why we have some, you know, some little wheels, and the only certain position between them are acceptable, the number of states goes down, but what is this join, yes, it's not really operation, yeah, it's just some position, it's a relation between them, they may, the system joined with the two, but it's not canonical operation, if you know this probability space, and this probability space you don't know, unless they're being used for measurement of the same system, then join always makes sense, they might have physical implementation in the same world, so to speak, if they're not, then it makes no sense, so you need this world where they implement it, so, and this one possible, one possible definition, which is enough for applications, but this is which I haven't explained last time, very simple, this is, yeah, if they don't interact, then this is definition, but in general, they, this symbol must depend on how they interact, and one possible, one possible definition of that, so it's not canonical operation, so it is a, Q is a subset here, so set, a set of states of Q is subset of the product, maybe many different subsets, so it may position many different ways, but whichever of them, you call it join, and then the projection must be reduction, there is some, there are some mathematics people use for this reduction, this margin or something, very, I think, very strange, where is this language, which is less, much less flexible, less categorical than, than the physical language, by the way, it's interesting enough, these are used really, almost categorical language, mathematicians use that, and so this is quite intuitively clear, whatever entropy it is, if it is intuitively the number of states, it must be this property, but the strength of the derivative is not, so this what it says in one notation, and this what is more compact notations, so the key point is you estimate one, two, three, some of the two minus a term, and that's tricky, so there's minus there, so there's some kind of cancellation, and if you use these joints, it means the set is subset of the product, where the coordinate projection reduction, so that's kind of rigorous definition of what I said, right, because this joint was somewhat vague, this how we understand it, okay, and that's the picture, because I couldn't make this, this is the picture, one, two is not in the picture, and this one, one of the correlations doesn't quite fit, and then there is this property, if something is reduced, entropy goes down, and this looks completely kind of the simplest of all, and it must be always so, but it's not in the quantum case, that's quite remarkable thing, many think quantum theory remains, and this is not, and this is what called, you know, people speak about entanglement, something like that, some words which are impossible to understand, but essentially it amounts to that, that when you look at a single object in a quantum observation apparatus, you can see many of them, you can see more states than it is there, so to speak, because you cannot separate apparatus from the object you observe, philosophically, mathematically certain sublinear algebra involved, we shall come back, and then there is another property which is, I got it, almost an axiom, but again you have to, you have to have it, which is not completely apparent from this intuitive definition, but it should be smaller, and then you can, and the quality is when all atoms have equal weight, so that's it, so just again with entropy, whatever, there is no definition, I think you cannot give truly definition, but this is roughly what you expected, what may be there, and people just, you know, modify definition, find new definitions, play into this property, and then the point is that the existence of such a function is quite remarkable, it's not, it looks kind of, just number of states, you just say it correctly, and then you finish, but example I was bringing forth, for me it was kind of a revelation that they read the following inequality, so this is a classical axiomatic inequality, even subset in the Euclidean k-dimensional space, volume bounded by area, so to speak, of the boundary, it's volume, ah, I say volume on the right-hand side, it is dimension n minus one, which I have failed to write down here, volume of the boundary means n minus one, k minus one, dimensional volume, and there is an inequality which is significantly more precise, instead of, if you take such a subset, that it's volume bounded by k minus one, volume of the projection of the coordinate hyperplanes, so there are k projections, say in three space, there are three projections of the plane, so there are the sets in the plane, and these bound the volume, and the way, the shape, the quality holds for solids, and whatever obtained by measurable transformation, present coordinates, and this is stronger, because when you project something, an area of the, say in three space, whatever you project on the plane, area of the projection is less in projection of the boundary, because only point on the boundary enter into the projection, right, because any line which meets a set also must meet its boundary, it's simple topology, but amazingly enough, it's non-trivial, and if you ever thought about isoparametric inequality, it's not trivial even in this weak form to prove, even when it's not non-sharp. Actually, the main simple proof is still none of them is trivial, right, and as I said before, all sub-parametric qualities are kind of trivial, corollaries, trivial, meaning modular, linear, kind of stupid calculations follow from this isoparametric inequality, and this is sharper, this implies so-called log inequality, and this follows from strong sub-parametricity in the first non-trivial case, when it's three-dimensional space, and these notations, we use this kind of a three-dimensional shape in three, and then that's its projection to three planes, and that's inequality, I wrote explicitly, it's product, you see it's another point, but it's not sum, this area of the boundary related to the sum of this projection, but not to the product, product a priori smaller, and therefore it's sharper, why I say log, so-called, it's sharper than usual inequality, and the proof goes is derived from, first we approximate it, and we can divide everything in tiny little pieces and concentrate your mass and atoms, so become final statement, and the final statement is like that, you have final subset in the product of three sets, purely combinatoric, and then you can see the projection to this binary projections to the products, right, and so the statement is that cardinality behave this way, so up there this vertical bars, refit of cardinality, and dots meaning sets, three meaning kind of, six, there are six faces to the cube, and three, four forces for squares, so to speak, why this notation, but there are sets, not measures, but for measures we know there is this inequality for measures, as I observed before, and one implies another because you take the probability space where all atoms have equal weights, so it's a very simple case, where entropy indeed equal by definition of entropy, or desirable definition of entropy, which we haven't defined on the set, but we want to, but this can be served as a definition of entropy, so you use this equality, and also, but for projection, you use inequality, and if you just rewrite this, you immediately see inequality, and you see it's really, it's really, well, you lose kind of, quite a bit of information, because entropy is smaller, so you don't have a quality unless projection, actually, have constant density, cause projection of sets is a very bad operation, but push forward to measures is good, for some reason, and then, however simple it is, read this question. So, luminous weakening inequality is symmetric on the permutation of coordinates. The question is, are there such an equality for other symmetries? And in particular, usual isoparametric inequality in this sharp form invariant at the asymptomatic of the euclidean space, but is there similar inequality of this kind of degree of precision, and you can make conjectural inequalities, and that's unknown, though there are very closely related, relating equality for functions, for densities, which I may mention is at some point, but so, however simple, you immediately kind of see that it's not, but by no means, the end of the story, and this perspective of entropy, in my point of view, advantage, it's immediately bring forth questions which completely disappear when you look at the standard text books where you describe entropy, it seems there is nothing to do, but this shows you it's very open field, almost nothing is known, actually, when you look carefully from a certain perspective. And then there's an exercise which I suggest, which is related to that. It's, again, a very elementary statement, and this linear algebra, you have a four linear form, so in four variables over any field, and then you can divide the variables in various ways, and you have binary forms, and each binary form characterized by its rank, not characterized by its rank, and then, and then, well, you have the same quality, and it's not clear why it's related to, to the luminous-witch inequality, it's stronger, it implies in a kind of trivial way, because it's very special forms, you obtain the same quality, but that way around is not completely obvious, and again, I don't know, and this is a question, which is begs itself, what in general relations one has, so given a multilinear form, you divide variables in pairs, and there are lots of ways to do this, by normal coefficient, it's actually, it's two to the n, I'm sorry, of different divisions, if n variables, and so there are two to the n numbers, which are ranks, what are the relations between these numbers, what are universal relations? So, and this is, of course, I don't know, and these are relations are there, and this is, well, it's a connection of many things, and then I make a little digression concerning genetics, because it's a beautiful instance of, where you see how probabilistic reasoning, and little arithmetic completely change the perspective and science, if you look in application of mathematical biology, this was probably the most significant, it was really kind of understanding, understanding this basic property of evolution, of course, of mathematics, of course it was known to Mendel, yeah, it was kind of path for Mendel, everything was obvious, but then again, the story was my understanding is, when it came, Mendel didn't explain that, it was just apparent for him, and he didn't explain it, and he couldn't imagine that somebody wouldn't understand it, on the other hand, by all these of the time, hardly understood multiplication table, which is kind of obvious, they didn't understand number, and in particular, they were confounded by this identity here, that for them was impenetrable, I mean, it's obvious, yeah, yes, how they discussed this, the issue, of course, not in this language, the question was to see it in mathematical terms, and this, of course, was point for it, hardly and widely done, they reformulated this phenomenon, and the phenomenon is that the next generation map is important, so, so I have the following map, again, it's even hardly and widely tried in correct terms, when you can see the matrices corresponding distribution of certain types of biological phenotypes, or other genotypes, secretly seen via phenotype, and this is a very simple operation, on matrices known as Segra map, then have this property, and actually, if they're normalized, then it will be that, and well, an entropy enters in this way, so it's entropically very interesting map, it's closely related to entropy, but what's remarkable about this map is square equals to itself, and this, amazing, is polynomial map on, if you normalize them by one, and so it's polynomial map is square one, and this very rare, when you compose polynomial map and have one, because degree ends up, when you multiply, or when you multiply, when you take super composition of two polynomial maps, typically, almost always, degree will be product of two degrees, right? But here it's cancelled off, and this cancellation, kind of strange, but I don't think, I'm a joke, to say this was a problem with biology itself, it's not the understood that. Actually, Hardy, himself, he wrote the formula, but he didn't, it was an exceptional situation, and then, well, from this moment on, the kind of Mandel theory was accepted by all of us, which was the last obstacle, but it's from the very beginning, the first obstacle, in my view, it's not understanding, because Mandel has mathematical education, and the other biologists were infinitely far from that, for them it was completely mysterious thing, and now, we come to the definition of entropy, and my point is that, my point I want to make, that you can define entropy not as a number. On one hand, you need various property, but in fact, entropy is kind of a function, and actually, recently, just two days ago, I received a message from Daniel Beninkan, when he collaborated, developed even more sophisticated concept of entropy, some kind of cohomology, one dimensional cohomology, and he thinks about high cohomology, but because I haven't read this, I didn't understand what he says, yeah. But apparently, it is not surprising, and what I'm saying is probably the first word in a longer story, that this entropy, and I believe this applies to most basic physical concepts, are actually functorial entities, they are not about numbers, and numbers are just interpretation of ninth century, and now we're in 20, and 400 years, mathematician, especially modern physicists, was speaking to numbers, which is, well, it's not something wrong with numbers, but that's a little thing of the story. And now, I want to define this, kind of, this here. First, there are homogeneous objects, then there are the seductions, and this is, so we postulate homogeneous, meaning all atoms have the same weight, but again, it's a categorical notion, and then we introduce, and that's kind of an essential topology, that you can compare different probability spaces, even if you don't have them in one kind of universe, right? And that's a typical problem we faced by physicists. We have one description of something which makes sense even if you cannot directly compare two objects. Saying one equal to another, you have to really have physically transformation of the space moving one to another, and then postulate this equality, so to speak. But just finding identical invariance is another story, and then we introduce mathematics using extra arrows. So they're not reduction, they're just set maps between sets and even not everywhere defined, but these are convened, so between two each of them is very simple. What is small perturbation of measure space? And there are two small perturbations. One, you can take some, erase some number of atoms of mass less than epsilon, and add other atoms total mass necessarily epsilon. So the bulk of atoms don't change, and this epsilon magnification, this gives you this additive distance. So it's operation which atoms you choose and which don't in coordinate in this pi. And secondly, you may take, multiply all atoms by functions without changing atoms, but such that this multiplication not very big, but this is tricky how it normalizes, because you see it divide by the number of atoms. And that's tricky, otherwise, and that's essential how we do that, because this is, and that's again typical is when you try to apply something categorical to geometry, physics, whatever, you have to have conception of distance of something comparable. And then you add this together, this give you distance between two spaces, even if they're not in both before your eyes. Yeah, they may be far away, and this can be communicated by telephone, so to speak. And then you minimize by all such correspondences. And this kind of quite clear, they're close if you can make these two operations work. And then we have this asymptotic sequence of spaces, and this is again, it's kind of physical. And the meaning is that you make measurement more and more precise measurements. So you have a big system, and piece of n means you measure precision, growing precision, bigger and bigger machinery by which you measure. And then the equivalent, this results the equivalent of this distance goes to zero. And that is once you have in mind, in a day I explained how from that much of the essential results were gothic theory of dynamic theory of Kolmogorov may be derived. And then you define this Bernoulli semi-group, what I call Bernoulli semi-group, is just asymptotic classes of these sequences. So two p and q are equivalent. If we take the high power and the right infinity, they become equivalent. See, there's very far, the original one will be not equivalent. But when you go there, they become equivalent, and here the law of large numbers enters, and it says, this is just a reformulation of the law of large numbers. It's completely simple. If you know the law of large numbers, it's obvious. And you apply it formally to the function log on the operability space. And you multiply space, log adds, and actually this, from my point of view, is much better than the law of large numbers because there is no numbers there. Because in the law of large numbers, we have what's called random variable. Random variable, I just can't understand it. I don't know what random variable is because it contains two quantities, one probability space on which it lives. And then function which is there. But in reality, you don't know it's probability space. You only observe something coming on random, and this tremendous mess. But in this form, there is no variable, no random variable. It's only spaces. There is no functions. That's the whole point. For the law of large numbers, you don't need any variable. You need only measure space itself. And everything depends on this. This is kind of the only, in my view, the right formula. And of course, it has a very simple, very simple interpretation. Of course, what was kind of normalization was hidden in the definition of the metric. Multiplicative metric was normalized in a certain way. And then it's here. And so this is what it means. So it's two equivalent. And so every high power, eventually, if you observe something, the same event, the same machine, the same apparatus you observe many, many times, the way it behaves, all probabilities become essentially equal. Equal up to the degree of precision needed for our purpose to define entropy. And then you find this entropy. So it's exactly how we can measure it now. Because why at number of states? You see it is coming from the number of states, with asymptotic. So it's log of the number of states, but after you repeat the experiment many, many times. And every experiment in physics might be repeat many times. Otherwise, I don't trust it. And then it's inevitable in the build up in the whole philosophy of physics. On the other hand, you can make the Boltzmann definition kind of, of course. I'm saying that you don't use the law of large numbers. You just say, I hide a class of equivalents. And then a posteriori with the law of large numbers, the two are equal. But the first is categorical. You don't know a priori, its number. A posteriori, its number. And in many cases, I guess it will be not a number. If you know, it's a good reason. For some cases, it should be number. And so again, this equivalence, if the spaces go to the limit, so this group is just real numbers. There is nothing but numbers. And then there is this Boltzmann formula. Again, it's very, it's trivially derived. Again, now everything, once you take these law of large numbers, anything about entropy follows from the case of homogeneous spaces, when all atoms are equal. If all atoms are equal, it is true. Therefore, it's always true. It might be slightly careful how it goes to the limits. But essentially, you don't have to think. This is still kind of tautologically, right? For space, which are all atoms equal, right? Where entropy is just log of the number of states is that. Therefore, it's always true. However, this is taken often for definition. And this is, in my view, exceedingly, well, politely to say incorrect or naive, but I would say stronger. Privately, it's extremely stupid. And because it's high, all that, and it causes difficulties in a way. But on that hand, I think it happens because mathematicians monkey this ape, this ape, this formula, removing k. But this is not only a formula. This is a very profound formula. And it has some significance, yeah? Exactly because it is constant, yeah. But then, this is a pathway. And then we just try to make it even more formal. If you really go categorical, you appeal to arrows rather than to objects. And so you have a category, and you have arrows, reductions. And then growth in your group, now I say group. It was a semi-group, actually. You can say group eventually. It's semi-group part of a group, which maybe I didn't have to do this, but here it's admissible. And actually, it's sometimes very pleasant to go to the group first. So I add these two relations. There are arrows, and there are two relations. And the first, typical. For growth in your group, the second, a little bit touchy. But that's essential for our purpose. And the moment you do that, you observe that, yeah. Then, as usual, for objects, it's defined like that. You can see the collapsing to one point, this arrow and this problem. This is standard kind of a practice in categorical language. It's very kind of just make language flexible, nothing in there. Then you observe the distance, which we defined, makes sense for arrows. So you have one space, goes to another space, so there must be comparison here, comparison there, and there must be consistent. So again, you know, categorical language tells you what it should be, and this doesn't cheat you. And so then it goes to the growth in your group. And then you just take this divide space by these two relations. And then you introduce extra equivalence relation, which was Bernoulli, but this is much, much easier. You see, you don't have, there is no powers involved. But you just say, so you have a group, and you have a metric. So it's very elementary situation, kind of removed from what we have abstractly. You have a group, and you have a metric, and the group is not invariant. And then you have two elements in the group by equivalent, if you multiply them by large number, and they come close together. So the group kind of compresses things in infinity. And so you take this distance, it must go to zero. Yeah, I didn't put it properly. My page was visible, but then, ah, I was here. And then there is a little point why it agrees with the previous discussion, because you can decompose Cartesian product, kind of this product agrees with composition of maps, right? And then the fourth group actually equal to the all group. And the law of large numbers, and here you have to look slightly more carefully at the law of large numbers, and how things go, but it not only shows that powers of spaces can become eventually constant. But this can be done consistently. The approximation can be done consistently when there is a reduction of one space to another. It's kind of trivial, absolutely trivial. It requires no extra effort. But that's essential in this. So the law of large numbers is factorial, right? So if you have several systems, you can do it consistently, though up to some limit. And in particular what you have, yeah. So remember, we have this property, which is look wide of this fast. But the point of one, because it's this force to the language, it happens in this quality. But I think it's accidental, and not true in general. But there are examples, even in measure 30, when you have infinite measure space, it's not fine, but countable many atoms. Everything I said so far makes sense. But entropy now may be infinite if there are infinitely many atoms. However, entropy of arrows may be finite. And that's typical in physics. You have infinite quantities, but the differences are finite. But make sense of that, there must be implemented by something, by some mathematical structure. And entropy exactly like that. So it's not just a number, entropy. It's a way of being assigned to it. And if you make the definition incorrectly, if you have two infinite space, you can't say what difference is. And then if you don't have categorical language, you always have problem. But in category 30, it's typical thing. How we relativize things. And this also all the time happens in physics. We live in this different world. If the energy of the universe were just before us, it would arise immediately. Not just it would arise. We wouldn't be even vapor. We would be not even plasma. We would be as quarks. But even relatively mild energy of under-earth, this already would destroy instantaneously. But we live on this edge when differences are relevant. And of course, the same is true about our psychology. We live by differences. And then I want to formalize a little bit this concept of joint in categorical language. I gave one definition, which was set theoretic. But in fact, when I speak about this joint, it means I have paramorphism. I have two objects go to one. I'm sorry, no. One object go to two. It's a diverging fan. And there is an order on them. Because one may factor through another. And the maximal or minimal one is what I call this joint. It's not canonical in this category, but it's canonical after it's a morphism. And with the sense of the minimal fan, if you interpret them set theoretically, the map. So you see, in the magic category, all maps are onto. They're also objective. There is no injective map. However, the basic inequalities, and I mentioned today some basic properties, depends on some kind of injectivity. And the injectivity appears when you have two arrows from one space to two other spaces. And this resulting map to the product is injective, set theoretically. And also it's monomorphism in categorical language. So because any category can be modified by considering particular category of diagrams. And these are very simple diagrams. And this is what is written here. So, but then the reason for the following question, how much still the law of Lajinam is factorial? As I said, for individual arrows, it's factorial. But if you look for diagrams, it's not so clear. And probably it's not so. And this is a kind of combinatorial question. I thought a little bit where you could figure out. And so if you can approximate powers of all arrows, there are many opposite arrows between them. So this diagram is graph. And then you want to go to the powers to replace everything with homogeneous, such that all kind of natural diagram commute. And for some diagrams, it's true. But it's, in general, I don't know if it is so. And that's a kind of basic thing in probability. Third, of course, to know. I think the answer is no. But to find an extent by which it is true would be interesting. And more essential, of course, you have to preserve property of fence in order to. But you have weaker properties which are sufficient. And so that's the point. And then there are the Shannon inequalities, which have already been written and they did true. And now this thing is rigorously defined, this kind of joint categorically. It was said theoretically, now it's categorical. They're the same. But again, when we want to generalize things, you go categorical. And the point is that now we can speak also about arrows. And you had this complicated. Remember, this was Shannon inequality for overlapping measurements. And they were kind of more complicated formula, right? It was not direct sum, right? It was something else. But now we take this much nicer form, yeah? You see, it's the same as they have for spaces. We had it for two, now it for many. But now, if you write it for F and kind of spelled out, you come exactly to what you had before, yeah? And therefore, you have this inequality. So again, you see there are kind of minus signs here. In F, there is no minus. Because F, themselves, is a minus. And that's very, very typical. We have this kind of arrow in relation. You always can see clearly the difference between two spaces. And this, again, one of the basic tenets of categorical language. It's not a theory, again. Categority, the way I use it, it's not really kind of a theory. But it's a language extremely flexible and simple. Much simpler than set theory or analysis or whatever. It's most primitive language available in mathematics. And really a very big kind of level of language. And but when you unfold it, it gives you a rather complicated thing. And we should see it has some implications, which are, not quite obvious, but I say it's much easier to remember kind of this formula, right? This formula, which was the same, rather than its corollary, rather than its corollary, right there. This is corollary, which is rather heavy to remember. I can't remember it, actually. It's an effort. It's a great effort to get to remember it, right? Who is plus, who is minus, et cetera. In fact, there are more corollaries to that. This one of the corollaries, which is exactly, exactly amounts to the limit suite in theorem on high dimensions. Now it's n instead of k, yeah. I must admit, I was confused and this is, yeah, I explained how things are related to another. Okay, so this was more or less the collection of what was before, but what was before and now, and now I want to pass to infinite spaces, infinite measure spaces. So the fact is that, again, categorically, whenever you have a category, there are standard operations how to enlarge this category and introduce new objects. And this is, and now, so given our category P, we can introduce, we have a bigger category, which you can see, there's a space over this category. And this, you use this kind of language. And this language, of course, is kind of, it's a little bit, no, may frighten you, but it's extremely simple thing. It means you have an object, new object with which you can operate as if it was in your category, up to some extent, yeah. And here it means that, for example, if you have an ordinary measure space, of course, I hate what the co-usual definition, but on that hand, it's nice. Then we can see the measure preserving maps from abstract measure space, for example, interval to finite set. How you make this map, you divide it into pieces, cut it into pieces of weights equal to your atoms, and each piece goes to corresponding atom. And this is a set. And that, basically, this carries all information about the measure space. You don't have to know anything else, just knowing this formal correspondence, because it agrees with, it agrees with, errors between finite spaces, right? So everything you have to know about infinite spaces is encoded by finite spaces. And this, of course, how you do analysis. But analysis, you have no language to express it without introducing this ugly infinite space, which is really very unpleasant for, but categorically, it's very easy. And this new object you introduce, this function, so it has value, right? So to each P, it is in a set, right? And you can think about that, that you add this new object to a category, and then, indeed, you have a set of maps from x to any object of a category, and that's it. So you have this, because sometimes they're covariant, sometimes they're contravariant, from the one you say, it's just extremely simple thing. So exactly, the point is that in measure theory, you don't need anything else. You don't need the set of the points, you don't need the points, whatever. All you need is relation. And as an exercise, you may, for example, see what happens to say Lebesgue integration theory. You just instantaneously come from that, you never come to Riemann integral, just in conceivable, Riemann integral in this language. It's just all properties of integration come, and in a second, they say what happens to the Beggs, to the Beggs' downstream. So there is one little point here, which I overlooked and somebody pointed out to me. Yeah, so when you do that, you have to take many objects in the same category. I mean, there is some logical point, this category you might be slightly careful with the language, I'm not really good at that. It's easy to make a mistake. So you have to use language, it's a very convenient language, and everything goes exactly as you want, except you might be sometimes careful. And then there is a little point, and this is a, we had this operation, we had this operation, and now we get a worry, and you wanted to be there. So again, in measure theoretic sense, it says if you have two partitions of your measure space, you can intersect this partition, and you have, again, finite measure space. When you take finite partition of your measure space, it gives you finite measure space, it collapse all pieces to atoms. You have another partition, you intersect them, and you have, that's where you have these arrows between finite spaces. So this, and then this is an essential feature. So it's not just a covariant function, but function of this property. And that's all you need, right? So it is basic operation, still makes sense. And now when you are under this object, this operation becomes non-ambiguous. So you have a category with additional operation, this finite space plus additional operation, and then encode this space. And again, of course, it's everything you do in measure theory, except you change your notation, and just make your thing disappear from your notation. The point is, you have this carry this weight and needed notation. You throw them away, and you already know what to do, because of category 30. And this is what you call the measure space. And these measure spaces, so defined, are not exactly as usual measure spaces. They are somewhat more general. But they have all properties of usual measure spaces. They have no, well, I don't want to say what the property they don't have, right? So just again, everything is categorical, everything works just in a yield way. All the operations we had in our work, in particular, we have a kind of transformation. This category is all operations kind of become factorial and transform. And again, you don't have to think about that. It just allows you to be completely flexible with the annotations and not introduce what you don't want to, or what they needed. And here, this operation now becomes non-ambiguous. Yeah, this operation before was somewhat ambiguous. It was manual operation, but when you add this X and everything governed by this X, it means kind of partition of you in space, but you don't have to say it. Then all this operation become canonical. So you can use them systematically without saying any time that exists, these are good, that it become canonical. Just language become more flexible. And read this, what I said, this operation, minimal and whatever. And now, come, kind of some novelty. And so, how we think about this physically again and physically, it's extremely simple. So, physically have this big system, which kind of secretly behind it is measure space. All you know about that, that you can have measurements. So you apply measurements and it gives you a set of observations, something ticking, and this is my set. So, physically, this is exactly what it is, kind of physical system is a function from my category of my machine by which I measure. So I have a machine for measurements and they related one to another by error. So one machine may be used for observation in another machine, right? And this consistently, then we have just some unknown. This machine we know, it's our apparatus, we make, we know something about them, but we have some real object and big object in the world. And we have this consistent observation and this means you have this covariance function. And the question is how much you can see to see everything. And of course, as you know, when you speak about anthropophysical system, it's a little bit illusion. So, entropy depends. So what is the number of states, right? If we have some atoms under some temperature in quantum mechanics, they never reach certain states and an origin so rarely can ignore them. So when temperature goes down, the number of states goes up. And this was known, it was very confusing for, confusing for, in classical physics, one of the reasons how quantum mechanics came up. And the entropy of a gas depends on the temperature chain and kind of kind of, not how you would think it. So, entropy is not, it's not even for this infine system, even you assume you know what infinity means, it's not well-defined, it's even more infinite. Yeah, it's infinite because whoop, it stretches. It stretches in the direction, in the direction of space. But it's also infinite because the other parameter is energy, and the deeper energy, the more energy. The more entropy. So there are many ways you have to normalize it. But however, so, but this problem kind of disappears if you have some definite class of measurements. So again, it is a, we don't think, don't speak about this abstract or real, you know, about the reality of the system. We only try to organize what we observe. And then we can say this in a way that we have sequence of observation and say it's full if, within our means, you can't see more. So we describe, in our machine for observation, our finite measure spaces. And therefore, only about them we see. And so we can say what do we mean to solve it. It means if you add q, and nothing happens. Right? And entropy give you information, full information about the system. So everything you observe about the system is encoded. But entropy is a number of states. We don't know what the states are. But if adding new machine, you see the same number of states, means you can't see anything new. Of course, it may have become of different color, but then there'll be new states. So if you see the same, so. And from that point of view, entropy give you full information about a physical world. There's nothing else but entropy, which is, of course, it's overstatement, yeah, but the question of interpretation. And then the example, and this example will be essential to us. These are infinite products. And, but maybe I'll come to this after the break, because it's essential for what follows. Because today, actually, I'm planning to finish a little bit earlier. But that's, again, maybe I repeat, yeah, I want to emphasize this concept. So we make observation with some. p is meaning we just have this infinite system of observations. And then we couple them with other machines and see what you observe. And the point is, the entropy doesn't change. All of these finite observations, right? Because it means you're always, for every q, there is another p, which when you add it, you see as much as you see with q in terms of the entropy. And this is how it's encoded in the language you want to be. And again, this is what's interesting. In my view, that's what I say. Physically, it's just kind of obvious thing. You don't think about them so naturally there. And mathematically, when you, I don't know how to say it in a traditional language. You know, pages of formulas, start writing some formulas. Completely immaterial, yeah, it's just, you know, you have to learn and exploit unrelated language. But the middle is better. Again, I haven't read what Bennington wrote. Maybe there is even better language and better meaning suggesting more, more, more structures behind it. OK, so let's make a little break. So in probability theory, the fundamental operation is the fundamental operation taking products, which corresponds to considering independent variables. And even with a dependent, yeah, you have something like products. And in this language, we can see infinite products defined by finite, profiling products of probability spaces I define. And these infinite products might be something which is all by finite products. So you only can see the projections, the coordinate projections. And by these, everything is being defined. And there is one kind of minimal example, which is essentially where the corresponding object will be topological cantor set with a measure. So you have a finite measure space. You multiply infinitely many. You have both topology and the measure. And you can say, oh, this is my measure space. In my measurable map, so to speak, you'll be continuous measure-preserving maps. It's again, from high point of view, it's a perfect measure space. And everything you'll be proving will be true for that. And then there is a maximal one. And this is a, when you make this completion in the Kilmogorov 30, you make some kind of a completion of that. It's kind of a, it has some extra, extra, extra properties. And which are unneeded for most applications. The whole, you do that because you want some universality. The same kind of, like, happens in the underway formalization of algebraic geometry. It has universal field. However, when you look points over algebraic variety, it's, you don't have, you don't need this universal field. Because if you consider all fields simultaneously, universality is built in. You don't need to make this universal object which is artificial anyway. And the same probability 30, this completion of Kilmogorov is unneeded, at least in the robotic theory. Sometimes you may use it. Sometimes you, if you look convergence almost, everywhere it's very convenient to have it. But in general, in many cases, you don't need it. And then there is this in the back dance dilemma. And it is entropically, again, some simple combinatorial statement. It's the only one which is needed to have full flash measure theory. And so I just don't want to repeat what it's written. It's just some, again, something about arrows in this category. Something completely trivial. Whenever you state it in this way, it's a kind of exercise and saying which says that this, so it's essentially what it says, that these products are like usual interval. Just for your satisfaction, if you want to say that your theory applies to the interval with the usually back measure, you have to say it. If you don't, you don't have to say it. Just again, it's purely philosophical. You don't need it. And then in between, I bring in this formula, when I have two spaces, one goes to another, an entropy of this arrow by this kind of formula. Again, this is one of the typical things which usually people speak about, a relative entropy, which is, again, it is a kind of minor point because it's minor, because it's not true in many more general contexts. So this is a little discussion saying that there is a way to express basic property of measure spaces, which people like very much in this kind of finite language. Of course, when you go to the limit, but you don't need them for most of your work in measure theory unless you start identifying abstract measure space to products with other measure spaces. It's kind of a trivial part of the activity, but this takes most, you see, it's a very heavy lemma, but it's completely trivial. You don't have to do it again. It's just only the question of purely comparing languages. And now we've come to a more interesting issue. When you have infinite space, like a crystal, in my picture, it is a crystal. And each atom may vary a state, and this atom interacts, and so it's infinitely many possibilities. So entropy must be, of course, infinite, but you have to normalize it properly. And so what you do is like that. So you measure it. We have this machinery, huge size n. We make measurements, finite, but big. We take entropy, divide by this size, normalize it, and go to the limit. And the question is how to do that. And there is no general recipe unless they all will be ambiguous, unless there is a symmetry. And the crystal is symmetric, and therefore, so what you know, if you take some piece of a crystal and go along, you assume, and you observe, nothing changes. You see all the same. So this is your axiom. And then, so what you say, you have a measured space. In measured space, we can say either it's a disfunctor or the whole category, an extension of our category. So it is something rather structural. So in this way, you measure space. Instead of having sets and points, it has inside finite objects, which are approximated. P, and there are arrows between them. But there is no points, so to speak. Your only sets are there. And again, it's very similar to the language of sigma algebra, but somewhat different. Because everything can serve only finite sets. There is no infinite sets. All sets are finite. And so imagine there are transformations which act on this P. So you have a crystal, and you have this machine attached. When you transform it, it goes somewhere. So transformation can be applicable to this P. So we have a measure space, and there is a transformation. The transformation applies to all P in this category, each line. Again, of course, it's always like that. But again, the point is you don't have to speak about sets or partitions. And then we make this agreement. So you're normalized by the number, by the size of the sets of transformations. So now my sets are not sets of something. We don't have concepts. We don't have a concept of the size of your apparatus, only how many transformations we applied. And that's an ambiguous number. So you take one apparatus, and move it in a different position, how many moves you made. And the point eventually, we should see the material whom you move. That's the whole point of that. And that's how it works. And now, of course, how you move. This may change your results. But what I want to show, and that's the essence of that, it's a material which P's I use. So if I have a sequence of measurements of my P's, which is all my system in principle, I can see everything. And I have another system of measurement, another Q's, which have the same property. The results will be the same. And this little kind of computation, we is sharing an equality. And so this is the core of the proof of this existence of entropy. And this is again, oh, excuse me. Hello? No, I'm sorry. Oh, no, not now, not now. I'm sorry, because I expect some more important code, but not this one. And so this is what I want to show. This is kind of the key when you use all the Shannon equality. And this is a finding. This certainly was in Shannon. But then it was with Kalmogorov, certain two measurements set up, and there was some kind of paper. And there were even mistakes in this paper, corrected by Sinai. However, there is nothing there, as I explained, compared if you use this kind of formalism. And so we want to show if you replace p by q, by some other system, nothing happens. If they dissolve, again, p is something, the system of measurements, we give a full picture of your p. A full picture of your x. And our definition of resolution, that within finite system, there is no new observation you make. And that everything was defined only by what happens when you see by this finite kind of observational means. And so here is the adjust. So this was definition of kakawa approximation works. And when we renormalize, now we take it big and big and make this thing, you arrive at this formula. So when you add q, so you measure, you make observation without extra apparatus q. And then you add this. And see the difference. Of course, if you add more when you add something. But the point is that in the limit, this effect will disappear. And so here is my annotations. And what I use, of course, is this relative Shannon ntp formula. So it says that when you add this q, nothing happens because of this inequality. And this is a little computation which you have to make. And then this is exactly a response that a point of view that you have equivalent measurements protocol. And in one lab and another lab, you make some measurements, you get the same results. There are different measurements, but they are equivalent within these measurements. So you can reduce one to another. Anything you see in one, you can see in another. Then you can postulate what you observe at the same object. So what you observe is a real world from this point of view is equivalent class of rules by how you observe it. And anything else is a kind of philosophy. We are not accessible to us. But this amazingly fits in all of these mathematics. And so we come with this effect. And this is absolutely abstract in general. You can do it with group action or whatever. It's just that it's a material which p's and q's you have unless they resolve your space. And now we apply it to the simple example. And this was what Magorov done. This kind of purely philosophical, again, modular, the Shannon theorem. Shannon inequality was also proven conceptually. But this doesn't quite fit here, I must say. It's not within this framework. So you still use numbers and inequalities. And I don't like it. So I don't know how to fully eliminate it. And then there was this theorem applied to these infinite product spaces. So there was concentration on sequences. So I have sequences of infinite sequences, distribution, their powers over one fine probability space. And then there is a theorem that if there is a measure-preserving map between them, which is isomorphism, then n-tepeson p and q are equal. So the simplest example, which you may observe. So you may have one space which weighs 1 half and 1 half, which has entropy. Well, some of these logs, like 2 log 2. And then you may have something that might be quite stupid what I'm saying. Say 1 over 10, 1 over 2 over 3, and whatever remains. So this is a kind of measure space. And you only have to have this n-tepeson equal. So here and here must be the same. Here is my p i's. Here is q j's. I hate notation ij, but when you write on the blackboard, they are numbered. Yeah, 1, 2, 3, right? But before you write down this way, they're not. There is no numbers. And so it equals sum of q j log q j. And then you have such a space, which has its atoms of different weights, to the power z. And take another space. So I have these two spaces. And imagine there is a bijective correspondence, but measure m bijective up to measure 0, as usual, as a preserving commuting protection. So the way I sketch it, they may have kind of equal. And then the point is, if it happens, n-tepeson on this final space equal to n-tepeson on this final space. And this was kind of a quite remarkable theorem. It was a big problem to decide when the Bernoulli-Schiff's are the same. So here, it's an infinite space, but it's compensated by the fact that there are equivalent maps. So secretly, this n-tepeson with big space is this infinite n-tepeson divided by infinite group and give you finite number. And you come back to what you started with. And this, of course, is a very general, very kind of fundamental issue. And you have kind of some physical system which you observe only kind of by not itself, but only when you repeat it infinitely many times. And this goes to infinity. You don't have individual observation. You only have repeated observation, very large number of n. How much of this you can say? And here is that you can reconstruct n-tepes, nothing else. You cannot tell if they have equal n-tepes, but quite different, of course. Then you cannot tell them apart. But the n-tepeson variant, and the proof is just after all being said, is kind of rather instantaneous. And the only kind of thing you use about integers is that. This bottom formula, where specificity of the situation enters, n plus i divided by n, this ratio converges to 1. So n-t-p-p and it cancels off, of course. But this ratio converges to 1 for every fixed i. And that's all which is in there except for the generalities which I described. So and then this was proven by Kolomogorov a long time ago in 1958. And as I said before, it's unclear to me. I haven't known enough kind of original paper in physics how much of this was understood by physicists. And in a way, they might have had this idea in different language, of course. And also Shannon might have understood it. This, I'm kind of very, very unclear what was understood was not. Because they were not, I don't think at that time there was any proper communication between different communities, like people close to Shannon on one hand and thesis, like Van Hoff, who were doing that. It's unclear to me. You can imagine that this was kind of obvious for physicists in a way. I mean, in a way. That's entropy. Of course, they understood that if you have a physical system like crystal, you measure entropy exactly. This is process of exhaustion in Van Hoff's theorem. And again, I guess it was known to these before Van Hoff they haven't, it was not written down, that entropy doesn't depend how you measure things. If you measure it, it's sufficiently kind of faithful. If you don't miss something. And this is kind of obvious. And it doesn't depend only on dynamics. But this point that depends on dynamics maybe was not quite clear. I don't know. But anyway, it was done so. And then it was, in my view, nothing was happening for a very long time. Because this argument applies to what is called to all amenable groups. So amenable groups. And this is just exactly the one, the class which you can imagine in physics. So you have this transformation. I prefer to speak about, actually, two-dimensional things rather one-dimensional. An essential feature of that, if you take big domains, properly chosen domains, then the boundary, the number of points in the boundary, significantly less than the number of points inside. So the boundary effect, cancel, become negligibly small. So you can make measurement by using these transformations. And the error coming from the boundary will cancel out because the proportion of point is smaller. And these groups are called in general amenable. Or generally, such array of points are amenable. When you can exhaust them, such that the number of elements of the boundary in the limit proportionally goes to 0 with respect to the number of points inside. And a typical example of a group which is not like that and where all this nothing of this applies. Exactly, so what doesn't apply is this. This was kind of the narrow neck, the bottleneck of the proof. Again, everything else is just from general principles. Is free group with those generators, so which is kind of maybe represented by 3, right? It branches everywhere. And at every moment, we have geometric progression, right? The sum of geometric progression is only twice bigger than the next term. So it grows pretty fast. And then it was known, there were examples that this Kalmogorov theory breaks down completely. It breaks down completely. And then, quite amusing recently, a few years ago, it was discovered it's still true for free group. So free group and generators, or any number of generators. If you take, again, the same p and q gamma, if you have two fine, probability space in this power and they are isomorphic, then entropy of p equals entropy of q. However, it is not true. And this was true for a minimal group. If you have a minimal group and you have reduction, reduction meaning now, equivalent map in this category. So it's, again, measure preserving map should be understood as in this abstract language, of course. But you can understand them also in a traditional way. So there is this arrow, which preserves measure goes to measure, pushes forward to measure, and this gamma make a variant. If the group of gamma is amenable, for example, a billion, then entropy goes down, which is not surprising. You have a smaller system, entropy goes down. It's not true for free group. And it was an example constructed prior to that quite a while ago by Benji Weiss, who is somebody I keep forgetting the name. And then, certainly, it was clear that the whole theory, as it is, cannot work. And then it was that by Louis Boyne, however, that this is true. If there is isomorphism, that entropy is preserved. And that's interesting, is the kind of philosophy behind it. It's very close to the original one of Boltzmann and even the way how it was presented by Kolmogorov. And then later on, it was kind of refined by Sinai, but actually it was refined. And this essential point was lost. And Karpatli got recovered by Bowen. And the point is that if you read what Boltzmann says and how he thinks, he certainly operates in terms of non-standard analysis. For him, it was, obviously, there are infinitesimal. And there are infinitely big time, infinitely small time. And then, of course, there was much of the statistical mechanics for formalizing that third language, not exactly how he meant it. And many think he said, it may look self-contradictory. Because if you don't accept infinitesimals, of course, you cannot say what he was saying. And so it was interesting about this concept, about this theorem that it uses, again, non-standard analysis. Again, analysis is just a word on use non-standard kind of logic when you allow infinitesimals. And so the point is that the groups, the property of the free group and other groups, where it is true, the groups are called suffix, so there is a class of groups which includes free groups and includes, in fact, it's not proven that not all groups are known to be suffix, but there is no group which is known not to be suffix. Of course, it's kind of, that's all. Probably majority of groups are not. The property I'm going to describe. But there is no count, for example, all known groups which you can decide are suffix. And the idea of suffix is very simple and is as follows. So what is true when you have the group which, what I said, is amenable. It means you can exhaust it by erase, so boundary will be very small compared to interior. Which means if I take any transformation in there, it's defined everywhere except on a very small proportion of elements. So the group approximately acts on a finite set. Another class of groups, another way it may act on a finite set is like a group like that. Of course, say for that, you can make it acting like that. It acts on this interval, ends go outside, but proportional points going outside, negligible interval grows. So with any degree of precision, you can make it act on a finite set. And there will be always proportion of situation when action has some problem. It will go to zero for any kind of property, any formula you write down, and the limit disappears. And there are two opposite different ways, one like that. Or you can make the group Z, the integers act on a finite group. On a finite set, you can make F act on Z divided by NZ. So you have this N gone. And you have rotation, and here action is perfect. There is no error. However, these are quite different kind of actions, and there are groups which act in one way but not in another way. Like free group admit this kind of action on finite sets with any degree of precision. And these are called rigidly final groups. And then you can say, well, let's make the definition. If there is action that's formalized what you have, you have approximate action on a finite set. And then we have bigger and bigger sets. We have better and better actions. And so you can say, when you go N infinity to large, you have this finite set. But N is infinity to large number, and group acts there. In more technical terms, you can say that social groups are groups which act by summaries by non-standard compact metric spaces. So if you know this language, it's very easy to say. Naively like that. And then for these groups, you have this theorem going, and the proof is not that simple as the total order of theorem. You really have to go into computation in a much more profound way. And it depends on all this kind of injectivity in there. So I don't want to dwell on this too much because I want to arrive at something else. And so I want to finish today at that moment because our next topic will be quantum entropy. So this is one direction. And the problem which I want to bring forth is can you define entropy of these objects like dynamical systems in this category theoretically? So I formulate it somewhere. So our definition of entropy, I haven't defined entropy for dynamical systems, which can be done by kind of approximation argument. And this is in a rather traditional way, except without mentioning a major theory. But the question is, what is the right category to define entropy for dynamical systems? And certainly, again, you know some properties you expect. And some of them, by the way, satisfied and some not for born entropy. Might be somewhat careful. Again, there are kind of flaws. But the difficulty which I see, but maybe it's very difficult to see, well, first this natural category, we have spaces x acted with group gamma. And so you have a morphism which commute to the sections. But this category may have not enough errors in there to define proper entropy. And when we were working with finite space, we have additional errors as partial correspondences. And these are unclear what they are should be there. They serve kind of coordinate system. And then you cancel it away. And that's unclear if you can define this entropy. And if you define it, you have to compute it. And again, in these examples of the Bowen with these social entropy, it's unclear if entropy is unique. It's a number or something much more elaborate. And this depends on choosing particular non-standard models. And the things are not kind of coherent. So on one hand, there are these remarkable results, something invariant. On the other hand, understanding is missing. And there may be fundamental reason for that. So maybe we have to change concepts of the entropy more radically. But so this is a question. And then we start. And then next, that, oh, by the way, I have a misprint. So today I'll finish earlier. And we start with this subject next time.