 The starting point is, this idea you can see to Mendel, that what we see, phenotypically, the constituents of what we see, something which we don't see, right, like it's like in a usual matter, you see atoms, but you don't, or rather they are manifestations, but not of the, what they're made of, right, not of the, of the, of the subatomic particles. So, and here the observable features associated to genes, but if gene composed of two or many, for the plural arguments of two or many units, and the A, B, and you write it like pre-normal A, B, and what you observe is a function, right. So it's the, the observable certain function of these two variables, and so you don't concern with what this function is, we only concern with this component, so this kind of, maybe more of them, maybe C, each of them, you think about this kind of variable, if you see it you'll be like that, and you have distribution of them in the population, and so you have this kind of sum, weights, yeah, A, B, C, A, B, C. So these are kind of formal variables, and you have polynomial in these variables, and when you think about what happens under randomly mixed, randomly made population, so what they exchange, right, they don't exchange their whole genes, but only these kind of gametes, and when you just formalize it was, was done I think by, by, of course by Mendel implicitly and more explicitly by Hardin special case, you arrive at the class of interesting dynamical systems, and then you can kind of forget where they came from, so let me summarize what are these systems. So they work in some topological algebra, and in the case of, in the case of genetics, it will be truncated polynomial algebra, so you can see the polynomials in many variables, divided by polynomials with degrees in each variables less than something, say one in this case, but maybe anything, yeah, and you take, they take this ideal generated by certain monomial, you divide by them, and you have this truncated polynomial algebra, and this is quite, quite, really very kind of well, well shaped algebra, and particularly I would say one property of this algebra, which is implicit in computation and genetics, and it's, and if you don't make it implicitly rather complicated, it has exponential map, so in this algebra the exponential map from this algebra to itself, which are more subjective, so if you start, if you have polynomial starting with positive free term, they are in the image, so you have log, but if you look at this property of the exponential, you write explicitly, it's rather, it's not that obvious, but it's quite, quite remarkable feature of this algebra that additive and multiplicative groups are the same, which is implicit in genetics, and so, but now what are these transformations? So, transformation go in two stages, first there are endomorphisms, A to A, then you have product of them, in fact, so, so, so how these endomorphisms come, come to life in examples, for example here, because so this algebra is typically algebra function somewhere, so it acts on a space and endomorphism come from maps of the space, in the case of Mendelian dynamics, you have Euclidean space, and you have coordinates, and you just have projections with coordinates subspaces, and these projections give you endomorphism of your algebra, of course, right, and actually it's not, and this endomorphism, but we have, we have interesting nice feature, the, the, it's important, by the way, this is also has some consequences, so you see that the whole story started when biologists realized that the Mendelian logic, and it was implicit in Mendel, tells you that variation stops in the second round of reproduction, of reproduction, so you have mixed population, maybe mixed, now you have new kind of population with different phenotype, but when you mix again, the distribution of phenotype doesn't change, so this, this slogan of, if it survives, doesn't apply, it just happens only once, and then nobody dies anymore, I mean, just, and this vibe was so kind of, people around Darwin were so much unhappy because completely destroyed everything they were saying, justifiably, because what we were saying was not, not correct, and then, and then just behind this, on the first level, each projection to the coordinates have its properties squared equal what? Second, on the second level, you multiply this endomorphism, and then you have new kind of maps, they are multiplicative endomorphism of the algebra, not of the algebra, but of its multiplicative group, and they are, as I said, quite non-trivial and quite amusing maps, just the, the objects which you obtain, and the typical map of this type, which I want to kind of, again, bring to the light because quite remarkable, these are called Sagra maps, and the most kind of remarkable, the first one is a map, it is a map from projective space pn to the sphere across a few dimensions and plus one over n over two at least, but anyway, it is obtained by taking linear form on a linear space, take linear form and take it square and become quadratic form, so you map space of linear form, one linear space to another space of space quadratic forms, and the image of the projective space because symmetric sphere factors to projective space, you have projective space, the simplest example, you have S2, it goes to projective 2, and then goes to the sphere S4 in alpha, yeah, and this quite, quite, quite if you think, quite remarkably, symmetric sphere sitting there, it's orbit of the orthogonal group, even of the linear group, and this next level, so inside of this already, we have this here, the point all the symmetry enters because we assume independence of events, so independence is assumption of symmetry, it's not just from philosophical point of view, it means something independent, mathematically means you assume things as symmetric as they can be, and this, why all the symmetry immediately comes in, and then the next level, this concerns a single gene or single locus, and this kind of the maps we should get this way, this multiplicative of different damorphisms, and they have the properties that also, this multiplicative of the m, also they have this Mendelian kind of maps, which Mendel emphasized square equals m, so if we look at a fixed particular phenotypic feature associated with one gene, its distribution population changes on the first round of random, random mating, it has nothing to do with selection, so what you observe in population is variation, as you observe it has nothing to do with selection, just mixing of present gamete, and then on the next level, when you have this m's, you can see some particular convex combination, and this moment with J, and this moment some analysis enters, because convex combination numbers must be positive, otherwise nothing works, I mean you don't get, you don't get anything manageable, and this combination corresponds when you have several genes in this kind of combination, now it doesn't stabilize, that's actually maybe doesn't stabilize, but it exponentially converges, and the reason of that again, the kind of maps you see in this example, so here your fundamental and the morphine projection to linear spaces, out of them you make other things, and they are invariant at the very big symmetry group, and actually it's exactly what Fiesch's color normalization group, because of the symmetry you can control them pretty well, and the basic result here, that this map, well you have to look what condition they satisfy, which I explained very simple condition you need to satisfy, but there is a unique fixed point in this map, it's equilibrium position, and it is fixed point is attractive, exponentially attractive fixed point, and this is seen because just once you know it's locally attractive, and because there is big normalization group, it's differential attractive, then you know it's attractive in direction, transversal to equilibrium position, so everything goes to that space of equilibrium maps, and the equilibrium maps, the one which maximized the entropy, and this one entropy enters, and just a couple of words in biology that should be said, that this feature, this fundamental property of sexually producing organisms, that it's not features which are inherited, but the secret units of gammas inside which are invisible, which are inherited, ensures stability of populations, if not for that, we all be dead pretty soon, we would degenerate, again elementary mathematics shows that we would die, nothing would be there, so if you take naïve, that we need theory, use mathematics, we are not there, we are dead, and this is very easy, easy mathematics, because it would be accumulation of deuterious mutation, and they will be eventually, any population when there is no horizontal gene exchange, when there is no sexual reproduction, it tends to degenerate, of course time depends on the scale, but the essential here which makes it comparably more stable, that population keeps units which are not manifested themselves, so if they become somewhat deuterious, some others will compensate, and this theory is completely different, and I haven't looked carefully in this mathematics, and so in any mathematics, in genetics we have two time scales, one the one which I described, from Mandel and then long scale, the latter which correspond to mutations, because these A's and B's may subject to certain modifications, which are not at all kind of trivial, what they are, they are not just random something, and I mean in traditional evolution theory they were confused by some people, at that time they could become clearly transparent relatively recently, there was some clarity she has been achieved, but it may be deceptive because say it was happening many times, when Darwin came with his theory, everybody was at least those who supported him would be very happy and say oh now we understand evolution, and then next generation came with Mandel, and then by Haldane and Fischer and Wright, who developed mathematical formalism and say oh no, no, this was complete nonsense, now we understand it, and this was called modern evolutionary synthesis, about in twenties or thirties, and then in seventies the molecular biology started, and Mano was a great, one of the great biology of this time, said that well now we understand evolution on the molecular level, and he was extremely kind of sarcastic about everybody else, you know, people out of speaking about evolution and saying they understand something there, because his opinion justifiably understood nothing, but now we say Mano understood as little, because his ground was very few data, and now we say, now we understand, and the reason is because the data on molecular goes to the petabyte basis, like trillions of units of petabyte, keep forgetting, maybe even quadillion, the number, total amount of lengths of sequences now being analyzed, I think, of the order of ten to the fifteen or something, yeah, this petabyte. So this is the data from which you now can make some non-trivial statement about the evolution of what happens, and if you have, you know, something like ten to the five which was the type of Mano, yeah, it's nothing you can't make, it's incompatible. So, and now people say, well, now we understand something, right, they can, one can reconstruct the common ancestor of all living organisms, which lived about 3.5 billion years ago, but on the other hand, it is explicitly stated now that we understand nothing, you know, it's a tiny little piece of the situation infinitely more complicated than people believed in 200 years ago, 100 years ago. So that's concerning that, and now I want to return to the entropy, because it enters here and it's, so the point, yeah, maybe one last point that, so this was very special kind of endomorphisms and mathematically immediately asked what are the transformations of this kind available when you have an algebra, you have this kind of dynamics in there, and other examples what they are, and another pronounced example is of course the normal law, so the normal law is exactly come from the same time, you have algebra functions under convolution, you take this convolution squared and the thing become rescaled, so you rescaled correctly, and this always but rescaling is essential, all these things, they never have, on the polynomial level, rarely almost never have it important, and then you have a fixed point, and you know it's a huge attractive base and this is the normal law, and the question is what happens for other fixed points of these transformations, which are sufficiently attractive and what are similar transformations involved, and I think, I don't know how much has been done with that, and of course there are many other algebras, many other examples immediately coming to my mind, but coming to one's mind, but I haven't looked at them, even here, as I said, there are lots of other mathematics, but the point is if you want to make mathematics, you have to really be separate yourself from reality, because of course this ideal genetics never satisfied, on the other hand, this is exactly because this mathematics is useful, because you can understand what it is, and it is roughly, roughly use it, but typically in this population dynamics you can see the more general kind of mating, and this mating is usually just more or less general, if you don't say something, quadratic map from euclidean space to itself, yeah, this is a quadratic map, and then, and then what? Nothing, quadratic map is just anything, any polynomial map may be seen as quadratic map, because all, all kind of algebraic operations generated by summation and multiplication, so it's of degree two, so everything is quadratic, so everything can be there, and how it cannot say where kind of is what you want, so that's the problem, on the other hand, you can imagine other models, for example evolution is sufficiently separated scales, of course another feature here, which is very non-physical, which is opposite to what we shall see in a second, it's scale separation, all biology depends on the fact that there are many separate scales, and mathematical that is related, it is part developed what is called tropical geometry, but again I haven't had, look at this myself carefully, and even what I looked at, I don't want to discuss here, but now I want to turn to entropy, and do the similar thing, so we have this very naive starting point was we have two groups of organisms, two fields of flowers of different color, they were separated, they mixed up, and we see distribution of colors, and see how it develops, and see it kind of stabilizes too fast to be counted by all these mathematical phenomenon, now and mathematics here is kind of amazingly ready ready, it's just basically it's algebra and homomorphism, something quite nice, symmetries, it's a group, actually symmetries here, it's a randomization it's typically an important group, it's ready, now what happens with with physics, so because Boiseman and Mendel were almost contemporary, Boiseman was slightly younger than Mendel, but of course we heard of Mendel, and they pursued the same idea that the fundamental world is discrete rather than continuous, right, so Mendel shown that basic units of inheritance are discrete and it's again as saying in opposite to basic premises of Darwinian theory, and so Mendel was right and Darwin was wrong at every point, but and Boiseman was insisting that the world was discrete on the level both of structure and energy, so he suggested atomic theory was of course conjectured long before him, but he was promoting it, and also he suggested energy is discrete, and this was his suggestion he made to Planck and the Planck has now introduced his Planck constant in discrete definition so, but what is kind of the model what are the mathematics he was trying to pursue, and so what I'm using about that, again there's mathematics which you have physical or biological very simple scheme, but mathematics you take as the one you know scheme is always the same, but your interpretation depends on your background so what is the logic of Boiseman, so we have the system of particles but at that time atoms were kind of semi-conjectural some people still didn't believe in atoms justifiably, because as we know today atoms in the classical framework cannot exist, it's just an absurd notion they only make sense in terms of quantum quantum mechanics, otherwise it's just self-contradictory concept and, but still you can pretend that and then you want to understand it's a system consisting of many atoms and you want to assign to it entropy, entropy of course was coming from physics in a different plane but we just stick to the ideologies of Boiseman and as a physicist you can read in physics, text entropy is just log of the number of states and this just try to decipher it try to decipher it so what is you have the system and what is states and what are the what it means so you can say aha this is how it was done by mathematicians of that time the interpretation was coming along with counter theory of sets and actually counter theory also many imagine this spirit well built of basic units elements of sets so you say aha, there is a set of states and this is log of the cardinality of this space right, so you have all possible position of the atoms and then the immediate objection is kind of infinite set, it has no cardinality, it will be infinity it just doesn't work but of course this is because we already try to read this in the language which is not appropriate here so I just what I want to explain that the way people were teaching Boiseman till recently was just completely arbitrary it was the language so for me the starting point of the language should be somewhere 1960 for this particular purpose I would say so pre-growth and dig or post-growth and dig of course there were other people involved particularly in this story it was also people accompanying growth and dig in categorical thinking and then another point was around this time which developed but it was non-standard analysis which didn't exist before and so what Boiseman was saying it can be interpreted completely rigorously in terms of non-standard analysis and categorical language mixed together so all this infinity is whatever relate to the concept of the space of states but there is no space of states, I mean this is a abstraction, mathematician may say it but you may say well it's just not that, there is no such thing and so what actually the thesis have what they observe and how interpret what they observe and this what let me explain it and then how we transform it to the language of post-growth and dig language and so what you observe and he just follows so here this physical system for Boiseman was a gas I prefer it to be a crystal right it just for the purpose of the discussion it's slightly easier and so it's a crystal so think about secretly but then you don't know what it is what atom position in the lightest in the symmetric pattern in the space and this is a three translation making this group G3 but you don't know that right just your mental picture so what you do is that measuring it so you take some machine you put it here and see what you see and then it's another physical system it has its own states and you think about this some thing with windows and then something blinking in these windows in this and this and this and this then you take another such machine put it somewhere else and you have the same blinking and you can move it along in this space and see what happens again they blink together and then and that's what all you see and moreover you don't have colors whatever you only have frequencies of blinking and if you can move one machine to another by translation of this space you say they kind of really the same and then you can compare corresponding windows and from that you develop concept on this but the only thing the physicists have is entropy and what I say description will be not quite complete I described in categorical terms but probably the right language would be two categories because there is in fact protocol of how you do that and these protocols are morphism in two categories and that probably better kind of formalism so the right not right but simple more adequate formalism from growth in the point of view would be the one which I'm going to describe so what is entropy what of this physical system right and I want to say that there is no set you ain't sure there is no set here there is entropy but there is no space of states you have number of states but there is no space of states and this number and this log must be understood in a way in an appropriate way exactly kind of translating what Boltzmann says into the language of first into categorical language and secondly into into language of non-standard analysis which actually there the right object is some growth in the group is some non-standard completion of certain category so and this I want to explain and this extremely extremely simple thing is just just words translate what I said so what again we have we don't know what system is we have attached to these little things these kind of windows and see this blinking and sometimes you can move it and sometimes you can have them together this and then my other attached and then each of them is a physical system in its own right and you can attach to it a smaller one and see what you see from this one right so you make this well we have this machine put here then this machine connect to this machine do that and so what is the category again I'm saying it is describing very simple category which is kind of kindergarten category but if you just say in this terms everything become extremely straight forward and this basic category is of finite measure spaces it's very simple category which is objects are stones kind of collection of stones with given weight and normalize to have total weight one to measure you measurement units and morphism when bringing better to say as I said before maybe not not stones but drops of water they drink them together some of them you can bring together and they mass adds so these are morphisms and immediate objection maybe is just like partially what is said why to have category but we shall see in a second that is similar advantages to think even in this example to speak in categorical terms now about entropy maybe I say before going to entropy we go ahead and then return back to entropy this is our finite spaces corresponding to this little machines which have finitely many entries and you have frequencies of some blinking here and these frequencies are our weights so our measure space of it our morphisms when you go from one to smaller one right but then there is this big one which is not of this nature right the original physical system is not like that it's kind of infinite these are finite there are finitely many windows and there are blinking there so weights are these frequencies and again when you look at this with limited means for example you only count blinking here and here you have this morphine from one finite measure space to another what about the big one right so the big one the one you measure by means of the small ones and you know perfectly well what it is in categorical language it is covariant factor from this category of finite measure space to category of states I mean just you don't have to think you know what it is I mean there are this language if you know this you do it effortlessly and then you go on and just all measure theory as we know it counts effortlessly from there all statements and theorems improves of measure theory are part of this categorical language so this language whoop like flow you just say all standard sentences absolutely trivial without thinking in all measure theory in your language once you have this category and this category is a skin and this what people do in measure theory usually but their language is nice essential language and this creates problem and historically so the point is interesting and I think for me justification that this language is bad was that entropy was in the dynamical system and went by Kolmogorov I keep forgetting 58 his first paper contain mistakes yes in this context this is so trivial some clear how there is no room for mistakes exactly because he was using his own language he created also measure theory it's more than measure language and this language not adequate here it just creates only problem it's a way to carry with you it's absolutely unnecessary and you make mistake of course ideas he had a completely right ideas and ideas like that they kind of garden ideas but their language is very awkward and so when you write it you just you know because it's complete ballast so you make mistake in this balance because it's balance that was connected by a cyanide and so it's called Kolmogorov's entropy but this again concern this ballast doesn't concern the core which is what I said now but still now turn to the entropy so eventually the point is to understand what is the entropy of this kind of physical system like crystal and in this terms so the point of again physical idea of entropy it's some sequence you measure many many times and then you average and then you get it the typical formula which you write for the finite measure space would be some Pi log Pi on that hand if you look at this picture you measure something here then take identical thing and measure it again and take identical thing to measure it again and look at the coherent of this measurements this because extremely kind of measure stuff but however and if you want to count this number of states you have to kind of repeat experiments too many times on that hand it is very coincides formula and that's exactly the formula it's not definition of entropy it's very efficient way to compute it suggested by Boltzmann and mistook for the definition by mathematicians because they were doing connection actually this was not Boltzmann's plank who first wrote this formula and but it's again it's a formula it's not just computational formula not the definition so what would be the definition so I give you first kind of the mode of this definition just just I then decipher it again so I have my category of finite measure spaces called F M finite measure spaces so this certain category if you have any category you can assign to it a growth in the group or growth in the semi group we shall do that we shall do it with certain care we have this in the back of our mind but we do it with some care and this means that we never have so this is preferably assigned to morphisms and so it's something we take a group generated formally by morphisms or semi group and the relation if you compose FG then this element in the growth in the group will be F plus G so it's a billion group or semi group I prefer semi group in this context like that very simple there's a billionization of your category and in the first approximation the entropy is on my client measure space it's emission this growth in the group and and well and that's it but then a posteriori this group or the semi group is isomorphic estopological actually this category also has some topology which I suppressed at this moment I somewhat suppressed this but this will come in a second otherwise if you just say it like that huge a billion group or semi group uncountable something horrible but still by the way not if you just limited to special space it's still not complete lecture but it's topological so you only look do everything continuously but but in any case I want just to kind of bypass this and just to come closer closer to the core of this to explains actually what how it works and just how to look at this in a somewhat different in a somewhat different perspective that's kind of the idea because in fact the correct statement it is not growth in the group of this category but over a non-standard model of this category so given any kind of fine category and we shall return to this this is kind of comes to the more modern result in ergodic theory when you have to work with this non-standard model of groups and more categories but now I want to look at this in a more naive way and just for me it is a following example which I laugh very much which surprised me a lot when I learned it and this is a Luis I'm not saying I say it correctly Luis who eat in any quality and this I already said spoke about it and I want to repeat it together where this idea is extremely clear and again forces you to think about other mathematical problems you can see there is a measurable set in the euclidean space in three-dimensional space it's true in any dimension but this the first non-trivial example and you can see there is three projections to coordinate planes so it is x, y, z plane and you can see the x, y projection x z projection and y z projection so you have three domains and this has certain volume and this has certain area x, y a x, z and a, y, z and then the inequality says non-surprisingly that volume squared less or equal than product of these three and this is a weak I mean sharp but not sharp in a certain sense form of isoparametric inequality because if you have a domain and you know bound to an area of course area of each of them smaller than area and this is the volume so I have volume as correct power estimation of area except the extreme shape here will be not a ball but a cube right which is somewhat amusing because this is a isoparametric inequality even in this sharp form it is very kind of powerful inequality it implies for example all sovereign inequality is half mathematical physical trivial corollary of this and so how you prove that and this is the proof by Whitney and Lewis was just writing some simple computation inequality which was not I mean it's very easy and it's kind of strange but the right way in my view the right way to think about this is as follows which is explain this growth and dig growth and dig philosophy you observe that first it has nothing to do with Euclidean space all was essential that you had product of three measure spaces I may be called according to my annotation X, Y and Z and I have a subset there and I project it to X, Y, Y, Z, etc. and look at these measures and they have the same quality I never use here the structure of the Euclidean structure because everything about measurable sets and measure theoretically it's all the same I speak here in measure theoretically which already I said it's very bad language by the way I said that measure space is a factor from the category of finite sets to the category of sets which I didn't quite explain yet but on the other hand this language is kind of sticky we still use it and there is a good reason not to use measure theoretically because it's really not correct in the way the text books measure theory they're just wrong everything they say is wrong because it's not a set measure space is not a set it's not a category of sets it's not a set this must be but on the other hand you can't say set if you remember if you know what you're saying okay and once you say this verse you observe if you put here Cartesian powers and nothing changes and all these numbers go to the N's power therefore if you can prove this in equality for large N even approximately with some error when you take back N's root you may get what you want so what you have to prove only it's approximately but after you took the power and now again I say slightly and carefully then come back come back come back to this in a second if you just quite true what I'm saying but almost when you start going to high and higher to high and higher power here with the law of large numbers this set obviously set will kind of converge to the cube and so in the limit you have this cube when you have a quality which is not quite true because exactly I'm saying not quite right it will be converge to the cube minus or plus something determine your favor and then in the limit you just say huh I have a quality I read this N and I have the proof and so it's completely effortless proof and now if you think a little bit about that so what happens when they're high in high power and apply the law of large numbers but now I want to formulate the law of large numbers because it motivates what I do on the other hand the definition with growth in a group they don't need it, it just come in the end so what are the point of the law of large numbers in this context so it applies to finite measure space so I have finite measure space which I want to write as a collection of atoms yeah finite collection of atoms it says if I take this Cartesian power of this which means I so it's quite simple right I have this thing I have this thing and in every one of them I put the product of the two you remember there was exactly the essence of this most elementary mental type of map, yeah it was exactly doing that you remember how you we are not quite far from where we were so this this hard-wired theorem or mental principle says if I take a matrix and where the sum of all elements normalize equals one I replace every entry by product of sum here and the sum here and normalize again the resulting matrix so the sum will be one so I have a space, I have a mapping from a space of matrix into itself the square of this map equals m, yeah this was kind of paradox resolved by hardy and Weimberg's special case though they never formulate in these terms just writing express formulas I've shown you last time the stupid formula but if you write a formula of 2 by 2 matrix symmetric matrix however this is exactly what we are doing this is a square of a space now we have this power and the point of the Bernoulli theorem saying that when n goes to infinity this space converges in some sense to space with equal entries which I would call homogeneous spaces I just equal entries I don't want to say it because this depends on the particular category I'm speaking about but being homogeneous object in the category is true always just for any category this concept of homogeneous of homogeneous object category so every measure space find measure space when you take it high power become essentially homogeneous and and to say it precisely in a way in a compact way you have to say that I take p and take n n-standard infinity large number and this space as an n-standard space is homogeneous up to higher order kind of infinitesimals and this language when you pursue it you should see it's really how it works now let me remind again what is the precise definition in what sense it converges and moreover and this is what is essential it's not only true for individual space it's true for morphisms so it works coherently so given any kind of diagram of maps you can you go to the high to the high power the whole diagram is approximated by corresponding diagram where all spaces are homogeneous and morphism between homogeneous spaces by the way the spaces are homogeneous and morphism between those are just the composition of numbers right so we have equal atoms you projected preserving weights so it means here you have number p times q and here is number p so you throw away some factor you decompose you decompose the numbers and the products right so it's category become extremely simple just category of numbers and category of multiplicative category of numbers and I'm saying when n goes to infinity this category converges to another category so in the limit actually it's non-standard limit we shall see this formalism is essential in the end by the end of this lecture but now let's say explicitly because you have to decipher it once and then you can forget it and speak in this kind of language and you feel very comfortable but once you have to say it explicitly in what sense converges it's non-standard analysis and in a way it just says if you just follow your not even tuition whatever you just do it right as it was always done and then the justification of that which is by the way is not at all true justification because it depends on some axiomatics which may not be accepted some people wouldn't accept it and okay so the essential convergence and there are two topologies involved there are two type equivalences and they're really different and eventually mix them up and one can respond to additive structure probability in numbers and now to multiplicative and the additive is very simple when you have two measure spaces two fine measure spaces and another measure space you say they're close I think some of them measure epsilon it may be many many atoms but total mass epsilon so obviously you don't change you take some bunch of atom of weight of one minus epsilon and you do whatever you want to others you spread them, condense them so it will be additively close by epsilon and then there is another one multiplicative and here we can use this normalization I hope I said it correctly that another one which you multiply you multiply everything by constant such that if you take I hope I'm not confusing it you take divide log of the constant by the log of the number of elements in your space this goes to zero everything applied for infinite large m so it must be infinitesimal this is what I'm saying, it's very inconvenient all the time to speak about sequences it's one n but infinite large which means you're working with sequences but I'm always confused you take log by c or you divide themselves you shall see in a second I'll formulate the the law of large numbers and then become probably correct except here there is one little point because you have two spaces this m by which one you divide and you divide by a small one but again this is conventional because in truth you have a sequence depending on n and this is your normalizing element you divide by this n log n and this I think is a kind of essential point again it goes in the direction of the large deviation large small deviation so I always forget how it's called now because one is independent additive structure numbers here multiply by number I multiply always by number here I just add or subtract some mess probability of theory exactly just is in the play between additive and multiplicative structure they're very different and the old proofs with equivalence here have to separately prove samsung invariant and the additive and samsung invariant and the multiplicative and then when I have two sequences two spaces x and y I take the powers and see if the two sequences for infinitely large n is the equivalent and you call them Bernoulli equivalent and Boltzmann entropy is a class of Bernoulli equivalent spaces and by Bernoulli theorem in the second x squared y such a space for large n is homogeneous and so it's the number of atoms the number is kind of defined so you take log of this number it will be entropy and there Bernoulli equivalent even only if entropy is equal and if you compute this entropy by going there you immediately see it's kind of trivial, it is given by this formula it's a kind of trivial computation because it's both behave correctly under the problem now why Bernoulli theorem why it's the law of large numbers it may usually the law of large numbers but it is the law of large numbers for variable log p exactly but you don't have to say that from a certain point of view this is a better law of large numbers than usual one because it involves no extra random variable it's just one measure space it's the property of the category itself of course it might be careful it's a little bit of cheating because there are two measure spaces here secretly it says this way it's p i and the space is equal ways to all atoms and of course additionally for Boltzmann there were two measure spaces there was Euclidean space with invariant you will measure and the second measure distribution of his ensemble where the entropy came up in fact property of two measures it comes now it's a measure space with extra weight function which gives you second measure and this was how Boltzmann was thinking about that and this was kind of Shannon interpretation I don't know exactly because the way Boltzmann writes of course he doesn't say it explicitly but in his arguing with the mathematician of course he shows what he had in mind and he had in mind of course non-standard analysis all the time and which was not of course available in a rigorous form in fact it wasn't well kind of established essentially you go by Leibniz but then came people you know like Weierstrass, you know, Kashin said it's not good and changed it to epsilon delta and of course everybody thought in those terms and so so Boltzmann was just in the wrong moment so nobody understood what he was saying I mean mathematician I had a tendency to misunderstand what he was saying arguing with him in a rather silly way anyway this is a this is a point now I just want to explain this so how we are using that you establish basic basic basic inequalities so what is the basic inequality is a Shannon inequality so let me now just come back to this abstract measure space and though this inequality is about finite measure spaces so when you have this so why abstract measure space you have abstract measure space in what sense it defines your functor in the category of finite measure spaces and the functor is extremely simple it just set of maps from X so if it's finite measure space AP it just set of all measurable maps but this what you do because this category you have this X it's object of some category and you know with the morphism of this category there is no points you see the point is there is no points because in order to define usually measure space you have to speak about all sets of measure zero and the set of this is more than continuum therefore you need to already fully develop Cerebino-Frankl theory and this certainly nobody should know this actually it's a very messy and pleasant theory and who knows whether it's right or not it's logic and it's secretly there of course you don't care because I'm saying it's low you care if you don't need, you never use it all this measure but as categorically all you have to know is this for our purpose and in classical language and still very convenient picture partition into finally many elements the point is it's convenient categorically to give it a name like that and have a map into some p and then it doesn't have to be a set it's just functor and in other way to think about this because it maps to all measure spaces finite measure space here is category of finite measure spaces and it's convenient in a second we shall see take a small subcategory so it will be set not just class and here on the top we have this X and this can go to any space here so you just in the top of this pyramid you bring this set and so what's relevant basic operation for sets for partition you can intersect and pairwise and this operation is called so we have this we have two spaces on the X you can say p times q and the meaning of that if you have a physical system you measure it by one set state detector and take another one you can see to them simultaneously so you can see how much frequencies appears and pairs of windows and this will correspond to this operation and you don't know again the whole point you don't know what are the mixes thing is you don't care it sets and not sets it's just absolutely irrelevant it's not set it is a set and numbers of course it's just rather a set whole abstraction and it's just we're used to that in common it's very useful in a way operation to consider set of everything whichever means somebody take a set of all of them but it seems that kind of experience of the last decade says but it's much better to do it categorically I mean it just gives you better language than sets of course it's just language I do them so and this corresponds to this intersection again you have to define categorically so what it means so you have this functor so you add this functor consider an object of a category and then it means the following so you have X goes to P X goes to Q and then there is kind of a minimal object here R so you can split this diagram and this category has this property so this is essential this category and measure spaces have this property when you add this no I said it incorrectly so your measure space means you have a functor with this property so you always have this R and then you have all you have to know about measure spaces everything can be set in this language in an extremely simple way I mean just everything translates here now this is by the way interesting how do you push up well this R will be this this is actually a core product only this operation other operation is silly from this point of view this is another operation but it's completely silly I mean it exists but it's it's theoretical it's perversion so to speak it enters in some level much more so it gets level I better not to say it's a need to destruction it's a some artifact of something where you are not saying you forget about that but it's completely different level much much by two or the high level of structure which is not needed here so this is what we have so we have this category and now I want to formulate, prove and explain the basic basic inequalities and the basic inequality here is Shannon inequality which says that p times q the entropy of that less or equal the entropy of p plus entropy of q now again it is physically kind of obvious because we make two observations and the count what you see is the count number of possibilities if you make them together they don't interact, they don't prevent for example we may have certain colors blinking here and if they are completely here and here independently of course the numbers will be multiplied so logs will add but if they interact some of them may prevent one from another and then for this of course that's obvious because already they have this picture and that's what it says now how we can see this in this language I'm saying because typically proof what mathematicians do write this entropy some log log is convex and it's quite happy I personally can't accept such but I don't know why log is convex you have the differentiated or calculus you don't have to know that you can translate this physical reasoning to categorical language and then become as obvious as that so it proves that log is convex so log is convex follows from this physical language it's not behind it and the proof is as follows when you have this inequality you know again you can put here power is n because everything multiplicative when everything multiplicative it becomes everything becomes homogeneous and then when it becomes homogeneous it reduces to the statement so let me I reformulated slightly differently just if you formulate the original theorem this inequality in terms of this two projection to P into Q it says we have here some weights and I project them here and here here is my P, here is my Q and here is my R so it's a measure here one project here and one projection here and this is R P Q and so this inequality says something about this matrices so you have here is projection give you measure P here is measure Q Ntp of P plus Q dominates measure of R now if I take Cartesian power follow them they all become homogeneous and what I see I see that this R will be subset in the product of P plus Q is monotone under injective maps you have it and this indeed somewhat strange because in measure theory all morphism are subjective and here you use the fact you use injectivity of the maps and this will be kind of fundamental it's really kind of kind of not obvious thing and this was kind of understood rather recently in the work by Louis Boyne so actually how injectivity can be seen categorically what is injective and the point is exactly this appears in this diagram R going to P and R going to Q this diagram is kind of injective so when you go from the original category look at the category of such kind of diagrams and so the morphism now is a pair of morphisms and this is a morphism so if you take this for morphism a new category category of this of these flags how they call here then then this will be injective morphism in the sense of category theory right so it's another way to formulate it and this which suggestive for other categories so that's entropy is monotone in the category of certain diagrams right because you have to go from subjective morphism to injective ones and the proof is exactly like that that this Shannon equality tells you that and another point another Shannon equality is which you usually use is slightly different and it says that entropy of now for morphisms also says Shannon equality and this I want to show you at this moment advantages of these notations so that's for morphisms and just if you traditionally this is called entropy of F called relative entropy and it's written like at least entropy of P comma Q for me it's morphism F is a morphism from P to Q the trouble with this notation of course you never know who is P in Q and secondly we have three symbols P comma Q instead of one F so you just exactly three times from this you see where I just exactly wrote it down to see so what happens so just half a line exactly the line yeah partially defined maps partially defined spaces in this context not but implicitly they are there because the measuring device that seemed like that was just important you're only localizing in a certain way measuring device measures the whole system measures the whole system however technically it's convenient to have partially defined map but they are not in this category so as usual because it's kind of slightly analytic the categorical language might be slightly adjusted you have to allow partially partially defined maps just for defining this topology in the limit it's needed but category is just categories it is in the defining its properties so you see the difference between the formulas over there and this how you write it when you write in digital term this is easy to remember and this certainly is a mess so if you make computations it's a lot of difference also relative entropy makes sense was good about that and was very typical for this logic categorical even when entropy of individual p and q makes no sense so in the simple thing this entropy of this morphine difference to entropy if they exist but it may happen that both p and q are actually infinite spaces and entropy of p and q is not defined but difference is so there's just a thought when this question came to you you define entropy for functions and he asked if you have partially defined morphine no they are morphisms no I don't function they are morphisms I don't know what function is in partial morphine you also have relation no this is exactly wrong language there is no relation there are categories and there are morphisms period this is exactly the power of this language you don't need this kind of junk partial sets that I don't need this junk yes category and more very clean language no in the end if you have it for relations you have it for the graph no I don't have a graph it's purely categorical language this is the whole point I insist you can express everything in purely categorical language you don't need set theoretic language all business about relations partial sets functions out of there you secretly have because you're used to them I keep them in the back of your mind but I prefer to have them back on the physical system you say there are no relations in categorical language no sure no you can bring them there junk you see the whole point of categorical language it emphasizes the relevant relations and throw away the junk and if that's right you carry all this junk with you this year you don't know what to do with this it accumulates and it's much more clean language not ideal I guess not perfect and it has some drawbacks here and we shall see them maybe today it's not the end of the world because one of the points you have to work in some completed category when you have no standard analysis and you have to go to the limit properly but my point was to show you on this example I could write some letters which again for me impossible because I never remember which and they were different different period of time I think boys when using s today people use h but if there are formulas like that you don't need this the most complicated formula you ever use but if you start elaborating the junk of course you have long formulas and you're forced to be in one line to use one letter what square root no there isn't a square root here I mean ok it's about numbers yeah it's another story so this is just another related from for these kind of diagrams which is useful and and coming back to this original example so what actually is proven here because Shannon inequality and another kind of obvious inequality which I want to say that entropy of any measure space is less or equal than log of the cardinality of the space and equality holds if and only if it's homogeneous this again true by the same logic you go to this limit and see what happens one space and based on to another you don't have to make any computation but this formalism formulas has disadvantage you cannot prove the equality holds only for homogeneous spaces right because you do it by approximation and you go and for this you need something like voicemail formula you need analyticity of log not much but in order to have the same location so log has a deep meaning and why take log by the way why take log because you have two junk of matter which are separate you want entropy to be like mass behave like energy or like mass it's additive and it's reasonable but why mathematically you take log and there is not explanation but some phenomenon which tells it's right thing to do right and it's fissure metric and I and but it's not that not that kind of deep reason that I I have only guess what it is but I say it in a short while but now so we have this this property and now coming back to this in fact much stronger inequality is through namely instead of projection of this you can see the entropies so here because you have just first you can think everything is discreet or continuous I mean just I haven't described this formula for continuous thing you can do it or you can approximate everything by discreet thinking this was subset and discreet measure space then because subset, cardinality and entropy are the same but projection you can replace by entropies which are smaller than cardinality I mean it's still inequality holds true so it becomes stronger inequality and when you come to analysis this is a response to so called sub-belief inequalities which are proven later in which is a kind of strengthening of strengthening of usual sub-belief inequalities so this contains much more much more mean but again this is a big mystery for me the problem what happens with this ball there is no similar there is no similar theorem for ball being symbol there is no sharp log sub-belief inequality where the extremal thing will be ball it will be only the best you can do when it will be Gaussian distribution but not the ball which also might not be real but anyway and so if you write the Shannon inequality in this example actually it's a relative Shannon inequality this gives you this inequality and so in this category limiting argument of course it's much easier for me each time to repeat this limiting argument you go to the limit and you see constant sets you don't have to remember formulas all you have to know that cardinality of a subset small and cardinality of a set and this is just all kind of the only formula you have to know now how from there we go to next step because we had infinite system and what is the integral of this infinite system and my infinite system was a crystal so I have this lattice so I had this lattice and I had at each node I had a particle and imagine each particle may be in finitely many states so I have this finite measure space in each point and I have this infinite product so I have this finite measure space to the power this is infinite power so this is infinite countable set unlike finite set what it is what kind of object is this so what are these products and again so what you say in this categorical language well you just say what are admissible morphism to finite dimension space and the basic of them are projections and this all you have to know there may be others and from that I want to define entropy and to prove this theorem of Kolmogorov let me remind this theorem of Kolmogorov which was a problem for quite a while which is in traditional language of measure theory but in this way prove formally speaking more general theorem but formally in reality it is the same so it says given finite measure space and Kolmogorov was concerned with the case of integers and take another measure space finite measure take g to the q so all sequences so this p is collection of atoms and this is collection of some atoms q so I have sequences of weight p or weight q for example the easiest may be this have two atoms of equal weight and these three atoms equal weight and the question in algorithm theory if there is isomorphism between measure isomorphism between two spaces which can be used with the action of the group right so when you have space with the action of the group giving new category which isomorphism commuting with the action of the group just again a general categorical construction and you want in this particular instance to know that and it was unknown just for this example and Kolmogorov shown that such isomorphism if such isomorphism is possible it would imply that entropy of p is equal to entropy of q this was his theorem and you can ask the same theorem for any group g so substitute here by any countable group and it's unknown it's still unknown though as I said recently was a great progress made by by Louis Boyne next step in that but Kolmogorov proved fuzzy and kind of explained to you everything is just the same for z or for this group as there is no difference it's unclear to me if it was not known ready to phase or not because they were never of course looking about g is one dimensional system who cares without interaction I mean you can't imagine phase being concerned with the system of particles on one dimensional crystal we don't interact I mean it's not a really exciting system they were starting three dimensional system with interaction and there was a big literature preceding that of Kolmogorov essentially by one half and they're proving very kind of deep theorems much more complicated than that but if they implicitly had this invariance that is out some clue to me because I haven't read these papers because you have to read them and understand them which is different language which is not so obvious this I'm curious yeah this was thesis knew that but this was short gap this was 58 and this paper was in a fun half and early 50s yeah hmm countable generated group is countable it's the same countable group countable generated group is the same class of groups in particular find generated will be countable it's countable group all they say is countable group it's unknown for find generated group it's unknown for all kind of examples of group nobody can show me a group it's unknown this theorem of born I mean I guess there are potential candidates of groups maybe it's not true but any group you show it but again it's related to some other question in group 30 exactly about this non-standard models so the proof goes why non-standard analysis here non-standard analysis is absolutely kind of crucial but not for Kalmogorov case so let me explain this Kalmogorov case very easy but again we just look in crystals I think the difficulty when you have the group Z and you know it's too primitive too simple to say anything Z3 is much more more fun actually I prefer like this CIG2 it can make pictures so we have to in principle define what entropy means of this kind of physical system and what you do yeah actually I have to some I can't wait so there are some formulas and just to explain what it means certainly the system is infinite right and so you and since it's infinite of course entropy must be infinite so you have to normalize it to the piece so you have a kind of huge crystal you take some big big chunk of matter so it become finite so you take entropy of this piece just imagine what you observe under this piece see what you see there take this entropy and divide it by the total number of atoms or by the volume whatever normalize and go to the limit limit may exist or may not resist if not there is no limit take sub limit but in physics limit should exist in these examples limit usually exist and if you have a crystal then of course you are added by the symmetry so you can always you know this piece and the bigger piece move piece must have the same entropy so this thing will be kind of invariant when you go to the limit but the problem is so how kind of you make this a measurement in practical so this big piece meaning you are attached to this your kind of so so as I say we have this detector we have this detector we can move this detector along but with this symmetry group so I have this detector it's finite measure space and I can move it along in your position and have this p times p. And I can make it many, many times and take this measurement and then normalize. And see how many states I see. But the point is, of course, when I do that, so I cover this chunk of matter by these p's, measure this entropy, whatever I see here, and then it grows. But the point is I can miss some of the states. My measure may be blind to some states. You're just missing them. So I can add another kind of more powerful detector and measure them together. And then I may see more. And this is one of the point of the thesis. The entropy is what you see. It's not what's in there. There is no entropy in physics of a system. It's completely absurd, because if you start looking deeper and deeper, it becomes infinite anyway. Depends on how much degree of freedom you measure. For example, it depends on the energy scale. It's just not the states which are there, but the states you can observe. And you exchange all the time. If they don't exchange, you don't see them. You only see the state. When it goes from one to another, energy is being committed to absurd. And it's again, by the way, not so easy to formalize mathematically, yes, by formalization, only partial. But at least you go away from this naive set 30. So it is, in fact, entropy depends on the class of measurements. And if you say not only measuring, but there are protocols, there will be two categories, not one category, which I don't exactly know how to formalize, but this may give you somewhat different kind of theory. And essential point is how many kind of this piece you need for that. And you say, huh, sorry, I'm in a good shape. If I have a complete system of these detectors, if I add a new one in that measurement with them in the limit, I get the same entropy. So which means that if I get this q and measure it with p and then take as many, many times, my entropy up to or the n, because it grows proportional to the number of terms, will be the same as if I had no q. So if I add this new apparatus, all I see, which was already I have seen. Of course, if I start hitting my physical system with much stronger energy, I will see more and more. But on a particular level, I see the same. So mathematically, it means that my kind of measures, abstract measures space, as a functor, is determined by morphism only to these spaces. I don't need extra spaces. Of course, I may change it. It depends how I look at this. Yeah, this is exactly. I'm saying there is no set. There are different functions. In different functions, on the same physical system, I can give you different objects. And then the point, particularly in the case of product spaces, that, as I said, by their definition, we have infinite product space, p to the power countable set S. Then the projection to find a subset give you full system of measurements. If I consider any kind of this joint of these sets, if I add any other set, my entropy doesn't grow. So when I measure entropy, this has nothing to do with the way of theory, whatever. In general property, it's more or less the definition of the product. And in the classical measure theory, it's called the Lebesgue density theorem. But this is essentially definition of measure. It has nothing to do with the way of theory. The moment I have it, now, I can measure my entropy, say, in the case of group Z. So I just measure this entropy on this segment. And then this sum entropy will be an original entropy to the power p to the n. And the p to the n, this entropy just multiplies by n. And if you have a q, it will be q to the n. If there is a sub-morphic, this entropy must be equal. And because the added term, because the group is amenable, amenable. So the only thing which is used here, that n plus i divided by n goes to 1 for n going to infinity. And this is just a mathematical part of the Kolmogorov theorem. Everything else is just this kind of categorical language. And where the mathematical structure enters amazingly there in this amenability of the group Z. This is true for amenable groups. This is theorem of Kolmogorov. And if you look a little bit closer, it's immediately what it shows. And this kind of interesting story, what it shows, that if you have amenable group A and have p and have q and just have a morphism like that. So I'm sorry, I wrote it in the wrong order. You have spaces p to the power A and q to the power A. Gamma is amenable group. Then entropy of p greater or equal to the entropy of q. In particular, if there is a morphic, the entropy are equal. If the group gamma is amenable. And that's how the story was still relatively recently, because it was observed by Weiss and, I keep forgetting his name, the second name that it is not true for free groups. So if you have a free group, there will be easy to construct such morphisms, which actually increase entropy. And therefore, it was believed that the Kolmogorov theorem is also not true. And then it was about a couple of years ago proven by Bohm for free groups and for some larger class of groups, which I'm going to describe now, that nevertheless, it is true. This is not true. This surjectivity entropy is not monotone. But it is still invariant of undisomorphism. But for the reason of kind of injectivity of certain auxiliary map, not of the surjectivity. And the proof is technical. I will not explain the address. Actually, now the IP in his latest papers, more transparent proof, which I haven't read, is first proved as a message. Ideas, clear, but they're computational indeed. It's a commutorial computation you have to make. But so what is the point is that there is a class of group where p to the gamma isomorphic to q to the gamma does imply that entropy is equal. And so now a couple of words I want to say a few words about these groups. So these groups include free group. And it groups also groups of linear groups. And in my journal, what it's called, rigidly defined groups. And so let me define this in a suitable language. That's to give the idea where this understand analysis enters. And first, what is rigidly defined group from this perspective? So these are groups gamma, which may isometrically act on a compact metric space. If I'm not mistaken here. This is kind of not usually definition. But this definition, which is suitable for my purpose. And the logic of that, and the way you have to think about that, that this when acts on a compact space, it acts approximately on finite set. Yeah, because if you cover it by finite set, we can have approximate action on finite set. And in certain effort, you make it actually actually on finite set. So rigidly defined group, the ones whose action on finite set fully kind of discloses the nature of the group. And the groups here, they're called software groups, which by definition, which in a second I shall elaborate and explain in a simple terms. But this is a kind of very good one. They are geometry groups over non-standard compact spaces. So you have to say, what is a non-standard compact spaces? A non-standard compact space is a compact space in the category of non-standard metric spaces, which means objects when you take non-standard model of the language of real numbers. So when you have real numbers kind of in your statements. And so metric is still understood as a real number, but spaces are on standard. So the idea behind it that these are groups which approximately act on finite sets. But they add in a different way from the one which is described here. So let me give the definition due to wise. And so these are software groups, probably they're very restrictive condition. So all amenable groups are software. All rigidly defined groups are software. Projective limits are software groups are software. But well, it's also kind of rigidly amenable groups are software. But there are more examples, but we don't know if the all groups are software. It's very unlikely they are all software. And the definition is quite simple. So if you have deciphered this non-standard language, which I was referring already in some category or sense, it means that again it's convenient to have kind of one infinitely large number. This number of course goes to infinity, and it has a huge, huge number, which is finite, but large. So if you just in this boundary between finite and infinite is not that clear, I mean, this is mathematical formalism. We all think we know what is a finite or infinite. But let me give an example, which is a typical example, which is number, which you cannot say what is finite or infinite. You can see the time, the maximal time of working of a Turing machine with a program over the 2000s, 2000 bits, or million bits, whatever. You describe whatever model you take for your computer with infinity, any model you take. And then you write the program and it may go forever or may stop. You can see only those which stop, see how long it takes, and take the supremum of all programs of given lengths sufficiently large. And from some moment on, it's a material. How long is there? Because all these numbers for all practical purposes are infinite. They are finite in one language, and infinite in another. And this is called the Gödel-Gödel theorem, that showed that our naive picture of finite numbers is not adequate. So this is exactly what kind of number you take. You take this kind of huge number, this finite set of this cardinality. And the group acts approximately, meaning that for given finite set of the group, these elements presented by transformation which are partially defined, and which partially satisfy relation, but by the order, the number of elements where they fail to be action divided by the number of elements infinitesimally small. So when they go to the limit, it becomes small. So they countingly up to a certain error. And then, of course, this depends on your non-standard model. And with every such model, Bohr introduced his entropy. But in the case of this Bernoulli shift, they all coincide with the usual entropy. But the proof of this is not easy, even for free group, even for a reasonably fine group. His proof is computational. Unlike Colmogorov theorem, you have to really compute to see how many combinations are there, and how they match, et cetera. OK, now I want to justify a log. Why there is log? So again, but this problem, even if it's true for all groups, regardless if all groups are, it's unknown if Swafi groups, if it only if a non-Safi group violates its property. And it's unknown if all groups are Swafi. And so it's very unclear what happens in general with this question, whether. Isomorphism implies equality of the entropies. This looks kind of like a nice, simple question, because so general. So I repeat, so we have an infinite product to find a major space indexed by gamma. The group X here, you consider two such spaces. And if the isomorphism preserves the group, commuting the group X, it implies that the entropies are equal. And this is wide open for general groups, with general countable groups. That's a very nice question, because it's so simple. And it looks quite difficult. And again, the two major results here, one is due to commagoraph and other to boeing. By the way, in a meanable group, particularly in that case, one knows that converse also is true. It is a theorem of what is his name, who is proven that if entropies are equal, then the thing isomorphic. I keep forgetting his name always, or in a second. It just came to my mind and then disappeared. And this kind of part actually easier. The theorem originally was much harder, but when it was proven, it's not seen. Exactly, it's not seen in the theorem. It's not seen in the theorem, it's much more elaborate proof than conagoraph. But once it was proven, there was some general limit, and then it extends to other groups rather effortlessly. Whenever a group has a kind of free-billion subgroups, it's already true. It happens to be as one could expect it. It's harder to prove, but having much less structure inside. And but this is, but this is, this is, this is remain unknown. Now about log. Now it isn't me. And this actually has also a certain relation to mathematical biology, which I don't know, I must admit. This concept come from mathematical biology, what I'm going to say, origination of mathematical biology. And there are two names. And so the major player here was Fisher. And so this is about log at Shannon inequality. If you think a little bit about Shannon inequality, indeed it easily transforms to the fact that entropy is, I keep forgetting, convex and concave function in the variables pi. So if you have p1, pn, I hate this notation. And so you think about them as a point in the n-simplex, n plus p is 0, pn. So I have union-simplex given, so the positive number sum equals 1. So entropy is some function of the simplex to positive real. And Shannon inequality is kind of equivalent. Again, when it's equivalent, it's still simple reduction, but in certain contexts, it may be equivalent sometimes not. And this function being, I keep forgetting, convex or concave. And I think convex. I would say minus entropy is convex, entropy is concave. I prefer this function, which is convex. But this will be negative, but it's material. So it's positive or convex or concave, but what is essential is Hessian is sign-definite. So we can always turn it into positive. So it defines positive definite quadratic form on the simplex. So this is Hessian of this entropy. So it's a differential quadratic form, which is positive definite. Therefore, it is a metric. So it's a Riemannian metric. And you may ask, what kind of metric is this? If you know this metric, it's complete, not complete, positive curvature, negative curvature. So what kind of metric is this? And that's my experience, again, when I just thought about that I asked somebody, and he said to me, this is called Schoen-Schachani metric. And then from this moment on, you can start searching internet, yeah? Schoen-Schachani is a mathematical biologist. I went to do this metric about 30 years ago. And when I just looked into that, and they were about 10 other names attached to this. People were discovering it all the time, this metric, and disguising it. I think the first was Fisher in something, 1923 or something. And then Schill couldn't figure out what kind of metric this was. But then I read by somebody who apparently discovered this metric himself. It was annoyed. It was called Schoen-Schachani. He said, why is it called Schoen-Schachani metric? It's just spherical metric. So this metric is isometric, the simplest isometric to the simplest in the sphere. It takes sphere in this point of cone. This metric, simplest isometric is very symmetric. So it takes a hessian of this function, sum of pi log pi. Why it will be of constant curvature? It's incredible this can be, by the way. It is not such a, what's such a remarkable, it has so high degree of symmetry. This only has symmetry of a mutation group. This has symmetry of the orthogonal group. How could it be? And so what is the map? What is the isometry between them? And the isometry is not a regular projection. If you take regular projection, it's an isometry. But what is the projection at the map, which is called Archimedes map? When you have coordinates xi, they go to xi squared. Then the points on the sphere, where some squared is one, goes to simplex, where the sum is one. So this map sphere is n to the simplex. Because here, if some, what mean being on the sphere? On the sphere means sum of x squared equals one. What does it mean being on the simplex? It's sum, so this is called yi equal. It means the sum of yi equals one. So, and this map gives us something, just obvious kind of. If you make computation, it's immediately see the symmetry up to a constant with two or something. And this map is what is called a real part of the moment map. And again, the moment map is a kind of very remarkable map. A mathematician discovered by Archimedes, who discovered that when you project sphere to the interval, this map is measure preserving. And in more advanced terms, it has the map for complex numbers from xi to xi times xi bar. The map from xi to positive numbers is also measure preserving. They're not one to one map. They're not preserved density, but they're preserved measure in the category of measure spaces. And so this is the map, and then immediate kind of. Tendency immediate conclusion is that entropy must be invariant under the group of orthogonal transformations. And this entropy was introduced by von Neumann. And so how it is defined? What are the objects where it is defined? So you have to say, yes, all the things about measure spaces, but in a way, it will be orthogonally invariant. So let's do that. We say it in a slightly fancy way, but it's kind of nice. And the corresponding to the spirit of quantum mechanics. So the point is that what I want to say is just from this Boltzmann formula implies the existence of quantum mechanics mathematically. And this is kind of strict. I don't see a rational explanation for that. Why statistics and the formula which came from just kind of formalism of the law of large numbers? The only mathematics there, the rest is the language. Why the law of large numbers imply the existence of orthogonal symmetry in the space? You see, if it were the normal law, we kind of know this. But why the law of large numbers? That's for me unclear. However, once this being said, how we define entropy in this Euclidean setting? So measures on finite set, it's something when the number is assigned to subsets. And they have some property invariant under permutation or something. So what will be this in the case of when you replace sets by Hilbert spaces? So formalism for Neumann, at least, formal quantum mechanics, is just anything instead of sets, a finite set. You play a finite dimensional replacement spaces. If it's set by infinite dimensional spaces. You may do it over real or over complex numbers. It makes no difference. This mathematically is the same. I mean, in this particular case, if you know it for real, you know for complex and vice versa. So instead of having finite set, you have Hilbert space. And you want to introduce something like a measure. But what you would be measuring to subsets, you have subspaces. And so to each subspace, you want to assign a number in Hilbert space. And the condition of additivity is that if the two spaces are orthogonal, this is additive. These things add up. And the whole thing must be, of course, you need to have a certain degree of symmetry, posteriori. And the way to do it is as follows. And it may be positive. It must be additive. And there's no way if you look at these kind of properties. There's only one solution. And solution will be as following. You take positive definite quadratic form. And you normalize it, trace to be 1, right? Some of eigenvalues 1. And then, if you want to measure, it's kind of using this form. And this will be a measure. So these are called in physics density matrices. And usually, they're in complex Hilbert space. But real is good for us as complex. So the positive definite matrix with trace 1. Given a subspace is restricted, they take trace over there. And this will be the mass. And this is additive by Pythagorean theorem. Of course, everything, of course, is Pythagorean theorem. You replace. So the point is, you replace aditivity kind of. The fact that aditivity of cardinalities you replace by Pythagorean theorem. Of course, all these formalism Hilbert spaces are just kind of unfolding of Pythagorean theorem. And so this is essential property of this additive. And this, as I say, is Pythagorean theorem. Actually, this property of this Archimedes theorem, the property of this preservation of measures, is also obtained by integration of Pythagorean theorem. So it's got a fundamental extension of aditivity to linear algebra. So what will be then the entropy of that? And of course, the kind of the naive definition of the entropy, so the object are replacing measures are positive definite quadratic forms. These are measure spaces. The positive definite quadratic forms with sum of eigenvalues, trace equal 1. Yes, this point total mass equal 1. So I'll call dainty matrices. And in particular kind of another manifestation of Pythagorean theorem, if you take athonormal frame, any athonormal frame, and take the sum of values of the form, it doesn't depend on the frame. Again, exactly because of orthogonality, it adds up and it doesn't depend on the frame. This aditivity is very strong, because this aditivity implies you can decompose it many ways and get the same number. It's much more symmetry than in the usual measures. So what would be entropy? And von Neumann defines entropy as, so we have this eigenvalues. And you can say again, this will be your kind of weights. It will be lambda i log lambda i is minus sign. And so it will be just diagonalized in a matrix. It becomes a finite measure space. You can take entropy as a measure space. And then you want to prove the basic properties of this Shannon type inequality in developer logic theory. And then it's become not completely obvious. So that's interesting that it's not completely obvious. And so let me show you kind of one way to formulate it. And so the point is that in quantum mechanics, and this is just you see also in mathematically, there is no concept of going from a big system to a small system. In classical physics, the reduction of our big system, you make a measurement and you go to the smaller system. But in quantum mechanics, it doesn't quite work because big system is not in the simplest example kind of a classical thing. You have a big system, and you can project this to A, the small one. And this is a reduction maps. In quantum mechanics, the corresponding thing will be tensor product of spaces. So because it is pointing quantum mechanical objects as spaces of function on A and B. So this will be this tensor product. And then you don't have, yeah, this doesn't exist. You can't reduce it. And this kind of physically has some meaning. Yeah, yes, the quantum world, universal, divisible. You cannot have isolated quantum mechanical system. You just think it doesn't exist. By physical, whatever it is, it's mean by that. This is what they say, and mathematically it means that. Actually, you remember we had it in this Mendelian formalism because the spaces were coming with additional structure. There was some, because we could kind of, there were spaces of function, and you could integrate all these spaces. This integral was particular functional, allowing this projection. But this, in principle, here, is not allowed. What you can do, however, you can symmetrize the states. So your states or your probability distribution states, if you have a group acting on a space by orthogonal transformation, you can apply the group on average, and you have something. And this systematically what happens in the world, you have this averaging. And this is more the same as reduction in the classical case. Because we have this kind of bunch of numbers, and you want to project it here. This projection prohibited in quantum mechanical world. However, imagine you have a group acting, permuting them in all these fibers. If you average them, they become constant along the fibers. And then, as good as have projection. You can go from one to another. All formulas become instantaneous. So taking this picture, having this picture in mind, we can ask about Shannon inequality when you apply this. And this here state, the basic inequality, Shannon inequality, here it is. This was proven by Landford and Robinson about 40, 50 years ago, that how entropy behaves under this averaging. If you look what happens in this example, in this classical example, it becomes Shannon inequality. And here, this condition, commuting is OK, but this is irreducible. And you may ask, why the hell is it irreducible? The proof is not terribly difficult, not terribly easy. It's kind of relatively simple. It was proven. And then conjecture was formulated differently. They formulated in a more traditional physical language. But in what I say, of course, this was obviously an annoying condition. And there were other conjectures, which I needed, which corresponds essentially in the slightly different language. They had absolute Shannon inequality, and you wanted relative inequality for morphisms. Here, there is no morphisms. But this corresponds to irreducibility. There's a trouble with irreducible. Not that there are many components, but some components go in multiples. And when they go in multiples, it's multiples completely can be mixed up. If you have a reusable presentation, and one thing appears many, many times, and this is what happens in these examples, then you can have too many symmetries. And this is a really different picture. And then it was proven five years later by Lib and Ruskai. And so this is true without this condition. And it's got a basic, basic fact here. And in one way, to see the proof is exactly along this line, which I said, you just have to redefine entropy the same way we did it for the final set. You have your Hilbert space, and where there is matrix Q. But now we take it to the zoreal power when n is, again, non-standard number, infinite number. And just start arguing everything reformergic in this term. And again, by the law of large numbers, these matrix become quasi-homogeneous. Well, this quasi is a little bit kind of annoying point, which means that in the limit, in the approximate approximation, it will have the following nature. Here you have a homogeneous object, meaning proportional to your Hilbert original form, all eigenvalues are equal, and projection to this subspace. But the fact that it has projection, it means not the whole space, this kernel. The subspace where it vanishes, semi-definite, non-definite. And this is a kind of little point which makes arguments slightly tricky. However, if you just assume that your object is like that, all before and after averaging are quasi-homogeneous, and then become kind of tautology. The same way as for sets, for different reasons. But you will make this computation, and it's just obviously equal. The only thing to check is really correct. That this operation is sufficiently functorial to survive all kind of things. Because, you see, what was important in classical case that not only Bernoulli's theory was true for objects, it true for morphisms. So you could coherently go to the homogeneous limits. And here also you have to show you can go coherently. But this coherence is kind of, well, you have to say exactly where it is. And so it takes a couple of pages on one hand. On that hand, that proves, traditionally done, done by this kind of what is now is called matrix convexity, which was kind of this term appeared later, that the whole theory of convexity, in particular convex function, had their counterpart for operators. And it's different kind of convexity, which is stable under terms of product, which I must admit, I don't know too much. But this is a kind of interesting enough thing. OK, so this is the end of the story about the entropy.