 My talk will be about the object which is usually called Hessian matrix. My main interests are probabilistic and analytic problems, so I'm not a geometer. So I will try to explain how this remaining matrix appear in our problems while they attract attention and hopefully some people from geometry find something interesting for them here. Maybe new problems, maybe something else. And the starting point will be the optimal transportation problem, which is now a quick development area. So this is a problem to find the object and to study the property of this object which is called optimal transportation. So let me remind what is it, so we are given two probability measures on Rn, you can simply think that they have densities. And you go look for a mapping which push forward measure number one on to measure number two. And it has a particular form, which is a gradient of a convex function. So this is the theorem which is called some sometimes brainier theorem saying that for so for a wide, for a wide choice of measures. You ask for nothing for measure number two for measure number one you can ask for density or if you like you can ask for for a weak assumption, which is written here in the formulation. So if you are given for two probability measures, you always find a mapping which, which sends new into new and has this form, which is a gradient of the convex function. So this was a kind of discovery from, from, so in which was done in the works of being given, he was working on fluid mechanics, and the standard approach to this problem from the point of view of analysis is to work with what is called Monch-Cantarouich problem, which is an infinite dimensional linear programming problem. We're looking for not for a function or for a mapping but you're looking for a measure on a product space and you look which which is which is minimizing a function which is called cost functional, which is given by the formula written here. So, so we integrate this, this quadratic expression. Just a minute. I started program which allows to make. No, no, it's already here. This, this, this is what this, this, this quadratic functions called cost function and given a cost function and given two measures that all of us can formulate a transport and tarouich problem. So, some unexpected and a kind of an expected thing that for this, this transportation cost. The solution to Cantarouich problem has a special structure. And this is concentrated on a gradient of convex function. So, the discovery of Brigham. And from this, this, his works started this on the other hand it's about a very special particular thing. But on the other hand it turns out that the theory has a new has numerous applications it's related to, to many things it's related to problems from analysis from probability, stochastic processes, economics, and of course geometry. Okay, so that's a kind of remarkable situation that working on special things you learn a bit. You know, you learn more about, about many other, many, many other things. Okay, from the point of view of PDE's, of course you, you solve an equation of the mention per type, which is written here. Okay, you just, okay, you understand that it appears from the change of variables formula and the approach of the approach of Brigham give you instruments of linear programming. So the standard way to prove it is to solve the dual, what is called dual linear program program, which gives you, which solution gives you a couple of function, and these functions are exactly this transportational potentials. Or you can work directly with the measure, so you remember that the solution is a measure you can investigate this measure, and by hand realize that there is, there is a convex potential. There is this property that the solution is concentrated on the graph and the arguments go back to some results from mathematical economics about, so people from economics use this convex, convex language. The general transforms convex functions and so general convexity for, for applications to the economics. Okay, anyway, you, you, you get a solution to the transportation problem in a very wide setting and the probabilistic motivation which attracted my attention was a result of kafarelli that this mapping is a contraction. Meaning that T is a one Lipschitz function, provided that the potential satisfies this inequality, inequalities. So here, here the D, D, D2 is, of course, it's a Hessian, it's a Hessian matrix matrix. And the standard situation you apply the theorem is then you measure the source measure is a Gaussian measure. In this case, of course, you have here one. And for the measure for the target measure you assume. And you have a contraction mapping and was for. Because from the point of view of say people working on the modern pair equation, the probability of a fairly observation was nothing special it was just observation. So you moved the bound for the solution of the modern per equation bounds to this type we're known for years. And they usually obtained just by maximum principle, so you, you take modern per equation you differentiate it to the techniques, which was known from I guess calabi type. Maxim principle gives you this estimate. But for for analysis it was a kind of so say, wonderful discovery because having this instrument, you can immediately transfer many results for Gaussian measures to measure satisfying this this estimate. So you can transfer and for Gaussian measure you can establish many sharp results you can prove sharp. It's a parametric is a parametric and equalities so but if type and equalities you have some, I don't know, decomposition and remittant polynomials. Everything is very precise. Nice because it's the Gaussian measures kind of precise model for probabilities. So in this model you can transfer many results. And recover many things and get some new. And also, having minded this, this this copyright contraction theory we believe that it was, it can be also applied to problems, which still open. Yesterday was a talk of boss Clark I think he, he, he was talking about this problem like KLS conjecture and some related things. And hopefully he explains some, what's going on in the area. And so it's not what I'm going to talk about here. So, to work with things. We hope that having this transport instrument, they can transfer analytic estimates from say Gaussian measure on other other nice measures into more difficult objects like uniform distribution on convex sets, and to get some new results. At the moment, the best results, however, they are obtained not not by this technique but but with the help of what is called fantastic localization. It turns out that Martin Gaels into formula and other things work here very, very well. Nevertheless, this contraction theorem still attacks attention, for instance, let me mention a very nice new results of fatigues long predominance. We established estimator with all this type without applications maximum principle, but applying completely other technique like convex ordering entropy functionals and some some something totally totally unrelated coming from other other say science. Okay. And what is how to how to how to prove that the standard proof is related on the differentiation of the modern per equation. So having this, this variation of this type you can differentiate it and you will get something which which let me write in this form. Here you get what is called here you get the kind of ability elliptic operator. And I want to stress that what is eventually is not very well known is that the operator generates a directly a form of this of the symmetric type, which is written here, and this directly form. You can view as a energy on the remaining manifold. And you will get the object, which is space equipped with the measure and the metric. So not only only cash on metric you so you get not only the remaining the remaining manifold. You can introduce this discussion metric here but also you will get what is called measure, measure metric space. Measure plus metric on the same place on the same space and the this optimal transportation meeting. It preserves. Not only. I have to add that, of course, you, you, you, you write the, you can write this for for the source measure but you can write it also for the target measure as well. And the optimal transportation problem you have two measures you transport new into new and also for for new for new you have the same thing you have measure and you have a hash metric. C is related to fight by a legend transform and this transportation. It's, it's a say, first of all it's it's measure preserving decimorphism. So it's it's sense and sense measuring to measure and it also preserves a metric. So this is this tea it's maps. The first place onto the second one. It preserves both measure and the metric. And so this is a kind of very nice picture. That in the optimal theory. You have this. This object. This measure, measure metric space and it's very, very naturally related to the, to the transportation problem. Okay. Let's go through the in the in the in the proof of the cover your serum as I said you differentiate so let me just quickly say what happens so you differentiate it twice you know you will get you will get this identity and in the maximum point, you will have simply. You are looking for a maximum point of PII and in the maximum point you have this inequality. So it's, it's, it's, it's, it's immediate. And from actually extract very easily the conclusion of the kafarelli theorem. Okay. Okay, here are just a list of problems. So for results you can, you can get by applying this contraction theorem. You can prove is a permit the estimates. Look so believe in quality. And something about a spectrum of the weighted Laplacian. So here you get what is in the man geometry usually called comparison theorems. So there are the Gaussian measures serves services a model and the target measure for target measure you get the isometric properties is not worse than the Gaussian space. And for analysts, the right instrument for working with metric measure spaces is not the remaining tensor and not the richie tensor but what about the, the, the, the object which is called bucket emery tensor. Here we have a combination of the richie richie tensor and the fashion of the potential. So you have a measure and it has a density with respect to the remaining volume and you cover which is called richie and bucket emery. And bucket emery is responsible for the most of analytic characterizations of the metric measure spaces. And here just a simple example. This is called bar scum flip theorem, which is actually embossed immediate carolery of Bokhna formula Bokhna formula for metric measure space. I recall the Bokhna formula relates the, the average of the generator of the, of the say weighted weighted say weighted Laplacian of the geometric measure space is with the Hilbert Schmidt norm. And here we have Bokhna equation plus here we have Bokhna emery applied to the gradient. Okay. Okay, in this, this estimate is quite important for analysts so it was discovered by bars company, not in the remaining setting but so on the, on the flat setting and by by different by different for instance it's carolery also it's also color of the bar scum company quality or pre copa lander. So if you like functional forms. And just an illustration of the new results we can get here it's just let that apply just this this inequality to the metric measure space are related to the transportation problem. And for this, those to this end you have to compute the buckery emery. The buckery emery has this quite nice symmetric expression here. So here you see the second derivatives of the, of the potentials and here as a kind is a is a is a positive, positive, positive tensor depends on the third derivatives. And what happens if you apply just, just this. This, this brass complete boss comp brass complete results to the to the function which is a eigenvalue of the of the potential. You will get kind of dimension three estimates for the eigenvalue of the generator. Okay, this is so the nice thing which it depends it doesn't depend on dimension. And this was the result of was like it myself obtained. Say, I guess 2016. Okay, so this just was in illustration and so maybe I don't have a lot of time. I have a lot of slides but I am a kind of slow. I apologize for this. So let me quickly discuss another result which is eventually is more interesting for the audience, which is about about estimates for the for the for the solution of of the what what called moment measure problem. Or if you like it's a color real care and so let me formulate it so you have a probability measure which has zero body center and you're looking for a measure of this type. So you will get an equation of that which is given by potential fire and the potential fire gives you the optimal representation of the of this measure into the target measure. So you will get an equation of of this type. And so this is something which is known for current time equation. So which was studied by many people including Professor Branson but mostly in the in the smooth setting and applying variational approach to this. You can get a solution and quite and quite wide assumption so they'll pose in this of this problem and a quite wide assumption is obtained in the paper of Clark and Cordero Raskin. And so 15 that for a wide classes of measure. You can solve this equation in a big sense. Pfizer gives maximum to the fall in functional. Okay, the functional cell field is is related to Precoba to Bruno Koskin equality and later it's also it was also Roger that there is another variational functional which gives you the same solution, but it can be formulated in the probabilistic formulation. Here you have measure and you have the Gaussian entropy of the measure. And so, and also you have what is called Wasserstein distance. So this is this functional w w t this is not, but this is nothing else but the Cantorovich functional for the quadratic costs. Okay. And I think I will stop my talk. So, don't understand that my time is already over. Maybe, maybe three more minutes, you can say something for us in three more minutes. Okay, that's enough. Okay, we studied the study. A special. Together with together with Clark we studied a special case of this equation, and this is written here, and we just, this is a just a special case of the moment measure problem for the measure which is uniformly distributed on a convex set. So given a arbitrary convex set. And you study this equation here and you, you want your. So you measure you measure it just from this moment, you want to measure just an informed distribution on the on this set. So let me let me stress that here you your, your, your set is not not a polytope. It's not, but it's just arbitrary convex set. Okay, you saw this equation and we conjectured the conjectured that the, the, the related has a full in bound for the richy curvature. And we have, and the, the case there are this this inequality is inequality is given by the following example. If you have, if you take the simplex and you solve the corresponding current shining equation. You will, you can, you can, you can write down the explicit solution. And in particular for this case. Your cave is the metric. So this is the UK equidimetric is, is, is turned out to be sphere. So for this, for this, for the simplex is constant. So it will be part of the sphere the creature constant is constant is constant it equals exactly, exactly this quantity. And the conjecture was that this is actually, actually maximum for the richie tensors meaning that for all other convex bodies, every richie tensor is just bounded by this quantity. So this is a kind of very nice geometric beautiful facts. I don't know any application of this but maybe you do it or not. We were thinking, dealing with this we were thinking about problems from asymptotic analysis like KLS. So there was a hope that this, this metric of this type can, can get us some new results. So last time we discussed this, this Clark just before the, the COVID, COVID epidemic. Actually it was a very beginning of this, this story. And after the by reasons which by the, by the reasons of abandonment and others so we stop this communication and somehow since the time we don't have a new developments about it. But so for the moment, as three years ago we have, we have a result that this context is verified for dimension two. So for the in dimension to we can prove it because in this case the richie is just is, is reduced to a scalar function you can work with a function and apply maximum principle. And you have say two dimensional kind of two dimensional analysis here. And we were trying to work with dimension three the richie is also relatively simple, have many symmetries. But still it was clear for us how to, to use this, how to, to get this result but this believe that the all conjecture is true and so hopefully it will be interesting for, for geometry by some reason. So, and, okay, I have more slides but sorry, I, I have to stop here.