 Thank you. And as Victor mentioned, it's reflections actually, it's not, it's also means thoughts, yeah, it's not really, it's reflections, yeah. So I will speak about the thing which is the most well-known, yeah, double, yeah, left and right side of everything, yeah. So I will speak about most well-known contribution of Victor cuts to mathematics, cut smoothie algebras. And the cut smoothie algebras have to, I will divide my desk on two parts. So if it will be orthogonal part, and another will be a symplectic part. So you start with generalized Carton matrix. So it's a matrix Aij from, Iij from 1 to some number r, which are integers, diagonal elements are equal to 2. And diagonal elements will be non-positive and assume that it's symmetrizable. So it means that there exist some numbers di, which are strictly positive. In fact, one can choose them into the integers such that di Aij is equal to dj Aji. So up to this factor, it's symmetric matrix. And then we have a remarkable object associated to the Carton matrix, the cut smoothie algebra. There are several versions. I will speak about kind of the minimal one. It will have direct sum decomposition in the kind of low triangle, diagonal and upper triangle parts with generators ei, hi, fi. And relations which now I think most mathematicians know that ei, fj is delta ij, hi. Hi, ei is aj ej, or maybe I mean ji, okay. Hi, fj, as a way around, hi, fj is equal to minus aji fj. And then you also need ser relations, I suppose. Yeah, so it's something like hi, commute. And then add ei to power 1 minus aji, j, ij, okay, you know better. And the similar for f. Yeah, so it's quite a miraculous list of relations in what is nice about these relations. So this algebra consists of three parts, n consists of f and commutations of f, h consists of h, and n plus consists of commutations of e. And algebra is zr graded. And there are roots in zr which will be, correspond to non-trivial graded components. And the set of roots has the following three remarkable property. First property, set of roots contains all what's called plus minus a alpha, alpha i, where alpha i are base elements in coordinate space, which will be graded gradings of elements ei. And alpha i is a gradient of elements ei, and f minus alpha i is a gradient of elements of minus ei. So the set of roots also contains zero. Now, so it contains kind of positive actant, suppose I'm three variables, vectors and positive actant. Second, the set of all roots, it's kind of low bound to the set of roots. The upper bound to the set of roots is contained, maybe I'll put zero in set of roots, it's contained in a union of two chambers, c plus, c minus, c pluses, product c minuses. So it contains in the union of positive and negative actant. And second, and third property, all the set of roots is invariant under the val group. And what is the val group? This is a group generated by reflections, namely reflections, and the reflection i of alpha j of alpha i is minus alpha i. A reflection i of alpha j is alpha j minus j times alpha i. In fact, I can write here for j non-equal to i, but for j equal to i it's the same because diagonal elements are two. And you see that it's really surprising because why such set exists? The minimal set which you can take, if you satisfy this property, you take the set of all base vectors with plus minus coefficient. And then we start to make reflection with respect to this group. And the question, why all the elements lie always in positive or negative actant? It's kind of a miracle. The only proof which I know it's by existence of cut-smooth algebras, maybe one can make elementary proof, but it will be very complicated. One should kind of treat special small cases and large cases and completely different technique. So I found it's very remarkable geometric color of the existence of cut-smooth algebras. And it works to introduce real numbers, signs or integer numbers, signs and all that. Otherwise it's just wrong. Yeah, I want to say that this reflection belongs to Aij plus, I just want to say that this reflection belongs shifted guy with positive coefficients. And now I'll just draw some basic examples. When r is equal to 2, you can start with the simplest, interesting Cartoon matrix for A3 and you get six roots. You see the sits in positive and negative actant. Then you can make slightly more complicated matrix. Then you get what's called a fine algebra. The set of all roots, it's integer vector length on three parallel lines. And again it consists of positive and negative part. And then one can make even more complicated thing, which is really hyperbolic algebra. And hyperbolic algebra we won't really start much, but the picture is like this. You get vector, let's say, 1, 0. Then you get vector 3, 1. Then somewhere else 8, 3 and so on. And similarly here. So you get vectors which are obtained by reflection of simple roots of base vectors. And then in between you get an angle sector with slope 3 plus minus square root of 5 over 2. There are two angle sectors. And everywhere inside there are what's called imaginary roots. And the quadratic form given by this matrix is non-positive and real roots, which are coming from simple thing. Okay, so you get this kind of part, which is mysterious part of hyperbolic algebra. And finally one can try to draw some three-dimensional example. For example, one takes symmetric matrix like this. The set of roots will be a collection of vectors in three-dimensional space. And you can projectivize it. You take positive roots and they will get the subset of triangle. And subset of triangle is pretty complicated. I've seen somebody make a picture of the subset. It's not true convex. It's really some complicated fractal monster in projectivization in RP2. So that's generally the picture of this story. And now how one can reformulate this property one to three geometrically. Suppose one get some vector space with non-degenerate for simplicity symmetric form. N is some number maybe larger than a rank of our algebra. Get collection of vectors and get linearly independent vectors. Such a scalar product between alpha i and alpha j will be exactly this symmetric matrix, which I assume. You just extend to some larger space to make it non-degenerate. And then the property one to three equivalent to the following. This is a subset, which is called wall or walls sitting in Rn with the following three properties. The analog of property one prime. Actually, what will be the subset? It will be union of hyperplanes orthogonal to all roots. I think just replace vectors by hyperplanes. All roots. So the first property is that this set of walls contains union of coordinate hyperplanes or it contains union of alpha i. The second property says that the set of complement of the walls contains strict positive octans. So maybe called C tilde plus by union C tilde minus. C tilde plus is a set of all vectors set at V of alpha i strictly positive. And C minus is complement. So it contains two simplicial open simplicial cones. And third property is the same as that the set of walls is invariant under action of the white group of these reflections. Okay. Now go to symplectic version. The symplectic part of the story is related to cluster algebras from Zelvinsky. We start with again matrix with integer coefficients but now diagonal elements i0. And its metric is Q symmetrizable, no conditional signs. Okay, there exists DI. We don't know what is analog of cut smoothie algebra. But the geometric result will be very similar. First of all, there is no weld group in this case. So to formulate what goes on, I really should go to kind of analog of this geometric reformulation. And analog of geometric reformulation is the following. So I first have to define some notion of mutation. Suppose I have a vector space. Now it will be even dimensional with skew symmetric, with non-degenerate skew symmetric form. And I have a collection of linearly independent vectors. And this vector space, the relation with these numbers, the ij is, of course, the scalar product is now DI times aij. And now I want to define a notion of mutation. Mutation, it will be an operation which makes from one collection a new collection of vectors. Alpha i prime is minus alpha i, like in a symmetric case. And alpha j prime is alpha j plus alpha i alpha j divided by DI times alpha i. And here put plus. What does it mean plus? Or x plus is the maximum of x and 0 for real number x. It's essentially the same formula. One can replace this aij by the same ratio. Yeah, it's the same formula. And also it belongs to alpha j plus some positive integer shift of alpha i because of our agreement. So that's how geometrically this transformation looks like. So we consider a square orthogonal hyperplane to alpha i. And we get vector alpha i belongs to this hyperplane because it's a symplectic form. And then there are some vectors in our collection which are on one side of hyperplane depending on the sign of scalar product with alpha and on the opposite side. And the new collection is the following. We make minus alpha i. We keep the same vectors on one side. And on the other side we tilt them in direction of previous original vector alpha i. That's another look of this reflection. So one can formulate these things so it will be completely indistinguishable. And now, I just want to say, sorry, I don't make it completely symmetric, I failed. One now can make the theorem which I want to announce is a following. It will be analog of this property. There exists a subset. In fact, it will be very canonical subset walls sitting in R to N with a following property. So there will be no analog of this 1 to 3 but I will formulate pure geometric property. There will be a certain subset which contains a union of skewer orthogonal hyperplanes. The complement to the wall contains, again, this C plus tilde plus C minus tilde. Exactly the same as I denoted before. It contains two open cones. And instead of invariance under the wire group, it will be different formula. Actually, this transformation, which I didn't note this before, I called it Ti, this alpha j prime. It's piecewise linear transformation of a simple electric vector space. And the formula, instead of invariance under the wire group, will be replaced by the following. That if you make a mutation with our collection, it will be the same as this transformation applied to the wall of original collection. So if you make mutation, you make the same picture, but now not linear transformation after piecewise linear transformation. So it's a theorem and one can make a completely parallel list of examples with this usual algebra. So again, with rank 2 we can make this matrix. Now we have five walls, which are five rays, not six rays, but five rays. Similarly, for 0, 2 minus 2, 0, 1, get almost the same picture as in multi-layer algebras. We can see the rays, but passing through kind of two negative base vector parts of the roots in Katsumdi Algebra. And similarly for 0, 3 minus 3, one get essentially the same picture, but again one take two negative base vectors and the same picture with the same slope in the positive actant. And also there is some three-dimensional version. Forka-Ganchirov, for example, calculated example for this picture for this case, and get extremely complicated decomposition in small triangles after a certain over-projective plane. So that's a surprisingly similar story with all kind of fractality, looking more of the same way. I have the only one proof of this result. It follows on some wall crossing formulas and some decomposition of elements in simplectomorphism group. So it has nothing to do with Katsumdi Algebra. So I really have no idea how to make joint simultaneous proof. There is some special case when matrices are symmetric, and then there is some category theory, but also in variance one can see some analogy and relations and put in the same footing. But in general cases are completely different and still it's quite a big mystery. So one should maybe find some algebraic structure behind. Thank you. Are there questions? I have a short corner. In fact, Fiendek has a genetic proof even in one general station it's not necessarily crystallographic. Then you cannot use Katsumdi Algebra. I have another question. Does the existing matrix is not symmetrable? It's not symmetrable. It's not symmetrable. But it should be crystallographic. Good questions? Cluster algebras have a connection with the integrable system, TQ relation, which then have a relation with Katsumdi Algebra. No, but here I consider some kind of very wild thing which is not like hyperbolic general Katsumdi Algebra are not related with anything. There are some having automorphic properties, but most of them are completely wild objects. So it's the same thing here. No, of course there is some A, B, C, D, E, whatever. I mean motor in normal.