 Thanks a lot for inviting me to be your talk here. I will speak about reflection occasion algebra. I have spoken about that here twice or three times, but now I want to present some new results, new application of this algebra. All results are obtained in collaboration with Saponov-Pavlov and other persons, but mainly Saponov-Pavlov. So now I want to begin with what is a classical form of Frobenius formula. So first I will speak about Schur weld duality. Schur weld duality about this algebra. So let V be the basic space in which this algebra acts. And this action can be extended on V in tensor power k for any integer k positive of course. And if we use the following co-product, the usual co-product on the generated is given by this formula. So it's possible to extend action with algebra on all powers on V. And also in this space V in power k, the symmetric group X. And finally it acts of course by action on first and second position and so on and so on. And the Schur weld duality states that these actions commute with each other. Moreover with algebras are centralizes of each other. And symbolically it's possible to present V in power k as follows. Here we take the sum over all partition of the integer k and here we put a reducible model over this algebra labelled by the partition lambda and m lambda is also labelled by lambda but it is a reducible model of the symmetric group. And the question what is quantum analysis of this duality? The group algebra with group algebra of symmetric group can be deformed into Hecchi algebra Hn depending on Kuh of course. And what is quantum analysis of this algebra? Everybody says it is quantum group and finally Jimba suggested a form of quantum Schur weld duality with the role of this algebra was attributed to quantum group. After that in this there Ramm exhibited a key analysis of Frobenius formula based on this duality. But now I want to present here another form of Q duality in which the role of this algebra plays the so-called reflection equation algebra. Also a new Frobenius formula will be exhibited. Now I want to remind you what is classical form of Frobenius formula. So consider commutative and determinants. To any partition we associate some family of symmetric and invariant undaction of symmetric group polynomials in x1 and so on. We shall deal with these two of them the power sums and Schur functions. I don't want to go into detail you know Schur functions but the Frobenius formula is as follows. If we want express power sums via Schur functions we have the following formula. The coefficients here are characters of symmetric groups. In the representation M lambda what is new here? New here is just a cyclic type of the element where we consider the evaluation of this action of the symmetric group. Okay now I want to recall remind you what is a power sums? Power sums with number K is as follows. It's very known. But what is P new? It is a product of power sums, simple power sums. The Schur polynomials can be defined in different ways. But for example by means of the Jacoby Trude formula I don't want to go into detail. I want to present here the explicit formula for full symmetric polynomial here, full symmetric polynomial and elementary symmetric polynomials. Now I want to say a couple of what's about the Heike algebra and its representation theory. So first I want to consider the Arting-Brady group Bn. This group is generated by generator tau 1 and so on, subject to the well-known relation. If distance between number is bigger, so is a commuter. If not, unless we have the following relation. The last relation is called the Brady 1. But finally we can embed the Brady group Bn in the bigger. It is possible to consider direct limit of this sequence. If we impose this condition more, we have represented a symmetric group. But if now we impose the following relation, it is just definition of Heike algebra or Ivochore Heike algebra. If we have particular, finally I suppose here that Q is generic. In particular it is not equal to plus minus one. But if we consider just this case, we have symmetric group or algebra of symmetry group. And it's possible to consider other questions which called Berman words Murakami-Venzel and so on and so on. It's possible to see all this stuff in the papers by Agieveski-Pietov on archive. Now I repeat, we consider Q generic. Let we be vector space over the field C. And consider operator which acts from V squared in the same space. We call this operator Bradyn if it's subject to the following relation. You see that which relation looks like the relation in the previous page, Bradyn relation, it's so far with reason. This operator is called Bradyn. The simplest example as a usual flipper, if we transpose two elements in the usual sense and superflips which are similar, but in this position it's necessary to take in consideration the parity of the elements. The dimension of the space is m, the dimension of the odd component is n, and I explain my notation. Now I want to draw a notation to the fact that supertransposition is involutive. Involutive means that the square root is just equal to identity. In particular case n equal to zero, we have just the usual flip and so on and so on. And finally I would like to draw your attention to the following factor. That we take the generator or braid group and we assign to this generator just with operator. What is with operator? With operator it's just placed on the position k. Here identity, here identity. So it is just representation of the braid, arching braid group. But it is a very special representation. You know that they are the so-called Brevard representations, but it is not the case. Here we have the following property. If we consider, I call with representation, a matrix representation. If we consider a matrix image of the element tau two, it is just image of the element tau one, but here we put the usual flip, and here we put the usual flip. So we have finally two operators. One operator, it is r, it is Hachier symmetry, for example, braiding, and the other operator is just the usual flip. And I present here two examples. I fix in the space v a basis. So consequently I have basis, corresponding basis of the space v squared. And finally we can present any operator by a matrix. Here the matrix corresponding to operator coming from the quantum group SLN. And here operator coming from quantum group SL11. So it is super case. You hear here minus just a sign in super situation. I skip finally I would like only to say that quantum Young-Baxter, it is another form of greater relation. If we have deformation of the identity in this situation, it's possible to write down the classical form of Young-Baxter equation, but I don't will speak about that. So if we have v equipped with r, r is Hachier symmetry. It's possible to introduce analog of symmetric algebra and skew symmetric algebra. And here T is free tensor algebra. And we can introduce as well corresponding Pancaré Hilbert series as follows. It's very natural. And finally I would like to say that there exist a lot of examples of solutions to greater relation to Hachier symmetries constructed by myself many years ago. And finally these series are very interesting and useful to classify tools to classify a little bit of solutions. Now example, if we consider the usual Flipper dimension n, we have as p minus the following polynomial. If we have p m n or deformation always super flip, we have the following polynomial p minus. But I mentioned it already that I constructed a lot of example with the following exotic as you wish example where p minus is as follows. And here n is dimension of the space but the power with polynomial, degree of polynomial is 2. And if we have here just polynomial, not with situation, I sculpt even. Okay, now I want to mention my theorem proved many years ago that with Pancaré series the product is equal to 1 here with minuses of course. And the theorem belonging to some Vietnamese mathematician saying that with Pancaré series it was always a rational function. And S are a couple of positive of integers which are very important in this recall. So if S is with couple is called by rank. A by rank is very important notion. Finally by rank is an analog of super dimension. Super dimension of super space is just with couple m n and here we have R S. And I want to say that for example in super case we consider V which is sum of odd and even part and the usual dimension with super dimension m n. So in this situation the usual dimension of this space is m plus n and quantum dimension is m minus 1. And final quantum dimension is quantity which are much more important in physical in quotient mark applications. Okay, now what about representation theory of the Hecchi algebras. Finally, Hecchi algebras is a morphic to group algebra of symmetric group for generic Q. I repeat I consider only generic Q. And according to this theorem it is well known V-algebra is a morphic of the direct sum or direct product as you wish of the matrix algebras. And dimension of the simple component here is just d lambda squared where d lambda is number of standard young tables corresponding to this partition. Now if we fix lambda from this set there is a basis consistent of the matrix unit because it is just usual matrix algebra. But denote in this way the matrix with all zero apart from the one position where we place one which is placed on the intersection of y row and j column. If we consider the set of this matrix unit the multiplication table is evident as follows. But diagonal element here we put y y twice. Our primitive orthogonal identity defines the resolution of unity. Namely we have it is unity of our algebra. So it is just the sum of diagonal elements which are just orthogonal identities. And finally the product is usually phi is equal to j lambda is equal to mu. So we have the same element if unless it is zero. Now we consider representation of Hakey algebra in this basis on itself by left action. So we consider any element that belong into our algebra and we consider image of this element by applying to the matrix unit that we have the following matrix here corresponding to this element. Now I want to say that in principle it's possible to construct with set of matrix element in different way. One way best for me is presented in the paper by Ogievetsky-Pyatov. There is this lecture on Hakey algebra on the archive. And finally with methods by Ogievetsky-Pyatov employs the so called Jutsus Murphy elements defined as follows. So classical version of Jutsus Murphy elements where classical version was considered in the paper by Akunkov, new approach to representation theory of symmetric group. And now I want to introduce finally the main actor of my communication which is thus reflection equation algebra. We fix R which is involved for Hakey symmetry. Usually I suppose it to be Hakey symmetry. And L is a matrix composed from some matrix composed from some entries. And I consider the following equation RL1 and so on and so on. What is L1? L1 is I forgotten to write down here. It is L multiplied by identity. And finally if you put L as follows here you have a system of relations. And there exists a theorem that with quadratic algebra it is just quadratic algebra. All relations are quadratic. With quadratic algebra has a homogeneous component of classical dimension for generic Q. And if we consider a similar relation L heart but here it is not zero linear term. With algebra is called modified reflection equation algebra. But if we have the following change of the generated matrix I call this matrix and with one generated matrixes. Finally if we apply this change we pass from this algebra to this one. So we are dealing with the same algebra but in two different bases. But I prefer to use different names for this algebra. Reflection equation and modified reflection equation. Okay. But as if we have following situation R goes tends to P as Q goes tends to 1. In this situation with algebra tends to symmetric algebra with algebra has entries commutative. And with algebra tends to the enveloping algebra. In spite of the fact that if Q is not equal to 1 with algebra R is a morph. With algebra R is a morph. Okay. Now what is the main advantage of reflection equation algebra modified or not? It's possible to introduce an elec of the trace. There exists a matrix C such that in this algebra there is some very good defined an elec of the trace. Our trace is just defined as the follows. It is just usual trace but here we put the matrix C. So you'll see that with trace which is well defined of the non-modified reflection equation algebra plays a very important role in constructing power sums. Power sums you see here. They look and the classical case. And power sums in the modified reflection algebra by a similar formula. But here is a matrix L. Here is a matrix L hat. If we consider the classical case, classical case means that I is equal to P. We have C equal to 1 and we get just the classical power sums. Nothing more. And now it's very important notation. In spite it is not to be understood why with because finally I want to introduce some over-linered indices. L2 over-linered. L1 over-linered. No modification. It is just classical one. But L2 over-linered. It is the previous matrix. And here instead of the usual flip usually here one puts the usual flip. But now I put the now high key symmetry and so on and so on and so on. After that I want to introduce the following notation. We are put here the product of the matrix L over-linered indices. And this product will play a very important role in the following. So this result can be found for example in the paper by Petrov. But it can be found in other papers as well. But the result is the following one. So if we take any element of Hakey algebra and we consider the following element we consider the image of D with respect to our representation of the Hakey algebra. We put here just the product indicated on the previous page. And we apply the R-tracer on the whole position. We have a central element in this algebra in the reflection equation algebra. So we have mapper which maps the with algebra Hakey algebra in reflection equation algebra. Sending mapping that in this it is characteristic. This mapper called characteristic. Okay. But now some elements are very interesting. Central and very special. If in as z I take the product of this element this product is called sometimes coxter one coxter element of the Hakey algebra. If I consider this product I have just another form another form or of power sums. So power sums can be presented as indicated above by a tracer of the power L. And another presentation is here. Now what are analog of sure function on our algebra? This is follows. Finally now on the place that we put just indicated above diagonal element which are orthogonal idempotents. So it is just a definition of analog of sure polynomials. But I would like to say that you see here in the definition the number y comes. But finally it's possible to show that this definition as L doesn't depend on y. It is not so easy to see because idempotents are not equivalent to each other. Okay. Now what is Q and L of Frobenius formula? With formula as follows. So here we put just the Q and L of sure functions. Here we put new Q and L of power sums. And here it is just a character of Hakey algebra in the representation labeled by lambda. One element was cyclic type is new. But it's possible to ask that what is cyclic type in this situation. In the symmetry group it is well known but here it is not. So finally I want to define cyclic type only for the coxstar element to some close elements. Finally we consider first coxstar elements. We remove some factor from the coxstar element so some factors are eliminated. And finally we have a number of strings between strings. We have blank spot. So I call this element remaining element coxstar element with gaps. And finally what are new? New is one. If we have we assign one to blank space, blank spotter. And if we have stringer we say that it is just the length of the string plus one. So I would like only to say that it's possible to introduce the notion of cyclic type for very special elements, for the coxstar elements and elements obtained by canceling some factors. And finally if we do so we arrive at the formula on the previous page. And I would like to mention again the paper by Arum Ram. And Arum Ram as I said above so introduce formula by using in the terms of quantum group. And I would like to say that in quantum groups it is not very understandable what is symmetric functions. Finally in reflection equation algebra symmetric functions are defined very naturally. And the symmetric group it is not. It is not possible to generalize this construction to other situation, to other hachysymmetry or other hop algebra. So only quantum groups, the Drinfield-Gymbe quantum groups is covered by construction by Arum Ram. And I would like to to say some words about other property of the reflection equation algebra. So on this algebra reflection equation algebra there exists an element of the Newtonian identity. You see with Newtonian identity looks like the classical one. The classical Newtonian identity is just the case corresponding to q equal one. We have here a similar relation and here in classical situation is k. In our situation it is q analog of k. And also K-Ligamenton identity. K-Ligamenton identity is as follows. So generating matrix of the reflection equation algebra is subject to the following relation. m is just the number here. So by rank here it is just m zero. In this situation we have analog of the usual K-Ligamenton identity. And coefficients here are analog of elementary symmetric polynomials introduced above. It is just a particular case of sure polynomials corresponding to one column diagrams. And now introduced u in determinants. So mu y. It is a very natural way. We consider the previous K-Ligamenton identity. And now we introduce sum is one coefficient. Product two by two is the second coefficient and so on and so on. So with quantity are called again values of its matrix. But I would like to draw your attention that with matrix has the entries non-commutative. So in this situation it is not evident how it is possible to introduce the notion of again values. Nevertheless it is possible to do. And finally we get the following algebra. Of course we assume with quantity to be central in this extension of algebra. But only stuff is valid. Only if we deal with even symmetry. Even symmetry. I repeat here we have zero. But if is not even we have we have the general situation mn. And in this situation it is possible to write down analog of K-Ligamenton identity. But finally we have two family of eigenvalues. I don't want to go into detail but we have two families. Why families is just even eigenvalues. The second one is odd eigenvalues. And finally we have to define what are symmetric polynomials into families of indeterminates and new. And for the power sums we have the following formula. So it is not so easy to understand the formula. But I would like to say that if we put in with formulas q equal to 1 we have the well-known classical super classical formula. That power something for super matrix it just as follows. It is just the sum of m in power K even minus the eigenvalues in power K odd. And I would like only to say that in even situation we have just here the whole liter root polynomials. And up to a numerical factor up to with identification because if you consider the books. So in all books in whole liter root polynomials just the parameter t comes and in our situation it is just q in minus power 2. And I would like to ask you maybe you know you have seen with formula in super situation because holy liter root polynomials in even situation are well known in many books. But in super situation I know nothing about that. Okay now I would like to say that finally on the previous page I presented here some polynomials which are supersymmetric. What doesn't mean supersymmetric polynomials. By definition supersymmetric polynomials are symmetric with one family symmetric with second family. And finally if we put ui is equal uj and equal to s here yj doesn't matter finally. It doesn't depend on s. By definition with polynomials it's called supersymmetric. It is just our case. Now I don't want to present this construction with detail. I would like only to say that on the reflection equation algebra it's possible to introduce the notion of quantum derivative for derivative on the generators as follows. But how it's possible to extend with the derivative partial derivative on the highest polynomial it is not evident. It's possible to do and the theorem is as follows. If we consider the generating matrix of our reflection equation algebra and we compose another matrix from this derivative partial derivative one with algebra we have another matrix which is a generating matrix of modified reflection equation algebra. It is very surprising factor and it looks like the classical situation. With matrix it's a generating matrix of this algebra if with matrix is with commutative entries is with matrix is composed from from partial derivative fusial partial derivative. So you see that in our situation we have some generalization of the well-defined classical cases classical formula for example capellian data in classical case. What does it mean? It means that if we consider with matrix L hat which is product of two matrix we have the following formula determinant of L determinant T here special determinant rho determinant. Rho determinant is determinant by product by rho. It doesn't matter finally but it's possible to define with rho determinant k it is just shifter it's necessary to consider L hat plus shifter and finally we have this formula which is called capellian identity. But in our situation in q situation you see with just a analog well-defined of reflection equation algebra of capellian identity we call this formula q capellian identity. And and I would like only to say that another form of q capellian identity was discovered by free Japanese mathematicians and but of course in their construction just quantum group appears and finally I repeat in this situation it's possible to apply with construction to only one case related to quantum group. Our construction can be applied for any r and and to finish I would like to say the following you see I introduce q analog with algebra and with algebra as well so non-modified modified reflection equation algebra and with element which are analog of power samsung and after that it's possible to define analog of Kazimir operator which is classical situation I will define here and finally the following problem arises. What are eigenvalues of this Kazimir operator which are belonging to this algebra? With eigenvalues we are computed by perelloma for both it's old paper very no paper and in our situation it's possible to define an analog of Kazimir operator but living in q analog of this algebra it means that modified reflection equation algebra and if we do so it's possible to see that deformation of Kazimir operator acting in the irreducible representation is a deformation of eigenvalues computed by perelloma for both but we succeeded to compute eigenvalues in our Q-deformed situation only for the lowest cases so for two lowest Kazimir operator because calculation is very difficult but the last thing to say I would like to draw a limitation to the factor that since the matrix L hat is just product L multiplied by d it's possible to put here just points so what doesn't mean it means that normal ordering normal ordering it's very important thing in the mathematical physics you know essentially if we have a infinite dimensional spaces but if we present L hat as follows it's possible to put here points and finally with operator are called cut and join operator in classical situation so with operator are q analog of cut and joint operator up to a normalization and classical situation here usually one put a normal parameter normal factor but I would like only to finish by saying that cut and join operator are very important operator in Hurwitz theory and finally we try to understand what is analog of Hurwitz theory in q situation for the moment it is not clear but finally our main objective is to develop and with q situation some analog of matrix models that's all what does it mean exactly normal ordering in your setting just a moment normal ordering what does it mean exactly in your and not necessarily exactly but what does it mean in your setting in the classical situation if you have for example operator here the matrix l here's a matrix d here just the interest as follows here's the matrix as follows and one operator similar operator and you can see the product of the operator in the usual sense but in new ordering normal ordering as you wish you put d on the other side okay if you have super situation it is not possible to transpose them simply it's necessary to consider the parity okay in our situation as well so we know what is transposition of derivatives and with elements and the wish transposition is g l minus lg is equal something delta something like that it is it looks like the hnb algebra but if we cancel with we have normal ordering