 Thank you for the invitation. I'm very honored to be part of this event. And like Martin, I mean I was mostly interested in SPDs. And it turns out that if you want to deal with the general KPZ that Martin presented in the previous talks, you need to use algebra and you have to use this conchimer of algebra. And in this talk, I will try to go through these two renormivations, the one which happened in SPDs, the one which is used for recentering monomials and the other one which is for renormalizing them, which is more equivalent to this BPHZ renormivation. And I want to interpret these renormivations in the framework of boogaloo both type recursions in the sense that I want to give a precise meaning that we obtain in the context of SPDs some algebraic Birko factorizations. So this is a joint work with Kurush Ebrahimifat. So before going to the algebraic setup, I would like to present very briefly two applications or two fields where you can actually use this tool. So first, the one with singular SPDs. So you start with here and just write an equation. So you've seen an example with generalized-guided equations. So I have a PTU minus the fashion of U equal a non-NIT, depending on U and its derivatives, but also on the side, which is a space-time noisome. And the whole idea of ready structures, trying to solve these equations, just give a lot of descriptions of your solutions. So your solution will be locally described by Y equal X plus a sum. Here you have your sum of data expansions, where here you sum up some combinatorics, as Martin presented in the previous lecture. And every tree here you interpret it as a monomial, so it will be small in Y minus X, where you use some map pi X. And you have some coefficient in front of it, so given by these two epsilon tau X, so the coefficients come from when you do your Picar iterations in the equation above, you produce a perturbative expansions and you can grab some coefficients, depending on the non-NITF. This is a Taylor expansion, so I truncate, and I have some reminder, which would be a small one, X is close to Y. So what is very important here, that I have a map here, pi X, with a base point X, and this map sent decorated trees without these combinatorials to actually re-center certain intervals. So it would be stochastic integral to be re-centered around the point X. This is like a character on my decorated trees, if you want your Feynman rule on. And this is the map that you will be interested to construct. So pi X is a re-centering map, and if you want to have the so-called model that has been used in writing structure, you need to add another map, which will be depending on two points, X and Y, and this map allows you to go from an expansion around the point X to an expansion around the point Y, doing with this transformation. And the copolar, I mean, pi and gamma constitute what we call a model in this context of single-core speedy. It happens that in the context of speedies, gamma is strongly determined by the pi X. So the re-centering map actually determines your gamma. This was the first, I would say, re-nomidation or re-centering operation. Then you have a second operation, which is more familiar to you. Actually, this pi X star, which are stochastic iterated integral, they could be all defined, because you have some products which are all defined. Of course, these products are there. I mean, they are in this right-hand side of the equation because the noise is a distribution, so you have problems to define these products. And so also, this term will not be confirmed in that you need to construct it with a suitable re-nomidation. For that, you need to act with some re-nomidation maps. We'll be learning our maps on your set of the corrected trees. And we do it in a way that we have a simple actions on this pi X, where we just apply first re-nomidation M, and then we apply this pi X, this re-centering map. So obtaining these simple expressions really comes from the algebraic perspectives when you have the correct interactions between the alpha algebra, I would give you this re-centering map and the one which we normalize, and these nice correct interactions allows you to write this simple formula. There are two re-nomidations at play in these PDEs, and I would like to give you how you can see them using these algebraic Birkoff factor revisions. Now I move to another application, which is more recent, and this was for numerical analysis. So imagine that you have some dispersive PDEs, which are all that form. So here we have some differential operator. Here we have P, which would be a polynomial and non-nit. Now the issue doesn't come from the fact that you have a singular noise, but you can add a rough initial data like this V. And the idea is that you want to derive an efficient scheme for this dispersive equation at low regularity for your initial data. And it turns out that if you want to write a numerical scheme that I developed, I wrote with my course, what we wrote in actually, you can pick up an approximation of the coefficient of your solutions, and it would be given by a sum of a trisor, like for our PDEs. And you still have the same type of coefficients. This coefficient depends on the initial value, and you have some integrals, iterated integrals, but here a numerical scheme. So it will be not the exact iterated integral that you will construct by doing the P-cariterations, but you'll be on the approximation of this iterated integral. And here we are taking a sum of our trees, decorative trees, but because we are doing this in Fourier, we have to incorporate the Fourier coefficient inside the trees. So that's why I put this is a different set of decorative trees. And as before, we have a character PR, which moves from the decorative trees and the speed out and gives you an approximation of iterated integrals. So there's other main characters. It turns out also that in that context, that from that character, you can build another character using some type of algebraic Birko factorizations. And this allows you to perform the local error analysis. In the sense that you want to know locally if your schema is good at approximating well your solutions. So these are the two applications. And now I would like to move into the algebraic framework. So now we switch a bit from the applications and we come to the algebraic set up. Okay, so I start with introducing what they mean by a decorative tree. So we have that first set of non-planar root trees. Then we have a final set. This is a geographic metric. And then I have a set of decorations will be this all times and to the power g plus one. So this n, okay, what was the interpretation of this decoration? So here, for instance, this finite set will parameterize the differential operator. If you have a system of equations, so then they will be associated to some propagators. If you want to encode the propagators. And the end d plus one here is a directive that you can put on your propagators. Okay, so you want to dissociate the propagator and the derivatives. And so now we're considering trees, decorative trees on this set of decorations. And there will be all that form. So I will have a node decorations, which should be this map. This map, we take value into n to the power g plus one, meaning that I allowed to have some polynomials like let's equal monomials inside my, my intervals. So like x to some power. And I have an edge decorations that will encode actually all the propagators and their directives. So it would be a map from the edge of the trees into the set of decorations. Okay. And I take the linear span of this decorative trees and I call it H. So I need to, to have a, provide here a product. So the product I will use when I'm talking about having characters will be the tree product between the criteria trees. So at the level of the non-planarity tree, it's just to join the root, joins the root of the two trees. And then you do the degenerate sum between the decorations. So I want to, just below you see an example of, of multiplications. So here I have one decorative trees. So I represent here on the nodes here of the node decorations. Here you have the edge decorations is this capital D. And with I do the joint with product, what happened is that, okay, you glue these two roots together. And on the way you add the decorations. You have L, you add it to M. So this would be our product for, it will be the tree product that we use. So these are the set of decorative trees. Now there is a symboling notations, which I was actually introduced by Martino in this foundation paper on the structures. Is that when you have these decorative trees, we want to represent them with symbols. And then you have some grafting operation, this I subscript A here. And you take a decorative tree and what you do, and you just graph this decorative tree with an edge, decorated by A on this node. This is equivalent to, it's a similar thing to that B plus operator that we use for the conchimer of algebra. So you have this grafting of the trees. And if I have this, it's just a dot decorated by K. I identify it as a monomial that I will denote X to the power K. And now my, sorry. And now my tree product, I mean, if I want to represent that tree using the set, this is two symbols, I actually give you these expressions. So I can go it slowly. So here you have at the root, you have M, so this gives you this X to the power M. Then if I go for the first branch, I have a calligraphic A, A, so declaration on the H. And on the top, I have this N. So this is X to the power N. And then I go on with that branch. It would be a calligraphic B, X to the power P, and so on. Of course, here the product is commutative. So I could have written in different ways. So we have like decorative trees. We have a nice B plus operator with using these symbolic notations. And now I can provide you the expression of a conchimer type of product, which should be appeared in the next slide. But first, before that, I need to say that we are considering trees, but we have some truncations in the sense that we will not consider all the trees. We may sometimes just want to consider trees, which actually will give you ill-defined iterative intervals so you want to have a way to truncate. So you define what is a degree of a tree. So it's more related to the analytical aspects of the SPDs. So one, you have one assignments, which will actually define on the propagator. So basically give you what type of regularity you earn by converting, for instance, by each kernel or other kernels. Then you have a second, second map, which actually will be on the polynomials. So it depends which weight you put on the polynomial or which other space you want to use. So often in SPDs, we use a 211 scaling in the sense that times comes double in comparison to the special components. And then the degree of the tree, it just that you take the sum about all the nodes decorations. So it will be the polynomials one. And then the degree of the tree, all the nodes decorations, those will be the polynomials one. So actually they increase it because they add regularity. And then you have to take the sum of all of the propagators. So here I have the things that I earn by completing with a kernel, minus I need to subtract the initial derivatives that could happen on this propagator. And then because we will be in the binarct, I mean, this will also the space of positive decorative trees, which are that form. I put in brackets because you will see in a minute. So there are trees of that form. And in fact, I am asking that all of the branches which are exciting the root are to be of positive degree. So this would be in the sense that this I will have can associate to it a nice projections, which will be this Pi plus. And this Pi plus projection would be multiplicative for the tree product I presented before. So this is the space that it is useful for doing, for instance, the re-centering. And now I come to the main map, which is a delete, perform the chart, which is given above. It looks very similar to the product in a way, in the sense that you can see that this calligraphic IA gives you the same expression that you will obtain with the Pi plus operator. But there are some subtleties, actually, which makes it study a bit more slightly more complicated. In the sense that here I play with my polynomials, in the sense that these polynomials are primitive elements. I give them here, these monomials. And I play with decorations. Here I have a sum over L in N to the power of D plus 1. And this sum has to be understood as if you are doing as a level of the algebra, a Taylor expansion. So I'm taking derivatives on my branch, on my calculator. And then I have X to the power of L divided by L factorial will be the classical monomials that you see in Taylor expansions. And in fact, so this was not done in the original paper of register structures, but we did it later with Martin Ehrer and Lorenzo Zambotti. In fact, you can consider this product with an infinitesum. You can keep the infinitesum. But if you want to give a meaning to this infinitesum, we use a big grading. So as a big grading, what you mean, what you will need to measure, of course, you will have the size of the tree. This will be one complement in this big grading. But you also have to take into account the, for instance, the age decorations. Because here when you do that sum, you increase the age decoration by putting these derivatives. So you can give actually a meaning to this coproduct in this, using the big grading, having this infinity. Okay. So now I'm saying that this is deformed. So this is a recent work with Dominique Manchon. Actually, you can really identify where the deformation happened. I mean, the deformation being the fact that you add all these extra terms, having this X to the power L, tensors, these derivatives on your, on your planted tree here. And this is quite involved actually. It's not straightforward that we need to use another product, which is not exactly the grafting of a tree onto the other, but the plugging of a tree into another. And then you can apply, I mean, you can deform this product using these Taylor expansions. Apply some algebraic procedure, gangoudon, which give you a associative product. And the adjoint of this associative product actually will be this coproduct. So it's like the same procedure when you start with a, that you apply it to a pretty product, construct a associative product, and then you can get your coproduct. So this explain, or you can see this coproduct as a deformation of the original primal coproduct. Okay. So these are just for commenting on this map when you have made mainly this infinite sum. And what happened when you look at applications in SPDs or numerical analysis, okay, you will not use this infinity. I mean, you will truncate because there are constraints due to your applications. So the first truncation, which is these corrections, basically what you do, you truncate here. So I put a projection here on my Pi plus. So in that way, I don't want this tree to be of negative degree. So by putting this directive, actually my degree is going down. So at some point I have to stop. So the sum will be finite. And then the subtraction will be determined by the degree. And if I want to coproduct, because this would be just corrections that I need to project also on the trunk. So on the trunk also I have to put this projection Pi plus. And this give you a coproduct. So I will not go into details, but if you think about the application I gave on numerical analysis with also these expansions into all the trees, these projections with Pi plus will be different because in numerical analysis you are interested in having an approximation up to this point. So you will truncate, for instance, you will remove all the trees which are too big because they will be of order r. This is a different projections. Just to tell you that depending on the context, actually you can play with this coproduct and adapt the projections that fit more your word. I mean the constraint that you have for NDPDs or singular numerical analysis scheme. This object is very central and you can tailor it for your own applications. Okay, so now I will present the main results which are in SPEs regarding these coproducts. In the sense that you can construct an antibody for these coproducts and the map that presented here, this one, when you project on the right, I mean on your make-through of the expansion phionics, it will be a right coproduct for this H plus of positive trees of positive degrees. And it turns out that one of the most important map in SPDs, the one which is used actually for re-centering your object in a way, is a map called a twisted antipod. So you start to be in H plus, but then you exit and you go into H. And the definition of this twisted antipod also looks quite similar to what would be the definition of an antipod. But then you see that you have a sum, the same type of sum that show up as a level of the coproduct. And you have some truncation according to the decree. And you apply it on what should be normally the formula of your antipod. And so the idea, and this is what we claim at that time with Martin and Lorenzo, is that actually this twisted antipod give you an equivalent of this algebraic Birko factorizations. And now in the next slides I would like to make this statement more precise and really try to match up the two languages. I mean the one you use with this twisted antipod. And the one that you will use if you try to develop a classical algebraic Birko factorization or Bogolubov recursion. So here I'm going through what I will consider as my algebraic Birko factorizations and see how one can apply it to the context of this twisted antipod. So I just to put here the framework, so I start with connected the graded of algebra with the co-project delta. I have the right commodule structure of h given by h-art and this is h-art. I will consider a character over h and also over h-art and they will take value into some commutative algebra, a. I will assume that I have some rotor-backstorm map, q from a into a and this rotor-backstorm map needs to satisfy this identity. So this is an identity for a rotor-backstorm map and this rotor-backstorm map will induce splitting of a into two sub-algebras. So one would be a minus when we apply q and the other one would be a plus when I apply the entity minus q. So this is equivalent for like if you have theories in epsilon, you just keep, you remove the pole part I mean this corresponds to this type of projections with your rotor-backstorm map. So this is basically the different objects at play and now I am able to stay what is my meaning of a bug-ribble type of retrosion. So here we pick up a character from h-art into a and then there will be a zebraomorphism, one from h into a minus which would be basically what we call I think it's calling the retrosion and the five plus would be the renormalized character from h-art into a such that for every element in h, so here I have my, I think I forgot to say but this is an injection of I think it's here, an injection from h into h-art so if I pick up a tau element of h and apply some preparation map called phi-bar and apply make sure I got my minus and then my phi-bar is constructing with some type of recursion, recursion where I use 3D annotation for reduced corrections so this reduced correction which is defined here and I just remove the primitive part of my correction and so I do the recursion with phi and I define this with phi-minus and if I want to get the renormalized character I just take the convolution here between phi and phi-minus where for the convolution I'm using the correction and basically what, why it's just called a bookly-worth type recursion because here I'm using a correction if I replace my correction for product I use, I get an algebraic vectorization okay then depends what is your definition of algebraic vectorization so here the recursion is built on this reduced corrections and what's so important is to add this this Q-map which actually do the projections okay so this is what I'm calling this bookly-worth type recursions and it turns out that in the cases or the application that presented it fits well to at least two applications so for instance when I was talking about the renormization of the model meaning that you want to recenter your static processes so this could be done okay you need to extend a bit your expectations but you can think of Q as the expectation and in numerical analysis Q will project according to the frequency so for instance if you have zero frequencies then you want to you want to remove this so this Q will project according to that so in that context it works well because for renormization of the models it's an extraction contraction of algebra which is connected and also in numerical analysis the off-algebra will be connected so you can basically apply not almost directly the bookly-worth recursion that I've presented before okay now it turned out to the off-algebra that I presented the one which is H plus this one of positive, of trees with positive degree unfortunately it's not connected because I have this X to the power K I mean all this monomials and one needs I mean one needs to find what should be the reduced question in that case because it's not only subtracting the primitive part but what you need to do what you have to do in that case also remark that I will not expand further is that while you have these co-interactions between two renormizations this one recentering and this one for renormization and we see that they correspond to actually you have two twisted antipodes you have two bookly-worth recursions and in sense is that you have some co-interactions between the two bookly-worth recursions in the sense that you can start by applying the one bookly-worth recursion produce a character and then from these characters you can actually apply the second bookly-worth recursions and the co-interactions tells you that actually you can switch between the two bookly-worth recursions so this is also something we can be a nice interpretation of this co-interaction properties now for the rest of the talk I will go to the complicated example the one when you are not connected and all you need to deal with this problem in this application we need to find a suitable create for a reduced question and then we will be able to perform a reduced couple that should be the primitive one in our case because we have these polynomials we need to subtract a bit more so in the sense that this reduced correction will be given a subtract so tensor one and then I need to subtract this big part basically what this big part says that in fact I put all these branches all the part of my branches or my trees on the right hand side of the co-product and then I have all the play with the decorations in the sense that I can extract some x I can put xk which comes from the and I put the sum over li which are the potential derivatives that I can put and I have some combinatorial factors here so what I want I want to subtract all this part I want to remove all this combinatorial all this part which makes sense because these terms are used by these deformations and they are made the term you want to remove when you consider a reduced map okay just one one justification of this choice is that at least on these terms which were at the beginning this is why your offer is not connected at least the reduced map is zero on that terms the reduced map has been designed in such a way that it is zero on x to the power k to the dot when you have these k decorations so now we are equipped with this reduced reduction and now okay what I want to do I want to introduce this splitting this so for the splitting we work with the smooth models, we mention that you replace your single noise by some approximations so here we are looking at infinity and what we do is that we fix the base point in x in our d-post 1 and you get the following splitings where f plus x contains a function that vanishes at x and x minus 1 would be the polynomials whose coefficient depends on x which are a function of x okay and then you have the natural rotor backstorm map even below where I pick up the infinity functions and what I do here I have some teller jet between y minus x so I just take the teller jet of f and I have some truncation here I mean I just truncated it with an electrical object and this truncation actually reflect in the algebra when we truncate the teller expansions according to the degree so this is this teller jet operator and actually it's well known that I don't know actually in the literature but you have this rotor backstor identity playing with these parameters when I have alpha and beta and so I have this rotor backstor identity and I think it's been understood in the literature that you can actually extend this result of the Bogorubov recursion with a rotor backstor identity to a family of rotor backstor maps so this is like you have this gives you a family of rotor backstor maps and that satisfies this rotor backstor identities so we already put the framework so we have this splitting space we have the rotor backstor identities we have our reduced fractions all the tools are in place for formulating the Bogorubov recursion ok so here we will instead of we consider a family of vectors from h into h these cfet functions and here these characters will be indexed by a base point here this x bar so we will see it in the next slides but you have to understand this x bar as some a priori re-centering re-centering you know that inside these trees I have some polynomials these monomials x to the power k and analytically I mean I can just write this as a polynomial functions or I can say that maybe I want to have a priori re-centering of these polynomials so I can try to re-center say that they re-center around the point x bar and see if you cannot get some invariance for that point x bar so here I have a family of characters depending on this point and then I can fill up the boogalubo of recursion exactly the same way as in my main definition where now I'm using my Taylor jet operator so I have always my preparations maps but now it depends on more points so here I have the x bar a priori re-centering I have x and y and so this gives me this recursion in terms of phi and phi minus and here this sum this is all three of the annotations for my reduced corrections and actually if I want to compute my phi minus I need to subtract here I'll apply my family operator backstorm map and this is where here you have the degree I mean the parameters of my Taylor jet is determined by the degree of the tree of town so this is where this match the properties that you will see in your co-product when you you truncate according to the degree and then I take the convolution product between phi and phi minus to get my phi plus delta plus so it looks familiar I'm close to the boogalubo recursion one need to be cautious because here I've introduced several base points one base point was this x bar that I hope to get rid of at some point because I want maybe to have some imbalanced property and there is another base point which is this x which would be the point x in which I want my re-nomination to be re-centered around that point x so this gives you this nice boogalubo recursion so actually what you can do with that so we construct these subjects and then okay you have some assumptions okay so you cannot take any characters depending on x bar which this is quite reassuring because if you look at the assumption that you have when you try to design a model in register structures one also you need assumptions on the characters so one assumption which is very natural is the one that you want to say that this x bar actually is re-centering on your monomers so I have x i would be y i minus x bar i so this is what I tell then you have another requirement you want that when you tweak the decoration on the edge this actually correspond to derivatives in the analytical part so you want to match when you add the decorations this amounts to take a derivative on your new character so these are very natural assumptions but actually they give you interpretations of these decorations and if you have these two assumptions and using the rotabaster identity that you've seen before actually you can prove what you obtain in the classical of regression is that this phi minus is a character from h plus into a minus x and phi plus would be a character from h into a and very nice result we also show that this phi minus is actually obtained by phi by applying the twisted antipode so this is actually what you see in our structure this is what we claim to be the algebraic bit of activations so these formulations by recursion actually make appear or actually match the recursion that you will do with this algebraic object which is the twisted antipode and you can go a bit further you can also have a nice expression of phi plus with the preparation map so be careful that this map will be evaluated on plenty of trees so it will be multiplicative here on my chart so actually you can from this recursion starting from this family of characters you can show the character property that you expect on phi minus on phi plus and then you recover the expression for the twisted antipode so now maybe we want to get rid of the X-bar so getting rid of the X-bar so this you ask some invariance property so here you need a bit more assumptions so basically you need to not only interpret the decoration on the edge of derivatives but you need to make appear what should be your propagator or your coordinates for instance here I make appear the two index of the variations for these two because one give me a kernel and the second one will give me the the derivatives on my kernel and here I will count the convolution with the character apply so you need to have another prescription of what should be your Feynman rule you need to have the convolution structure when you are on the plenty tree and if you had this assumption on the top of the two previous ones actually you can show some X-bar invariance in the sense that for 5 plus you can actually remove X-bar it doesn't depend on X-bar then you have some invariance for the preparation of preparation map 5-bar and 5-minus but they are weaker that for 5-bar this will be X-bar invariant on plenty trees and for 5-minus will be X-bar invariant on trees with zero node decorations so there are some restrictions you get fully the dependence on X-bar but for 5 plus you get it so this is something you can show and now I would like to finish my talk with a final remark is that if you use most of these language operating structures what we were considering which are denoted by this bold face pi which by this character indexed by this point X-bar normally if you were in the context of the work we did with Lorenzo and Marching what you do is that you start with this character you apply a tree student code and then you express it with a conclusion with your delta add plus and you construct that map you see the beginning to construct and we show that in the process actually the map is independent from X-bar and this would be the model the pi X would determine this map and finally what you can say that these maps here which are used for constructing the model they can completely have one-to-one correspondence between the map you can see in the retrievable recursion so this pi plus will give you the pi X and this will be your counter terms given by this pi minus and you get the X-bar invariance for the model so this really matches the two languages so the conclusion of this of this work that actually is this algebraic vectorization or Pogolubov recursions show up I mean is really there in the framework of SPDEs and one-to-one can use this language to reinterpret the fundamental objects at play in relative structures and all the renormulations but it's not only limited to to construct partial differentiation even can be seen the level of PDEs when one has to deal with numerical schemes this is a very nice application and started research I mean it was interesting mostly in PDEs or SPDEs and it was really unexpected but very nice for me to be in touch with these tools and little by little I realized that it was actually a central object in both SPDEs and maybe next in this numerical scheme for PDEs so thank you for your attention thank you Yvonne for your nice talk are there questions for Yvonne from from the audience for your future yes I would have a question about these two Pogolubov recursions in co-interaction is there do you have a general framework for this do you need only I mean a commodule b-algebra and a character with value and a rotabaxter algebra for example what you need is co-interacting b-algebra which give you two group of characters which I can take the semi-direct product of them and then using that you can actually write a Pogolubov recursion on this semi-direct product of these groups and you can try first to do re-centering or you can switch this has not been really formalized it was just a more general idea or an observation that we got in the paper but maybe you can try to push this formalism or this framework actually on the co-interactions looks very nice yes thank you I have a question though I'm surprised that I'm asking this kind of question so going back from combinatorics to analytics you know in physics one of the consequences of the co-product and these these recursions is that it has some influence on the analytic behavior of the function right so there's something like a leading log expansion or a normalist dimension so by studying these combinatorics to learn something about some features of the actual solutions that are underlying this procedure that you're describing so I was just wondering maybe also to Martin in these two talks on one hand there's this analytic side on the other hand there's a combinatorial side but is there any feedback in the sense that so far we have seen that you use this to define what the well-defined to normalize solution is to these kind of equations but is there anything beyond that can you learn something about behavior when things become singular when arguments of these things come close together or anything like that I'm just curious if that makes sense if you can infer from the algebraic part some analytical properties on your solutions or I mean the simplest example in field theory is this thing called the leading log expansion so if you have a Green's function suppose it just depends on a momentum and then you want to know how does it behave as a function of the momentum and then if you order it by powers in the momentum then you get corrections by the log of the momentum and you can for example tell that the highest power of the log like the biggest correction at each order is related to some simple co-product of some simple graphs that enter your combinatorial structure and there is like there's a way to order everything in terms of these kinds of corrections and logs for example so I'm just curious if anything like that exists also in this SBE applications but I have no idea what the analytic question to be asked would be I mean I'm just curious if there is anything I mean as a practitioner you have some problems you have these analytical features and you want to encourage them at the level of the algebra because you want to organize your computations nicely and to be able to derive general results but I think I've seen recently maybe in the work of Martin for the support theorem was using some equations on the characters to infer some analytical properties maybe Mark could comment but this this I think was nice perspectives from the from the algebraic part or even he was able to show that he has a build given by some if you have complain on the constants you see that correspond to some of ideal of algebra but it seems to be a huge effort trying to connect this what should be normally analytical properties and to see them at the level of the algebra it's really a traditional effort at least for us I think for everyone but the kind of result is that for example you can relate the coefficient of a log I mean it doesn't tell you what the coefficient is so it doesn't really does the analysis for you but it does tell you that a certain coefficient of log is the same as another coefficient of log somewhere else so it tells you some information about how some information gets redistributed due to the nature of substractions that are underlying the definitions of things but I'm sorry for the super vague question unless yeah I actually might have a question do sub-hopf algebra play a role for you for example root trees which have at most k-outgoing branches at every vertex if k is one these are just linear root trees which never split so to speak does such sub-hopf algebra play a role for you well you just have linear treat like with no branching at all that would be the simplest one to say I just take root trees which have always two branches at most outgoing ah two branches you want to talk about by any means sub-hopf algebra particular role for you because under the co-product this is closed under the co-product such trees goes into such trees tends to such trees and maybe maybe there is something to be learned from that no for the moment no I don't see what I mean when we construct this regularity I mean we have this notion of rule right that's essentially gives you right because so in these regularity structures you naturally don't have all possible trees showing up but you only have somehow because they are the trees it's really more like the Feynman diagrams so in the Feynman diagram you sort of fix yourself the interaction so you give yourself a number of types of notes somehow and so here here the analog of the Feynman diagrams would themselves be trees already and then the analog of that would also be if you give yourself rules about how things can kind of come together and then these sort of Taylor expansions they are the ones that look kind of like the con-crime hop-file traversal with these sort of cuts and then in this case they would in our case also consist only of trees that have certain structure right so they wouldn't be all possible trees but they would only be the ones depending on the degree of your non-linearity and these kind of things okay thanks thank you if there are any further questions now then let's thank Ivan again and all speakers of today thank you