 So, thank you very much, Karen and Eric, for the invitation. It's a real honour for me to be here on that very special occasion. And first of all, let me say happy birthday to Gertchen Grükvunsch. We should have been in France, so I should say, but anniversary. And since it's early morning, very early morning for some of you, this is going to be a leisurely talk, not too serious. But it's going to take us along winding roads from complementations, which I'll define shortly. You can think of the orthogonal complement for the moment as a guiding concept, to locality in relation with locality in quantum field theory. And I discovered that, in fact, this takes us to quantum logic, which is not my field of research at all, but we have had to look into the literature there and discovered that things we had thought we'd proven were new, were in fact known in some parts of quantum logic. And my motivation comes from renormalisation, and there I got a lot of inspiration from the work of Dirk and with Alain Kohn, and I'll go back to this. This is joint work with Pierre-Claivier, Liguot and Binja. So happy birthday, and now let me go to the motivations. So renormalisation and locality, I'm going to put that in a very concise description, which of course is oversimplifying a complex, very complex picture. So I'll tell you later what locality means. We'll go back to that. It's just a binary symmetric relation. So we have an algebra. This will be this binary symmetric relation I'll come back to later, and a product. You can think of an algebra of Feynman graphs with a concatenation of graphs, trees, also with a concatenation of trees, well, forests, I should say, concatenating trees, building forests and cones, which so these are convex polyhedral cones, which were in fact the starting point for our study. And now I'm going to be vague at this stage just saying there will be another algebra of meromorphic germs at zero. I'm not yet specifying whether one variable, like in the algebraic factorisation of Dirk and Kohn, as we will opt for very shortly, the algebra of multivariable meromorphic germs at zero with linear poles. So here we'll have several variables. Now this is where we land. This M, which we haven't yet completely specified, but we know it's something like meromorphic germs at zero. So it's where the singularities, the divergences arise. And we have a morphism phi from this algebra Feynman graphs, forests, cones into this algebra of meromorphic germs. These can be given by Feynman integrals, regularised, of course, here I'm subintending because I landed meromorphic germs that I'm using. I could be using dimensional regularisation or some other regularisation. I have functions, so they present poles, but I can define this, Feynman integrals or branched zeta function or conical zeta functions. I won't say too much about this because this is not really what we're going to talk about. Today we're going to stay on the right-hand side where these meromorphic germs are. What we're interested in, of course, we don't want meromorphic germs to have to measure. We want numbers. And if we want numbers, we'll have to replace this target algebra by C. Before that, this is where this locality, which I haven't told you very much about, but you can now maybe guess, you see it reflects locality in quantum field theory as far as it says that if two IIs are independent for this relation, then their values, the value of the product, this concatenation, the value of this concatenation will factorise and will be a product of the evaluated files over each argument. So this we'll go back to as a kind of separating device. It tells you these events are far apart and hence this is why there will be no extra term coming from whatever could happen between them. What we want to do is, as I said, we want to go from M to C and that's what everybody wants to do, to have numbers. And so here it's a black box. All I know is that when the AIs, these Feynman graphs, trees, cones are just concatenated without further interaction, then I would like this value here to factorise. So I would like to keep this when somehow we extract from these meromorphic germs some number. So this is a black box. Okay, so what have I learned from Dirk and Alain Kohn? I've learned that if you have as a target space M one variable, so this is typically a result of working with dimensional regularisation, then you... Yes, there's a question? No? Okay. So you want to separate the poles. You have your function phi. We had in the previous map, phi of A. You have an A in here, phi of A is a meromorphic germ. What you do, you want to separate the holomorphic part which I'll write in blue from the polar part which I'll write in red. And if you do that just naively, it won't work. Namely, this multiplicativity property we had here down below will be spoiled. So you have to be a little more clever and very clever. Indeed, was the idea of introducing co-products which separate... Which kind of undo whatever mistakes you might have made having naively taken this holomorphic part and evaluated it at zero to get a number. In several variables... So here this uses a co-product. So it uses a lot of structure on the first algebra, on A. Here we will focus on the target algebra M and we will want to separate it. Remember, from our point of view, we'll take several variables. So we'll have lots of variables to deal with and this separation is not at all straightforward. It's in working out this separation that we're... That's what we're going to focus on today. What are the mechanisms behind this separation in order to be able to build this renormalized map we wanted? So when you look at Lorentz expansions in one variable, you know what to do. There's a holomorphic part. I'm at a point at zero. At zero will be zero. And then you take the holomorphic part and take out all the poles, polar parts, okay? And of course, if you do that for this big phi and just take pi plus composed with phi as a phi plus, you won't have multiplicative property I mentioned. But so what we want to do is generalize what looks like a simple artifact splitting this and this is minimal subtraction scheme, one could say. But we're going to do it in multiple variables. For that, we have to understand how to separate the polar part from the holomorphic part. And this is what the complement mapping, the complementation maps will help us to do. So please don't hesitate to ask questions along the way. So a good way of separating is considering orthogonal objects when you have some orthogonality around at your disposal. So I'll take a vector space. I could take it infinite dimensional, but let's not worry about this. Let's take it finite dimensional. And this object here, G of V is the set of all linear subspaces of V. And let's take an inner product on V, very simple facts. What does it define? It defines an inner product. Well, it's defined as a as a billionaire form. And so it is a symmetric relation. I'm assuming I'm in a real case here. And so it defines for me a binary symmetric relation. U is orthogonal to W. If and only if a Q of U V, U W is zero for any U W in this respective spaces. A very tightly related object is this complement map, which takes you as seen here as an element of what is later going to be a lattice, the lattice of linear subspaces, taking it to another one. What does it do? It takes it to the orthogonal. And of course the two are very much related. U is orthogonal to W. If and only if W is in the orthogonal complement. So this is the polar set, which is here the polar set of U or the polar space is all consists of all the ones that are that all the W's that satisfy Q U W equals zero. So this looks like a very simple thing, but for us you see this U orthogonal is related to the symmetric binary relation represented by Q. And so this is on this side. And now how do we get back psi Q from the polar set? So the polar set with this one and this complement orthogonal complement will be the maximum for a partial ordering given by inclusion. So this is how we can go back and forth between the two and why are we interested in orthogonal complements on V? It looks like, you know, very simple first year material. Why do we, why are we interested in this? Because we will use them to separate polar parts from holomorphic parts in our meromorphic germs. And remember that was what we wanted to do. Relative complement maps are ubiquitous. When you start looking at them closely, you see that they arise in many ways and not surprisingly in co-products and not surprisingly in the one that was introduced by Dirk and Allah when looking at fine man graphs or rooted trees. Because what is such a comp... How is such a co-product built? It's built from separating. I don't know whether you see, I've got things in front of what I'm trying to show you. So you're separating X. Yes, so I think there's a, so Y. So you're separating Y from the complement of Y in a certain element X. So remember, we have a poset structure on fine man diagrams to be a subgraph or fine man graphs to be a subgraph of or trees to be a subrouter tree of. And if you think of the co-products, the way they're built, they use a complementation. They take a subgraph, a subtree and separate it from its complement, which I won't describe now for each of these co-products. You have to give a precise definition of what the complement is. Surprisingly, you also see those complementations as an abstract notion, something which takes an object to a complement in a meaning that they build together. You can rebuild the space from those two parts. This arises in an equivalent geometry or toric geometry. So that's where I first learned about it in some paper by Thomas Hammond Gaufalidis about the Euler Maclaurin formula on convex polytopes because hidden behind there, there is such a co-product. So the complement map is the transverse cone to a face of a cone or you look at cones in fact, rather than polytopes and your complement map is taking a transverse face to a face of a cone, which I don't want to dwell on here. So all we want to remember from here, we need some kind of rigid complementation to separate polar parts from holomorphic parts. So because we want to be systematic and that's in order maybe to hopefully at the end get a better grasp on a kind of overarching renormalization group or maybe I should call it Galois group, we want to be systematic and see which symmetric binary relations define a reasonable complement map and conversely, how do you go back and forth? So we want to generalize this relation. Remember, we had orthogonality, which was seen as a binary symmetric relation and here a complementation, which sent a subspace to its orthogonal complement. So this is one way and the other way. So what we want to do here, so we did it both ways and what we want to do here is forget about the orthogonality, try to find conditions on these symmetric binary relations in order to relate them to some complementation. We have to define what that means and the natural setup is surprisingly lattices and because we're having this locality with progressing with this locality all along the way, it's because we want to keep in mind this constraint of a factorization of measurements over independent events. But then if you look at the literature, you see that under completely different names, this was studied for other reasons in relation to quantum logic. So here what we expect is, I can't see what I'm showing, I hope you can, I have things in front of it, that U is independent, we call it, of W. You can think orthogonal if and only if U is in the orthogonal, so in psi T of U, sorry, W is in psi T of U. So this is a generalization of what I wrote here. We're just replacing orthogonal Q by T by this T and this by UT, UT meaning we take all the ones that are indeed independent of the elements in U. And here we do the same thing. We have a psi T instead of a psi Q and here the maximum, the maximum will make sense because we have a poset structure and bounded structure and we'll take here the polar, just the one I just described. So this is what is written here, that's what we expect and now we have to make sure that this makes sense in this very general locality lattice setup. Just so that I convince you that orthogonality is important in Laurent expansions, let me quickly tell you why it's everywhere because you see we're looking at this kind of function in several variables. The allies here are linear forms in some Z1, ZK and the big allies as well, they're linear forms and that's why we're going to talk about meromorphic germs at zero with linear forms, H being a holomorphic at zero. So that's what is written here. Example, very simple example. And so now we ask the question, what happens if I said Z1 equals Z2 equals zero? What do I do at this stage? And we want to address this question directly. And so for that, we define the dependence set of the linear forms entering in this, function F for this germ to make sure we'll be able to decide whether the linear forms, this F depends on have nothing to do with linear forms, another G let's say depends on. So this is what we do here, you see, we say I'm using abusively the same notation as a usual orthogonality, writing F1 has nothing to do with F2, if the variables they depend on have nothing to do with each other in the sense that the spaces generated by these linear forms are orthogonal. So Q here is fixed. So for example, if I take the canonical in a product on R2, I would say that this Z1, my Z2 is independent of Z1 plus Z2. You view them as carried by the vectors E1 minus E2, E1 plus E2, where E1, E2 would be an orthonormal basis for Q. And then we have to say which ones we want to keep, which ones we don't want to keep. Now we have a general notion of germ and among those I want to throw some away. And I'll throw those away for which the top part has nothing to do with the bottom part, so which are irreducible. So these are just recalling what set up we're in. And now that's the important condition. The top part, sorry, the top part is also the space, the dependent space generated by the top part is independent of that of the bottom part. And then I'll decide this, we throw away if this is the case. We'll decide this is polar. And they generate a subspace. And to be very precise, because we're going to have to use, we have to go to Lorentz expansion, we'll have to look a bit more closely at all these expressions and keep track of the kind of linear forms arising in the polar part. For that, we use the language of cones. That's where we came from. But in fact, it enlightened us on how to deal with such meromorphic germs. And we'll say, we'll have families of cones because you see, we have several of these SJs, I'll call them soon, several polar parts. And we have for each of them, a family, a set of linear forms. And these linear forms will organize in cones. They will be the edges of the cone. That these will form the edges of the cone. And then we don't want overlapping. And this is, we don't want these cones to overlap. Otherwise we're doing over counting. So this is this condition properly positioned, which is a bit technical. And then, so that's what we require of the polar germs that will arise when we look at one of these general meromorphic germs at zero. So this was a result that we proved using, inspired a lot by Nicole Berlín and Michel Verne's work, that using an inner product and the complementation, we could separate M into M plus and M minus Q. Those are the poles, the ones for which the numerator has linear forms orthogonal to these, those of the denominator. So here are the technical conditions just so that we have uniqueness, holomorphic up there. So this will be holomorphic. So what are we doing? We're separating this meromorphic germ into a holomorphic part at zero and a black box, but we can look into this black box. This contains, it's like the dustbin, it contains all the polar germs. And to make sure we understand them, we keep track of the cones, the supporting cones, and ask that they're simplicial. So linearly independent operators. So be careful, this decomposition is not unique and one over L1 L2 can be written in two ways. And this is why the supporting cones are very important. The first one has supporting cones with edges given by L1 L2, whereas the other one has different supporting cones, L1 L1 plus L2. So this is very much related to double zeta functions in the sense you're splitting the upper quarter plane into eighth of planes using the line y equal x. Are there any questions? So this was to show you that orthogonality is very important. Where did it come up? Just so that we're on the same line of thought, it was here. So it looks kind of harmless, but that's what decides for the splitting and hence for what we're going to keep. Because in the end, we would like to say what we want to keep is h of zero. We throw everything away. And so it would be a multi variable minimal subtraction scheme, okay? And, but knowing the Lorentz expansion gives us much more insight and we will use that to define not today, but we're using that. And that's why we're very interested in a generalization of this theorem to describe a Galois group of Meromorphic germs, of Meromorphic germs, modulo of transformations of Meromorphic germs that are the identity on the holomorphic part. So you see this was proven for Q and this special inner product which gave us a locality, we would like now to prove that in a much more general setup, namely replacing this by a locality and replacing this complement map by the, if I call T the locality, psi T the associated complementation. So that's why I'm looking at orthogonality as a locality relation because I want then to go one step up from orthogonality to a general or relatively general locality relation. So what is a locality? So the G of V, I remind you was the set of a finite dimensional or closed linear subspace of V and there is a natural partial order to be a linear subspace of or closed if you need it. So for example, yeah, so what is a locality relation, a symmetric binary relation and you see orthogonality gives you one on G of V because it tells you if two linear spaces are orthogonal or not and so this gives you this binary symmetric relation. Again, you see here we used this binary symmetric relation to declare or to decide whether two functions, two meromorphic germs were independent or not. We even abused somewhat of the notations using orthogonality. So, okay. So the lattice G of V, that's the one we're interested in. What is it? It's a poset. What is a lattice? It's a poset, partially ordered set with a join and a meet. So for our lattice, this will be some of two vector spaces. This will be the intersection of two spaces. Usually, I mean those operations are usually associative and they should be if we have a nice compatibility with the partial order, they should be compatible. So monotone, one says isotone in that area in the lattice community. So these are very natural conditions when you have a partial order. Then you might want a biggest element or a smaller and the smallest element. And this, for example, first V would be the biggest because any other element in G of V is a subspace and the partial order is being a subspace of and the smallest element would be zero. The set zero because it's in every linear space. Distributivity is rare. I won't dwell on this. It's interesting and it leads to many other notions, but this is luxury, distributive. And the power set of a set X with inclusion is distributive as you can quickly figure out. But the one we're interested in is not distributive as you also can quickly see. So we really need to get around this property and that's important. So there are the partial borders which you can think about to be a multiple of and there is a notation I'd like to introduce which will come back later. It's this A with a down, a narrow pointed down which is the set of all elements smaller or equal to A and it will play an important role. Now, author modular lattice means a lattice with at least a pre complement for the moment in the sense that for any A there is some complement but it might not be unique like a complement subspace to a subvector sublinear space of V is not unique. And so yes, A plus B equals C means that the join is C and the meter zero. Zero you see was the smallest. Now author complemented and you see the analogy with orthogonal complement is much more restrictive. You want that A and its complement have nothing to do with each other or little to do with each other namely their meat is zero is the minimal element. You want this compatibility with the partial order and you want evolution. And if you look at the details you'll see that this is already preparing for a real complement, rigid complement as it's called in some parts of the literature because then you have a separation of the maximal element into A and psi of A and that's what we need because we want separate holomorphic from polar. Okay, so there's also a notion of relative complement which is important because you don't always work with the whole space. You might work with subspaces and want to separate them into another subspace, smaller one and its complement. And so an example, yes, I won't go into those examples. They're very, they belong to folklore knowledge. So taking the ordinary complement map in a set and then taking the orthogonal complement which gives you all these properties for when you have the luxury to have an inner product around to help you. So any question maybe? Okay, so this is preparing for us to see how what we need of a complement. So locality now because remember we wanted to go from locality this symmetric binary form to a complement. So let's see what locality is. It is indeed a symmetric binary relation. So you might think, oh, it won't separate much if I take the identity for example which is symmetric binary, but that's true and we will rule out that kind of relation very soon. And the polar set I told you, so here we're on, for the moment we're on a set and the polar set is all the ones that have nothing to do with A that are independent. We would read this A independent of B. So all the B's that are independent of A and what I called, I like to call polar set for various reasons, it's like a dual. And if you note A is in its double. Polar set. So recently we came across this statement. So yes, which is in fact a well known in the lattice literature, but I was not at all aware of this literature. So we were informed by the referees. So a locality on a post set is, now we're putting a post set structure. So there was no partial order here there is. And so we just have to make sure everything is compatible. So the polar sets of two elements, one smaller than the other are included in the reverse order in one and another. This is called a Galois connection, surprisingly for me absorbing. This is saying that if something is orthogonal to the bigger element, then it's orthogonal to or independent of the smallest one. And then you see we had A was in this double polar. Now we're asking that all the elements below A are in the double order. And all these conditions are equivalent. And this is called, we call it a locality post set, but it's called weak degenerate orthogonal, if I'm correct by some authors. The terminology is not uniform in that literature, it seems. So now we're putting a little more, we're putting a lattice. So now we have two operations and we have to make sure everything is compatible. We have the partial order, but now we have also the join and the meet. For the meet it's automatic due to this partial order. This is the smallest of the two for the partial order. This is not automatic, it's the largest of the two. And you see we start seeing this locality constraint. Well, the fact that this phi we had originally should obey some locality or should be compatible with locality. And this is what is kind of coming into the picture. If A is independent of B1 and B2, it should be independent of their join. You can view it in a different way. It's equivalent saying it's a sub lattice or a lattice ideal. So example, this very nice but not really interesting example of post sets. It's too nice. You can equip it with a non, with a locality which spoils the post set locality. If you take the, instead of taking the intersection to be void, yeah, by the way, this locality relation reflects what one might intuitively think of with locality. If you think, for example, of smooth functions and say two smooth functions are independent if their supports are disjoint. And this is very much used when you deal with divergences. Whereas here it's when their union is X and that's not compatible with the post set structure. And of course, the orthogonality we just check all along that we have it with us. Yes, we have because that's what we're going to to generalize. And so now remember we wanted to separate things because we wanted to separate holomorphic from polar. And so now that's where we're introducing these conditions and here it is. You see, if A is independent of B then their meat, yes, their meat should be minimal. You could see it as a non degeneracy condition. If A, it's a non, kind of a non reflexivity, in a sense, this has another meaning, but A should not, if A is independent of itself, then it's zero. So this is the very important separating condition. If A, this locality relation is actually separating. This is just asking that the polar set of zero is the whole thing. And then we have this completeness. So remember that we were at this stage, we had A, we started noticing that A was in the double polar because of the compatibility with the partial order we have that all elements below A are also in the double polar set. Now we're asking that they coincide. This is equivalent to A being the maximum of this set. And if you remember, there was something about a maximum at some point when we looked at the relation between orthogonality and orthogonal complement, this is preparing for that. Okay, example, it's always the same example, just to make sure we have it with us. Yes, everything's okay. Now we're going to the core of our purpose, namely, remember I wanted to relate orthogonality as a binary symmetric relation with the notion of orthogonal complement and we wanted to generalize this. Now, so we realized thanks to a referee that this theorem in a very different language was already existent. We proved a corollary as well, which was not there. But essentially this is known and I don't think everybody in lattice theory is familiar with this reference because it doesn't seem to be quoted very much but this belongs to folklore knowledge. We realized in lattice theory that an orthocomplementation is in one-to-one relation with what we call, that's not the terminology used there and it's not formulated in this way, strongly separating locality relations. So how do we do that? So this is like the orthogonal complement map and this is like the orthogonality and strongly separating, remember, it's really making sure that things combine into our separate. So maybe I should make sure. Strongly separating was this and this is more stringent than this ordinary separation and this is going to ensure that we can go from orthocomplementation to a strongly separating locality relation. So how does it go? You take a symmetric binary relation, T and you build your psi T. Psi T will be the max of the polar set. Remember, that's how we build psi orthogonal. The other way around, how do you decide when you've got a complement map, the two elements are independent. If the one is inside the, well, I should have put, this is not quite right. I should have put all the other way around. Sorry, this is a mistake. B is in the complement of A or conversely. So okay, I'll skip this. This is a refinement of what we did and what's important is that when we put here the orthogonality, we have here the orthogonal complement. We saw that when we discussed this example. Any question? I have a quick question, sorry. Yes. So this strictness of this complementation, is this required for the renormalization application in the linear poles? Yes, yes. Because otherwise you don't have uniqueness. You see, you have, as long as you don't have this strict requirement, strictness requirement, there's a freedom of choice at what complement you're going to take. There's no map really. You can't define a map saying, this is going to be the complement. Yeah. Opalides and Pommersheim, when they looked at polytopes and the Euler-Magnar and formulae polytopes, they called it rigid complement. It's a good word maybe, rigid. Thank you. Pleasure. So we were working on lattices and that served for us to be able to find, you know, all abstract locality relations which would serve a separating purpose. But now we have to go back to the vector space. And so let's go back to the vector space. And we first equip it with a set locality, but now we have linearity. So we require that the polar set of any subset is a linear subspace. If you think of what happens for the orthogonal complement, the orthogonal complement of any subset is a subspace. So that's the kind of thing. This is very, it's a difficult condition to deal with in other aspects of this locality framework. Okay, so we shall say that this locality on the linear subspace is non-degenerate. If we have, you see, it looks very much like what we had before, but now remember we're on V. We're not on spaces. But this will imply that the polar set of V is zero. And strongly non-degenerate means that we have it conversely. If the polar set is zero, then you has to be the whole space. And so interestingly, there is a one-to-one correspondence between these locality, so this looks very weak, set locality, which has some compatibility with linearity. The orthogonal or the polar is linear. This is in one-to-one correspondence with other ones we've been looking at, lattice locality relations on this lattice G of V. G of V is somewhere on this lattice. And how do we set the relationship, the relation between the two? Two spaces are, two vector spaces are independent if their elements, their vectors are independent. This is here. Conversely, two vectors are independent if they're, if projectively, if they're the one-dimensional spaces spanned by these vectors are independent in this G of V. And there is a one-to-one correspondence between what we've been looking at up to now. So locality relations on this lattice and the ones on the vector space. And if you want something more stringent, like strong separating one, the one that we're going to use, indeed they are also in one-to-one correspondence. That's why we went to lattices because that's where things are more understandable. So as a corollary, since we know that this special class of locality relations is in one-to-one correspondence with author compliments on lattices, it gives you something on vector spaces. And that's a translation of what we had before. That's what we aimed for because we needed compliment, author complementations on vector spaces and we get them through author complementations on lattices. And so all this is just translating, putting back to the vector space what we've learned on the lattice. And it generalizes what we knew before, namely going from an orthogonal compliment, orthogonal relation, orthogonality relation to an orthogonal compliment. Now, why have done all this work? I think I have two more minutes. You might say, well, most of it is maybe orthogonal compliments, but I don't think so. And this example, which I don't think I have time to comment too much, gives you a way of building from... In fact, it's like building from a set complement of vector space complement if you analyze the way it's done. There's a separation of a set of angles into two disjoint sets of angles and the bijection which takes one set to the other and then, so u theta, which is there, it's a line in the direction theta is sent to you psi theta, psi being a compliment in fact on sets in a sense. And then if you take psi of theta to be pi minus theta, you get back the orthogonal compliment. This is one way on R2 to get other compliment maps which are not auto-complementations which are not the orthogonality. Now, what do we want to do as from there? So this was like a sidewalk to understand all the ways we have a separating on a vector space. Now, what we're doing now is generalizing this Laurent expansion construction beyond the orthogonality relation. And for that, we use the language of cones. We translate these Laurent expansions into a conical language which makes it much more tractable. Now, we're also looking into the Galois group and that's why we want to go... You see, we don't really need Laurent expansions to have a minimal subtraction scheme because we don't need to look into what I call the dustbin at that point which contains all the polar parts. But we do if we want to look in a more refined manner at all transformations, linear transformations of multiple variable meromorphic germs with linear poles which stabilize holomorphic germs. And then we really need to look at what's in the dustbin namely at the details of the Laurent expansion. And we want to do that for this locality but also beyond. And we hope that this can give us some insight on what in this picture could be something like a renormalization kind of group. So thank you for your attention and here are some references and thank you. All right, thank you very much, Sylvie. Are there questions? Well, we're waiting for other people. I have a question. Yes. So how important is the boundedness? It comes up in the definition of the complement but, Inel, you could have a locally complemented solution. Yes, that's right. Thank you for this question. We indeed, a relative complement is enough but you can always, when you... So you can always consider when you have a lattice which is not bounded, the lattice of elements smaller than a certain element and then this one is bounded so you can always reduce your study to the case of a bounded lattice. And this is called relative complement, the one you mentioned. And indeed, for example, as I mentioned, in... Well, everywhere in the coproduct for Feynman graphs, it's a relative complement because you're taking a graph and you're taking a subgraph and a complement with respect to the bigger graph. The same for trees, it's a relative complement and the same for Garoufaldides and Pommersheim in polytope. So usually you do need a relative complement but you can always, thanks to that, reduce your investigations to a bounded lattice. I see David Broderhurst has a question. Yes. Thank you, Sylvie, for this very clear account. There was something that intrigued me at an early stage where you pointed out that there was a sort of ambiguity in what you were doing by taking partial fractions of your L's. Yeah? Ah, yes. So when I had Z1 minus Z2 over Z1 plus Z2? No, no, no, no, no. 1 upon L1, 1 upon L2, I can express as a sum of two terms using different partial fractions. Okay, maybe I'll go back to it. Ah, yes. Yes, I know. So that was for the Lohan expansions. Yes. Here. So can you carry that ambiguity through your work? I mean... Yeah, yeah, that's the point. So thank you for this question. So what we're doing is not unrelated to blow up procedures, which I'm not at all confident on, but when you kind of split the divergence into, you kind of blow it up to understand how it's built. And this is what we do in introducing multi-variable regularization schemes or terms of functions. And we keep track of... So when you do a blow up, you look at things locally. We do it globally, but at the cost of having two things, an inner product, which is my locality, which rigidifies. So we're in the category of sets with locality. And the second thing, as you rightly point out, we have to keep, if we want to, the details of this expansion, if we really want to know what happens in the dustbin, what I call the dustbin, and not only do a minimal subtraction scheme, for a minimal subtraction scheme, we don't need to know what happens there. If we really want to know that, then we have to look at this family. And what I pointed out here is that we have to fix it if we want to a certain lower expansion. And this is kind of replacing the other option, which would be to look at things locally. Setting, giving ourselves a family of supporting cones is a looking glass into the singularities of F. Thank you. Does that answer your question? I have a maybe related question on that side. So what I think of regularizing things with many variables then there's many ways of doing it. Is it the case that the Q, like the dependence on the locality, this is how you encode making the choices of how to do the partial fractions? That's right. And that's why you see that's exactly the point. Q determines who is polar because HGA should be kind of orthogonal to the LIs in a metaphorical manner. So yes, and it also, so this decides who is polar. And then it all goes in there and that's why there's a plus with a Q because here it doesn't show but we've used the inner product Q in an essential manner. Exactly, and that's the choice of Q is a choice of renormalization in a sense. And what we want to do is go beyond Q because we feel the importance of Q and that's why we wanted first to make sure we understood the role of Q and that's what these stringent locality structure. Sure, but within this very special case, if I change Q, do we have like a renormalization group formula? That's right. It's part of it. It's part of it. That's what we're doing, you see? And our first attempt is this Galois group. So the Galois group will look into that. So before we go to a more overarching renormalization group, we're looking at the Galois group of transformations of such meromorphic germs which leave this part invariant. I mean, each individual invariant, which is identity here. And this relates to transformations of cones because these are described by cones. Yes, so yes, that's why we're looking at this Galois group because we know we feel it's obvious that the renormalization group has something to do with this choice. Thank you very much. Thank you. So Sylvie, there's also a question in the Q&A. So Jonathan in the Q&A asks about Metroids. He says, the concept of Metroids generalized linear independence. And he wonders, does your theory also apply to Metroids? I've never looked into that. I did look at some points at Metroids, but not in this context. I don't know. So I suppose you have something behind your mind to ask this because you think it could apply. Maybe I'd be grateful for a reference so that I could look into it. No, I haven't thought of that. Eric, do you wanna unmute Jonathan? He's one of the attendees. I don't seem to have that power today, but Eric should. Then you can tell us more about your question if you'd like. Yeah, Jonathan should be speaking now. Oh, he says he has no mic. Okay, well, thank you Jonathan. Go ahead Jonathan. Thank you. Maybe you can send me a reference later. Thank you. Are there any other questions? All right, why don't we all thank Sylvie? Again, you can unmute and clap. Thank you very much. Thank you.