 Thank you for having me and of course it's really nice to be part of this event, to celebrate Derk's 60th birthday. I wasn't sure whether I should start maybe with some anecdote because somehow I was Derk's PhD student and I was thinking about something that was not too personal to tell so I think in general I would say being a PhD student or having been the PhD student of Derk was a very nice experience in the sense that he gave me a lot of freedom, a lot of time also to pursue the ideas and collaborations that I wanted to do. Nevertheless I should say for instance there was this experience I don't know one or maybe two years into the PhD things had not really taken much had not made much progress. He stopped me then at IHS in the entrance hall and asked me where I was with my work and he essentially just said okay well if there's not more then you just finish with a nice summary of your work. Obviously this made me thinking and it was a good a good push in the right direction. Another I would say nice anecdote really positive anecdote was once when I was before we had our first paper written that he came to my office and I was really lost in some computations that simply didn't work out and they were at the blackboard and he just moved in again was asking how things were going and looked at the blackboard and I do not know how pointed at the right place saying I should redo this calculations and indeed it turned out that something was very wrong there and it worked out and so we had our first paper and so in total I would say it was a very positive experience with a good feeling. About my work okay now I should I'm getting my emails maybe I should stop this but I don't know how. Okay so it's about big products and combinatorial hop algebras. I think one of the great impulses that came out of Dirk's work is this topic of combinatorial hop algebras and trying to find and to understand structures in terms of combinatorial hop algebras and jointly with Frederic Patras and Nicolas Tapia and Lorenzo Zambotti we we started to to understand big products or big polynomials using combinatorial hop algebras and to to summarize maybe or to kind of give an idea of where I want to go with this talk I should say so what is the aim okay the aim is to understand relations between different families of big polynomials in non-commutative probability so this is of course a lot of information here first of all there's not only one family of big polynomials but several in the commutative probability theory which I'm not an expert so I will I will not try to really explain this since this would probably take too much time and it's not really the point it's I will try to take a combinatorial perspective or algebraic perspective on the issue and then try to explain somehow what it means to have different families of big polynomials the setting well the setting is the one of what's called non-commutative shuffle hop algebras and they're also known as denry form algebras but I think this is a better way of talking about this structure and this is an example for a combinatorial hop algebra connected graded hop algebra with some extra structure now I decided to already make it clear here at this point that those different families of big polynomials they are part of this theory of non-commutative probability that somehow I think essentially goes back to Boykulesk, those polynomials appeared in work by Michael and Chilevich around 2004 and 2009 but I think in a series of three papers but in a very different way they are understood there in terms of so-called generating function calculus so we did generating functions and one aspect of our work was to really find a different way hop algebraic way to deal with these big polynomials and in fact then to use this to understand them better and in fact one outcome of this work is that these different big polynomials are related I will try to indicate this at the end okay so let me start with the classical case so those big polynomials that you probably all know from either probability or say quantum field theory I take this very algebraic point of view so I try also to approach them maybe in a somewhat unconventional way so I start with a commutative unit associative k algebra this is usually the algebra of my random variables I take phi to be a linear unit of form then I look at the augmented tensor algebra so I start with in degree one so I skip the unit and if I want to talk about the tensor algebra with unit and put a bar above it okay now on this algebra with concatenation as product I can put a core product the so-called unshuffle core product that is defined here here I take this way of looking at it in terms of choosing subsets of my index at one to n then I associate with this subset word in this form here I can have the natural order on those elements that I take out and and then I put on the left the corresponding word from the letters that I've taken out according to the subset s and on the other side I put the rest so which means I simply extract letters sequences of letters put them on the left and what is left over I just simply concatenate and have it on the right another way would be to stay okay that delta as it is a morphism is for each letter is a primitive element and so with this so-called unshuffle core product I can define on the dual side a shuffle product this is a shuffle product on the linear maps over the over the tensor algebra that's what I have defined here which is defined in terms of this core product and then I take my linear map phi that was defined over my original algebra a and I extend this onto the tensor algebra with unit by simply saying okay if I see the word with n letters then what I mean is in fact my script phi I think a little phi evaluated on the product of the entries in this word so this is if you want a moment for instance right this is I take I should I should take a this is a linear map it's it's it maps the unit of the tensor algebra to one and so it's invertible and for this we can find another linear map that maps the unit of the tensor algebra to zero such that phi can be written as an as a shuffle exponential now everything here seen somehow in the dual over the tensor algebra so this is the shuffle product I have defined up here this is this particular map c that's essentially is characterized by mapping the unit to zero and so I have expressed this new phi here as a shuffle exponential now it turns out that if you compute if you compute if you apply this shuffle exponential so the exponential here is of course defined in the usual way so if you apply this you simply follow the rule of doing this the shuffling or unshuffle a core product on the word it turns out with a couple of steps in between that you can give a very nice and elegant expression for this for this for this for this computation so you find the fact that phi of a one times a n equals a sum the sum here is coordinated by set partitions all set partitions of order n and then you take the product over the blocks and you evaluate the map c over the the subsequence of the subword according to the elements in the block p i okay so this is what you find if you do this computation using the unshuffle core product structure available here now when you then think a little bit about this then you see in fact that if you give these things names so I call those images of the say in a single letter here to to make it simple I call those mn this is the image of say letter a and the nth tensor power of it and the same for c apply to the nth tensor power I call this kappa n then I take this exponential generating function where I put all those mn's here the same for the c for the kappa n's and then in fact if I equal this so if I want this to be equal then I find a way to express my moments in terms of what then I would like to call classical cumulants so the first moment equals the first cumulant the second moment equals the second cumulant plus the product of the first two cumulants then you have the third moment expressed as a polynomial in the cumulants up to order three and at order four you find this expression here and maybe I guess many have seen this already for us at this point to to get this maybe connection with the non-commutative side of the picture the interesting coefficient here is this three that appears here as we will see in the in the non-commutative world it disappears there will be then just the two because we lose one partition if you go there you can also look at this in terms of a recursion given here with this polynomial coefficient and in fact this is also nicely related to in fact the substructure of the shuffle algebra structure that I just stated that I plan to return to when we come to the non-commutative side of the picture okay so now let's simply move forward and and define what I would like to call the but in this case would be the so-called tensor big polynomials they become the classical big polynomials if I evaluate them in elements from my original algebra a but so for the moment it's simply defined as I take the identity as a map over the tensor algebra and I essentially convoluted this with the inverse of my map phi okay so this is the convolution I've written it here explicitly using the unshuffled core product then what you can show is so if I have this polynomial which is a polynomial in the tensor algebra so it's words with certain coefficients the coefficients given in terms of the inverses of phi evaluated at sub words of the word that I'm looking at and I essentially find the unity which means it's it's zero essentially all the time if I evaluate this on a non-trivial world and I have also this particular property here that if I define this derivation that simply kills the letter a in the word one has to be a little bit careful here how to define this but I will return to this later on when we have something very similar in the non-commutative setting I have this intertwining relation so if I compute my tensor big polynomial as I want to call them on the word and then I look for the letter a and I kill this then this is essentially the same as first looking for the word in your rich for the letter a in the original word then computing the big polynomial of it so these are the tensor big polynomials and in fact I can invert them with respect to composition simply by defining it in this way here and also if you if you simply evaluate you look for the the only the constant term in the inverse of this then it's just five of the word now here the first few big polynomials up to order three so you can also nicely see here that if you evaluate so if you apply five to this expression here then this is zero this is true for all the other expressions as well again maybe as a hint on what's interesting here say up to order three is again this particular coefficient that that will change when we when we come to that it's strictly non-commutative setup so and as I said these are tensor big polynomials so if I really want to have the big polynomial as they are stated in in many books then I would really evaluate this this tensor polynomial in in the algebra a so I would really start to multiply in the algebra and then those properties that I just stated for the for the big polynomials is particularly intertwining with this abstract derivation after a letter in the tensor algebra just means that these polynomials have this two properties here and which I think makes them what's called an up somehow up the polynomials maybe as a as a remark why maybe this this to start with this identity convoluted into the inverse of my phi is was a good idea if I look at this what one would call normal ordered or big ordered exponential which is simply generating function for the big polynomials then in fact one can show that this can be given this expression here it's simply the exponential ta here my parameter t and then I simply subtract the the generating function for my cumulants which is nothing but saying that I take the exponential ta divided by my moment series and the moment series I express as the exponential of the the cumulants and so I find this expression here which is roughly something like phi inverse times okay the exponential but here we also talk about all the big polynomials put together into this generating series as another remark I will also repeat later on since it's it's it's something that holds through also the non-commutative setting since I can invert my big polynomials or tensor big polynomials I can define in fact a new product on the tensor algebra defined like this so here this is the ordinary concatenation I take my two words then I compute the inverse big polynomials I multiply the tensor algebra and then I take the big the big polynomial of the resulting expression this I define a new product this product has this nice property that if I take big polynomials say with respect to the letter a of and then we find like this and this is the big polynomial of the product of a and with a m in the tensor algebra so usually this big product is awkward properties but I can define a new product such that I have these nice properties here we have a question from Ralph yeah just a very quick one so you say that this is the time ordered product in your formula is that obvious I mean I can take your formula as a definition but if I usually you have some exactly some time or can you say one more word why why this is given by dividing out by these cumulons where is that sorry here this is the normal order product right yeah for me this would be a normal order product this computation I mean this this result here you can deduce from from this property up here from this derivation property you can really take it's just formal I mean that's what you're saying very yes I mean yes I mean by formal if you mean simple yes or I think it is I think essentially the statement here yes okay so sorry thanks with respect to the formal series yes if you mean this yes okay there may be another question okay now the non-computative case so I am saying it's I will not try to to make all of this really making or appearing completely perfect it's it's too difficult and there's not enough time for this so I start first with what is a non-computative probability space so it's not much but an associative algebra that's not necessarily commutative and it has a linear map that maps the unit of the algebra to one that's that's all that I want to say here of course concrete examples are it's another it's another story to really think on this level but for the moment all that we need is that we have a unital associative algebra with a linear map so not that different from what we had before in fact and now how to get how to how to to make the connection with this vastly different world of non-computative probability theory so I I choose to to state for me or for the work that we have done the central observation that is due to Roland Speicher and I think it's around 97 so about the same time and I think Dirk had this first paper on the hopf algebra visualization if I if I'm not mistaken or maybe a little bit earlier that simply says that we do if if you forget about this what the computation we did what we simply asked so we have a multivariate moment here so it's phi evaluated on the product of a 1 to an a 1 to an elements from this algebra a then it turns out so I would like to express this in terms of cumulants so I would like to introduce new objects called then free cumulants then I find essentially the same statement so it's a sum but this time it's a sum not over all partitions but what's called non-crossing partitions set partitions then I have the same phenomena I take each block and I now multiply the cumulant with respect to the the entries specified through the entries in this block and somehow this is the one of the major insights here that really opens the whole new world in in terms of combinatorics non-crossing partitions etc so non-crossing partition means really in a very pictorial way that we have to avoid partitions if you if you want to have this pictorial representations of one two three four and then we have something like this so this would be a partition that should not be considered whereas if I look if I think about all partitions so these these arcs mean the blocks then I would include this but here it's it's not there and of course this means also that many more partitions are actually not available in this in this picture here and so this this is what what defines free cumulants and and free cumulants are to do free probability as an example for non-competitive probability what the classical cumulants are to to classical probability so they vanish mixed cumulants vanish if the underlying variables are free and then I think in the same year Speicher and Rudy also considered another example so-called Boolean moment cumulant relations so here are the moments on the left hand side now called Boolean cumulants and again what changes is the again the set of my set partitions these are the interval partitions now that are central to Boolean independence or Boolean probability if one wants so this is how I would like to to motivate essentially the non-commutative perspective on moment cumulant relations and probability so this is now what shall we do or what do we have to do to to to adapt what I did in the classical case in terms of hop algebra shuffle algebra etc to define to to get the moment cumulant relation and then the corresponding big polynomials in this setting when I thought just I have to deal with the the question of avoiding certain set partitions that were naturally coming out of this hop algebraic computation that I did earlier okay and okay then it was as an example here I was I remind you when I introduced the order for moment cumulant relation and I was saying that there was three here now it's only a two because this particular set partition is not available anymore so I have only those two at four and as a maybe another remark that I think it's it's interesting but with respect to the big polynomials that will appear not not so much there's also the notion of monotone moment cumulant relation or monotone independence then I still sum over the non-crossing partitions of order n but here I have a coefficient which is the inverse of the of the tree factorial also something I think where Dirk has made early on a very important contribution in this famous chen paper so these are the the inverse tree factorials here the trees are representing the nesting of my non-crossing partition for instance something like this you can you can associate a tree to this or for us and okay so now let me try to directly how to to do this I would not know but somehow when you look at the the question so how are the corresponding generating series related now here of course I have to say what I mean by generating series I have to do a little bit more general so I have here n non-commuting letters I have n elements from my algebra a and then I put the corresponding multivariate moments together in this moment generating series here now it's not an exponential generating series it's just words and those coefficients and the same I can do for the the free and the Boolean cumulants but observe here that some of the moment generating series starts with one whereas this was also true in the classical case of course the cumulant and the Boolean generating series free cumulant and Boolean cumulant generating series start with n equals one okay so we'll letter c instead here I have one now this relation of Speicher and Speicher and Rudi can be actually also derived from the relation between the generating series and this is for the free case this is this particular fixed point type relation between the two generating series here the moment series on the one side and the cumulant series on the other side but here you see that I have to substitute a different set of letters where I take now the I call the letter z i being a letter w i but now I multiply the whole moment generating series into it so this is a priori a very complicated operation here but it delivers at the end this moment cumulant relation we have observed in the Boolean case it's much simpler I have that the moment generating series is one plus then simply the product of the Boolean cumulant generating series time again the moment series and I think an interesting remark is the following that I think Tritanovich in 1980 in the in the short paper on planar field theory planar quantum field theory observed that if you want to understand the relation between generating functions for the full and the full full and connected I forgot connected here sorry full and connected planar greens function so here I have the full greens functions with this dotted block here relation here which simply means okay if I want to organize my my uh my planar Feynman graphs into connected components okay I have to always avoid crossings of the legs and so which means they can go into these little spaces here right when you look at this in fact then you can see somehow the the hop algebraic structure is already present here if if you believe me I will try to do this so we need we need something a little bit larger to deal with this phenomenon and the one the the way we do this is using the double tensor algebra so we need an extra tensorial structure here again so I take the tensor algebra with unit over the tensor algebra over my non-competitive probability space and oh there's something missing I think this is not good so there should here there should be a um there should be a j1 s no why this this is p at j k s or maybe the s the other way around it doesn't matter so it's s1 to sk so what what we have done here is I do essentially unshuffling but uh I so to say I remember where I have extracted those sequences or letters that I put on the left hand side so I follow essentially the same prescription so I take for instance here this word with six letters I take this set s 2 4 and 5 and when I extract these letters then as is a 2 a 4 and a 5 this is what I have here cannot access my pointer since I have this and I click on it then disappear and on the other hand once I have extracted those letters and the sub words then I'm left with this expression here which is just a one and this bar a 3 and this bar a 6 okay which you can also maybe represent this way here I have extracted this letter together with this sequence of letters and this is the core product that I define on this double tensor algebra in fact you can show that this gives a non commutative non-co commutative uh connected graded uh bi algebra and therefore a hover algebra more uh more interesting is the fact that I can split this core product and it's splitting is very simple I simply say uh that I want uh one in this set s that I extract to define the word as or I say explicitly I do not want to have one in there okay so this is simply the splitting that I prescribe the sum is obviously uh delta now this has some implications and these are as follows if I again go to the dual side and I define my convolution product I've stated here again in terms of this new core product I defined but I transfer also the splitting so I get two extra binary operations this left and right operations so in principle I can say okay this associative product here non commutative splits into some of those two binary operations then it turns out that this together this left and right operation in fact defines what is a non commutative shuffle algebra these are the shuffle relations there are three of them it's also what's called a dengue form algebra after Lode and as a remark we had this already available in the in the commutative setting but then it's a little bit less interesting there's only one operation because I have this particular relation between these two half shuffle core products and so I could I could already have started this introducing here now with this extra layer of structure available we can do the same we did before so first we have to to extend our map five one step further onto the double tensor algebra this is what I have done here and I in fact so I have to say how I see it in in relation to this extra tensor product and I want it to be a character so it should respect this algebraic structure and then somehow what we find out is that if I start with such a group like element phi then I can find three different infinitesimal characters such that I can phi right as this exponential is this right half shuffle fixed point equation and I call this the right half shuffle exponential because it's a little bit too to discuss why I call these exponentials and of course I can also define the infinitesimal character corresponding to the original exponential defined with respect to the associative product and there are a couple of remarks that I would say are interesting there's on the one hand I can also everything express in terms of the proper exponentials and if I want to think of the inverses then I see that this inverting those left and right half shuffle exponentials means essentially flipping from the left to right and then putting a minus here so these are non-trivial relations on the level of shuffle algebra that are quite useful if you want to really do computations here and what you can show is that with this result so with these three infinitesimal characters available and these fixed point equations and the original exponential I can we can derive these moment cumulant relations simply as an exercise in computing or evaluating such a series of left or right iterated left or right half shuffles on the word W so I mean here again this is a result that comes out from this particular computation with those half shuffle operations okay so it's on the one hand the free case with the left right half shuffle with the left half shuffle and the Boolean case with the right half shuffle sorry we should go like this and as a remark the actual exponential gives what's called the monotone moment cumulant relations but I won't go into this now I think I should speed up a little bit okay what I wanted to say is now okay let's continue with this program and do the same as we did in the in the classical case and look at this particular operation of taking the identity convoluted into the inverse of this map phi and as I said now phi is the one so to say is this this character that goes all the way back to the original non-commutative probability space we were considering and here now I define again a map that is defined on the double tens or algebra and I can I can also invert this simply by multiplying from the right with phi then in fact I find the identity written as this product of my of what I call the free weak map or the corresponding free big polynomials convoluted with phi now a couple of observations is the following so w is multiplicative so it respects the this the product on the double tensor algebra that's what I've indicated here now and it's also centered which means again if I evaluate if I take the expectation or if I evaluate my my map phi on such a polynomial that comes out of this computation up here and this is essentially zero if evaluated on a non-trivial word and I had it already earlier in the classical context if I define this partial a simply in terms of an infinitesimal character that's one only if it sees the letter a then essentially zero on any other words or bar products of words with other words and I define partial a like this so again I have my co-product here I decompose my word or my element from the double tensor algebra according to this prescription here and it essentially means I look for the letter a and if the letter a is there then since I have extracted it it's it's eliminated and where I have extracted it I put a bar this is now the rule here that's what I've tried these with these three examples here you can see now if I extract this letter a then I have to put a bar here so this this derivation applied to this word here w1 whatever prefix a then w2 whatever suffix then I put a bar here when I extract the letter a and if I take for instance a word with two a's then I get the sum so if there's a property of a derivation that if I applied to what I call a big polynomial then this is the same as first applying this derivation and then applying the big polynomial that's what I try to to say here and here are a couple of the first three three big polynomials so this terminology is essentially due to to Anchilovich called this these polynomials his way of deriving it is rather different but these the resulting polynomials we call them three big polynomials and maybe here are the the places where they they differ from in order three for instance they differ from the classical big polynomial this is here as you can see somehow here you would expect the inverse of phi applied to the to the subword a1 a3 but since we have this particularity of this bar that comes in through this core product it's not the case and the same here instead of a six in the classical case we have only a five here okay this is all due to this particular form of this core product which which makes sure that we essentially work with within the context of non-crossing partitions but you purely see them in that long word algebra and you can also compute as I said the inverse the inverse means essentially flipping so if phi was even like this but the alphas are essentially the three cumulants and I take the inverse and I have to flip the half shuffle exponential I have to put the minus sign here and so if if I if I go to the definition of this w here this was phi inverse and if you do this computation then you find this expression here that's that's clear from the construction of this core product you extract the word as so this is the tensorial part and then you have those coefficients which means evaluating this character on those subsequences now if you work this out and you really look at what it means to work with those right half shuffles with the three cumulants with minus sign then you recover a theorem by Anchilovich that has that gives an explicit expression of those three big polynomials in terms of three cumulants okay but this follows from this computation here quite straightforwardly now now we have these polynomials they somehow follow the same program from an algebraic point of view then we did in the in the classical case but one could ask why not looking at the same construction in the Boolean setting that's what I've tried to do here because if I just do this computation now expressing phi in terms of Boolean cumulants not much will change I'm just rewriting the three big polynomials in terms of the the Boolean cumulants and this is interesting but maybe not quite what we want it's that what you can show is that you can express the three big polynomials in such a way that the Boolean cumulants appear like this okay so the proof is is computational you have a certain recursion that is satisfied by the three big polynomials and you also need and this is hint at something also quite interesting that among those cumulants there are interesting relations so I can express the three cumulants in terms of the Boolean cumulants through this half shuffle adjoint action if you want so there's a nice lee theoretic perspective on the level of of of these infinitesimal characters they are they are Leo pre-lea elements in fact and this together here allows me then to express it my three big polynomials in this form here and it's this part inside here I call it a w prime so it's the identity minus the identity left half shuffle beta that define in fact the Boolean big polynomials okay so it's I call it like this I call it the Boolean big map this gives the the Boolean big polynomial so again I'm sure that you told us how they look like here we deduce them from this from this approach in shuffle hopper algebra and what you can show is there again what's called centered so if I evaluate phi on the image of such a Boolean big map so Boolean big polynomial then this is essentially zero if evaluated on which work is polynomials are related so here I'm saying okay the Boolean big polynomials can be expressed in terms of the three big polynomials by this operation the left half shuffle operation with phi and so what we what we see here is in fact so I get my three big polynomials by taking the identity and operating with the actual convolution product from the right with the inverse of my character phi then I get the Boolean big polynomials by acting on my three big polynomials in terms of my character phi here's the inverse here's phi but and here's also this left half shuffle versus the full shuffle product and so all of this actually can be understood that these big polynomials are result of the right action of the group of characters on the space of linear maps over the tensor algebra right this is this is via the convolution product in this left half shuffle and just as a remark if I would instead now operate on my Boolean big polynomials with another character or state it's the inverse so somehow I imagine I have another state available then I get what what is generally called conditionally free big polynomials so there's there's a nice way of linking those different big polynomials in non-competitive probability something that's not really obvious here at least it's not clear to me is how to to deal with the monotone the monotone side of the spectrum and okay this is the final remark I believe I can again also define new wig new product in terms of those big maps and then I have this particular property okay that's that's what I wanted to say thank you and a belated happy birthday thank you thank you very much Kirush other questions for Kirush then either put your hand up if or speak if you can speak already I just said one quick question when you go when you talked about the inverse of the left or right exponential that the left and right swaps and you have to take the negative of the argument think it was like four or five slides before I mean what happens if I want to write the inverse as the same exponential of something I mean basically you're saying that it's not trivial anymore to to find the inverse by just negating it you mean yes I mean if I really want to say what is the inverse of this with respect to the convolution product yeah then in fact I find this yes it's it's maybe less I mean in some sense it's nice but it's not as simple as just putting a minus sign in the argument of the exponential yeah yeah okay that's just a fact of life thank you we have a question of David Broadhurst okay um yes my question is related to um work by uh bedrag Sylvia Kanovic and Peter Schaubach back in the 80s which you referred to a planar field theory and the distinctive message that I remember from that work is that they said whereas you have exponentials a normal field theory in a planar field theory you have continued fractions yes and when I looked at your work with Frederick I didn't see any reference to continued fractions did they somehow get lost well at least at least the way we we look at it or we work with it yes somehow but you can solve this this fixed point equation in terms of continued fractions yes but I do not know how this would enter this this program that we that we follow in this work but it's also true that I haven't really thought too much about it but it's it's something that somehow we have not picked up that's true thank you thank you we have one more question by Ivan Bruné yes speak Ivan hi you mentioned that you don't have any formula for the monotone version just for the Boolean uh do you have a make progress no no I mean it's I have no good way of of how to think about this somewhere in some sense if I just if I use yeah can you hear me much Eric can you hear I can hear you yeah I think maybe just advance connection is a bit okay but I can hear you I said okay now you can hear me I can hear you okay uh in some sense the fact that this this is so to say the perfect description of phi in terms of this classical exponential if you want somehow makes it very hard to to to compute with it in fact some of these these fixed point type equations left and right half shuffle exponentials that result from this are very very nice useful to to do computations but if you want so if you think how the question would be how to extract from this so sorry so I can I can of course write the those uh free uh big polynomials uh in this form so I can bring in the the monotone cumulant in some sense more needed it somehow allows to to extract what could be then the the monotone big polynomials if to say so but I don't know how to how to do this so I'm not sure where the people have looked at maybe I think I have maybe has saved but I'm not so sure it's not so not so clean thank you very much I see no further hands at the moment so one thing that some people try at conferences is to unmute everyone if they can for a second and then you can do a little crap