 Okay, welcome everyone to the Schubert seminar yet again. Today we have Alan Knudsen talking from Cornell and the title of his talk is the commutant of divided difference operators and the Kalashko genus. Thanks for having me. So thanks for inviting me Rebecca. One of my co-authors on this, the other Christian Gates. So, I'm going to tell two independent stories and then tie them together. The first story will be about these, these differential operators that behave nicely on Schubert polynomials as were discovered in the last few years. So, the story as it began with them was, as I understand it, wondering, what's the derivative of Schubert polynomial. When you think about derivatives of Schubert polynomials to get anything reasonable. And so, I'll remind you about Schubert polynomials and then start taking derivatives of them. So, let me start. I don't actually really think of Schubert polynomials as fundamental, even though that's what to what was the case that the people who are working on this, you know, the differential operators are doing. So, let me define, given a carton matrix A, I guess A is what I'm going to be using for type A. Given a carton matrix C, we get a ring of Schubert symbols. And so I'm just going to call it H or C. And it's going to, as a ring, it's going to be isomorphic to the, the cohomology ring, the associated flag variety. So that, you know, it's very common, so we're not so interested in rings as we are rings with bases, and so it'll have this basis corresponding to the vial group of that carton matrix. And so, what is C usually going to be? Usually it'll be of type A and minus one, so the, so that this vial group would then be a type of type Sn, or maybe it'll be a type A sub the naturals. So, I'm thinking about this versus I'm thinking about this going on forever, or sometimes it'll be A sub the integers. So I'll be thinking about this type A diagram and it goes forever in both directions. So those are the principle C's that are relevant in today's talk. Anyway, so I've got this basis for this mysterious ring in the base elements correspond, say to permutations, or maybe to permutations of the natural positive integers that don't move. They'll only move five out of any things or permutations of all integers that only move five out of any things. So the basis I'll call these guys S of pi. And these are going to be the Schubert symbols. And the multiplication is determined by the Chevrolet Monk formula. What this is some formula is not worth my writing down. It tells you how to it tells you how to multiply one of these guys by another of these guys so S of pi by S sub S of row, but specifically when the thing you're multiplying by is a generator of your boundary. So this guy's a simple reflection and S is the Schubert symbol associated simple reflection, and the Chevrolet Monk formula tells you how to multiply two of these, and then expand the basis of Schubert symbols again. In the, it's always going to be a positive expansion in the type a cases the coefficients that positive expansion will be zero or one. And it turns out once you once you say that once you say how the multiplication works, then these guys generate the ring and therefore if you know how to multiply by a generator, you know, by associativity and induction how to multiply by any of the basis elements. So that's the basis ends. And there's a particular case of this that that let's go and shoot some Berger. Considered, which is, if you consider it, or this what I said here a sub the naturals or maybe, I don't know, maybe it's the naturals plus. And zero doesn't move. It's not mathematically an interesting fact it's just traditionally when you think about SN, it's moves the numbers one through and not zero thread minus one. So, this guy is going to move positive integers and leave, and won't have the integer zero to negative infinity or it won't, or if it hasn't it doesn't move them. So, this shoes and Rosé said, there's an isomorphism of this ring with, and let me write it as a map going this direction with this polynomial ring in many variables. It's better than that. Not just a ring theoretic statements, but these Schubert symbols. Maths to these polynomials they introduced the Schubert polynomials. So I don't want to introduce your polynomials again because I think about two thirds of the talks at least in the Schubert seminar do so. And also because for me the polynomials are not fundamental. It's that I've got this ring with this with a certain basis that's really the idea. I'm going to in fact be thinking about homomorphisms from this ring to other rings, not just to polynomials. And I went scouting a bit for to find a name in mathematics for the concept that you would have a complicated ring or ring with basis, and you know homomorphism from it to somewhere else that you would then use that ring homomorphism to study and they're the name I ran into is genus. So like the Hirtsenbrook genus or the Witten genus. The genus is traditionally comes from some code board is a ring of a point, which is totally unrelated as far as I understand anything that I'm doing, but good enough for me, there's a there's a there's a notion of, you've got your complicated ring, you've got your complicated ring and homomorphism here to study this one. And so I'm going to call this the Luskut-Schutzenbruget genus. Okay. Now I'm going to need the, okay so I'm going to need some operators on here and on there that will motivate the differential operator every time in a second. Up on the ring with basis, let me define partial alpha of s pi will be there'll be cases, it'll be, it's about trying to take pi and multiply it on the right by r alpha. So what's going on is I've got a permutation and I have a simple reflection like it's exchanging i and i plus one that's in these type A cases. And I say, if I multiply pi by our alpha, does the permutation get shorter. So we're the we're the things in position i and i plus one out of order, and now they end up in order. So that's the case that the length of this guy is equal to the length of pi minus one. And in that case, I want to get s pi r alpha. And if it's a, and if not, I want to get zero. Okay, so these, these were studied first these operators I'm telling you were actually studied first here. And when they're, when you think of them acting on here, and you know, right now, if you know about sugar polynomials and you know that you know how this morphism works. If you know this rule, you could carry this rule over here and discover how these partial alphas work on this side. That's a totally a historical but you could do it. And here they're acting by these divided difference operators. So on the, on the polynomial side it's kind of complicated like if you apply one of these operators to a monomial, you get a big sum of manuals. Whereas you apply one of these operators to a Schubert symbol, you get a Schubert symbol or zero. And they act easily here. And the way that let's go and shoot some Thursday, define the Schubert polynomials is basically they say they want this, and they, and they set it up so that their polynomials obey this recursion. Using the divided difference operators. All right. So, here was the observation. So, as we try and get the order of the, of the authors right so is that comical. And, you know, why gas is obviously going to be last. And David Spire and Oliver. There we go. Observe that there's this operator, nobler, which is the sum of all the derivatives. So they, they make this operator that acts on this side acts on polynomials, and they observe two nice things. One is that this commutes with the, with the divided difference operators. So that's, that's one nice thing. And the other is that when you apply it to a Schubert symbol, you get a positive Schubert symbols. So, let me rather than writing down formula for right now I'm just going to write that this is as positive. Okay. So I was interested in this terrible. So I was interested in the question. From here, what are all the operators that do this thing. That's the commute with the partials. And I want to give you a very simple answer to that, and then see like where in there, where do the, where do the differential sits right so this this guy of course it's a, it's a first order differential operator, right, so it's a, it satisfies. If you apply it to a product, then it satisfies the Linus rule. So you could ask, and I will, what are all the operators and satisfy number one. And then once you know those, what are the operators of degree negative one that satisfy this linus rule and is nobler like the only one, or are there lots of them. So, the answer about what the commutant is of the of the of the partial operators is maybe a little bit disappointing a little bit easy once you know. So, these guys, they are acting on the right now there's a simpler version of this recursion, where you just say, let's just act like that. And if you do that, then what you're really thinking about is like the group algebra of your, of your, of your value group, acting on, maybe the ring of functions on the value group, the functions that are finally supported on your, on your value group. That it just, it's just a, the, the Cayley representation of your final group. If you think about like a group acting on itself by left multiplication, the only things that can meet with that are the group are a second copy of the group acting on the other side on the, on the right by by right multiplication. So here these guys are acting by right multiplication. And so I'm going to introduce ones that act by left multiplication. I'm going to define Marshall I of as pi. And it's going to be the same formula, except it'll be on the other side as our alpha pi, where the length of our alpha pi is less than the length of pi and zero if it's more. Okay. So my theorem is that's every, every D that commutes. So it is a D is an operator, every D acting on the ring of Schubert symbols. Every D that commutes with all of the partials is uniquely of the form sum over all row of Marshall row times some coefficients. Every D sub row because it's a, because it's a big D. So this could be, this could be an infinite sum. So you might worry when you have an infinite sum, and you want to take that infinite sum and apply it to some as pi, why do you get a finite answer. And if you think about it, you'll find out it's okay that there's only five of the many terms here that will when acting on this as pi won't give you zero. So you do get a finite answer. The other thing is I only defined Marshall I for you for or Marshall alpha, I should really say. I don't need to find Marshall. I don't need to find Marshall row for you when row is a simple reflection. So maybe it should even be Marshall R alpha. There we go. So when row is a simple reflection that I define the Marshall are you in this way it's about it's like the partials but it's on the left. For the same reason that there is a notion of partial partial row for every row, which is to say, it doesn't if you pick a reduced word for row and multiply get applied together those partial alphas. You'll get a well defined thing it doesn't matter what reduced word you use exactly the same thing is happening here. These are giving well defined operators. Marshall row, based on any reduced word for row. And so those are the things that commute with the partials and everybody and everything that's a. So everything that commutes is of that form. And when you make things of that form, they're all different. They. This is what this is really about is I have these, I've got this, this algebra that is acting on this module. So the, the algebra is defined by, if you want to multiply. So if you multiply and row and the algebra, you should get zero if the lengths don't add up when the length of pi plus times row isn't the length of pi plus the length of row. And you should get to pi row if they do add up. So that was called the Neil heck algebra. And what we have is these two commuting actions of the hell heck algebra on here. So if you put admission, the middle heck algebra is usually thought of as being finite sums of these instead of infinite sums like I kind of want. And it's not a, it's not a big deal if you put infinite sums there, it's still got multiplication. I want the infinite sums so I can say every. I want to ask them is so I can write down a formula for nobler, which is that nobler is the sum over I greater than zero of I times Marshall I. So that's a, a shorthand way or an efficient way of expressing the formula that's HPS W give for their novel. So, so far I told you what the things are that can use with partials and the answer is Marshall's. And I told you where this guy who could not only commutes with the partials, but gives you a derivation, some of you satisfies Leibniz rule, where that sits in this coordination of this commutant. So, let's, let's try and answer the question of what are all of the differential, excuse me, are all the Leibniz differentials that you can make out of the marshals. Question question. When is so I'm going to limit my to the case. Let's say, let's take alpha in simple roots. And I'm going to limit to only adding up Marshall our alphas. So, I'm not I'm going to think about about operators I can make from the marshals, but they should still be degree negative one operators like nobler is. And of course there is a more general question if you have operators that, you know, have some degree negative K, or maybe they're even homogeneous. And I didn't think about that question. We didn't think about that question. I'm just talking up to say anything interesting. So I'm going to take that and then I'm going to put some coefficients in France, and ask when is this a derivation. So this is going to be some condition on the D's right and if di equals I. So if, if we're in, if we're in this case, right so the novel. The problem is written down was in this situation, it was, it was acting on the ring of Schubert symbols, or a positive infinity. So, I wanted to ask, I want to answer this study more generally and say, I'm on some dig a diagram I take some linear combination like this. And so I'm just going to give a partial answer, because I don't have a full answer, but it's certainly necessary. That's so partial answer. And let me take C simply laced for this. It's not laced, but to, and I could state it for not so much, but this is going to be easier. It's necessary that each D alpha is one half the sum of the at the neighbors. And so I'm only saying it's necessary I don't know that if you take the D alphas that satisfy this that you will indeed get the derivation property. Again, what the derivation property is, is that it satisfies this line that's rule over here. So I don't know that if you do this, then it will satisfy that I suspect it's true but I didn't really care. How do you prove something like this though, it's very easy. You apply this guy D to s r alpha squared. Okay, so if I, if I stick this if I take both P and q. From here, I think both P and q to be s r alpha. Then I know how how s r alpha time what s r alpha times s r alpha should be because of the Chevrolet monk rule, and I plug it I plug it into there, and it gives me a condition on D and the condition turns out to be this. Okay, so what would that look like in in our in the three types are considering so I'm thinking about and minus one. A positive naturals and a all the integers. So a minus one, like let's try and do this with maybe three. So I want to have a number here. So I've got ABC. I want this number to be half of that number. Okay, so that'd be over two. And I want this number to be half of that number also so that's be over two. So I want this number to be half of the sum of those two, but it's equal to some of those two. So I find out actually be zero. So this was a really unsatisfying question in this in this case and more generally in finite type, there will be no such derivations. All right. What about if this goes on forever. So there's going to be some vector space of these right if I have a solution these linear equations, you know, that's a linear equations they're defining some vector space. So I'm going to take this thing and say, let's scale this number I have here to be one, because if it's zero, well this guy is supposed to be half of that. So that would also be zero and we keep going all the zero. I'm going to scale this to be one. So that's supposed to be half of that. And that's supposed to be half the sum of these two. And that's supposed to be half the sum of those two and so on. Okay, and we're getting novel. So this is the only option. When we're thinking here there were no options because zero dimensional space. Here there's a one dimensional space of solutions to this equation. Okay, so now about a z. Multiple of. So those are the only, the only derivations. So the only operators of degree negative one that satisfy Leibniz and commute with all the partials. You know, on the le Cougar chutes and Brigitte ring of polynomials and variables, the only options are multiples of the one that that these four authors found. And now I go to a z. And, okay, so these are connected, and then they go easily far that way in that way. So, let's start off with. Well, I'll just, I'll say, there's another solution we it's easy to figure out that. Well, there's another obvious solution now which is you have ones everywhere. And every number is is half the sum of the neighbors. And what's pretty quick to figure out is that actually, the space of derivations here is two dimensional. So zero dimensional here one dimension here, here is two dimensional, and these guys are a basis. So there's a there's a new, there's a new option. And it's called see. So, this was discovered by Nana show. This operator. And he rightly points out that it's only available in this back stable situation so that's what's going on here we're thinking about the back stable case. So, and I'd like to tell you a bit more about what he did in the back stable case, and in this language of genera, which is the plural of genius. So, what's going on in the back stable case. So, so back stable functions. Alan, might this be a good time for a five minute break. Oh, it'd be a great time. Okay, so let's do that we'll just pause for five minutes and people can also stay on and ask questions if they like.