 Yes, you're right. Okay, I also wanted to just make a general announcement while we have the recording going that if there's anyone who wants to do a nomination for early career short talks, now is the time to do that. I think, do we have a deadline for those nominations? I don't think we have a deadline, but the time is approaching. So please let us know if you know any students or if you'd like to nominate any students. Yeah, decisions are being made soon, so great. Okay, so let's continue on to hear about backstage functions. Right, so a backstage function is, so the backstage functions, they live inside power series in now by infinity variables. So I've got a variable for every xi. And now when you say, when people say power series, they're used to doing things like one plus x plus x squared, dot, dot, dot. So that's not what I want. I want things to have a finite degree. And so that means that it's more like sure functions, right, that it's more like the ring of symmetric functions, which one says, so people incorrectly say the ring of symmetric functions is those power series that are invariant under swapping the variables, but you probably don't want that. You probably only, certainly you wouldn't want that if you want to say the sure functions are a basis of that thing. So I'm going to insist on that any particular backstage function only has, is probably homogeneous, honestly, but if I want to be able to, this thing can be closed under plus, I better allow it to have finite degree instead of insisting to be homogeneous. Okay, and secondly, it doesn't use, or let's say, it uses only those xi where i is less than some bound. So the obvious example is x5 plus x4 plus x3 plus x2, dot, dot, dot forever towards the negative variables that doesn't use anything past x5. So there should exist some m such that it only uses the variables less than that. And it should be symmetric in xi where i is less than some other guy m. So it messes around with finite variables for a while. But once you get to sufficiently negative variables, then it becomes a symmetric function. So it's a, you could say it's some sum of symmetric functions in the negative variables, times, polynomials in all the variables. And so we get this genus, the, well, we get two things. One is we get this ring of backstable functions. So it's these guys as a subring of power series. And this has a basis of, I'll call them bs pi where pi is in the viral group or a z. So this is permutations from z to z that only move finally many things. For each one of those, there's a backstable Schubert function that has these things. And so it's got a basis consisting of these. So this is a picture then of the ring h of a z. So it's a, so it's a genus in that sense. Let's see. So it's on this ring that the Nenyshev operator is acting. One way to think about what's going on is to, what the relation is between this C and this NABLA. So there's an extra symmetry that you get when you work backstably that you don't have when you're thinking about usual Schubert polynomials. You can take a permutation of z and you can conjugate it by the shift operator that takes i to i plus one. And that's an automorphism of the group. It's an automorphism of the ring of Schubert symbols. And it conjugates NABLA to NABLA plus C, great. Because it's changing the i. So remember the coefficients of NABLA, NABLA was this sum of i, Marshall i. And if you do this shifting on saying it'll change this to i plus one, but that'll stay i. So if you take NABLA and you conjugate it by shift and you subtract NABLA, you get C. So that's maybe the right symmetries to think of that we have these infinitesimal symmetries coming through these and we have this other one coming through shifts and that's how they're related. Okay, so one thing I'm gonna head for is making use of the fact that these are derivations, we can exponentiate them and get actual automorphisms. So that's the fun property of what you have lie events is that you exponentiate and you get an actual automorphism. Okay, I have a question. Whether backstable function ring is just a polynomial ring in some infinite set of variables. It is, it is just a polynomial ring and that's because there's another map to this ring from the ring of symmetric functions, tensor the polynomial ring. So this is just ordinary brackets in all variables. And so I wanna go, so you know what this, what this homomorphism is probably gonna do on here. It will take a polynomial in the XIs, do this power series in the XIs, right? Certainly if I have a polynomial then it satisfies these three conditions. But where I want a symmetric function to go is, so if I have running out of letters for S, about sigma lambda, no, I'll call it sure sub-landed. So where sure lambda goes is to the symmetric function but I'll plug in the negative variables. So every, now, let me give an example of what's going on here. Consider the backstable function coming from S R minus four. So that's equal to X minus four plus X and minus five plus X and minus six forever, okay? Now, how do I see this power series in the image of this? Well, I'll say that's coming from the sure function for a single box, but when you plug in, when you take the sure function for a single box and you plug in negative variables, then you get the sum of all of those and I don't want that because I want not this. So I'm gonna subtract off X naught up to down to X negative three. So here's a guy in the image of this map. Now, this one is also known to be a polynomial ring in the, for example, the elementary symmetrics. So the sure sub columns. So here's a polynomial ring, here's a polynomial ring, the tensor polynomial ring, this is a polynomial ring. In kind of three times the natural many variables. So there's my, there's my columns that could be some natural number height and then there's my natural number of variables on this side, my negative natural number of variables on that side. That's how big this polynomial ring is, this ring of backstable functions. Right into the question now or is that it? Nope, that's right. So Dave was talking about the backstables as well and was thinking of them more in this picture. So here, the thing that I find a little weird in this picture is why did I plug the negative naturals into here as opposed to plugging in the variables five and below or the variables negative 12 and below? Why did I plug in the variables zero and below? There's a whole bunch of different isomorphisms of this ring with this. And so like on here, the shift operation I said is very clear. It takes XI to XI plus one. And on here, it's pretty weird. So on this part, it's taking XI to XI plus one but then these sure functions are getting all mixed up with those. So I prefer this picture where the shift symmetry is manifest although abstractly, yes, it's a polynomial ring. I wanna show you, let's see, I wanna show you something fun that Nettyshift did but maybe I will have to get to it at the very end as part of a refined version that we did. So I'm going to start a completely other story and then relate it. So inside the flag variety in CN, and there's a more general version of what I'm gonna say due to Cleogeko, there's a more general version of what I'm gonna say that he did for arbitrary finite semi-simple complexly groups but I'm gonna be specifically thinking about this one. So inside here, I'm gonna take some general point and I'm gonna hit it with the torus. So T of course is the diagonal matrices. I'm gonna hit some general point in here with the torus and I'll get a subset of here and then I'm gonna take the closure. And the thing I get is called the permutahedral toric variety. And since this is being recorded, I'm gonna point out that I think there should be an A there, not an O. And I think the people who put an O there should be forced to talk about tetrahedra and octahedra. So there's a permutahedral toric variety. I'm just gonna call this guy a TV perm, okay? So what Cleogeko thought about was let's consider the map from cohomology of the flag variety to cohomology of the permutahedral toric variety. And I wanna say this toric variety, it shows up a bunch of places. It's a Hasenberg variety for the regular semi-simple case. It's very important in a bunch of Junha's work. It's a weirdly smooth. Like if you studied this guy, not inside the flag variety, but inside say GR24, then the polytope for it would be an octahedron. And the corresponding toric variety wouldn't be smooth. So strangely, this is a manifold living inside here. And Cleogeko thought about this induced map on cohomology back to that manifold. Now it turns out this map on the cohomology is not onto. It only gets what I'm gonna call the Cleogeko ring. So it has some image and Cleogeko gave a nice presentation of that. So he says, this is, well, he only gives it rationally. He says, it's, sorry. No, well, let me stick with the rational enough to worry about, and then I'm gonna write down a formula that's only rational. So he says in this case, it's generated by these x1 through xn, module of the Cleogeko ideal, which says that each number is either zero or is half the sum of its neighbors. So those are the relations the xis should satisfy. Maybe I want this to go from zero to n and past the n's, we get zero. So that's a ring, but of course we're interested in rings with bases. So you wanna know when you take a Schubert symbol, where does it go under this map in this presentation? So this map, I'm gonna call the Cleogeko genus. Okay, you're at got homomorphism from ring I'm interested in to another ring. And if you're thinking ahead to like, what would the xis look like when you impose these equations? Wouldn't they all have to be zero? And the answer is not quite, they could square to zero sort of thing. So this is defining some ring with nilpons, but that shouldn't be a surprise because we're thinking about as a quotient of the cohomology ring, and this of course is some, you know, is some Artinian ring with a full of nilpons. All right, so where does this pie go under this map? It goes to one over the length of pie factorial times the sum over reduced words of pie. And I'll use a capital P as a typical reduced word for pie. And then we take the products over the letters in there of the xis. So that's the map. Now, the thing I don't like about this formula is that it's taking place in kind of a complicated ring. And so let's abandon the geometry that got us here and just consider, let's just consider the case now when we go to the az case. All right, so in the limit of az, I'm just going to say I'm dealing with this ring of Schubert symbols, az. Don't worry about it being a cohomology of some bi-infinite lie variety. I probably wouldn't do that, but I'm not gonna need to work that hard. And I'm going to go to rationals now in the infinity variables, but I still want to impose these conditions. Okay, so this is now a genus on here. And I'm gonna use the same formula for this genus. So what does this ring look like? And it's still complicated, but I maybe take quotients of this ring. Could I mod out a larger ideal and then get something comprehensible? So when you think about like taking this ideal and seeing what are the minimal prime ideals that contain it. So saying maybe I could get this guy as an intersection of other ideals. And then instead of modding out this complicated thing, I'll mod out those ideals one by one and each one of those quotients will give me partial information and I'll sort of stitch that information together. So that suggests that you want to quotient this, sorry, you want to find ideals that contain this. And so littlestellensatz, I'm now gonna completely misapply because I'm in infinite dimensions, but I'm gonna do it anyway and say, let's think about the solution set to these equations. So these equations say that each number is either zero or it's half the sum of the neighbors. And you get things like this where you'll have zeros for a while and then coming off here, you'll have two, four, six and coming back there, you'll have five, three, one, right? So here's a way of, so like here I might have X one, X two, X zero. Here's a way of solving these equations where you have zero for a while, but then once you stop being zero, you're gonna have to have this linear behavior we saw before. And it turns out that all of those, so there's a whole bunch of different minimal primes that contain this ideal. And most of them look like this and there's one other guy that doesn't, which is the one where you don't have any zeros. You just have a single affine linear function instead of these two linear functions at the ends. So you get this further quotient, which is just that same thing, but now I will leave off. So can I just, just to check my understanding is that should five, three, one could, something seems off. Oh, sorry, that's okay. Like three, two, one. Yeah, three, two, one, sorry. Three, two, one, okay, thanks. Right, so you want to deal with the bigger ideal that just says each number is half the sum of the neighbors. Don't settle for it being zero. And that ring is very simple. That's just a cube-racket AB. So it's just that Xi is, so the map is on spaces is that, well, Xi, I guess the map is Xi maps to AI plus V. So there's a nice workism of this further quotient that I could argue, but I'm out of time almost, that all the other quotients you get are gonna be useless for Schubert calculus. So this is the only one that really matters. Okay, and now I want to say what the relation is between parts one and two and be done. So we have the ring of Schubert symbols and we have this Kleopzko map, or maybe I'll call it this, it's not even the Kleopzko genus anymore. I'm just gonna call it this affine linear genus. So this composite where we impose not just the Kleopzko equations, but actually we go all the way to cube-racket AB. So that's this genus. And I'm going to use the knobla and Xi that we had before. So I'm gonna take A times knobla plus B times Xi, and I'm gonna exponentiate that. So this is now some inhomogeneous operator on this ring, but that operator is, it's a ring automorphism. So it doesn't quite go here because it has this A and B in it. So I need to tensor over Z with cube-racket AB, okay? And then there's this for me, then there's a map from there to there, of the Schubert's to zero, except for the identity that goes to one. So that's this dumb ring that just says, let's extract what's happening. It's kind of an integration that takes only the class one, or maybe no, it's better to say it's restriction to a point to take the class one. And so now I'm gonna state the theorem, which is that this commutes. So it's more of an observation than a theorem. It comes from computing both sides independently and saying, is it weird that these are equal? Thanks very much.