 Great. Okay. It's a pleasure to introduce Patricia Klein from Minnesota speaking about bumpless pipe dreams that encode the geometry of suber polynomials. Thank you. Thank you for the introduction and for the invitation and thank you all for being here. What I'll talk about today is joint work with Anna Weigant who's currently at MIT. So the first thing I want to talk about is intersections and unions of matrix Schubert varieties. How can we break apart matrix Schubert varieties and what do we get when we sum the ideals? So let's start with a small example. So the small example will really be thinking about the Schubert determinants ideal of 15324 breaking apart as a sum of these easier Schubert's. So the conventions today will be that we fill in our rotodiograms. As this one tells me one in the first column the five is giving me the five in the second column and so on. And we might notice that this is just a one little ladder shape here. So this is already we can either call this a one sided ladder determinants ideal or it's a vexillary matrix Schubert variety. And what we're taking here are just the size two minors inside of this little ladder. And we can decompose this as a sum of two bi-grass monion permutations. So the bi-grass monion algebraically is just going to mean that we have one essential box and we're just taking minors northwest of that one spot. So one size minor one location of interest. So size two minors northwest of this one location. Size two minors northwest of this one location here. And this is a very friendly sum. We sum two Schubert's we get another one. And it's maybe not so difficult to believe that we can break apart any Schubert determinants ideal into a sum of these bi-grass monion Schubert's in just this way where we take each essential cell and we form the Schubert determinants ideal of the bi-grass monion that is given by that one rank condition that one location. And we sum them up and we get all of our rank conditions. So let's see how this looks in the rank tables. So these rank tables they agree an awful lot of locations but they disagree for example in this location there. And so we notice that the the value of our rank table in the sum is the lesser of the two possible values coming from the two rank tables of the permutations whose ideals we're adding. So similarly here we have a two and a one compete and we get a one. And in some sense of course we get the the minimum value because if in we're getting more relations so that means as we sum ideals so we're insisting on more rank conditions. So for example this ideal here is insisting that we be rank at most two sorry yeah rank at most rank at most one so the two by two miners have to vanish here. And so and we don't get that insistence from this permutation and so it's the stronger of the insistences that tells us which which number to see to fill in as as a rank condition. So when the when we're summing ideals and we're trying to form the appropriate rank table we're going to take the smaller of the two values coming from the ideals we're summing. So this is a very friendly way to take our favorite cubertermental ideal and instead just study for willing to understand sums these by-grass monion permutations. Here's another very friendly looking sum. This permutation here is not only by-grass monion but it's dominant it's just one variable. And this is just another by-grass monion permutation this is two by two miners and they're two by three matrix. And again we're going to form this rank table by taking the minimum of the two values that we see. So we see a zero there and a one there we choose the zero and so on. And so maybe we hope that this is also going to be another cubertermental ideal. So let's take this rank table that we've gotten and try to make a zero one matrix that gives us these rank conditions. So if the the rank zero part has to be just this one cell here that's telling us that we have to be rank one there and rank one there. So we have to have those two dots. And now unfortunately we see that we have to rank one here. And in order to get a rank one in our rank table we would have to stand on this square here and look northwest and count one dot but we already have two dots and so there's no hope. So there is no way to to take this rank table and fill in a permutation matrix. So it is not the case that sums of cuberterminental ideals are cheaper. Which is kind of a shame because we you know we like northwest rank conditions. Some of them sum so easily. Wouldn't it be nice if we could understand all of these northwest rank conditions? So let's record our observations that we've made. One of them is that every cuberterminental ideal can be expressed as a sum of cuberterminental ideals by grass monions. And the second more concerning observation is that it is not the case that if we sum two cuberterminental ideals we necessarily get a cuberterminental ideal. So let's take an arbitrary sum of cuberterminental ideals and call it iSEBE. So there are a lot of things that are good about cuberterminental ideals and we might wonder how many of them are true for these sums. So must this sum be radical? We know that cuberterminental ideals are radical. Is this sum still going to be radical? Cuberterminental ideals are Cohen-McAulay. Is this sum going to be Cohen-McAulay? Is it height unmixed? So if it's radical then unmixed would just be the same as equidimensional. And so do we do we at least have that? More generally what is the codimension? How do we understand the minimal primes? And is there any combinatorial structure that can help us understand these these IA? These IA they're just going to be any ideal given by Northwest Bank conditions. And some good news is that yes these IA are all radical. Let's do the analygant. And are they necessarily Cohen-McAulay? Unfortunately no. Are they necessarily quite unmixed? Also no. And it's not the failure of Cohen-McAulayness doesn't just come from minimal primes of the wrong height. We can have things that are equidimensional but not Cohen-McAulay. So how do we understand the codimension? Is there any combinatorial structure? So those are questions we'll consider as we go forward. But before we think more about about these I want to convince you that these two are our questions worth asking. So maybe the Cohen-McAulay property is not as near and dear to your heart as it is to mine. And similarly unmixedness. So suppose that we're interested in enumerative geometry. Should we care about our varieties being equidimensional? I think that we should. And one way to think about this geometrically is that we might care about the degree of a variety. So the degree of a variety is take your favorite productive variety and intersect with hyperplanes in general position until we get down to finitely many points of intersection count those finitely many points. If I take for example a plane with a line through it in free space and I pass another line through this variety that will get me down to finitely many points of intersection. This line that I started with it's not doing anything. These two lines in free space I can pull apart. I can just wiggle them apart and this gives me no additional points of intersection. So the number of points of intersection comes only from the top dimensional pieces and for that reason we should prefer that our varieties be equidimensional. Algebraically the statement would be that the Hilbert polynomial gets its top dimensional information from the top dimensional pieces of the variety and these things like a line through a plane the line contributes lower order terms. So it's not telling us about the normalized leading coefficient of the Hilbert polynomial and the normalized leading coefficient of the Hilbert polynomial is the degree. So algebraically and geometrically we're really interested in the top dimensional components and so we value our varieties that are equidimensional. For Cohen Macaulayness in case you so if you have never seen a definition before I basically wouldn't read this slide. If you have seen a definition before and like one of them want to be reminded here they are and if you know the definition well then you kind of also skip this slide. But geometrically how do we picture something that's Cohen Macaulay? So the maximal ideal is not embedded and we can slice with a generic hyperplane section and the maximal ideal is still not embedded. We can keep doing that and the maximal ideal never becomes embedded and then eventually we're down to something dimension zero and so there's room for the maximal ideal to be embedded it's just the only associated primes it's the only prime around. And from the standpoint of enumerative geometry why do we value Cohen Macaulayness? Why do we like it that the maximal ideal isn't embedded? So one answer to that question comes from Sarah's intersection formula. So there is a way that we can algebraically compute the intersection number in terms of this alternating sum of lengths of tours and alternating sums of lengths of tours are unpleasant. They're often difficult to compute and in the Cohen Macaulay setting this intersection number this computation just simplifies down to tour zero and tour zero is just s-mod the sum of the ideals determining my variety. So in the Cohen Macaulay setting we are well equipped to count points in intersection. So if we care about enumerative geometry we should care about the Cohen Macaulay property. So it is a real feature of major qubit varieties and something that we're looking for as we consider these other ideals determining northwestern conditions. So let's return to our example. This sum of these two by Grassmannian superterminal ideals that we saw here's the rank table just the rank table that we produced before rather than you know was there and I looked at a couple of slots and took the smaller value and a fact about this rank table is that it can be determined in the same way that we get rank tables from permutation matrices coming from this matrix A. So this matrix A is what's called an alternating sine matrix. So its entries live in zero one and negative one and in each column and in each row when we sum we get a one. Moreover, any row that has a negative one has to start with a one end in a one and alternate one negative one one negative one throughout its non-zero entries. Same thing for every column that's the alternating part of the alternating sine matrix. So how do we get this rank table? Well we take this corner sum. So for example if I want to know what rank goes here then I carve out this part of my ASM and I add up the entries in that northwest corner. So I get one plus one plus negative one is one and that tells me to put a rank of one there. Similarly here for example I would say one plus one plus one plus negative one is two and that tells me to be ranked two there. And so just like when we're forming a Schubert determinant ideal when we form an ASM ideal what we do is we plunk down a generic matrix and we say that we have to be ranked at most one in this corner. So that tells us that the two by two minor there has to vanish. That it turns out is this term because we already have this one by one term in this corner. Or here for example we have to be ranked at most one northwest of that corner. So we get this the two by two matrix that lives in this location is one of the generators for our ideal. So this ASM that's really just a generalization of Schubert detrimental ideals because when we have no negative ones here when we just have ones and zeros then we have a permutation matrix. We're interested in it because it arises as whatever northwest rank conditions we want. And a feature of it is that it decomposes into an intersection of these other Schubert determinant ideals. So here are two other Schubert determinant ideals these are both co-dimension three because one two three and one two three and the number of boxes in our rhodod diagram tells us the co-dimension of the ideal. And so this ia it is hide and mixed and its co-dimension is three because here are its two associated primes they are both co-dimension three. So again we like alternating sine matrix varieties because they are the ideals that tell us about anything described by northwest rank conditions and there's something that we know how to study in part because they're decomposing as this intersection of Schubert determinant ideals. So alternating sine matrices you know did they just sort of come out of nowhere is it does it happen that they you know describe these northwest rank conditions are they only there to describe northwest rank conditions. The answer is no they were alternating sine matrices have been around for a while they show up in statistical mechanics and they're of a lot of interest in enumerative combinatorics where like counting the alternating sine matrix alternating sine matrices was a big deal and in the 90s Zollberger and Cooper Berg have these competing proofs for the right enumeration so alternating sine matrices are of interest of their own and it turned out that they show up as the right combinatorial object to describe all ideals of northwest rank conditions. So here's the formal description of an ASM and it's just the you know the formalization of the example that we just did where an alternating sine matrix is an n by n matrix with zero ones and negative ones and again the rules are that rows and columns each sum to one and the non-zero entries alternate one negative one one negative one one ASMs that don't have negative ones are permutation matrices and we can form these alternating sine matrix ideals in the same way that we form superdeterminant ideals. So some theorems due to Annowagant are that these ideals are not just set theoretically carving out these these northwest rank conditions but they are actually the right defining ideal for the variety that is their rate. When we sum ASMs we get other ASMs so if we start from these by-girls monium permutations we can sum as many of them or as few of them as we want and still get an ASM and we can repeat the process and there is a combinatorial way to break up alternating sine matrix alternating ASM ideals as superdeterminant ideals. So there is a lattice of ASMs and if we restrict the permutation matrices this is just the poset that is under a strong Bruja order. So this perm of A what this is is these are the the smallest in Bruja order permutations that live above the alternating sine matrix A and so the this decomposition of ideals can be read off directly from the combinatorics. So maybe now is actually a good time to to stop and take any questions if there are any. Okay or let's if you have questions let's do our five minute pause now. I guess the recommendation is that the the part after intermission should never be longer than the part before and yet that's what we're going to do in this case. Okay thanks very much for the first part and yeah we will have a let's say six minute break which means we'll start again at 4.27