 its properties before actually defining it. So those properties that we call the left-shed properties that have been studied in many different settings, some of them include color manifolds in complex geometry, spurner properties of both sets in combinatorics, some interesting tilings also in combinatorics, and even in differential geometry, people have found applications of left-shed properties on the Laplace equations. So when we, now we can actually define the properties. So the ring is always going to be a polynomial ring. The field can be any field as long as it's infinite. So we don't care about the characteristic yet, but it's an infinite field. So we take an infinite field and a monomial ideal such that it looks like that, so we can always have the pure powers and the rest of the generators, and the quotient is Artinian. And then we say A has the weak left-shed property in degree i. If the multiplication map from ai to ai plus one by the linear form, which is the sum of the variables of the ring has full rank, okay? And this is not the most general definition. You can define it for essentially any ideal. And then the difference is that instead of this linear form, you need a general linear form. But for monomial ideals, we can use this one. So here we have an example. So if you have A, the polynomial ring in four variables, and the field can be, again, any infinite field, and then you take the quotient by that ideal, then A can be written as this direct sum, so I'm using the brackets for the generated by. And then we have these two maps, the map from A1 to A2, and the map from A2 to A3. And the way you write the maps is you simply say, okay, so which monomials in the next degree are divisible by the element I have in the basis right now? So you look at the column that has A, the one entries are exactly the monomials that divide A and the rest are zero. And then you keep doing it for every element in the basis, and then you want to compute the rank of that matrix. So as I said, essentially what it boils down to is we want to compute the rank of a matrix and we want it to be a full rank. And then the main idea that we are going to use is that we can think of the columns or the rows of a matrix as either exponents of monomials, and in that case we get kind of a product structure, or we can think of them as coefficient of linear forms. The second one, it's not the monomial case, but it's the more combinatorial one. And using the first one is where we get to positive characteristics. So for this talk, we'll focus on the first one, okay? And then our main goal is, well, we have exponents of monomials, then we can take the ideal generated by these monomials, and then we want to be able to describe the weak left-chats property of those algebras in terms of those ideals, okay? So to do that, we need to introduce a new polynomial ring. So we have our polynomial ring over K, which can be, again, any infinite field. And I is a monomial ideal such that the quotient is artinian. And then we are going to define a new polynomial ring that depends on I, where the base field is the complex numbers, and for each nonzero monomial in the artinian ring, we have one variable, okay? So we have Tm, and m is the index, it's a monomial. So for an example, if you like square free monomial ideals, when you take I to be just the squares of the variables, then this new polynomial ring is going to be a polynomial ring over the complex numbers, where you have one variable for each square free monomial in n variables, okay? So first let's handle the case where WLP in characteristic zero. So if we have a square free monomial ideal of the polynomial ring, and we take the quotient by the ideal plus the squares of the variables, then we have an artinian ring. This has like a very nice combinatorial structure. And because of the combinatorial structure, the matrices that we get as the multiplication maps, they have a very particular structure. And essentially it's this proposition here. When you take the quotient like this, so the square free monomial plus the squares of the variables, the row sum of the matrix is constant. So if you take any row, and then you take the sum of the entries, you always get a fixed number in each degree. So here we have an example. If we take this ideal, and then we take again this quotient by the squares and ABD, then the matrix we get when multiplying from A2 to A3 by the linear form is this. So we see that the row sum is always three. And the row sum is always going to be kind of the degree you're ending up in. And then we can define an ideal from this matrix, which is the one I have here. So you take every one in the matrix, and then you go to your new polynomial ring, and then you take the product of the variables that appear with one. So here we would have TAB, TAC, TBC, which corresponds to the row ABC, and so on. You do it for every row. And from Earhart's theory, one very nice condition that we have is that for matrix, matrices like that, where the row sum is constant, we know that, well, since we have integer entries, we know that the analytic spread of the ideal we get by taking the exponents as monomials is exactly the rank of the matrix. And that's kind of the first step to realize that the weak left-chance property is actually very related to analytic spread. Oh, sorry. Yeah, and then as I said, like a corollary that we get from this theorem, right out of the bed, is that when you take a square free reduction, so a square free monomial plus the squares of the variables in a polynomial ring, in characteristic zero, this is equivalent, like having the WOP is equivalent to some monomials ideal, monomial ideals having maximum analytic spread. Oops, yeah. So when we go to positive characteristics, the situation is a little bit different because what happens is, let's say you have a matrix like this, one, one, minus one, one, and then what happens is the theorem from Earhart's theory that we have is that the rank of that matrix is going to be the analytic spread of some ideal, but when you go to positive characteristics, some maximum minors of your matrix, in this case, since it's two by two, it's just the determinant, can be zero. So for example, in this case, if you go in characteristic two, your determinant becomes zero, and then you lose rank, even though over Z or over C, you still have maximum rank. So the problem is exactly the characteristics that divide the maximum minors, like the GCD. And then what we're going to show is, okay, so let's say we have an algebra that has a WOP in characteristic zero. Which characteristics does the algebra fail, the WOP in? And to do that, we need the mixed multiplicities, which were studied by Trung and Verma. So let's say we have a polynomial ring, and then we take an arbitrary ideal, and M is the maximum homogeneous ideal. Then we can look at the multigraded breeze algebra. In this case, it's just two ideals. So we look at the breeze algebra, and then we can look at the hybrid polynomial of this algebra, the quotient of the breeze algebra. And then in this case, we have more than one leading coefficient. It's not like in the one-dimensional case, where you're N graded. And since you have more than one, you have essentially more than one leading coefficient in some sense. These terms, like the highest degree terms, we call them the mixed multiplicities of M and J. And they're very interesting. And okay, so the reason we need them is because, so the first result, first important result for us here, is from 2001, where Trung showed that the indices for which the mixed multiplicities are positive depend only on the analytic spread of J. So you have the mixed multiplicities of M and J. If you want to know where the mixed multiplicities is positive, like for which indices, you only need to know the analytic spread of J. It doesn't depend on M, it doesn't depend on anything else, just analytic spread of J. Then in 2007, Trung and Verma showed that the determinant of a matrix can be computed by actually computing, well, when the matrix has constant rho sum, you can compute the determinant by finding one specific mixed multiplicity of equigenerated monomial ideals. That's like the next step. And it's like a way more general theorem. It connects mixed multiplicities and mixed volumes, but we only need kind of the elastic quality. Okay, and then using these results, what we can do is we can describe the failure of the WOP in positive characteristics for square free reductions. So again, a square free monomial ideal plus the squares of the variables in terms of mixed multiplicities. So this is the theorem that we have. So if we have an artinian algebra A where you take the quotient of the polynomial ring by a square free monomial ideal and squares of the variables, and then you assume that the dimension of the second degree component of A is higher than the first one, then either A has the WOP in degree one in every characteristic except for two, or A fails the WOP in every characteristic. So it's kind of characteristic free in some sense if you forget about characteristic two. And that condition that the second degree is higher in dimension than the first one is not asking for much. If you think about I as the suddenly rising ideal of some complex, you're just asking that your simplicial complex has more edges than vertices, which is very reasonable. And some other facts that are known is for example, WOP in characteristic two of square free reductions is related to simplicial homology. It's actually kind of exactly simplicial homology in some sense. This is a result by Migliori, Nager, and Schenck. And there is a criterion for WOP in degree one and characteristic zero in terms of the combinatorics of your ideal. So if you think of I as, again, as the suddenly rising complex of some simplicial complex, then you can look at some combinatorial properties of that complex and determine if it has the WOP. And so now by our result, we essentially know how to compute the WOP in degree one in that case for kind of any square free reduction. Now here we have an example. Yes, we have this simplicial complex and then we can look at the ideal of the minimum non-faces of this ideal. The only minimum non-face of this simplicial complex is going to be ABD because the other non-face is everything which is contained. So we have this quotient and then if we want to compute the WOP in degree one, we need to look at the rank of this matrix, this one here. And again, to look at what matrix you get, you just need to multiply the linear forms by the vertices because we're in degree one. These are the things we get. So we put one in the support of each equation that we have. So the first one is ABACAD. We put one there and zero in the rest. We fill the rest of the matrix and then we can, well, in this case, since the matrix is small, we can compute the rank or we can define the ideal, which in this case for the first one is always going to be the edge ideal of the skeleton of your simplicial complex. So we get this ideal, which we are looking again. We take the rows, look at the positive entries and take the product in the other polynomial rank. And then this ideal has another to spread for. So in particular, the matrix has full rank and then you have the WOP in every degree, in every characteristic, except for a characteristic two. Yeah. And this is, I think I have like one more minute. So I've just mentioned the other case. So this is essentially what happens when you look at the product structure that I had at the beginning. Where is it? Yeah. So this is what happens when you look at the rows of your matrix as the exponents. When you want to look for the columns of your matrix as coefficients of linear forms, then in some sense you have a lot more structure because then what you have is you have a product of linear forms and then using the results by Jun Ha, where he used again the results by Trong and Verma to prove that the characteristic polynomial of a matriot or central hyperplane arrangement or Lacan cave. Then essentially what you get is that the matrix is the representation of some matriot and then you can do lots of things with the combinatorics of matriots. So yeah, I guess that's it.