 So welcome back everyone. Now we're at the last talk of this session. We'll be hearing a talk by Mireille Kibbuti on homological methods in rewriting. She is currently a PhD student at the MIT, advised by Adam Klippuschlid-Powler. I hope I didn't mispronounce it. And while she will be speaking about work she did completely independently, which was awarded by the best paper award for junior researchers at the FSCD conference last year, which is the conference on formal structures for computation and deduction. So we're really curious to hear about your work here and please go ahead. Okay, so hello. My name is Mireille and this talk, I'm going to talk about homological methods in rewriting. This talk is about real equation theories or term rewriting systems, TRSs for short. So I begin by clarifying terminology around them. Both equation theories and TRSs consist of a set of variables, a signature, and a set of rules. A signature is a set of constant or function symbols like this. And if we fix a set of variables and a signature, we can construct terms looking like this. Then our rule is a pair of two terms. For equation theories, we use an equality sign to represent a pair. And for TRSs, we use an arrow. The only difference between equation theories and TRSs is that if we don't care about the order of two terms, then it's an equation theory. If we do care about the order, then it's a TRS. So let's move on to the main question in this talk. We are given an equation theory or TRS. Then can we tell if there is any smaller theory or TRS to equivalence to the given one? Smaller here means smaller number of rules. So in other words, can we know how many rules are actually needed? This talk will give a lower bound of that number using algebra. Later of this talk, I will explain a brief introduction and history of the algebra we are going to use. Here's an example, the theory of groups. The theory of groups is presented by these five axioms. But it's known that these three axioms are enough. They can derive the other two axioms on the right. And even smaller presentations are known. The theory of groups is equivalent to these two axioms. And if you use a symbol for division instead of multiplication, then there's just a single axiom, which is known to be equivalent to the theory of groups. So this is the smallest. And in terms of the number of axioms, but do we have a single axiom over the same signature, like this one? It has multiplication, inverse, and identity. So we don't count the last rule as the smallest one. Since it's over the different signature, it doesn't have any information about identity E. So our question is this. Is this still possible to give a single axiom equivalent to the theory of groups without changing the signature? Tarski, Neumann, and Kuhnen shows the answer is no. Then what about other equation theories or TRSs, not just the theory of groups? If we are given a theory of TRS, can we tell in a generic way how many rules are actually needed? At FACD 2016, Melvos and Mimran gave a lower bound of the number of rules. Their theorem tells that if we have a complete TRS, complete means terminating and confluent, then you can compute a number Mm, and any equivalent TRS has at least Mm rules. The number sign here means the cardinality of the set, but not many TRSs are known to have Mm greater than 1. So for many examples, the number Mm just tells you at least 0 or 1 rule is needed, which is of course trivial. And for equivalent TRS here, the two signatures can be defined. So for those reasons, we want another lower bound. Here is my theorem at FACD 2019. We fix a signature sigma and let's say R is a complete TRS over sigma. We use a new notion degree of R to describe the precondition here, but I'll explain it later. The theorem is if the degree of R is 0 or prime, then we can compute a non-negative energy E of R such that any equivalent TRS over the same signature has at least sharp R minus E of R rules. Equivalence here is the equivalence between two TRSs over the same signature, and this lower bound is greater than or equal to Melvos-Mimran's lower bound Mm. Using this theorem, we can prove Tarski's theorem earlier in a very simple way. Let's say R is a complete TRS of the theory of groups. Computing the degree and E of R, I got the degree is 2, which means our theorem is applicable, and sharp R minus E of R is 2. So we can conclude any TRS equivalent to the theory of groups over the same signature consisting of multiplication, inverse, and identity as always two rules. And this is the outline of this talk. We are going to see definitions of degree and E of R first, then example, then a quick overview of my proof. So far, this is the same with my FESD 2019 talk, but since I have more time today, I'm going to share you about homology, which is the most important thing in the proof. So we begin by the definitions. We define the degree of a TRS first. We assume every variable is written like x sub a natural number for simplicity. Sharp sub i of the term means the number of occurrences of variable x i in the term. The degree of a TRS is defined by doing this. For each rule and for each variable, count the variable in the left term and in the right term, take the difference of them, then take the dcd of those numbers. For example, let's think about this TRS. For first rewrite rule, x1 occurs once in the both sides, so we get the difference 0. And for x2, it occurs twice in the left side, but doesn't occur in the right side, so we get 2. For the second rule, in the same manner, we get 3. Taking the dcd of them, we see the degree is 1. As a special case, having the degree 0 is easily described. The degree of a TRS is 0 if and only if the rewrite relation of the TRS preserves the mod set of variables like this example, x1 to x1, x2 to x2, x3 to x2, like that. And to define e of r, we introduce a matrix associated to the TRS, which plays a very important role in connecting rewriting systems and algebra. Let's say r is a complete TRS that has n rules and m critical pairs. If you don't know, that's why a critical pair is a pair of two terms related from a single term in two different ways that satisfies some conditions. I don't tell the condition now, but you can find the definition in rewriting text books, and it's not something I defined. Then we fix a rewriting strategy and d of r, which is I defined, is n times m matrix whose ij's entry is computed using the i3 write rule and the j's critical pair. Here's the j's critical pair, and we normalize them, and we count the number of i3 write rule, li to ri, in each normalizing pad, and then take the difference of them. That is the ij's entry. Here's an example. This TRS preserves variables, so its degree is zero, and it has four critical pairs, so d of r is a four times four matrix. For the first critical pair, the first rewrite rule a1 appears once in the both sides, so we put 1 minus 1, 0 here, and a2s appear twice in the left side, so we put 2, and the other rewrite rule don't appear, so we put 0s. Doing this for the rest, we get the whole matrix. Then we can give e of r, which defines our lower bound. First, we consider the matrix d of r over the integers modulo the degree d. That is, we are going to consider matrix operations over the ring z over d z. Remember that our main theorem has a precondition, d is zero prime. If d equals zero, z over d z is isomorphic to just z, and if d is prime, z over d z forms a finite field of order d. If d is prime, e of r is defined as the rank of d of r, and if d equals zero is something similar to the rank, but a little more complicated, e of r is the number of ones or negative ones in the Smith-Normal form of d of r. Smith-Normal form is a normal form of a matrix over z, looking like this, obtained by elementary row and column operations. The only diagonal elements can be nonzero, and each nonzero elements divides the element on its lower right. Then let's look up some examples. We already computed the matrix for this example. By row and column operations, we can reduce it into this form. e of r is the number of ones or negative ones, so in this case, it's one. Applying the main theorem, we see any TRS equivalent to r has at least three rules. In other words, there's no equivalent TRS with two or less rules, and we have an equivalent TRS with three rules, a one, a two, and a three. For the theory of groups, a complete TRS has 10 rewrite rules and 48 critical views, so we don't compute the matrix here, of course, but I implemented a program to compute the matrix and e of r. I run the program and e of r, that was eight, and which implies any equivalent TRS has at least two rules, so the theory of groups cannot be presented by just one rule over the same signature. My program computes Malbos and Mimram's lower bound also, and e was zero. The last example is a TRS for average and successes of natural numbers. It has one critical pair, and d of r is the five times one zero matrix, since the left path and the right path have the same number of each rule, so e of r is zero, and the same e of r is zero, means the original TRS r is already one of the smallest. This can be generalized like any TRS whose critical pairs are of this type, then the TRS does not have any smaller TRS, smaller equivalent TRSes, and this type here means the left path and the right path have the same subset of rewrite rules. I think it's pretty interesting. I don't know if it's shown by another way, but let me know if you know. Now let's see the proof. From now on we assume the degree is prime. Assuming the d is prime makes the proof simpler because we can use linear algebra. The over d z forms a field, not just a ring, and its n-th Cartesian power forms an n-dimensional vector space. We also use Melbus Niemann's result. Even though their lower bound didn't give many interesting examples, their theory behind that is actually important. So we first look into their theory briefly. To define their lower bound, they introduced two linear maps between z over d z vector spaces whose dimensions are sharp... I hear echo. Is that okay? This sounds okay. I think it's okay. I think that was my fault. Sorry. Is it okay now? I think so. Okay. So to define their lower bound, they introduced two linear maps and the dimensions are the number of rules, the number of signatures, and critical affairs. And yeah, they are named one tilde and two tilde. And in fact, we are already familiar with the second map. The matrix T of r we've seen is the matrix presentation of the second map if the TRS is complete. Melbus Niemann's lower bound mm of sigma r is defined by the dimension of the quotient space of the kernel of the first map by the image of the second map. Here the kernel of a linear map is a subspace of the vectors that map to zero and the image is the subspace of the images of all vectors by the linear map. Then mm of sigma r is less than or equal to the cardinality of r. This is shown by abstract linear algebra like the dimension of the quotient space is less than or equal to the dimension of the space we are taking the quotient. So we have kernel and since the kernel is a subspace of z over dz to sharp r, so we have this inequality and it's sharp r. And the core theorem is that their lower bound is invariant under equivalence. Equivalence here is the version where the signatures can be defined. It's shown using homological algebra and the quotient vector space kernel over image is called the second homology. By showing mm is invariant under equivalence we can change r into any equivalent TRS. So it means any equivalent TRS has at least mm rewrite rules. This is a very rapid introduction to Morbos Mimlan's paper. So let's move on to our proof. Our lower bound sharp r minus e of r is equal to the dimension of this quotient vector space and we name it V. We can show it from some basic facts from linear algebra as well like the dimension of the quotient space is the difference between the dimensions and the first term is just sharp r and the second term is the rank of the major exp of r. And by definition it's just e of r and also by some theorems from linear algebra the dimension of V is equal to the sum of the dimension of these two vector spaces and it's less than or equal to the cardinite view r. And kernel over image here is the second homology so it's invariant under equivalence. For the dimension of the image of del 1 tilde it does depend on the signature but it's invariant under equivalence between two TRSes over the same signature. In summary we get these equalities and sharp r minus e of r is invariant under equivalence over the same signature so you can replace r with any r prime equivalent to r and then we get the desired inequality. Here remember the statement of the main theorem. We have a precondition saying the degree of r is zero r prime and we've seen the case the degree is prime. The proof can be extended to the case d is zero but not to the case d is neither zero r prime. Basically it's because the ring d over dz is more complicated in that case especially it has zero devices like if d is four two times two is zero module of four and many useful theorems don't work for rings with zero devices. For such rings for instance Smith's normal form used to define the lower bound is not well defined for matrices. Okay so we saw something called homology in the proof. For now I want to talk more about homology and its relation to rewriting. Even before my and Malbos and Mimons research homology groups appeared in rewriting in 1980s so but for string rewriting so let's talk about string rewriting. String rewriting systems SRSes for short is the same with TRSes but it's about strings instead of terms. We have an alphabet that is a set of characters and a set of rules and for instance we have this kind of SRS here and ipson is the empty string and if we have the string abab then we can rewrite ba here to ab so we get this and abb is written to ipson the empty string so we get a. For the first step let's see how string rewriting relates to algebra. There's a relationship between an SRS and a monoid. If we have an SRS we can define a monoid that is the quotient of the strings over the alphabet by reflexive symmetric transitive closure of the real-life relation and the multiplication is the string concatenation. In this situation the SRS is called a presentation of that monoid. For example let's say our alphabet has just a single character a and a single rule aa to ipson then the monoid has just two distinct equivalence classes. We write brackets for equivalence classes. The class of aa is equal to that of ipson because aa is written to ipson. For the next example we have two characters ab and rule ba to ab. Then the monoid looks like this since the class of ba equals that of ab by the rewrite rule and not just we can define a monoid from an SRS any equivalent SRS is present isomorphic monoys and also we are given a monoid. If we are given a monoid we can always find an SRS that presents the monoid if we are allowed to have infinite alphabets and rules. So this is the relationship between monois and SRS. So let's move on to homology groups. What are homology groups? Homology groups appear in many contexts. The most famous one is the homology of a topological space and this sounds somewhat confusing but there's something called homology groups of a group and not just groups. Chilean fields medallists noticed that we can define homology groups of a general algebraic system and for any of these one common thing is that homology groups are avian groups and they extract some information from the object we are given. Here examples for topological spaces we have a sequence of homology groups for each topological space and for any two topologically isomorphic spaces like a sphere and a cuboid and then the the sequences are the same and for a sphere it's known that the zeroth homology is the first one but we count its zeros. The zeroth homology is z and the first homology is the trivial group that consists of just the identity and the second is z and the latest are all trivial. And for a torus the first homology is z times z the product of z and the others are the same with the sphere. For surfaces like these the first homology tells the number of holes for a sphere since it has no holes then the first homology is just just has zero and for a torus we have a hole here and we can think of the space inside the torus as a hole so we have two holes it corresponds to the things that the first homology has to z and for homology groups of groups I don't give the concrete examples because it's going to be complicated but basically there's a sequence of homology groups for each group and any two isomorphic groups gives the same sequence and the homology groups of a group also called group homologies are important to think about homology of a theory or TRS. Similar to monoids there's something called group presentation. A monoid presentation was a pair of an alphabet sigma and a set of rules r but for a group presentation r is a set of strings and strings can have a formal inverse of a character in the alphabet. Then to present a group from the sigma and r first we create a monoid presentation from them. The new alphabet is sigma with its inverse and the rules are w to epsilon where w is a string r or xx inverse or x inverse x for each character x. Rules xx inverse to epsilon and x inverse x to epsilon means the identity rules so this equips the presented to monoid with a group structure and any group can be presented in this way as well as monoids and the reason why I'm talking about groups is that group theories has worked on small presentations of groups for a long time. Since we are interested in small presentations of equational theories or TRSs that sounds similar. A result by epsilon is that if group g is presented by finite sigma and r then the difference between the number of rules and the number of characters is bounded below by something using homology groups of g. s and s is the number of generators and rank is the torsion free rank. If you don't know about them both s and rank are something similar to the dimension of vector space but even though homology groups here are not vector spaces and if we move the minus sharp sigma to the right hand side we get a lower bound of the number of rules when we fix the alphabet. So that sounds more similar to our result for TRSs and indeed the result for TRSs is the consequence of generalizing this inequality here to TRSs. This is how I reached to the result but for generalization we need homology groups for monoids or TRSs so let's see things about that. For monoids since monoids are pretty similar to groups we can construct homology groups for monoids or SRSs in the same way with groups. Also in this case any two equivalent SRSs or isomorphic monoids gives the same homology groups but not many people had worked on this homology and no application to rewriting were known until 1987. In 1987 Squire a group theorist published an amazing paper he solves an open problem at the time he the problem asks if there's exist monoid with a solvable world problem that cannot be presented by any finite complete SRS. A monoid with a solvable world problem means that checking the equality of elements in the monoid is decidable and notice that if a finite complete SRS present a monoid the world problem of the monoid must be solvable because if you want to check if two elements are equal then you can rewrite them by the SRS until they cannot be written anymore then you can just check the resulting terms are equal so the problem is asking if the converse is true or not. What Squire discovered is that if the third homology group of the third homology group of a complete TRS is not finitely generated then the SRS must be infinite. His theorem is actually stronger than that but and remember that if two SRSs are equivalent then their homology groups are the same so if we have a complete SRS whose third homology group is not finite regenerated then not just that SRS is infinite but all SRSs equivalent to that SRS are infinite that means we cannot have finiteness and completeness at the same time. This is how he solved the problem and this was the very first time that homological algebra was applied to rewriting and I have to mention something about this. I found out about this topic when I was reading the French Wikipedia article on rewriting the English version doesn't have this so thank you the French speaker who wrote this article. I published a paper received on ours and was invited here it's all because of you so thank you so much the French speaker merci beaucoup and remember that we saw this diagram here SRSs are closely related to monoliths now let's see what about TRSs instead of SRSs for strings we can define multiplication as concatenation then what about terms instead of trying to define multiplication of two terms we define multiplication of two tuples of terms by substitution for instance we have these terms and then their multiplication is substituting the first element of the tuple for x1 in the left term and the second element for x2 then we get this term and we can multiply these two tuples we have a tuple on even on the left of dot in the same way the first element for x1 in the left the second element for x2 in the left the third element for x3 so if we multiply an m tuple with k kinds of variables and k tuples with m kinds of variable variables then we get an n tuples with k kinds of variables multiplication and yeah so to multiply them the k's here must coincide so yeah and yeah so multiplication now is defined but it's not always for it's not for any two terms we have something like type something like pipe like these two must coincide so tuples of terms form something like a monoid but the multiplication is typed what is that algebraic structure called it's called a category more precisely let's see how the category of terms is defined first objects are natural numbers so morphisms are defined between natural numbers natural numbers morphisms from k to n are n tuples with variables x1 x2 to xk and composition is substitution or multiplication here as the usual composition we can see the type matches well since k on the left is the kinds of variables and k on the right means we have k tuple and finally the identity morphism from n to n is the tuple of x1 to xn this category is the term version of the free monoid or the queen style of an alphabet with multiplication by concatenation we can generalize that category to something called lovier theories a lovier theory is a category whose objects are natural numbers and each object n equals the n's categorical power of one the later part means that any morphism from n to k is a n tuple of k to 1 to k and the relationship between t r s is and lovier theory is pretty much the same with the relationship between s r s is and morphoids any lovier theory is presented by a t r s in the and in the presented lovier theory any term t is identified with a term s if and only if t can give s by a sequence of rewriting in either direction this is how we relate a t r s to an algebraic structure and i can't give the definition of homology concretely it's complicated but for the timeline homological azure on lovier theories is investigated first by zivo aze and pierce sivili they defined cohomology groups of lovier theories and cohomology groups are something due out to homology groups and melbus and heman defined the homology and figured out the second homology is computable when the given t r s is complete and showed that the number of rules is bounded below by using the homology and i gave a better lower bound as i talked in conclusion um we get a lower bound of the number of sorry let's see sorry about that yeah uh in conclusion we have a lower bound of the number of uh rewrite tools to present a t r s of a fixed unit and we saw a relationship between rewriting and abstract algebra we have new algebraic tools and i'm hoping this work gives more research directions of t r s s so that's all thank you for listening thank you very much very much it was really exciting work and just a nice application of interdisciplinary results it well it shows us that how important op max is in science so see in wikipedia and that we all should learn french so are there any open questions i would have a question whoops okay continue yeah after you uh just about the is there a constructive way to get the smaller uh t r s in your method or is there an hope to to build it i say i didn't hear you well um is it constructive constructive the way you get a smaller uh system of holes uh yeah um uh let's see for topological spaces you mean from is is your question whether from the lower bound can you extract a constructively a t r s that achieves that lower bound yeah yeah uh yeah we can constructively compute the number the number but not the new the new system of holes from the old one the smaller one yeah on your implementation the does it your software does this i'm sorry your your your code you you spoke about your your code to compute the number but does your code also compute the new system the smallest system oh sorry um we can't compute the system so it's not you know um it can find a lower bound but it won't construct the system okay thank you yeah uh rick's statement here uh have you thought what i'd like to do is instead of using simply the number of occurrences of the variables i'd like to replace that by an unspecified polynomial depending on that and then replace the degree with uh instead of being primed with an irreducible polynomial seems to me everything after that would work just as well and i'm wondering whether that might get you extend the lower bounds to other possibly powers of primes i know it's a very vague question but the thing is that the algebra i think would work equally well on an irreducible polynomial is it wouldn't on a prime number yeah yeah that's interesting um yeah i'm not sure what's what will happen right now but yeah thank you for the suggestion yeah i would have a question related to bringing that a little bit closer to lambda calculus we know for instance that there is the s and k combinator calculus and then there is also a single axiomatization of the same with rosser's x combinator i don't know if your lower bound can tell us if we could search for even simpler forms in some of those situations the other thing is that in intuitionistic propositional logic which is a bit more complicated than the bullion one minimal axiomatizations are known but i don't think there is a proof that there is one that's minimal so maybe applying the techniques that you have for some of those could tell us about that it's more more of a possible suggestion of uh to reuse the same formalism yeah yeah that's interesting yeah so this one applies only to like the first or the term rewriting so it's not now it's not applicable to you know lambda calculus higher order things yeah but that would be interesting have you considered extending this to higher order rewriting systems i thought about that but it's not it's not an easy thing so i'm not sure how to extend i was wondering one thing just so the simpler case of linear term rewriting systems which is a special case of what you're doing do you get something um i mean something nice in that in that case from your your bound so in your case yeah yeah that might be much simpler i'm not sure right now yeah yeah okay um yeah i mean i had lots of questions it was a very nice talk one one thing i wanted to ask about uh the looking at the example of monoids the axiomatization of monoids i mean like the example that you gave her groups but can you can you extract a lower bound for the axiomatization of uh of monoids so like an associative and unital operation yeah i yeah i tried that but i didn't have any non-travial inequality so yeah okay is that known what is the minimal axiomatization like do you need three axioms or i don't know um yeah i don't know about that okay okay uh but yeah thanks thanks again it was a very nice talk no welcome perfect are there any other questions perfect so thank you very much it was a really nice talk so thank you again really this ends the first part of this after