 welcome to my talk. So the title is somewhat scary. The paper is about using layer systems for proving conference of first order term rewriting systems. So let me provide some context. So as I said the context is first order term rewrite systems and I'm lucky that the previous talk was about grunt tree rewrite systems where we had an example which used different letters but looked something like this. We had some F of A and B and this could be rewritten to F of A prime of some C and B using a rule that said that A can be rewritten to A prime of C. I should say that we talk about term rewrite systems where we consider these trees as terms. So this left hand side would be F of A and B and similarly the right hand side would be F of A prime of C and B. And the first order aspect is that we allow variables and rules. That is we could have a rule that rewrites F of X and Y where X and Y are placeholders for arbitrary subtrees and that could be rewritten to F of, sorry, to make this example work, to F of A prime of X and Y. And this left hand side has an instance where we replace X by A and Y by B which results in this term and using the same substitution for the right hand side we obtain this term. So that's what first order term rewriting is and that's what we are talking about. The origin of this is actually equational reasoning where, for example, F could be, I don't know, could be addition and we could have a rule like this that X plus Y equals some right hand side that wouldn't look like this but we will actually see an example later. The next concept that we use is conference. Conference is useful for us because it actually describes that when we consider a term we write systems as a model for computation that the results are well defined. Conference will be introduced on the next slide. And conference is actually useful without termination. I mean, we can ask whether these sequences are always finite. That's the problem of termination. But even when the rewrite system is not terminating, it could be a program that keeps running forever. We are still interested in whether the computations path diverge or are essentially still can always produce the same intermediate results. I skimmed over the general model of computation part of it. The idea here is actually that functional programming languages can be considered as definitions of term rewrite systems which are not first order but higher order but that's not so important. And actually combinatorial logic which is universal is an instance of a first order term rewrite system. So in our group, we are actually interested in automating that is deriving decision procedures for subclasses of term rewrite system originally for termination. But there's also interest in automating this for conference. Of course, the underlying problem is almost all interesting problems in computer science is undecidable and the interest lies in finding powerful methods that decide as many problems as possible in practice. So let's move on. What is conference? Well, conference just says if we have a starting term S and some rewrite sequences applying those rules that result in different subterms T and U, then we can find a common reduct that is a term V such that both that we can find rewrite sequences from both T and U to V. So what are, well, the origin for this was equation of reasoning and there's, well, there are methods to decide equation of systems that start with a set of equations and try to orient them and complete them. That's Knudt-Bennich's contribution and underlying this criterion for conference, namely that if a system is, rewrite system is terminating and locally confluent, that is, it's confluent of the rewrite sequences from S to T and S to U are restricted to one rewrite step. Then it's also confluent in general. And another important criterion is also conality, which is essentially of the justification for functional programming, which is where we say, okay, in a local context, it's always, rules are always unambiguous then the system will be confluent. This is just vague. This is not a formal definition. I'm going to have an example later, but it's not so important for this talk. And of course, there are many other techniques, which I will not explain now. Today, we are interested in modularity, many sorted persistence and order sorted persistence. To explain what this means, we need some definitions. So we now introduce term rewrite systems formally. So what is a term rewrite system? Well, we have a signature that contains, consists of a set of function symbols and of a set of variables, where each function symbol has a given arity. And then we can construct terms over this signature inductively in pretty much the obvious way that those, each variable is a term and F applied to the number of, a number of terms given by its arity is, again, a term. And then we can define term rewrite systems as a set of rules that have a left hand side and the right hand side, each of which is a term. And we require that the variables that occur on the right hand side also occur on the left hand side. And that the left hand side is not a variable. Because if the left hand side is a variable, we could replace any subtree or subterm of a given term by the right hand side and that's fairly degenerate case. So that's the ordinary TRS. To explain persistence, we will use many sorted TRSes, where in addition to the signature, we have a set of sorts that is, well, every term, every subterm of our terms has a sort. Sorts could be natural numbers, integers, or lists, et cetera. That's the intuition. Furthermore, we have variables for each sort, which are distinct. And instead of an arity, we have for each function a signature, which is the, well, the types of its arguments and the result type of a function. And the definition of terms is then changed accordingly. We have terms for each given sort alpha, that is, terms of that sort, which can be constructed from either variables of that sort or by applying a function with the right result type to terms of the correct argument types. So that's where the signature comes in. And I abbreviate notation by saying that this f of that signature alpha 1 to alpha of the arity to alpha is included in f. And while rules are essentially defined in the same way as before, with the restriction that the types of the left hand side and the right hand side must be equal. And one further tweak to this definition results in order sorted TRSs where sorts are actually ordered by some sub type relations. So for example, n is a sub type, natural numbers is a sub type of the integers. With the intuition that wherever we require an integer, we can also use a natural number. And accordingly, for the terms, we then allow at argument positions not only terms of the precise type of that argument terms, but of any sub type as well. And for rewrite rules, we require that the right hand side is of the same type as the left hand side or of a more specific type. One final technical definition. In this case, for order sorted TRSs, we say that a variable in a term is strictly bound if it exactly measures the required argument type of the function that where it occurs. So these are known results. The first result is result about ordinary TRSs. If R1 and R2 are TRSs over disjoint signatures, then there's a result by Toriyama from 1987 that says that both these TRSs are confluent, then the union of the two TRSs confluent and vice versa. Then for many sorted TRSs, there's the result that when we have a many sorted TRS that is confluent as a many sorted TRS, then when we forget about types, the resulting ordinary TRS is also confluent. And vice versa, this result is by Toriyama and Odo from 1996. And in the same year, they published a technical report where they claim that this extends to the order sorted case as well, if we require that all variables are bound strictly everywhere. So why is this interesting? Well, consider this TRS where we define the difference of two numbers by saying that the difference of the successes of two numbers is the difference of the two numbers. And the difference of x and zero is x. And over there, the difference of zero and x is zero. And if we take the difference of two equal numbers, then that is also zero. And assume that this is extended with a rule on lists that says, well, we can replicate elements infinitely. So we repeat x is an infinite list consisting of elements x. And the problem with this TRS as well, the classical conditions do not apply. The TRS is not terminating, so we cannot use Knudbenniks. And the TRS is not left linear. There is a left hand side on the right hand side of the slide, where a variable occurs more than once. Therefore, we cannot use these criteria to conclude a conference. And it turns out that a lot of more advanced criteria also don't work. However, using modularity, where we notice that the last rule actually does not use any function symbols that occur in the other rules, we can decompose those TRS into two TRSes, namely the first four rules and the last one. And we can prove confluence for each of these individually and use modularity to conclude that the whole TRS is confluent. So the claim is that this is actually useful and interesting. On the other hand, for this claim on order sorted TRSes, we have a counter example. So here we have a rather huge term rewrite system. And there's some symmetry in the rules here for each rule in the left column. The rule in the right column is obtained by replacing F by G and G by F. And while it turns out that all requirements are satisfied, in fact, for the, except for the last rule, this is a many sorted TRS. And in the last rule, while we take the left hand side, then it has sort one, that is the result type of C. And if we take the right hand side, then it has sort zero, that is the sort of X. And how does, why is this a counter example? Well, there are some boring parts in the rewrite system, well, graph. But this is the interesting part. So we have two components which look very similar. But, well, the F of X comma O can be rewritten to A and in a cycle. And G of X comma O can be rewritten in a similar way and to B. But the point is that in the sorted sorting, we cannot replace these X's by O. Because O has sort, sorry, O has sort one. But the context where the X's of Q have a sort zero and one is not a subtype of zero. In fact, the relation is in the opposite way. However, when we change to the ordinary TRS, then of course we can replace X by O. And then the two components here collapse and we have some term that rewrites to both A and B. And then we cannot join A and B because these are normal forms. So this result of modularity, this sums up the first part, has been proven several times. And most important of these proofs is the so-called simplified proof that's a part of the paper title by Klopp and others. And there are a number of derived results like persistence that we've just seen that are based on the simplified proof. And the technical report was also based on this proof but turned out to be wrong. So we thought it would be interesting to look at the common core of these proofs and find where the actual mistake is and try to find a common ground so that we can prove similar such generalized results more easily. I should speed up, I guess. So what is common in these proofs? Well, consider this example for modularity where we have two disjoint signatures. And the first TRS is not important. The second TRS has a rule that C of X goes to X. And how do these proofs deal with the fact that there are two disjoint signatures? What they do is that when they take a term and look at the points where a subterm doesn't fit, that is where the signature changes. So in this case for C of X, these breaks in the signature happen between F and C and then again between C and F. So we have split terms. That's one common concept. In the many sorted case, this would happen when the argument type of function does not match the type of the actual parameter. And we can, in this example, we could rewrite F of C of F of X to F of F of X. This can be done on the split term as well. Then we would have F, then a break, because there was a break previously, then F of X. But it turns out that actually these have the same signature, so there shouldn't be a break there. And in further step called the fusion step, where two, where layers are merged, this would result in F of X with no splits at all, or no breaks at all. And the second common concept here is that these parts of the, that term can be broken into, in this case, F of applied to a whole that can be replaced by other parts and so on. These, occupying all these proofs, and we call them layers. We're writing, that's also common, acts on layers pretty much independently. And as I've already shown, layers confuse. So a layer system is essentially a set, a subset, a set of terms over an extended signature, where we add a placeholder for common subterms, for subterms where the signature, that didn't really fit. So F of some whole. So that's what a layer system is, that's half of the title of the paper. And we have the following results. I should say that this is not, sorry, that this is not the complete definition that we use, some conditions are missing. So what are our results? Well, if R is a left linear TRS that is of variables on the left hand side, do not occur more than once. Then, and the layer system is weakly consistent, which is, which I will define on the next slide. Then, if the TRS is confirmed on the layer system, restricted to the actual terms with out holds, then R is, then this TRS is also confirmed on all terms, as an ordinary TRS. Then we have the same result for non-duplicating TRSs. The reason that it is a separate theorem is that the proof is wholly different, or completely different. And for general TRSs, we have additional constraints that have to be satisfied. So this slide, I will not explain, but this is what these conditions look like. There's a lot of notation here that I haven't introduced. But let's look at how these conditions translate in the order sort of modularity case. Well, what is the layer system in this case? The layer system is just the set of well sorted terms. And after we've constructed that, we also add the terms that are obtained by replacing variables by holds. That is, these square sum modes. It turns out that the first, third, fifth and eighth condition are trivially satisfied. In the second, the second condition essentially means that we need infinitely many variables of each sort, which is a standard assumption. And the fourth and sixth conditions are satisfied if we require that, sorry, the fourth condition is satisfied if we require just that left hand side. But in the left hand side of rules, variables are bound strictly. Same for the sixth rules and the seventh rules. I have the seventh condition is a bit more complicated, but interesting that it is the reason that the counter example was actually a counter example. If we look at this again, in the split terms, we have a rewrite like this. Well, we have f of a left argument and o in one layer, then the left argument is o in the next layer that rewrites as displayed here. And from there, using the first rule there, that is we have f of x, y goes to f, capital f of x, c of x and y. And from there, we can apply the second rule. And after that, we would have, instead of this o, a separate layer in the second argument, but it turns out that we actually, the type fits and we can refuse these layers. And that is we had applied a rewrite step at the top of the term. And this resulted in some fusion to happen. And it turns out that the proof fails if this is allowed. And the correct condition that we derive here, can derive here is that variables in the right hand side of rules have to be strictly bound. This was also required before. And for so-called collapsing rules where the right hand side is a variable, the sort of x must be maximum with respect to the given order. So this is the result for order sorted TRSs. Many sorted persistence and modularity are easy consequences of this. We have some further results that also fit this framework of layer systems. And to conclude, we've seen modularity and layer systems and how, and persistence, sorry. We've introduced layer systems as a common concept used in all these proofs and have presented a counter example and the correct result for order sorted persistence. I have omitted all proof details because they are really very, very technical. And I think without context wouldn't make sense to present. Future work would be to simplify this proof a bit and formalize it in interactive theorem proof. And yeah, find more applications for this framework. Thank you.