 string-grating systems provide a, during complete model of computation. And since the beginning of the computation theory, there was studied and some classes of string-grating systems has been identified as giving decidability results. And among them, we will be interested in monadic string-grating systems. And we will use them applying on languages families to see what happens in this situation, but not just in usual monadic case, but in some monadic higher order case. And also the languages will be in some such higher order. But what is interesting for us in this work, especially the use of some generalized model of automata, which is very simple in fact, as you will see, but our point is to avoid some implementation details. Because when we work with different automata models, we have always to do with some kind of memory, some push-down store, et cetera. But here we view just an automaton like a structure with binary relations. And these binary relations labeled transitions, of course, with usual letters of the alphabet. And we use two binary symbols, like almost like unary symbols, because we put some loops to indicate the initial states and the final states. But this is, of course, these loops are not used as we are interested in path, which concern the language accepted by automata, but we use these loops for other constructions on automata. In fact, so which constructions we will make on automata, which kind of operations we need, quite usual, at least the first one with disjoint sum. The second one is formally, when we work with finite automata, often we need to connect them in some way. So we will use this inverse path functions, which I don't want to go to details, because it's almost like a kind of logic interpretation using monadic second order logic for instance, but in some restricted, let's say, some kind of fragment of monadic second order logic, because if we take the whole power of MSO, then some result couldn't be obtained in this way. But this is just a technical point. And the other operation we need is the iteration, which has two form of iteration, basic iteration and full iteration. So the basic was defined by stop in 75 and the full iteration, which works for any kind of relational structure, was introduced by Mucznik, which is an extension of this basic iteration, as you will see. I don't want to explain the details of the definition of this iteration. We'll look at the example, but just let me say that if we start with a structure of some vocabulary with binary relations, then the iteration is extended with two predicates, predicate symbol, binary, which is successor, and unary, which usually is called clone. So let us have a look how it works on an example, very simple with final structure, with just two vertices, three states. And what happens is that when we iterate the structure, every vertex has its own copy of the structure itself, which is linked through this successor relation from the original vertex, and the clone predicate in the gate in the copy, in some sense, the origin. What's the vertex the copy origin is from? So this is applied, this copy is as created for every vertex of the structure, and we iterate this construction again and again, infinitely many times. So as I said in the basic iteration, the only difference is that just we don't have this clone predicate, which has this over here, so it disappears in the basic iteration. So the construction is very interesting because it possesses the side-ability of the monadic second-order theory, namely whenever we have a formula on the iterated structure side, then we may construct the formula for the original structure, which is such that the both hold, it is equivalent that one holds on one structure and the other on the other structure. So the proof of this result was first given in 84, but some kind of short paper, conference paper, I think it was stacked, and the full proof was, and the longer version was given by value K, which is 96, and another interesting point about this iteration is also that full iteration, you may interpret within full iteration the unfolding, unfolding from a given vertex of a structure, which is not really possible, which is not possible in the case of basic iteration. So now this operation, we use them to consider a hierarchy of this simple model of automata as just as a relation structure, where at the basic level, zero-level we have finite structures, and we climb up in the hierarchy using this full iteration, and at each level being closed by this inverse path functions, which is, as I said, kind of restricted monadic single-order interpretation. So this hierarchy is infinite, it is strict, and it may be defined in other ways, also using, for instance, instead of iteration, we may use unfolding, and instead of this path functions, other operations, which are very similar. So there are many variants of this. But of course, the question would be, but which kind of languages we may obtain in such a way with this kind of automata? Well, so there's a kind of hierarchy also, where first two-level, zero and one, are regular and context-free languages, and then the level two was defined by Aho Seti and Ulman as so-called index languages, and then this idea of index languages, language indexed grammar was extended by Maslow 2.2 for any level, and he also showed further that level n index languages, n index languages are accepted by some kind of level n push-down automata, which is, in fact, let us just take a kind of level two. These are push-down automata with stacks of stacks. So level three of stacks of stacks of stacks and so on. And there were also some other characterizations of this hierarchy using recursive program schemes. So if we go back to this general model of automata we use, and in fact, this is also somehow which is related to this hierarchy of indexed languages, namely, a language is indexed, if and only if it is n indexed, if and only if it is accepted by a level n automata. Okay, so we want to apply it to string writing. String writing is just a way of transforming words into words using rules. We, by identifying within a word, a left-hand side of a rule, which is like this, replacing left-hand side by the right-hand side within the word. So we get in such a way, single-step of a string writing relation, and which may be applied iterated, and we get, as a transitive closure, the string writing relation. I will just give you an example, it's quite obvious. And the important property we need and often need, we will meet them often, are confluence. Namely, if we have a possibility to write if, in different ways, a given word, then there is always a possibility to join these two paths. So in this case, we say that system is confluent, and also what is important, be able to stop. Stop means we don't have an infinite chain of writings, and this is called netherian as for relations also. And this both properties, when we have, we call that the system is canonical. Okay, so then the monadic case of a string writing system, this is the one which interested us. Monadic system is the one which, on the right-hand side, has only letters or empty words. So something very simple. But this is not necessarily netherian as we have, we may have rules, letter, which gives letter, but this kind of difficulty, which is easy overcome by considering classes, equivalence classes, and then we get some equivalent netherian system and some very easy transformation. I don't want to go to this as well. So, which kind of question we may ask? We may ask the questions about which kind of languages we get by applying string rewriting either forwards, like this, on the language, or backwards, in reverse direction. And of course, the study went very quickly with this monadic string writing system beyond finite case, by considering regular monadic string writing systems or context-free, regular. When we have, we consider the left-hand sides corresponding to a given letter C, say LC. All these languages should be regular languages, and the context-free, of course, all these languages should be context-free for every letter, every right-hand side. So, some old result, important results about this kind of applying rewriting to languages. If we have a context-free language and a context-free string rewriting system, then the pre-image by a string rewriting system is, again, context-free. It's not very surprising. And if we have, in the deterministic case, when we have a regular language and the finite canonical string rewriting system, we get a deterministic context-free language. So, this is by backward writing and rewriting forward. It works very well for regular languages, namely, when we apply no matter which monadic string writing system, finite, infinite, any case, we always get a regular language. And when the string rewriting system is context-free, then this language, the regular language, may be computed in polynomial time. But, of course, we cannot improve with this result in the case of context-free languages when we want to see what happens for context-free languages. By post-image, we may get a non-recursive language. So, I will skip, perhaps, this example of Benoit theorem. And what will interest us, of course, we will be interested in S-indexed monadic string rewriting system. So, those systems where the language of left-hand size for each letter belongs to family of N-indexed languages. So, then we may see all the rules having the same right-hand side as being represented by an automaton, level N automaton, accepting all the words having the same right-hand side. So, see it in this way. And, well, it's not difficult to obtain the following color in this case for using N-indexed string rewriting, monadic string rewriting applied to regular system that we have, of course, regular language will be constructed effectively, also in this case. So, now let us have a look for pre-images of N-indexed monadic string rewriting. Let us start to have a look in the case when we only apply it once. So, remember that we have, of course, the language which is accepted by some here, by some level N automaton. So, let's have a closer look at this automaton. We have a state, there is a transition, perhaps not necessarily deterministic automaton. So, we may have another transition outgoing from the same state. And there are some, of course, other paths between these states. We have some initial states, some finite states also, which are perhaps connected by paths with the states we consider. And we have now, we want to rewrite backwards those letters which are right-hand sides of rewrite rules replacing by left-hand sides which are words accepted by this automaton. So, the construction is quite natural here. We need to just connect, go from the state to initial states of the automaton and go back then to the state P. And of course, this similar thing has to be done with the state Q, but we need a private copy of this state, especially for this state. So, we have a private copy for every state and, of course, every letter because this automaton corresponds to a given letter. So, this is for letters of the alphabet, but for the right-hand side being the empty word, we need, of course, connect directly to the same state again using a private copy of each state. So, for this state, for that state, and for the other also. So, what we get is, at the end, of course, we need to forget this initial states and final states and use only those from the original automaton. So, then we may accept words in the language, starting here going through on the bottom level in the original automaton. And if we climb to the other automaton, then we'll accept some language obtained by rewriting by single step, but in several independent places, possibly. So, it's easy to see that in this situation, what we obtained using N-indexed string-rating system and M-indexed language, we will get this pre-image as being the maximum of N and M and possibly also one. The one appears because if you take zero here for N and zero for M, we already know that we go to context-free languages. So, this is why it is like this. And now, if you want to rewrite it repetitively, then we have to iterate this process, this construction and if you look at this iteration, that it looks very like basic iteration. In fact, we don't need this clone predicate in this case. And we need some result in order to apply to this case, result about basic iteration, which is the following. The class of level N structures is closed under basic iteration. So, this is very different from the full iteration, which leads to climb in the hierarchy here. We stay at the same level of the hierarchy. And as we see, this clone predicate seems to be very essential. So, the extension, which was given by Mucnik, is essential to get some more expressive power. So then, of course, we have the following result, namely, like before, but just we have the star which appears here. We have the maximum of this N and M. We climb at the maximum, or rather we stay at the maximum. So, to put it just simple, if we have an index language and we apply inverse image by the system, which is at most an index, then we stay at the same level. So, every level of the N index language hierarchy is closed by this rewriting by N index systems. So, now, we would like to investigate the deterministic case. What happens if our languages we want to rewrite as they are deterministic? Well, what happens if we use N index deterministic systems, then we know that we will not stay on the same level, but we don't know exactly what happens. Do we climb some more levels when we want to stay deterministic? Of course, former results, via former result, we know that we have a language which is of the same level, but we know that it will not be necessarily deterministic. So, what happens exactly, we don't know. At least, we know what happens in the case when the system is regular, namely we stay at the level zero of the hierarchy, and then to get a deterministic result, we need at least these two restrictions that S system has to be confident, because if we don't have confidence, then this is like some kind of non-determinism, and also we need only to work with irreducible words with respect to the string-rating systems, because in fact, it's easy to see that it doesn't make much sense when we use these languages with reducible words for the deterministic case. So, still, however, if we take the following example of the language which is deterministic context-free, and we apply this single-step rule, the single-step system on it, which gives us something like this. Again, of course, a language which is context-free, but which is not deterministic context-free. It's not very difficult to show that this is not deterministic context-free. So, what happens exactly with the determinism? To see it, we need to use something which is called Cayley automata, and this Cayley automata is like Cayley graphs for groups, just a generalization to the case of string-rate systems. They have as vertices all irreducible words, so the words we cannot rewrite, and we have sigma-labeled labeled edges, and from the word U, there is an A-labeled edge to the word V. Whenever we take the word UA, we may rewrite it into the word V. So, remember, both U and V are irreducible here. So, for instance, in the following case, we have for this string-rate system, we have an edge between A to B labeled by B, because if we add at the end of A, A, the letter B, then we may rewrite it three times, and we end up with B. So, this is how it works, and we get some kind of graph, Cayley graph like this, and the Cayley automaton is just this kind of graphs, Cayley graphs with initial and final states, namely, we have only one initial state which will be always epsilon, empty word, and the set of final states will be specified explicitly, as a set, of course, of irreducible words. So, what we know about Cayley automata is that the automaton is, of course, deterministic and complete when the system is canonical, and also, it is not difficult to see that the language accepted by such a automaton is precisely the preimage of the language of final states of the automaton, in the case when the system is canonical and monadic. And what we may establish is that the Cayley automaton is deterministic level N plus one automaton when the language of final states is deterministic and indexed and as is canonical and regular. So, I will skip the proof, and this leads us to the following result about the deterministic case when we have the canonical and regular string writing system and a language deterministic level and the N index language, then the preimage is deterministic level and plus one index language. So, if we combine the two theorems in the non-deterministic and deterministic case, then we see that starting with the deterministic N index language, using a canonical and regular string writing system, we get a language which is N indexed. So, at some level, it is N indexed, but if you want it to be deterministic, it is only deterministic going one level higher in the hierarchy. So, we believe that work would be some kind of promoting the use of this infinite automaton because they are very nice to use and it's quite easy to establish more general results than studying a particular kind of usual automata, push-down automata, or higher push-down automata, and we would like to continue this work beyond the monadic case. Thank you very much for your attention.