 Αυτό είναι το φόρο, αν και ο πόνος που βλέπουμε αυτού του οικογένειον θα φτάσουμε την πόνο, όταν έχουμε φόρτα και οικογένειο, δεν το διδάξουμε, αλλά είναι ήδη the same as the previous, όπως η φόρτα, πρέπει να φύγει να be complete. Έτσι έχουμε δύσκολο του φόρου, όπως είχαμε για τις δυνειώσεις της εξοδεξής. Λοιπόν, σε κάποιο σκέφτος, το οποίο τα πράγματα κλάβουν, Είναι όταν έχουμε μεγαλύτερη χρησιμοποιή στην αμφιλή, που είναι η αγγέντρα, που είναι σε καταλαβαίνο της λογικής λογικής εξοδοσύνησης της πανδιφιλής. Αν υπάρχει από τη αμφιλή, στην αυτοσύνηση της πανδιφιλής, έχουμε αυτά τα φράγματα και το απελευθείο των αυτών πρόβλημας είναι πραγματικά σύνθετα με την αρκετά της πανδιφιλής. Λοιπόν, εγώ θα σας δώσει κάποια εξοδογή, γιατί αυτή είναι η κατάσταση της τελευταίας. Λοιπόν, έχουμε μιλήσει για το καλκουλούς και το αλζεβρα. Βλέπουμε ότι μπορεί να κάνουν όλα αυτά τα καλύτερα πράγματα, αλλά βλέπουμε επίσης και την ώρα, ότι υπάρχουν κάποιες πράγματα που μπορούν να κάνουν. Και η πιο στιγμή που έχουμε βλέπει στις, δεν ξέρω, τρεις δύσκολες πράγματα, είναι ότι η κλωσία της τρανσικής δεν είναι πρώτα ή δημιουργή, οπότε δεν είναι εξηγημένη στην αλζεβρα ή το καλκουλούς. Παρτικόν, ο αλζεβρα ή το καλκουλούς δεν μπορεί να εξηγήσει πράγματα που ανοιχθεί η εξηγημή, such as, yes. Of course. Oh yes, yes, yes. So, the aggregate operator, this is a very good question, the aggregate operator really involves second order logic in some sense, because what you do is you have an equivalence relation in action and you work with equivalence relations of this και υπάρχει πολλοί που θα κάνουν αγγραφίες, λιπκοί, στις άλλες, που έχουν κάνει τα δύο. Να φύγουν πολύ καλύτερα όταν κάνεις αυτό. Το άλλο που θα δώσω είναι πάνω από έναν τρόπο διαφορετικό λεξό. Όλη αυτή η δισκάσσυνή που είχαμε εδώ, απλήνει στις σημαντικές, ότι η εξογραφία μας είναι να γίνει στις στις στις. But in an actual database system, the queries don't retain sets, they retain multisets or bugs. In other words, when you do the projection, if you eliminate two columns, you may get identical tuples. The database system will retain these tuples, unless you explicitly ask it. This makes perfectly good sense precisely because you want to compute averages, aggregate operators. Let's say you want to compute the averages of all salaries. Some people may have the same salary in the same department, right? You don't want to throw this out, you'll mess up the average. So back in 1993, there was a paper by Surazit Saoud-Hourian-Mosevardi that they said, look, you database through a distance got it all wrong. You did Chandra Melling got it all wrong. You studied these problems with sets semantics. What happens with bugs semantics? Well, the situation is very embarrassing. We don't know if conjunctive query containment under bug semantics is decided. We don't know that. We know that conjunctive query containment with inequalities, not equal, is undecidable. That's a paper that Eric V, Jayram and myself had in post 2006, with a reduction from Hilbert's 10th problem. But the problem that's open is conjunctive query containment under bugs semantics. That's open. Is it decided? Ben here. Ben, you've tried this problem, right? With Swastika at some point, right? It's hard, right? Ben says it's hard, it's really hard. Okay. So yeah, there are a lot of things we still don't understand, including this one. So I want to finish by remaining, I have 35 minutes to tell you a little bit of the story of Datalog. So that's a declarative language that augments the language of conjunctive queries with recursion mechanism. So it's very amusing. In 1979, Eho and Ullman, you can't get more distinguished computer scientists than that, but they published a paper in POPL. This is the Programming Languages Conference, the top conference. And they showed that no relational algebra expression can define transitive closures. They didn't know that Fresen knew that back in 1954, right? And we saw this here with Erenforz-Fressen Games. But they discovered it and they had very good intuition. They essentially came up with one of the proofs that we saw here. They get credit for bringing to the Programming Language Community and the Database Community, in particular the fact that SQL cannot, as it was at the time, could not express recursive queries such as transitive closures. And now we understood from the discussion we had today, the last two days we understand the reason. Calculus, its first-order logic, can only express local properties. You can't tell apart using first-order formulas graphs of this type, cycles and unions of two cycles. In particular, this means if you have a database that has information about parent, you cannot write an algebra or a calculus expression that defines ancestor. Now it's interesting, of course, that if you think about paths in a graph or ancestor, this is an infinite union of conjunctive queries. But the result tells you it's not equivalent to any finite union. So this really suggests a severe limitation in the expressive power of algebra and calculus. So what could the people in databases do? Well, when you take the, when we teach the undergrads with the course, we tell the students that SQL is very nice, it's declarative, but if you run into trouble you can always bring in C or Java, right? Through embedded SQL you make a call and you write any recursive program you want. This is okay, but it's really a dirty solution. It's an inferior solution because the whole point of going to SQL and the higher-level language was to separate the design from the implementation. You go to a high-level and it destroys the high-level character of SQL. And the other possibility is to go back to the drawing board and see what can we do to augment the expressive power of calculus, of first order logic with some high-level declarative mechanism for recursion. In fact, this mechanism has been mentioned here under the name of fixed-point logics. I just want to give you a different perspective of this, but such a mechanism would be superior to the previous solution because it maintains the high-level character of calculus. So the dialogue here is the slogan. The dialogue is conjunctive queries plus recursion. It's what you get by forcing a marriage between conjunctive queries plus recursion and, in fact, the language was introduced by Sandra and Harrell in another paper, not the paper that Anous mentioned before in 1982, and since that time has been studied by the research community in great depth. I mean, there are literally hundreds of papers, scores of PhD thesis. It was when I entered the database field, I've been trained as a mathematical logician, but I went to my first database conference in 1987. I knew very little about databases, and I was like a child in a candy store. I could understand half the papers because they were about data log, and data log was familiar with because I had studied under the name of inductive definition. So by 1995, this had gone out of fashion in databases, and you found late 90s relatively few papers. In the last five or six years, there's been an amazing comeback of data log from all sorts of different areas of computer science. People have used versions of data log to specify network protocols. This is Joe Hellerstein and his students Bun Tao Lu and others. People at Microsoft have used data log as access control languages. There is a company in Oxford that is using data log in program analysis and so on and so forth. In fact, there was a conference on data log 2.0 at Oxford last year, and there was another one in Vienna in the next fall. So the language has found other applications outside databases, and finally the people that designed the SQL standard gave in and decided to introduce a version of data log that I'm going to tell you about called linear data log. So SQL 1999 and subsequent versions of the standard support data log in this restricted form that I will try to explain. So what is a data log program? It's basically a finite set of rules that express conjunctive queries. The only difference here is that before when we had a single conjunctive query, we had a name for the head that did not appear in the body. Here some names on the head may also appear in the body, and that's where recursion comes into the picture, and I'll show you lots of examples. So a data log program is a finite set of rules. It's expressing a conjunctive query, but some occur both on the left and the right. The ones that occur both on the left and the right are called intentional database predicates or recursive predicates. The ones that occur on the right but not on the left are extensional, and the idea is that the extensional ones are the given predicates. It's what we have in the database. The rules express some knowledge, and they are used to define the intentional predicates. That's what recursion is all about. So let's look at two examples. This is, again, the transitive closure reincarnated now as a data log program. So yesterday how to do it in least fixed point logic. Well, for us, here is going to be a rule, a data log program with two rules. So T says there is a path from X to Y. So how can we have a path of X to Y if there is an edge from X to Y, or there is a Z, and there is an edge from X to Z, and the path from Z to Y. Then there is the divide and conquer version, so to speak, in which you say you replace the second rule by TX, Z, and TZY, and that says there is a path from X to Y if there is a Z, such that there is a path from X to Z and the path from Z to Y. That's the divide and conquer. So intuitively, this is what these data log programs do. Of course, we have to give proper semantics and make sure that they, in fact, they define the transitive closure. We can allow, here is a program with several, with two recursive predicates. Odd says there is a path of odd length from X to Y, and even says there is a path of even length. So here one is used to define the other. You define there is a path of odd length if there is an edge, there is a Z, and there is an edge from X to Z, and the path of even length from Z to Y, and odd the other way around. So here we have two recursive predicates, two IDBs, even and odd, and one EDB. So we can think here that the data log program gives us a recursive specification of the IDB predicates, even and odd in terms of the EDB predicates is given to us, we use the data log program to define even and odd. So this is a case of mutual recursion, of course. What is the precise semantics of a data log program? Well, you can give two types of semantics, declarative semantics and procedural semantics. And then you can prove that the declarative semantics matches the procedural semantics. And that's what I want to do here. So the declarative, you can think of it as denotational semantics. You give an object, which is the meaning of the program, and the procedural or operational, you give an algorithm for computing these semantics. Now, I could do it through the least fixed point mechanism, but I want to give you a slightly different description of this, which somehow motivated from the first programming course in which we teach our students recursion. So when we teach recursion, we say, look at the factorial function, it has this nice recursive definition. And if I had plus and times, I can define exponentiation. In fact, there was a language called Pascal, I didn't have exponentiation explicitly. You had to write a silly program like this one. So what's going on here? What is going on with this is that we can write recursive equations that define functions over the integers using plus and times. And then what we prove there, of course, in recursive function theory, is that there is only one function that satisfies this equation. We want to do something similar. We want to write recursive specifications. But what we want to do here in the recursive specifications, we don't have plus and times, all we have around are the operations of relational algebra. So you can think of a data log program as being a system of recursive equations where the operations are some of the operations of algebra. So here is how it goes. If I give you a data log program, this may have many predicates, many recursive predicates. For every recursive predicate, you write an equation. What do you do? For every equation, you take the expression on the right-hand side. That's a conjunctive query, so you can write it in algebra. And you combine the right-hand sides of the different rules for the same predicate using union. So, for instance, in the case of the transitive clause, we have only one recursive predicate, T. So this ST is E union. If I write the body of the second rule in algebra, it becomes pi14 sigma2 equal 3 T cos T. So that's a recursive equation that involves the data log program. And we can do the same thing if we have two predicates. Now we get a system of equations, one for every recursive predicate. So that's a system of recursive equations. Now, this is analogous to the situation of what we had with recursive functions, but unfortunately, it is not true that there is only one solution to these recursive equations. In fact, we can have many solutions. So in fact, every transitive relation containing E would satisfy these equations. So we have many solutions. So we can say that the semantics is the unique solution to this specification. So when we have many solutions, we hope to find a nice solution. And here there is the smallest beautiful approach. That will give you the least fixed point semantics. Or big is beautiful, you've got the greatest fixed point semantics. So it so happens that people chose here the least fixed point semantics. So the theorem is that every recursive equation arising from a datalog program has a smallest solution, smallest with respect to the partial order. If I had two, three, five recursive predicates is the extension of the partial order coordinate wise. So every recursive equation arising from a datalog program has a smallest solution. So in the case of this datalog program, the smallest solution actually is the transitive closure, right? So that's why if we take this datalog program, view them as recursive specifications and ask for the smallest solution, in that sense this datalog program gives us the transitive closure. This is the smallest solution to the recursive specification. And this is a very special case of a general theorem called the Kanaster-Tarski theorem which was not proved, it's not the joint paper. These are different papers. And interestingly enough, Tarski's paper that has this is his most cited paper, although it's one of his most trivial theorems. And it has to do with smallest solutions of recursive equations arising from monotone operations. So what was crucial here was that all the operators we have allowed in datalog programs are monotone. We have allowed disjunction, Cartesian product projection and in fact I'm going to just quickly sketch the proof of this. So we have these declarative semantics as the least fixed points, the least solutions to these specifications. Of course I haven't showed you that these solutions exist, but I will in a minute. Let's look at the procedural semantics. Well, we can give different meaning to the programs through a bottom-up evaluation. So what do we do? We have these rules. Remember, we have the given predicates and the intended predicates, the recursive ones. We start and we instantiate all recursive predicates to empty set. Now the right-hand side is a conjunctive query. So we can take this value and apply the rules, obtain new values for the heads, and therefore update the heads. Now with these new values of the head we plug them in and we repeat until there is no change of the IDP predicates. And when there is no change, we stop and we report this as the result of the data log problem. So this is the so-called bottom-up evaluation or naive evaluation where you have a four-loop or a Y-loop until no change appears. So for instance if you do this for the transitive closure P, you get a series of sequence of binary predicates. T0 is the empty set, Tn plus 1 is what you get by taking the rules and plugging for T what you had in the previous state. And so on. I'm not going to... These are all exams, I trust you can go over them. So here is the result that I want to get into. That if you have a data log program or true, the bottom-up evaluation of the procedural semantics terminates with a number of steps bounded by a polynomial in the size of the database instance and the declarative semantics coincides with the procedural semantics. And the proof is really not difficult at all. For simplicity let's assume we have just one recursive predicate and let's assume that the arity recursive predicate is K. So by induction we saw that the nth iteration is contained in the nth plus 1 iteration. This uses the monotonicity of the building blocks that we have. T0 is empty and then we use the monotonicity to get the T0 contained in T1 and we assume that Tn is contained in Tn plus 1 and then the monotonicity and the conjunctive queries gives us this. So we have an increasing sequence of binary relations but these are carry relations on the active domain of the database but this is a finite set. Therefore this sequence has to stop it cannot keep increasing. In fact there is a meme that most the size of the active domain to the K at which level we get Tm equal Tn plus 1 So before we hit this somewhere before up to the active domain of i to the K we are going to find that this iteration has stopped. So we know that this iteration has stopped at some point. So this was the termination and now we have found our solution Tn plus 1. So at this point we have a solution to the recursive specifications. Now we are going to prove this the smallest solution and we prove by induction that if we have another solution then every level of the iteration is contained in this other solution. Again, T0 is empty so it is contained in every solution and again we use the monotonicity so you put the two together that when we have stopped Tm is contained in this star So this is the smaller Tm remember Tm was the same as Tm plus 1 is the smallest solution of the recursive equations So we have achieved both things here We have proved that the smallest solution exists so the declarative semantics is well defined and it is obtained through this bottom-up evaluation So this is a very special case of the Tarski theorem It is the same argument that you used to give meaning to semantics to least fixed point logic but here it comes as a very clean as conjunctive queries plus recursive. What you get in the wash out of this that for every fixed program this is data complexity The bottom-up evaluation can be carried out in polynomial time Why is that? The reason is that the number of iterations is bounded by polynomial in the size of the database The degree of the polynomial is just the parity of the recursive predicate if we have one Every step of the iteration can be carried out in polynomial time because we do a relational algebra evaluation on some fixed query from a data log program So polynomial times polynomial you get polynomial By the way since we saw before that we can do data complex in log space Why can't we do this in log space also? What's the problem? But what we do in every step is first order operation on some fixed query Let me just say very quickly The reason is we have to carry the relational building along We have to store it As we will see you cannot do data log in log space unless very strange things happen So the bottom the bottom line is that the data complexity of data log is in P It's a very important result because you wanted to add recursion but you didn't want to get the complexity outside polynomial time So the data complexity of data log is in P The combined complexity however is high it's exponential time complete I won't show you this So there is a price that you pay there So this is the price of recursion if you will at the level of combined complexity Remember for calculus for first order logic it's P space concrete before the data log jumps to exponential time complete But data complexity is still in P time Let me show you two interesting programs two interesting data log programs because all we have seen so far is the transitive closure and this even and all Non-two-colorability can be expressed by data log program Why is that? Because non-two-colorability is the same as there is no cycle of length, right? So what we can do is take the previous program we had for computing on and even that's one of the reasons I wanted it So odd x, y and even x, y says there is a path for the length there is a path of even length between x and y and then we have another predicate Q with no variation a zero-ary predicate that is true if and only if there is an x such that there is a cycle of length from x to x So you can do non-two-colorability in data log Okay, that's in some sense a very simple data log program with a somewhat of non-trivial proof of correctness because you need to know the theorem from graph theory, right? By the way can you do as a sanity check can we do three-colorability can you write the data log program for three-colorability what would happen was that yes we would collapse NP2P but actually we know better from the work of ANUS that we cannot even do it we cannot even do it in least fixed point logic in fact what is IFP plus counting, right? So two-colorabilities and non-two-colorabilities is very special in fact that's where we draw the line Here is my absolutely favorite data log program is the path systems query Have you seen this before? Somebody might have announced this before So back in 1974 Steve Cook wrote a four-page paper called on a space-time trade-off This paper showed that there exist problems which are complete for polynomial time by a log space reduction So Cook not only gave us NP completeness he also gave us p-time completeness and one that he used to show that is complete for p-time is exactly the problem computed by this data log program Cook did not know that the log was not around at the time but Cook was always thinking in terms of automated theorem proving So to get a feeling about what this program is trying to do think of a proof system that has two parts A and has an eternally rule of inference So think r, x, y, z meaning that x is inferred from y and z using the rule r For instance, something like modus ponens Something like modus ponens if I have phi and phi implies psi I can get psi, I can think of this as being an eternally rule of inference that says I get psi from phi and phi implies psi but the solution is the same character So this program, if you have this interpretation gives you the theorems of the system tells you that x is a theorem if it is an axiom or you can get it from two other theorems using this rule of inference So Cook proved that computing evaluating this data log program is a p-complete problem Now what do I mean as a decision problem So I give you A and r and some value B and I ask you is B in T This is a p-complete problem So the data log can express p-complete problems I mean that's the bottom line The data log can express p-complete problems Okay In particular this shows you that the data log evaluation is not going to be in log space because it can do p-complete problems So in some sense even the data complex is higher still in p-time but higher than the log space that we had before which goes back to the remark that we had earlier Very quickly What is linear data log Linear data log is what you get is the fragment of data log in which in every rule you have at most one recursive predicate At most one So you can write cousin from sibling You have a parent You are given a parent You define sibling as having one parent in common and then you define cousin this way And so there are some very amusing things So when we had the election in the US in 2008 They found out that Barack Obama is an eighth cousin of Dick Cheney You can't think of two more different people than that When the link is life, I checked it last night It's correct, it's still there And if you think that this is not good enough Here is another one Sarah Palin and Prince Diana are tenth cousins So this is really also you shouldn't put bounds on it You should just let data log run until you discover all these things Anyway the story of linear data log is that it's very interesting The data log program is linearizable if it's equivalent to a linear program For instance the transitive closure divide and conquer is linearizable because equivalent to this On the other hand the transitive the program for the path systems query-cooks program if you will is not linearizable is not linearizable This requires some proof Of course from complexity reasons can be because it's easy to see that every linearizable program is in NC So you would hear that but actually you can prove it By the way, telling if a problem if a data log program is linearizable is undecidable subsequent versions of SQL 99 subsequent versions support linear data log A linear data log program is a program in which on every rule the right hand side has only one recursive predicate So here parent is given to us the recursive predicates are sibling and cousin the second rule has only one recursive predicate, sibling the third rule has only one recursive predicate which is cousin So that's a linear program This is not linear because the right hand side has two occurrences of the recursive predicate On the other hand is equivalent to a linear one But this the second one is not linear but is not linearizable Is that clear now? So that's what SQL supports Supports only linear data log programs I think so It relates more to the fact that when it's linear you can implement on a stack You can do recursion They were driven I don't think that the standard people knew anything about the theory about time-complete problems and anything like that They were just driven by how easy it is to implement linear So you can write this is the syntax I'm running late, I don't want to explain it more But that's how it is Let me I'm almost running out of time Let's take a look and try to tie this up with some other Let's compare the data log with a calculus So Unors of consigned queries are contained in the data log but calculus is not contained in the data log The data log cannot do all of calculus For instance, the reason is simply because we have only an out-monotone operator we cannot express even the quotient operation For instance, we cannot express difference we cannot express the quotient On the other hand the data log is not contained in calculus because we have a transitive closure So here is the data log and here is first order logic or calculus Certainly what we have here are the unions of conjunctive queries which means what else is there and the answer is nothing else and we owe this to Ben Rosman as a corollary to Rosman's theorem the preservation under homomorphism is that data log intersects with relational calculus is precisely the unions of conjunctive queries Why is that? Because if you have a data log query which is also expressible in relational calculus then in effect you have a first order query which is preserved under homomorphism or is equivalent to a union of conjunctive queries So in terms of expressive power this intersection this intersection is precisely precisely the unions of conjunctive queries That's exactly what this intersection is It's a nice way that's something that we really didn't know until actually we knew it another way which I'll show you now is a very simple corollary to Ben's wonderful theorem Here is another theorem that we can get very easily from what we saw here together with Ben's theorem Back in 1987 Aytai and Gurevich had a paper in Fox where they proved this theorem with a highly nontrivial proof and essentially they used bounded tree width and all sorts of other things and they said the following statements are equivalent for a datalog program Py Py is bounded Bounded means that there is a fixed number K such that on every database you only need to iterate this program K times You know what I'm saying is that your recursion is not needed It tells you that this happens even though the query is defined the query defined by Py is expressible in first order logic γιατί είναι ένα πρόεδρο για το σύνταξο εικόνα. Δεν πιστεύει ότι αν η σύνταξη είναι εξηγημένη πρώτα ή η λογική, δε θα υπάρχει ένα άλλο δεύτερο προγραμμό που δημιουργεί. Αυτό είναι το προγραμμό εικόνα. Τώρα, ένα από τα δύο σύνταξη είναι δυναμή, αν η πιστεύει, όμως είναι ένα γραμμό της σύνταξης, ελπίζω να πρέπει να κρατήσει. Οπότε, η ενδιαφερή στις σύνταξης είναι δυναμή. Αυτή η διεθνή που μεταφέρει με ένα στιγμό, θα χρησιμοποιήσουμε το Ροσμάν's θεόριο. Αυτό το διεθνήκε για ένα. Αν η Q είναι πρώτα ή η διεθνή, αφού η στιγμή είναι προσέβαιη από τα στιγμή, είναι ένα κοίυελαντο στο στιγμό της κοίρες της κορυνής. Βλέπτες τι έχουμε τώρα. Στο δίκιο έχουμε το στιγμό της κορυνής κορυνής, διεθνήκε από το Ροσμάν's θεόριο. Στο δίκιο έχουμε το αφού της κορυνής της κορυνής, που θα φύγουμε από το δεθνό της κορυνής. Τώρα μπορούμε να χρησιμοποιήσουμε το Σαγγύβια Νακάκη's θεόριο, που σας είπα πριν. Αν έχετε δύο κορυνές της κορυνής της κορυνής, then every member of one contains the other, then every member of the left contains the other. Αν αφού υπάρχει το Σαγγύβια Νακάκη,εξάρωση, ότι αυτή τη κάνατο β 싶 απορραμbucks, πρέπει να καιγησούει ένας κορυνής κορυνής της θεότητας, πέγω raised endurance fills από κάνασ yaptι τέτοιο διαliesο亲άτας. Κ완 και το Σαγγύβια Νακάκη, αφήρτο μετά το β πρόγρο της αλλά αυτά, είναι ανέγρασibocarpibrinsky Ωγσίο για να ξέρει λύissonδο της. Πρόιχέται εντήpielικο και βiphopar crews, εκ τα ανα floss Singingmin's σταν ταξа, εντύπεσπου από τη γιαταγή του στημοισ convictions. Υπάρχουν αξιόδιες της Δέταλογκας με εμφανιστήτητα που μπορεί να μην να μιλάω για αυτό. Δέταλογκη με εμφανιστήτητα στην Ευρωπαϊκή Βαθμότητα. Βέβαια, αυτή η αρήση ήταν δύο κερδίδες. Σήμερα, στην τεστορία του 1980 και την 90' δεν υπάρχουν πέρα. Έχεις να μην να μην καταλάβεις τα σημαντικά, τα στρατηφότητα, τα σημαντικά, τα καλύτερα, τα σημαντικά, τα σημαντικά, τα σημαντικά, τα σημαντικά. Αυτό δεν μπορεί να μην καταλάβεις. Η 15η του Βιαμπιτεύου Βιάνου Βούκ έχει μια καλή επαγγελή. Θέλω να τελείωθα με τον σχέδιο. Έτσι, έχουμε αξιόδει Δέταλογκου, να καταλαβαίνει την εξακρίναση εξακρίναση. Έχουμε δει σε κάποιο συγχώρηση, το κοίτευμα της ευαγελίας, το κοίτευμα της Δέταλογκας. Ποτέ για τον κοίτευμα της εξακρίνασης. Βέβαια, υπάρχει πόρυ αγγελία εδώ. Πόρυ αγγελία είναι ότι ο Δέτος Μουέλι, το 1987, στην Πόρυ αγγελία, που, όπως είχα πωθήκα, είναι το πόρυ αγγελία μου, διότι η εξακρίναση εξακρίνασης για δέταλογκου είναι αγγελία. Εννοείς, είναι αγγελία, επειδή είναι αγγελία με ένα μόνο εξακρίνασης. Είχε διάφορες εξακρίνασης και εξακρίνασης. Η εξακρίναση είναι πολύ συνδυνή, επειδή μόνο εξακρίνασης, επειδή είναι δύσκολο με ένα γένος της εξακρίνασης. Αυτή η εξακρίναση σφιχωριότητας, είναι ένα εξακρίναση μονοτήρας. Είναι ένα δύο εξακρίναση σφιχωριότητας, με τη δύο που εξακρίνεται στη ίδιση της two-condex free grammars, και δει με μια εξακρίναση αυτή την προβλήματα για το δέταλογκου. So the picture I want to leave you with, and this is essentially the last slide, is we've gone started with calculus, we dropped to conjunctive queries, we can go to unions of conjunctive queries and not get anything worse. The power of recursion is, this is the price we pay for using conjunctive queries. Η στιγμή στις栏δην θα σέγνει στην εξεντρικία ψαρφή, η στιγμή ρεύμα της δεσμότητας είναι η πραγματική ασκότητα που behind and condem, να κάνει μέσα. Λοιπόν, ελπίζω ότι που έχουμε κάνει στις εδώ είναι να δείξουμε ένα σ fuq μέσα η Ευρυπία Βινόυα ήταν πολύ νευαδικο, στις λέει ότι η θέα τέτοιμη ρεύμα και την προοδική χαρνή θεία είναι πολύ σημανή. Η θέα θέα τέτοιμη ρευμα είναι η δεύτερη αφή, η δεύτερη εργασία, αλλά also the database provides the concrete scenario of finite model theory. I mentioned yesterday that, to me also, it's a case of logic from computer science. I mean, the catalog is a case of logic from computer science. So thank you very much for your attention.