 And yeah, welcome to my talk about dependence logic with a majority quantifier. And this is a joint work with Arnaud Durand, you are Continent and Harry Belfoymer. At first, I will give you an introduction about dependence logic. And I will also give a notion for dependence logic. And then I will introduce the majority quantifier. And also we'll give reason why we investigate this. And then on the other hand side, on the second order side, we will see the so-called most quantifier and the counting hierarchy. And then in the main result, we have shown in the paper that second order logic with that most quantifier is equivalent to dependence logic with that majority quantifier. And then I'll conclude. Okay, the motivation for dependence logic is characterizing the dependencies between variables. So in first order logic, you see that dependencies of variables can only expressed from left to right, so to speak. So here we can see that y2 depends on x2 and x1 and everything that is before. But we cannot say that y2 is independent of what is before, so that it does only depend on x2. That's not possible here. So therefore I have, therefore in 1961, Hankin introduced this quantifier, which is like a matrix. And that means that this y1 does depend on the x1. And y2 does depend on the x2. But the y1 does not depend on the x2. And the y2 does not depend on the x1. And in 1989, Hintika and Sando introduced this independence friendly logic with that slash quantifier here. And that means that the y2 is independent of x1 and y1. And here we explicitly say from which a variable is independent from. And then in 2007, Yokovenen introduced the dependence logic where we have this new operator called dependence atom. And it says that the x2 determines y2. So that means y2 does only depend on x2. And here we explicitly say that where a variable is dependent from and not where it's independent from. Okay, first order logic extends ordinary, sorry, dependence logic extends ordinary first order logic. And dependencies have been studied in computer science. And they've gotten more and more important. And for example, in database systems, yeah, with dependence logic, we can express dependencies explicitly in this, this is what we want. And we obtain the dependence logic just by adding this dependence atom to first order logic. And that means that we have a function from the vector t1 to tn minus 1, 2, tn. The dependence atom here is satisfied if there is a function from here from the n minus 1 t to tn. And it's known that dependence logic is the same as existential second order logic. And in terms of complexity, it captures the complexity class of NP. Okay, the semantics of dependence logic are based on the concepts of teams. And the team is defined as follows. A team of this domain of the structure is a set of assignments. And assignment is just a function that maps for each variable a value. And then we have certain ways to extend those teams. So for example, we have for an x from the domain and for a function we define the extension where we add a column to the team. If you imagine a team as a tabular, then we add a column to the team. And we map every assignment to a new value. And here we blow up the team. So we make copies of every assignment. And then we fill in the new column that every value from the domain. So I will present that in an example that makes it more clear. We need those extensions for defining the existential and the universal quantifiers on teams, on team semantics. Okay, then the dependence atom itself is defined as follows. It on a structure A and the team X, it is satisfied if and only if. For two assignments S and S prime. When the values of the first n minus one variables are equal in S and S prime, then it has to hold that the values of the nth variable has also be equal in S and S prime. Okay, here's an example for the dependence atom. This is a team. So these are the assignments. S1 is an assignment and S2 is an assignment. And here we can say, for example, it holds that X1 determines X2, because if we check this by the semantics, then we'll see that the values in S1 and S2 for X1 are not equal. So the values for X2 don't have to be equal. They are anyways, but they don't have to. And we cannot say that X3 is dependent on X2, because here in X2, we see these values are equal. And in X3, it would follow from the semantics. And in X3, they also would have to be equal and they are not. Here's the extension of the team where we add a column X4 and we have a function that maps S1 to 2 and there's 2 to 1. And we just add this team that's meant by X extended by F to X4. Okay, and the semantics then of the existential and the universal quantifier is defined in such a way that we say on a structure A and a team X, there exists an X phi is true if and only if for some functions, for some function, A models on the extended team that phi. So there is where this existential quantifier is used. So we say here there exists a function. And the for all quantifier is just defined very similar that we say A models for all X phi, if and only if it models the phi on the blown up team where we set in every value from the domain. So there's this for all. So here we have a formula and as you can imagine this formula is always true because we can always find the value for X4 such that it's true. And yeah, we try to extend this team. And for example, we have to fill in here equal values because they have been equal here. So they have to be equal here and then this formula is true. This is the existential quantifier and for the universal quantifier. Let's say that we have a co-domain of 0, 1 and 2. And here we have to speak about this A because we have to try every value from A. So I have to give it explicitly. And then we blow up this team. So you see that S1 prime, S1 prime prime are just copies in the values of X1 to X3 of the first one. And S2 prime and S2 prime prime are also copies in the first three values. And then in the fourth column I just fill in every possible value from the domain 0, 1 and 2. So that is the for all quantifier. Then I'll introduce dependence logic with a majority quantifier. And the semantics of this majority quantifier is that we say for a model and a team it holds that the majority of X satisfies phi of X if and only if for at least half of the possible functions going from X to A. It holds that the extension satisfies phi. So A to the X is the number of all possible functions from X to A. And we say here for at least half of them. So that means majority. And yes, with that M quantifier we add the capability of counting through the logic. Counting is something very interesting in logic and in computational complexity. And since it's known that second order with that most quantifier is equivalent to the counting hierarchy, which I'll introduce later. And it's known that dependence logic is the same as existential second order logic. We can or it's worth investigating whether this majority quantifier maybe raises the expressive power of dependence logic in a way such that it's equivalent to a second order most and the counting hierarchy. Okay, the majority quantifier. Here we have a structure with a domain 0, 1, 2, 3 and this team. And then we want to investigate whether this formula holds or not. Yeah, intuitively one would say it does hold because you can imagine more functions where X3 is not equal to X2. But then we count the number of all functions that we can extend the team with and these are a to X that's 64 functions. And then we count the number of functions that extend the team in a way such that X2 is not equal to X3 is satisfied. And here we have to say that this is only satisfied if none of the values occurring then here in the new column is equal to 0. So we have only three possible values that can extend here for satisfying X2 is not equal to X3. And then we have 3 to the 3, so 27 functions that satisfy X2 is not equal to X3. And so this does not hold. And the point is that the converse does not hold either. So the law of excluded middle does not hold here, but it didn't hold on dependence logic. So why should it here? Okay, and on the second order logic side we define the most K quantifier where we have a structure and a domain and this most K quantifier binds a K or a relation symbol. And it's true if and only if we count the possible relations for phi and if at least half of them satisfy phi then the most quantifier is satisfied. So 2 to the n to the k is the number of all possible relations which I can apply here and we say at least half of them have to satisfy phi. Okay, accounting hierarchy, that's the analog to the polynomial hierarchy and as a building block we don't use NP but we use PP and the class PP consists of all those languages for which there is a P-time Turing machine such that X is an L if and only if for more than half of the computation path of N they accept X. So here again there's this more than half hidden in it and then the accounting hierarchy is defined just like the polynomial hierarchy where we have C0 of P is just P and then the k plus 1th level of P is just PP to the kth level and then the infinite union of all this is the accounting hierarchy. And then the main result of our paper is that we have shown that dependence logic with that majority quantifier is equal to second order with that most quantifier and it is known that this is then equal to the accounting hierarchy so we have also then shown that dependence logic with the majority quantifier is the same as the accounting hierarchy. And the idea behind this is we do an intermediate step we define quantifier called most kf which is similar to the most k quantifier but it's defined over functions and then we show that they are equivalent the logic's most second order most and second order most f because we can see the relations we have here we can see the relations as the pictures or the graphs of the functions we apply here. Okay, then we show that dm is equivalent to most f and the proof is technical and so I don't go into detail here. The quantifier most f is very analog defined as the most k quantifier again we have a structure here and then we say for most of the functions phi of g has to hold if and only if for at least half of them, half of the functions they satisfy phi. So n to the n to the k is the number of all functions and we say at least half of them have to satisfy phi. And yeah, this is the proposition where we show that the logic's are equivalent and yeah, our main result here is split up into two parts where we say for every dependence logic majority formula phi there is a sentence psi of second order most f such that they are equivalent where we encode the team here as a relation for the second order side and then the converse direction we say for every phi of second order most there is a sentence psi of dependence logic with that majority quantifier such that they are equivalent. Okay, concluding we can say that adding the majority quantifier increases the expressive power of dependence logic and still there are several open questions we have seen in the paper that dependence logic with without dependence atoms is not flat. Flat means that everything that's true on singleton assignments is also true on the union of those singleton assignments and dependence logic with a majority quantifier but without those dependence atoms is also not flat so that's the kind of first order majority, if you so to speak and what about this expressive power and then do the open formulas of dependence logic with a majority quantifier correspond to the downwards monotone properties of the counting hierarchy. We thought about that because in dependence logic it's the case that the downwards monotone properties that the open formulas correspond to the downwards monotone properties of NP which D is equivalent to. And finally but I think this is the most interesting point we have just defined one specific quantifier M there are many many ways to define other quantifiers not by majority but we can define those quantifiers in terms of Linnström quantifiers and what about extensions of D by other quantifiers that's I think the most important point. Okay, then that's it.