 Okay, welcome back. So the next talk is by Pritish Kamal. Pritish is a postdoc at TTI Chicago. Before that he was a graduate student at MIT, a research fellow at Microsoft Research Bangalore and a student here at IT Bombay. And he has worked in a number of areas in complexity theory including algebraic complexity, communication complexity, proof complexity and so on. And now he's trying to use the skills to uncover the mysteries of machine learning. And today he'll tell us about how to use lifting theorems to prove monotone circuit law bonds. Yeah, thanks for the introduction and like thanks for inviting me here. I mean it's really special to be back in IT Bombay and yeah so I'll start with the talk and this is joint work with actually Ankit and Mika, Robert and Dimitri. So like when I was like close to graduating from undergrad I started to get fascinated by this like holy grail of complexity which we that we want to understand what makes problems computationally hard. So we want hardness results for explicit functions. So and this is really fascinating and maybe intimidating because somehow you have to rule out all possible non-trivial algorithmic paradigms. So how would you be able to do this? So I think complexity theorists tried to like moved early on to studying the circuit model where you just have very simple operations and or and not gates and we just care about this number of gates number of operations involved and since any algorithm can be simulated by a circuit which whose size is roughly same as the running time it suffices to prove like lower bounds on circuits right and I mean circuits don't seem to be the right programming abstraction like you wouldn't try to program in this way even though I mean I must say that it's like we are in living in a strange time where a lot of these ML tasks are being solved by neural networks which are circuit like and this is really very spectacular time to be in. So anyway this psychological trick of looking at circuits with the hope that low proving lower bounds will be easier that psychological trick has not played out very well and I mean we barely know linear lower bounds. So I think I mean people turn to studying restricted models of circuits. So one such restriction which was like very popular in the 80s was this monotone model of computation where you don't have any not gates. So you just have and and or gates. So this model is already restricted in that it can only compute monotone function. So a function is monotone if if x is bitwise less than y then f of x is less than or equal to f of y right. So with and or gates you can only compute monotone functions. So people asked okay can we at least prove lower bounds for these restricted model of circuits and Rasbrough and followed by work by Alon, Nogalon and Ravi Bopanna showed that the clique function requires exponential size monotone circuits okay. So and clique is this NP hard function. So there was a lot of excitement and like my advisor told me this story actually that at that time Mike Sipser made a bet that P versus NP is going to be resolved in the next few years because all you need to do is reason about not gates and how hard can that be. But shortly after Rasbrough showed a super polynomial lower bound for monotone circuit lower bound for the problem of perfect matching. So which shows that okay monotone circuits are not that powerful because matching can be solved in polynomial time and this was further even extended by Tardosh to exhibit a function which requires which can be solved in polynomial time but requires exponential sized monotone circuit. So it's a really strong separation right. So some then mean the excitement dampened a little bit and Mike Sipser lost his bet but you can ask okay are monotone circuit lower bounds interesting today and I want to argue that it is still interesting today because I mean there are many reasons so many connections so I mean this workshop is about communication complexity and having elegant mathematical connections is probably reason enough to study something but there are also applications to prove complexity that we'll talk about connecting monotone complexity to prove complexity. There's also this very cute connection I don't know if it'll be talked about in the following talks but there's a nice connection between monotone complexity and extension complexity which inspired this lower bound by Mika Rahul and Rahul Jain and Tom Watson on extended formulations of independence. I don't want to talk about it too much but I don't know if it will come up in the next talks. And there's also this interesting connection not with monotone circuits but with monotone span programs which carry which capture linear secret sharing schemes in the problem. So there are a lot of interesting things happening and why monotone circuit lower bounds are interesting. Okay so bringing us to the theme of this workshop right we want to understand lifting theorems so what are these so here we have we take a weak model of computation and a strong model of computation and the goal is to say that if this weak model cannot do some task x then the strong model also cannot do a related task to x right that's the we want to show this in a black box way okay and the weak model in this talk will be resolution refutations which Mark I mean already said the stage 4 and and the strong model will be monotone circuits okay so previously there's all there is this work on feasible interpolation which takes lower bounds on monotone circuits and uses it to prove lower bounds on resolution so that can be seen as the easier side right and the goal of a lifting theorem is to do the converse which is take lower bounds on resolution and prove lower bounds on monotone circuits. So let me quickly flash this theorem but I'll come back to it again in more detail but what this theorem does is you give me any unsatisfiable k, c and f on n variables for which you have a resolution width lower bound I'll define this in this talk but you give me any such lower bound and there exists a very related function f on like n to the k many variables for which you get a monotone circuit lower bound okay so that's I'll get back to this theorem again later okay but before that I want to also say that like this lower bound this lifting theorem can also be strengthened to this much stronger model of monotone real circuits actually this corresponds to what Mark called as communication with the greater than oracle and that will be clear hopefully in my talk so this monotone real circuit model is this model where you allow the wires to carry arbitrary real values okay the inputs are Boolean outputs are Boolean but the wires can be arbitrary real values and any gate so you have allow only gates of arity 2 and they can compute an arbitrary monotone function over two real inputs okay so it's a much stronger model of circuits so just as a funny aside just want an example monotone real circuits can simulate all monotone neural networks like neural networks where all weights are non-negative and all activations are monotone which is true of the popular ones okay so just to give an example so right so and but the reason in complexity theory why monotone real circuits were introduced was because proving lower bounds there imply lower bounds on these cutting plane refutations okay so I'll I'll get to all of these definitions in the top but just want to give you a high level picture first okay so just to motivate you before I like get into the details I want to give you a very crisp corollary that comes out of this lifting theorem which is this lower bound on on this XOR SAT function and like the punch line is that monotone circuits cannot perform Gaussian elimination over F2 okay so what is this function so so normally in 3 XOR SAT you have a system of 3 XOR constraints and you want to know whether it is satisfiable or not here I'll write the input slightly differently which is first I write down all 3 XOR constraints okay these are roughly 2 n cubed many of them and I'll treat my input as an indicator vector of which constraints are in my system it is an equivalent representation of the same problem right and and I'll say that the function is one if the system encoded by this indicator vector is unsatisfiable so so unsatisfiable because I want the function to be monotone so think about it so if I flip a 0 to a 1 it means I'm adding a constraint to the system so it can only be more unsatisfiable right so so this function now you agree is monotone right and for this function we can this lifting theorem machinery implies that it's monotone even real monotone circuit lower bounds is is exponential like 2 to the n to the epsilon okay is it clear so I mean the the context of this is that okay if you look at the monotone versus non-monotone separations I mentioned before so there was one for matching which is in randomized NC2 and but the lower bound is only super polynomial and there was stardosh's function which was in P but it requires you to solve a semi-definite program so maybe it's not in NC but this XOR SAT is a it's a very easy function right it's just Gaussian elimination it can be reduced it can be solid in NC2 it can be reduced to determinants so it's a it's a monotone circuit lower bound for a very easy function okay so one thing that's interesting also is that all previous lower bounds used this like really fantastic like technique due to Rasbarow called this method of approximations whereas the lower bound that I'm going to show you is going to follow from a communication complexity approach by a by a lifting theorem so it's a very different approach to proving lower bounds okay so back to this picture okay so I want to take lower bounds on resolution and prove lower bounds on monotone circuits but really what's underlying this lifting theorem is really query complexity and communication complexity and there is already Mark alluded to this connection between query complexity and resolution there is also a very nice connection between communication and monotone circuits which I'll tell you about right and the lifting theorem is really in this language of query and communication so let's start with start with communication complexity okay so it has been already introduced many times so just introduced by a picture so so Arkaday mentioned that you know you can study decision problems promise problems or search problems but in this talk I'm only going to talk about search problems right so in search problems you're given Alice is given X Bob is given Y and there are many possible answers that could be valid for X and Y and your your goal is to communicate the and find at least one valid answer to XY right okay so I'll give an example of such a communication task but before that communication protocols can be like conveniently represented as trees where you have Alice nodes and Bob nodes right I suppose everybody is familiar with that by now okay so so bringing the connection to monotone circuits and communication complexity was done in this really beautiful elegant work of Karshmore and Wigderson in late 80s so what they said was okay you give me a monotone function okay how many people have seen this connection actually okay hopefully some people have not okay so let me go or this is really elegant okay so so you give me a monotone function f from it you construct a communication search problem where you say that okay I give Alice an input it's Alice a one input of f right x such that f of x is one give Bob a y such that f of y is zero clearly x and y are different and since f is monotone there must exist a coordinate where xi is one and yi is zero okay if not then the function wouldn't be monotone so for any x and y like this there must exist such a coordinate so the goal is to find this coordinate so it's a search problem and the easy direction of this is is like very elegant very instructive so you can come up given a formula for f I can come up with a protocol for solving this search problem which is that I start at the top gate I know that it evaluates to one for Alice and zero for Bob which means that if it's an OR gate right at least one of its wires must be one for Alice but both the wires should be zero for Bob so Alice can communicate which one that is and you can go down this formula and you can keep going down and finally at the leaf you will have a solution right and in like this direction also works in the converse so you get this exact like exact equality which is that the communication complexity of this monotone kashmer Wigderson game is exactly equal to the monotone circuit depth of f okay there's also a non-monotone version of this but I won't talk about it let me be upfront about that thing so there is a non-monotone search problem and it captures non-monotone circuit depth okay so but really this connection is not about circuits it's really about formulas because formulas can be balanced so circuit depth if I have circuit depth d I can open it up to a formula of size 2 to the d so it's really capturing log of the formula size log of the monotone formula size so this connection is between communication complexity and monotone formulas okay so going back to this picture I told you about this connection between communication complexity and formulas so now let me tell you about query complexity and resolution which Mark already talked about so in query complexity you have us again I'm going to only talk about search problems so for any input any binary n-bit input you have your goal is to query bits and come up with a valid answer for that input okay so the connection to Mark already talked about resolution so here you have an unsatisfiable cnf okay here's an unsatisfiable cnf and why is it unsatisfiable okay here's a resolution proof so Mark already described before so you resolve on z1 you resolve on z2 and come up with a contradiction and the connection to query complexity is that you can just turn this upside down and so for any unsatisfiable cnf you can define a search problem right which is that given any input z right since it's unsatisfiable there must be at least one clause which that assignment violates and the goal search problem goal of the search problem is to find that clause okay and the connection to query complexity is just that you turn this resolution proof upside down it's really approved by picture but decision rep decision tree depth for this search problem is exactly the resolution depth needed to refute this unsatisfiable csp okay so so that completes this connection between query complexity and resolution so finally I want to tell you about this lifting theorem so this was already Arkadiev talked about so so just for for deterministic decision trees so there's this lifting theorem from from from Raz McKenzie but also like there are these many many dots many many works which have changed the gadget and so on like Arkadiev's talk was about using different gadgets but I'll stick to the Raz McKenzie version because that's closer to what we do so it says that you start with a query search problem and you compose you replace each bit by a communication gadget on two inputs give all the access to Alice otherwise to bomb and now the communication complexity of this composed search problem is lower bounded by or it's essentially the decision tree complexity times log m where m is the communication cost of the gadget so in Raz McKenzie they use a specific gadget which is this indexing gadget where Alice has a pointer in 1 to m and Bob has m bits and the goal is to compute the x bit of y that's the indexing and like indexing is sort of complete for proving lifting theorem so any lifting theorem we prove you could also prove it with the indexing gadget okay so so this is and this is what Raz McKenzie showed and I'll stick to this even though there are lifting theorems with improved uh it's like smaller gadget size okay so so that proves this direction of the lifting theorem so if you put all this together like this was the result in Raz and McKenzie which is to prove that monotone NCI plus 1 is not contained in monotone NCI so a depth separation for monotone formulas okay that was that was their motivation and if you put it if you apply this machine if you start with like a particular query problem you can actually recover a monotone formula lower bound for x or sat like this three x or sat problem that I mentioned in the beginning you can prove this exponential lower bound for monotone formulas for x or sat okay so so now what what should we do if you want to go to circuits right and like the challenge seems to be that like decision trees communication protocols formulas they're all tree like objects somehow you need to reason about like dag like objects okay that's what Mark already alluded to so this is like as many things in complexity theory this was already studied by rasboro in the 90s like this is really I mean it's it's hidden in this paper and some for some reason this is not taught this connection is not taught in communication complexity courses uh I don't know somehow it was it was missed I mean it's of course it existed in literature but somehow it's not made it to mainstream courses but uh rasboro's definition was slightly more complicated so it was simplified later by in the work of Dimitri Sokolov okay so I'll present the simplified definition and later get back to what rasboro actually did so to define this notion of communication dag's okay let me start by going back to a definition of a tree like communication right and then I'll generalize it to dag legs I'll define tree like communication a different way so I have this tree like protocol so every node of this tree corresponds to a rectangle okay so the root node corresponds to the x y rectangle and as you go for any node it gets partitioned into its two children node so so for any node b it is partitioned into two rectangles which correspond to its two children right and then as you go down the tree keep getting partitioned and finally at the leaf you are labeled by an answer right so that gives you a a partition of a rectangle into these monochromatic rectangles okay note that in this definition like a very slick thing that happened here is that there was no explicit reference of Alice and Bob like there was no I didn't talk about two parties I just talked about rectangles being partitioned and and the reason Alice and Bob is really implicit in this because a rectangle if it's partitioned into two it can only be split horizontally or vertically and that corresponds to the Alice or Bob okay so but in going to this definition you got rid of an explicit mention of Alice and Bob so to generalize this to dag like communication model right the only thing you need to do is really change this one line that instead of a node getting partitioned into two rectangles you replace it by saying that a node is covered by the two child rectangles okay so in picture like if this is a parent rectangle if this no rectangle corresponds to the node v I require that it is covered by its two child rectangles that's the only change to get to this dag like communication model and so the dag communication complexity is just is defined as the log of the number of nodes so log because that's what I did with formulas I took log of the formula size so it's log of the number of nodes and the connection due to Ras Barov and Dimitri's simplification is that this dag communication complexity exactly captures log of the monotone circuit size so going back to this picture I told you about this equivalence between dag communication and monotone circuits yeah so sorry yeah I just want to ask is it still true that you can only cover it sort of row lines or columns yeah actually yeah that's that's an excellent point so in proving this equivalence you have to like how do you prove the converse you from going from a dag communication protocol to circuit you have to replace some things by and and something by or and that's determined by a rectangle if it's covered by two rectangles it has to be covered either horizontally or vertically so actually if I can go back to this picture so this is covered vertically so this is like a bob node so you replace it by AND gate right that's an excellent point actually that's how this equivalence is proved and it's really along the same lines as the Kahrschmer Wichtersen equivalence for formulas okay so now let me tell you about this query complexity like the dag query model so going back again to decision trees right I'll redefine decision trees in a different way so you have a tree where every node is labeled by a sub cube right so the top node is labeled by the entire Boolean cube and then as you go down the decision tree sub cube gets partitioned into two sub cubes right and then you keep going down and finally at the leaf you are labeled by like an answer right so this gives you a partition of the entire Boolean cube into some monochromatic sub cubes okay and the decision tree complexity I can define it now as the maximum co-dimension of any sub cube anywhere in this like it's just this is the same thing but just a different way to say it right the maximum number of bits that are fixed in any sub cube anywhere in this tree that's the decision tree complexity so to define dag decision daggs the only thing that you need to change is again this line which is instead of sub cubes being partitioned into two sub cubes they are now covered by two sub cubes so just to give an example so this is let's say the sub cube corresponding to v so these four bits are fixed and discovered by these two these two sub cubes and the way to see it is that there is this red coordinate it could either be 0 or 1 if it is 0 it is covered here and if it's 1 it is covered in the lower one so the mean bit can be at most the depth is what I say yeah sure sure but it can be much smaller so right I can it can be smaller because I can forget things along the way and right so actually I'm going to clarify that point but let me just say here that so this dag decision tree complexity of this search problem this same search problem that Mark defined before so for I give you an assignment and your goal is to find a clause that is violated right so the dag query complexity of this search problem is captures this resolution width instead of resolution depth okay so to clarify a little bit more on that point there is an alternate way to think about query dag which is maybe more nicer so so in decision trees you think of it as a game between two players so there's an explorer and an adversary the explorer query is a coordinate and the adversary replies back with what that coordinate is so in decision trees the only query in decision tags you can either query so in any round you can either choose a coordinate and get an answer for that coordinate or you can forget choose a coordinate and forget that forget what that value was but the interesting the important detail here is that when you forget if you ask that value again there's no guarantee that this the value return will be the same and you have to be robust to that you have to be correct even if the if you forget something and the answer next time is different you still have to be correct no matter what so at the end when the game ends you have only queried some you only remember having queried some bits and that should be enough to give you an answer to the search problem so I mean yeah it takes maybe some time to wrap your heads around this if you haven't seen this but so really I mean this is nothing very deep going on it's very simple very nice connections here okay so I told you about this this connection between dag query model decision tags and resolution width as opposed to resolution depth okay so finally I want to tell you about this last piece which is this lifting theorem so that was the sort of goal of this talk so what we can show is that okay if you start with a query search problem compose it with the indexing gadget and now the dag communication complexity of this composed problem is lower bounded by the decision dag complexity of the query search problem right and so it's really it's along the same lines as the RAS mechanism and we prove it again with the indexing gadget actually it's an interesting question to prove it for other gadgets so question for RQD okay so this is the workshop I want to tell you a little bit about the proof so I so this paper I really like this paper it's called rectangles are non-negative untas it really laid down the foundation for this new generation of lifting theorems and it introduced this notion of density which was written here before like this notion of blockwise density and so on but let me not talk about those details but just talk at a high level so consider a rectangle in this space of all inputs right so I'm looking at the indexing gadget so Alice has n pointers in 1 to n and Bob has n blocks of m bits each so it's n copies of the indexing gadget and we like a rectangle is good if when I apply the indexing gadget it looks like this that some number of bits are fixed and the other bits are have full support so what that really means is that intuitively it means that Alice and Bob have communicated information about d coordinates and they have not communicated anything about the other coordinates like that's that's the rough intuition so we like rectangles which are like this not all rectangles may be like this but we like rectangles because it has a nice intuition that Alice and Bob are really simulating the the query world they are simulating the query comes they're only communicating blockwise and not doing some non-trivial communication which involves all of their inputs okay so we like rectangles like this and and so actually I must say that so in this in the first paper by Mika Shahar Raghu Tom and David it was proved for the inner product gadget and then later it was strengthened by Mika Tony and Tom to the indexing gadget and somehow we still we need to use the indexing version so so we are really looking at the indexing gadget okay so so now the the statement is that okay not all rectangles might be nice but all rectangles can be decomposed into rectangles that are nice okay not okay this is not entirely true there may be some rectangles which are bad okay not all so I take a take give me any rectangle in the space of all inputs it can be partitioned into smaller rectangles so either they're good which means they're nice in this sense that if you apply the indexing gadget it has only it reveals the values of some small number of bits and it's it has full support on the remaining right but there may be some rectangles which are bad for which there are no guarantees but at least those rectangles are covered in a very small number of rows and columns okay this is actually the property why we need to use indexing and I don't know how to do it in a product because I don't know how to get this property so for all the non-error rectangles they are structured they are nice with having fixed only up to d coordinates right so so this d is same as this d so so you can choose any d you want and you can get a decomposition something more special that we use which was maybe not needed in other works is that in fact what you can show is that you achieve this full support on a single row of this nice rectangle so I said that rectangle is nice if it has some coordinates fixed and full support on the other coordinates in fact you can get the full support on on a single row like it's not it follows from like the theorem of me me cartonial term so let's see okay this might get a little bit technical so maybe I'll just flash some animation okay let's see so so what we want to prove this theorem which is that the communication the DAG communication complexity of the compose search problem is lower bounded by the decision DAG complexity of the query search problem and the way you do it is with a simulation type theorem which is that you give me a communication DAG for this compose search problem and I'll extract from it a decision so you give me a communication DAG and I'll extract from it a query DAG decision DAG but I told you about this alternate interpretation of decision DAG which is this explorer versus adversary game where you query a bit and you forget a bit so you start by partitioning all the rectangles into structured rectangles okay and the proof follows this outline that for any rectangle I'll start with the root go down the communication DAG and recreate a query DAG out of it okay and the invariant is that I start for any rectangle I'll have a structured rectangle in which I am in currently okay so at the root I'll just start with the full rectangle and the structured rectangle is the full rectangle in going from one rectangle to its children okay this part is I mean there's it's not very non trivial but it has I mean some arguments and I don't want to stress you out with those simple arguments so I'll just flash a proof by animation okay I've worked very hard on these animations so so okay so just be prepared I'll just flash this animation okay okay that's it so so I started at a node and I want to go to one of the child nodes and you do it without actually having queried a lot of bits so if you're like going back in this animation I started out remembering having queried these four bits and I'm at node v and I go I go through these steps and now I have reached node w and I remembered I've queried some bits and forgotten some bits and I'm still remembering only order d bits yeah it's actually not very complicated but it's just hard to do it in a talk okay so but it's only like two pages it's not very complicated yeah so I move from this rectangle to a structured rectangle of the child rectangle and then I forget what I had previously yeah I can explain it offline basically if you really want to know it's not really worth going into this point okay so now you keep going down the decision tag and finally when you reach a leaf you have a valid answer for your search problem so you can extract out a decision tag from a communication tag okay so okay so that that was a little bit technical so okay so I'll promise I won't have too many technical things from here on okay so I told you about this lifting theorem and the conclusion we get out of it is this okay so this is the theorem I had shown in the beginning so so you give me any n-variate like k c n f unsatisfiable k c n f for which you know a resolution width lower bound okay of w from it I will construct an explicit function f on n to the order k many variables for which this lifting theorem implies a monotone circuit lower bound of of size n to the w so just to go back to this corollary like if in this theorem was maybe too general so if you start with this f being this so called satin contradiction which is something for which we know a resolution width lower bound almost linear lower bound you can from it you get this XOR sat lower bound oh good good actually yeah so the reason why this like this machinery for us stops at only 2 to the n to the epsilon and not 2 to the n is because we need indexing gadget of polynomial size and that's a actually very important open question which is if this simulation could be done with constant sized gadget then you would get an explicit function with 2 to the omega n lower bound and right now actually we don't even know even without lifting we don't know monotone circuit lower bounds of 2 to the omega and we know 2 to the n to the 1 third I think yeah actually so in in these reductions actually even if you do it with any other gadget I would these reductions work by reducing that gadget to indexing and then performing this reduction so maybe in some special cases you might be able to do a clever reduction but not in this generic way so it might there might be something but I'm not I cannot make a general statement out of a small size gadget maybe in some special cases for a special if you start with a special particular unsatisfiable CNF maybe you could come up with a smarter reduction right that's true I would say yeah yeah right because then it would reduce to indexing which is sub polynomial size right so right and the other thing I had mentioned was this extension to monotone real circuits right so the way we get to this is so I told you about this decomposition theorem for rectangles so you can decompose any rectangle into these nice rectangles with errors which are contained in a small number of rows and columns to do this lower bound for monotone real circuits you need to talk about these triangles instead of rectangles so now you have these triangle communication tags where every node is instead of labeled by rectangle it's labeled by triangle and every node is covered by its two child triangles that's the triangle communication tag is equivalent to monotone real circuits so the key to proving this low bound is is that we need to decompose triangles into these structured rectangles who don't like rectangles okay so that's so I won't go into the details of that but just to flash it at a high level that that's what's going on yeah okay good good actually I think mark also didn't clarify so triangle is something that is like the ones of a greater than function right so something that looks like this for some permutation of rows and columns it looks like this so it's something that can be reduced to a single greater than call right so this is the real communication model or equivalently model with this oracle query to a greater than right so it's clear that such a thing can be solved with a one single greater than query and finally I also said that there is this application to this cutting plane refutation so okay I mean mark already defined this but let me quickly go over it so cutting planes are these things where statements are linear threshold functions and you can encode your classes as linear threshold functions and you do keep on deriving new statements till you end in zero greater than one okay and if you put everything together the theorem we get is that if you start with any give me an unsatisfiable KCNF on n variables for which you know that resolution width is at least w for which you know resolution width lower bound from it I can construct and end to the slightly polynomially larger 2 KCNF for which the cutting planes length is n to the omega w right so I can take lower bounds on resolution and take lift it to lower bounds on cutting planes and actually this is semantic cutting planes so even the derivation rules don't matter because this communication this triangle communication dag model cannot distinguish between semantic and syntactic cutting planes so never mind if you don't know what semantic cutting planes is but this lower bound is for the semantic cutting planes model okay so in particular Mark mentioned talked about the separation between polynomial calculus and cutting planes so the way if you want to separate any proof system from cutting planes all you need to do is separate it from resolution right if you exhibit a CSP that is easy for your proof system but hard for resolution you apply this lifting theorem and it will still remain easy for your proof system but will be hard for cutting planes that's that's how we recover that statement so I want to like lighten the mood a little bit and talk about like is there a so we talked about this lifting theorem like is there a broader context for this or is it really something that's very niche and on its own okay so I want to like talk about this bigger picture and so something that I'm very interested in okay so let's look at this two search problems that I mentioned so in the query world I talked about this search problem where you have this unsatisfiable cnf you're given an assignment and your goal is to find a clause which is violated okay in the communication world we had this monotone Kashmir Wigderson search problem where Alice had a one input bob as a zero input want to find a coordinate where Alice has a one bob as a zero okay so the observation here is actually that both of these have small non-deterministic cost so this search problem has non-deterministic query cost of k because I can just tell you you can just query those k bits I can tell you which k bits to query if you query those you will get a clause which is violated so you can be convinced that this clause is valid and in the communication world I just need to non-deterministically guess log n bits to find this coordinate which or in other words I can cover my entire rectangle with n monochromatic rectangles so that's so log n is the non-deterministic communication cost so both these search problems are total search problems that they always have an answer for any input and they have small non-deterministic cost okay so in fact the observation is that these such type of search problems are actually complete for total search problems with small non-deterministic cost so you give me any query search problem which has small non-deterministic cost I can reduce it to a search problem which looks like this where there is an unsatisfiable CSP and you want to find a violated clause and similarly in the communication world any total search problem with small non-deterministic cost can be reduced to a monotone Kashmir Wigdorsen search problem okay so so these are total search problems with small non-deterministic cost and I don't know if it reminds you of something but it is these things are studied in the Turing machine world so I don't know if many people attended this workshop on algorithmic game theories and game theory people like this class called p-pad which contains these Nash equilibrium and other problem so if you look at in the Turing machine world if you look at search problems there are search problems which are solvable in polynomial time and then there are search problems which can be verified in polynomial time but if you look at these classes there are these rogue problems like such as Nash equilibrium or factoring where there is these are total search problems and Nash equilibrium always exists but it it's still hard to find them so how do we characterize the complexity of such problems and to to address to sort of talk about these problems Papadimitriou and friends they defined this class called TF NP total search problems in NP where the solution is always guaranteed to exist whereas for SAT you cannot really guarantee that satisfying assignment always exists so and to specifically talk about these problems they defined these classes subclasses of TF NP which correspond to some mathematical principle that ensures totality of of that search problem so so PLS corresponds to this principle that every DAG has a sync and PPP corresponds to the pigeon hole principle and PPA corresponds to this class that every graph with an odd degree vertex has another okay and for example Nash is complete for PPAD which is the PPA but on directed graphs so never mind if you haven't seen these classes before but there's some real nice connection here which I want to stress upon so so you can take these Turing machine search problem classes and so for any complexity class in the Turing machine world you can study it in the query world and in the communication world so this is like very classic right so going back to this paper of Babai, Frankel and Simon they studied this Turing machine complexity classes in the communication world there is this recent survey of Mika, Tony and Tom which I highly encourage you to look at on this land this latest developments on these different communication classes so these are all communication classes in the decision world whereas here I'm talking about these total search problems in in the communication world right so you can do the same thing for these TF NP subclasses okay so if I go back to like this lower bound I told you about on monotone formulas right through this lifting theorem of Braz and Mackenzie this can be seen as a lifting theorem for the class FP which is problems which are solvable in polynomial time so it's a lifting theorem from the query version of FP to the communication version of FP so now going to this DAG communication which is brings me to the question of what was the broader context right so this lifting theorem from decision DAGs to communication DAGs right this can be interpreted as a lifting theorem for this class called PLS which embodies this principle that every DAG has a sink and in fact Rasbaro when I told you that Rasbaro did defined some communication analog which captured circuits he defined it as PLS communication complexity so he said that okay this is PLS communication complexity a PLS communication complexity of the monotone Cartman readers in game is capture circuit size that's what he did and and this rectangle DAG that I defined is just a canonical version of of looking at these arbitrary PLS protocols okay right so this is this lifting theorem for PLS this query DAGs to communication DAGs I want to throw in one more connection which Mark alluded to which is this lifting theorem from Nulstellens arts to span program so never mind if you haven't heard of what span programs are before but so Nulstellens arts is a proof system which which Mark talked about and span programs are this computational model which talks about taking linear span of a given vector testing whether a span of given vectors contains a specific target vector so never mind if you haven't seen that before but so the lifting theorem shows that of Tony and Robert Robert shows that if you give me a lower bound on Nulstellens arts degree you get a lower bound on span programs so if you restrict to the case of F2 this can be actually seen as a query to communication lifting for this class PPA which is this principle that every graph with odd degree vertex as another okay so so going back to this picture I told you that Fp is same as formulas and tree like resolution PLS is like circuits and dag like resolution or resolution width and PPA is equivalent to F2 spam programs and F2 Nulstellens arts degree and I really like this picture for some reason because it's really begging for more connections to be made right you take a TFMP subclass you can ask what is its corresponding proof system or computational model you can start with a proof system and ask what is its if there is a corresponding TFMP subclass you can start with a computational model and ask if there is a corresponding TFMP subclass right so there's one obvious missing piece which which can be made progress on which is like we here I only talked about F2 whereas there is also an Fp lifting theorem lifting theorem for Fp Nulstellens arts and Fp spam programs so we came up with a definition of some new class PPA P which is equivalent to this which captures this Nulstellens arts and spam programs over Fp but it turned out that actually this definition was already given by Papa Dimitriou in the same paper where he defined PPA it's like hidden in one paragraph which is being overlooked I think for the most part and this actually motivated us to go off on a tangent and study this class in the Turing machine world and like give it give a natural complete problem for it and I think this class is very interesting it has it can it seems to be the right class to capture some interesting problems I can tell you about it offline this is not related to lifting okay so so I think this picture is interesting that it really asks for more connections to be made and it unifies it gives a unified view for understanding these lifting theorems okay so that's it so I'll just conclude here so this this is the lifting theorems I talked about this is the TFMP picture so in the end with some open questions so one great open question is proving these lifting theorem with constant size gadgets which Rampers have alluded to earlier so that would imply a 2 to the omega n monotone circuit low bound a mark alluded to this question of lifting for DAG models with other shapes so intersection of triangles for example that would have implications for proof system which is like like where lines are clauses over linear threshold functions so another thing which is not related to lifting but something that I was very interested in but then eventually I gave up on it which is proving this exponential monotone circuit low bound for perfect matching so rasbarov proved it proved a super polynomial low bound but somehow this lifting machinery doesn't seem to be like it's somehow I'm not I don't know how to embed a lifted problem into matching and this seems this somehow also resembles this what's going on an extension complexity where Roth was proved this exponential lower bound for matching for for linear programming external formulations but somehow it doesn't seem to fit into this lifting framework so maybe something some direct proof is needed for matching and finally like there is this picture which is begging for more connections to be made two questions about the questions here so what's the best monotone circuit low bound we have I mean yeah I think it's two to the end to the one-third one-third yeah this ras and haken okay and that's not a lifting that's not a lifting it uses the method of approximations I think it's maybe from the 90s I'm not very sure and this perfect matching question so in the monotone world it may be that even bipartite perfect matching is doesn't have some exponential size so yeah yeah right I think so so actually in the formula world monotone formulas we know this lower bound due to ras and Wigderson which is also for bipartite perfect matching so it's a two to the square root n which is n n is the number of variables type lower bound yeah so it's conceivable that it could be for bipartite yeah whereas for extension complexity it cannot be for bipartite so it's slightly different is that important for perfect matching yeah I'm not entirely sure so Mika has told me this at some point in the past not for matching but for clique you could get some better lower bound if you could do it within the product but I exactly forget what that reduction is so if you could get it if you had a lifting theorem with inner product you could choose a specific outer search problem for which you could get a clever reduction to clique but I forget what the exact detail is okay so I'll stop you