 Hello everyone, and thanks for joining me My name is Alejandro Aguirre, and I'm going to present joint work with Shinai Katsumata from the National Institute of Informatics in Tokyo So our motivation behind this work is studying the notion of weakest precondition in a formal manner So in a very simplified way in program verification Weakest preconditions can be used to check that a program satisfies a whole triple So we have a program that is specified by a whole triple P as a precondition and Q as a post-condition Then this whole triple is satisfied only if P implies the weakest precondition of F and Q and this is a very useful technique that can be applied in a lot of settings For instance for programs with side effects such as non-termination or probabilities and It also supports quantitative notions of predicates and not only Binary predicates but predicates that have real values for instance Being a bit more concrete we can see a program as a map F from memories to memories and What WP does it computes inverse image of predicates, so we have a predicate Q which is a subset of M then The WP of F and Q is the inverse image of Q Along F And crucially this WP is composable meaning first The WP of skip and Q is just Q and secondly if they do have a programming composition F composed with G Then the WP of F composed with G is the composition of the individuals WPs In a contrarbarian manner if you want to add effects to this setting One common manner of adding effects is by using a monad So we assume that our computational effects are captured by monad T and Now we want to compute WP of F and Q But now we cannot directly apply the inverse image because Q Is a subset of M, but it's not a subset of TM So we first need to have some way of mapping Q to a predicate over TM and this is what T dot does Which is what we call the lifting of T and then in this case what WP does is it first applies the lifting to Q and then it computes the inverse image of T dot Q So now we also want to Ask whether WP is composable But now the question is a bit different because skip is no longer the identity program is actually the unit of the monad T And F composed with G Is no longer normal composition is a closely composition So instead of having G composed with F we have G sharp composed with F where G sharp is the closely lifting of G and this is the question we cannot work and It's what we try to answer So in this talk the contributions we present are a characterization of WPs in a general effectful setting Using the notion of vibration in category theory and we are able to characterize WPs exactly as Some special kind of liftings of monads That I will explain later Then we apply these Techniques to the domain vibrations which are vibrations that capture Normal notions of predicates such as binary predicates or real value predicates also show how to combine Monads to compose effects and Finally instantiate all this theory to Concrete examples to recover Transformers in the literature we begin by studying pre-gesture conditions in a general setting And for this we're going to use vibrations So a vibration is a situation in which we can take the inverse of an object in P Along a morphism in C Being a bit more concrete for every object in C. We can define its fiber, which is you know by P sub X when X is the object and This Px is a poset of Objects that get mapped to X amorphisms that get mapped to the identity of X And if you have a morphism in C going from say X to Y then you can compute a Functor from the fiber over Y to the fiber over X and this functor is going to compute inverse images This is summarized by the diagram if we have F going from X to Y and we have Q above Y Then we can compute the inverse image, which is denoted F star Q and the morphism in P going from F star Q to Q We denoted by bar F and we also call it the Cartesian lifting determined by F and Q So some examples of vibration for instance We can look into the vibration from the category of predicates and the category of sets Where the vibration just maps up predicate to its base set Or we could look into the forgetful factor from topical spaces to set I'll note so the forgetful factor from the category of binary relations To set square so if we want to add effects into this setting we do this using monads So first we have a monad T over C that captures the computational effect of Of our programs and then we define a monad T dot on P that needs to satisfy Competitivity commutativity, sorry of these three diagrams So what the first diagram expresses is that if we have a predicate Q over X Then T dot Q is a predicate over TX And then the second and the third diagram They require that P maps the unit and the multiplication of the T dot monad to the unit and multiplication of the T monad And all this data packed together it forms a triple that we call a like stress tractor Which is a setting which we can define because preconditions and we do this as follows If we have a DAX extractor defined by P, T and T dot Then the precondition WP is exactly the inverse image Induced by the vibration so WP of F and Q is F star of T dot Q And once we have defined the WP we can also define whole triples So a whole triple is satisfied If P implies WP of F and Q So now of course we want to look into composability So we can think of two programs F Going from X to TY and you're going from Y to Z And in the category C we can compose them using closely composition and we have predicate Q over Z in P Then there's Two ways of computing the inverse image. We can just compute the inverse image of the composition Which is the lower path from the diagram or we can compute first the inverse image along G and then the inverse image along F Which is the upper path from the diagram And by the universality of the inverse image There's always going to be this implication But in general there's not going to be any quality So of course we would like to Characterize the cases in which we have an exact equality between these two inverse images and this Equality which is as composability is satisfied if and only if the dice restructure is Cartesian So we require that T dot is fibered and furthermore that Satisfies these two extra conditions So the first condition Precise that The unit of the T dot monad is not only map to the unit of the T monad by P but it's also the Cartesian morphism above eta and Same for the multiplication of the T dot monad. So in general This kind of Cartesian leaf things are not very common So for these monads on this table, this is actually an exhaustive List of the Cartesian leaf things. So this is the maybe monad. This is the Power and not 90 paracet monad it is present monad and monad composition So you need to read the table right now. I'm going to explain some of the examples, but the point here Is that if you want to be able to exhaustively list Liftings for this monad then we need some techniques that allow us to To prove that there are no no more liftings and this is what we're going to do in the second part of the talk for some specific kind of vibrations So now we study the concrete setting of domain vibrations from a lack slice category So a lack slice category is defined as follows. We first Have a category C and we pick an object omega and we assume that every Homeset with omega as a Codomain is ordered And we further assume that pre-composition is monotonic. So anything manner to obtain these conditions are by Assuming that omega is ordered itself and that the Homeset order is just the point-buss order of functions And then we build a slice category C slash omega that has as objects morphisms in C that have Omega as a codomain So for instance I going from X to Omega or J going from Y to Omega And as morphisms morphisms In C between the domains So F from X to Y that makes this diagram commute lastly Meaning that this path of diagram is less than this path of the diagram and This is known as a domain vibration Because the maps Morphisms to its to their domain so I gets mapped to X and J gets mapped to Y And I we want to Look at the extra structures in this setting And to do this we first look at monotone algebras because there's a tight connection So monotone algebra is a part of a functor F and an algebra of F That preserves the ordering of Homesets in the following way So we have I less than J then The two ways of going from FX to Omega one through I and your one through J are ordered in the same manner as This these two so if I is less than J, then this path of the diagram is less than this path of the diagram And we have two monotone algebras Oh, I'm an idea of F and oh prime an algebra of G Then a morphism between these two algebras is a natural transformation between the functors That makes this diagram commute exactly and now this This data both the monotone algebras and morphisms of monotone algebras from a monoidal category That has us Junit the identity Functor and the identity algebra and as operation the Algebra composition and functor composition And now the the point of this is that there's a monoidal isomorphism between the this category of monotone algebras and the category of fifth of fiber functors and Cartesian two cells between them Now I'm not going to give details of what this category is The main point is that this is a category of endo functors and A monoidal in this category is exactly a pair of a monad T and a Cartesian lifting of this monad So now we can find the monoidal object in the category of M alge and get a objective correspondence between Cartesian liftings and a special kind of algebras and And this monoid object in M alge is exactly a pair of a Monad and an Eilenberg Moore monotone t algebra So now that the point of this slide is that there's a objective correspondence between monotone EM monotone t algebras and Cartesian liftings so to construct Cartesian liftings it's enough to find monotone t algebras which are easier to find But sometimes we also want to combine effects So for instance, we may want to combine non determinism and probabilities on non-termination and determinism and the way this is usually done is By finding a distributive law so distributed law is just Between two monads TS is just a way and a trial transformation that allows us To swap the order of the monads so it goes from ST to TS and this is enough to define a composite monad t alpha s if Alpha satisfies this to Green's conditions So now ideally we would like to be able to construct Cartesian liftings for the composite monad from Cartesian liftings of the individual Monads and therefore from algebras of the individual monads And if like this is possible as long as the algebra satisfy an extra condition So the extra condition is that if we have an algebra in monotone t algebra and in monotone s algebra They induce a Cartesian lifting of the composite monad if and only if These two algebras commute nicely with The distributive law so we have ST omega there's two ways of eliminating s and t one is by first Applying t and then s and everyone is by swapping t and s and then applying first s and then t And if these two paths of the pentagon commute then we can build a Cartesian lifting for the composite monad So now we'll leave behind the more abstract part of the presentation and we look at looking to some examples We start with examples in the category of slices over to where to is asset to point set zero one with the expected order and set slash two Can be seen as simply the category of binary predicates That has a smorphism functions that preserve the truth of the predicates So the first example we look into is a Parcetti monad sometimes called maybe monad that can be used to model non determinism sorry non-termination or errors and There are exactly two in monad on algebras over two Which are taught for totality and part for partiality and This induced two Cartesian liftings and therefore to composable WP transformers But instead of looking at the WP transformers we look into the core triples, which are a bit easier to understand So the whole triple PF Q is Satisfied in the tot setting if for any input in P The output is not an error and it satisfies Q So it's like a total interpretation of the whole triple and the partial interpretation of the whole triple is that if the input satisfies X and The output is not an error then the output satisfies Q But if the output is an error then it does not matter We can also look into the non-empty power set monad That is used to model non-termination So P plus X maps X The non-empty subsets of X and there are also two M E monotone algebras over two So one is must that models must Non-termination and the one is May and those induced two Cartesian liftings that are summarizing the table So the whole triple PF Q satisfy in the May setting if for any input in P There is a possible execution of F such that it its output is in Q And the mass interpretation of the whole triple is that if the input satisfies P then any possible Execution has an output that satisfies Q this example can be also seen in a similar model in the probabilistic setting So we also have two algebras of the distribution monad P must and P may So probabilistic may It interprets The whole triple PFQ as follows. So if X satisfies P Then the output of the program on X satisfies Q with non-zero probability I'm the probabilistic mass interpretation of the whole triple is that if the input satisfies P Then the output needs to specify Q with probability one Now we look into more complex examples of slices were the positive reals So the category sets last as your infinity can be seen as the category that has as objects Revalued predicates and as morphisms as always functions that preserve This triangle laxly So we start by looking again into the distribution monad, but in this case we need to find a real value of algebra And one possible example of this is the expected value Which satisfies a monotonicity conditions And now we are looking to exhaustively into algebras of D We just consider the expected value But the idea is that the expected value allows us to recover a Transformer that is known in the literature as the weakest pre-expectation as introduced by Maccabre and Morgan and the weakest pre-expectation Of a program F and a predicate I Computes the expected value of I over the output distribution of F For some input X. We can also combine effects For instance probabilities and undeterminism This is a well-known problem For this we define the monad idea of index distributions, which is another way of defining a monad of distributions But the point of idea is that it has a distributive law over P Which the does not? And for ID we can also define the monotone I the algebra of expected value And if you combine this expected value with SAP Which is a an algebra of P Then we obtain the WP defined by this equation so if we have a program F with Probabilities and undeterminism and we have a predicate I then the WP of F and I Computes the supreme of the expected values of any possible output And now a final we want to use our techniques to reconstruct the expected runtime transformer That was introduced by Kaminsky and others to reason about runtimes of probabilistic programs So our settings a bit simplified compared to theirs because we don't consider loops but in any case it is that they have a language first-order language With probabilistic assignments non-termistic choice conditionals and also a tick operator to introduce Time into a program and The ERT transformer maps functions between memories and real values to functions between Memories and real values As summarized by this table, I'm going to go quickly through this By the point is that of course it's compositional so ERT of C1 composed with C2 is the composition of ERTs and also they introduce cost for Assignments and for conditionals So now to model this in our setting we need to combine three monads first a monad of cost Which just is a brighter monad that adds a natural valued cost to a program and also the monad of probabilities and determinism as in the previous example And we develop a denotational model of their language so we have a set V of values an object V of values and states Our memories are a map from variables to values and in the semantics of a program as I'm up from memories to T memories and we can reconstruct their Transformer from three algebras So for the cost monad we just use addition as an algebra so we have a natural number and a real number then you can produce a real number by addition And for a ID we use Expected value and for a P we use supreme and now using these three algebras we can construct a WP Which is summarized by this table and that it almost coincides with the ERT transformer. It just needs an extra transformation. So we need to Instrument the code with ticks Because the RT language does not have explicit ticks for assignment and if But once we do this transformation then We can show that for any program C our WP of the accounting transformation of C Coincides exactly with the expected runtime of C. In the paper there's more material that I have not been able to cover So we first study how to generalize our settings to k vibrations where k is a category of pos And the idea here is that if the fibers are not posets or something stronger Then we can get WP transformers that have nicer properties as mid-preservation or continuity We're also looking to how to Dualize some of our results to study strongest post conditions and We should have to compute the WPs induced by generic effects as Introduced by plotkin and power And there's some more examples. We have not been able to cover for time reasons. So for instance one example is to study higher moments of cost in probabilistic programs, which is a transformer present in a paper by Kura and others and We'll just study vibrations that are not domain vibrations some related work that I should mention perhaps the more red band one is the work by Hasuo So in his setting what he studies corresponds in our setting to Cartesian liftings along C slash T omega So our increment with respect to this work is that our object of truth values is not T omega, but in general any omega Some other work is the work by Goncharov and Shreda Where they study predicates by the use of what they call innocent monads As some other work also is the work by Hino and others Where they study predicate transformers using relative monads and finally I should mention the work by my yard and others on their Dijkstra monads Which is it has an intersection with our work that we will discuss in the final version of the paper so now to conclude I have presented a way of characterizing composability of WPs in an effectful setting as pullbacks of Cartesian liftings of monads Also, I have shown techniques of how to construct these Cartesian liftings from algebras of the individual monads and I have shown that algebras of the individual monads can be composed Into algebras of the composite monads and therefore to construct Cartesian liftings for the composite monads And finally I have shown that all these abstract techniques are actually applicable to reconstruct a Transformers that have appeared in the recent literature And with this I would like to conclude the talk and thank everyone that has watched