 And he writes the following. Truth-theoretic semantics is an alternative to truth-condition semantics. It is based on the fundamental assumption that the central notion in terms of which meanings are assigned to certain expression of our language, in particular to logical constants, is that of proof rather than truth. So the idea is to replace the primitive notion of truth with another notion that of proof and use this notion of truth in order to explain how we assign meaning to certain linguistic expression and in particular to logical constants. And in this talk I will focus on logical constants, even more, I will focus on logical connectives, propositional connectives. Easy thing. Easy but not so much as we will see. But proof-theoretic semantics as Peter says, is also inherently inferential. As it is inferential activity which manifests itself in proofs. So the idea is that when we write down the proof we are making some inferential activity. And in particular, as he says, it does belongs to inferentialism. Inferentialism is, let's say, this philosophical doctrine according to its inferences and in particular the rules of inferences establish the meaning of expressions. In contradistinction to denotationalism according to its denotations of the primary sort of meaning. So again you see the difference that was already kind of settled here. The idea of saying, well, in the standard truth conditional semantics you define the meaning of the connectives explaining when according to the truth condition of certain sentences in which this connective appears and when you go down to the, not to the connectives but to the proposition and the sub-propositional terms you then use the notion of denotation in order to establish the notion of truth. In proof-theoretic semantics on the other end you start from the notion of proof and then you enter into the notion of proof and you say, well, to proof there is a kind of step of inference and so then what you take says the primitive part is a rule of inference. We will not enter here into what is a rule of inference we will see examples of a rule of inference in a while because there is another typical of proof-theoretic semantics that is this general as a pletas says this general philosophical and sematical perspective merge with constructive views which already is engineered in the philosophy of mathematics and especially in mathematical intuition. So, usually proof-theoretic semantics is linked with a constructivist a theory of logical constants of meaning of mathematics because the idea is that, well when you are a constructivist you want to say that something is true because of a proof in terms of proof and not because there is some outside world that is there and how to say establish the notion of truth so it is us with our inferential activity our proof activity that we establish the truth of proposition and not only the truth what we say in proof-theoretic semantics it establishes also the meaning well, if we take this last part that the idea that at least original proof-theoretic semantics goes with a constructivist approach starting from this, it is kind of easy to enter a little bit more into proof-theoretic semantics because if we take inspiration from ok, I take it for granted you know what is the BHK explanation of logical connectives but the BHK explanation of logical connectives BHK stands for the Brow and I think Kolmogorov explanation of logical connectives and it is a way of explaining the meaning of a logical connective or what is a logical connectives from a constructivist point of view that is exactly through a notion of proof ok, so for instance as you say what is a proof of a conjunction a conjunction of the form A and B well, a proof of a conjunction of B it is a proof of A together with the proof of B that you take together and do with you good, but proofs, this notion of proof is formalized within the framework of what logic would go against as natural deduction then, well, sorry I didn't say about it before sorry, I will say taking inspiration from the BHK explanation of logical connectives then proof-theoretic semantics rests on the idea that we know the meaning of a compound sentence I used the example of A and B when we know what counts as a proof, but a canonical proof so, the standard way of proving disconnective so, clearly from the constructivist point of view there is this idea that you can prove a sentence in many different ways but this is a canonical one the one that establishes which is the meaning of the connective now, if proofs are formalized within the framework of against a natural deduction then a canonical proof of a sentence well, is nothing but what the proofs derivation I think with an introduction rule of the main connective this is because in natural deduction for those who are not familiar with you have two kind of rules rules for introducing an expression connected in this case and rules for eliminating ok, so this is the way in which it is so, standardly in natural deduction against natural deduction you do the opposite of what you usually do in the system you have axioms all the time here you switch you have only rules only rules of the sentence and no axioms well, now if we take this point of view of saying well, a canonical proof is something that we explain in terms of introduction rule of against a natural deduction then this means that we assign priority to introduction rules because they are the ones that establish the meaning of a connective and these rules are taken for granted as a self-adjustified ok and it is on their basis that the corresponding elimination rule should be justified so the idea is that we take our introduction rule as self-adjustified in the sense that we lay down in a certain way because this is the way in which we explain the meaning of a connective then there are other rules that you can use and you want to show that also these other rules are correct and what you do, you try to justify them in terms of elimination rule how? I will say in the cycle but first let me say that this way of proceeding so giving priority to introduction rule corresponds to a particular form of internationalism what is saying that the meaning of the connective is given by the inference rules that is usually called verification why verification is because the idea is that the introduction rule tells me how I should verify that the assertion of a certain complex or compound sentence is done so it is done correctly if it has been asserted in the direct way that means through the introduction otherwise it is asserted in a non-dialect way and we should show that we can eventually assert it in the direct way but doing a process of justification of the non-canonical or non-dialect way of asserting a certain complex on the other hand you could have decided to start from the elimination rule if you want and in that case we will speak of another form of internationalism called pragmatism because in this case the idea is that you manifest your way of understanding the meaning of a connective by showing that you can draw by manifesting it by drawing inferences from the assertion of a connective so the identification is telling which are the conditions for making an assertion and introducing it to the plane of our language of our discourse a complex sentence a certain medical language on the other case is that you start from the assertion of these content centers and you try to draw the consequence from it ok so take this for a while in mind because we will come in a second now one of the person who developed and the person who introduced this terminology verification of the pragmatism is my connective infaration is now is today a big discipline in the theory of meaning a big area and there are many approaches we can speak for instance approaches by problem blendom by pragmatism but pragmatism was probably the first of a start in this project and pragmatism considers two conditions that the inference rules and the logical connective must satisfy in order to be fully justified and thus completely determine the meaning of a connective so the idea is that as we were saying before you can use this infarationalism approach in order to explain the meaning of whatever linguistic expression or at least this is the material project logical connective because according to that you can fully justify the other rules that are not the rules of introduction and then to say that the meaning of a connective is completely justified by its rules in another case imagine that we are not working just with logical connectives but we are in specific mathematical scientific theory it could be that certain let's say atomic senders it is harder to justify them completely infaration so it could be that the part of the justification comes from the theoretical meaning so in that case you don't completely determine the meaning in terms of the inference in this case the case of logical connective yes without too much surprise of that because in a certain logical connective that are topic neutral so they do not depend on a particular world on a particular state of affairs so we can think that we can justify just in terms of our inference use and here in this talk we concentrate on logical connectives now these two conditions that amets asks for showing that a certain connective is completely determined so its meaning is completely determined by its inference rules are the condition of harmony and of stability now harmony is the condition which received the most of attention if you look in the literature or proof of the electric semantics everyone speaks of harmony and there are many many different views of capturing exactly what this condition means but let's say that when the inference rules were in the mean of a connective are formulating in terms of natural departure as we said before, well this condition can be formally captured by what is called the pramic inversion principle what is pramic inversion principle when is the following according to this principle the idea is that what can be drawn by the elimination for a connective C and this is a connective what is a connective C should be no more than what can already be drawn from the premises allowing the application of the introduction rule for C notice that here I am taking the notion of premise in a broad sense in the sense that when you have a schematic presentation on a rule in natural deduction for me a premise is whatever thing that stands above the inference line so it is not just the formula could be the fact that the formula depends on another like in the case of the implication introduction usually people speak on grounds in this case but I don't want to speak on grounds because this is something that could open a very many discussions so I just speak on premises, okay and harmony can thus be seen as a sort of no more or no rain condition and notice that this interpretation of the no more or no rain condition also if we interpret harmony in another way so not in the sense of an universal principle or a private but in the sense of conservativity which is another way of understanding harmony but it can still be understood as a no more condition but in the case of the universal principle or private it can also be formally captured by an operation of proof transformation or derivation transformation which is that of the true reduction what is the true reduction you have an introduction who introduces a complex sentence with some connective C as principle connective and then you immediately eliminate it so he said the tool because you introduce and then you immediately eliminate it so you make a sort of the tool and harmony is the fact of showing that this the tool is real the tool says that we can eliminate it we can bypass this the tool it's not essential but the other property is the one that interests me more in this talk because stability can be understood as the counterpart of harmony and it can thus be seen as a corresponding to a no less or no loss condition well, in this sense we can formulate in this follow what can be drawn by the elimination rule for a connective C should not be less this time and not more then what can already be drawn this is allowing for the application of the introduction so according to that method satisfying both harmony and stability is a way to having what a perfect balance between introduction and elimination so if you start from the introduction rule the elimination rule are no stronger, no weaker than the introduction rule and if you do the other way around you start from the elimination rule introduction rule, then the introduction rule no stronger, no weaker than the elimination and so in this sense for them proving that a certain connective satisfy both the harmony and stability is a way to prove that the verification is then the parameters are not too right but in this talk I will take the point of view of the verificationist is the most used one so I will always start from introduction rule and try to justify the elimination but you can do the other way around now several attempts to form and catch up the notion of stability and be recently studied in the literature I hear David East so we have the work by Curvis Frazzetti and Dikov, Frazzetti himself just in the read tenant, even if tenant does not really speak of stability but he speaks of deductive equilibrium that is very similar property here in this talk I will focus on two of these proposers they don't just interread why on this well not only these two proposers are prima facche very similar and so one of the point of the talk is to try to separate them and show in which sense they are different and on the other way well they are the two most general proposers that exist in the sense that the other can be subsuman of these two and moreover they involve in operation of proof of proof transformation and in this sense we can compare stability to harmony ok, in order to add into the core of the talk well let me say that in this talk we will analyze these two proposers and we will compare them with respect to a specific case study which is that of quantum disjunction and this is indeed the connective considered by domet in order to discuss the failure of the stability condition now which are the rules for quantum disjunction well they are the following so let's start from the introduction we have two introduction rule which are exactly the same introduction rule as the standard introduction rule for disjunction then you have the elimination rule which is very, very similar to the determination rule of standard disjunction with the difference there is a provizor, the provizor is that no side assumption have to be present in the sub derivation of the two minorities of this rule in the sense that here I could have a derivation for me to see but in order to apply this rule you don't have to have any side assumption here so no gamma for who sees these things in the schematic way but it is the same for this in minor privacy so you should not see just from me and just from you and no other things ok now, first thing to remark is that the rules of quantum disjunction are armonios this is something that you can already see in domet I will not enter into the detail I will just give the idea is that you could create a detour of this form here I choose to create using the first of the two introduction rule but you can do the same so you have a detour because you have an introduction that introduces this complex formula and then this complex formula is immediately eliminated and how it can be reduced well it can be reduced in this way so you take the derivation d of a and you plug in over the derivation rule I use here the square brackets to know that this is private notation to say maybe there are many occurrences so I can plug it there could be multiple plug it is an armonios but not only domet but also just in to I read that the rules of quantum disjunction are not same so this is a particular interesting because it is an armonios so it has good properties but not enough in order to have a meaning that is completely justified by the rules of quantum disjunction in fact just in to I read in that paper of 2017 well it is a very very rich and refined paper in which they analyze four alternative formulations of stability here we will focus only on the one that they generalize local completeness ok we will see later maybe if we have time widely call it like that here we will call it zajastability for short but what I call zajastability is just what they generalize local completeness they have other proposer for formalizing stability but I don't need to treat that when zajastability is applied to the case of quantum disjunction well which is the form that it takes it takes the formula form it says assume the point that is a formula C is derived from the premises of the introduction rule for the quantum disjunction from A and B respectively so in other words consider that if you have that S is a certain system of rules in which we have at least quantum disjunction but you could have other thing then you can derive C from A and possibly a certain context gamma and derive C from B and possibly a certain context gamma this mean basically delta this derivation from C from A and from B and here it could be that indeed there are also some side assumption I don't put the side assumption here just because it will be easier to read in comparison than regions of symptoms proposed and drunk but if you want we can but it is just that I think that they just make confusion for them now then they ask to respect to other condition and the first is that you have that C should be derivable in the system S from A quantum rule and another thing that this derivability result should be established or must be established by applying a disjunction elimination for quantum disjunction on this derivation that you have on this option so you assume to have this derivation then you say what is for the quantum disjunction to this table is that I should derive the same C that I derived from A or B or quantum B and that this derivability result must be established in this way so through elimination why? because this shows that other things that follow from A and C which are the graph let's call it the graph but the premises A and B of the introduction rule for the quantum disjunction must follow from the elimination and indeed it is the case so here there should be a graph I forgot it ok, it can be argued that since gamma is not necessary everything so 2, that means this one cannot always be obtained since the application of the quantum disjunction elimination would not be correct so in particular there would be a situation in which we have the assumption the point 1 that is true but the point 3 that is false and so the condition does not work ok, good let's give a concretely discounted example we can consider a language in which we have quantum disjunction and conjunction the standard conjunction then we take gamma as just the formula E ok, so it is a single to this case and then we take C to be E and A quantum disjunct B ok, we have then that from A and E we derive E and A quantum order and we do the same from E in particular we have these two terminations now what does the condition the J R stability condition ask that I should be able to apply quantum disjunction elimination rule here to these two because these two are special instances of these two I should obtain the same conclusion if I do that I am applying the quantum disjunction elimination rule in a bad way so this configuration here is not properly speaking the derivation ok, and so therefore we must conclude that the quantum disjunction is not J R stable yes, but you can say however we could also show the failure of the J R stability for the quantum disjunction by showing the failure of the other condition the fact that the fact that C is derived from A quantum order in this case you should have the point 1 that is satisfied but you would have point 2 that is not satisfied and so again you have the failure of the coach how it works here we take again gamma as E we take C as a different formula we take it as E and A quantum order E and B now we have on one side that this formula here is derived from A and E and it is also derived from B and D since we have these two derivation on the other hand by adopting the standard interpretation of quantum logic based on automodular like this which is almost at the time the standard interpretation at the time of time of quantum logic it is possible to show that this formula is not at all the consequence of E and A quantum order so that since it is not the consequence we cannot deny this thing now the way which Jacinta agreed that kind of stability for the quantum disjunction seems to follow in this this second counter example and not the first one that they gave they say this the following thing in their article they said the unation rules for quantum disjunction are not generally complete are not generally locally complete so, stay with us E and A or E and B is derived from E A and from E B even now the same formula is not derived from E and A quantum order B so it seems that they are just speaking in terms of derivability as we established in the second counter example not in the way which we proved or we tried to prove this particular derivability now this is what I said so why the first counter example is based on the impossibility of taking a specific proof transformation that of taking two derivations applying a delimination rule and obtaining a new derivation the second counter example only rests on a non-delivability or unprobability result why the second proof theoretic semantics takes proofs as the principal object of study and not just as mere tools for studying the derivative derivation for establishing probability results so as I said proof theoretic semantics is interested in proofs and not in probability so in this sense it can be claimed that the J R stability face to be generally proof theoretic as it does not always work at the level of proofs but it works instead sometimes at the level of probability now more generally if you want we can see the problem of J R stability from another angle that is certainly it is based on an operation the problem is that this operation of proof transformation does not preserve the delimability context why? because it changes the set of assumptions of the derivation involved in the transformation the idea is that the operation that they consider takes these two derivation as input and should give this derivation as output now in passing from one to two so from this configuration to this we have the effect that the context of the delimability changes because in the first case C is derived under the assumption A and possibly gamma and under the assumption B and possibly gamma but in the second derivation C is derived under the assumption A 1 to power B and possibly gamma so it changes the set of my assumption so it changes the set of my assumption thus in order to block the transformation well it is sufficient to show that the delimability relation that means the sequence if you want considered in one order while the delimability relation considered in two does not but in that case it is a question of the delimability probability not of proofs now let me move to Tranquini account and let's see the difference Tranquini analyzes the stability condition of what he calls generalized expansion again if I have time I will explain why he calls it this one but here for short we call it destability when destability is applied to the claim of quantum disjunction it takes the following form and you will see it is very similar to the previous one but with two big differences it assumes as before that the formula C is derivable from the premises of the introduction rule of the quantum disjunction that means from A and B respectively as we have before but it takes another condition as an assumption he assumes that A quantum or B and plus some possibility gamma allows us to derive C so what he did he moved what was a condition to be satisfied before to an assumption to be satisfied means that we clearly have this derivation we assume this derivation of C from A quantum or B now under these two assumptions that it is required for this connective to be stable to satisfy condition 3 that is that this derivability result here should be or must be better established by applying the following configuration here it is the second difference it is not exactly the second configuration as before because now we ask not only to apply the elimination not only to apply to the derivation that we have here but also to derive A or B from A and from B ok now that we have this we can see the fundamental difference between the J-astability and the stability as we already mentioned it consults the place assignment to point 2 in J-astability 2 is part of what one is required to have in order to satisfy the stability condition while in the distability is part of the assumption which allows one to formulate the stability condition and this is a crucial difference since we explicitly assume that C is derivable from A equal to B one can now focus on how C is derived from A equal to B in this way the stability condition is made to work at the proof structure level and not just at the near probability level in particular what we have is that from the point of view of distability the second counterexample that we consider is no more and is not any more a counterexample for the distability so we showed two counterexample before and in the distability only the first one works the second one is not a counterexample so it is more restrictive on the other end as I was saying before the first counterexample still works but let's see how it works again we take a gamma to be equal to C to be equal to E and A equal to B then we have first that we have that from E and A we derive E and A equal to B we do the same thing if we change A and B and also if we put A equal to B instead of A in fact we have three following derivations that are perfect derivations perfectly correct the one that establishes that E is derived from E and A equal to B is derived from E and A equal to B and the one in which we just add to this premise the application of the introduction the first one and the second one and we get the derivative from A but still the conditional tree cannot be respected why it cannot be respected well because now if we take these two derivations or better sorry these derivations we take it and we plug it on the two minor premises of A and B and then we apply the sorry the deduction introduction on the two cases then here and then this is not the current derivation because we have the extra assumption E here and so they want to deduction elimination is not correct because this should not appear so in fact here it is a play then E and A or B is derived from E and A equal to B but in a certain way and not in another so here it is the case of how we derive things and not that they are derived from now the operation of this transformation which is involved in this study it takes the following input so it takes as input a derivative of C from A equal to B and not from A and B and it gives this example this proof transformation preserves now in the reliability context because you see here I start from A equal to B plus some possible extra assumption gamma and in this proof configuration they are exactly the same so this is a proof transformation that preserves the derivative context and it makes it analogous to the operation of the two reduction which is also in an operation which preserves the derivative context in fact when you have the tool you take a certain proof D of C from a certain set of assumption gamma and this brings to another derivation D star of C from a set of assumption gamma star which is a subset of a gamma most of the time it is gamma but sometimes it is a subset but this is not a problem since the derivative relation is momentum or this means that there is no real set of assumption that is made ok this is a subset that can be made by monotonism while this is completely different for a random variable before just in the ring because in that case they clearly change the assumption it's not that one is the subset only now the two accounts of stability that we presented can be differentiated in the case of quantum assumption or if the letter is in connection with the other connectives namely conjunction this is how we construct the two counter examples one thing that I argue because as you have seen sorry I say it only now but this work is made is a joint work with two colleagues friends in the Noriko Vokawa and Mattia Petrovna if quantum assumption is taken alone we can prove that these two approaches behave in a session in the same way why this stuff because when quantum assumption is considered alone so without other connectives without other rules it can be shown that it satisfies both the J-stability and the T-stability and the reason is given by the following result I will not prove it but I am just not sure if you want then we can discuss how we can prove it so let's take an example which has only quantum assumption and with the system of rules controlled just by the rules of quantum assumption we can prove that if C is derivable in Q from some assumption then this assumption is unique it's unique not only the fact that I have just one shape of assumption but the assumption is really unique I have just one occurrence of this assumption now let's show why the two both satisfy the stability condition supposem that 0.1 of the J-stability is satisfied so that we have two derivations of this kind the previous result guarantees that there is no sound assumption neither in D1 nor in D2 we can thus apply in a perfectly simple way the destruction of the relation and so 0.2 and 0.3 are not satisfied what we just said is already also sufficient to just satisfy condition 1 and 2 of the destability because if I have this result I immediately have condition 1 and 2 of the destability and since in the configuration 2 here this one the only assumption is of this form it is possible to transform it into the two new derivations applying the deduction introduction the first one and the second one and then respect also the destability what we do is just that we plug in here an introduction for this direction and so we have this new two derivations and then apply again the deduction elimination and this is an instance of the schema that should be satisfied in 0.3 of the destability ok see here I have the two ok, because I mean the two is the question of harmony, not of stability ok so this is satisfied the destability now do you have time a little bit more? I think you can use a quarter of a number ok good so I will try to explain why just intervene call their property generalized local stability because it generalized expansion and you will see that this is an interesting point because we will make again a point in 3 of the tranquine approach as we said at the beginning the condition of harmony and stability are meant to guarantee the perfect balance between the introduction and elimination before a given line of connectivity in particular the idea is that introduction would foresee as primitive as we said like it is the case in the verification by satisfying the harmony and the stability condition we should be guaranteed to find the elimination would foresee that are neither stronger nor weaker than the corresponding introduction now, a very minimal requirement so is the weaker than stability but this mean for capturing such a balance is to have that if we have that c is a non-reconnective and so we have a same line formula to which it is applied and we have that c a1, a1 is derivable from itself no, sorry a minimal requirement is to ask that these complex centers the centers that has the main connected the connected c that we have should be derivable from itself by using exclusively the rule for c no, this is a minimal requirement because we made it like if we have been able to introduce it well, we should also be able that starting from it we reintroduce it and this property is known in the literature as local completeness although it seems to be a very minimal requirement to notice that local completeness completeness is not a trivial condition and is not satisfied by all harmonious connectedness let me give an example an example is the one that is called simple implication is a sort of a special kind of implication it is discussed by Dambit but was already introduced by Mitro Kada in the 80s and it is a rule that has well, we connected the test as introduction rule the standard introduction rule for implication no, sorry, sorry as introduction rule not the standard as elimination rule it has the standard as elimination it has a restricted introduction rule because it asks that here there is no side assumption so you see, we are making a sort of similar connected as before before we imposed a restriction on the center assumption on the minor premises of the elimination here it is imposed on the major premise of the introduction so you should be like be only for me and not for another thing and if it is the case then you can introduce the implication it can be proved that these rules are not harmonious and so it can be proved that they are not harmonious in fact well, you can even prove something more that you have cut elimination or cut elimination is a little bit more than pure in any case, however they do not satisfy local completeness, why? for the following reasons this is the only reason for proving local completeness for this simple implication are the rules of this implication itself we are forced to start with one and we have to prove you should be able to derive A implies B for main principle so if you have to start from this you should start from elimination right? start from elimination no, how can I proceed? I cannot do the implication introduction because I have a contest here so I am locked so this connective is not locally complete but it is harmonious notice on the other end that local completeness is satisfied so quantum disjunction is a little bit better than this conflict because you can derive B from A to B now just to agree that their account of stability is a generalization of the local completeness condition and this is why they speak of generalized the local completeness they require that the elimination for a certain connective C should allow to obtain the same consequences C that can be obtained by the premises of the introduction without necessarily restricting C to B of the form C over A1A so, what I am saying is that basically if you want this derivation here is just a particular case of the of the schema that they are asking to be respected in order to prove the local completeness in fact you have just to instantiate A equal to B and then use an introduction but then these instances of the elimination is perfectly correct so that works as I said it is just a particular instance of the proof configuration that they just can point to however no explicit reference to this particular configuration is mentioned in the jazdability it is a particular instance of this but I don't care too much of this particular on the other hand on the contrary in point 3 of the distability can be explained by making explicit reference to this condition to the local completeness why? ok, first consider the condition to that is one of the two assumptions it asks to have the derivation of C from a quantum order now, imagine that we take this and we plug over it the derivation that we have seen before this is usually called an expansion of 3 because you have 3 and then you expand it on the importance so that you replicated a quantum order in that quantum order but now from this configuration you can obtain the configuration asked at point 3 of the distability how? by asking to permute this derivation upwards on the two minor premises of the derivation because this is the proof configuration that Trankini asks but you see now that the configuration of the first best Trankini is something that can be explained in terms of two operations local expansion and permutation of the derivation of the minor premises of the derivation notice that clearly here the problem is that in the case of quantum disaction this permutation cannot always be done because if I have a contest karma then it does not work following Trankini account we can test this kind of stability as the composition of two operations two of proof transformation one, it is an operation which expands the hypothesis of truth and the other is an operation of proof permutation and this corroborates for me the idea that Trankini account captures in a more genuine way the proof theoretic character of the stability condition rather than the syndrome which account because, why? because in a certain sense I can have a final analysis of this operation in terms of proof transformation operation and also it explains better why it is in a certain sense a generalized local completeness or a generalized expansion moreover the composing stability into two operations allows us to deepen our understanding of this condition so, if we want to show that the rules of a certain connective are not stable we have two ways of doing that either we show that the rules of this connective does not satisfy local completeness and thus they do not allow to operate the proof expansion as it is the case of the simple application or we show that even if proof expansion is possible it is the proof permutation that cannot be executed so, in a certain sense I can say both the simple implication that they want to be ejection are not stable but for a different reason I can explain why they are not stable if they are not stable for a different reason for one because it cannot be extended the other because it cannot allow it does not allow for a permutation notice also that as it is formulated local expansion allows only the rules for a connective at the time and it is for this reason that the ejective local is used in order to characterize it this idea that the expansion is a local property is something that I developed some years ago with my friend Martin Petrollo in the paper on rather different topic we call it the property deducibility of identicals but it is basically the same on the other hand the operation of proof permutation involves an derivation d which has to be permitted in order to the minor premises of the elimination and this means two things the first one that in order to apply such an operation the elimination rule must be in the generalized format ok, we can develop it later for the discussion if you want but you can do the permutation only if you have two minor premises in which you can make this permutation and so the elimination rule should be in the generalized format now read as his own way of conceiving what is the general elimination Tranquini is he is another one that is the one taken from Petro Schildermeister and the two account are not exactly the same we can discuss it later if you want some of this the other point is that the derivation can involve several groups because it is a derivation you can add many things inside which can be different from those of the connective under analysis and the permutation consists then in a global action operating on all the rules used in D you have no more reasoning only just on the rules of a particular connective but you can have many other rules of different connectives as it was the case for the conjunction as we have seen the problem with the rules of quantum disjunction arises in D when the latter are considered together with the rules of another connected conjunction why? if they can alone they are safe so the fact of what I am trying to point out is that local completeness is really a local property it concerns one connective at a time the other property is property of permutation it is something more complicated now this is something just to tell you where we are trying to go with my colleague in the now in the material is that this is the analysis that we made in natural reduction but in fact we can go a step further and try to analyze this in sequence why? I explain in just one slide and then I stop so if we pass to the sequence the operation of permutation can be treated in a more local way because we can use a special rule for speaking of it that means the rule of cap that is a way of composing to proofs and then I can study how I reduce the cap and the reduction of the cap can involve what corresponds in natural reduction for permutation more precisely the possibility of operating a proof permutation corresponds to the possibility of reducing a specific type of cap in this way it is possible to have a homogeneous treatment as formal operations involved in the cap reduction process harmony would correspond to the operation of reducing a principal cap that is a cap in which the two cap formulas are both principal and coming from a logical equation so I have for instance a cap in which one formula that comes from a C in production rule and the other from the top from the C right rule and this is a eliminating this cap is proving that but stability would correspond to operation of reducing a particular kind of non-principal cap namely a cap in which one of the two cap formula is non-principal so it does not come from a rule of a connecting but it is a part of the context and it is the conclusion of an identity derivation what is an identity derivation of a sequence of the formula like that so for instance particular identity derivation of this is here it is the right rule right rule and left so the idea is that I cut here and the idea here is that in order to eliminate this cap what would the idea be to let push it up and this is the permutation moreover sequence calculus would allow us also to define another property which we call the rule of stability because in fact instead of cutting on this side we cut on the other side and so going through sequence calculus we can even be more refined in our analysis of the property of stability well in fact you can do this you can analyze also the rule of stability in actual deduction that is a more complicated affair because you should have a dual inversion principle but we can discuss later if you want so that's basically my presentation so to wrap up it was for showing that this property of stability is really complicated you have to look at some details and according to the way in which you analyze it you can give rise to two different account of stability very similar but in fact not exactly the same and one of them at least for me the one by Tranquini is more proof theoretical in spirit than the other and then since it is really the one by Tranquini in operation of proof transformation well if you want to look in details what are these proof transformations and maybe it will this work also to go on the side of secret calculus so we have plenty of time for questions maybe short break yeah, you can take a short break if you follow this presentation online please ask a question if you want to no we are waiting for Kelly we can resume now we can now turn to the discussion so once to start ok, well I have a question so it's a very very basic question about the two accounts so you you've convinced me that they are obviously different accounts and they have different merits and there is one key for example in which they diverge but you could say also that the diverge is just about one reason for I mean there is just one case that so there is one account for particular connective there is one account for example that is neutralized by Tranquini's account but both accounts give the same verdict about quantum disjunction what you've shown is that so they differ at least like intentionally in the sense that they give a different reason for why quantum disjunction is unstable and they have different explanatory properties and I agree with you that the Tranquini account is no fine brain in some sense you go more to the structure of proof than the other one and I didn't completely grasp why it was not symmetric but at least it has some gives like within this is perhaps for another question but the first question I wanted to ask is do you have an example of a connective that would be classified as unstable according to one account and stable so no and so you have a result say that when in a special case where the quantum disjunction they are so my question is perhaps it can be proved that they are extensionally equivalent in general maybe not a good way to show that they are not extensionally equivalent is to give an example of a connective that is stable and so should you expect that they are extensionally equivalent or not yes and the problem is a problem of how do they show that something is not stable but my guess I didn't try to prove it in general that they agree on saying this connective is not stable this it is because yeah I think that the only I started on this case I don't have any other case but what I showed is exactly this the Jacinto read is a little bit broader so you can generate more counter examples than you have but counter examples is by tranquility also so my guess is that they are equivalent in fact they have also in the beginning there is something in which they can be maybe differentiated that is the kind of general in nation that they use but on this I don't remember but I think that in the case of Jacinto read the general in nation for conjunction is different but it does not create any problem just create on how you write down a proof but then in the end you do the same thing so again it is a problem of the style of something but not on the essence of the result so now on this I think that I have a question about the formulation of stability how does it work and what are the consequences of multiplicative rules in substructurologics in which we have the same scene ok good so in fact the difference on the style of the connecting you will see when you go to a sequence of course let's say that in natural reduction distinguishing the styles in the connecting is rather difficult because well you don't have explicitly the structural rule what does it mean really to have the structural rule in a sequence rather than on using explicitly weakening or something these are properties of the durability more so if you go on sequence ok then I can say something that is the reason just one second let's do can we see one thing then I can show you this also in the case of natural reduction but it is a little bit more complicated so let me start from sequence calculus I believe that we translate what I said for stability but in the case of the connecting the t-stable so standard disjunction ok we think standard disjunction standard what does it mean because when we have natural reduction it is the same thing that we said before so I start from a from two identity sequences and then I apply the same rule that I used before so I use this right rule here oops and the other right rule and then I make my elimination sorry elimination oh but why I they like that ok of identity what is it making an expansion of a truth of gamma a or b drc this is like the one from which funky instance ok and then if you compose them so if you take up between them you are kind of expanding this truth window something else ok anyway now here I argue that what is stability is the fact that I can permute this cut upwards with respect to the left rule in that case natural reduction the elimination ok and this I can do because here I have the standard disjunction so I don't care about the fact that this is empty or not I can always do ok but wait a second here I am not using any structural method so in fact if you look this is not just a disjunction whatsoever this is anything disjunction if I use the linear logic style of writing right good now what about the multiplicative disjunction look at the moment of how you write down the identity you wouldn't write down in this way the first thing that you do is that sorry I immediately write it in the linear logic style pa pa sorry plus plus this is per left per right now here if I cut on this side then I cannot do a permutation because I cannot go up with respect to this but I could do on the other way cut on the other side and now yes I can move up more than I cut with respect to the right one because what happened is that the left with the right and the right with the left and this for me is what is called the dual instability so the dual instability is in fact in view of that the fact that I have different kind of expulsion and according to the order of the loop that I give they select me connective and then I can do on one side or on the other depending on which is this order so you have that what would be dual would be plus and the tensor and what will be so the stable statement would be plus and tensor and what the dual stability would be par and orbit so in a certain sense what I am claiming if you look it in a proper way through sequence calculus you can even justify in a certain sense why you rise up on linear logic so a separation of the style of the coordinates in fact you can do the same in natural deduction but in natural deduction you have to give me the second of looking at some nodes I don't have in mind like that but in fact in natural deduction you have you can do the same by applying what Sara Negri in a paper called varieties of linear calculus calculus she sorry if you give me just one second I'll read you she introduce what she calls a dual of the inversion principle and she formulates as follows give me one second she says sorry I don't have this in mind yes whatever follows from a formula must follow from the sufficient ground for deriving the formula so to say in another way is that you would get for instance if you take the plus you get this then the introduction would be something very very very strange this, the introduction one so you introduce but in fact and if you go and you look and you keep the same elimination you can do the same and then form rules of this format for the pattern but for the pattern you must be also multiple conclusion that's one thing that usually I don't like too much conclusion in natural deduction you can do and then you show that stability is permuting upwards with respect to the minor premises of the elimination and the dual of stability is permuting upwards with respect to the major premise of the introduction so in a certain sense you can recast what you have in sequential angles but you have to change the shape of natural deduction and I think that this is a very very proper way of dealing with the style of the connecting in natural deduction but you have this strange rule of introduction and for me for me there is only one problem is that when you sorry if you give me one second I show you how it is the natural deduction rule of introduction for the pattern she would have something like that let me give you a second yes the problem is this is that you would have A B gamma and something like that so it means that you have multiple conclusion now my problem with multiple conclusion is that strictly speaking from a proof theoretic point of view if you accept the idea that the rule that in a rule the formula appearing in a rule in fact the formula that I asserted it is another difficult to say what is an assertion involving more than one formula at a time I mean when you assert something you assert nothing what would that mean is set of sentence either you use this direction or the standard of the problem while in secret calculus you can go for a different justification you say well maybe I abandon the idea that I am also using the rules for explaining the meaning I am just using the rules for defining the constant then so maybe I put aside a little bit the question on me but that's another history that is more complicated the status of the secret calculus actually I just want to follow up on this unless someone has a question I should be very broad perhaps I need something so could you say more about how you think the status of multiple conclusion in a secret calculus for me personally when I go to secret calculus I have my way of understanding secret calculus is that secret calculus is a calculus for studying the relation of the river spring so it is a sort of formal setting so in that case I am not sure that I can extend all the proof theoretic semantics project to secret calculus but I think it is a hard thing because in natural deduction is clear what is the interpretation that you give to a derivation or at least if you give an interpretation in terms of assertion which is the interpretation that I have to give because it is linked to the use when you write down something like this is that from the assertion after a certain day and a certain day so it is not just something on which I play your formula they are interpreting things and interpreting them and then I can have a linguistic how to say action of them in secret calculus this reading is much more complicated there is a lot of literature and so if I go in the same if I go in the secret calculus side then maybe I should change a bit of project is not completely proof theoretic semantics is defining logical constant using proofs rather than the same truth or denotation or whatever but it is a semantic thing or it is just a more formalistic defining connectives in a setting which is not more than theory but it and that's all which is a kind of a different project maybe the notion of meaning is putesan I mean we can talk about this later but there are people who think that secret calculus is actually a good framework for proof theoretic semantics like Pauli so you have to disagree with Pauli on this just to clarify the position I am not saying that just to understand but my feeling is that when I go on the side of secret calculus but this is maybe my personal way in few things is that then I I go on a different project and I get found maybe again something that goes on the side of theory of meaning but changing the meaning would not be given by a reading which is a reading and a septoric reading but would be a computation of reading so the meaning is given by the computational operation and computational operation is not necessarily a cell or a septoric linguistically that's how I really put these things this is another project that's true completely correct that's why here as I tried to say in the slide that not the secret calculus could maybe evane more or more genius reading of all this operation because they are linked to cut the elimination to cut the reduction but then you should give an interpretation to cut the reduction that then would allow me to give a theory of meaning to basically then I would give a computational reading of cut reduction that means execution of the problem and then I would have another project the definition of the theory of meaning based on computational operation Thanks a lot for a great talk I especially like the last slide the suggestion for the only genius treatment I personally haven't worked so much on stability in one harmony it's a more natural like phenomenon or criteria I guess to me at least and I failed to in fact see the intuition or something behind stability in general and why this would be so harmful for theory of meaning if we have unstable connectives is there some point where it can go horribly wrong caused by stability only and not harmony you can have this example that speaks of that if you take for instance standard consumption and quantum disjunction and you have the standard disjunction you could prove something that in the older language so consumption and and quantum disjunction that was not provable at the beginning and so you have non-conservative results basically and why in fact for a reason that maybe blocking it with stability is asking maybe too much because you could block in fact that the problem is given by the fact that you don't have in a natural reduction elevation what is called a permutation reduction so sometimes you have an introduction and elimination of the same connected that are separated by an elimination rule of another connected if you permute like the permutation that I gave you can recreate the cut eliminate the cut but if you don't have the permutation you could have some sort of hidden the tool so the fact is that in a certain sense if stability is you can see in two ways let me put like that I can ask just harmony but then since I ask harmony usually in a system I don't just ask that I reduce locally at the two but then I have the world normalization but in order to have the world normalization usually you ask also to have this permutation reduction in order to explicit the tool that were written by an elimination so you can say ok, I go from harmony to normalizability and normalizability must go with the permutation reduction or you can do another way and then you stay just on harmony in this and generalize it or you say no, I have harmony and then I ask stability stability is this expansion plus permutation and if I have permutation then I can also add all the good the tool that I should have so it is two way of looking that's a very good question for me because in a certain sense you could just say I don't have stability and just to generalize my notion of harmony and to push it through normalization full normalization so yes in a certain sense yes and that's why I I try to show you that you can analyze stability into two different things expansion plus permutation permutation is the complicated stuff and permutation makes a lot of things appearing or not appearing like producing new cuts so it is in a certain sense more global than what thinks at the beginning but yes in fact let's say in this way the fact is that if I want to be more precise you could ask harmony or better normalizability and normalizability in a full sense so with the permutation reduction even for system the test general elimination rule that are at the end not completely stable I give you an example so the standard rule of introduction for implication so here more restriction on the context and you have an elimination rule a general elimination rule so that allows you to make some kind of notation for instance which is usually called the flatten general elimination here I could have some derivation here for instance and push it here so I can prove what is usually called permutation reduction in a normalizational process but this rule is not stable in the sense of working why? because imagine that we make an expansion first and this we can do without any problem and also I try to give the same notation this is an expansion of the proof so I start it from this proof I expand it to elimination introduction can I push this thing up the problem is no because from where it comes this line this here oh ok maybe an introduction without any hypothesis ok when I discharge this here and see but I remain there so these rules allow for even full normalization but they do not allow for ever this kind of stuff and they have full normalization because in this case the full normalization would mean that a standard says the major premise of the elimination will stand ground in the sense that a normal derivation in general elimination would be a derivation in which the only hypothesis are major premise of the elimination so that would be normal but it would not allow to have this property so in a certain stability is even stronger than asking for full normalization if we understand full normalization also as permutation reduction we don't want strong criteria we want as weak as possible criteria in the case I think here dammit they want the strongest because you want the strongest would be that so just accept classical logic with the usual rules and from that you are connected but in a certain sense what they would like to have in this case is that you would like to have that in your introduction rule and then the elimination rule should allow to derive no more and no less than what you can derive from the premises of the introduction rule this is very strong but why it is so strong because they would like that in this way you see it is no more and no less it is like having really a definition of the connected the connected is exactly captured by the by nothingness I see the intuition I have the intuition myself that one should be get a bit loose from that definition one is a french list and say well there is something being done with some rules it is apparently coherent it is conservative let's say so why not just thing that there is maybe some meaning in science even if it is not a definition I see but I think that this is exactly but this is my personal understanding of the proof that it is an interesting problem is that they want to ask this in the case of logical constants in order to say logical constants are exactly those expressions for which I can give a definition in terms of inference maybe there is another expression for which I cannot ask this thing I should ask less harmony, normalization and I am fine then it is what happens usually with mathematical system in which I can ask a mathematical system to be a goal using the set assessment I have reduced the cut or I have reduced the detour but I cannot ask more in some particular cases like arithmetic if you write the introduction rule for and elimination for the predicate to be a natural number and you write them in the style of Martin Lover inductive definition how you can get something that is both harmonious and also stable but stability then could create some problem there is maybe too strong in some cases so then it depends from the mathematical if you want a projective job what you ask is just to have the desistible reduces in 2 or 8 minutes so this connective is it a local property the implication yes the way then I can say one thing and why if you want to have better general elimination because in better general elimination you it would be this rule to explain what is this double arrow the double arrow is whatever rule that allows me to pass from A to B so in here you disactivate not only formula but rules in fact in better generalized point of view formulas are trivial rules in circumstances so everything is a rule in this case and then you put the right your derivation in this way I use my rule B that allows me to pass from A to B and then right here and here I disactivate sorry I disactivate here the rule the rule disactivate formula so then we come the expansion and then if you put the derivation here you can permute it upward why? because we do that this you do that this we can see if this starts here and then we discharge this so that is perfectly fine this also it has a syntax because instead of representing things like that they discharge a derivation they say they would do something like this discharge no sorry this is ok they say yes they discharge this thing I prefer the other notation by better but it is the same you can do exactly the same so that is the difference so you see that stability also that makes a difference with respect to each style of general elimination rule in the other one do we take the other but maybe the other one are there any other questions there can be questions about the general project about the big picture if you don't want to dwell too much of the technicalities this is also an opportunity to discuss these things ok sorry a very naive one about the beginning you were saying that the meaning of the compound sentence is to know to have a notion of the canonical proof and I was surprised about that why is it important for the meaning to go for the canonical because this I think is very much linked to a constructivist at that point when you have a constructivist that you think that things should be explained so the construction that you make now let me take an example which is not from logic but from geometry in geometry also when you introduce your figures you have a canonical way of constructing things so for instance in Euclid 1.1 you have a canonical way of constructing in Equala-Terel Triangle even if you can construct in other ways but you should be able for the constructivist to show this standard way for making this construction because it is the deconstructor that explains what is this object what is a canonical in Equala-Terel Triangle is the thing that I construct in that way and then all the other I explain by saying I could have construct in that way in a canonical way so here it is basically the same thing is that I want to explain what this means to be a consumption sure I can use in my language and during a discussion I introduce I use a conjunctive centers in many different ways but then I should be able to show that when I introduce it in the during the discussion I should do it in the canonical way because this is what explains me what is really the conjunction and it differentiates from another connect in a certain sense so it is the how to say if I want to recognize the difference between I say in another let me do this I don't know if my example is but imagine that you have an implication implication you have an elevation and then you have an area here and then you can let enter into play B and C but you can let enter into play also B or C in the same way and in fact you want to distinguish for these two because they are not the same so I must find a way in which I say oh but if I introduce in this way then I should be able to introduce in the canonical way that means passing through the conjunction of interaction and the vector would distinguish me in this from this because in this case I would introduce just in one moment so it is in a certain size a way to say how I can really say that this is conjunction and it is not another but still I understand that you want to be to have this power and distinguish but still how should I understand the alternative proof for a certain connector in a meaning in a semantic that would be a proof that allows you to say that if you have a proof which is not canonical is something that you would say well having an canonical proof is enough to show that something is true but it does not explain me yet why it is like that and then why comes from the normalization and the fact that I can canonize it I would put in this way I use why which is a plenty of implications but I would say that yes so sometimes a non-canonical proof is bad I would say a sort of incomplete incomplete thing it tells me something but I expect that it is canonizable in order to fully justify my assertion so for instance if you ask me show me that VLC is true I give you a proof of VLC whatever it is but I expect that in order to show that it is really a proof and it is correct I should be able to normalize it so it is a way of showing that not only what I give it is sufficient for giving you the truth but it really gives you why it is like that I don't know if you want to see it but yes, why not why not so I would say like this that a proof that is not canonical could suffice to show the truth for the certain sentence normalizing it is being sure that the sentence is also meaningful and that it is real that I am not a witness of it is true that is a false witness but in that sense you are going farther from the verificationist notion of traditional that you seem to be attached to in a certain set because there is no particular way to verify something if you verify it it is true true is verification and here true is proof it is doing something so I am puzzled about this canonical still ok I can say in another way but I like your why but it is not verificationist in something else I am trying to find if I have another example in mind to go on this but maybe there was question so maybe the ok ok because in a certain sense you give me something and say it is a proof of this I want to verify that it is really a proof and to verify that it is already a proof I can analyze otherwise I cannot say that it is a proof of BNC well it could be here fortunately the term of BNC but I passed from this I am not sure that I can really transform into this if I don't transform into this I am not sure that it is really of the form BNC ok so if you build a triangle with another means you are not sure it is a triangle of that type until you transform it in a certain sense yes it is a way to check that it is really the object of the kind that I declare but in terms of programs you see very well so you could say we have a program of a certain specification BNC how can I tell you that the program does what it declares I have to transform my program into something of this form a pair a pair where P1 is of type B and now I am sure that it does what it declares so yes in a certain sense let me put in another way now I have the right way to say having a proof is something that guarantees you the truth colonizing it guarantees that I don't maybe I am taking this from Martin Mervin in a certain sense it is a sort of transposition of what Martin Mervin would say but it is something like that so bring it to canonical proof is a way to know that it is true I have a question the big picture I am going back to the first slide to clarify so we have these verificationism and pragmatism there are two high views and then I will say very connected and stable we can show that they are not really competitors but I was trying to find very simple examples if I want to introduce in my calculus a connective that is like conjunction but without one of the illuminations that is clearly not stable it is A from B in for A R from B so it is a model that is not stable and the next thing and then what then what about the initial discussion if I want to have this connective I am giving arguments to verificationism to pragmatism or it is just that if you start from the conjunction from the introduction sorry and then you say and then again my elimination which is a sort of trick that the usual one the pragmat say the elimination is not correct so try again this is not enough in order to define a connective just by means of influence is enough for me then stability would be as strong as harmony in the center forcing us to introduce to the connectives but you could have done the same by saying you know but I take a point of view which for me am the elimination first and then you start from the elimination and you say oh is this introduction that I try too strong less strong and then in case of the check so you have to check again harmony and stability stability so then clearly and this is why I was also speaking at the end of these things by so that is one big problem here is that sometimes in the financial discussion likes the discussion of how should be an introduction if you start from introduction because here there is something strange you start from introduction and then asking this question of stability or generalized one so you are saying something on the elimination but you didn't say too much on the introduction and just end of saying oh is something that should have some hypothesis and then I am in the conclusion but what happens when you have I don't know for instance rules for a modality like the modality of probability logic try to put them into natural deduction in the conclusion or try to take a modality for the belief connecting and again you don't have really an introduction and elimination so in that case what do you do, what are you starting from what you are trying to justify so there is very very few things telling us from where I start and what is an introduction and what is an elimination I don't know too many words on that now it says a chapter a little bit in his book he says something there are some works telling us yeah sure you should be what has been the principle separation of connected so there are properties but you don't have a general format of what should be an introduction there are some who try to have a formal format but it works for something and if you want to let that into the bits of modalities that's very hard so in general I think that here we learn something of the elimination rules but still it lacks something of the introduction I have some questions to start up curiosity there is a couple of times that you almost said the word grounding and then the why the proof also went in that direction and I have the feeling that you want to say that the ground is not just proof which is a kind of semantics that we tried to I know I see it is that it would be so it is helpful sometimes to use a notion like ground or say that something explains why something is true but then I don't want to use the less that I can but to say I use it but I forget it because then I should explain in which sense they are used and maybe construct the theory clearly Pravitz is someone that is trying to do that nowadays and he uses the notion of grounds in an explicit way but it is kind of complicated I am not in the setting of Pravitz maybe who this even Martin Leud has some of these terminology in a certain sense but using it in a kind of different way so it is more on it is something that founds the knowledge or something like that so as I was saying the effect of normalizing I get in the normal form is because not only it is true but I know that it is true so I have in a certain sense found it by knowledge I have something that tells me that I know this but it would be hard because then would be not only to have a theory of meaning but also theory of knowledge Martin Leud has it but it is rather complicated to explain how it works I want to not to say too much but yes the effect is that I would say not only say that you have a theory of meaning or a certain kind of knowledge in project we try to develop here tries to put the notion of grounds first and then reduce and then try to develop a semantics and a proof theory afterwards put the full meaning should be in the ground of theory and then you can see it the links with all the proof theory semantics like functional semantics results are promising so far but it's like you want to use it all the time and then you just say do it that is where you put the meaning I see that's interesting because on my side I would say the thing that I do not alone but also in Mathia Thomas Seiler and other comics is not going on grounds but we made another decision we say no we go on a sort of formal level we want to speak of programs and we do philosophy of computer science in the sense that instead of having proofs we have something how to say a little bit more a notion of more liberal than proof which is the notion of program because not only all the programs are proofs, but certain programs those that are certified are proofs and starting from that we try to reconstruct the notion of proof the notion of logical constants and that stuff but we made a different thing, we let aside all the theory of knowledge meaning that we just do theory of computing which is maybe easier for us but yes I see the similar thing to start from something a little bit broader I think and then try to recast the notion of truth with some properties that then from the starting notion I like the program way of thinking about this so that sounds also very interesting project we are trying to do that I think that we have less boredom than try to even ever capture a notion of knowledge but I don't know maybe we have been too dismissive in a certain sense that's from where we come we come from linear logic so that was normal to go through all the carry out stuff in that way I think we are actually on time so we got thank our speaker thank you