Logic: Between Semantics and Proof Theory

Abstracts

Sergei Artemov

Semantics of Proofs

We show how the recently formalized Brouwer-Heyting-Kolmogorov (BHK) semantics of proofs helps fix two well-known problems in the original informal BHK semantics.

First, according to the original BHK, any proof vacuously "proves" the negation of any independent sentence. In particular, in the arithmetical context, the negation of consistency is BHK-proved by any proof. This dubious feature violates the fundamental intuitionistic assumption that truths should be supported by meaningful proofs. Moreover, BHK semantics in its naive form exhibits the law of excluded middle at the meta-level: either a sentence is constructively true or its negation is constructively true, which is counterintuitive and inconsistent with other well established semantics of intuitionistic logic, e.g., Kripke models. Second, informal BHK's treating of universal quantifiers is also not satisfactory since it yields vacuous constructive "proofs" for all true Pi1 sentences of arithmetic, e.g., Fermat's Last Theorem or the consistency statement.

Well-known computational versions of BHK in which proof objects are understood as computational programs (e.g., realizability semantics and its descendants) failed to correct these flaws and merely reproduce them formally. The provability BHK semantics comprising Gödel's embedding of intuitionistic logic into modal logic S4, followed by a proof realization of S4-sentences in an appropriate logic of proofs, not only provides a formal BHK-sound semantics of (arithmetical) proofs for intuitionistic logic, but, in addition, offers plausible fixes of the aforementioned flaws in the original BHK.

Matthias Baaz

Sequents of relations and first order logic
(joint work with Agata Ciabattoni)

We introduce a first sequent-style calculus for witnessed Gödel logic. Our calculus makes use of the cut rule.  We show that this is inescapable by establishing a negative result on the existence of reasonable calculi for a large class of logics. These include witnessed Gödel logic, (fragments of) Łukasiewicz logic, and intuitionistic logic extended with the quantifiers of classical logic.

Diderik Batens

Spoiled for Choice?

This paper first substantiates an old claim. The transition from a theory that turned out trivial to a consistent replacement need not proceed in terms of inconsistencies, which are negation gluts. Logics that tolerate gluts or gaps (or both) with respect to any logical symbol may serve as the lower limit for adaptive logics that assign a minimally abnormal consequence set to a given premise set. The same obtains for logics that tolerate a combination of kinds of gluts and gaps. This result runs counter to the obsession with inconsistency that classical logicians and paraconsistent logicians share.

All such basic logics will be systematically reviewed, some variants will be outlined, and the claim will be argued for. While those logics tolerate gluts and gaps with respect to logical symbols, ambiguity logic tolerates ambiguities in non-logical symbols. Moreover, forms of tolerance may be combined, with zero logic as an extreme.

In the baffling plethora of corrective adaptive logics (roads from trivial theories to consistent replacements), adaptive zero logic turns out theoretically interesting as well as practically useful. On the one hand all meaning becomes contingent, depending on the premise set. On the other hand, precisely adaptive zero logic provides one with an excellent analyzing instrument. For example it enables one to figure out which corrective adaptive logics lead, for a specific trivial theory, to a suitable and interesting minimally abnormal consequence set.

Libor Behounek and Petr Cintula

The Local, the Global and the Ugly

The consequence relation usually studied in fuzzy and substructural logics transmits the full truth from the premises to the conclusion. As objected in [2], this consequence relation (henceforward called the "global" consequence relation) only cares about the designated truth values, neglecting the non-designated values and their ordering. As a remedy of this shortcoming, [2] has proposed the "degree-preserving" consequence relation, which ensures that the truth value of the conclusion is at least as large as the meet of the truth values of the premises. Modus ponens and classical deduction theorem fail for the degree-preserving consequence relation of contraction-free logics.

 

In this talk we consider an alternative consequence relation which also takes non-designated truth degrees into account (henceforth called the "local" consequence relation for substructural logics), but combines the premises by fusion rather than the lattice meet operation. Classical deduction-detachment theorem is valid for the local consequence theorem. In contraction-free logics (with exchange), though, multisets rather than sets of premises then have to be considered: the local consequence relation is thus a special case of multiset consequence relations studied by Avron in the early 1990's [1]. Since multiset consequence relations fall outside the class of Tarski consequence relations, the standard apparatus of Abstract Algebraic Logic cannot be applied to them. In the talk we will sketch some ideas about general and particular methods for handling the local consequence relations of substructural logics and compare the properties of the three consequence relations.

 

References:

[1]  Avron A.: Simple consequence relations. Information and Computation 92 (1991): 105-139.

[2]  Bou F., Esteva F., Font J.M., Gil A., Godo L., Torrens A., Verdu V.: Logics preserving degres of truth from varieties of residuated lattices. Journal of Logic and Computation 19 (2009): 1031-1069.

Jean-Yves Béziau

Linking Proof and Truth through Abstract Logic 

In this talk I will present a general result connecting sequent proof systems to truth-functional and non truth-functional semantics based on Gentzen's original interpretation of sequents and Lindenbaum-Asser extension theorem. I will explain how from this result we can get immediate completeness for thousands of systems of logic running from paranormal logics to pure alethic modal logics.

 

References:

J.-Y-.Beziau, "Paranormal logics and the theory of bivaluations"

in Universal Logic: an Anthology - from Paul Hertz to Dov Gabbay, Birkhäuser, Basel, 2012, pp.361-372.

http://www.springer.com/mathematics/book/978-3-0346-0144-3

Maria Paola Bonacina

Interpolation for resolution and superposition

Given two inconsistent formulae, a (reverse) interpolant is a formula implied by one, inconsistent with the other and only containing symbols they share. Interpolants are used in verification to compute over-approximations of images, refine abstractions, or generate invariants. We consider interpolation of refutations generated by inference systems for first-order logic with equality, with resolution and superposition. These inference systems are at the heart of state-of-the-art theorem provers, may yield decision procedures, and can be integrated in SMT-solvers. We present a two-stage approach that computes provisional interpolants, which may contain non-shared symbols, and applies lifting to replace them with quantified variables. The resulting interpolation system is complete, meaning that it extracts an interpolant from any refutation.

 

(Joint work with Moa Johansson)

Walter Carnielli

Paraconsistent set theories by predicating on (in)consistency

I want  to show  how a new  kind  of  paraconsistent set theory can be founded upon  the Logics of Formal Inconsistency  (LFIs) by defining  consistency as a  predicate acting on  sentences, as well as on  sets. Paraconsistent set-theories under this perspective (such as ZFmbC and ZFCil) can be shown to handle some set-theoretical paradoxes, and can be proved to be non-trivial. Taking  into account that  George Cantor in his seminal work on  set theory not  only referred  to  ‘inconsistent sets’, but regarded contradictions as beneficial, I discuss  how Cantor’s view of inconsistent collections can be related to this approach (a joint  work  with Marcelo E. Coniglio) and also how the approach can be related to the work of Arnon Avron on LFIs and on predicative set  theory.

Agata Ciabattoni

Standard completeness via proof theory: an automated approach

Standard completeness, that is completeness of a logic with respect to algebras based on truth values in [0,1], has received increasing attention in the last few years. In a standard complete logic connectives are interpreted by suitable functions on [0,1], and this makes it a fuzzy logic in the sense of Hajek. Checking or discovering whether a logic is standard complete is sometimes a challenging task which deserves a paper for each specific logic. We show how to use recent results on the systematic introduction of sequent and hypersequent calculi to automate standard completeness proofs for large classes of logics.

Nachum Dershowitz and Gilles Dowek

A Two-Dimensional Programming Language for Two-Dimensional Data

Turing, in his immortal 1936 paper, observed that “[human] computing is normally done by writing … symbols on [two-dimensional] paper”, but noted that use of a second dimension “is always avoidable” and that “the two-dimensional character of paper is no essential of computation”. We propose to exploit the naturalness of two-dimensional representations of data by promoting two-dimensional models of computation. In particular, programs for a two-dimensional Turing machine can be recorded most naturally on its own two-dimensional input-output grid, in such a transparent fashion that schoolchildren would have no difficulty comprehending their behavior. This two-dimensional rendering allows, furthermore, for a most perspicacious rendering of Turing's “universal machine”.

Michael Dunn

Arrow Logics, Relevance Logics, and Relation Algebras

I explore the relationships between the van Benthem-Venema semantics for arrow logics and the Routley-Meyer semantics for relevance logics, and show that there is a translation between the two. I compare van Benthem's version of the semantics for arrow logic aimed at relation algebras with my own generalization of the Routley-Meyer semantics aimed at the same target.  I use my (1993, 2001) representation of relation algebras based on Routley-Meyer frames to give an equivalent representation of relation algebras based on frames for arrow logic.

References

van Benthem, J. (1991). Language in Action: Categories, Lambdas and Dynamic Logic.  Amsterdam: North Holland.

van Benthem, J. (1994). "A Note on Dynamic Arrow Logic," in J. van Eijck and A. Visser (eds.), Logic and Information Flow, Cambridge MA: The MIT Press.

Dunn, J. M. (2001). "Representation of Relation Algebras Using Routley–Meyer Frames." In C. A. Anderson and Zeleny, M. (eds.), Logic, Meaning and Computation: Essays in Memory of Alonzo Church, Dordrecht: Kluwer, 77–108. Preliminary version in Indiana University Logic Group Preprint Series, IULG-93-28, 1993.

Routley, R., & Meyer, R. K. (1973). "The Semantics of Entailment." In H. Leblanc (ed.), Truth, Syntax and Modality, Proceedings of the Temple University conference on alternative semantics,. Amsterdam: North Holland.

Venema, Yde (1996). "A Crash Course in Arrow Logic," in Marx, M., Pólos, L., and Masuch, M. (eds), Arrow Logic and Multi-Modal Logic, Palo Alto CA: CSLI Publications.

Christian Fermüller

On Matrices, Nmatrices, and Games

We generalize Hintikka's semantic game for classical logic to the family of all finite valued propositional logics. This in turn serves as a springboard to develop game semantics for all propositional logics that can be characterized by finite nondeterministic matrices. In this approach, a new concept of nondeterministic valuation, which we call `liberal valuation', merges that augments the usually employed static and dynamic valuations in a natural manner.  While winning strategies for unrestricted semantic games correspond to liberal valuations, the characterization of static and dynamic valuations calls for specific pruning procedures resulting in properly restricted versions of the game.

Nissim Francez

Relevant Harmony

After reviewing the basic definitions of harmony and stability, two of the central concepts in Proof-Theoretic Semantics, the paper considers the implicational fragment of the relevant logic R (Anderson&Belnap), under a labelled natural deduction (ND) system, where the labels keep track of `use' of assumptions. Thereby, no assumption is discharged that has not been used. It is shown that the ND is not closed under composition of derivations using its standard definition, there by prohibiting Prawitz's detour removal reductions, hence failing harmony. A revised definition of derivation composition is proposed, under which the ND system *is* closed, allowing reductions and reenabling harmony (and stability).

Dov Gabbay

Logical Foundations for Bipolar Argumentation Networks

Traditional abstract argumentation networks have been studied in two directions

1.       Investigate their semantics in detail. This has lead to the study of extensions, dealing with loops, connections with classical logic, the notions joint and disjunctive attacks, and especially what concerns us here, the Equational approach and the ASPIC instantiation approach.

2.       Generalize the notion of argumentation to bipolar argumentation and to the addition of the notion of support.

This paper will study bipolar and tripolar argumentation network, both from the equational point of view and by reducing them to traditional attack network. The paper will avoid traditional difficulties with bipolar networks and, most importantly, the paper will indicate how to model the ASPIC approach (see M. Caminada and L. Amgoud. On the evaluation of argumentation formalisms), using tripolar equational networks with joint support. The full reduction of ASPIC to tripolar equational networks will be done in a subsequent paper (The equational approach to defeasible argumentation with applications to ASPIC, paper 478, in preparation).

Furio Honsell

LF_P – A Logical Framework with External Predicates

joint work with: Marina Lenisa, Luigi Liquori, Petar Maksimovic, Ivan Scagnetto

 

The LF_P Framework is an extension of the Harper-Honsell-Plotkin's Edinburgh Logical Framework LF with external predicates. This is accomplished by defining lock type constructors, which are a sort of diamond-modality constructors, releasing their argument under the condition that a possibly external predicate is satisfied on an appropriate typed judgement.  

Lock types are defined using the standard pattern of constructive type theory, i.e. via introduction, elimination, and equality rules. Using LF_P, one can factor out the complexity of encoding  specific features of logical systems which are awkwardly encoded in LF, e.g. side-conditions in the application of rules in Modal Logics, substructural rules as in non-commutative Linear Logic, and pre- and post-conditions in Hoare-like programming languages.  Once these conditions have been isolated, their verification can be delegated to an external proof engine, in the style of Poincare' Principle.

We investigate and characterize the metatheoretical properties of the calculus underpinning LF_P, together with its canonical presentation, based on a suitable extension of the notion of beta-eta-long normal form, allowing for smooth formulation of adequacy statements.

Beata Konikowska

Modular Construction of Cut-Free Sequent Calculi for Paraconsistent Logics

We provide a general method for a systematic and modular generation of cut-free calculi for thousands of paraconsistent logics known as Logics of Formal (In)consistency. The method relies on the use of non-deterministic semantics for these logics.

Ori Lahav

Semantic Investigation of Canonical Gödel Hypersequent Systems

A major breakthrough in the proof theory of fuzzy logics was the introduction in [1] of HG – a well-behaved calculus for Gödel logic. The idea was to lift LJ, the sequent system for intuitionistic logic, to a hypersequent system, and add a hypersequent structural rule, called communication, to accommodate the linearity property. In this talk we study a general family of "communication-based" hypersequent systems, with logical rules of a general canonical form, of which HG is a particular instance. The obtained family contains, among others, various calculi for extensions of Gödel logic with new connectives. We present a method to obtain (potentially non-deterministic) many-valued semantics for each hypersequent system of this family. Our semantic analysis provides new insights into the semantic effect of the ingredients of HG, and leads to a simple characterization of cut-admissibility in systems of this family.

References

[1] Arnon Avron. Hypersequents, logical consequence and intermediate logics for concurrency. Annals of Mathematics and Artificial Intelligence, 4:225–248, 1991.

Johann Makowsky

Characterizing word functions recognized by multiplicity automata

We introduce the class of word functions into a field F definable in monadic second order logic MSOL-EVAL(F) and prove a Buechi-Elgot-Trakhtenbrot type theorem: A word function f is recognized by a multiplicity automaton A over a field F iff f Î MSOL-EVAL(F). We also discuss extensions where words are replaced by trees and F is replaced by a ring.

Joint work with N. Labai

Joke Meheus

Which Style of Reasoning to Choose in the Face of Conflicting Information?

joint work with: Christian Straβerquad Peter Verdée (Centre for Logic and Philosophy of Science, University of Ghent, Belgium)

As is well-known, in logics for reasoning from (possibly) inconsistent information often different kinds of consequence relations are defined. This holds true, for instance, for the consequence relations that are defined in terms of maximal consistent subsets. What are called the Inevitable Consequences in [7] are those that follow by Classical Logic CL from every maximal consistent subset of the premises (elsewhere these consequences are called Strong Consequences or Universal Consequences). The Weak Consequences (sometimes also called Existential Consequences) are those that follow by CL from some maximal consistent subset of the premises. The Free consequences are those formulas that follow from the set of formulas which belong to every maximal consistent subset of the premises. Analogous distinctions can be found in default logics (and in logics that are defined in terms of them–such as some deontic logics, see for instance [4], and the signed systems from [2]). The sceptical consequences are those that follow from every extension, the credulous consequences are those that follow from some extension. (The prudent consequences from [2] are those that are analogous to the Weak consequences.) In Input/Output logics (see, for instance, [5]), we have the same distinction under the names ``full meet constrained output'' and ``full join constrained output''. Finally, we find this approach also in abstract argumentation (see, for instance, [3]) where credulously accepted arguments are in some extensions of a given framework, and sceptically accepted arguments are in all extensions.

Although the three same kinds of consequence relations keep reappearing, little attention has been paid to motivating which kind of consequence relation is best suited for which kind of application context. (One of the exceptions is [6] where it is argued that, in some cases, the sceptical consequences are best suited for epistemic reasoning, whereas the credulous consequences are better suited for practical reasoning.)

Also for adaptive logics (which provide a unifying framework for all kinds of defeasible reasoning) different kinds of consequence relations are defined. This is done by varying the so-called adaptive strategy.

In recent years, many of the above consequence relations were characterized as adaptive logics. This not only offers a unified framework for these consequence relations, but also provides them with the same kind of semantics and the same kind of (dynamic) proof theory (and generic proofs for all important metatheoretic properties). What is striking is that the Strong consequences (and the Sceptical consequences) as well as the Free consequences (and the Prudent consequences) can be reconstructed by means of the two `standard' adaptive strategies (namely, Minimal Abnormality for Strong/Sceptical and Reliability for Free/Prudent), but that for the Weak consequences, a new kind of adaptive strategy had to be designed (which is now called the Normal Selections strategy). Up to now, adaptive logicians paid relatively little attention to the Normal Selections strategy (except for reconstructing and integrating existing consequence relations into the adaptive logic framework). Like some others, adaptive logicians have always been rather sceptical about the usefulness of the Weak/Credulous consequences.

The aim of the present paper is twofold. First, we want to distinguish between different application contexts and argue that some `styles' of reasoning are better suited for some application context than for others. Next, we want to address the concerns that some people have expressed with respect to the usefulness of the Weak/Credulous consequences. At the same time, however, we shall argue that, as a `reasoning style', making Weak/Credulous inferences does make sense (for some application contexts), provided one has a (dynamic) proof theory for the corresponding consequence relation. Such a dynamic proof is exactly what adaptive logics can offer. All this will be illustrated by means of a very simple reconstruction of the Free, Strong, and Weak Consequences. (A characterization of these consequences in terms of adaptive logics already exists, see [1], but the difference is that the latter relies on a paraconsistent logic, and a translation of the premises in terms of both a classical and a paraconsistent negation, whereas the present reconstruction proceeds entirely in terms of CL.)

References

[1] Diderik Batens. Towards the unification of inconsistency handling mechanisms. Logic and Logical Philosophy, 8:5–31, 2000. Appeared 2002.

[2] Philippe Besnard and Torsten Schaub. Signed systems for paraconsistent reasoning. Journal of Automated Reasoning, 20:191–213, 1998.

[3] Phan Minh Dung. On the acceptability of arguments and its fundamental role in nonmonotonic reasoning, logic programming and n-person games. Artificial Intelligence, 77:321–358, 1995.

[4] John Horty. Reasons as Defaults. Oxford University Press, 2012.

[5] David Makinson and Leendert van der Torre. Constraints for Input/Output logics. Journal of Philosophical Logic, 30(2):155–185, April 2001.

[6] Henry Prakken. Combining sceptical epistemic reasoning with credulous practical reasoning. In P.E. Dunne and T.J.M. Bench-Capon, editors, Computational Models of Argument. Proceedings of COMMA-06., pages 311-322, Amsterdam, 2006. IOS Press.

[7] Nicholas Rescher and Ruth Manor. On inference from inconsistent premises. Theory and Decision, 1:179–217, 1970.

George Metcalfe

RM Revisited

In this talk I will describe some recent results for the substructural logics RM and RMt and their fragments, paying special attention to the pioneering work of Arnon Avron on this topic. In particular, generalizing earlier results of Meyer and Avron, I will describe a complete characterization of the extensions of RMt admitting Craig interpolation. I will also show that a "modus-ponens-like" rule introduced by Avron provides a basis for the admissible rules of the implication and implication-multiplicative-conjunction fragments, while the full multiplicative fragment requires a supplementary infinite set of rules.

Sara Negri

From semantics to proofs: Provability logics and modal embeddings

It is well known that intuitionistic propositional logic may be faithfully embedded not just into the modal logic S4 but also into the provability logics GL and Grz of Gödel-Löb and Grzegorczyk, and also that there is a similar embedding of Grz into GL. Known proofs of these results are short but model-theoretic and thus nonconstructive. Complete, cut-free sequent systems for these logics have been proposed, among others, by Arnon Avron in 1984. Here a labelled contraction-free sequent system for Grzegorczyk logic is presented  and shown to be complete and therefore closed with respect to cut. The completeness proof for the labelled calculus yields at the same time a decision procedure and a direct countermodel construction for underivable sequents. As an application, a constructive proof of the embedding of Int into Grz and a proof-theoretic decision procedure for Int is obtained.


(The talk is based on a joint paper with Roy Dyckhoff)

Nicola Olivetti

Conditional logics: from semantics to proof theory

Conditional logics are intended to formalise a kind of non-truth functional conditional of the sort "if A were the case then B". They have a long history and they have been used to formalise several types of reasoning such as:  counterfactuals, causality, belief change, plausible/non-monotonic inferences, strategic reasoning in games, among others. Semantically, conditional logics enjoy a possible-world semantic similar to the one modal logics. On the other hand proof systems for conditional logics are relatively underdeveloped, in comparison for instance to the one for modal logics. 

After recalling the logic themselves, we present the state of the art on proof systems for conditional logics, open problems, and recent developments based on nested sequents

Itala M. L. D'Ottaviano

Relations between the classical differential calculus and a paraconsistent differential calculus

In da Costa (2000) two special algebraic structures are constructed, the hyper-ring A and the quasi-ring A*, which extend the field of the standard real numbers and whose elements are called hyper-real numbers. From A*, da Costa proposes the construction of a paraconsistent differential calculus , whose language is the language L of his known paraconsistent logic C=1, extended to the language of the paraconsistent set theory CHU1 introduced in 1986, in which we deal with the elements of A*.

Carvalho (2004), by using the paraconsistent apparatus, studies and improves the calculus proposed by da Costa, obtaining extensions of several fundamental theorems of the classical differential calculus. In this paper, from the introduction of the concepts of paraconsistent super-structure χ over a set X of atoms of CHU1 and of monomorfism between paraconsistent super-structures, we prove a Transference Theorem that “translates” the classical differential calculus into da Costa’s paraconsistent calculus.

Transference Theorem: Let R and S be paraconsistent super-structures, i and i adequate interpretation functions,an adequate monomorfism from R into S, and α(x1, x2,..., xn) a formula of L whose free variables are among x1, x2,..., xn. In these conditions, α(x1, x2,..., xn) is valid in R, relatively to vi if, and only if, ♦(α(x1, x2,..., xn)) is valid in S relatively to vi’’.

 

References:

CARVALHO, T.F. On da Costa’s paraconsistent differential calculus, 2004. 200p. (Doctorate Thesis). Campinas, State University of Campinas, 2004.

Da COSTA, N.C.A. Paraconsistent Mathematics. In: I WORLD CONGRESS ON PARACONSISTENCY, 1998, Ghent, Belgium. Frontiers in paraconsistent logic: proceedings. Edited by D. Batens, C. Mortensen, G. Priest, J.P. van Bendegen. London: King’s College Publications, 2000, p. 165-179.

Anatol Slissenko

Introducing Metrics in Computations

Metrics in computations permit to describe the diversity of computations, in particular their information diversity. The goal is to find means to better understand what are practical algorithms and problems (they are intuitively `smooth' and have strong `auto-similarity'), and to find new ideas for designing and analyzing algorithms. Similar considerations may work for models of logic formulas that describe computational problems.  The talk is about first steps in this direction. It discusses metrics and similarity measures based on syntactic similarity and on entropy of partitions.

Heinrich Wansing

Falsification, natural deduction, and bi-intuitionistic logic

A kind of a bi-intuitionistic propositional logic is introduced that combines verification and its dual. This logic, 2Int, is motivated by a certain dualization of the natural deduction rules for intuitionistic propositional logic, Int. It is shown that 2Int can be faithfully embedded into Int. In 2Int, from a falsificationist perspective, intuitionistic negation internalizes support of truth, whereas from the verificationist perspective, co-negation internalizes support of falsity.