**July 24–26, August 02 2024**

Argentinean Society

of Philosophical Analysis (SADAF)

Buenos Aires, Argentina

Miguel Álvarez Lisboa (UBA/IIF-SADAF-CONICET)

Quentin Blomet (IJN-CNRS)

Beatrice Buonaguidi (King’s College, London)

Yll Buzoku (UCL, London)

Alba Cuenca (Melbourne University)

Bruno Da Ré (UBA/IIF-SADAF-CONICET)

Pablo Dopico (King’s College, London)

Eliana Franceschini (UBA/IIF-SADAF-CONICET)

Camila Gallovich (UBA/IIF-SADAF-CONICET)

Andrea Iacona (Università di Torino)

Carlo Nicolai (King’s College, London)

Francesco Paoli (Università di Cagliari)

Lucas Rosenblatt (UBA/IIF-SADAF-CONICET)

Mariela Rubin (UBA/IIF-SADAF-CONICET)

Damián Szmuc (UBA/IIF-SADAF-CONICET)

Diego Tajer (UBA/IIF-SADAF-CONICET)

Nicolo Zamperlin (Università di Cagliari)

Martina Zirattu (Università di Torino)

**Miguel Álvarez Lisboa: “There is only one logical connective”**

According to some philosophers such as Došen, the meaning of logical constants is to be found in the ‘structural’ properties of the consequence relation in which they occur. We may follow Girard and call this thesis *locus solum*, “only the place [matters]”. In this work I explore a consequence of this thesis. I will show that there is a connective that is able to express all the relevant properties of any consequence relation, and consequently it may be seen as the ‘only’ logical connective. The connective in question is Π, the Cartesian Product of a Family of Sets, one of the basic type-forming operators of Martin-Löf’s Intuitionistic Type Theory (ITT). The Π-fragment of this logic corresponds to second order intuitionistic propositional logic (*Ni* ^{2}), and is equivalent to Girard’s type system **F**. I defend that (1) Π is indeed a logical connective; (2) Gentzen’s sequent calculus can be fully interpreted in the Π-fragment of ITT; (3) This interpretation can be easily extended to cover modal and metainferential logics; (4) The theory of meaning underlying the formalism serves as a philosophical basis for locus solum; in particular, it delivers an elegant explication of the Adoption Problem along the lines of Finn. My conclusions suggest a *Modus Ponens/Modus Tollens* argument. The fact may be equally interpreted as a unifying consequence of the locus solum thesis, or as evidence that locus solum is still too reductionistic to serve as a theory of meaning of the logical constants.

**Quentin Blomet: “Matrix Duality”**

I develop the algebraic concept of *matrix duality* to characterize the duality that can exist between many-valued logics defined by matrices based on an ordered algebraic reduct and equipped with two designated subsets. This concept encompasses the duality between pairs of logics such as the strong Kleene logic K3 and the logic of paradox LP, Pietz and Rivieccio’s ETL and Shramko’s NFL, as well as the self-duality of the strict-tolerant logic ST, the tolerant-strict logic TS or the Belnap-Dunn logic FDE. I show that this algebraic notion subsumes several duality concepts related to order reversal, including *negation duality* — where A entails B in the first logic iff not-B entails not-A in the second —, *structural duality* — where A entails B in the first logic iff B anti-entails A in the second —, and *Schröder duality* — where A entails B in the first logic iff B’ entails A’ in the second, with A’ and B’ obtained from A and B by interchanging conjunction and disjunction. We will see, for instance, that any two logics defined by dual matrices, each with the same designated subset for premises and conclusions, are Schröder dual.

Traditionally, hyperintensional contexts and operators are defined in a negative way, namely, as contexts and operators which do not license unrestricted intersubstitutivity for necessary equivalents, or for logical equivalents (Cresswell 1975, Nolan 2014). I argue that this notion of hyperintensionality is too vague, especially if it needs to be used to judge whether a logical consequence relation counts as hyperintensional, i.e. displays hyperintensional contexts and operators. Starting by challenging a criterion for hyperintensionality for a logical consequence relation suggested by Odintsov and Wansing (2021), I will present several different criteria for a logic to count as hyperintensional. This will help me draw a diagnosis of hyperintensionality as, simply, some degree of asymmetry between truth preservation and falsity preservation. Furthermore, I will argue that hyperintensionality can show up at different levels for different consequence relations, licensing a spectrum of hyperintensional behaviour: as a case study of this, I will consider N-logics and HYPE (Leitgeb 2019).

**Yll Buzoku: “Proof-theoretic Semantics for Intuitionistic Linear Logic”**

Recent advances in our understanding of proof-theoretic semantics have led us to be able to start giving substructural logics such semantics. Whilst work was done on the logic of Bunched Implications by Gheorghiu, Gu and Pym, Intuitionistic Linear Logic presented unique challenges that were unanswered by our understanding of proof-theoretic semantics; in particular, modelling the proof-theoretic nature of the modality “of-course”. In this talk, I plan on giving a high level overview of what some of these advances in our understanding of proof-theoretic semantics are, how I was able to capture the proof-theoretic behaviour of the modality “of-course” to give a semantics for Intuitionistic Linear Logic and to give a flavour of what the future of proof-theoretic semantics for Linear Logic might look like.

**Alba Cuenca: “Substructural epistemic and logic omniscience”**

In this work, I provide a comprehensive analysis of how substructural logics integrate epistemic operators and address the problem of logical omniscience. I propose a unified framework for understanding substructural approaches to epistemic logic, focusing on arbitrary agents and the general concept of knowledge. Using the philosophical framework of the fundamental problem of logical omniscience as outlined by Hawke, Özgün, and Berto (2020), I examine the effectiveness of substructural logics in resolving this issue.

**Bruno Da Ré: “A semantics for K: a nonmonotonic and nontransitive version of Classical logic”**

In the past years, it has been shown that the derivable rules of LK without cut and with the invertible rules can be semantically characterized using Strong Kleene tables, employing the st-notion of logical consequence. Furthermore, it has also been proved that the external logic of this calculus is LP. In this talk, our aim is to study these properties regarding another very well-known sequent calculus for classical logic: K. Unlike LK, this system is defined in a SET-SET setting and does not have weakening as a derivable rule. Firstly, we will provide a semantics for this calculus which consists of a non-monotonic and non-transitive semantics for classical logic. Next, we will discuss how this semantics could be extended to a multiset setting for characterizing the derivable rules of G3c. Finally, we will present the external logic of K (which is also paraconsistent) and connect it with LP.

This is a joint work with Camillo Fiore

**Pablo Dopico: “Axiomatic theories of supervaluationist truth: completing the picture”**

As is well-known, in his seminal ‘Outline of a theory of truth’, Kripke suggested to run his fixed-point construction for theories of truth over supervaluationist schemes. Three such schemes stood out: (1) the scheme **vb**, which considers extensions of the truth predicate consistent with the original one; (2) the scheme **vc**, which considers consistent extensions more generally, and; (3) the scheme **mc**, which only considers maximally consistent extensions. As is also known, Andrea Cantini proposed an axiomatization of the fixed-point theory constructed over the scheme **vc**, which he called **VF**, and
proved that the theory was sound with respect to the fixed-point models generated by that scheme. Moreover, he showed that **VF** was a remarkably strong theory, matching the strength of the impredicative theory **ID**_{1}. In this paper, we complete the picture of axiomatic theories of supervaluationist truth by introducing two new theories that correspond—and are sound with respect—to the schemes **vb** and **mc**. In the case of the former scheme, we advance a theory that we call **VF**^{−}, and establish its proof-theoretic strength, which equals that of **VF**. The
most substantial part of the paper, however, is dedicated to the theory which axiomatizes the fixed-point semantic theory over **mc**, which we call **VFM**. For the lower-bound, we show that it defines Tarskian ramified truth predicates up to *ε*_{0} (**RA**_{ < ε0}). For the upper-bound, we provide a cut elimination argument formalized within the theory **ID**_{1}^{*}, which is known to be proof-theoretically equivalent to **RA**_{ < ε0}. Finally, we also introduce the schematic reflective closure of the theory **VFM**, as defined by Feferman. We establish its consistency, and carry out the proof-theoretic analysis for this theory, which confirms that this schematic reflective closure is as proof-theoretically strong as the theory **RA**_{ < Γ0}.

This is joint work with Daichi Hayashi.

**Eliana Franceschini: “Incompatibility and negation at the metainferential level”**

The aim of this article is to take a step towards a full understanding of the notion of negation in the context of metainferential logics. The main contribution will be a sequent calculus for metainferential logics that allows us to show that there is not an intrinsic connection between a failure of the rule of Cut and what some authors (such as Barrio, Pailos, and Szmuc, 2019) call metainferential paraconsistency. Roughly speaking, metainferences are inferences between collections of inferences, and thus substructural logics can be regarded as those which have fewer valid metainferences than Classical Logic. In the last few years, much attention has been paid to the study of metainferences, and in particular, to the question of what are the valid metainferences of a given logic. Nevertheless, there is no agreement at all about how to understand some particular phenomena and properties of the metainferential level, such as the notion of metainferential duality or the one of metainferential paraconsistency—both of them intimately linked with the posibility of somehow negate inferences. One of the problems at present is that it does not exist a unified perspective about what we could call (following the work of C. Fiore, F. Pailos, and M. Rubin, 2022) inferential constants, and in particular, about the notion of negation for the metainferential level. In this article I offer an analysis of the different negations present in the recent literature on metainferences, in the light of certain philosophical discussions about the meaning of negation. I will show how the chosen conception of metainferential negation one adopts influences the results about the duality between logics at the metainferential level as well as the property, for a given logic, of being metainferentially paraconsistent. I explain the way in which I consider the meaning of a negation for the metainferential level is better grounded, and afterwards I present a sequent calculus that captures this conceptual understanding of a metainferential negation. One philosophical outcome of this proof system is that it shows the lack of an intrinsic connection between metainferential paraconsistency and failures of the rule of Cut.

**Camila Gallovich: “Ungroundedness in Yablo’s theory of truth”**

The thought that our attributions of truth and falsity must be grounded in non-semantic states of affairs constitutes an important semantic intuition. According to Stephen Yablo in “Grounding, Dependence, and Paradox”, this intuition is two-sided. Its first aspect—inheritance—draws on the way in which a complex statement inherits its meaning from certain simpler statements. Its second aspect—dependence—shows the way in which the meaning of a complex statement depends on simpler statements. Paradigmatically, the fixed-point construction given by Saul Kripke in his “Outline of a Theory of Truth” provides an inheritance-style characterization of grounding, whereas the dependence-based construction introduced by Yablo provides a dependence-style characterization. Yablo’s article states a further result: “any collection with an inheritance-style characterization admits a canonically related dependence-style characterization” (Yablo, 1982, p.119). That means the statements that are sanctioned as grounded by these approaches are the same. Yablo also shows that the result extends to a fragment of the set of ungrounded statements; to wit, to the set of statements that, according to these theories, are sanctioned as paradoxical. The guiding question of the talk is how far Yablo’s result can be pushed. To settle the question, it will be useful to consider fragments of the ungrounded statements precluded in Yablo’s analysis and also to consider languages enriched by means of additional semantic predicates other than Tr(x). The talk will run as follows. First, I will show how to set up a dependence-based construction for a language extended with a paradoxicality predicate. Then, I will provide characterizations of the semantic behaviours that can be distinguished within this approach. The resulting dependence-style characterization of truth and paradoxicality resembles an inheritance-style characterization recently provided by Lucas Rosenblatt and myself in the context of the fixed-point conception.

**Andrea Iacona: “The Stoic Thesis and its Formalization”**

In this talk I develop an idea that goes back to the Stoics, namely, that an argument is valid when the conditional formed by the conjunction of its premises as antecedent and its conclusion as consequent is true. As I will argue, once some basic features of our informal understanding of validity are properly spelled out, and a suitable account of conditionals is adopted, the equivalence between valid arguments and true conditionals makes perfect sense. I will show how this equivalence can be formalized in a first-order language that contains a naive truth predicate and a suitable conditional connective. The validity predicate that turns out to be definable in this language significantly increases our expressive resources and provides a coherent formal treatment of paradoxical arguments.

**Carlo Nicolai: “CLassical closures”**

I present some observations on the theory of classical determinate truth recently introduced by Fujimoto and Halbach. The observations aim to show that there is a sense in which the primitive determinate predicate of CD+ could be dispensed with without compromising the logical strength and motivation of the theory. In particular, there’s a precise sense in which the axioms of CD+ are a notational variant of the classical closure of Kripke-Feferman truth.

This is joint work with Luca Castaldo.

**Francesco Paoli: “Logical Metainferentialism”**

Logical inferentialism is the view that the meaning of logical constants is implicitly defined by the operational rules that govern their behaviour in proofs – in particular, sequent calculus proofs, according to an increasingly dominant tendency. A tenable articulation of this view presupposes a clarification of certain crucial aspects, concerning e.g. harmony criteria for rules or what counts as a normal proof. Sequent calculus inferentialists generally do so in terms of proofs from axioms, not of derivations from assumptions. Our version of logical metainferentialism (which is different, in some respects, from the Buenos Aires Plan) calls into question this dogma, against the backdrop of the idea that meaning determination is relative to sequent-to-sequent derivability relations of Gentzen systems. We advance a suggestion towards a metainferentially appropriate reformulation of harmony, and explore its potential by focussing on a case study, the calculi for FDE and its extensions.

**Lucas Rosenblatt: “Truth give me strenght”**

Arguably, the most echoed argument against non-classical theories of truth is that they are deductively weaker than the corresponding classical theories; a typical example of this is that the classical theory KF proves more arithmetical sentences than the non-classical theory PKF, even though both can be seen as adequate axiomatizations of the Kripkean conception of truth. In this talk, we provide two different answers to this argument, focusing on the pair of theories KF and PKF as our test-case. Our first answer is that, despite appearances, the difference in deductive power between KF and PKF is not detrimental to the non-classical theory: PKF proves all the sentences that are theorems of PA, so if the conclusion of the argument is that PKF is an arithmetically weak theory, it follows that PA is also an arithmetically weak theory, which seems at least implausible. The second response is based on an argumentative strategy usually known as classical recapture. There is a natural way to improve the proof-theoretical power of PKF by means of additional axioms. In particular, we shall extend PKF with a subset of the grounded instances of the principle of excluded middle and we will show that the resulting theory has the same proof-theoretical power as KF.

This is joint work with Jonathan Erenfryd, Camillo Fiore and Camila Gallovich.

**Mariela Rubin: “Natural Deduction for ST”**

The main goal of this work is to present a Fitch’s style Natural Deduction Calculus for the non-transitive logic known as ST. ST is a logic that is inferentially identical to Classical Logic but that differs from it at the meta-inferential level by making transitivity locally invalid; this means roughly, that there are interpretations where the arguments from *A* to *B* and from *B* to *C* are satisfied, but the argument from *A* to *C* is not (where *A*, *B* and *C* are formulas of the relevant object language). ST is usually characterized semantically by means of the Strong Kleene valuation schema. Also, it can be characterized by the calculus that results from Gentzen’s LK when we remove the rule of Cut and add the appropriate inverses of the introduction rules for the connectives. But for the time being there is still no Fitch-style Natural Deduction Calculus for this logic. There are good reasons for wanting this. First, because it is a technical tool that is still absent in the literature. And second, because Natural Deduction Calculi are philosophically interesting in their own right. Following an inferentialist stance, the meaning of logical terms is determined by their inferential role. So semantics in inferentialist terms are defined in terms of a calculus. Yet, for some philosophers such as Dummett, this is not all. The meaning of a connective is given by the rules that explain how to infer sentences from other sentences, and that is what Fitch-style Natural Deduction Calculus offers: it gives one the information of when we can infer a sentence with some logical term as its main connective and what we can infer from that sentence.

This is joint work with Camillo Fiore and Paula Teijeiro.

We study the possibility of having nontrivial sequent calculi counting with well-behaved introduction and elimination rules alongside the usual structural rules, but possibly lacking the axiom of reflexivity or the cut rule. We examine the possibility of giving three- and four-valued non-deterministic semantic characterizations for the notions of derivability within these calculi.

This is joint work with Bruno Da Ré.

**Diego Tajer: “A realistic normativity for logic?”**

The aim of this paper is to analyze the relevance of the psychology of deductive reasoning for the debate on the normativity of logic. I will present three different psychological theories which, in their own ways, distinguish between “easy” and “hard” logical rules. Then, I will argue that some of the most common difficulties for explaining the normativity of logic can be reframed using these distinctions. In conclusion, a virtuous dialogue between the psychology of reasoning and the normativity of logic is possible.

**Nicolo Zamperlin: “Generalized Epstein semantics for Parry systems”**

I present an algebraic-oriented approach to the study of content-sensitive connectives obeying a fully compositional behavior, starting from the recent studies on this field provided by Thomas Ferguson. In Ferguson (2023), Kripke-style semantics is employed to study subsystems of Parry’s logic of analytic implication (PAI) and its demodalized counterpart, Dunn’s logic. In my approach I extend the set-assignment semantics first developed by Richard Epstein and I use it to study the same Parry systems analyzed by Ferguson. While Epstein’s set-assignment semantics (Epstein (1994)) assigns subsets of a reference powerset as content to formulae, in the presented generalized semantics contents can be assigned by ranging over a wider range of algebraic structures, and the same applies to the set of truth-values, which are no longer limited to the classical 2-element Boolean algebra. A result obtained with these richer mathematical structures is a partial answer to the open problem of a set-assignment semantics for PAI: I prove completeness for the global version of PAI (the global variant of Fine (1986)) in this generalized Epstein semantics, and similarly for the global modal logics corresponding to Ferguson’s subsystems of PAI and their demodalized counterparts.

**Martina Zirattu: “Variable inclusion through substructural means.”**

In this talk, we critically discuss what has become known as logics of variable inclusion: logical systems where validity requires (with certain exceptions) that the propositional variables occurring in the conclusion are included among those appearing in the premises (right variable inclusion), or vice versa (left variable inclusion). One notable example is the case of Paraconsistent Weak Kleene and its Paracomplete counterpart (which we will refer to as ‘WK logics’), which are known to be, respectively, the left and right variable inclusion companions of Classical Logic (CL). The main goal of this talk is to set forth novel refined versions of the variable inclusion companions for CL, which we will call the analytic and synthetic companions of CL. We will characterize them proof-theoretically, via a two-sided sequent calculus. The main feature of the refinements we are proposing is that they are substructural: they both share the same rules for the connectives as the WK logics, but in each case we impose linguistic constraints on the application of Weakening. In exchange for losing monotonicity, each of the companions we present gets to keep all classical anti-theorems (theorems) along with unconditional right (left, resp.) variable inclusion.

This is joint work with Agustina Borzi

This Workshop aims to analyze different topics in Philosophical Logic, mainly connected with substructural logics, semantic paradoxes and non-classical logics in general

We are thankful for the support provided by: **CONICET** (Consejo Nacional de Investigaciones Científicas y Técnicas, Argentina); **MINCYT** (Ministerio de Ciencia, Tecnología e Innovación, Argentina; project no. 01-RC-2022-01-00028); **PLEXUS** (grant agpreement no. 101086295, a Marie Sklodowska-Curie action funded by the EU under the Horizon Europe Research and Innovation Programme); and **MOSAIC** (grant agpreement no. 101007627, also a Marie Sklodowska-Curie action funded by the EU under the Horizon Europe Research and Innovation Programme).