231 research outputs found
Semantic A-translation and Super-consistency entail Classical Cut Elimination
We show that if a theory R defined by a rewrite system is super-consistent,
the classical sequent calculus modulo R enjoys the cut elimination property,
which was an open question. For such theories it was already known that proofs
strongly normalize in natural deduction modulo R, and that cut elimination
holds in the intuitionistic sequent calculus modulo R. We first define a
syntactic and a semantic version of Friedman's A-translation, showing that it
preserves the structure of pseudo-Heyting algebra, our semantic framework. Then
we relate the interpretation of a theory in the A-translated algebra and its
A-translation in the original algebra. This allows to show the stability of the
super-consistency criterion and the cut elimination theorem
On completeness of reducibility candidates as a semantics of strong normalization
This paper defines a sound and complete semantic criterion, based on
reducibility candidates, for strong normalization of theories expressed in
minimal deduction modulo \`a la Curry. The use of Curry-style proof-terms
allows to build this criterion on the classic notion of pre-Heyting algebras
and makes that criterion concern all theories expressed in minimal deduction
modulo. Compared to using Church-style proof-terms, this method provides both a
simpler definition of the criterion and a simpler proof of its completeness.Comment: 24 page
Strong normalization property for second order linear logic
AbstractThe paper contains the first complete proof of strong normalization (SN) for full second order linear logic (LL): Girardâs original proof uses a standardization theorem which is not proven. We introduce sliced pure structures (sps), a very general version of Girardâs proof-nets, and we apply to sps Gandyâs method to infer SN from weak normalization (WN). We prove a standardization theorem for sps: if WN without erasing steps holds for an sps, then it enjoys SN. A key step in our proof of standardization is a confluence theorem for sps obtained by using only a very weak form of correctness, namely acyclicity slice by slice. We conclude by showing how standardization for sps allows to prove SN of LL, using as usual Girardâs reducibility candidates
A logical foundation for session-based concurrent computation
Linear logic has long been heralded for its potential of providing a logical basis for concurrency.
While over the years many research attempts were made in this regard, a Curry-Howard correspondence between linear logic and concurrent computation was only found recently, bridging the proof theory of linear logic and session-typed process calculus. Building upon this work, we have
developed a theory of intuitionistic linear logic as a logical foundation for session-based concurrent computation, exploring several concurrency related phenomena such as value-dependent session
types and polymorphic sessions within our logical framework in an arguably clean and elegant way, establishing with relative ease strong typing guarantees due to the logical basis, which ensure the fundamental properties of type preservation and global progress, entailing the absence of deadlocks
in communication.
We develop a general purpose concurrent programming language based on the logical interpretation, combining functional programming with a concurrent, session-based process layer through the form of a contextual monad, preserving our strong typing guarantees of type preservation and
deadlock-freedom in the presence of general recursion and higher-order process communication.
We introduce a notion of linear logical relations for session typed concurrent processes, developing an arguably uniform technique for reasoning about sophisticated properties of session-based concurrent computation such as termination or equivalence based on our logical approach, further supporting our goal of establishing intuitionistic linear logic as a logical foundation for sessionbased concurrency
Implicit automata in typed -calculi II: streaming transducers vs categorical semantics
We characterize regular string transductions as programs in a linear
-calculus with additives. One direction of this equivalence is proved
by encoding copyless streaming string transducers (SSTs), which compute regular
functions, into our -calculus. For the converse, we consider a
categorical framework for defining automata and transducers over words, which
allows us to relate register updates in SSTs to the semantics of the linear
-calculus in a suitable monoidal closed category. To illustrate the
relevance of monoidal closure to automata theory, we also leverage this notion
to give abstract generalizations of the arguments showing that copyless SSTs
may be determinized and that the composition of two regular functions may be
implemented by a copyless SST. Our main result is then generalized from strings
to trees using a similar approach. In doing so, we exhibit a connection between
a feature of streaming tree transducers and the multiplicative/additive
distinction of linear logic.
Keywords: MSO transductions, implicit complexity, Dialectica categories,
Church encodingsComment: 105 pages, 24 figure
Recommended from our members
The Dialectical Virtue of Ideological Reduction
Many would agree that there is something generally appealing and attractive about reduction. By this, I do not mean that reductive theories are accepted across the board, nor do I mean that they should be. All I mean is that there is something recognizably âgoodâ about the reductive method that may be outweighed by other considerations. For instance, it is extremely rare for one to reject the reductionist position of a given domain while conceding that the proposed reductive procedure is successful. Typically, the opposition consists in denying that the subject matter can be reduced. This suggests an unspoken rule of the dialectical that, if something can be reduced, then it should be reduced; or, all else being equal, a reductive theory is to be favored over a non-reductive theory. In a similar light, conventional metaphysical wisdom tells us that primitivist positions should only be accepted as a last resort. Again, there may be cases where the situation is dire enough for us to resort to primitivism. Nevertheless, few will oppose the general idea that we ought to reach for primitivism only when all other accounts fail. Let us call these attitudes âUNSPOKEN RULEâ and âLAST RESORTâ, respectively.
In this dissertation, I offer an explanation of how these attitudes apply to ideological reduction by appealing to what I call dialectical virtues and vices. For this end, the first half of the dissertation is dedicated to clarifying what ideology is and what ideological reduction involves. In the second half, I introduce the notion dialectical virtues and vices and argue for their epistemic relevance. Dialectical virtues are features of a theory that place its advocates at a comparative advantage in the context of a debate; dialectical vices are those that place them at a comparative disadvantage. I argue that there are certain dialectical virtues and vices that are associated with (ideological) reductionism and primitivism, and that the above attitudes are the results of our being sensitive to them. Furthermore, I put forth this explanation, not as a merely psychological or sociological explanation of these attitudes. Instead, I argue that dialectical virtues and vices are epistemically relevant and that have serious implications for our evaluation of theories
Logic, Norms and Ontology. Recent Essays in Luso-Brazilian Analytic Philosophy
The present special issue of Disputatio brings together some of the best work recently done in Brazil and Portugal in the tradition of analytic philosophy (broadly conceived). Over the past ten years or
so we have witnessed an impressive growth of analytic philosophy in both countries, either in terms of quantity or in terms of quality of the produced philosophy. We hope that this volume capture, at least partly, the dynamics and strength of such development. The range of philosophical problems and topics covered by the contributed essays is vast, cutting across several philosophical disciplines. Indeed, one can find therein issues in philosophical logic, meta-philosophy, ethics,
aesthetics, philosophy of science, philosophy of language, philosophy of mathematics and metaphysics. Such variety of subject-matter is also a trait of recent Luso-Brazilian analytic philosophy.Fundação para a CiĂȘncia e a Tecnologi
Recommended from our members
A defence of predicativism as a philosophy of mathematics
A specification of a mathematical object is impredicative if it essentially involves quantification over a domain which includes the object being specified (or sets which contain that object, or similar). The basic worry is that we have no non-circular way of
understanding such a specification. Predicativism is the view that mathematics should be limited to the study of objects which can be specified predicatively.
There are two parts to predicativism. One is the criticism of the impredicative aspects of classical mathematics. The other is the
positive project, begun by Weyl in Das Kontinuum (1918), to reconstruct as much as possible of classical mathematics on the basis of a predicatively acceptable set theory, which accepts only countably infinite objects. This is a revisionary project, and certain parts of mathematics will not be saved.
Chapter 2 contains an account of the historical background to the predicativist project. The rigorization of analysis led to Dedekind's and Cantor's theories of the real numbers, which relied on the new notion of abitrary infinite sets; this became a central part of modern classical set theory. Criticism began with Kronecker; continued in the debate about the acceptability of Zermelo's Axiom of Choice; and was somewhat clarified by Poincaré and Russell. In the
light of this, chapter 3 examines the formulation of, and motivations behind the predicativist position.
Chapter 4 begins the critical task by detailing the epistemological problems with the classical account of the continuum. Explanations of classicism which appeal to second-order logic, set theory, and
primitive intuition are examined and are found wanting.
Chapter 5 aims to dispell the worry that predicativism might collapses into mathematical intuitionism. I assess some of the arguments for intuitionism, especially the Dummettian argument from indefinite
extensibility. I argue that the natural numbers are not indefinitely extensible, and that, although the continuum is, we can nonetheless make some sense of classical quantification over it. We need not reject the Law of Excluded Middle.
Chapter 6 begins the positive work by outlining a predicatively acceptable account of mathematical objects which justifies the Vicious Circle Principle. Chapter 7 explores the appropriate shape of formalized predicative mathematics, and the question of just how much mathematics is predicatively acceptable.
My conclusion is that all of the mathematics which we need can be predicativistically justified, and that such mathematics is
particularly transparent to reason. This calls into question one currently prevalent view of the nature of mathematics, on which
mathematics is justified by quasi-empirical means.Supported by the Arts and Humanities Research Council [grant number 111315]
Asking and Answering
Questions are everywhere and the ubiquitous activities of asking and answering, as most human activities, are susceptible to failure - at least from time to time. This volume offers several current approaches to the systematic study of questions and the surrounding activities and works toward supporting and improving these activities. The contributors formulate general problems for a formal treatment of questions, investigate specific kinds of questions, compare different frameworks with regard to how they regulate the activities of asking and answering of questions, and situate these activities in a wider framework of cognitive/epistemic discourse. From the perspectives of logic, linguistics, epistemology, and philosophy of language emerges a report on the state of the art of the theory of questions
Semantic and Mathematical Foundations for Intuitionism
Thesis (Ph.D.) - Indiana University, Philosophy, 2013My dissertation concerns the proper foundation for the intuitionistic mathematics whose development began with L.E.J. Brouwer's work in the first half of the 20th Century. It is taken for granted by most philosophers, logicians, and mathematicians interested in foundational questions that intuitionistic mathematics presupposes a special, proof-conditional theory of meaning for mathematical statements. I challenge this commonplace. Classical mathematics is very successful as a coherent body of theories and a tool for practical application. Given this success, a view like Dummett's that attributes a systematic unintelligibility to the statements of classical mathematicians fails to save the relevant phenomena. Furthermore, Dummett's program assumes that his proposed semantics for mathematical language validates all and only the logical truths of intuitionistic logic. In fact, it validates some intuitionistically invalid principles, and given the lack of intuitionistic completeness proofs, there is little reason to think that every intuitionistic logical truth is valid according to his semantics.
In light of the failure of Dummett's foundation for intuitionism, I propose and carry out a reexamination of Brouwer's own writings. Brouwer is frequently interpreted as a proto-Dummettian about his own mathematics. This is due to excessive emphasis on some of his more polemical writings and idiosyncratic philosophical views at the expense of his distinctively mathematical work. These polemical writings do not concern mathematical language, and their principal targets are Russell and Hilbert's foundational programs, not the semantic principle of bivalence. The failures of these foundational programs has diminished the importance of Brouwer's philosophical writings, but his work on reconstructing mathematics itself from intuitionistic principles continues to be worth studying.
When one studies this work relieved of its philosophical burden, it becomes clear that an intuitionistic mathematician can make sense of her mathematical work and activity without relying on special philosophical or linguistic doctrines. Core intuitionistic results, especially the invalidity of the logical principle tertium non datur, can be demonstrated from basic mathematical principles; these principles, in turn, can be defended in ways akin to the basic axioms of other mathematical theories. I discuss three such principles: Brouwer's Continuity Principle, the Principle of Uniformity, and Constructive Church's Thesis
- âŠ