298 research outputs found

    A Galois connection between classical and intuitionistic logics. I: Syntax

    Full text link
    In a 1985 commentary to his collected works, Kolmogorov remarked that his 1932 paper "was written in hope that with time, the logic of solution of problems [i.e., intuitionistic logic] will become a permanent part of a [standard] course of logic. A unified logical apparatus was intended to be created, which would deal with objects of two types - propositions and problems." We construct such a formal system QHC, which is a conservative extension of both the intuitionistic predicate calculus QH and the classical predicate calculus QC. The only new connectives ? and ! of QHC induce a Galois connection (i.e., a pair of adjoint functors) between the Lindenbaum posets (i.e. the underlying posets of the Lindenbaum algebras) of QH and QC. Kolmogorov's double negation translation of propositions into problems extends to a retraction of QHC onto QH; whereas Goedel's provability translation of problems into modal propositions extends to a retraction of QHC onto its QC+(?!) fragment, identified with the modal logic QS4. The QH+(!?) fragment is an intuitionistic modal logic, whose modality !? is a strict lax modality in the sense of Aczel - and thus resembles the squash/bracket operation in intuitionistic type theories. The axioms of QHC attempt to give a fuller formalization (with respect to the axioms of intuitionistic logic) to the two best known contentual interpretations of intiuitionistic logic: Kolmogorov's problem interpretation (incorporating standard refinements by Heyting and Kreisel) and the proof interpretation by Orlov and Heyting (as clarified by G\"odel). While these two interpretations are often conflated, from the viewpoint of the axioms of QHC neither of them reduces to the other one, although they do overlap.Comment: 47 pages. The paper is rewritten in terms of a formal meta-logic (a simplified version of Isabelle's meta-logic

    The Significance of Evidence-based Reasoning for Mathematics, Mathematics Education, Philosophy and the Natural Sciences

    Get PDF
    In this multi-disciplinary investigation we show how an evidence-based perspective of quantification---in terms of algorithmic verifiability and algorithmic computability---admits evidence-based definitions of well-definedness and effective computability, which yield two unarguably constructive interpretations of the first-order Peano Arithmetic PA---over the structure N of the natural numbers---that are complementary, not contradictory. The first yields the weak, standard, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically verifiable Tarskian truth values to the formulas of PA under the interpretation. The second yields a strong, finitary, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically computable Tarskian truth values to the formulas of PA under the interpretation. We situate our investigation within a broad analysis of quantification vis a vis: * Hilbert's epsilon-calculus * Goedel's omega-consistency * The Law of the Excluded Middle * Hilbert's omega-Rule * An Algorithmic omega-Rule * Gentzen's Rule of Infinite Induction * Rosser's Rule C * Markov's Principle * The Church-Turing Thesis * Aristotle's particularisation * Wittgenstein's perspective of constructive mathematics * An evidence-based perspective of quantification. By showing how these are formally inter-related, we highlight the fragility of both the persisting, theistic, classical/Platonic interpretation of quantification grounded in Hilbert's epsilon-calculus; and the persisting, atheistic, constructive/Intuitionistic interpretation of quantification rooted in Brouwer's belief that the Law of the Excluded Middle is non-finitary. We then consider some consequences for mathematics, mathematics education, philosophy, and the natural sciences, of an agnostic, evidence-based, finitary interpretation of quantification that challenges classical paradigms in all these disciplines

    Topics in Programming Languages, a Philosophical Analysis through the case of Prolog

    Get PDF
    [EN]Programming languages seldom find proper anchorage in philosophy of logic, language and science. is more, philosophy of language seems to be restricted to natural languages and linguistics, and even philosophy of logic is rarely framed into programming languages topics. The logic programming paradigm and Prolog are, thus, the most adequate paradigm and programming language to work on this subject, combining natural language processing and linguistics, logic programming and constriction methodology on both algorithms and procedures, on an overall philosophizing declarative status. Not only this, but the dimension of the Fifth Generation Computer system related to strong Al wherein Prolog took a major role. and its historical frame in the very crucial dialectic between procedural and declarative paradigms, structuralist and empiricist biases, serves, in exemplar form, to treat straight ahead philosophy of logic, language and science in the contemporaneous age as well. In recounting Prolog's philosophical, mechanical and algorithmic harbingers, the opportunity is open to various routes. We herein shall exemplify some: - the mechanical-computational background explored by Pascal, Leibniz, Boole, Jacquard, Babbage, Konrad Zuse, until reaching to the ACE (Alan Turing) and EDVAC (von Neumann), offering the backbone in computer architecture, and the work of Turing, Church, Gödel, Kleene, von Neumann, Shannon, and others on computability, in parallel lines, throughly studied in detail, permit us to interpret ahead the evolving realm of programming languages. The proper line from lambda-calculus, to the Algol-family, the declarative and procedural split with the C language and Prolog, and the ensuing branching and programming languages explosion and further delimitation, are thereupon inspected as to relate them with the proper syntax, semantics and philosophical élan of logic programming and Prolog

    The Significance of Evidence-based Reasoning in Mathematics, Mathematics Education, Philosophy, and the Natural Sciences

    Get PDF
    In this multi-disciplinary investigation we show how an evidence-based perspective of quantification---in terms of algorithmic verifiability and algorithmic computability---admits evidence-based definitions of well-definedness and effective computability, which yield two unarguably constructive interpretations of the first-order Peano Arithmetic PA---over the structure N of the natural numbers---that are complementary, not contradictory. The first yields the weak, standard, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically verifiable Tarskian truth values to the formulas of PA under the interpretation. The second yields a strong, finitary, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically computable Tarskian truth values to the formulas of PA under the interpretation. We situate our investigation within a broad analysis of quantification vis a vis: * Hilbert's epsilon-calculus * Goedel's omega-consistency * The Law of the Excluded Middle * Hilbert's omega-Rule * An Algorithmic omega-Rule * Gentzen's Rule of Infinite Induction * Rosser's Rule C * Markov's Principle * The Church-Turing Thesis * Aristotle's particularisation * Wittgenstein's perspective of constructive mathematics * An evidence-based perspective of quantification. By showing how these are formally inter-related, we highlight the fragility of both the persisting, theistic, classical/Platonic interpretation of quantification grounded in Hilbert's epsilon-calculus; and the persisting, atheistic, constructive/Intuitionistic interpretation of quantification rooted in Brouwer's belief that the Law of the Excluded Middle is non-finitary. We then consider some consequences for mathematics, mathematics education, philosophy, and the natural sciences, of an agnostic, evidence-based, finitary interpretation of quantification that challenges classical paradigms in all these disciplines

    Frameworks, models, and case studies

    Get PDF
    This thesis focuses on models of conceptual change in science and philosophy. In particular, I developed a new bootstrapping methodology for studying conceptual change, centered around the formalization of several popular models of conceptual change and the collective assessment of their improved formal versions via nine evaluative dimensions. Among the models of conceptual change treated in the thesis are Carnap’s explication, Lakatos’ concept-stretching, Toulmin’s conceptual populations, Waismann’s open texture, Mark Wilson’s patches and facades, Sneed’s structuralism, and Paul Thagard’s conceptual revolutions. In order to analyze and compare the conception of conceptual change provided by these different models, I rely on several historical reconstructions of episodes of scientific conceptual change. The historical episodes of scientific change that figure in this work include the emergence of the morphological concept of fish in biological taxonomies, the development of scientific conceptions of temperature, the Church-Turing thesis and related axiomatizations of effective calculability, the history of the concept of polyhedron in 17th and 18th century mathematics, Hamilton’s invention of the quaternions, the history of the pre-abstract group concepts in 18th and 19th century mathematics, the expansion of Newtonian mechanics to viscous fluids forces phenomena, and the chemical revolution. I will also present five different formal and informal improvements of four specific models of conceptual change. I will first present two different improvements of Carnapian explication, a formal and an informal one. My informal improvement of Carnapian explication will consist of a more fine-grained version of the procedure that adds an intermediate, third step to the two steps of Carnapian explication. I will show how this novel three-step version of explication is more suitable than its traditional two-step relative to handle complex cases of explications. My second, formal improvement of Carnapian explication will be a full explication of the concept of explication itself within the theory of conceptual spaces. By virtue of this formal improvement, the whole procedure of explication together with its application procedures and its pragmatic desiderata will be reconceptualized as a precise procedure involving topological and geometrical constraints inside the theory of conceptual spaces. My third improved model of conceptual change will consist of a formal explication of Darwinian models of conceptual change that will make vast use of Godfrey-Smith’s population-based Darwinism for targeting explicitly mathematical conceptual change. My fourth improvement will be dedicated instead to Wilson’s indeterminate model of conceptual change. I will show how Wilson’s very informal framework can be explicated within a modified version of the structuralist model-theoretic reconstructions of scientific theories. Finally, the fifth improved model of conceptual change will be a belief-revision-like logical framework that reconstructs Thagard’s model of conceptual revolution as specific revision and contraction operations that work on conceptual structures. At the end of this work, a general conception of conceptual change in science and philosophy emerges, thanks to the combined action of the three layers of my methodology. This conception takes conceptual change to be a multi-faceted phenomenon centered around the dynamics of groups of concepts. According to this conception, concepts are best reconstructed as plastic and inter-subjective entities equipped with a non-trivial internal structure and subject to a certain degree of localized holism. Furthermore, conceptual dynamics can be judged from a weakly normative perspective, bound to be dependent on shared values and goals. Conceptual change is then best understood, according to this conception, as a ubiquitous phenomenon underlying all of our intellectual activities, from science to ordinary linguistic practices. As such, conceptual change does not pose any particular problem to value-laden notions of scientific progress, objectivity, and realism. At the same time, this conception prompts all our concept-driven intellectual activities, including philosophical and metaphilosophical reflections, to take into serious consideration the phenomenon of conceptual change. An important consequence of this conception, and of the analysis that generated it, is in fact that an adequate understanding of the dynamics of philosophical concepts is a prerequisite for analytic philosophy to develop a realistic and non-idealized depiction of itself and its activities

    Categorical Ontology of Complex Systems, Meta-Systems and Theory of Levels: The Emergence of Life, Human Consciousness and Society

    Get PDF
    Single cell interactomics in simpler organisms, as well as somatic cell interactomics in multicellular organisms, involve biomolecular interactions in complex signalling pathways that were recently represented in modular terms by quantum automata with ‘reversible behavior’ representing normal cell cycling and division. Other implications of such quantum automata, modular modeling of signaling pathways and cell differentiation during development are in the fields of neural plasticity and brain development leading to quantum-weave dynamic patterns and specific molecular processes underlying extensive memory, learning, anticipation mechanisms and the emergence of human consciousness during the early brain development in children. Cell interactomics is here represented for the first time as a mixture of ‘classical’ states that determine molecular dynamics subject to Boltzmann statistics and ‘steady-state’, metabolic (multi-stable) manifolds, together with ‘configuration’ spaces of metastable quantum states emerging from complex quantum dynamics of interacting networks of biomolecules, such as proteins and nucleic acids that are now collectively defined as quantum interactomics. On the other hand, the time dependent evolution over several generations of cancer cells --that are generally known to undergo frequent and extensive genetic mutations and, indeed, suffer genomic transformations at the chromosome level (such as extensive chromosomal aberrations found in many colon cancers)-- cannot be correctly represented in the ‘standard’ terms of quantum automaton modules, as the normal somatic cells can. This significant difference at the cancer cell genomic level is therefore reflected in major changes in cancer cell interactomics often from one cancer cell ‘cycle’ to the next, and thus it requires substantial changes in the modeling strategies, mathematical tools and experimental designs aimed at understanding cancer mechanisms. Novel solutions to this important problem in carcinogenesis are proposed and experimental validation procedures are suggested. From a medical research and clinical standpoint, this approach has important consequences for addressing and preventing the development of cancer resistance to medical therapy in ongoing clinical trials involving stage III cancer patients, as well as improving the designs of future clinical trials for cancer treatments.\ud \ud \ud KEYWORDS: Emergence of Life and Human Consciousness;\ud Proteomics; Artificial Intelligence; Complex Systems Dynamics; Quantum Automata models and Quantum Interactomics; quantum-weave dynamic patterns underlying human consciousness; specific molecular processes underlying extensive memory, learning, anticipation mechanisms and human consciousness; emergence of human consciousness during the early brain development in children; Cancer cell ‘cycling’; interacting networks of proteins and nucleic acids; genetic mutations and chromosomal aberrations in cancers, such as colon cancer; development of cancer resistance to therapy; ongoing clinical trials involving stage III cancer patients’ possible improvements of the designs for future clinical trials and cancer treatments. \ud \u

    Mathematical Logic: Proof Theory, Constructive Mathematics (hybrid meeting)

    Get PDF
    The Workshop "Mathematical Logic: Proof Theory, Constructive Mathematics" focused on proofs both as formal derivations in deductive systems as well as on the extraction of explicit computational content from given proofs in core areas of ordinary mathematics using proof-theoretic methods. The workshop contributed to the following research strands: interactions between foundations and applications; proof mining; constructivity in classical logic; modal logic and provability logic; proof theory and theoretical computer science; structural proof theory

    The Lean mathematical library

    Full text link
    This paper describes mathlib, a community-driven effort to build a unified library of mathematics formalized in the Lean proof assistant. Among proof assistant libraries, it is distinguished by its dependently typed foundations, focus on classical mathematics, extensive hierarchy of structures, use of large- and small-scale automation, and distributed organization. We explain the architecture and design decisions of the library and the social organization that has led us here

    Teoria tradicional da informação semântica sem escândalo da dedução : uma reavaliação moderadamente externalista do tópico baseada em semântica urna e uma aplicação paraconsistente

    Get PDF
    Orientador: Walter Alexandre CarnielliTese (doutorado) - Universidade Estadual de Campinas, Instituto de Filosofia e Ciências HumanasResumo: A presente tese mostra que é possível reestabelecer a teoria tradicional da informação semântica (no que segue apenas TSI, originalmente proposta por Bar-Hillel e Carnap (1952, 1953)) a partir de uma descrição adequada das condições epistemológicas de nossa competência semântica. Uma consequência clássica de TSI é o assim chamado escândalo da dedução (no que segue SoD), tese segundo a qual verdades lógicas têm quantidade nula de informação. SoD é problemático dado que conflita com o caráter ampliativo do conhecimento formal. Baseado nisso, trabalhos recentes (e.g., Floridi (2004)) rejeitam TSI apesar de suas boas intuições sobre a natureza da informação semântica. Por outro lado, esta tese reconsidera a estratégia de assumir a semântica urna (RANTALA, 1979) como o pano de fundo metateórico privilegiado para o reestabelecimento de TSI sem SoD. A presente tese tem o seguinte plano. O capítulo 1 introduz o plano geral da tese. No capítulo 2, valendo-se fortemente de trabalhos clássicos sobre o externalismo semântico, eu apresento algum suporte filosófico para essa estratégia ao mostrar que a semântica urna corretamente caracteriza as condições epistemológicas de nossa competência semântica no uso de quantificadores. O capitulo 3 oferece uma descrição precisa da semântica urna a partir da apresentação de suas definições básicas e alguns de seus teoremas mais funda- mentais. No capítulo 4, eu me concentro mais uma vez no tema da informação semântica ao formalizar TSI em semântica urna e provar que nesse contexto SoD não vale. Finalmente, nos capítulos 5 e 6 eu considero resultados modelo-teóricos mais avançados sobre semântica urna e exploro uma possível aplicação paraconsistente das ideias principais dessa tese, respectivamenteAbstract: This thesis shows that it is possible to reestablish the traditional theory of semantic information (TSI, originally proposed by Bar-Hillel and Carnap (1952, 1953)) by providing an adequate account of the epistemological conditions of our semantic competence. A classical consequence of TSI is the so-called scandal of deduction (hereafter SoD) according to which logical truths have null amount of information. SoD is problematic since it does not make room for the ampliative character of formal knowledge. Based on this, recent work on the subject (e.g., Floridi (2004)) rejects TSI despite its good insights on the nature of semantic information. On the other hand, this work reconsiders the strategy of taking urn semantics (RANTALA, 1979) as a privileged metatheoretic framework for the formalization of TSI without SoD. The present thesis is planned in the following way. Chapter 1 introduces the thesis¿ overall plan. In chapter 2, relying heavily on classical works on semantic externalism, I present some philosophical support for this strategy by showing that urn semantics correctly characterizes the epistemological conditions of our semantic competence in the use of quantifiers. Chapter 3 offers a precise description of urn semantics by characterizing its basic definitions and some of its most fundamental theorems. In chapter 4, turning the focus once again to semantic information, I formalize TSI in urn semantics and show that in this context SoD does not hold. Finally, in chapter 5 and 6 I consider more advanced model-theoretic results on urn semantics and explore a paraconsistent possible application of the present idea, respectivelyDoutoradoFilosofiaDoutor em Filosofia142038/2014-8CNP
    • …
    corecore