83 research outputs found

    Descriptive Complexity, Computational Tractability, and the Logical and Cognitive Foundations of Mathematics

    Get PDF
    In computational complexity theory, decision problems are divided into complexity classes based on the amount of computational resources it takes for algorithms to solve them. In theoretical computer science, it is commonly accepted that only functions for solving problems in the complexity class P, solvable by a deterministic Turing machine in polynomial time, are considered to be tractable. In cognitive science and philosophy, this tractability result has been used to argue that only functions in P can feasibly work as computational models of human cognitive capacities. One interesting area of computational complexity theory is descriptive complexity, which connects the expressive strength of systems of logic with the computational complexity classes. In descriptive complexity theory, it is established that only first-order (classical) systems are connected to P, or one of its subclasses. Consequently, second-order systems of logic are considered to be computationally intractable, and may therefore seem to be unfit to model human cognitive capacities. This would be problematic when we think of the role of logic as the foundations of mathematics. In order to express many important mathematical concepts and systematically prove theorems involving them, we need to have a system of logic stronger than classical first-order logic. But if such a system is considered to be intractable, it means that the logical foundation of mathematics can be prohibitively complex for human cognition. In this paper I will argue, however, that this problem is the result of an unjustified direct use of computational complexity classes in cognitive modelling. Placing my account in the recent literature on the topic, I argue that the problem can be solved by considering computational complexity for humanly relevant problem solving algorithms and input sizes.Peer reviewe

    Synthetic Philosophy of Mathematics and Natural Sciences Conceptual analyses from a Grothendieckian Perspective

    Get PDF
    ISBN-13: 978-0692593974. Giuseppe Longo. Synthetic Philosophy of Mathematics and Natural Sciences, Conceptual analyses from a Grothendieckian Perspective, Reflections on “Synthetic Philosophy of Contemporary Mathematics” by F. Zalamea, Urbanomic (UK) and Sequence Press (USA), 2012. Invited Paper, in Speculations: Journal of Speculative Realism, Published: 12/12/2015, followed by an answer by F. Zalamea.International audienceZalamea’s book is as original as it is belated. It is indeed surprising, if we give it a moment’s thought, just how greatly behind schedule philosophical reflection on contemporary mathematics lags, especially considering the momentous changes that took place in the second half of the twentieth century. Zalamea compares this situation with that of the philosophy of physics: he mentions D’Espagnat’s work on quantum mechanics, but we could add several others who, in the last few decades, have elaborated an extremely timely philosophy of contemporary physics (see for example Bitbol 2000; Bitbol et al. 2009). As was the case in biology, philosophy – since Kant’s crucial observations in the Critique of Judgment, at least – has often “run ahead” of life sciences, exploring and opening up a space for reflections that are not derived from or integrated with its contemporary scientific practice. Some of these reflections are still very much auspicious today. And indeed, some philosophers today are saying something truly new about biology..

    Asymptotic elimination of partially continuous aggregation functions in directed graphical models

    Full text link
    In Statistical Relational Artificial Intelligence, a branch of AI and machine learning which combines the logical and statistical schools of AI, one uses the concept {\em para\-metrized probabilistic graphical model (PPGM)} to model (conditional) dependencies between random variables and to make probabilistic inferences about events on a space of ``possible worlds''. The set of possible worlds with underlying domain DD (a set of objects) can be represented by the set WD\mathbf{W}_D of all first-order structures (for a suitable signature) with domain DD. Using a formal logic we can describe events on WD\mathbf{W}_D. By combining a logic and a PPGM we can also define a probability distribution PD\mathbb{P}_D on WD\mathbf{W}_D and use it to compute the probability of an event. We consider a logic, denoted PLAPLA, with truth values in the unit interval, which uses aggregation functions, such as arithmetic mean, geometric mean, maximum and minimum instead of quantifiers. However we face the problem of computational efficiency and this problem is an obstacle to the wider use of methods from Statistical Relational AI in practical applications. We address this problem by proving that the described probability will, under certain assumptions on the PPGM and the sentence φ\varphi, converge as the size of DD tends to infinity. The convergence result is obtained by showing that every formula φ(x1,,xk)\varphi(x_1, \ldots, x_k) which contains only ``admissible'' aggregation functions (e.g. arithmetic and geometric mean, max and min) is asymptotically equivalent to a formula ψ(x1,,xk)\psi(x_1, \ldots, x_k) without aggregation functions

    Topics in Programming Languages, a Philosophical Analysis through the case of Prolog

    Get PDF
    [EN]Programming languages seldom find proper anchorage in philosophy of logic, language and science. is more, philosophy of language seems to be restricted to natural languages and linguistics, and even philosophy of logic is rarely framed into programming languages topics. The logic programming paradigm and Prolog are, thus, the most adequate paradigm and programming language to work on this subject, combining natural language processing and linguistics, logic programming and constriction methodology on both algorithms and procedures, on an overall philosophizing declarative status. Not only this, but the dimension of the Fifth Generation Computer system related to strong Al wherein Prolog took a major role. and its historical frame in the very crucial dialectic between procedural and declarative paradigms, structuralist and empiricist biases, serves, in exemplar form, to treat straight ahead philosophy of logic, language and science in the contemporaneous age as well. In recounting Prolog's philosophical, mechanical and algorithmic harbingers, the opportunity is open to various routes. We herein shall exemplify some: - the mechanical-computational background explored by Pascal, Leibniz, Boole, Jacquard, Babbage, Konrad Zuse, until reaching to the ACE (Alan Turing) and EDVAC (von Neumann), offering the backbone in computer architecture, and the work of Turing, Church, Gödel, Kleene, von Neumann, Shannon, and others on computability, in parallel lines, throughly studied in detail, permit us to interpret ahead the evolving realm of programming languages. The proper line from lambda-calculus, to the Algol-family, the declarative and procedural split with the C language and Prolog, and the ensuing branching and programming languages explosion and further delimitation, are thereupon inspected as to relate them with the proper syntax, semantics and philosophical élan of logic programming and Prolog

    Reason, causation and compatibility with the phenomena

    Get PDF
    'Reason, Causation and Compatibility with the Phenomena' strives to give answers to the philosophical problem of the interplay between realism, explanation and experience. This book is a compilation of essays that recollect significant conceptions of rival terms such as determinism and freedom, reason and appearance, power and knowledge. This title discusses the progress made in epistemology and natural philosophy, especially the steps that led from the ancient theory of atomism to the modern quantum theory, and from mathematization to analytic philosophy. Moreover, it provides possible gateways from modern deadlocks of theory either through approaches to consciousness or through historical critique of intellectual authorities. This work will be of interest to those either researching or studying in colleges and universities, especially in the departments of philosophy, history of science, philosophy of science, philosophy of physics and quantum mechanics, history of ideas and culture. Greek and Latin Literature students and instructors may also find this book to be both a fascinating and valuable point of reference

    Zero-one laws with respect to models of provability logic and two Grzegorczyk logics

    Get PDF
    It has been shown in the late 1960s that each formula of first-order logic without constants and function symbols obeys a zero-one law: As the number of elements of finite models increases, every formula holds either in almost all or in almost no models of that size. Therefore, many properties of models, such as having an even number of elements, cannot be expressed in the language of first-order logic. Halpern and Kapron proved zero-one laws for classes of models corresponding to the modal logics K, T, S4, and S5 and for frames corresponding to S4 and S5. In this paper, we prove zero-one laws for provability logic and its two siblings Grzegorczyk logic and weak Grzegorczyk logic, with respect to model validity. Moreover, we axiomatize validity in almost all relevant finite models, leading to three different axiom systems

    The Civilization at a Crossroads: Constructing the Paradigm Shift

    Get PDF
    The book addresses the broad issue of sustainability of our civilization and seeks to contribute to the ongoing discussion of what many see as its systemic crisis. There is a broad agreement that new creative ideas, initiatives, and solutions are essential for dealing with the current problems. However, despite this recognition, we still know very little about the process of creation and how it works. As a result, our civilization fails to harness the enormous creative potential of humanity. This failure, the book argues, is the main source of our current problems—languishing economy, deteriorating environment, continued violence, the deficit of democracy, and the lack of new fundamental breakthroughs in science. It examines some of these problems and demonstrates the connection between them and our failure to embrace the process of creation. The book offers a perspective that sheds light on the process of creation. It pays special attention to the theoretical contributions of Jean Piaget and the ongoing discussions of knowledge production that help us understand better how the process of creation works. The central argument of the book is that in order to solve our current problems and ensure the sustainability of our civilization well into the future we must embrace the process of creation and make it the central organizing principle of our social practice. Finally, the book provides an outline of the principal changes that the adoption of the new social practice organized around the process of creation will involve
    corecore