36 research outputs found
Some attempts at a direct reduction of the infinite to the (large) finite
I survey some endeavors which have been made to attain a sort of direct reduction of the usual notion of countable infinity to some reasonable notion of finiteness, in terms of nonstandard arithmetic, feasibility, pseudo-models of derivations, Ehrenfeucht star-models, etc. I maintain that although many interesting results have been obtained in these attempts, they ultimately show that (at least by the means considered here) no satisfactory reduction is possible
The Significance of Evidence-based Reasoning for Mathematics, Mathematics Education, Philosophy and the Natural Sciences
In this multi-disciplinary investigation we show how an evidence-based perspective of quantification---in terms of algorithmic verifiability and algorithmic computability---admits evidence-based definitions of well-definedness and effective computability, which yield two unarguably constructive interpretations of the first-order Peano Arithmetic PA---over the structure N of the natural numbers---that are complementary, not contradictory. The first yields the weak, standard, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically verifiable Tarskian truth values to the formulas of PA under the interpretation. The second yields a strong, finitary, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically computable Tarskian truth values to the formulas of PA under the interpretation. We situate our investigation within a broad analysis of quantification vis a vis: * Hilbert's epsilon-calculus * Goedel's omega-consistency * The Law of the Excluded Middle * Hilbert's omega-Rule * An Algorithmic omega-Rule * Gentzen's Rule of Infinite Induction * Rosser's Rule C * Markov's Principle * The Church-Turing Thesis * Aristotle's particularisation * Wittgenstein's perspective of constructive mathematics * An evidence-based perspective of quantification. By showing how these are formally inter-related, we highlight the fragility of both the persisting, theistic, classical/Platonic interpretation of quantification grounded in Hilbert's epsilon-calculus; and the persisting, atheistic, constructive/Intuitionistic interpretation of quantification rooted in Brouwer's belief that the Law of the Excluded Middle is non-finitary. We then consider some consequences for mathematics, mathematics education, philosophy, and the natural sciences, of an agnostic, evidence-based, finitary interpretation of quantification that challenges classical paradigms in all these disciplines
Representation and Reality by Language: How to make a home quantum computer?
A set theory model of reality, representation and language based on the relation of completeness and incompleteness is explored. The problem of completeness of mathematics is linked to its counterpart in quantum mechanics. That model includes two Peano arithmetics or Turing machines independent of each other. The complex Hilbert space underlying quantum mechanics as the base of its mathematical formalism is interpreted as a generalization of Peano arithmetic: It is a doubled infinite set of doubled Peano arithmetics having a remarkable symmetry to the axiom of choice. The quantity of information is interpreted as the number of elementary choices (bits). Quantum information is seen as the generalization of information to infinite sets or series. The equivalence of that model to a quantum computer is demonstrated. The condition for the Turing machines to be independent of each other is reduced to the state of Nash equilibrium between them. Two relative models of language as game in the sense of game theory and as ontology of metaphors (all mappings, which are not one-to-one, i.e. not representations of reality in a formal sense) are deduced
Naturalizing institutions: Evolutionary principles and application on the case of money
In recent extensions of the Darwinian paradigm into economics, the replicator-interactor duality looms large. I propose a strictly naturalistic approach to this duality in the context of the theory of institutions, which means that its use is seen as being always and necessarily dependent on identifying a physical realization. I introduce a general framework for the analysis of institutions, which synthesizes Searle's and Aoki's theories, especially with regard to the role of public representations (signs) in the coordination of actions, and the function of cognitive processes that underly rule-following as a behavioral disposition. This allows to conceive institutions as causal circuits that connect the population-level dynamics of interactions with cognitive phenomena on the individual level. Those cognitive phenomena ultimately root in neuronal structures. So, I draw on a critical restatement of the concept of the meme by Aunger to propose a new conceptualization of the replicator in the context of institutions, namely, the replicator is a causal conjunction between signs and neuronal structures which undergirds the dispositions that generate rule-following actions. Signs, in turn, are outcomes of population-level interactions. I apply this framework on the case of money, analyzing the emotions that go along with the use of money, and presenting a stylized account of the emergence of money in terms of the naturalized Searle-Aoki model. In this view, money is a neuronally anchored metaphor for emotions relating with social exchange and reciprocity. Money as a meme is physically realized in a replicator which is a causal conjunction of money artefacts and money emotions. --Generalized Darwinism,institutions,replicator/interactor,Searle,Aoki,naturalism,memes,emotions,money
The Significance of Evidence-based Reasoning in Mathematics, Mathematics Education, Philosophy, and the Natural Sciences
In this multi-disciplinary investigation we show how an evidence-based perspective of quantification---in terms of algorithmic verifiability and algorithmic computability---admits evidence-based definitions of well-definedness and effective computability, which yield two unarguably constructive interpretations of the first-order Peano Arithmetic PA---over the structure N of the natural numbers---that are complementary, not contradictory. The first yields the weak, standard, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically verifiable Tarskian truth values to the formulas of PA under the interpretation. The second yields a strong, finitary, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically computable Tarskian truth values to the formulas of PA under the interpretation. We situate our investigation within a broad analysis of quantification vis a vis: * Hilbert's epsilon-calculus * Goedel's omega-consistency * The Law of the Excluded Middle * Hilbert's omega-Rule * An Algorithmic omega-Rule * Gentzen's Rule of Infinite Induction * Rosser's Rule C * Markov's Principle * The Church-Turing Thesis * Aristotle's particularisation * Wittgenstein's perspective of constructive mathematics * An evidence-based perspective of quantification. By showing how these are formally inter-related, we highlight the fragility of both the persisting, theistic, classical/Platonic interpretation of quantification grounded in Hilbert's epsilon-calculus; and the persisting, atheistic, constructive/Intuitionistic interpretation of quantification rooted in Brouwer's belief that the Law of the Excluded Middle is non-finitary. We then consider some consequences for mathematics, mathematics education, philosophy, and the natural sciences, of an agnostic, evidence-based, finitary interpretation of quantification that challenges classical paradigms in all these disciplines
Tense Logic and Ontology of Time
This work aims to make tense logic a more robust tool for ontologists, philosophers, knowledge engineers and programmers by outlining a fusion of tense logic and ontology of time. In order to make tense logic better understandable, the central formal primitives of standard tense logic are derived as theorems from an informal and intuitive ontology of time. In order to make formulation of temporal propositions easier, temporal operators that were introduced by Georg Henrik von Wright are developed, and mapped to the ontology of time.Peer reviewe
Parikh and Wittgenstein
A survey of Parikhâs philosophical appropriations of Wittgensteinian themes, placed into historical context against the backdrop of Turingâs famous paper, âOn computable numbers, with an application to the Entscheidungsproblemâ (Turing in Proc Lond Math Soc 2(42): 230â265, 1936/1937) and its connections with Wittgenstein and the foundations of mathematics. Characterizing Parikhâs contributions to the interaction between logic and philosophy at its foundations, we argue that his work gives the lie to recent presentations of Wittgensteinâs so-called metaphilosophy (e.g., Horwich in Wittgensteinâs metaphilosophy. Oxford University Press, Oxford, 2012) as a kind of âdead endâ quietism. From early work on the idea of a feasibility in arithmetic (Parikh in J Symb Log 36(3):494â508, 1971) and vagueness (Parikh in Logic, language and method. Reidel, Boston, pp 241â261, 1983) to his more recent program in social software (Parikh in Advances in modal logic, vol 2. CSLI Publications, Stanford, pp 381â400, 2001a), Parikhâs work encompasses and touches upon many foundational issues in epistemology, philosophy of logic, philosophy of language, and value theory. But it expresses a unified philosophical point of view. In his most recent work, questions about public and private languages, opportunity spaces, strategic voting, non-monotonic inference and knowledge in literature provide a remarkable series of suggestions about how to present issues of fundamental importance in theoretical computer science as serious philosophical issues