6 research outputs found

    The Significance of Evidence-based Reasoning for Mathematics, Mathematics Education, Philosophy and the Natural Sciences

    Get PDF
    In this multi-disciplinary investigation we show how an evidence-based perspective of quantification---in terms of algorithmic verifiability and algorithmic computability---admits evidence-based definitions of well-definedness and effective computability, which yield two unarguably constructive interpretations of the first-order Peano Arithmetic PA---over the structure N of the natural numbers---that are complementary, not contradictory. The first yields the weak, standard, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically verifiable Tarskian truth values to the formulas of PA under the interpretation. The second yields a strong, finitary, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically computable Tarskian truth values to the formulas of PA under the interpretation. We situate our investigation within a broad analysis of quantification vis a vis: * Hilbert's epsilon-calculus * Goedel's omega-consistency * The Law of the Excluded Middle * Hilbert's omega-Rule * An Algorithmic omega-Rule * Gentzen's Rule of Infinite Induction * Rosser's Rule C * Markov's Principle * The Church-Turing Thesis * Aristotle's particularisation * Wittgenstein's perspective of constructive mathematics * An evidence-based perspective of quantification. By showing how these are formally inter-related, we highlight the fragility of both the persisting, theistic, classical/Platonic interpretation of quantification grounded in Hilbert's epsilon-calculus; and the persisting, atheistic, constructive/Intuitionistic interpretation of quantification rooted in Brouwer's belief that the Law of the Excluded Middle is non-finitary. We then consider some consequences for mathematics, mathematics education, philosophy, and the natural sciences, of an agnostic, evidence-based, finitary interpretation of quantification that challenges classical paradigms in all these disciplines

    The Significance of Evidence-based Reasoning in Mathematics, Mathematics Education, Philosophy, and the Natural Sciences

    Get PDF
    In this multi-disciplinary investigation we show how an evidence-based perspective of quantification---in terms of algorithmic verifiability and algorithmic computability---admits evidence-based definitions of well-definedness and effective computability, which yield two unarguably constructive interpretations of the first-order Peano Arithmetic PA---over the structure N of the natural numbers---that are complementary, not contradictory. The first yields the weak, standard, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically verifiable Tarskian truth values to the formulas of PA under the interpretation. The second yields a strong, finitary, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically computable Tarskian truth values to the formulas of PA under the interpretation. We situate our investigation within a broad analysis of quantification vis a vis: * Hilbert's epsilon-calculus * Goedel's omega-consistency * The Law of the Excluded Middle * Hilbert's omega-Rule * An Algorithmic omega-Rule * Gentzen's Rule of Infinite Induction * Rosser's Rule C * Markov's Principle * The Church-Turing Thesis * Aristotle's particularisation * Wittgenstein's perspective of constructive mathematics * An evidence-based perspective of quantification. By showing how these are formally inter-related, we highlight the fragility of both the persisting, theistic, classical/Platonic interpretation of quantification grounded in Hilbert's epsilon-calculus; and the persisting, atheistic, constructive/Intuitionistic interpretation of quantification rooted in Brouwer's belief that the Law of the Excluded Middle is non-finitary. We then consider some consequences for mathematics, mathematics education, philosophy, and the natural sciences, of an agnostic, evidence-based, finitary interpretation of quantification that challenges classical paradigms in all these disciplines

    Visual perception an information-based approach to understanding biological and artificial vision

    Get PDF
    The central issues of this dissertation are (a) what should we be doing — what problems should we be trying to solve — in order to build computer vision systems, and (b) what relevance biological vision has to the solution of these problems. The approach taken to tackle these issues centres mostly on the clarification and use of information-based ideas, and an investigation into the nature of the processes underlying perception. The primary objective is to demonstrate that information theory and extensions of it, and measurement theory are powerful tools in helping to find solutions to these problems. The quantitative meaning of information is examined, from its origins in physical theories, through Shannon information theory, Gabor representations and codes towards semantic interpretations of the term. Also the application of information theory to the understanding of the developmental and functional properties of biological visual systems is discussed. This includes a review of the current state of knowledge of the architecture and function of the early visual pathways, particularly the retina, and a discussion of the possible coding functions of cortical neurons. The nature of perception is discussed from a number of points of view: the types and function of explanation of perceptual systems and how these relate to the operation of the system; the role of the observer in describing perceptual functions in other systems or organisms; the status and role of objectivist and representational viewpoints in understanding vision; the philosophical basis of perception; the relationship between pattern recognition and perception, and the interpretation of perception in terms of a theory of measurement These two threads of research, information theory and measurement theory are brought together in an overview and reinterpretation of the cortical role in mammalian vision. Finally the application of some of the coding and recognition concepts to industrial inspection problems are described. The nature of the coding processes used are unusual in that coded images are used as the input for a simple neural network classifier, rather than a heuristic feature set The relationship between the Karhunen-Loève transform and the singular value decomposition is clarified as background the coding technique used to code the images. This coding technique has also been used to code long sequences of moving images to investigate the possibilities of recognition of people on the basis of their gait or posture and this application is briefly described

    Data bases and data base systems related to NASA's aerospace program. A bibliography with indexes

    Get PDF
    This bibliography lists 1778 reports, articles, and other documents introduced into the NASA scientific and technical information system, 1975 through 1980
    corecore