159,060 research outputs found

    Speakable in Quantum Mechanics

    Get PDF
    At the 1927 Como conference Bohr spoke the now famous words "It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature." However, if the Copenhagen interpretation really holds on to this motto, why then is there this feeling of conflict when comparing it with realist interpretations? Surely what one can say about nature should in a certain sense be interpretation independent. In this paper I take Bohr's motto seriously and develop a quantum logic that avoids assuming any form of realism as much as possible. To illustrate the non-triviality of this motto a similar result is first derived for classical mechanics. It turns out that the logic for classical mechanics is a special case of the derived quantum logic. Finally, some hints are provided in how these logics are to be used in practical situations and I discuss how some realist interpretations relate to these logics

    “Fuzzy time”, from paradox to paradox (Does it solve the contradiction between Quantum Mechanics & General Relativity?)

    Get PDF
    Although Fuzzy logic and Fuzzy Mathematics is a widespread subject and there is a vast literature about it, yet the use of Fuzzy issues like Fuzzy sets and Fuzzy numbers was relatively rare in time concept. This could be seen in the Fuzzy time series. In addition, some attempts are done in fuzzing Turing Machines but seemingly there is no need to fuzzy time. Throughout this article, we try to change this picture and show why it is helpful to consider the instants of time as Fuzzy numbers. In physics, though there are revolutionary ideas on the time concept like B theories in contrast to A theory also about central concepts like space, momentum… it is a long time that these concepts are changed, but time is considered classically in all well-known and established physics theories. Seemingly, we stick to the classical time concept in all fields of science and we have a vast inertia to change it. Our goal in this article is to provide some bases why it is rational and reasonable to change and modify this picture. Here, the central point is the modified version of “Unexpected Hanging” paradox as it is described in "Is classical Mathematics appropriate for theory of Computation".This modified version leads us to a contradiction and based on that it is presented there why some problems in Theory of Computation are not solved yet. To resolve the difficulties arising there, we have two choices. Either “choosing” a new type of Logic like “Para-consistent Logic” to tolerate contradiction or changing and improving the time concept and consequently to modify the “Turing Computational Model”. Throughout this paper, we select the second way for benefiting from saving some aspects of Classical Logic. In chapter 2, by applying quantum Mechanics and Schrodinger equation we compute the associated fuzzy number to time. These, provides a new interpretation of Quantum Mechanics.More exactly what we see here is "Particle-Fuzzy time" interpretation of quantum Mechanics, in contrast to some other interpretations of Quantum Mechanics like " Wave-Particle" interpretation. At the end, we propound a question about the possible solution of a paradox in Physics, the contradiction between General Relativity and Quantum Mechanics

    Non-Classical Approaches to Logic and Quantification as a Means for Analysis of Classroom Argumentation and Proof in Mathematics Education Research

    Get PDF
    Background: While it is usually taken for granted that logic taught in the mathematics classroom should consist of elements of classical propositional or first-order predicate logic, the situation may differ when referring to students’ discursive productions. Objectives: The paper aims to highlight how classical logic cannot grasp some epistemic aspects, such as evolution over time, uncertainty, and quantification on blurred domains, because it is specifically tailored to capture the set-theoretic language and to validate, rather than to consider epistemic aspects. The aim is to show that adopting classical and non-classical lenses might lead to different results in analysis. Design: Nyaya pragmatic and empiricist logic, with Peircean non-standard quantification, both linked by the concept of free logic, are used as theoretical lenses in analysing two paradigmatic examples of classroom argumentation. Setting and Participants: excerpts from a set of data collected by prof. Paolo Boero from the University of Genoa during research activities in a secondary school mathematical class. Methodology: The examples are discussed by adopting a hermeneutic approach. Results: The analysis shows that different logical lenses can lead to varying interpretations of students’ behaviour in argumentation and presenting proof in mathematics and that the adopted non-classical lenses expand the range of possible explanations of students’ behaviour. Conclusion: In mathematics education research, the need to consider an epistemic dimension in the analysis of classroom argumentation and proof production leads to the necessity to consider and combine logical tools in a way specific to the discipline, which might differ from those usually required in mathematics

    On the mechanisation of the logic of partial functions

    Get PDF
    PhD ThesisIt is well known that partial functions arise frequently in formal reasoning about programs. A partial function may not yield a value for every member of its domain. Terms that apply partial functions thus may not denote, and coping with such terms is problematic in two-valued classical logic. A question is raised: how can reasoning about logical formulae that can contain references to terms that may fail to denote (partial terms) be conducted formally? Over the years a number of approaches to coping with partial terms have been documented. Some of these approaches attempt to stay within the realm of two-valued classical logic, while others are based on non-classical logics. However, as yet there is no consensus on which approach is the best one to use. A comparison of numerous approaches to coping with partial terms is presented based upon formal semantic definitions. One approach to coping with partial terms that has received attention over the years is the Logic of Partial Functions (LPF), which is the logic underlying the Vienna Development Method. LPF is a non-classical three-valued logic designed to cope with partial terms, where both terms and propositions may fail to denote. As opposed to using concrete undfined values, undefinedness is treated as a \gap", that is, the absence of a defined value. LPF is based upon Strong Kleene logic, where the interpretations of the logical operators are extended to cope with truth value \gaps". Over the years a large body of research and engineering has gone into the development of proof based tool support for two-valued classical logic. This has created a major obstacle that affects the adoption of LPF, since such proof support cannot be carried over directly to LPF. Presently, there is a lack of direct proof support for LPF. An aim of this work is to investigate the applicability of mechanised (automated) proof support for reasoning about logical formulae that can contain references to partial terms in LPF. The focus of the investigation is on the basic but fundamental two-valued classical logic proof procedure: resolution and the associated technique proof by contradiction. Advanced proof techniques are built on the foundation that is provided by these basic fundamental proof techniques. Looking at the impact of these basic fundamental proof techniques in LPF is thus the essential and obvious starting point for investigating proof support for LPF. The work highlights the issues that arise when applying these basic techniques in LPF, and investigates the extent of the modifications needed to carry them over to LPF. This work provides the essential foundation on which to facilitate research into the modification of advanced proof techniques for LPF.EPSR

    Unifying Functional Interpretations: Past and Future

    Full text link
    This article surveys work done in the last six years on the unification of various functional interpretations including G\"odel's dialectica interpretation, its Diller-Nahm variant, Kreisel modified realizability, Stein's family of functional interpretations, functional interpretations "with truth", and bounded functional interpretations. Our goal in the present paper is twofold: (1) to look back and single out the main lessons learnt so far, and (2) to look forward and list several open questions and possible directions for further research.Comment: 18 page
    • …
    corecore