37,811 research outputs found

    Nonlocal quantum information transfer without superluminal signalling and communication

    Full text link
    It is a frequent assumption that - via superluminal information transfers - superluminal signals capable of enabling communication are necessarily exchanged in any quantum theory that posits hidden superluminal influences. However, does the presence of hidden superluminal influences automatically imply superluminal signalling and communication? The non-signalling theorem mediates the apparent conflict between quantum mechanics and the theory of special relativity. However, as a 'no-go' theorem there exist two opposing interpretations of the non-signalling constraint: foundational and operational. Concerning Bell's theorem, we argue that Bell employed both interpretations at different times. Bell finally pursued an explicitly operational position on non-signalling which is often associated with ontological quantum theory, e.g., de Broglie-Bohm theory. This position we refer to as "effective non-signalling". By contrast, associated with orthodox quantum mechanics is the foundational position referred to here as "axiomatic non-signalling". In search of a decisive communication-theoretic criterion for differentiating between "axiomatic" and "effective" non-signalling, we employ the operational framework offered by Shannon's mathematical theory of communication. We find that an effective non-signalling theorem represents two sub-theorems, which we call (1) non-transfer-control (NTC) theorem, and (2) non-signification-control (NSC) theorem. Employing NTC and NSC theorems, we report that effective, instead of axiomatic, non-signalling is entirely sufficient for prohibiting nonlocal communication. An effective non-signalling theorem allows for nonlocal quantum information transfer yet - at the same time - effectively denies superluminal signalling and communication.Comment: 21 pages, 5 figures; The article is published with open acces in Foundations of Physics (2016

    A Process Modelling Framework Based on Point Interval Temporal Logic with an Application to Modelling Patient Flows

    Get PDF
    This thesis considers an application of a temporal theory to describe and model the patient journey in the hospital accident and emergency (A&E) department. The aim is to introduce a generic but dynamic method applied to any setting, including healthcare. Constructing a consistent process model can be instrumental in streamlining healthcare issues. Current process modelling techniques used in healthcare such as flowcharts, unified modelling language activity diagram (UML AD), and business process modelling notation (BPMN) are intuitive and imprecise. They cannot fully capture the complexities of the types of activities and the full extent of temporal constraints to an extent where one could reason about the flows. Formal approaches such as Petri have also been reviewed to investigate their applicability to the healthcare domain to model processes. Additionally, to schedule patient flows, current modelling standards do not offer any formal mechanism, so healthcare relies on critical path method (CPM) and program evaluation review technique (PERT), that also have limitations, i.e. finish-start barrier. It is imperative to specify the temporal constraints between the start and/or end of a process, e.g., the beginning of a process A precedes the start (or end) of a process B. However, these approaches failed to provide us with a mechanism for handling these temporal situations. If provided, a formal representation can assist in effective knowledge representation and quality enhancement concerning a process. Also, it would help in uncovering complexities of a system and assist in modelling it in a consistent way which is not possible with the existing modelling techniques. The above issues are addressed in this thesis by proposing a framework that would provide a knowledge base to model patient flows for accurate representation based on point interval temporal logic (PITL) that treats point and interval as primitives. These objects would constitute the knowledge base for the formal description of a system. With the aid of the inference mechanism of the temporal theory presented here, exhaustive temporal constraints derived from the proposed axiomatic system’ components serves as a knowledge base. The proposed methodological framework would adopt a model-theoretic approach in which a theory is developed and considered as a model while the corresponding instance is considered as its application. Using this approach would assist in identifying core components of the system and their precise operation representing a real-life domain deemed suitable to the process modelling issues specified in this thesis. Thus, I have evaluated the modelling standards for their most-used terminologies and constructs to identify their key components. It will also assist in the generalisation of the critical terms (of process modelling standards) based on their ontology. A set of generalised terms proposed would serve as an enumeration of the theory and subsume the core modelling elements of the process modelling standards. The catalogue presents a knowledge base for the business and healthcare domains, and its components are formally defined (semantics). Furthermore, a resolution theorem-proof is used to show the structural features of the theory (model) to establish it is sound and complete. After establishing that the theory is sound and complete, the next step is to provide the instantiation of the theory. This is achieved by mapping the core components of the theory to their corresponding instances. Additionally, a formal graphical tool termed as point graph (PG) is used to visualise the cases of the proposed axiomatic system. PG facilitates in modelling, and scheduling patient flows and enables analysing existing models for possible inaccuracies and inconsistencies supported by a reasoning mechanism based on PITL. Following that, a transformation is developed to map the core modelling components of the standards into the extended PG (PG*) based on the semantics presented by the axiomatic system. A real-life case (from the King’s College hospital accident and emergency (A&E) department’s trauma patient pathway) is considered to validate the framework. It is divided into three patient flows to depict the journey of a patient with significant trauma, arriving at A&E, undergoing a procedure and subsequently discharged. Their staff relied upon the UML-AD and BPMN to model the patient flows. An evaluation of their representation is presented to show the shortfalls of the modelling standards to model patient flows. The last step is to model these patient flows using the developed approach, which is supported by enhanced reasoning and scheduling

    Who Cares about Axiomatization? Representation, Invariance, and Formal Ontologies

    Get PDF
    The philosophy of science of Patrick Suppes is centered on two important notions that are part of the title of his recent book (Suppes 2002): Representation and Invariance. Representation is important because when we embrace a theory we implicitly choose a way to represent the phenomenon we are studying. Invariance is important because, since invariants are the only things that are constant in a theory, in a way they give the “objective” meaning of that theory. Every scientific theory gives a representation of a class of structures and studies the invariant properties holding in that class of structures. In Suppes’ view, the best way to define this class of structures is via axiomatization. This is because a class of structures is given by a definition, and this same definition establishes which are the properties that a single structure must possess in order to belong to the class. These properties correspond to the axioms of a logical theory. In Suppes’ view, the best way to characterize a scientific structure is by giving a representation theorem for its models and singling out the invariants in the structure. Thus, we can say that the philosophy of science of Patrick Suppes consists in the application of the axiomatic method to scientific disciplines. What I want to argue in this paper is that this application of the axiomatic method is also at the basis of a new approach that is being increasingly applied to the study of computer science and information systems, namely the approach of formal ontologies. The main task of an ontology is that of making explicit the conceptual structure underlying a certain domain. By “making explicit the conceptual structure” we mean singling out the most basic entities populating the domain and writing axioms expressing the main properties of these primitives and the relations holding among them. So, in both cases, the axiomatization is the main tool used to characterize the object of inquiry, being this object scientific theories (in Suppes’ approach), or information systems (for formal ontologies). In the following section I will present the view of Patrick Suppes on the philosophy of science and the axiomatic method, in section 3 I will survey the theoretical issues underlying the work that is being done in formal ontologies and in section 4 I will draw a comparison of these two approaches and explore similarities and differences between them

    Do Goedel's incompleteness theorems set absolute limits on the ability of the brain to express and communicate mental concepts verifiably?

    Full text link
    Classical interpretations of Goedel's formal reasoning imply that the truth of some arithmetical propositions of any formal mathematical language, under any interpretation, is essentially unverifiable. However, a language of general, scientific, discourse cannot allow its mathematical propositions to be interpreted ambiguously. Such a language must, therefore, define mathematical truth verifiably. We consider a constructive interpretation of classical, Tarskian, truth, and of Goedel's reasoning, under which any formal system of Peano Arithmetic is verifiably complete. We show how some paradoxical concepts of Quantum mechanics can be expressed, and interpreted, naturally under a constructive definition of mathematical truth.Comment: 73 pages; this is an updated version of the NQ essay; an HTML version is available at http://alixcomsi.com/Do_Goedel_incompleteness_theorems.ht

    Quantum dynamics as a physical resource

    Get PDF
    How useful is a quantum dynamical operation for quantum information processing? Motivated by this question we investigate several strength measures quantifying the resources intrinsic to a quantum operation. We develop a general theory of such strength measures, based on axiomatic considerations independent of state-based resources. The power of this theory is demonstrated with applications to quantum communication complexity, quantum computational complexity, and entanglement generation by unitary operations.Comment: 19 pages, shortened by 3 pages, mainly cosmetic change
    • 

    corecore