2,782 research outputs found

    A Geometrical Representation of Entanglement as Internal Constraint

    Get PDF
    We study a system of two entangled spin 1/2, were the spin's are represented by a sphere model developed within the hidden measurement approach which is a generalization of the Bloch sphere representation, such that also the measurements are represented. We show how an arbitrary tensor product state can be described in a complete way by a specific internal constraint between the ray or density states of the two spin 1/2. We derive a geometrical view of entanglement as a 'rotation' and 'stretching' of the sphere representing the states of the second particle as measurements are performed on the first particle. In the case of the singlet state entanglement can be represented by a real physical constraint, namely by means of a rigid rod.Comment: 10 pages, 3 figures. submitted to International Journal of Theoretical Physic

    Hidden measurements, hidden variables and the volume representation of transition probabilities

    Full text link
    We construct, for any finite dimension nn, a new hidden measurement model for quantum mechanics based on representing quantum transition probabilities by the volume of regions in projective Hilbert space. For n=2n=2 our model is equivalent to the Aerts sphere model and serves as a generalization of it for dimensions n≥3n \geq 3. We also show how to construct a hidden variables scheme based on hidden measurements and we discuss how joint distributions arise in our hidden variables scheme and their relationship with the results of Fine.Comment: 23 pages, 1 figur

    Quantum Particles as Conceptual Entities: A Possible Explanatory Framework for Quantum Theory

    Full text link
    We put forward a possible new interpretation and explanatory framework for quantum theory. The basic hypothesis underlying this new framework is that quantum particles are conceptual entities. More concretely, we propose that quantum particles interact with ordinary matter, nuclei, atoms, molecules, macroscopic material entities, measuring apparatuses, ..., in a similar way to how human concepts interact with memory structures, human minds or artificial memories. We analyze the most characteristic aspects of quantum theory, i.e. entanglement and non-locality, interference and superposition, identity and individuality in the light of this new interpretation, and we put forward a specific explanation and understanding of these aspects. The basic hypothesis of our framework gives rise in a natural way to a Heisenberg uncertainty principle which introduces an understanding of the general situation of 'the one and the many' in quantum physics. A specific view on macro and micro different from the common one follows from the basic hypothesis and leads to an analysis of Schrodinger's Cat paradox and the measurement problem different from the existing ones. We reflect about the influence of this new quantum interpretation and explanatory framework on the global nature and evolutionary aspects of the world and human worldviews, and point out potential explanations for specific situations, such as the generation problem in particle physics, the confinement of quarks and the existence of dark matter.Comment: 45 pages, 10 figure

    Cartoon Computation: Quantum-like computing without quantum mechanics

    Get PDF
    We present a computational framework based on geometric structures. No quantum mechanics is involved, and yet the algorithms perform tasks analogous to quantum computation. Tensor products and entangled states are not needed -- they are replaced by sets of basic shapes. To test the formalism we solve in geometric terms the Deutsch-Jozsa problem, historically the first example that demonstrated the potential power of quantum computation. Each step of the algorithm has a clear geometric interpetation and allows for a cartoon representation.Comment: version accepted in J. Phys.A (Letter to the Editor

    Automated supervised classification of variable stars I. Methodology

    Get PDF
    The fast classification of new variable stars is an important step in making them available for further research. Selection of science targets from large databases is much more efficient if they have been classified first. Defining the classes in terms of physical parameters is also important to get an unbiased statistical view on the variability mechanisms and the borders of instability strips. Our goal is twofold: provide an overview of the stellar variability classes that are presently known, in terms of some relevant stellar parameters; use the class descriptions obtained as the basis for an automated `supervised classification' of large databases. Such automated classification will compare and assign new objects to a set of pre-defined variability training classes. For every variability class, a literature search was performed to find as many well-known member stars as possible, or a considerable subset if too many were present. Next, we searched on-line and private databases for their light curves in the visible band and performed period analysis and harmonic fitting. The derived light curve parameters are used to describe the classes and define the training classifiers. We compared the performance of different classifiers in terms of percentage of correct identification, of confusion among classes and of computation time. We describe how well the classes can be separated using the proposed set of parameters and how future improvements can be made, based on new large databases such as the light curves to be assembled by the CoRoT and Kepler space missions.Comment: This paper has been accepted for publication in Astronomy and Astrophysics (reference AA/2007/7638) Number of pages: 27 Number of figures: 1

    The delta-quantum machine, the k-model, and the non-ordinary spatiality of quantum entities

    Full text link
    The purpose of this article is threefold. Firstly, it aims to present, in an educational and non-technical fashion, the main ideas at the basis of Aerts' creation-discovery view and hidden measurement approach: a fundamental explanatory framework whose importance, in this author's view, has been seriously underappreciated by the physics community, despite its success in clarifying many conceptual challenges of quantum physics. Secondly, it aims to introduce a new quantum-machine - that we call the delta-quantum-machine - which is able to reproduce the transmission and reflection probabilities of a one-dimensional quantum scattering process by a Dirac delta-function potential. The machine is used not only to demonstrate the pertinence of the above mentioned explanatory framework, in the general description of physical systems, but also to illustrate (in the spirit of Aerts' epsilon-model) the origin of classical and quantum structures, by revealing the existence of processes which are neither classical nor quantum, but irreducibly intermediate. We do this by explicitly introducing what we call the k-model and by proving that its processes cannot be modelized by a classical or quantum scattering system. The third purpose of this work is to exploit the powerful metaphor provided by our quantum-machine, to investigate the intimate relation between the concept of potentiality and the notion of non-spatiality, that we characterize in precise terms, introducing for this the new concept of process-actuality.Comment: 19 pages, 4 figures. To appear in: Foundations of Scienc

    Quantum Aspects of Semantic Analysis and Symbolic Artificial Intelligence

    Full text link
    Modern approaches to semanic analysis if reformulated as Hilbert-space problems reveal formal structures known from quantum mechanics. Similar situation is found in distributed representations of cognitive structures developed for the purposes of neural networks. We take a closer look at similarites and differences between the above two fields and quantum information theory.Comment: version accepted in J. Phys. A (Letter to the Editor

    Detection of gravity modes in the massive binary V380 Cyg from Kepler spacebased photometry and high-resolution spectroscopy

    Get PDF
    We report the discovery of low-amplitude gravity-mode oscillations in the massive binary star V380 Cyg, from 180 d of Kepler custom-aperture space photometry and 5 months of high-resolution high signal-to-noise spectroscopy. The new data are of unprecedented quality and allowed to improve the orbital and fundamental parameters for this binary. The orbital solution was subtracted from the photometric data and led to the detection of periodic intrinsic variability with frequencies of which some are multiples of the orbital frequency and others are not. Spectral disentangling allowed the detection of line-profile variability in the primary. With our discovery of intrinsic variability interpreted as gravity mode oscillations, V380 Cyg becomes an important laboratory for future seismic tuning of the near-core physics in massive B-type stars.Comment: 5 pages, 4 figures, 2 tables. Accepted for publication in MNRAS Letter

    On classical models of spin

    Full text link
    The reason for recalling this old paper is the ongoing discussion on the attempts of circumventing certain assumptions leading to the Bell theorem (Hess-Philipp, Accardi). If I correctly understand the intentions of these Authors, the idea is to make use of the following logical loophole inherent in the proof of the Bell theorem: Probabilities of counterfactual events A and A' do not have to coincide with actually measured probabilities if measurements of A and A' disturb each other, or for any other fundamental reason cannot be performed simulaneously. It is generally believed that in the context of classical probability theory (i.e. realistic hidden variables) probabilities of counterfactual events can be identified with those of actually measured events. In the paper I give an explicit counterexample to this belief. The "first variation" on the Aerts model shows that counterfactual and actual problems formulated for the same classical system may be unrelated. In the model the first probability does not violate any classical inequality whereas the second does. Pecularity of the Bell inequality is that on the basis of an in principle unobservable probability one derives probabilities of jointly measurable random variables, the fact additionally obscuring the logical meaning of the construction. The existence of the loophole does not change the fact that I was not able to construct a local model violating the inequality with all the other loopholes eliminated.Comment: published as Found. Phys. Lett. 3 (1992) 24
    • …
    corecore