1,450 research outputs found

    Response to Nauenberg's "Critique of Quantum Enigma: Physics Encounters Consciousness"

    Full text link
    Nauenberg's extended critique of Quantum Enigma rests on fundamental misunderstandings.Comment: To be published in Foundations of Physic

    An quantum approach of measurement based on the Zurek's triple model

    Full text link
    In a close form without referring the time-dependent Hamiltonian to the total system, a consistent approach for quantum measurement is proposed based on Zurek's triple model of quantum decoherence [W.Zurek, Phys. Rev. D 24, 1516 (1981)]. An exactly-solvable model based on the intracavity system is dealt with in details to demonstrate the central idea in our approach: by peeling off one collective variable of the measuring apparatus from its many degrees of freedom, as the pointer of the apparatus, the collective variable de-couples with the internal environment formed by the effective internal variables, but still interacts with the measured system to form a triple entanglement among the measured system, the pointer and the internal environment. As another mechanism to cause decoherence, the uncertainty of relative phase and its many-particle amplification can be summed up to an ideal entanglement or an Shmidt decomposition with respect to the preferred basis.Comment: 22pages,3figure

    Typicality vs. probability in trajectory-based formulations of quantum mechanics

    Full text link
    Bohmian mechanics represents the universe as a set of paths with a probability measure defined on it. The way in which a mathematical model of this kind can explain the observed phenomena of the universe is examined in general. It is shown that the explanation does not make use of the full probability measure, but rather of a suitable set function deriving from it, which defines relative typicality between single-time cylinder sets. Such a set function can also be derived directly from the standard quantum formalism, without the need of an underlying probability measure. The key concept for this derivation is the {\it quantum typicality rule}, which can be considered as a generalization of the Born rule. The result is a new formulation of quantum mechanics, in which particles follow definite trajectories, but which is only based on the standard formalism of quantum mechanics.Comment: 24 pages, no figures. To appear in Foundation of Physic

    Quantum measurement as driven phase transition: An exactly solvable model

    Get PDF
    A model of quantum measurement is proposed, which aims to describe statistical mechanical aspects of this phenomenon, starting from a purely Hamiltonian formulation. The macroscopic measurement apparatus is modeled as an ideal Bose gas, the order parameter of which, that is, the amplitude of the condensate, is the pointer variable. It is shown that properties of irreversibility and ergodicity breaking, which are inherent in the model apparatus, ensure the appearance of definite results of the measurement, and provide a dynamical realization of wave-function reduction or collapse. The measurement process takes place in two steps: First, the reduction of the state of the tested system occurs over a time of order /(TN1/4)\hbar/(TN^{1/4}), where TT is the temperature of the apparatus, and NN is the number of its degrees of freedom. This decoherence process is governed by the apparatus-system interaction. During the second step classical correlations are established between the apparatus and the tested system over the much longer time-scale of equilibration of the apparatus. The influence of the parameters of the model on non-ideality of the measurement is discussed. Schr\"{o}dinger kittens, EPR setups and information transfer are analyzed.Comment: 35 pages revte

    Decoherence and wave function collapse

    Full text link
    The possibility of consistency between the basic quantum principles of quantum mechanics and wave function collapse is reexamined. A specific interpretation of environment is proposed for this aim and applied to decoherence. When the organization of a measuring apparatus is taken into account, this approach leads also to an interpretation of wave function collapse, which would result in principle from the same interactions with environment as decoherence. This proposal is shown consistent with the non-separable character of quantum mechanics

    Quantum models of classical mechanics: maximum entropy packets

    Get PDF
    In a previous paper, a project of constructing quantum models of classical properties has been started. The present paper concludes the project by turning to classical mechanics. The quantum states that maximize entropy for given averages and variances of coordinates and momenta are called ME packets. They generalize the Gaussian wave packets. A non-trivial extension of the partition-function method of probability calculus to quantum mechanics is given. Non-commutativity of quantum variables limits its usefulness. Still, the general form of the state operators of ME packets is obtained with its help. The diagonal representation of the operators is found. A general way of calculating averages that can replace the partition function method is described. Classical mechanics is reinterpreted as a statistical theory. Classical trajectories are replaced by classical ME packets. Quantum states approximate classical ones if the product of the coordinate and momentum variances is much larger than Planck constant. Thus, ME packets with large variances follow their classical counterparts better than Gaussian wave packets.Comment: 26 pages, no figure. Introduction and the section on classical limit are extended, new references added. Definitive version accepted by Found. Phy

    Stringent Constraints on Cosmological Neutrino-Antineutrino Asymmetries from Synchronized Flavor Transformation

    Full text link
    We assess a mechanism which can transform neutrino-antineutrino asymmetries between flavors in the early universe, and confirm that such transformation is unavoidable in the near bi-maximal framework emerging for the neutrino mixing matrix. We show that the process is a standard Mikheyev-Smirnov-Wolfenstein flavor transformation dictated by a synchronization of momentum states. We also show that flavor ``equilibration'' is a special feature of maximal mixing, and carefully examine new constraints placed on neutrino asymmetries. In particular, the big bang nucleosynthesis limit on electron neutrino degeneracy xi_e < 0.04 does not apply directly to all flavors, yet confirmation of the large-mixing-angle solution to the solar neutrino problem will eliminate the possibility of degenerate big bang nucleosynthesis.Comment: 11 pages, 6 figures; minor changes to match PRD versio

    Explaining the unobserved: why quantum mechanics is not only about information

    Get PDF
    A remarkable theorem by Clifton, Bub and Halvorson (2003)(CBH) characterizes quantum theory in terms of information--theoretic principles. According to Bub (2004, 2005) the philosophical significance of the theorem is that quantum theory should be regarded as a ``principle'' theory about (quantum) information rather than a ``constructive'' theory about the dynamics of quantum systems. Here we criticize Bub's principle approach arguing that if the mathematical formalism of quantum mechanics remains intact then there is no escape route from solving the measurement problem by constructive theories. We further propose a (Wigner--type) thought experiment that we argue demonstrates that quantum mechanics on the information--theoretic approach is incomplete.Comment: 34 Page

    Causality - Complexity - Consistency: Can Space-Time Be Based on Logic and Computation?

    Full text link
    The difficulty of explaining non-local correlations in a fixed causal structure sheds new light on the old debate on whether space and time are to be seen as fundamental. Refraining from assuming space-time as given a priori has a number of consequences. First, the usual definitions of randomness depend on a causal structure and turn meaningless. So motivated, we propose an intrinsic, physically motivated measure for the randomness of a string of bits: its length minus its normalized work value, a quantity we closely relate to its Kolmogorov complexity (the length of the shortest program making a universal Turing machine output this string). We test this alternative concept of randomness for the example of non-local correlations, and we end up with a reasoning that leads to similar conclusions as in, but is conceptually more direct than, the probabilistic view since only the outcomes of measurements that can actually all be carried out together are put into relation to each other. In the same context-free spirit, we connect the logical reversibility of an evolution to the second law of thermodynamics and the arrow of time. Refining this, we end up with a speculation on the emergence of a space-time structure on bit strings in terms of data-compressibility relations. Finally, we show that logical consistency, by which we replace the abandoned causality, it strictly weaker a constraint than the latter in the multi-party case.Comment: 17 pages, 16 figures, small correction

    Output spectrum of a detector measuring quantum oscillations

    Full text link
    We consider a two-level quantum system (qubit) which is continuously measured by a detector and calculate the spectral density of the detector output. In the weakly coupled case the spectrum exhibits a moderate peak at the frequency of quantum oscillations and a Lorentzian-shape increase of the detector noise at low frequency. With increasing coupling the spectrum transforms into a single Lorentzian corresponding to random jumps between two states. We prove that the Bayesian formalism for the selective evolution of the density matrix gives the same spectrum as the conventional master equation approach, despite the significant difference in interpretation. The effects of the detector nonideality and the finite-temperature environment are also discussed.Comment: 8 pages, 6 figure
    corecore