2,071 research outputs found

    Probing quantum-classical boundary with compression software

    Get PDF
    We experimentally demonstrate that it is impossible to simulate quantum bipartite correlations with a deterministic universal Turing machine. Our approach is based on the Normalized Information Distance (NID) that allows the comparison of two pieces of data without detailed knowledge about their origin. Using NID, we derive an inequality for output of two local deterministic universal Turing machines with correlated inputs. This inequality is violated by correlations generated by a maximally entangled polarization state of two photons. The violation is shown using a freely available lossless compression program. The presented technique may allow to complement the common statistical interpretation of quantum physics by an algorithmic one.Comment: 7 pages, 6 figure

    Classical computing, quantum computing, and Shor's factoring algorithm

    Get PDF
    This is an expository talk written for the Bourbaki Seminar. After a brief introduction, Section 1 discusses in the categorical language the structure of the classical deterministic computations. Basic notions of complexity icluding the P/NP problem are reviewed. Section 2 introduces the notion of quantum parallelism and explains the main issues of quantum computing. Section 3 is devoted to four quantum subroutines: initialization, quantum computing of classical Boolean functions, quantum Fourier transform, and Grover's search algorithm. The central Section 4 explains Shor's factoring algorithm. Section 5 relates Kolmogorov's complexity to the spectral properties of computable function. Appendix contributes to the prehistory of quantum computing.Comment: 27 pp., no figures, amste

    A Computational Model for Quantum Measurement

    Full text link
    Is the dynamical evolution of physical systems objectively a manifestation of information processing by the universe? We find that an affirmative answer has important consequences for the measurement problem. In particular, we calculate the amount of quantum information processing involved in the evolution of physical systems, assuming a finite degree of fine-graining of Hilbert space. This assumption is shown to imply that there is a finite capacity to sustain the immense entanglement that measurement entails. When this capacity is overwhelmed, the system's unitary evolution becomes computationally unstable and the system suffers an information transition (`collapse'). Classical behaviour arises from the rapid cycles of unitary evolution and information transitions. Thus, the fine-graining of Hilbert space determines the location of the `Heisenberg cut', the mesoscopic threshold separating the microscopic, quantum system from the macroscopic, classical environment. The model can be viewed as a probablistic complement to decoherence, that completes the measurement process by turning decohered improper mixtures of states into proper mixtures. It is shown to provide a natural resolution to the measurement problem and the basis problem.Comment: 24 pages; REVTeX4; published versio
    corecore