877 research outputs found

    The Quantum Frontier

    Full text link
    The success of the abstract model of computation, in terms of bits, logical operations, programming language constructs, and the like, makes it easy to forget that computation is a physical process. Our cherished notions of computation and information are grounded in classical mechanics, but the physics underlying our world is quantum. In the early 80s researchers began to ask how computation would change if we adopted a quantum mechanical, instead of a classical mechanical, view of computation. Slowly, a new picture of computation arose, one that gave rise to a variety of faster algorithms, novel cryptographic mechanisms, and alternative methods of communication. Small quantum information processing devices have been built, and efforts are underway to build larger ones. Even apart from the existence of these devices, the quantum view on information processing has provided significant insight into the nature of computation and information, and a deeper understanding of the physics of our universe and its connections with computation. We start by describing aspects of quantum mechanics that are at the heart of a quantum view of information processing. We give our own idiosyncratic view of a number of these topics in the hopes of correcting common misconceptions and highlighting aspects that are often overlooked. A number of the phenomena described were initially viewed as oddities of quantum mechanics. It was quantum information processing, first quantum cryptography and then, more dramatically, quantum computing, that turned the tables and showed that these oddities could be put to practical effect. It is these application we describe next. We conclude with a section describing some of the many questions left for future work, especially the mysteries surrounding where the power of quantum information ultimately comes from.Comment: Invited book chapter for Computation for Humanity - Information Technology to Advance Society to be published by CRC Press. Concepts clarified and style made more uniform in version 2. Many thanks to the referees for their suggestions for improvement

    Analysis of pivot sampling in dual-pivot Quicksort: A holistic analysis of Yaroslavskiy's partitioning scheme

    Get PDF
    The final publication is available at Springer via http://dx.doi.org/10.1007/s00453-015-0041-7The new dual-pivot Quicksort by Vladimir Yaroslavskiy-used in Oracle's Java runtime library since version 7-features intriguing asymmetries. They make a basic variant of this algorithm use less comparisons than classic single-pivot Quicksort. In this paper, we extend the analysis to the case where the two pivots are chosen as fixed order statistics of a random sample. Surprisingly, dual-pivot Quicksort then needs more comparisons than a corresponding version of classic Quicksort, so it is clear that counting comparisons is not sufficient to explain the running time advantages observed for Yaroslavskiy's algorithm in practice. Consequently, we take a more holistic approach and give also the precise leading term of the average number of swaps, the number of executed Java Bytecode instructions and the number of scanned elements, a new simple cost measure that approximates I/O costs in the memory hierarchy. We determine optimal order statistics for each of the cost measures. It turns out that the asymmetries in Yaroslavskiy's algorithm render pivots with a systematic skew more efficient than the symmetric choice. Moreover, we finally have a convincing explanation for the success of Yaroslavskiy's algorithm in practice: compared with corresponding versions of classic single-pivot Quicksort, dual-pivot Quicksort needs significantly less I/Os, both with and without pivot sampling.Peer ReviewedPostprint (author's final draft

    Quantum violations in the Instrumental scenario and their relations to the Bell scenario

    Full text link
    The causal structure of any experiment implies restrictions on the observable correlations between measurement outcomes, which are different for experiments exploiting classical, quantum, or post-quantum resources. In the study of Bell nonlocality, these differences have been explored in great detail for more and more involved causal structures. Here, we go in the opposite direction and identify the simplest causal structure which exhibits a separation between classical, quantum, and post-quantum correlations. It arises in the so-called Instrumental scenario, known from classical causal models. We derive inequalities for this scenario and show that they are closely related to well-known Bell inequalities, such as the Clauser-Horne-Shimony-Holt inequality, which enables us to easily identify their classical, quantum, and post-quantum bounds as well as strategies violating the first two. The relations that we uncover imply that the quantum or post-quantum advantages witnessed by the violation of our Instrumental inequalities are not fundamentally different from those witnessed by the violations of standard inequalities in the usual Bell scenario. However, non-classical tests in the Instrumental scenario require fewer input choices than their Bell scenario counterpart, which may have potential implications for device-independent protocols.Comment: 12 pages, 3 figures. Comments welcome! v4: published version in Quantum journa

    Modelling and feedback control design for quantum state preparation

    Get PDF
    The goal of this article is to provide a largely self-contained introduction to the modelling of controlled quantum systems under continuous observation, and to the design of feedback controls that prepare particular quantum states. We describe a bottom-up approach, where a field-theoretic model is subjected to statistical inference and is ultimately controlled. As an example, the formalism is applied to a highly idealized interaction of an atomic ensemble with an optical field. Our aim is to provide a unified outline for the modelling, from first principles, of realistic experiments in quantum control

    Are we bootstrapping the right thing? A new approach to quantify uncertainty of Average Treatment Effect Estimate

    Full text link
    Existing approaches of using the bootstrap method to derive standard error and confidence interval of average treatment effect estimate has one potential issue, which is that they are actually bootstrapping the wrong thing, resulting in unvalid statistical inference. In this paper, we discuss this important issue and propose a new non-parametric bootstrap method that can more precisely quantify the uncertainty associated with average treatment effect estimates. We demonstrate the validity of this approach through a simulation study and a real-world example, and highlight the importance of deriving standard error and confidence interval of average treatment effect estimates that both remove extra undesired noise and are easy to interpret when applied in real world scenarios

    Is "the theory of everything'' merely the ultimate ensemble theory?

    Get PDF
    We discuss some physical consequences of what might be called ``the ultimate ensemble theory'', where not only worlds corresponding to say different sets of initial data or different physical constants are considered equally real, but also worlds ruled by altogether different equations. The only postulate in this theory is that all structures that exist mathematically exist also physically, by which we mean that in those complex enough to contain self-aware substructures (SASs), these SASs will subjectively perceive themselves as existing in a physically ``real'' world. We find that it is far from clear that this simple theory, which has no free parameters whatsoever, is observationally ruled out. The predictions of the theory take the form of probability distributions for the outcome of experiments, which makes it testable. In addition, it may be possible to rule it out by comparing its a priori predictions for the observable attributes of nature (the particle masses, the dimensionality of spacetime, etc) with what is observed.Comment: 29 pages, revised to match version published in Annals of Physics. The New Scientist article and color figures are available at http://www.sns.ias.edu/~max/toe_frames.html or from [email protected]

    Investigation of light scattering in highly reflecting pigmented coatings. Volume 3 - Monte Carlo and other statistical investigations Final report, 1 May 1963 - 30 Sep. 1966

    Get PDF
    Monte Carlo methods, Mie theory, and random walk and screen models for predicting reflective properties of paint film
    corecore