596 research outputs found

    Quantum probabilities as Bayesian probabilities

    Full text link
    In the Bayesian approach to probability theory, probability quantifies a degree of belief for a single trial, without any a priori connection to limiting frequencies. In this paper we show that, despite being prescribed by a fundamental law, probabilities for individual quantum systems can be understood within the Bayesian approach. We argue that the distinction between classical and quantum probabilities lies not in their definition, but in the nature of the information they encode. In the classical world, maximal information about a physical system is complete in the sense of providing definite answers for all possible questions that can be asked of the system. In the quantum world, maximal information is not complete and cannot be completed. Using this distinction, we show that any Bayesian probability assignment in quantum mechanics must have the form of the quantum probability rule, that maximal information about a quantum system leads to a unique quantum-state assignment, and that quantum theory provides a stronger connection between probability and measured frequency than can be justified classically. Finally we give a Bayesian formulation of quantum-state tomography.Comment: 6 pages, Latex, final versio

    Unknown Quantum States: The Quantum de Finetti Representation

    Full text link
    We present an elementary proof of the quantum de Finetti representation theorem, a quantum analogue of de Finetti's classical theorem on exchangeable probability assignments. This contrasts with the original proof of Hudson and Moody [Z. Wahrschein. verw. Geb. 33, 343 (1976)], which relies on advanced mathematics and does not share the same potential for generalization. The classical de Finetti theorem provides an operational definition of the concept of an unknown probability in Bayesian probability theory, where probabilities are taken to be degrees of belief instead of objective states of nature. The quantum de Finetti theorem, in a closely analogous fashion, deals with exchangeable density-operator assignments and provides an operational definition of the concept of an ``unknown quantum state'' in quantum-state tomography. This result is especially important for information-based interpretations of quantum mechanics, where quantum states, like probabilities, are taken to be states of knowledge rather than states of nature. We further demonstrate that the theorem fails for real Hilbert spaces and discuss the significance of this point.Comment: 30 pages, 2 figure

    Designing and Implementing a Competency-Based Training Program for Anesthesiology Residents at the University of Ottawa

    Get PDF
    Competency-based medical education is gaining traction as a solution to address the challenges associated with the current time-based models of physician training. Competency-based medical education is an outcomes-based approach that involves identifying the abilities required of physicians and then designing the curriculum to support the achievement and assessment of these competencies. This paradigm defies the assumption that competence is achieved based on time spent on rotations and instead requires residents to demonstrate competence. The Royal College of Physicians and Surgeons of Canada (RCPSC) has launched Competence by Design (CBD), a competency-based approach for residency training and specialty practice. The first residents to be trained within this model will be those in medical oncology and otolaryngology-head and neck surgery in July, 2016. However, with approval from the RCPSC, the Department of Anesthesiology, University of Ottawa, launched an innovative competency-based residency training program July 1, 2015. The purpose of this paper is to provide an overview of the program and offer a blueprint for other programs planning similar curricular reform. The program is structured according to the RCPSC CBD stages and addresses all CanMEDS roles. While our program retains some aspects of the traditional design, we have made many transformational changes

    Using large-scale genomics data to identify driver mutations in lung cancer: methods and challenges

    Get PDF
    Lung cancer is the commonest cause of cancer death in the world and carries a poor prognosis for most patients. While precision targeting of mutated proteins has given some successes for never- and light-smoking patients, there are no proven targeted therapies for the majority of smokers with the disease. Despite sequencing hundreds of lung cancers, known driver mutations are lacking for a majority of tumors. Distinguishing driver mutations from inconsequential passenger mutations in a given lung tumor is extremely challenging due to the high mutational burden of smoking-related cancers. Here we discuss the methods employed to identify driver mutations from these large datasets. We examine different approaches based on bioinformatics, in silico structural modeling and biological dependency screens and discuss the limitations of these approaches

    Predicting human interruptibility with sensors, in

    Get PDF
    A person seeking someone else’s attention is normally able to quickly assess how interruptible they are. This assessment allows for behavior we perceive as natural, socially appropriate, or simply polite. On the other hand, today’s computer systems are almost entirely oblivious to the human world they operate in, and typically have no way to take into account the interruptibility of the user. This paper presents a Wizard of Oz study exploring whether, and how, robust sensor-based predictions of interruptibility might be constructed, which sensors might be most useful to such predictions, and how simple such sensors might be. The study simulates a range of possible sensors through human coding of audio and video recordings. Experience sampling is used to simultaneously collect randomly distributed self-reports of interruptibility. Based on these simulated sensors, we construct statistical models predicting human interruptibility and compare their predictions with the collected self-report data. The results of these models, although covering a demographically limited sample, are very promising, with the overall accuracy of several models reaching about 78%. Additionally, a model tuned to avoiding unwanted interruptions does so for 90 % of its predictions, while retaining 75 % overall accuracy

    Impact of electrostatic crosstalk on spin qubits in dense CMOS quantum dot arrays

    Full text link
    Quantum processors based on integrated nanoscale silicon spin qubits are a promising platform for highly scalable quantum computation. Current CMOS spin qubit processors consist of dense gate arrays to define the quantum dots, making them susceptible to crosstalk from capacitive coupling between a dot and its neighbouring gates. Small but sizeable spin-orbit interactions can transfer this electrostatic crosstalk to the spin g-factors, creating a dependence of the Larmor frequency on the electric field created by gate electrodes positioned even tens of nanometers apart. By studying the Stark shift from tens of spin qubits measured in nine different CMOS devices, we developed a theoretical frawework that explains how electric fields couple to the spin of the electrons in increasingly complex arrays, including those electric fluctuations that limit qubit dephasing times T2T_2^*. The results will aid in the design of robust strategies to scale CMOS quantum technology.Comment: 9 pages, 4 figure

    Bounds to electron spin qubit variability for scalable CMOS architectures

    Full text link
    Spins of electrons in CMOS quantum dots combine exquisite quantum properties and scalable fabrication. In the age of quantum technology, however, the metrics that crowned Si/SiO2 as the microelectronics standard need to be reassessed with respect to their impact upon qubit performance. We chart the spin qubit variability due to the unavoidable atomic-scale roughness of the Si/SiO2_2 interface, compiling experiments in 12 devices, and developing theoretical tools to analyse these results. Atomistic tight binding and path integral Monte Carlo methods are adapted for describing fluctuations in devices with millions of atoms by directly analysing their wavefunctions and electron paths instead of their energy spectra. We correlate the effect of roughness with the variability in qubit position, deformation, valley splitting, valley phase, spin-orbit coupling and exchange coupling. These variabilities are found to be bounded and lie within the tolerances for scalable architectures for quantum computing as long as robust control methods are incorporated.Comment: 20 pages, 8 figure

    Precision tomography of a three-qubit electron-nuclear quantum processor in silicon

    Full text link
    Nuclear spins were among the first physical platforms to be considered for quantum information processing, because of their exceptional quantum coherence and atomic-scale footprint. However, their full potential for quantum computing has not yet been realized, due to the lack of methods to link nuclear qubits within a scalable device combined with multi-qubit operations with sufficient fidelity to sustain fault-tolerant quantum computation. Here we demonstrate universal quantum logic operations using a pair of ion-implanted 31^{31}P nuclei in a silicon nanoelectronic device. A nuclear two-qubit controlled-Z gate is obtained by imparting a geometric phase to a shared electron spin, and used to prepare entangled Bell states with fidelities up to 94.2(2.7)%. The quantum operations are precisely characterised using gate set tomography (GST), yielding one-qubit gate fidelities up to 99.93(3)%, two-qubit gate fidelity of 99.21(14)% and two-qubit preparation/measurement fidelities of 98.95(4)%. These three metrics indicate that nuclear spins in silicon are approaching the performance demanded in fault-tolerant quantum processors. We then demonstrate entanglement between the two nuclei and the shared electron by producing a Greenberger-Horne-Zeilinger three-qubit state with 92.5(1.0)% fidelity. Since electron spin qubits in semiconductors can be further coupled to other electrons or physically shuttled across different locations, these results establish a viable route for scalable quantum information processing using nuclear spins.Comment: 27 pages, 14 figures, plus 20 pages supplementary information. v2 includes new and updated references, and minor text change

    Use of stochastic simulation to evaluate the reduction in methane emissions and improvement in reproductive efficiency from routine hormonal interventions in dairy herds

    Get PDF
    This study predicts the magnitude and between herd variation in changes of methane emissions and production efficiency associated with interventions to improve reproductive efficiency in dairy cows. Data for 10,000 herds of 200 cows were simulated. Probability of conception was predicted daily from the start of the study (parturition) for each cow up to day 300 of lactation. Four scenarios of differing first insemination management were simulated for each herd using the same theoretical cows: A baseline scenario based on breeding from observed oestrus only, synchronisation of oestrus for pre-set first insemination using 2 methods, and a regime using prostaglandin treatments followed by first insemination to observed oestrus. Cows that did not conceive to first insemination were re-inseminated following detection of oestrus. For cows that conceived, gestation length was 280 days with cessation of milking 60 days before calving. Those cows not pregnant after 300 days of lactation were culled and replaced by a heifer. Daily milk yield was calculated for 730 days from the start of the study for each cow. Change in mean reproductive and economic outputs were summarised for each herd following the 3 interventions. For each scenario, methane emissions were determined by daily forage dry matter intake, forage quality, and cow replacement risk. Linear regression was used to summarise relationships. In some circumstances improvement in reproductive efficiency using the programmes investigated was associated with reduced cost and methane emissions compared to reliance on detection of oestrus. Efficiency of oestrus detection and the time to commencement of breeding after calving influenced variability in changes in cost and methane emissions. For an average UK herd this was a saving of at least £50 per cow and a 3.6% reduction in methane emissions per L of milk when timing of first insemination was pre-set
    corecore