189 research outputs found

    On proving the robustness of algorithms for early fault-tolerant quantum computers

    Full text link
    The hope of the quantum computing field is that quantum architectures are able to scale up and realize fault-tolerant quantum computing. Due to engineering challenges, such "cheap" error correction may be decades away. In the meantime, we anticipate an era of "costly" error correction, or early fault-tolerant quantum computing. Costly error correction might warrant settling for error-prone quantum computations. This motivates the development of quantum algorithms which are robust to some degree of error as well as methods to analyze their performance in the presence of error. We introduce a randomized algorithm for the task of phase estimation and give an analysis of its performance under two simple noise models. In both cases the analysis leads to a noise threshold, below which arbitrarily high accuracy can be achieved by increasing the number of samples used in the algorithm. As an application of this general analysis, we compute the maximum ratio of the largest circuit depth and the dephasing scale such that performance guarantees hold. We calculate that the randomized algorithm can succeed with arbitrarily high probability as long as the required circuit depth is less than 0.916 times the dephasing scale.Comment: 27 pages, 3 figures, 1 table, 1 algorithm. To be submitted to QIP 202

    Early Fault-Tolerant Quantum Computing

    Full text link
    Over the past decade, research in quantum computing has tended to fall into one of two camps: near-term intermediate scale quantum (NISQ) and fault-tolerant quantum computing (FTQC). Yet, a growing body of work has been investigating how to use quantum computers in transition between these two eras. This envisions operating with tens of thousands to millions of physical qubits, able to support fault-tolerant protocols, though operating close to the fault-tolerant threshold. Two challenges emerge from this picture: how to model the performance of devices that are continually improving and how to design algorithms to make the most use of these devices? In this work we develop a model for the performance of early fault-tolerant quantum computing (EFTQC) architectures and use this model to elucidate the regimes in which algorithms suited to such architectures are advantageous. As a concrete example, we show that, for the canonical task of phase estimation, in a regime of moderate scalability and using just over one million physical qubits, the ``reach'' of the quantum computer can be extended (compared to the standard approach) from 90-qubit instances to over 130-qubit instances using a simple early fault-tolerant quantum algorithm, which reduces the number of operations per circuit by a factor of 100 and increases the number of circuit repetitions by a factor of 10,000. This clarifies the role that such algorithms might play in the era of limited-scalability quantum computing.Comment: 20 pages, 8 figures with desmos links, plus appendi

    Policy mapping : women’s economic empowerment in Rwanda

    Get PDF
    This detailed mapping paper provides context on women’s economic empowerment (WEE) in Rwanda and assesses existing gaps in research, with specific focus on three priority themes: unpaid care work (UCW), gender segregation in labour markets, and women’s collective action. Women continue to lag behind in terms of employment opportunities and have low involvement in entrepreneurship, business development, and cooperatives due to unpaid care work, and lack of specific skills and capacities. The impacts on WEE domains are limited by the small and scattered nature of most programs and projects. The paper identifies a number of policy entry points.William and Flora Hewlett FoundationBill & Melinda Gates Foundatio

    Generation of High-Resolution Handwritten Digits with an Ion-Trap Quantum Computer

    Full text link
    Generating high-quality data (e.g. images or video) is one of the most exciting and challenging frontiers in unsupervised machine learning. Utilizing quantum computers in such tasks to potentially enhance conventional machine learning algorithms has emerged as a promising application, but poses big challenges due to the limited number of qubits and the level of gate noise in available devices. In this work, we provide the first practical and experimental implementation of a quantum-classical generative algorithm capable of generating high-resolution images of handwritten digits with state-of-the-art gate-based quantum computers. In our quantum-assisted machine learning framework, we implement a quantum-circuit based generative model to learn and sample the prior distribution of a Generative Adversarial Network. We introduce a multi-basis technique that leverages the unique possibility of measuring quantum states in different bases, hence enhancing the expressivity of the prior distribution. We train this hybrid algorithm on an ion-trap device based on 171^{171}Yb+^{+} ion qubits to generate high-quality images and quantitatively outperform comparable classical Generative Adversarial Networks trained on the popular MNIST data set for handwritten digits.Comment: 10 pages, 8 figures (more details and discussion in main text for clarity

    Analyzing the Performance of Variational Quantum Factoring on a Superconducting Quantum Processor

    Full text link
    In the near-term, hybrid quantum-classical algorithms hold great potential for outperforming classical approaches. Understanding how these two computing paradigms work in tandem is critical for identifying areas where such hybrid algorithms could provide a quantum advantage. In this work, we study a QAOA-based quantum optimization algorithm by implementing the Variational Quantum Factoring (VQF) algorithm. We execute experimental demonstrations using a superconducting quantum processor and investigate the trade-off between quantum resources (number of qubits and circuit depth) and the probability that a given biprime is successfully factored. In our experiments, the integers 1099551473989, 3127, and 6557 are factored with 3, 4, and 5 qubits, respectively, using a QAOA ansatz with up to 8 layers and we are able to identify the optimal number of circuit layers for a given instance to maximize success probability. Furthermore, we demonstrate the impact of different noise sources on the performance of QAOA and reveal the coherent error caused by the residual ZZ-coupling between qubits as a dominant source of error in the superconducting quantum processor

    Modelling the elimination of river blindness using long-term epidemiological and programmatic data from Mali and Senegal

    Get PDF
    The onchocerciasis transmission models EPIONCHO and ONCHOSIM have been independently developed and used to explore the feasibility of eliminating onchocerciasis from Africa with mass (annual or biannual) distribution of ivermectin within the timeframes proposed by the World Health Organization (WHO) and endorsed by the 2012 London Declaration on Neglected Tropical Diseases (i.e. by 2020/2025). Based on the findings of our previous model comparison, we implemented technical refinements and tested the projections of EPIONCHO and ONCHOSIM against long-term epidemiological data from two West African transmission foci in Mali and Senegal where the observed prevalence of infection was brought to zero circa 2007–2009 after 15–17 years of mass ivermectin treatment. We simulated these interventions using programmatic information on the frequency and coverage of mass treatments and trained the model projections using longitudinal parasitological data from 27 communities, evaluating the projected outcome of elimination (local parasite extinction) or resurgence. We found that EPIONCHO and ONCHOSIM captured adequately the epidemiological trends during mass treatment but that resurgence, while never predicted by ONCHOSIM, was predicted by EPIONCHO in some communities with the highest (inferred) vector biting rates and associated pre-intervention endemicities. Resurgence can be extremely protracted such that low (microfilarial) prevalence between 1% and 5% can be maintained for 3–5 years before manifesting more prominently. We highlight that post-treatment and post-elimination surveillance protocols must be implemented for long enough and with high enough sensitivity to detect possible residual latent infections potentially indicative of resurgence. We also discuss uncertainty and differences between EPIONCHO and ONCHOSIM projections, the potential importance of vector control in high-transmission settings as a complementary intervention strategy, and the short remaining timeline for African countries to be ready to stop treatment safely and begin surveillance in order to meet the impending 2020/2025 elimination targets

    The value of monitoring data in a process evaluation of hygiene behaviour change in Community Health Clubs to explain findings from a cluster-randomised controlled trial in Rwanda.

    Get PDF
    BACKGROUND: A cluster-Randomised Controlled Trial evaluation of the impact of the Community Health Clubs (CHCs) in the Community Based Environmental Health Promotion Programme in Rwanda in 2015 appeared to find little uptake of 7 hygiene indicators 1 year after the end of the intervention, and low impact on prevention of diarrhoea and stunting. METHODS: Monitoring data was revisited through detailed community records with all the expected inputs, outputs and external determinants analysed for fidelity to the research protocol. Five household inventory observations were taken over a 40-month period including 2 years after the end of the cRCT in a random selection of the 50 intervention CHCs and data compared to that of the trial. Focus Group Discussion with all Environmental Health Officers of the Ministry of Health provided context to understand the long-term community dynamics of hygiene behaviour change. RESULTS: It was found that the intervention had been jeopardised by external determinants with only 54% fidelity to protocol. By the end of the designated intervention period in June 2014, the treatment had reached only 58% of households with 41% average attendance at training sessions by the 4056 registered members and 51% mean completion rate of 20+ sessions. Therefore only 10% of 50 CHCs provided the full so-called 'Classic' training as per-protocol. However, sustainability of the CHCs was high, with all 50 being active 2 years after the end of the cRCT and over 80% uptake of recommended practices of the same 7 key indicators as the trial was achieved by 2017. CONCLUSIONS: The cRCT conclusion that the case study of Rusizi District does not encourage the use of the CHC model for scaling up, raises concerns over the possible misrepresentation of the potential of the holistic CHC model to achieve health impact in a more realistic time frame. It also questions the appropriateness of apparently rigorous quantitative research, such as the cluster-Randomised Controlled Trial as conducted in Rusizi District, to adequately assess community dynamics in complex interventions
    • …
    corecore