128 research outputs found

    Evening

    Get PDF
    https://digitalcommons.library.umaine.edu/mmb-vp/1384/thumbnail.jp

    Evening

    Get PDF
    https://digitalcommons.library.umaine.edu/mmb-vp/1383/thumbnail.jp

    Parallel window decoding enables scalable fault tolerant quantum computation

    Get PDF
    Quantum Error Correction (QEC) continuously generates a stream of syndrome data that contains information about the errors in the system. Useful fault-tolerant quantum computation requires online decoders that are capable of processing this syndrome data at the rate it is received. Otherwise, a data backlog is created that grows exponentially with the TT-gate depth of the computation. Superconducting quantum devices can perform QEC rounds in sub-1 μ\mus time, setting a stringent requirement on the speed of the decoders. All current decoder proposals have a maximum code size beyond which the processing of syndromes becomes too slow to keep up with the data acquisition, thereby making the fault-tolerant computation not scalable. Here, we will present a methodology that parallelizes the decoding problem and achieves almost arbitrary syndrome processing speed. Our parallelization requires some classical feedback decisions to be delayed, leading to a slow-down of the logical clock speed. However, the slow-down is now polynomial in code size and so an exponential backlog is averted. Furthermore, using known auto-teleportation gadgets the slow-down can be eliminated altogether in exchange for increased qubit overhead, all polynomially scaling. We demonstrate our parallelization speed-up using a Python implementation, combining it with both union-find and minimum weight perfect matching. Furthermore, we show that the algorithm imposes no noticeable reduction in logical fidelity compared to the original global decoder. Finally, we discuss how the same methodology can be implemented in online hardware decoders.Comment: 12 pages, 7 figure

    Reversible adsorption on a random site surface

    Full text link
    We examine the reversible adsorption of hard spheres on a random site surface in which the adsorption sites are uniformly and randomly distributed on a plane. Each site can be occupied by one solute provided that the nearest occupied site is at least one diameter away. We use a numerical method to obtain the adsorption isotherm, i.e. the number of adsorbed particles as a function of the bulk activity. The maximum coverage is obtained in the limit of infinite activity and is known exactly in the limits of low and high site density. An approximate theory for the adsorption isotherms, valid at low site density, is developed by using a cluster expansion of the grand canonical partition function. This requires as input the number of clusters of adsorption site of a given size. The theory is accurate for the entire range of activity as long as the site density is less than about 0.3 sites per particle area. We also discuss a connection between this model and the vertex cover problem.Comment: 16 pages, 10 figure

    A real-time, scalable, fast and highly resource efficient decoder for a quantum computer

    Full text link
    Quantum computers promise to solve computing problems that are currently intractable using traditional approaches. This can only be achieved if the noise inevitably present in quantum computers can be efficiently managed at scale. A key component in this process is a classical decoder, which diagnoses the errors occurring in the system. If the decoder does not operate fast enough, an exponential slowdown in the logical clock rate of the quantum computer occurs. Additionally, the decoder must be resource efficient to enable scaling to larger systems and potentially operate in cryogenic environments. Here we introduce the Collision Clustering decoder, which overcomes both challenges. We implement our decoder on both an FPGA and ASIC, the latter ultimately being necessary for any cost-effective scalable solution. We simulate a logical memory experiment on large instances of the leading quantum error correction scheme, the surface code, assuming a circuit-level noise model. The FPGA decoding frequency is above a megahertz, a stringent requirement on decoders needed for e.g. superconducting quantum computers. To decode an 881 qubit surface code it uses only 4.5%4.5\% of the available logical computation elements. The ASIC decoding frequency is also above a megahertz on a 1057 qubit surface code, and occupies 0.06 mm2^2 area and consumes 8 mW of power. Our decoder is optimised to be both highly performant and resource efficient, while its implementation on hardware constitutes a viable path to practically realising fault-tolerant quantum computers.Comment: 11 pages, 4 figure

    Vitamins and minerals for women: recent programs and intervention trials

    Get PDF
    Women's nutrition has received little attention in nutrition programming, even though clinical trials and intervention trials have suggested that dietary improvement or supplementation with several nutrients may improve their health, especially in low-income settings, the main focus of this paper. Most attention so far has focused on how improvements in maternal nutrition can improve health outcomes for infants and young children. Adequate vitamin D and calcium nutrition throughout life may reduce the risk of osteoporosis, and calcium supplementation during pregnancy may reduce preeclampsia and low birth weight. To reduce neural tube defects, additional folic acid and possibly vitamin B12 need to be provided to non-deficient women before they know they are pregnant. This is best achieved by fortifying a staple food. It is unclear whether maternal vitamin A supplementation will lead to improved health outcomes for mother or child. Iron, iodine and zinc supplementation are widely needed for deficient women. Multimicronutrient supplementation (MMS) in place of the more common iron-folate supplements given in pregnancy in low-income countries may slightly increase birth weight, but its impact on neonatal mortality and other outcomes is unclear. More sustainable alternative approaches deserve greater research attention

    Executive summary: heart disease and stroke statistics--2013 update: a report from the American Heart Association.

    Get PDF
    Each year, the American Heart Association (AHA), in conjunction with the Centers for Disease Control and Prevention, the National Institutes of Health, and other government agencies, brings together the most up-to-date statistics on heart disease, stroke, other vascular diseases, and their risk factors and presents them in its Heart Disease and Stroke Statistical Update*The Statistical Update is a valuable resource for researchers, clinicians, healthcare policy makers, media professionals, the lay public, and many others who seek the best national data available on heart disease, stroke, and other cardiovascular disease-related morbidity and mortality and the risks, quality of care, medical procedures and operations, and costs associated with the management of these diseases in a single document*Indeed, since 1999, the Statistical Update has been cited \u3e10 500 times in the literature, based on citations of all annual versions*In 2011 alone, the various Statistical Updates were cited ≈1500 times (data from ISI Web of Science)*In recent years, the Statistical Update has undergone some major changes with the addition of new chapters and major updates across multiple areas, as well as increasing the number of ways to access and use the information assembled*For this year\u27s edition, the Statistics Committee, which produces the document for the AHA, updated all of the current chapters with the most recent nationally representative data and inclusion of relevant articles from the literature over the past year*This year\u27s edition also implements a new chapter organization to reflect the spectrum of cardiovascular health behaviors and health factors and risks, as well as subsequent complicating conditions, disease states, and outcomes*Also, the 2013 Statistical Update contains new data on the monitoring and benefits of cardiovascular health in the population, with additional new focus on evidence-based approaches to changing behaviors, implementation strategies, and implications of the AHA\u27s 2020 Impact Goals*Below are a few highlights from this year\u27s Update . © 2013 American Heart Association, Inc

    Executive summary: heart disease and stroke statistics--2014 update: a report from the American Heart Association.

    Get PDF
    Each year, the American Heart Association (AHA), in conjunction with the Centers for Disease Control and Prevention, the National Institutes of Health, and other government agencies, brings together the most up-to-date statistics on heart disease, stroke, other vascular diseases, and their risk factors and presents them in its Heart Disease and Stroke Statistical Update. The Statistical Update is a critical resource for researchers, clinicians, healthcare policy makers, media professionals, the lay public, and many others who seek the best available national data on heart disease, stroke, and other cardiovascular disease-related morbidity and mortality and the risks, quality of care, use of medical procedures and operations, and costs associated with the management of these diseases in a single document. Indeed, since 1999, the Statistical Update has been cited >10 500 times in the literature, based on citations of all annual versions. In 2012 alone, the various Statistical Updates were cited ≈3500 times (data from Google Scholar). In recent years, the Statistical Update has undergone some major changes with the addition of new chapters and major updates across multiple areas, as well as increasing the number of ways to access and use the information assembled. For this year's edition, the Statistics Committee, which produces the document for the AHA, updated all of the current chapters with the most recent nationally representative data and inclusion of relevant articles from the literature over the past year. This year's edition includes a new chapter on peripheral artery disease, as well as new data on the monitoring and benefits of cardiovascular health in the population, with additional new focus on evidence-based approaches to changing behaviors, implementation strategies, and implications of the AHA's 2020 Impact Goals. Below are a few highlights from this year's Update. © 2013 American Heart Association, Inc

    Variable Mutation Rates as an Adaptive Strategy in Replicator Populations

    Get PDF
    For evolving populations of replicators, there is much evidence that the effect of mutations on fitness depends on the degree of adaptation to the selective pressures at play. In optimized populations, most mutations have deleterious effects, such that low mutation rates are favoured. In contrast to this, in populations thriving in changing environments a larger fraction of mutations have beneficial effects, providing the diversity necessary to adapt to new conditions. What is more, non-adapted populations occasionally benefit from an increase in the mutation rate. Therefore, there is no optimal universal value of the mutation rate and species attempt to adjust it to their momentary adaptive needs. In this work we have used stationary populations of RNA molecules evolving in silico to investigate the relationship between the degree of adaptation of an optimized population and the value of the mutation rate promoting maximal adaptation in a short time to a new selective pressure. Our results show that this value can significantly differ from the optimal value at mutation-selection equilibrium, being strongly influenced by the structure of the population when the adaptive process begins. In the short-term, highly optimized populations containing little variability respond better to environmental changes upon an increase of the mutation rate, whereas populations with a lower degree of optimization but higher variability benefit from reducing the mutation rate to adapt rapidly. These findings show a good agreement with the behaviour exhibited by actual organisms that replicate their genomes under broadly different mutation rates
    corecore