170,059 research outputs found

    Efficient fault-tolerant quantum computing

    Full text link
    Fault tolerant quantum computing methods which work with efficient quantum error correcting codes are discussed. Several new techniques are introduced to restrict accumulation of errors before or during the recovery. Classes of eligible quantum codes are obtained, and good candidates exhibited. This permits a new analysis of the permissible error rates and minimum overheads for robust quantum computing. It is found that, under the standard noise model of ubiquitous stochastic, uncorrelated errors, a quantum computer need be only an order of magnitude larger than the logical machine contained within it in order to be reliable. For example, a scale-up by a factor of 22, with gate error rate of order 10510^{-5}, is sufficient to permit large quantum algorithms such as factorization of thousand-digit numbers.Comment: 21 pages plus 5 figures. Replaced with figures in new format to avoid problem

    Effects of phylogenetic reconstruction method on the robustness of species delimitation using single-locus data

    Get PDF
    1. Coalescent-based species delimitation methods combine population genetic and phylogenetic theory to provide an objective means for delineating evolutionarily significant units of diversity. The Generalized Mixed Yule Coalescent (GMYC) and the Poisson Tree Process (PTP) are methods that use ultrametric (GMYC or PTP) or non-ultrametric (PTP) gene trees as input, intended for use mostly with single-locus data such as DNA barcodes. 2. Here we assess how robust the GMYC and PTP are to different phylogenetic reconstruction and branch smoothing methods. We reconstruct over 400 ultrametric trees using up to 30 different combinations of phylogenetic and smoothing methods and perform over 2,000 separate species delimitation analyses across 16 empirical datasets. We then assess how variable diversity estimates are, in terms of richness and identity, with respect to species delimitation, phylogenetic and smoothing methods. 3. The PTP method generally generates diversity estimates that are more robust to different phylogenetic methods. The GMYC is more sensitive, but provides consistent estimates for BEAST trees. The lower consistency of GMYC estimates is likely a result of differences among gene trees introduced by the smoothing step. Unresolved nodes (real anomalies or methodological artefacts) affect both GMYC and PTP estimates, but have a greater effect on GMYC estimates. Branch smoothing is a difficult step and perhaps an underappreciated source of bias that may be widespread among studies of diversity and diversification. 4. Nevertheless, careful choice of phylogenetic method does produce equivalent PTP and GMYC diversity estimates. We recommend simultaneous use of the PTP model with any model-based gene tree (e.g. RAxML) and GMYC approaches with BEAST trees for obtaining species hypotheses

    Quantum Teleportation is a Universal Computational Primitive

    Get PDF
    We present a method to create a variety of interesting gates by teleporting quantum bits through special entangled states. This allows, for instance, the construction of a quantum computer based on just single qubit operations, Bell measurements, and GHZ states. We also present straightforward constructions of a wide variety of fault-tolerant quantum gates.Comment: 6 pages, REVTeX, 6 epsf figure

    In situ nanocompression testing of irradiated copper.

    Get PDF
    Increasing demand for energy and reduction of carbon dioxide emissions has revived interest in nuclear energy. Designing materials for radiation environments necessitates a fundamental understanding of how radiation-induced defects alter mechanical properties. Ion beams create radiation damage efficiently without material activation, but their limited penetration depth requires small-scale testing. However, strength measurements of nanoscale irradiated specimens have not been previously performed. Here we show that yield strengths approaching macroscopic values are measured from irradiated ~400 nm-diameter copper specimens. Quantitative in situ nanocompression testing in a transmission electron microscope reveals that the strength of larger samples is controlled by dislocation-irradiation defect interactions, yielding size-independent strengths. Below ~400 nm, size-dependent strength results from dislocation source limitation. This transition length-scale should be universal, but depends on material and irradiation conditions. We conclude that for irradiated copper, and presumably related materials, nanoscale in situ testing can determine bulk-like yield strengths and simultaneously identify deformation mechanisms

    Model predictive control for power system frequency control taking into account imbalance uncertainty

    Get PDF
    © IFAC.Model predictive control (MPC) is investigated as a control method for frequency control of power systems which are exposed to increasing wind power penetration. For such power systems, the unpredicted power imbalance can be assumed to be dominated by the fluctuations in produced wind power. An MPC is designed for controlling the frequency of wind-penetrated power systems, which uses the knowledge of the estimated worst-case power imbalance to make the MPC more robust. This is done by considering three different disturbances in the MPC: one towards the positive worst-case, one towards the negative worst-case, and one neutral in the middle. The robustified MPC is designed so that it finds an input which makes sure that the constraints of the system are fulfilled in case of all three disturbances. Through simulations on a network with concentrated wind power, it is shown that in certain cases where the state-of-the-art frequency control (PI control) and nominal MPC violate the system constraints, the robustified MPC fulfills them due to the inclusion of the worst-case estimates of the power imbalance

    Fast Scalable Construction of (Minimal Perfect Hash) Functions

    Full text link
    Recent advances in random linear systems on finite fields have paved the way for the construction of constant-time data structures representing static functions and minimal perfect hash functions using less space with respect to existing techniques. The main obstruction for any practical application of these results is the cubic-time Gaussian elimination required to solve these linear systems: despite they can be made very small, the computation is still too slow to be feasible. In this paper we describe in detail a number of heuristics and programming techniques to speed up the resolution of these systems by several orders of magnitude, making the overall construction competitive with the standard and widely used MWHC technique, which is based on hypergraph peeling. In particular, we introduce broadword programming techniques for fast equation manipulation and a lazy Gaussian elimination algorithm. We also describe a number of technical improvements to the data structure which further reduce space usage and improve lookup speed. Our implementation of these techniques yields a minimal perfect hash function data structure occupying 2.24 bits per element, compared to 2.68 for MWHC-based ones, and a static function data structure which reduces the multiplicative overhead from 1.23 to 1.03

    Reliable H ∞ filtering for stochastic spatial–temporal systems with sensor saturations and failures

    Get PDF
    This study is concerned with the reliable H∞ filtering problem for a class of stochastic spatial–temporal systems with sensor saturations and failures. Different from the continuous spatial–temporal systems, the dynamic behaviour of the system under consideration evolves in a discrete rectangular region. The aim of this study is to estimate the system states through the measurements received from a set of sensors located at some specified points. In order to cater for more realistic signal transmission process, the phenomena of sensor saturations and sensor failures are taken into account. By using the vector reorganisation approach, the spatial–temporal system is first transformed into an equivalent ordinary differential dynamic system. Then, a filter is constructed and a sufficient condition is obtained under which the filtering error dynamics is asymptotically stable in probability and the H∞ performance requirement is met. On the basis of the analysis results, the desired reliable H∞ filter is designed. Finally, an illustrative example is given to show the effectiveness of the proposed filtering scheme.Deanship of Scientific Research (DSR) at King Abdulaziz University in Saudi Arabia under Grant 16-135-35-HiCi, the National Natural Science Foundation of China under Grants 61329301, 61134009 and 61473076, the Shanghai Rising-Star Program of China under Grant 13QA1400100, the Shu Guang project of Shanghai Municipal Education Commission and Shanghai Education Development Foundation under Grant 13SG34, the Program for Professor of Special Appointment (Eastern Scholar) at Shanghai Institutions of Higher Learning, the Fundamental Research Funds for the Central Universities, the DHU Distinguished Young Professor Program, and the Alexander von Humboldt Foundation of German

    Frontoparietal representations of task context support the flexible control of goal-directed cognition.

    Get PDF
    Cognitive control allows stimulus-response processing to be aligned with internal goals and is thus central to intelligent, purposeful behavior. Control is thought to depend in part on the active representation of task information in prefrontal cortex (PFC), which provides a source of contextual bias on perception, decision making, and action. In the present study, we investigated the organization, influences, and consequences of context representation as human subjects performed a cued sorting task that required them to flexibly judge the relationship between pairs of multivalent stimuli. Using a connectivity-based parcellation of PFC and multivariate decoding analyses, we determined that context is specifically and transiently represented in a region spanning the inferior frontal sulcus during context-dependent decision making. We also found strong evidence that decision context is represented within the intraparietal sulcus, an area previously shown to be functionally networked with the inferior frontal sulcus at rest and during task performance. Rule-guided allocation of attention to different stimulus dimensions produced discriminable patterns of activation in visual cortex, providing a signature of top-down bias over perception. Furthermore, demands on cognitive control arising from the task structure modulated context representation, which was found to be strongest after a shift in task rules. When context representation in frontoparietal areas increased in strength, as measured by the discriminability of high-dimensional activation patterns, the bias on attended stimulus features was enhanced. These results provide novel evidence that illuminates the mechanisms by which humans flexibly guide behavior in complex environments

    Aquatic macroinvertebrate responses to native and non-native predators

    Get PDF
    Non-native species can profoundly affect native ecosystems through trophic interactions with native species. Native prey may respond differently to non-native versus native predators since they lack prior experience. Here we investigate antipredator responses of two common freshwater macroinvertebrates, Gammarus pulex and Potamopyrgus jenkinsi, to olfactory cues from three predators; sympatric native fish (Gasterosteus aculeatus), sympatric native crayfish (Austropotamobius pallipes), and novel invasive crayfish (Pacifastacus leniusculus). G. pulex responded differently to fish and crayfish; showing enhanced locomotion in response to fish, but a preference for the dark over the light in response to the crayfish. P. jenkinsi showed increased vertical migration in response to all three predator cues relative to controls. These different responses to fish and crayfish are hypothesised to reflect the predators’ differing predation types; benthic for crayfish and pelagic for fish. However, we found no difference in response to native versus invasive crayfish, indicating that prey naiveté is unlikely to drive the impacts of invasive crayfish. The Predator Recognition Continuum Hypothesis proposes that benefits of generalisable predator recognition outweigh costs when predators are diverse. Generalised responses of prey as observed here will be adaptive in the presence of an invader, and may reduce novel predators’ potential impacts
    corecore