40,885 research outputs found

    Experimental quantum verification in the presence of temporally correlated noise

    Full text link
    Growth in the complexity and capabilities of quantum information hardware mandates access to practical techniques for performance verification that function under realistic laboratory conditions. Here we experimentally characterise the impact of common temporally correlated noise processes on both randomised benchmarking (RB) and gate-set tomography (GST). We study these using an analytic toolkit based on a formalism mapping noise to errors for arbitrary sequences of unitary operations. This analysis highlights the role of sequence structure in enhancing or suppressing the sensitivity of quantum verification protocols to either slowly or rapidly varying noise, which we treat in the limiting cases of quasi-DC miscalibration and white noise power spectra. We perform experiments with a single trapped 171^{171}Yb+^{+} ion as a qubit and inject engineered noise (∝σz\propto \sigma^z) to probe protocol performance. Experiments on RB validate predictions that the distribution of measured fidelities over sequences is described by a gamma distribution varying between approximately Gaussian for rapidly varying noise, and a broad, highly skewed distribution for the slowly varying case. Similarly we find a strong gate set dependence of GST in the presence of correlated errors, leading to significant deviations between estimated and calculated diamond distances in the presence of correlated σz\sigma^z errors. Numerical simulations demonstrate that expansion of the gate set to include negative rotations can suppress these discrepancies and increase reported diamond distances by orders of magnitude for the same error processes. Similar effects do not occur for correlated σx\sigma^x or σy\sigma^y errors or rapidly varying noise processes, highlighting the critical interplay of selected gate set and the gauge optimisation process on the meaning of the reported diamond norm in correlated noise environments.Comment: Expanded and updated analysis of GST, including detailed examination of the role of gauge optimization in GST. Full GST data sets and supplementary information available on request from the authors. Related results available from http://www.physics.usyd.edu.au/~mbiercuk/Publications.htm

    Does the motor system need intermittent control?

    Get PDF
    Explanation of motor control is dominated by continuous neurophysiological pathways (e.g. trans-cortical, spinal) and the continuous control paradigm. Using new theoretical development, methodology and evidence, we propose intermittent control, which incorporates a serial ballistic process within the main feedback loop, provides a more general and more accurate paradigm necessary to explain attributes highly advantageous for competitive survival and performance

    Sparse experimental design : an effective an efficient way discovering better genetic algorithm structures

    Get PDF
    The focus of this paper is the demonstration that sparse experimental design is a useful strategy for developing Genetic Algorithms. It is increasingly apparent from a number of reports and papers within a variety of different problem domains that the 'best' structure for a GA may be dependent upon the application. The GA structure is defined as both the types of operators and the parameters settings used during operation. The differences observed may be linked to the nature of the problem, the type of fitness function, or the depth or breadth of the problem under investigation. This paper demonstrates that advanced experimental design may be adopted to increase the understanding of the relationships between the GA structure and the problem domain, facilitating the selection of improved structures with a minimum of effort

    Results of Evolution Supervised by Genetic Algorithms

    Full text link
    A series of results of evolution supervised by genetic algorithms with interest to agricultural and horticultural fields are reviewed. New obtained original results from the use of genetic algorithms on structure-activity relationships are reported.Comment: 6 pages, 1 Table, 2 figure

    Towards Understanding the Origin of Genetic Languages

    Full text link
    Molecular biology is a nanotechnology that works--it has worked for billions of years and in an amazing variety of circumstances. At its core is a system for acquiring, processing and communicating information that is universal, from viruses and bacteria to human beings. Advances in genetics and experience in designing computers have taken us to a stage where we can understand the optimisation principles at the root of this system, from the availability of basic building blocks to the execution of tasks. The languages of DNA and proteins are argued to be the optimal solutions to the information processing tasks they carry out. The analysis also suggests simpler predecessors to these languages, and provides fascinating clues about their origin. Obviously, a comprehensive unraveling of the puzzle of life would have a lot to say about what we may design or convert ourselves into.Comment: (v1) 33 pages, contributed chapter to "Quantum Aspects of Life", edited by D. Abbott, P. Davies and A. Pati, (v2) published version with some editin

    A hybrid genetic algorithm for solving a layout problem in the fashion industry.

    Get PDF
    As of this writing, many success stories exist yet of powerful genetic algorithms (GAs) in the field of constraint optimisation. In this paper, a hybrid, intelligent genetic algorithm will be developed for solving a cutting layout problem in the Belgian fashion industry. In an initial section, an existing LP formulation of the cutting problem is briefly summarised and is used in further paragraphs as the core design of our GA. Through an initial attempt of rendering the algorithm as universal as possible, it was conceived a threefold genetic enhancement had to be carried out that reduces the size of the active solution space. The GA is therefore rebuilt using intelligent genetic operators, carrying out a local optimisation and applying a heuristic feasibility operator. Powerful computational results are achieved for a variety of problem cases that outperform any existing LP model yet developed.Fashion; Industry;

    Digital Ecosystems: Ecosystem-Oriented Architectures

    Full text link
    We view Digital Ecosystems to be the digital counterparts of biological ecosystems. Here, we are concerned with the creation of these Digital Ecosystems, exploiting the self-organising properties of biological ecosystems to evolve high-level software applications. Therefore, we created the Digital Ecosystem, a novel optimisation technique inspired by biological ecosystems, where the optimisation works at two levels: a first optimisation, migration of agents which are distributed in a decentralised peer-to-peer network, operating continuously in time; this process feeds a second optimisation based on evolutionary computing that operates locally on single peers and is aimed at finding solutions to satisfy locally relevant constraints. The Digital Ecosystem was then measured experimentally through simulations, with measures originating from theoretical ecology, evaluating its likeness to biological ecosystems. This included its responsiveness to requests for applications from the user base, as a measure of the ecological succession (ecosystem maturity). Overall, we have advanced the understanding of Digital Ecosystems, creating Ecosystem-Oriented Architectures where the word ecosystem is more than just a metaphor.Comment: 39 pages, 26 figures, journa

    Evolutionary algorithm-based analysis of gravitational microlensing lightcurves

    Full text link
    A new algorithm developed to perform autonomous fitting of gravitational microlensing lightcurves is presented. The new algorithm is conceptually simple, versatile and robust, and parallelises trivially; it combines features of extant evolutionary algorithms with some novel ones, and fares well on the problem of fitting binary-lens microlensing lightcurves, as well as on a number of other difficult optimisation problems. Success rates in excess of 90% are achieved when fitting synthetic though noisy binary-lens lightcurves, allowing no more than 20 minutes per fit on a desktop computer; this success rate is shown to compare very favourably with that of both a conventional (iterated simplex) algorithm, and a more state-of-the-art, artificial neural network-based approach. As such, this work provides proof of concept for the use of an evolutionary algorithm as the basis for real-time, autonomous modelling of microlensing events. Further work is required to investigate how the algorithm will fare when faced with more complex and realistic microlensing modelling problems; it is, however, argued here that the use of parallel computing platforms, such as inexpensive graphics processing units, should allow fitting times to be constrained to under an hour, even when dealing with complicated microlensing models. In any event, it is hoped that this work might stimulate some interest in evolutionary algorithms, and that the algorithm described here might prove useful for solving microlensing and/or more general model-fitting problems.Comment: 14 pages, 3 figures; accepted for publication in MNRA
    • 

    corecore