1,382 research outputs found

    Theoretical results on bet-and-run as an initialisation strategy

    Get PDF
    Bet-and-run initialisation strategies have been experimentally shown to be beneficial on classical NP-complete problems such as the travelling salesperson problem and minimum vertex cover. We analyse the performance of a bet-and-run restart strategy, where k independent islands run in parallel for t1 iterations, after which the optimisation process continues on only the best-performing island. We define a family of pseudo-Boolean functions, consisting of a plateau and a slope, as an abstraction of real fitness landscapes with promising and deceptive regions. The plateau shows a high fitness, but does not allow for further progression, whereas the slope has a low fitness initially, but does lead to the global optimum. We show that bet-and-run strategies with non-trivial k and t1 are necessary to find the global optimum efficiently. We show that the choice of t1 is linked to properties of the function. Finally, we provide a fixed budget analysis to guide selection of the bet-and-run parameters to maximise expected fitness after t = k · t1 + t2 fitness evaluations

    Learning-based quantum error mitigation

    Full text link
    If NISQ-era quantum computers are to perform useful tasks, they will need to employ powerful error mitigation techniques. Quasi-probability methods can permit perfect error compensation at the cost of additional circuit executions, provided that the nature of the error model is fully understood and sufficiently local both spatially and temporally. Unfortunately these conditions are challenging to satisfy. Here we present a method by which the proper compensation strategy can instead be learned ab initio. Our training process uses multiple variants of the primary circuit where all non-Clifford gates are substituted with gates that are efficient to simulate classically. The process yields a configuration that is near-optimal versus noise in the real system with its non-Clifford gate set. Having presented a range of learning strategies, we demonstrate the power of the technique both with real quantum hardware (IBM devices) and exactly-emulated imperfect quantum computers. The systems suffer a range of noise severities and types, including spatially and temporally correlated variants. In all cases the protocol successfully adapts to the noise and mitigates it to a high degree.Comment: 28 pages, 19 figure

    A Graph-Based Semantics Workbench for Concurrent Asynchronous Programs

    Get PDF
    A number of novel programming languages and libraries have been proposed that offer simpler-to-use models of concurrency than threads. It is challenging, however, to devise execution models that successfully realise their abstractions without forfeiting performance or introducing unintended behaviours. This is exemplified by SCOOP---a concurrent object-oriented message-passing language---which has seen multiple semantics proposed and implemented over its evolution. We propose a "semantics workbench" with fully and semi-automatic tools for SCOOP, that can be used to analyse and compare programs with respect to different execution models. We demonstrate its use in checking the consistency of semantics by applying it to a set of representative programs, and highlighting a deadlock-related discrepancy between the principal execution models of the language. Our workbench is based on a modular and parameterisable graph transformation semantics implemented in the GROOVE tool. We discuss how graph transformations are leveraged to atomically model intricate language abstractions, and how the visual yet algebraic nature of the model can be used to ascertain soundness.Comment: Accepted for publication in the proceedings of FASE 2016 (to appear

    Kinetic parameter estimation from TGA: Optimal design of TGA experiments

    Get PDF
    This work presents a general methodology to determine kinetic models of solid thermal decomposition with thermogravimetric analysis (TGA) instruments. The goal is to determine a simple and robust kinetic model for a given solid with the minimum of TGA experiments. From this last point of view, this work can be seen as an attempt to find the optimal design of TGA experiments for kinetic modelling. Two computation tools were developed. The first is a nonlinear parameter estimation procedure for identifying parameters in nonlinear dynamical models. The second tool computes the thermogravimetric experiment (here, the programmed temperature profile applied to the thermobalance) required in order to identify the best kinetic parameters, i.e. parameters with a higher statistical reliability. The combination of the two tools can be integrated in an iterative approach generally called sequential strategy. The application concerns the thermal degradation of cardboard in a Setaram TGA instrument and the results that are presented demonstrate the improvements in the kinetic parameter estimation process

    Cluster validity in clustering methods

    Get PDF

    Further exploration of the Océ copier

    Get PDF
    corecore