2 research outputs found

    An extensive experimental evaluation of automated machine learning methods for recommending classification algorithms

    Get PDF
    This paper presents an experimental comparison among four automated machine learning (AutoML) methods for recommending the best classification algorithm for a given input dataset. Three of these methods are based on evolutionary algorithms (EAs), and the other is Auto-WEKA, a well-known AutoML method based on the combined algorithm selection and hyper-parameter optimisation (CASH) approach. The EA-based methods build classification algorithms from a single machine learning paradigm: either decision-tree induction, rule induction, or Bayesian network classification. Auto-WEKA combines algorithm selection and hyper-parameter optimisation to recommend classification algorithms from multiple paradigms. We performed controlled experiments where these four AutoML methods were given the same runtime limit for different values of this limit. In general, the difference in predictive accuracy of the three best AutoML methods was not statistically significant. However, the EA evolving decision-tree induction algorithms has the advantage of producing algorithms that generate interpretable classification models and that are more scalable to large datasets, by comparison with many algorithms from other learning paradigms that can be recommended by Auto-WEKA. We also observed that Auto-WEKA has shown meta-overfitting, a form of overfitting at the meta-learning level, rather than at the base-learning level

    Algorithms and protocols for quantum information technologies

    Get PDF
    Quantum technologies harness the properties of controlled quantum mechanical systems for applications in computation, communication and metrology, and enable us to further our understanding of fundamental physics. Quantum hardware is designed to manipulate complex and fragile many-particle quantum states, which requires exquisite control machinery, advanced software and near-complete isolation from the surrounding environment. With sufficient capabilities, quantum computers with large registers of quantum bits will perform classically intractable calculations such as quantum chemistry simulations. Quantum key distribution will provide completely secure communications across the world and quantum metrology will enable measurement precision beyond today's capabilities. While quantum technologies are far from surpassing classical hardware today, the vision and potential impact has sparked a world-wide research effort, both theoretical and experimental, to develop a quantum computer. The most sophisticated current quantum technologies have control over small numbers of quantum bits and are limited by environmental decoherence processes. While progress in developing quantum hardware is ongoing, designing and demonstrating new algorithms and protocols for quantum information is a thriving research field. Photonics provides an ideal platform for small-scale proof-of-concept quantum experiments, which are the focus of this thesis. In the coming decades, we will see the realisation of quantum hardware capable of applications that outperform any classical computer. In this thesis, I present several protocols and algorithms for quantum information science and technology, which are implemented in quantum photonic experiments. The applications of these works include robust quantum tomography, quantum state relocation, quantum enhanced data recovery and probing fundamental causality in quantum mechanics. The works presented here are based on two photon experiments from a prototypical spontaneous parametric down-conversion source, which provides an excellent test-bed for quantum information experiments. Photons can be used to encode quantum information in a range of different degrees of freedom. This versatility, along with compatibility with existing technology gives photonics a great advantage in the development of quantum hardware
    corecore