194,064 research outputs found

    Sequential Parameter Optimization

    Get PDF
    We provide a comprehensive, effective and very efficient methodology for the design and experimental analysis of algorithms. We rely on modern statistical techniques for tuning and understanding algorithms from an experimental perspective. Therefore, we make use of the sequential parameter optimization (SPO) method that has been successfully applied as a tuning procedure to numerous heuristics for practical and theoretical optimization problems. Two case studies, which illustrate the applicability of SPO to algorithm tuning and model selection, are presented

    Hyper-parameter optimization of Deep Convolutional Networks for object recognition

    Full text link
    Recently sequential model based optimization (SMBO) has emerged as a promising hyper-parameter optimization strategy in machine learning. In this work, we investigate SMBO to identify architecture hyper-parameters of deep convolution networks (DCNs) object recognition. We propose a simple SMBO strategy that starts from a set of random initial DCN architectures to generate new architectures, which on training perform well on a given dataset. Using the proposed SMBO strategy we are able to identify a number of DCN architectures that produce results that are comparable to state-of-the-art results on object recognition benchmarks.Comment: 4 pages, 1 figure, 3 tables, Submitted to ICIP 201

    Comparison of classical and sequential design of experiments in note onset detection

    Get PDF
    Design of experiments is an established approach to parameter optimization of industrial processes. In many computer applications however it is usual to optimize the parameters via genetic algorithms. The main idea of this work is to apply design of experiment’s techniques to the optimization of computer processes. The major problem here is finding a compromise between model validity and costs, which increase with the number of experiments. The second relevant problem is choosing an appropriate model, which describes the relationship between parameters and target values. One of the recent approaches here is model combination, which can be used in sequential designs in order to improve automatic prediction of the next trial point. In this paper a musical note onset detection algorithm will be optimized using sequential parameter optimization with model combination. It will be shown that parameter optimization via design of experiments leads to better values of the target variable than usual parameter optimization via grid search or genetic optimization algorithms. Furthermore, the results of this application study reveal, whether the combination of many models brings improvements in finding the optimal parameter setting

    Robust Design Optimization of a High-Temperature Superconducting Linear Synchronous Motor Based on Taguchi Method

    Full text link
    © 2002-2011 IEEE. This paper investigates the efficient robust design and optimization of a high-Temperature superconducting (HTS) linear synchronous motor by using the Taguchi parameter design approach. The manufacturing tolerances of the HTS magnets, primary iron core and the air gap are considered in the robust design to ensure that the optimal design is less sensitive to these uncertainties. To overcome the disadvantages of the conventional Taguchi parameter design approach, a sequential Taguchi robust optimization method is presented for improvement of the motor performance and manufacturing quality. The proposed method is efficient because it holds the advantages of both Taguchi method and sequential optimization strategy. It can significantly increase the average thrust and decrease the thrust ripple of the investigated HTS linear synchronous motor
    • …
    corecore