1,606,954 research outputs found

    Bayesian Optimization with Dimension Scheduling: Application to Biological Systems

    Get PDF
    Bayesian Optimization (BO) is a data-efficient method for global black-box optimization of an expensive-to-evaluate fitness function. BO typically assumes that computation cost of BO is cheap, but experiments are time consuming or costly. In practice, this allows us to optimize ten or fewer critical parameters in up to 1,000 experiments. But experiments may be less expensive than BO methods assume: In some simulation models, we may be able to conduct multiple thousands of experiments in a few hours, and the computational burden of BO is no longer negligible compared to experimentation time. To address this challenge we introduce a new Dimension Scheduling Algorithm (DSA), which reduces the computational burden of BO for many experiments. The key idea is that DSA optimizes the fitness function only along a small set of dimensions at each iteration. This DSA strategy (1) reduces the necessary computation time, (2) finds good solutions faster than the traditional BO method, and (3) can be parallelized straightforwardly. We evaluate the DSA in the context of optimizing parameters of dynamic models of microalgae metabolism and show faster convergence than traditional BO

    Bloch oscillations of cold atoms in optical lattices

    Full text link
    This work is devoted to Bloch oscillations (BO) of cold neutral atoms in optical lattices. After a general introduction to the phenomenon of BO and its realization in optical lattices, we study different extentions of this problem, which account for recent developments in this field. These are two-dimensional BO, decoherence of BO, and BO in correlated systems. Although these problems are discussed in relation to the system of cold atoms in optical lattices, many of the results are of general validity and can be well applied to other systems showing the phenomenon of BO.Comment: submitted to the review section of IJMPB, few misprints are correcte

    Fiber Strong Shape Theory for Topological Spaces

    Get PDF
    In the paper we construct and develop a fiber strong shape theory for arbitrary spaces over fixed metrizable space \Bo. Our approach is based on the method of Marde\v{s}i\'{c}-Lisica and instead of resolutions, introduced by Marde\v{s}i\'{c}, their fiber preserving analogues are used. The fiber strong shape theory yields the classification of spaces over \Bo which is coarser than the classification of spaces over \Bo induced by fiber homotopy theory, but is finer than the classification of spaces over \Bo given by usual fiber shape theory

    Bloch oscillations in complex crystals with PT symmetry

    Full text link
    Bloch oscillations (BO) in complex lattices with PT symmetry are theoretically investigated with specific reference to optical BO in photonic lattices with gain/loss regions. Novel dynamical phenomena with no counterpart in ordinary lattices, such as non-reciprocal BO related to violation of the Friedel's law of Bragg scattering in complex potentials, are highlighted.Comment: 4 pages, 3 figure

    Basic Enhancement Strategies When Using Bayesian Optimization for Hyperparameter Tuning of Deep Neural Networks

    Get PDF
    Compared to the traditional machine learning models, deep neural networks (DNN) are known to be highly sensitive to the choice of hyperparameters. While the required time and effort for manual tuning has been rapidly decreasing for the well developed and commonly used DNN architectures, undoubtedly DNN hyperparameter optimization will continue to be a major burden whenever a new DNN architecture needs to be designed, a new task needs to be solved, a new dataset needs to be addressed, or an existing DNN needs to be improved further. For hyperparameter optimization of general machine learning problems, numerous automated solutions have been developed where some of the most popular solutions are based on Bayesian Optimization (BO). In this work, we analyze four fundamental strategies for enhancing BO when it is used for DNN hyperparameter optimization. Specifically, diversification, early termination, parallelization, and cost function transformation are investigated. Based on the analysis, we provide a simple yet robust algorithm for DNN hyperparameter optimization - DEEP-BO (Diversified, Early-termination-Enabled, and Parallel Bayesian Optimization). When evaluated over six DNN benchmarks, DEEP-BO mostly outperformed well-known solutions including GP-Hedge, BOHB, and the speed-up variants that use Median Stopping Rule or Learning Curve Extrapolation. In fact, DEEP-BO consistently provided the top, or at least close to the top, performance over all the benchmark types that we have tested. This indicates that DEEP-BO is a robust solution compared to the existing solutions. The DEEP-BO code is publicly available at <uri>https://github.com/snu-adsl/DEEP-BO</uri>
    corecore