61,296 research outputs found

    GLOBAL OPTIMIZATION METHODS

    Get PDF
    Training a neural network is a difficult optimization problem because of numerous local minimums. Many global search algorithms have been used to train neural networks. However, local search algorithms are more efficient with computational resources, and therefore numerous random restarts with a local algorithm may be more effective than a global algorithm. This study uses Monte-Carlo simulations to determine the relative efficiency of a local search algorithm to 9 stochastic global algorithms. The computational requirements of the global algorithms are several times higher than the local algorithm and there is little gain in using the global algorithms to train neural networks.Research Methods/ Statistical Methods,

    On the Thermodynamics of Global Optimization

    Full text link
    Theoretical design of global optimization algorithms can profitably utilize recent statistical mechanical treatments of potential energy surfaces (PES's). Here we analyze a particular method to explain its success in locating global minima on surfaces with a multiple-funnel structure, where trapping in local minima with different morphologies is expected. We find that a key factor in overcoming trapping is the transformation applied to the PES which broadens the thermodynamic transitions. The global minimum then has a significant probability of occupation at temperatures where the free energy barriers between funnels are surmountable.Comment: 4 pages, 3 figures, revte
    • …
    corecore