141 research outputs found

    Optimization by thermal cycling

    Full text link
    Thermal cycling is an heuristic optimization algorithm which consists of cyclically heating and quenching by Metropolis and local search procedures, respectively, where the amplitude slowly decreases. In recent years, it has been successfully applied to two combinatorial optimization tasks, the traveling salesman problem and the search for low-energy states of the Coulomb glass. In these cases, the algorithm is far more efficient than usual simulated annealing. In its original form the algorithm was designed only for the case of discrete variables. Its basic ideas are applicable also to a problem with continuous variables, the search for low-energy states of Lennard-Jones clusters.Comment: Submitted to Proceedings of the Workshop "Complexity, Metastability and Nonextensivity", held in Erice 20-26 July 2004. Latex, 7 pages, 3 figure

    Theory of Randomized Search Heuristics in Combinatorial Optimization

    Get PDF

    06061 Abstracts Collection -- Theory of Evolutionary Algorithms

    Get PDF
    From 05.02.06 to 10.02.06, the Dagstuhl Seminar 06061 ``Theory of Evolutionary Algorithms\u27\u27 was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available

    Genetic algorithms and simulated annealing for robustness analysis

    Get PDF
    Genetic algorithms (GAs) and simulated annealing (SA) have been promoted as useful, general tools for nonlinear optimization. This paper explores their use in robustness analysis with real parameter variations, a known NP hard problem which would appear to be ideally suited to demonstrate the power of GAs and SA. Numerical experiment results show convincingly that they turn out to be poorer than existing branch and bound (B&B) approaches. While this may appear to shed doubt on some of the hype surrounding these stochastic optimization techniques, we find that they do have attractive features, which are also demonstrated in this study. For example, both GAs and SA are almost trivial to understand and program, so they require essentially no expertise, in sharp contrast to the B&B methods. They may be suitable for problems where programming effort is much more important than running time or the quality of the answer. Robustness analysis for engineering problems is not the best candidate in this respect, but it does provide an interesting test case for the evaluation of GAs and SA. A simple hill climbing algorithm is also studied for comparison

    Imbalanced data classification using support vector machine based on simulated annealing for enhancing penalty parameter

    Get PDF
    For pattern cataloguing and regression issues, the support vector machine (SVM) is an eminent and computationally prevailing machine learning method. It’s been effectively addressing several concrete issues across an extensive gamut of domains. SVM possesses a key aspect called penalty factor C. The choice of these aspects has a substantial impact on the classification precision of SVM as unsuitable parameter settings might drive substandard classification outcomes. Penalty factor C is required to achieve an adequate trade-off between classification errors and generalisation performance. Hence, formulating an SVM model having appropriate performance requires parameter optimisation. The simulated annealing (SA) algorithm is employed to formulate a hybrid method for evaluating SVM parameters. Additionally, the intent is to enhance system efficacy to obtain the optimal penalty parameter and balance classification performance at the same time. Our experiments with many UCI datasets indicate that the recommended technique could attain enhanced classification precision

    Simulated Annealing is a Polynomial-Time Approximation Scheme for the Minimum Spanning Tree Problem

    Full text link
    We prove that Simulated Annealing with an appropriate cooling schedule computes arbitrarily tight constant-factor approximations to the minimum spanning tree problem in polynomial time. This result was conjectured by Wegener (2005). More precisely, denoting by n,m,wmaxn, m, w_{\max}, and wminw_{\min} the number of vertices and edges as well as the maximum and minimum edge weight of the MST instance, we prove that simulated annealing with initial temperature T0wmaxT_0 \ge w_{\max} and multiplicative cooling schedule with factor 11/1-1/\ell, where =ω(mnln(m))\ell = \omega (mn\ln(m)), with probability at least 11/m1-1/m computes in time O((lnln()+ln(T0/wmin)))O(\ell (\ln\ln (\ell) + \ln(T_0/w_{\min}) )) a spanning tree with weight at most 1+κ1+\kappa times the optimum weight, where 1+κ=(1+o(1))ln(m)ln()ln(mnln(m))1+\kappa = \frac{(1+o(1))\ln(\ell m)}{\ln(\ell) -\ln (mn\ln (m))}. Consequently, for any ϵ>0\epsilon>0, we can choose \ell in such a way that a (1+ϵ)(1+\epsilon)-approximation is found in time O((mnln(n))1+1/ϵ+o(1)(lnlnn+ln(T0/wmin)))O((mn\ln(n))^{1+1/\epsilon+o(1)}(\ln\ln n + \ln(T_0/w_{\min}))) with probability at least 11/m1-1/m. In the special case of so-called (1+ϵ)(1+\epsilon)-separated weights, this algorithm computes an optimal solution (again in time O((mnln(n))1+1/ϵ+o(1)(lnlnn+ln(T0/wmin)))O( (mn\ln(n))^{1+1/\epsilon+o(1)}(\ln\ln n + \ln(T_0/w_{\min})))), which is a significant speed-up over Wegener's runtime guarantee of O(m8+8/ϵ)O(m^{8 + 8/\epsilon}).Comment: 19 pages. Full version of a paper appearing at GECCO 202
    corecore