351,099 research outputs found

    Meta-heuristic algorithms in car engine design: a literature survey

    Get PDF
    Meta-heuristic algorithms are often inspired by natural phenomena, including the evolution of species in Darwinian natural selection theory, ant behaviors in biology, flock behaviors of some birds, and annealing in metallurgy. Due to their great potential in solving difficult optimization problems, meta-heuristic algorithms have found their way into automobile engine design. There are different optimization problems arising in different areas of car engine management including calibration, control system, fault diagnosis, and modeling. In this paper we review the state-of-the-art applications of different meta-heuristic algorithms in engine management systems. The review covers a wide range of research, including the application of meta-heuristic algorithms in engine calibration, optimizing engine control systems, engine fault diagnosis, and optimizing different parts of engines and modeling. The meta-heuristic algorithms reviewed in this paper include evolutionary algorithms, evolution strategy, evolutionary programming, genetic programming, differential evolution, estimation of distribution algorithm, ant colony optimization, particle swarm optimization, memetic algorithms, and artificial immune system

    Estimation and uncertainty of reversible Markov models

    Get PDF
    Reversibility is a key concept in Markov models and Master-equation models of molecular kinetics. The analysis and interpretation of the transition matrix encoding the kinetic properties of the model relies heavily on the reversibility property. The estimation of a reversible transition matrix from simulation data is therefore crucial to the successful application of the previously developed theory. In this work we discuss methods for the maximum likelihood estimation of transition matrices from finite simulation data and present a new algorithm for the estimation if reversibility with respect to a given stationary vector is desired. We also develop new methods for the Bayesian posterior inference of reversible transition matrices with and without given stationary vector taking into account the need for a suitable prior distribution preserving the meta- stable features of the observed process during posterior inference. All algorithms here are implemented in the PyEMMA software - http://pyemma.org - as of version 2.0

    Towards large scale continuous EDA: a random matrix theory perspective

    Get PDF
    Estimation of distribution algorithms (EDA) are a major branch of evolutionary algorithms (EA) with some unique advantages in principle. They are able to take advantage of correlation structure to drive the search more efficiently, and they are able to provide insights about the structure of the search space. However, model building in high dimensions is extremely challenging and as a result existing EDAs lose their strengths in large scale problems. Large scale continuous global optimisation is key to many real world problems of modern days. Scaling up EAs to large scale problems has become one of the biggest challenges of the field. This paper pins down some fundamental roots of the problem and makes a start at developing a new and generic framework to yield effective EDA-type algorithms for large scale continuous global optimisation problems. Our concept is to introduce an ensemble of random projections of the set of fittest search points to low dimensions as a basis for developing a new and generic divide-and-conquer methodology. This is rooted in the theory of random projections developed in theoretical computer science, and will exploit recent advances of non-asymptotic random matrix theory

    Calculation of aggregate loss distributions

    Full text link
    Estimation of the operational risk capital under the Loss Distribution Approach requires evaluation of aggregate (compound) loss distributions which is one of the classic problems in risk theory. Closed-form solutions are not available for the distributions typically used in operational risk. However with modern computer processing power, these distributions can be calculated virtually exactly using numerical methods. This paper reviews numerical algorithms that can be successfully used to calculate the aggregate loss distributions. In particular Monte Carlo, Panjer recursion and Fourier transformation methods are presented and compared. Also, several closed-form approximations based on moment matching and asymptotic result for heavy-tailed distributions are reviewed

    Quantum query complexity of entropy estimation

    Full text link
    Estimation of Shannon and R\'enyi entropies of unknown discrete distributions is a fundamental problem in statistical property testing and an active research topic in both theoretical computer science and information theory. Tight bounds on the number of samples to estimate these entropies have been established in the classical setting, while little is known about their quantum counterparts. In this paper, we give the first quantum algorithms for estimating Ī±\alpha-R\'enyi entropies (Shannon entropy being 1-Renyi entropy). In particular, we demonstrate a quadratic quantum speedup for Shannon entropy estimation and a generic quantum speedup for Ī±\alpha-R\'enyi entropy estimation for all Ī±ā‰„0\alpha\geq 0, including a tight bound for the collision-entropy (2-R\'enyi entropy). We also provide quantum upper bounds for extreme cases such as the Hartley entropy (i.e., the logarithm of the support size of a distribution, corresponding to Ī±=0\alpha=0) and the min-entropy case (i.e., Ī±=+āˆž\alpha=+\infty), as well as the Kullback-Leibler divergence between two distributions. Moreover, we complement our results with quantum lower bounds on Ī±\alpha-R\'enyi entropy estimation for all Ī±ā‰„0\alpha\geq 0.Comment: 43 pages, 1 figur
    • ā€¦
    corecore