3,174 research outputs found

    Tools for Landscape Analysis of Optimisation Problems in Procedural Content Generation for Games

    Get PDF
    The term Procedural Content Generation (PCG) refers to the (semi-)automatic generation of game content by algorithmic means, and its methods are becoming increasingly popular in game-oriented research and industry. A special class of these methods, which is commonly known as search-based PCG, treats the given task as an optimisation problem. Such problems are predominantly tackled by evolutionary algorithms. We will demonstrate in this paper that obtaining more information about the defined optimisation problem can substantially improve our understanding of how to approach the generation of content. To do so, we present and discuss three efficient analysis tools, namely diagonal walks, the estimation of high-level properties, as well as problem similarity measures. We discuss the purpose of each of the considered methods in the context of PCG and provide guidelines for the interpretation of the results received. This way we aim to provide methods for the comparison of PCG approaches and eventually, increase the quality and practicality of generated content in industry.Comment: 30 pages, 8 figures, accepted for publication in Applied Soft Computin

    Benchmarking for Metaheuristic Black-Box Optimization: Perspectives and Open Challenges

    Full text link
    Research on new optimization algorithms is often funded based on the motivation that such algorithms might improve the capabilities to deal with real-world and industrially relevant optimization challenges. Besides a huge variety of different evolutionary and metaheuristic optimization algorithms, also a large number of test problems and benchmark suites have been developed and used for comparative assessments of algorithms, in the context of global, continuous, and black-box optimization. For many of the commonly used synthetic benchmark problems or artificial fitness landscapes, there are however, no methods available, to relate the resulting algorithm performance assessments to technologically relevant real-world optimization problems, or vice versa. Also, from a theoretical perspective, many of the commonly used benchmark problems and approaches have little to no generalization value. Based on a mini-review of publications with critical comments, advice, and new approaches, this communication aims to give a constructive perspective on several open challenges and prospective research directions related to systematic and generalizable benchmarking for black-box optimization

    The application of modified adaptive landscapes to heuristic modelling of engine concept designs using sparse data

    Get PDF
    The automotive internal combustion engine industry operates in a sector that relies on high production volumes for economies of scale, and dedicated production equipment for efficiency of operations and control of quality, yet is subject to the vagaries of a dynamic marketplace, with the need for constant change. These circumstances place pressure on engine designs to be optimised at launch to be competitive and meet market needs, yet be adaptable to uncertain requirements for change over their production life. Engine designers therefore need concept configuration evaluation tools that can assess architectures for resilience to geometric change over the production life of the product. The problem of being resource efficient whilst having the capacity to adapt tochanging environments is one that has been addressed in nature. Natural systems have evolved strategies of satisficing conflicting requirements whilst being resource efficient. The theory of adaptive landscapes helps us to visualise the adaptive capacity of potential morphological forms. A concept attribute analysis methodology based on satisficing and adaptive landscapes has been developed and tested for application to engine concept design. The Plateau, Flooded Adaptive Landscape technique (PFAL),has been evaluated against exemplar engine life histories and shows merit in aiding the decision-making process for concept designers working with sparse data. The process lets the designer visualise the attribute map, enabling them to make better trade-off decisions and share these with non-expert stakeholders to gain their input in final concept choices

    Using Automated Algorithm Configuration for Parameter Control

    Full text link
    Dynamic Algorithm Configuration (DAC) tackles the question of how to automatically learn policies to control parameters of algorithms in a data-driven fashion. This question has received considerable attention from the evolutionary community in recent years. Having a good benchmark collection to gain structural understanding on the effectiveness and limitations of different solution methods for DAC is therefore strongly desirable. Following recent work on proposing DAC benchmarks with well-understood theoretical properties and ground truth information, in this work, we suggest as a new DAC benchmark the controlling of the key parameter λ\lambda in the (1+(λ,λ))(1+(\lambda,\lambda))~Genetic Algorithm for solving OneMax problems. We conduct a study on how to solve the DAC problem via the use of (static) automated algorithm configuration on the benchmark, and propose techniques to significantly improve the performance of the approach. Our approach is able to consistently outperform the default parameter control policy of the benchmark derived from previous theoretical work on sufficiently large problem sizes. We also present new findings on the landscape of the parameter-control search policies and propose methods to compute stronger baselines for the benchmark via numerical approximations of the true optimal policies.Comment: To appear in the Proc. of the ACM/SIGEVO Conference on Foundations of Genetic Algorithms (FOGA XVII

    On the entropy of protein families

    Get PDF
    Proteins are essential components of living systems, capable of performing a huge variety of tasks at the molecular level, such as recognition, signalling, copy, transport, ... The protein sequences realizing a given function may largely vary across organisms, giving rise to a protein family. Here, we estimate the entropy of those families based on different approaches, including Hidden Markov Models used for protein databases and inferred statistical models reproducing the low-order (1-and 2-point) statistics of multi-sequence alignments. We also compute the entropic cost, that is, the loss in entropy resulting from a constraint acting on the protein, such as the fixation of one particular amino-acid on a specific site, and relate this notion to the escape probability of the HIV virus. The case of lattice proteins, for which the entropy can be computed exactly, allows us to provide another illustration of the concept of cost, due to the competition of different folds. The relevance of the entropy in relation to directed evolution experiments is stressed.Comment: to appear in Journal of Statistical Physic
    corecore