577 research outputs found

    A hyper-heuristic inspired by pearl hunting

    Get PDF
    Accepted ManuscriptPublishe

    Pearl hunter : an inspired hyper-heuristic

    Get PDF
    Other Versio

    A modified choice function hyper-heuristic controlling unary and binary operators

    Get PDF
    Hyper-heuristics are a class of high-level search methodologies which operate on a search space of low-level heuristics or components, rather than on solutions directly. Traditional iterative selection hyper-heuristics rely on two key components, a heuristic selection method and a move acceptance criterion. Choice Function heuristic selection scores heuristics based on a combination of three measures, selecting the heuristic with the highest score. Modified Choice Function heuristic selection is a variant of the Choice Function which emphasises intensification over diversification within the heuristic search process. Previous work has shown that improved results are possible in some problem domains when using Modified Choice Function heuristic selection over the classic Choice Function, however in most of these cases crossover low-level heuristics (operators) are omitted. In this paper, we introduce crossover low-level heuristics into a Modified Choice Function selection hyper-heuristic and present results over six problem domains. It is observed that although on average there is an increase in performance when using crossover low-level heuristics, the benefit of using crossover can vary on a per-domain or per-instance basis

    A comparison of crossover control mechanisms within single-point selection hyper-heuristics using HyFlex

    Get PDF
    Hyper-heuristics are search methodologies which operate at a higher level of abstraction than traditional search and optimisation techniques. Rather than operating on a search space of solutions directly, a hyper-heuristic searches a space of low-level heuristics or heuristic components. An iterative selection hyper-heuristic operates on a single solution, selecting and applying a low-level heuristic at each step before deciding whether to accept the resulting solution. Crossover low-level heuristics are often included in modern selection hyper-heuristic frameworks, however as they require multiple solutions to operate, a strategy is required to manage potential solutions to use as input. In this paper we investigate the use of crossover control schemes within two existing selection hyper-heuristics and observe the difference in performance when the method for managing potential solutions for crossover is modified. Firstly, we use the crossover control scheme of AdapHH, the winner of an international competition in heuristic search, in a Modified Choice Function - All Moves selection hyper-heuristic. Secondly, we replace the crossover control scheme within AdapHH with another method taken from the literature. We observe that the performance of selection hyper-heuristics using crossover low level heuristics is not independent of the choice of strategy for managing input solutions to these operators

    Improving performance of a hyper-heuristic using a multilayer perceptron for vehicle routing

    Get PDF
    A hyper-heuristic is a heuristic optimisation method which generates or selects heuristics (move operators) based on a set of components while solving a computationally difficult problem. Apprenticeship learning arises while observing the behavior of an expert in action. In this study, we use a multilayer perceptron (MLP) as an apprenticeship learning algorithm to improve upon the performance of a state-of-the-art selection hyper-heuristic used as an expert, which was the winner of a cross-domain heuristic search challenge (CHeSC 2011). We collect data based on the relevant actions of the expert while solving selected vehicle routing problem instances from CHeSC 2011. Then an MLP is trained using this data to build a selection hyper-heuristic consisting of a number classifiers for heuristic selection, parameter control, and move-acceptance. The generated selection hyper-heuristic is tested on the unseen vehicle routing problem instances. The empirical results indicate the success of MLP-based hyper-heuristic achieving a better performance than the expert and some previously proposed algorithms

    Human brain evolution and the "Neuroevolutionary Time-depth Principle:" Implications for the Reclassification of fear-circuitry-related traits in DSM-V and for studying resilience to warzone-related posttraumatic stress disorder.

    Get PDF
    The DSM-III, DSM-IV, DSM-IV-TR and ICD-10 have judiciously minimized discussion of etiologies to distance clinical psychiatry from Freudian psychoanalysis. With this goal mostly achieved, discussion of etiological factors should be reintroduced into the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-V). A research agenda for the DSM-V advocated the "development of a pathophysiologically based classification system". The author critically reviews the neuroevolutionary literature on stress-induced and fear circuitry disorders and related amygdala-driven, species-atypical fear behaviors of clinical severity in adult humans. Over 30 empirically testable/falsifiable predictions are presented. It is noted that in DSM-IV-TR and ICD-10, the classification of stress and fear circuitry disorders is neither mode-of-acquisition-based nor brain-evolution-based. For example, snake phobia (innate) and dog phobia (overconsolidational) are clustered together. Similarly, research on blood-injection-injury-type-specific phobia clusters two fears different in their innateness: 1) an arguably ontogenetic memory-trace-overconsolidation-based fear (hospital phobia) and 2) a hardwired (innate) fear of the sight of one's blood or a sharp object penetrating one's skin. Genetic architecture-charting of fear-circuitry-related traits has been challenging. Various, non-phenotype-based architectures can serve as targets for research. In this article, the author will propose one such alternative genetic architecture. This article was inspired by the following: A) Nesse's "Smoke-Detector Principle", B) the increasing suspicion that the "smooth" rather than "lumpy" distribution of complex psychiatric phenotypes (including fear-circuitry disorders) may in some cases be accounted for by oligogenic (and not necessarily polygenic) transmission, and C) insights from the initial sequence of the chimpanzee genome and comparison with the human genome by the Chimpanzee Sequencing and Analysis Consortium published in late 2005. Neuroevolutionary insights relevant to fear circuitry symptoms that primarily emerge overconsolidationally (especially Combat related Posttraumatic Stress Disorder) are presented. Also introduced is a human-evolution-based principle for clustering innate fear traits. The "Neuroevolutionary Time-depth Principle" of innate fears proposed in this article may be useful in the development of a neuroevolution-based taxonomic re-clustering of stress-triggered and fear-circuitry disorders in DSM-V. Four broad clusters of evolved fear circuits are proposed based on their time-depths: 1) Mesozoic (mammalian-wide) circuits hardwired by wild-type alleles driven to fixation by Mesozoic selective sweeps; 2) Cenozoic (simian-wide) circuits relevant to many specific phobias; 3) mid Paleolithic and upper Paleolithic (Homo sapiens-specific) circuits (arguably resulting mostly from mate-choice-driven stabilizing selection); 4) Neolithic circuits (arguably mostly related to stabilizing selection driven by gene-culture co-evolution). More importantly, the author presents evolutionary perspectives on warzone-related PTSD, Combat-Stress Reaction, Combat-related Stress, Operational-Stress, and other deployment-stress-induced symptoms. The Neuroevolutionary Time-depth Principle presented in this article may help explain the dissimilar stress-resilience levels following different types of acute threat to survival of oneself or one's progency (aka DSM-III and DSM-V PTSD Criterion-A events). PTSD rates following exposure to lethal inter-group violence (combat, warzone exposure or intentionally caused disasters such as terrorism) are usually 5-10 times higher than rates following large-scale natural disasters such as forest fires, floods, hurricanes, volcanic eruptions, and earthquakes. The author predicts that both intentionally-caused large-scale bioevent-disasters, as well as natural bioevents such as SARS and avian flu pandemics will be an exception and are likely to be followed by PTSD rates approaching those that follow warzone exposure. During bioevents, Amygdala-driven and locus-coeruleus-driven epidemic pseudosomatic symptoms may be an order of magnitude more common than infection-caused cytokine-driven symptoms. Implications for the red cross and FEMA are discussed. It is also argued that hospital phobia as well as dog phobia, bird phobia and bat phobia require re-taxonomization in DSM-V in a new "overconsolidational disorders" category anchored around PTSD. The overconsolidational spectrum category may be conceptualized as straddling the fear circuitry spectrum disorders and the affective spectrum disorders categories, and may be a category for which Pitman's secondary prevention propranolol regimen may be specifically indicated as a "morning after pill" intervention. Predictions are presented regarding obsessive-compulsive disorder (OCD) (e.g., female-pattern hoarding vs. male-pattern hoarding) and "culture-bound" acute anxiety symptoms (taijin-kyofusho, koro, shuk yang, shook yong, suo yang, rok-joo, jinjinia-bemar, karoshi, gwarosa, Voodoo death). Also discussed are insights relevant to pseudoneurological symptoms and to the forthcoming Dissociative-Conversive disorders category in DSM-V, including what the author terms fright-triggered acute pseudo-localized symptoms (i.e., pseudoparalysis, pseudocerebellar imbalance, psychogenic blindness, pseudoseizures, and epidemic sociogenic illness). Speculations based on studies of the human abnormal-spindle-like, microcephaly-associated (ASPM) gene, the microcephaly primary autosomal recessive (MCPH) gene, and the forkhead box p2 (FOXP2) gene are made and incorporated into what is termed "The pre-FOXP2 Hypothesis of Blood-Injection-Injury Phobia." Finally, the author argues for a non-reductionistic fusion of "distal (evolutionary) neurobiology" with clinical "proximal neurobiology," utilizing neurological heuristics. It is noted that the value of re-clustering fear traits based on behavioral ethology, human-phylogenomics-derived endophenotypes and on ontogenomics (gene-environment interactions) can be confirmed or disconfirmed using epidemiological or twin studies and psychiatric genomics

    Assessing hyper-heuristic performance

    Get PDF
    Limited attention has been paid to assessing the generality performance of hyper-heuristics. The performance of hyper-heuristics has been predominately assessed in terms of optimality which is not ideal as the aim of hyper-heuristics is not to be competitive with state of the art approaches but rather to raise the level of generality, i.e. the ability of a technique to produce good results for different problem instances or problems rather than the best results for some instances and poor results for others. Furthermore from existing literature in this area it is evident that different hyper-heuristics aim to achieve different levels of generality and need to be assessed as such. To cater for this the paper firstly presents a new taxonomy of four different levels of generality that can be attained by a hyper-heuristic based on a survey of the literature. The paper then proposes a performance measure to assess the performance of different types of hyper-heuristics at the four levels of generality in terms of generality rather than optimality. Three case studies from the literature are used to demonstrate the application of the generality performance measure. The paper concludes by examining how the generality measure can be combined with measures of other performance criteria, such as optimality, to assess hyper-heuristic performance on more than one criterion

    Comprehensive Taxonomies of Nature- and Bio-inspired Optimization: Inspiration versus Algorithmic Behavior, Critical Analysis and Recommendations

    Full text link
    In recent years, a great variety of nature- and bio-inspired algorithms has been reported in the literature. This algorithmic family simulates different biological processes observed in Nature in order to efficiently address complex optimization problems. In the last years the number of bio-inspired optimization approaches in literature has grown considerably, reaching unprecedented levels that dark the future prospects of this field of research. This paper addresses this problem by proposing two comprehensive, principle-based taxonomies that allow researchers to organize existing and future algorithmic developments into well-defined categories, considering two different criteria: the source of inspiration and the behavior of each algorithm. Using these taxonomies we review more than three hundred publications dealing with nature-inspired and bio-inspired algorithms, and proposals falling within each of these categories are examined, leading to a critical summary of design trends and similarities between them, and the identification of the most similar classical algorithm for each reviewed paper. From our analysis we conclude that a poor relationship is often found between the natural inspiration of an algorithm and its behavior. Furthermore, similarities in terms of behavior between different algorithms are greater than what is claimed in their public disclosure: specifically, we show that more than one-third of the reviewed bio-inspired solvers are versions of classical algorithms. Grounded on the conclusions of our critical analysis, we give several recommendations and points of improvement for better methodological practices in this active and growing research field.Comment: 76 pages, 6 figure

    Learning Counterfactually Invariant Predictors

    Full text link
    Notions of counterfactual invariance (CI) have proven essential for predictors that are fair, robust, and generalizable in the real world. We propose graphical criteria that yield a sufficient condition for a predictor to be counterfactually invariant in terms of a conditional independence in the observational distribution. In order to learn such predictors, we propose a model-agnostic framework, called Counterfactually Invariant Prediction (CIP), building on the Hilbert-Schmidt Conditional Independence Criterion (HSCIC), a kernel-based conditional dependence measure. Our experimental results demonstrate the effectiveness of CIP in enforcing counterfactual invariance across various simulated and real-world datasets including scalar and multi-variate settings

    Design of vehicle routing problem domains for a hyper-heuristic framework

    Get PDF
    The branch of algorithms that uses adaptive methods to select or tune heuristics, known as hyper-heuristics, is one that has seen a large amount of interest and development in recent years. With an aim to develop techniques that can deliver results on multiple problem domains and multiple instances, this work is getting ever closer to mirroring the complex situations that arise in the corporate world. However, the capability of a hyper-heuristic is closely tied to the representation of the problem it is trying to solve and the tools that are available to do so. This thesis considers the design of such problem domains for hyper-heuristics. In particular, this work proposes that through the provision of high-quality data and tools to a hyper-heuristic, improved results can be achieved. A definition is given which describes the components of a problem domain for hyper-heuristics. Building on this definition, a domain for the Vehicle Routing Problem with Time Windows is presented. Through this domain, examples are given of how a hyper- heuristic can be provided extra information with which to make intelligent search decisions. One of these pieces of information is a measure of distance between solution which, when used to aid selection of mutation heuristics, is shown to improve results of an Iterative Local Search hyper-heuristic. A further example of the advantages of providing extra information is given in the form of the provision of a set of tools for the Vehicle Routing Problem domain to promote and measure ’fairness’ between routes. By offering these extra features at a domain level, it is shown how a hyper-heuristic can drive toward a fairer solution while maintaining a high level of performance
    corecore