2,313 research outputs found

    The role of Walsh structure and ordinal linkage in the optimisation of pseudo-Boolean functions under monotonicity invariance.

    Get PDF
    Optimisation heuristics rely on implicit or explicit assumptions about the structure of the black-box fitness function they optimise. A review of the literature shows that understanding of structure and linkage is helpful to the design and analysis of heuristics. The aim of this thesis is to investigate the role that problem structure plays in heuristic optimisation. Many heuristics use ordinal operators; which are those that are invariant under monotonic transformations of the fitness function. In this thesis we develop a classification of pseudo-Boolean functions based on rank-invariance. This approach classifies functions which are monotonic transformations of one another as equivalent, and so partitions an infinite set of functions into a finite set of classes. Reasoning about heuristics composed of ordinal operators is, by construction, invariant over these classes. We perform a complete analysis of 2-bit and 3-bit pseudo-Boolean functions. We use Walsh analysis to define concepts of necessary, unnecessary, and conditionally necessary interactions, and of Walsh families. This helps to make precise some existing ideas in the literature such as benign interactions. Many algorithms are invariant under the classes we define, which allows us to examine the difficulty of pseudo-Boolean functions in terms of function classes. We analyse a range of ordinal selection operators for an EDA. Using a concept of directed ordinal linkage, we define precedence networks and precedence profiles to represent key algorithmic steps and their interdependency in terms of problem structure. The precedence profiles provide a measure of problem difficulty. This corresponds to problem difficulty and algorithmic steps for optimisation. This work develops insight into the relationship between function structure and problem difficulty for optimisation, which may be used to direct the development of novel algorithms. Concepts of structure are also used to construct easy and hard problems for a hill-climber

    Non-deterministic solvers and explainable AI through trajectory mining.

    Get PDF
    Traditional methods of creating explanations from complex systems involving the use of AI have resulted in a wide variety of tools available to users to generate explanations regarding algorithm and network designs. This however has traditionally been aimed at systems that mimic the structure of human thought such as neural networks. The growing adoption of AI systems in industries has led to research and roundtables regarding the ability to extract explanations from other systems such as Non-Deterministic algorithms. This family of algorithms can be analysed but the explanation of events can often be difficult for non-experts to understand. Mentioned is a potential path to the generation of explanations that would not require expert-level knowledge to be correctly understood

    Towards explainable metaheuristics: PCA for trajectory mining in evolutionary algorithms.

    Get PDF
    The generation of explanations regarding decisions made by population-based meta-heuristics is often a difficult task due to the nature of the mechanisms employed by these approaches. With the increase in use of these methods for optimisation in industries that require end-user confirmation, the need for explanations has also grown. We present a novel approach to the extraction of features capable of supporting an explanation through the use of trajectory mining - extracting key features from the populations of NDAs. We apply Principal Components Analysis techniques to identify new methods of population diversity tracking post-runtime after projection into a lower dimensional space. These methods are applied to a set of benchmark problems solved by a Genetic Algorithm and a Univariate Estimation of Distribution Algorithm. We show that the new sub-space derived metrics can capture key learning steps in the algorithm run and how solution variable patterns that explain the fitness function may be captured in the principal component coefficients

    Partial structure learning by subset Walsh transform.

    Get PDF
    Estimation of distribution algorithms (EDAs) use structure learning to build a statistical model of good solutions discovered so far, in an effort to discover better solutions. The non-zero coefficients of the Walsh transform produce a hypergraph representation of structure of a binary fitness function; however, computation of all Walsh coefficients requires exhaustive evaluation of the search space. In this paper, we propose a stochastic method of determining Walsh coefficients for hyperedges contained within the selected subset of the variables (complete local structure). This method also detects parts of hyperedges which cut the boundary of the selected variable set (partial structure), which may be used to incrementally build an approximation of the problem hypergraph

    Investigating benchmark correlations when comparing algorithms with parameter tuning: detailed experiments and results.

    Get PDF
    Benchmarks are important to demonstrate the utility of optimisation algorithms, but there is controversy about the practice of benchmarking; we could select instances that present our algorithm favourably, and dismiss those on which our algorithm underperforms. Several papers highlight the pitfalls concerned with benchmarking, some of which concern the context of the automated design of algorithms, where we use a set of problem instances (benchmarks) to train our algorithm. As with machine learning, if the training set does not reflect the test set, the algorithm will not generalize. This raises some open questions concerning the use of test instances to automatically design algorithms. We use differential evolution and sweep the parameter settings to investigate the practice of benchmarking using the BBOB benchmarks. We make three key findings. Firstly, several benchmark functions are highly correlated. This may lead to the false conclusion that an algorithm performs well in general, when it performs poorly on a few key instances, possibly introducing unwanted bias to a resulting automatically designed algorithm. Secondly, the number of evaluations can have a large effect on the conclusion. Finally, a systematic sweep of the parameters shows how performance varies with time across the space of algorithm configurations. The datasets, including all computed features, the evolved policies and their performances, and the visualisations for all feature sets are available from the University of Stirling Data Repository (http://hdl.handle.net/11667/109)

    Investigating benchmark correlations when comparing algorithms with parameter tuning.

    Get PDF
    Benchmarks are important for comparing performance of optimisation algorithms, but we can select instances that present our algorithm favourably, and dismiss those on which our algorithm under-performs. Also related are automated design of algorithms, which use problem instances (benchmarks) to train an algorithm: careful choice of instances is needed for the algorithm to generalise. We sweep parameter settings of differential evolution to applied to the BBOB benchmarks. Several benchmark functions are highly correlated. This may lead to the false conclusion that an algorithm performs well in general, when it performs poorly on a few key instances. These correlations vary with the number of evaluations

    Optimising the introduction of connected and autonomous vehicles in a public transport system using macro-level mobility simulations and evolutionary algorithms.

    Get PDF
    The past five years have seen a rapid development of plans and test pilots aimed at introducing connected and autonomous vehicles (CAVs) in public transport systems around the world. Using a real-world scenario from the Leeds Metropolitan Area as a case study, we demonstrate an effective way to combine macro-level mobility simulations based on open data (i.e., geographic information system information and transit timetables) with evolutionary optimisation techniques to discover realistic optimised integration routes for CAVs. The macro-level mobility simulations are used to assess the quality (i.e., fitness) of a potential CAV route by quantifying geographic accessibility improvements using an extended version of Dijkstra's algorithm on an abstract multi-modal transport network

    Exploring representations for optimising connected autonomous vehicle routes in multi-modal transport networks using evolutionary algorithms.

    Get PDF
    The past five years have seen rapid development of plans and test pilots aimed at introducing connected and autonomous vehicles (CAVs) in public transport systems around the world. While self-driving technology is still being perfected, public transport authorities are increasingly interested in the ability to model and optimize the benefits of adding CAVs to existing multi-modal transport systems. Using a real-world scenario from the Leeds Metropolitan Area as a case study, we demonstrate an effective way of combining macro-level mobility simulations based on open data with global optimisation techniques to discover realistic optimal deployment strategies for CAVs. The macro-level mobility simulations are used to assess the quality of a potential multi-route CAV service by quantifying geographic accessibility improvements using an extended version of Dijkstra's algorithm on an abstract multi-modal transport network. The optimisations were carried out using several popular population-based optimisation algorithms that were combined with several routing strategies aimed at constructing the best routes by ordering stops in a realistic sequence

    Sustained running in rats administered corticosterone prevents the development of depressive behaviors and enhances hippocampal neurogenesis and synaptic plasticity without increasing neurotrophic factor levels

    Get PDF
    We have previously shown that voluntary running acts as an anxiolytic and ameliorates deficits in hippocampal neurogenesis and spatial learning. It also reduces depression-like behaviors that are normally observed in rats that were administered either low (30 mg/kg) or moderate (40 mg/kg) doses of corticosterone (CORT). However, the protective effects of running were absent in rats treated with a high (50 mg/kg) dose of CORT. We examined whether allowing animals to exercise for 2 weeks prior and/or concurrently with the administration of 50 mg/kg CORT treatment could have similar protective effects. We examined hippocampal neurogenesis using immunohistochemical staining of proliferative and survival cells with the thymidine analogs (BrdU, CIdU, and IdU). In addition, we monitored synaptic protein expression and quantified the levels of neurotrophic factors in these animals as well as performing behavioral analyses (forced swim test and sucrose preference test). Our results indicate that the depressive phenotype and reductions in neurogenesis that normally accompany high CORT administration could only be prevented by allowing animals to exercise both prior to and concurrently with the CORT administration period. These animals also showed increases in both synaptophysin and PSD-95 protein levels, but surprisingly, neither brain-derived neurotrophic factor (BDNF) nor insulin-like growth factor 1 (IGF-1) levels were increased in these animals. The results suggest that persistent exercise can strengthen resilience to stress by promoting hippocampal neurogenesis and increasing synaptic protein levels, thereby reducing the deleterious effects of stress.published_or_final_versio
    • …
    corecore