56 research outputs found
Recommended from our members
An investigation of genetic operators for continuous parameter space
The success of a genetic optimization algorithm in continuous parameter space depends on the recombination (crossover) operators that it uses. In this paper we consider a wide spectrum of such operators within a unified framework and study their relative importance in the search process. We consider four basic types recombination operators which cover the relevant exploration potential of a continuous space: Interpolation, Extrapolation, Exchange and Mutation. Each of these basic types may have several variants. We characterize the various operators and their variants by their spatial sampling properties and examine their contributions to the search by applying different mixtures of the operators in several benchmark problems. The results suggest that the optimal mixture of operators may depend on the problem. But, in general, all basic types are needed for efficient optimization
Structure Discovery in Mixed Order Hyper Networks
Background Mixed Order Hyper Networks (MOHNs) are a type of neural network in which the interactions between inputs are modelled explicitly by weights that can connect any number of neurons. Such networks have a human readability that networks with hidden units lack. They can be used for regression, classification or as content addressable memories and have been shown to be useful as fitness function models in constraint satisfaction tasks. They are fast to train and, when their structure is fixed, do not suffer from local minima in the cost function during training. However, their main drawback is that the correct structure (which neurons to connect with weights) must be discovered from data and an exhaustive search is not possible for networks of over around 30 inputs. Results This paper presents an algorithm designed to discover a set of weights that satisfy the joint constraints of low training error and a parsimonious model. The combined structure discovery and weight learning process was found to be faster, more accurate and have less variance than training an MLP. Conclusions There are a number of advantages to using higher order weights rather than hidden units in a neural network but discovering the correct structure for those weights can be challenging. With the method proposed in this paper, the use of high order networks becomes tractable
An analysis of the local optima storage capacity of Hopfield network based fitness function models
A Hopfield Neural Network (HNN) with a new weight update rule can be treated as a second order Estimation of Distribution Algorithm (EDA) or Fitness Function Model (FFM) for solving optimisation problems. The HNN models promising solutions and has a capacity for storing a certain number of local optima as low energy attractors. Solutions are generated by sampling the patterns stored in the attractors. The number of attractors a network can store (its capacity) has an impact on solution diversity and, consequently solution quality. This paper introduces two new HNN learning rules and presents the Hopfield EDA (HEDA), which learns weight values from samples of the fitness function. It investigates the attractor storage capacity of the HEDA and shows it to be equal to that known in the literature for a standard HNN. The relationship between HEDA capacity and linkage order is also investigated
Multimodal Optimisation with Structured Populations and Local Environments
Abstract. Spatially-structured evolutionary algorithms are frequently implemented using a homogeneous environment throughout space. Such a configuration does not promote local adaptation of individuals in space. This paper introduces an evolutionary algorithm using space and lo-calised environments to promote speciation. Surprisingly, a randomly generated ârugged â landscape appears to best support speciation by en-couraging crossover between niches, while maintaining locally distinct species.
- âŠ