385 research outputs found

    Deconstructing the Big Valley Search Space Hypothesis

    Get PDF
    The big valley hypothesis suggests that, in combinatorial optimisation, local optima of good quality are clustered and surround the global optimum. We show here that the idea of a single valley does not always hold. Instead the big valley seems to de-construct into several valleys, also called ‘funnels’ in theoretical chemistry. We use the local optima networks model and propose an effective procedure for extracting the network data. We conduct a detailed study on four selected TSP instances of moderate size and observe that the big valley decomposes into a number of sub-valleys of different sizes and fitness distributions. Sometimes the global optimum is located in the largest valley, which suggests an easy to search landscape, but this is not generally the case. The global optimum might be located in a small valley, which offers a clear and visual explanation of the increased search difficulty in these cases. Our study opens up new possibilities for analysing and visualising combinatorial landscapes as complex networks

    The Multi-Funnel Structure of TSP Fitness Landscapes: A Visual Exploration

    Get PDF
    We use the Local Optima Network model to study the structure of symmetric TSP fitness landscapes. The `big-valley' hypothesis holds that for TSP and other combinatorial problems, local optima are not randomly distributed, instead they tend to be clustered around the global optimum. However, a recent study has observed that, for solutions close in evaluation to the global optimum, this structure breaks down into multiple valleys, forming what has been called `multiple funnels'. The multiple funnel concept implies that local optima are organised into clusters, so that a particular local optimum largely belongs to a particular funnel. Our study is the first to extract and visualise local optima networks for TSP and is based on a sampling methodology relying on the Chained Lin-Kernighan algorithm. We confirm the existence of multiple funnels on two selected TSP instances, finding additional funnels in a previously studied instance. Our results suggests that transitions among funnels are possible using operators such as `double-bridge'. However, for consistently escaping sub-optimal funnels, more robust escaping mechanisms are required

    Understanding Phase Transitions with Local Optima Networks: Number Partitioning as a Case Study

    Get PDF
    Phase transitions play an important role in understanding search difficulty in combinatorial optimisation. However, previous attempts have not revealed a clear link between fitness landscape properties and the phase transition. We explore whether the global landscape structure of the number partitioning problem changes with the phase transition. Using the local optima network model, we analyse a number of instances before, during, and after the phase transition. We compute relevant network and neutrality metrics; and importantly, identify and visualise the funnel structure with an approach (monotonic sequences) inspired by theoretical chemistry. While most metrics remain oblivious to the phase transition, our results reveal that the funnel structure clearly changes. Easy instances feature a single or a small number of dominant funnels leading to global optima; hard instances have a large number of suboptimal funnels attracting the search. Our study brings new insights and tools to the study of phase transitions in combinatorial optimisation

    Coarse-Grained Barrier Trees of Fitness Landscapes

    Get PDF
    Recent literature suggests that local optima in fitness landscapes are clustered, which offers an explanation of why perturbation-based metaheuristics often fail to find the global optimum: they become trapped in a sub-optimal cluster. We introduce a method to extract and visualize the global organization of these clusters in form of a barrier tree. Barrier trees have been used to visualize the barriers between local optima basins in fitness landscapes. Our method computes a more coarsely grained tree to reveal the barriers between clusters of local optima. The core element is a new variant of the flooding algorithm, applicable to local optima networks, a compressed representation of fitness landscapes. To identify the clusters, we apply a community detection algorithm. A sample of 200 NK fitness landscapes suggests that the depth of their coarse-grained barrier tree is related to their search difficulty

    Inferring Future Landscapes: Sampling the Local Optima Level

    Get PDF
    Connection patterns among Local Optima Networks (LONs) can inform heuristic design for optimisation. LON research has predominantly required complete enumeration of a fitness landscape, thereby restricting analysis to problems diminutive in size compared to real-life situations. LON sampling algorithms are therefore important. In this paper, we study LON construction algorithms for the Quadratic Assignment Problem (QAP). Using machine learning, we use estimated LON features to predict search performance for competitive heuristics used in the QAP domain. The results show that by using random forest regression, LON construction algorithms produce fitness landscape features which can explain almost all search variance. We find that LON samples better relate to search than enumerated LONs do. The importance of fitness levels of sampled LONs in search predictions is crystallised. Features from LONs produced by different algorithms are combined in predictions for the first time, with promising results for this ‘super-sampling’: a model to predict tabu search success explained 99% of variance. Arguments are made for the use-case of each LON algorithm and for combining the exploitative process of one with the exploratory optimisation of the other

    Comparing Communities of Optima with Funnels in Combinatorial Fitness Landscapes

    Get PDF
    The existence of sub-optimal funnels in combinatorial fitness landscapes has been linked to search difficulty. The exact nature of these structures — and how commonly they appear — is not yet fully understood. Improving our understanding of funnels could help with designing effective diversification mechanisms for a ‘smoothing’ effect, making optimisation easier. We model fitness landscapes as local optima networks. The relationship between communities of local optima found by network clustering algorithms and funnels is explored. Funnels are identified using the notion of monotonic sequences from the study of energy landscapes in theoretical chemistry. NK Landscapes and the Quadratic Assignment Problem are used as case studies. Our results show that communities are linked to funnels. The analysis exhibits relationships between these landscape structures and the performance of trajectory-based metaheuristics such as Simulated Annealing (SA) and Iterated Local Search (ILS). In particular, ILS gets trapped in funnels, and modular communities of optima slow it down. The funnels contribute to lower success for SA. We show that increasing the strength of ILS perturbation helps to ‘smooth’ the funnels and improves performance in multi-funnel landscapes.Authors listed as ECOM Trac

    The Effect of Landscape Funnels in QAPLIB Instances

    Get PDF
    The effectiveness of common metaheuristics on combinatorial optimisation problems can be limited by certain characteristics of the fitness landscape. We use the local optima network model to compress the ‘inherent structure’ of a problem space into a network whose structure relates to the empirical hardness of the underlying landscape. Monotonic sequences are used on the local optima networks of a benchmark set of QAP instances (QAPLIB) to expose landscape funnels. The results suggest links between features of these structures and lowered metaheuristic performance

    Combining Exploration and Exploitation in Active Learning

    Get PDF
    This thesis investigates the active learning in the presence of model bias. State of the art approaches advocate combining exploration and exploitation in active learning. However, they suffer from premature exploitation or unnecessary exploration in the later stages of learning. We propose to combine exploration and exploitation in active learning by discarding instances outside a sampling window that is centered around the estimated decision boundary and uniformly draw sample from this window. Initially the window spans the entire feature space and is gradually constricted, where the rate of constriction models the exploration-exploitation tradeoff. The desired effect of this approach (CExp) is that we get an increasing sampling density in informative regions as active learning progresses, resulting in a continuous and natural transition from exploration to exploitation, limiting both premature exploitation and unnecessary exploration. We show that our approach outperforms state of the art on real world multiclass datasets. We also extend our approach to spatial mapping problems where the standard active learning assumption of uniform costs is violated. We show that we can take advantage of \emph{spatial continuity} in the data by geographically partitioning the instances in the sampling window and choosing a single partition (region) for sampling, as opposed to taking a random sample from the entire window, resulting in a novel spatial active learning algorithm that combines exploration and exploitation. We demonstrate that our approach (CExp-Spatial) can generate cost-effective sampling trajectories over baseline sampling methods. Finally, we present the real world problem of mapping benthic habitats where bathymetry derived features are typically not strong enough to discriminate the fine details between classes identified from high-resolution imagery, increasing the possiblity of model bias in active learning. We demonstrate, under such conditions, that CExp outperforms state of the art and that CExp-Spatial can generate more cost-effective sampling trajectories for an Autonomous Underwater Vehicle in contrast to baseline sampling strategies
    corecore