153,445 research outputs found

    Free search in multidimensional space

    Get PDF
    One of the challenges for modern search methods is resolving multidimensional tasks where optimization parameters are hundreds, thousands and more. Many evolutionary, swarm and adaptive methods, which perform well on numerical test with up to 10 dimensions are suffering insuperable stagnation when are applied to the same tests extended to 50, 100 and more dimensions. This article presents an original investigation on Free Search, Differential Evolution and Particle Swarm Optimization applied to multidimensional versions of several heterogeneous real-value numerical tests. The aim is to identify how dimensionality reflects on the search space complexity, in particular to evaluate relation between tasks’ dimensions’ number and corresponding iterations’ number required by used methods for reaching acceptable solution with non-zero probability. Experimental results are presented and analyzed

    Free search in multidimensional space (presentation)

    Get PDF
    Evaluation on multidimensional tests of Free Search, Differential Evolution, Particle Swarm Optimization Study abilities to avoid stagnation and trapping in local suboptimal solution Identify minimal number of iterations and time required to resolve multidimensional tasks with acceptable precisio

    Free Search – comparative analysis 100

    Get PDF
    Abstract: Search methods’ abilities for adaptation to various multidimensional tasks where optimisation parameters are hundreds, thousands and more, without retuning of algorithms’ parameters seems to be a great challenge for modern computational intelligence. Many evolutionary, swarm and adaptive methods, which perform well on numerical tests with up to ten dimensions are suffering insuperable stagnation when applied to 100 and more dimensional tests. This article presents a comparison between particle swarm optimisation, differential evolution both with enhanced adaptivity and Free Search applied to 100 multidimensional heterogeneous real-value numerical tests. The aim is to extend the knowledge on how high dimensionality reflects on search space complexity, in particular to identify minimal time and minimal number of objective function evaluations required by used methods for reaching acceptable solution with non-zero probability on tasks with high dimensions’ number. The achieved experimental results are summarised and analysed. Brief discussion on concepts, which support search methods effectiveness, concludes the article

    Visualising the structure of document search results: A comparison of graph theoretic approaches

    Get PDF
    This is the post-print of the article - Copyright @ 2010 Sage PublicationsPrevious work has shown that distance-similarity visualisation or ‘spatialisation’ can provide a potentially useful context in which to browse the results of a query search, enabling the user to adopt a simple local foraging or ‘cluster growing’ strategy to navigate through the retrieved document set. However, faithfully mapping feature-space models to visual space can be problematic owing to their inherent high dimensionality and non-linearity. Conventional linear approaches to dimension reduction tend to fail at this kind of task, sacrificing local structural in order to preserve a globally optimal mapping. In this paper the clustering performance of a recently proposed algorithm called isometric feature mapping (Isomap), which deals with non-linearity by transforming dissimilarities into geodesic distances, is compared to that of non-metric multidimensional scaling (MDS). Various graph pruning methods, for geodesic distance estimation, are also compared. Results show that Isomap is significantly better at preserving local structural detail than MDS, suggesting it is better suited to cluster growing and other semantic navigation tasks. Moreover, it is shown that applying a minimum-cost graph pruning criterion can provide a parameter-free alternative to the traditional K-neighbour method, resulting in spatial clustering that is equivalent to or better than that achieved using an optimal-K criterion

    Free Search Towards Multidimensional Optimisation Problems

    Get PDF
    The article presents experimental results achieved from a novel heuristic algorithm for real-value search and optimisation called Free Search (FS). The aim is to clarify the abilities of this method to return optimal solutions from multidimensional search spaces currently resistant to other search techniques

    Performance evaluation on optimisation of 200 dimensional numerical tests - results and issues

    Get PDF
    Abstract: Many tasks in science and technology require optimisation. Resolving such tasks could bring great benefits to community. Multidimensional problems where optimisation parameters are hundreds and more face unusual computational limitations. Algorithms, which perform well on low number of dimensions, when are applied to high dimensional space suffers insuperable difficulties. This article presents an investigation on 200 dimensional scalable, heterogeneous, real-value, numerical tests. For some of these tests optimal values are dependent on dimensions’ number and virtually unknown for variety of dimensions. Dependence on initialisation for successful identification of optimal values is analysed by comparison between experiments with start from random initial locations and start from one location. The aim is to: (1) assess dependence on initialisation in optimisation of 200 dimensional tests; (2) evaluate tests complexity and required for their resolving periods of time; (3) analyse adaptation to tasks with unknown solutions; (4) identify specific peculiarities which could support the performance on high dimensions (5) identify computational limitations which numerical methods could face on high dimensions. Presented and analysed experimental results can be used for further comparison and evaluation of real value methods

    Environmental inversion using high-resolution matched-field processing

    Get PDF
    This paper considers the inversion of experimental field data collected with light receiving systems designed to meet operational requirements. Such operational requirements include system deployment in free drifting configurations and a limited number of acoustic receivers. A well-known consequence of a reduced spatial coverage is a poor sampling of the vertical structure of the acoustic field, leading to a severe ill-conditioning of the inverse problem and data to model cost function with a massive sidelobe structure having many local extrema. This causes difficulties to meta-heuristic global search methods, such as genetic algorithms, to converge to the true model parameters. In order to cope with this difficulty, broadband high-resolution processors are proposed for their ability to significantly attenuate sidelobes, as a contribution for improving convergence. A comparative study on simulated data shows that high-resolution methods did not outperform the conventional Bartlett processor for pinpointing the true environmental parameter when using exhaustive search. However, when a meta-heuristic technique is applied for exploring a large multidimensional search space, high-resolution methods clearly improved convergence, therefore reducing the inherent uncertainty on the final estimate. These findings are supported by the results obtained on experimental field data obtained during the Maritime Rapid Environmental Assessment 2003 sea trial. (c) 2007 Acoustical Society of America

    Performance evaluation on optimisation of 200 dimensional numerical tests - results and issues

    Get PDF
    Abstract: Many tasks in science and technology require optimisation. Resolving such tasks could bring great benefits to community. Multidimensional problems where optimisation parameters are hundreds and more face unusual computational limitations. Algorithms, which perform well on low number of dimensions, when are applied to high dimensional space suffers insuperable difficulties. This article presents an investigation on 200 dimensional scalable, heterogeneous, real-value, numerical tests. For some of these tests optimal values are dependent on dimensions’ number and virtually unknown for variety of dimensions. Dependence on initialisation for successful identification of optimal values is analysed by comparison between experiments with start from random initial locations and start from one location. The aim is to: (1) assess dependence on initialisation in optimisation of 200 dimensional tests; (2) evaluate tests complexity and required for their resolving periods of time; (3) analyse adaptation to tasks with unknown solutions; (4) identify specific peculiarities which could support the performance on high dimensions (5) identify computational limitations which numerical methods could face on high dimensions. Presented and analysed experimental results can be used for further comparison and evaluation of real value methods

    Optimisation of NMR dynamic models I. Minimisation algorithms and their performance within the model-free and Brownian rotational diffusion spaces

    Get PDF
    The key to obtaining the model-free description of the dynamics of a macromolecule is the optimisation of the model-free and Brownian rotational diffusion parameters using the collected R1, R2 and steady-state NOE relaxation data. The problem of optimising the chi-squared value is often assumed to be trivial, however, the long chain of dependencies required for its calculation complicates the model-free chi-squared space. Convolutions are induced by the Lorentzian form of the spectral density functions, the linear recombinations of certain spectral density values to obtain the relaxation rates, the calculation of the NOE using the ratio of two of these rates, and finally the quadratic form of the chi-squared equation itself. Two major topological features of the model-free space complicate optimisation. The first is a long, shallow valley which commences at infinite correlation times and gradually approaches the minimum. The most severe convolution occurs for motions on two timescales in which the minimum is often located at the end of a long, deep, curved tunnel or multidimensional valley through the space. A large number of optimisation algorithms will be investigated and their performance compared to determine which techniques are suitable for use in model-free analysis. Local optimisation algorithms will be shown to be sufficient for minimisation not only within the model-free space but also for the minimisation of the Brownian rotational diffusion tensor. In addition the performance of the programs Modelfree and Dasha are investigated. A number of model-free optimisation failures were identified: the inability to slide along the limits, the singular matrix failure of the Levenberg–Marquardt minimisation algorithm, the low precision of both programs, and a bug in Modelfree. Significantly, the singular matrix failure of the Levenberg–Marquardt algorithm occurs when internal correlation times are undefined and is greatly amplified in model-free analysis by both the grid search and constraint algorithms. The program relax (http://www.nmr-relax.com) is also presented as a new software package designed for the analysis of macromolecular dynamics through the use of NMR relaxation data and which alleviates all of the problems inherent within model-free analysis
    • …
    corecore