152,431 research outputs found

    Evolvability signatures of generative encodings: beyond standard performance benchmarks

    Full text link
    Evolutionary robotics is a promising approach to autonomously synthesize machines with abilities that resemble those of animals, but the field suffers from a lack of strong foundations. In particular, evolutionary systems are currently assessed solely by the fitness score their evolved artifacts can achieve for a specific task, whereas such fitness-based comparisons provide limited insights about how the same system would evaluate on different tasks, and its adaptive capabilities to respond to changes in fitness (e.g., from damages to the machine, or in new situations). To counter these limitations, we introduce the concept of "evolvability signatures", which picture the post-mutation statistical distribution of both behavior diversity (how different are the robot behaviors after a mutation?) and fitness values (how different is the fitness after a mutation?). We tested the relevance of this concept by evolving controllers for hexapod robot locomotion using five different genotype-to-phenotype mappings (direct encoding, generative encoding of open-loop and closed-loop central pattern generators, generative encoding of neural networks, and single-unit pattern generators (SUPG)). We observed a predictive relationship between the evolvability signature of each encoding and the number of generations required by hexapods to adapt from incurred damages. Our study also reveals that, across the five investigated encodings, the SUPG scheme achieved the best evolvability signature, and was always foremost in recovering an effective gait following robot damages. Overall, our evolvability signatures neatly complement existing task-performance benchmarks, and pave the way for stronger foundations for research in evolutionary robotics.Comment: 24 pages with 12 figures in the main text, and 4 supplementary figures. Accepted at Information Sciences journal (in press). Supplemental videos are available online at, see http://goo.gl/uyY1R

    Worst-input mutation approach to web services vulnerability testing based on SOAP messages

    Get PDF
    The growing popularity and application of Web services have led to an increase in attention to the vulnerability of software based on these services. Vulnerability testing examines the trustworthiness, and reduces the security risks of software systems, however such testing of Web services has become increasing challenging due to the cross-platform and heterogeneous characteristics of their deployment. This paper proposes a worst-input mutation approach for testing Web service vulnerability based on SOAP (Simple Object Access Protocol) messages. Based on characteristics of the SOAP messages, the proposed approach uses the farthest neighbor concept to guide generation of the test suite. The test case generation algorithm is presented, and a prototype Web service vulnerability testing tool described. The tool was applied to the testing of Web services on the Internet, with experimental results indicating that the proposed approach, which found more vulnerability faults than other related approaches, is both practical and effective

    Connectionist Theory Refinement: Genetically Searching the Space of Network Topologies

    Full text link
    An algorithm that learns from a set of examples should ideally be able to exploit the available resources of (a) abundant computing power and (b) domain-specific knowledge to improve its ability to generalize. Connectionist theory-refinement systems, which use background knowledge to select a neural network's topology and initial weights, have proven to be effective at exploiting domain-specific knowledge; however, most do not exploit available computing power. This weakness occurs because they lack the ability to refine the topology of the neural networks they produce, thereby limiting generalization, especially when given impoverished domain theories. We present the REGENT algorithm which uses (a) domain-specific knowledge to help create an initial population of knowledge-based neural networks and (b) genetic operators of crossover and mutation (specifically designed for knowledge-based networks) to continually search for better network topologies. Experiments on three real-world domains indicate that our new algorithm is able to significantly increase generalization compared to a standard connectionist theory-refinement system, as well as our previous algorithm for growing knowledge-based networks.Comment: See http://www.jair.org/ for any accompanying file

    The relevance of outsourcing and leagile strategies in performance optimization of an integrated process planning and scheduling

    Get PDF
    Over the past few years growing global competition has forced the manufacturing industries to upgrade their old production strategies with the modern day approaches. As a result, recent interest has been developed towards finding an appropriate policy that could enable them to compete with others, and facilitate them to emerge as a market winner. Keeping in mind the abovementioned facts, in this paper the authors have proposed an integrated process planning and scheduling model inheriting the salient features of outsourcing, and leagile principles to compete in the existing market scenario. The paper also proposes a model based on leagile principles, where the integrated planning management has been practiced. In the present work a scheduling problem has been considered and overall minimization of makespan has been aimed. The paper shows the relevance of both the strategies in performance enhancement of the industries, in terms of their reduced makespan. The authors have also proposed a new hybrid Enhanced Swift Converging Simulated Annealing (ESCSA) algorithm, to solve the complex real-time scheduling problems. The proposed algorithm inherits the prominent features of the Genetic Algorithm (GA), Simulated Annealing (SA), and the Fuzzy Logic Controller (FLC). The ESCSA algorithm reduces the makespan significantly in less computational time and number of iterations. The efficacy of the proposed algorithm has been shown by comparing the results with GA, SA, Tabu, and hybrid Tabu-SA optimization methods

    Automated Generation of Cross-Domain Analogies via Evolutionary Computation

    Full text link
    Analogy plays an important role in creativity, and is extensively used in science as well as art. In this paper we introduce a technique for the automated generation of cross-domain analogies based on a novel evolutionary algorithm (EA). Unlike existing work in computational analogy-making restricted to creating analogies between two given cases, our approach, for a given case, is capable of creating an analogy along with the novel analogous case itself. Our algorithm is based on the concept of "memes", which are units of culture, or knowledge, undergoing variation and selection under a fitness measure, and represents evolving pieces of knowledge as semantic networks. Using a fitness function based on Gentner's structure mapping theory of analogies, we demonstrate the feasibility of spontaneously generating semantic networks that are analogous to a given base network.Comment: Conference submission, International Conference on Computational Creativity 2012 (8 pages, 6 figures

    Development of an automated aircraft subsystem architecture generation and analysis tool

    Get PDF
    Purpose – The purpose of this paper is to present a new computational framework to address future preliminary design needs for aircraft subsystems. The ability to investigate multiple candidate technologies forming subsystem architectures is enabled with the provision of automated architecture generation, analysis and optimization. Main focus lies with a demonstration of the frameworks workings, as well as the optimizers performance with a typical form of application problem. Design/methodology/approach – The core aspects involve a functional decomposition, coupled with a synergistic mission performance analysis on the aircraft, architecture and component levels. This may be followed by a complete enumeration of architectures, combined with a user defined technology filtering and concept ranking procedure. In addition, a hybrid heuristic optimizer, based on ant systems optimization and a genetic algorithm, is employed to produce optimal architectures in both component composition and design parameters. The optimizer is tested on a generic architecture design problem combined with modified Griewank and parabolic functions for the continuous space. Findings – Insights from the generalized application problem show consistent rediscovery of the optimal architectures with the optimizer, as compared to a full problem enumeration. In addition multi-objective optimization reveals a Pareto front with differences in component composition as well as continuous parameters. Research limitations/implications – This paper demonstrates the frameworks application on a generalized test problem only. Further publication will consider real engineering design problems. Originality/value – The paper addresses the need for future conceptual design methods of complex systems to consider a mixed concept space of both discrete and continuous nature via automated methods
    • 

    corecore