70 research outputs found

    Evolutionary cellular configurations for designing feed-forward neural networks architectures

    Get PDF
    Proceeding of: 6th International Work-Conference on Artificial and Natural Neural Networks, IWANN 2001 Granada, Spain, June 13–15, 2001In the recent years, the interest to develop automatic methods to determine appropriate architectures of feed-forward neural networks has increased. Most of the methods are based on evolutionary computation paradigms. Some of the designed methods are based on direct representations of the parameters of the network. These representations do not allow scalability, so to represent large architectures, very large structures are required. An alternative more interesting are the indirect schemes. They codify a compact representation of the neural network. In this work, an indirect constructive encoding scheme is presented. This scheme is based on cellular automata representations in order to increase the scalability of the method

    Grammars and cellular automata for evolving neural networks architectures

    Get PDF
    IEEE International Conference on Systems, Man, and Cybernetics. Nashville, TN, 8-11 October 2000The class of feedforward neural networks trained with back-propagation admits a large variety of specific architectures applicable to approximation pattern tasks. Unfortunately, the architecture design is still a human expert job. In recent years, the interest to develop automatic methods to determine the architecture of the feedforward neural network has increased, most of them based on the evolutionary computation paradigm. From this approach, some perspectives can be considered: at one extreme, every connection and node of architecture can be specified in the chromosome representation using binary bits. This kind of representation scheme is called the direct encoding scheme. In order to reduce the length of the genotype and the search space, and to make the problem more scalable, indirect encoding schemes have been introduced. An indirect scheme under a constructive algorithm, on the other hand, starts with a minimal architecture and new levels, neurons and connections are added, step by step, via some sets of rules. The rules and/or some initial conditions are codified into a chromosome of a genetic algorithm. In this work, two indirect constructive encoding schemes based on grammars and cellular automata, respectively, are proposed to find the optimal architecture of a feedforward neural network

    Neural Network architectures design by Cellular Automata evolution

    Get PDF
    4th Conference of Systemics Cybernetics and Informatics. Orlando, 23-26 July 2000The design of the architecture is a crucial step in the successful application of a neural network. However, the architecture design is basically, in most cases, a human experts job. The design depends heavily on both, the expert experience and on a tedious trial-and-error process. Therefore, the development of automatic methods to determine the architecture of feedforward neural networks is a field of interest in the neural network community. These methods are generally based on search techniques, as genetic algorithms, simulated annealing or evolutionary strategies. Most of the designed methods are based on direct representation of the parameters of the network. This representation does not allow scalability, so to represent large architectures very large structures are required. In this work, an indirect constructive encoding scheme is proposed to find optimal architectures of feed-forward neural networks. This scheme is based on cellular automata representations in order to increase the scalability of the method.Publicad

    Limited Evaluation Cooperative Co-evolutionary Differential Evolution for Large-scale Neuroevolution

    Get PDF
    Many real-world control and classification tasks involve a large number of features. When artificial neural networks (ANNs) are used for modeling these tasks, the network architectures tend to be large. Neuroevolution is an effective approach for optimizing ANNs; however, there are two bottlenecks that make their application challenging in case of high-dimensional networks using direct encoding. First, classic evolutionary algorithms tend not to scale well for searching large parameter spaces; second, the network evaluation over a large number of training instances is in general time-consuming. In this work, we propose an approach called the Limited Evaluation Cooperative Co-evolutionary Differential Evolution algorithm (LECCDE) to optimize high-dimensional ANNs. The proposed method aims to optimize the pre-synaptic weights of each post-synaptic neuron in different subpopulations using a Cooperative Co-evolutionary Differential Evolution algorithm, and employs a limited evaluation scheme where fitness evaluation is performed on a relatively small number of training instances based on fitness inheritance. We test LECCDE on three datasets with various sizes, and our results show that cooperative co-evolution significantly improves the test error comparing to standard Differential Evolution, while the limited evaluation scheme facilitates a significant reduction in computing time

    Non-Direct Encoding Method Based on Cellular Automata to Design Neural Network Architectures

    Get PDF
    Architecture design is a fundamental step in the successful application of Feed forward Neural Networks. In most cases a large number of neural networks architectures suitable to solve a problem exist and the architecture design is, unfortunately, still a human expert’s job. It depends heavily on the expert and on a tedious trial-and-error process. In the last years, many works have been focused on automatic resolution of the design of neural network architectures. Most of the methods are based on evolutionary computation paradigms. Some of the designed methods are based on direct representations of the parameters of the network. These representations do not allow scalability; thus, for representing large architectures very large structures are required. More interesting alternatives are represented by indirect schemes. They codify a compact representation of the neural network. In this work, an indirect constructive encoding scheme is proposed. This scheme is based on cellular automata representations and is inspired by the idea that only a few seeds for the initial configuration of a cellular automaton can produce a wide variety of feed forward neural networks architectures. The cellular approach is experimentally validated in different domains and compared with a direct codification scheme.Publicad

    Guiding for Associative Learning : How to Shape Artificial Dynamic Cognition ?

    No full text
    Je n'ai pas de nouvelles de la date de parution de ces actes. La conférence est passée, les organisateurs ont confirmés que les papiers sortiront dans deux numéros de LNAI, en attendant, je souhaitais mettre notre travail en ligne pour qu'il soit diffusé ...International audienceThis paper describes an evolutionary robotics experiment, which aims at showing the possibility of learning by guidance in a dynamic cognition perspective. Our model relies on Continuous Time Recurrent Neural Networks and Hebbian plasticity. The agents have the ability to be guided by stimuli and we study the influence of a guidance on their external behavior and internal dynamic when faced with other stimuli. The article develops the experiment and presents some results on the dynamic of the systems

    Self-adaptive exploration in evolutionary search

    Full text link
    We address a primary question of computational as well as biological research on evolution: How can an exploration strategy adapt in such a way as to exploit the information gained about the problem at hand? We first introduce an integrated formalism of evolutionary search which provides a unified view on different specific approaches. On this basis we discuss the implications of indirect modeling (via a ``genotype-phenotype mapping'') on the exploration strategy. Notions such as modularity, pleiotropy and functional phenotypic complex are discussed as implications. Then, rigorously reflecting the notion of self-adaptability, we introduce a new definition that captures self-adaptability of exploration: different genotypes that map to the same phenotype may represent (also topologically) different exploration strategies; self-adaptability requires a variation of exploration strategies along such a ``neutral space''. By this definition, the concept of neutrality becomes a central concern of this paper. Finally, we present examples of these concepts: For a specific grammar-type encoding, we observe a large variability of exploration strategies for a fixed phenotype, and a self-adaptive drift towards short representations with highly structured exploration strategy that matches the ``problem's structure''.Comment: 24 pages, 5 figure

    The evolutionary origins of hierarchy

    Get PDF
    Hierarchical organization -- the recursive composition of sub-modules -- is ubiquitous in biological networks, including neural, metabolic, ecological, and genetic regulatory networks, and in human-made systems, such as large organizations and the Internet. To date, most research on hierarchy in networks has been limited to quantifying this property. However, an open, important question in evolutionary biology is why hierarchical organization evolves in the first place. It has recently been shown that modularity evolves because of the presence of a cost for network connections. Here we investigate whether such connection costs also tend to cause a hierarchical organization of such modules. In computational simulations, we find that networks without a connection cost do not evolve to be hierarchical, even when the task has a hierarchical structure. However, with a connection cost, networks evolve to be both modular and hierarchical, and these networks exhibit higher overall performance and evolvability (i.e. faster adaptation to new environments). Additional analyses confirm that hierarchy independently improves adaptability after controlling for modularity. Overall, our results suggest that the same force--the cost of connections--promotes the evolution of both hierarchy and modularity, and that these properties are important drivers of network performance and adaptability. In addition to shedding light on the emergence of hierarchy across the many domains in which it appears, these findings will also accelerate future research into evolving more complex, intelligent computational brains in the fields of artificial intelligence and robotics.Comment: 32 page

    Generative capacities of grammars codification for evolution of NN architectures

    Get PDF
    Proceeding of: 2002 Congress on Evolutionary Computation, 2002. CEC'02, may 12-17, 2002. Honolulu, Hawaii, USA.Designing the optimal architecture can be formulated as a search problem in the architectures space, where each point represents an architecture. The search space of all possible architectures is very large, and the task of finding the simplest architecture may be an arduous and mostly a random task. Methods based in indirect encoding have been used to reduce the chromosome length. In this work a new indirect encoding method is proposed and an analysis of the generative capacity of the method is presented
    • …
    corecore