14,391 research outputs found

    Designing automatically a representation for grammatical evolution

    Get PDF
    A long-standing problem in Evolutionary Computation consists in how to choose an appropriate representation for the solutions. In this work we investigate the feasibility of synthesizing a representation automatically, for the large class of problems whose solution spaces can be defined by a context-free grammar. We propose a framework based on a form of meta-evolution in which individuals are candidate representations expressed with an ad hoc language that we have developed to this purpose. Individuals compete and evolve according to an evolutionary search aimed at optimizing such representation properties as redundancy, uniformity of redundancy, and locality. We assessed experimentally three variants of our framework on established benchmark problems and compared the resulting representations to human-designed representations commonly used (e.g., classical Grammatical Evolution). The results are promising as the evolved representations indeed exhibit better properties than the human-designed ones. Furthermore, the evolved representations compare favorably with the human-designed baselines in search effectiveness as well. Specifically, we select a best evolved representation as the representation with best search effectiveness on a set of learning problems and assess its effectiveness on a separate set of challenging validation problems. For each of the three proposed variants of our framework, the best evolved representation exhibits an average fitness rank on the set of validation problems that is better than the average fitness rank of the human-designed baselines on the same problems

    Grammars and cellular automata for evolving neural networks architectures

    Get PDF
    IEEE International Conference on Systems, Man, and Cybernetics. Nashville, TN, 8-11 October 2000The class of feedforward neural networks trained with back-propagation admits a large variety of specific architectures applicable to approximation pattern tasks. Unfortunately, the architecture design is still a human expert job. In recent years, the interest to develop automatic methods to determine the architecture of the feedforward neural network has increased, most of them based on the evolutionary computation paradigm. From this approach, some perspectives can be considered: at one extreme, every connection and node of architecture can be specified in the chromosome representation using binary bits. This kind of representation scheme is called the direct encoding scheme. In order to reduce the length of the genotype and the search space, and to make the problem more scalable, indirect encoding schemes have been introduced. An indirect scheme under a constructive algorithm, on the other hand, starts with a minimal architecture and new levels, neurons and connections are added, step by step, via some sets of rules. The rules and/or some initial conditions are codified into a chromosome of a genetic algorithm. In this work, two indirect constructive encoding schemes based on grammars and cellular automata, respectively, are proposed to find the optimal architecture of a feedforward neural network

    Neural Network architectures design by Cellular Automata evolution

    Get PDF
    4th Conference of Systemics Cybernetics and Informatics. Orlando, 23-26 July 2000The design of the architecture is a crucial step in the successful application of a neural network. However, the architecture design is basically, in most cases, a human experts job. The design depends heavily on both, the expert experience and on a tedious trial-and-error process. Therefore, the development of automatic methods to determine the architecture of feedforward neural networks is a field of interest in the neural network community. These methods are generally based on search techniques, as genetic algorithms, simulated annealing or evolutionary strategies. Most of the designed methods are based on direct representation of the parameters of the network. This representation does not allow scalability, so to represent large architectures very large structures are required. In this work, an indirect constructive encoding scheme is proposed to find optimal architectures of feed-forward neural networks. This scheme is based on cellular automata representations in order to increase the scalability of the method.Publicad

    Grammatical evolution to design fractal curves with a given dimension

    Full text link
    Original paper in http://ieeexplore.ieee.org/Lindenmayer grammars have frequently been applied to represent fractal curves. In this work, the ideas behind grammar evolution are used to automatically generate and evolve Lindenmayer grammars which represent fractal curves with a fractal dimension that approximates a predefined required value. For many dimensions, this is a nontrivial task to be performed manually. The procedure we propose closely parallels biological evolution because it acts through three different levels: a genotype (a vector of integers), a protein-like intermediate level (the Lindenmayer grammar), and a phenotype (the fractal curve). Variation acts at the genotype level, while selection is performed at the phenotype level (by comparing the dimensions of the fractal curves to the desired value).This paper has been sponsored by the Spanish Ministry of Science and Technology (MCYT), project numbers TIC2002-01948 and TIC2001-0685-C02-01

    Non-Direct Encoding Method Based on Cellular Automata to Design Neural Network Architectures

    Get PDF
    Architecture design is a fundamental step in the successful application of Feed forward Neural Networks. In most cases a large number of neural networks architectures suitable to solve a problem exist and the architecture design is, unfortunately, still a human expert’s job. It depends heavily on the expert and on a tedious trial-and-error process. In the last years, many works have been focused on automatic resolution of the design of neural network architectures. Most of the methods are based on evolutionary computation paradigms. Some of the designed methods are based on direct representations of the parameters of the network. These representations do not allow scalability; thus, for representing large architectures very large structures are required. More interesting alternatives are represented by indirect schemes. They codify a compact representation of the neural network. In this work, an indirect constructive encoding scheme is proposed. This scheme is based on cellular automata representations and is inspired by the idea that only a few seeds for the initial configuration of a cellular automaton can produce a wide variety of feed forward neural networks architectures. The cellular approach is experimentally validated in different domains and compared with a direct codification scheme.Publicad
    • …
    corecore