73,417 research outputs found

    Evolution of neural networks for classification and regression

    Get PDF
    Although Artificial Neural Networks (ANNs) are important Data Mining techniques, the search for the optimal ANN is a challenging task: the ANN should learn the input-output mapping without overfitting the data and training algorithms may get trapped in local minima. The use of Evolutionary Computation (EC) is a promising alternative for ANN optimization. This work presents two hybrid EC/ANN algorithms: the first evolves neural topologies while the latter performs simultaneous optimization of architectures and weights. Sixteen real-world tasks were used to test these strategies. Competitive results were achieved when compared with a heuristic model selection and other Data Mining algorithms.Fundação para a Ciência e a Tecnologia (FCT) - projecto POSI/EIA/59899/2004

    Morphological granulometry for classification of evolving and ordered texture images.

    Get PDF
    In this work we investigate the use of morphological granulometric moments as texture descriptors to predict time or class of texture images which evolve over time or follow an intrinsic ordering of textures. A cubic polynomial regression was used to model each of several granulometric moments as a function of time or class. These models are then combined and used to predict time or class. The methodology was developed on synthetic images of evolving textures and then successfully applied to classify a sequence of corrosion images to a point on an evolution time scale. Classification performance of the new regression approach is compared to that of linear discriminant analysis, neural networks and support vector machines. We also apply our method to images of black tea leaves, which are ordered according to granule size, and very high classification accuracy was attained compared to existing published results for these images. It was also found that granulometric moments provide much improved classification compared to grey level co-occurrence features for shape-based texture images

    Bibliometric Mapping of the Computational Intelligence Field

    Get PDF
    In this paper, a bibliometric study of the computational intelligence field is presented. Bibliometric maps showing the associations between the main concepts in the field are provided for the periods 1996–2000 and 2001–2005. Both the current structure of the field and the evolution of the field over the last decade are analyzed. In addition, a number of emerging areas in the field are identified. It turns out that computational intelligence can best be seen as a field that is structured around four important types of problems, namely control problems, classification problems, regression problems, and optimization problems. Within the computational intelligence field, the neural networks and fuzzy systems subfields are fairly intertwined, whereas the evolutionary computation subfield has a relatively independent position.neural networks;bibliometric mapping;fuzzy systems;bibliometrics;computational intelligence;evolutionary computation

    Evolino for recurrent support vector machines

    Full text link
    Traditional Support Vector Machines (SVMs) need pre-wired finite time windows to predict and classify time series. They do not have an internal state necessary to deal with sequences involving arbitrary long-term dependencies. Here we introduce a new class of recurrent, truly sequential SVM-like devices with internal adaptive states, trained by a novel method called EVOlution of systems with KErnel-based outputs (Evoke), an instance of the recent Evolino class of methods. Evoke evolves recurrent neural networks to detect and represent temporal dependencies while using quadratic programming/support vector regression to produce precise outputs. Evoke is the first SVM-based mechanism learning to classify a context-sensitive language. It also outperforms recent state-of-the-art gradient-based recurrent neural networks (RNNs) on various time series prediction tasks.Comment: 10 pages, 2 figure

    Small-variance asymptotics for Bayesian neural networks

    Get PDF
    Bayesian neural networks (BNNs) are a rich and flexible class of models that have several advantages over standard feedforward networks, but are typically expensive to train on large-scale data. In this thesis, we explore the use of small-variance asymptotics-an approach to yielding fast algorithms from probabilistic models-on various Bayesian neural network models. We first demonstrate how small-variance asymptotics shows precise connections between standard neural networks and BNNs; for example, particular sampling algorithms for BNNs reduce to standard backpropagation in the small-variance limit. We then explore a more complex BNN where the number of hidden units is additionally treated as a random variable in the model. While standard sampling schemes would be too slow to be practical, our asymptotic approach yields a simple method for extending standard backpropagation to the case where the number of hidden units is not fixed. We show on several data sets that the resulting algorithm has benefits over backpropagation on networks with a fixed architecture.2019-01-02T00:00:00
    corecore