40,911 research outputs found

    Fuzzy heterogeneous neural networks for signal forecasting

    Get PDF
    Fuzzy heterogeneous neural networks are recently introduced models based on neurons accepting heterogeneous inputs (i.e. mixtures of numerical and non-numerical information possibly with missing data) with either crisp or imprecise character, which can be coupled with classical neurons. This paper compares the effectiveness of this kind of networks with time-delay and recurrent architectures that use classical neuron models and training algorithms in a signal forecasting problem, in the context of finding models of the central nervous system controllers.Peer ReviewedPostprint (author's final draft

    Clustering outdoor soundscapes using fuzzy ants

    Get PDF
    A classification algorithm for environmental sound recordings or "soundscapes" is outlined. An ant clustering approach is proposed, in which the behavior of the ants is governed by fuzzy rules. These rules are optimized by a genetic algorithm specially designed in order to achieve the optimal set of homogeneous clusters. Soundscape similarity is expressed as fuzzy resemblance of the shape of the sound pressure level histogram, the frequency spectrum and the spectrum of temporal fluctuations. These represent the loudness, the spectral and the temporal content of the soundscapes. Compared to traditional clustering methods, the advantages of this approach are that no a priori information is needed, such as the desired number of clusters, and that a flexible set of soundscape measures can be used. The clustering algorithm was applied to a set of 1116 acoustic measurements in 16 urban parks of Stockholm. The resulting clusters were validated against visitor's perceptual measurements of soundscape quality

    Similarity networks for classification: a case study in the Horse Colic problem

    Get PDF
    This paper develops a two-layer neural network in which the neuron model computes a user-defined similarity function between inputs and weights. The neuron transfer function is formed by composition of an adapted logistic function with the mean of the partial input-weight similarities. The resulting neuron model is capable of dealing directly with variables of potentially different nature (continuous, fuzzy, ordinal, categorical). There is also provision for missing values. The network is trained using a two-stage procedure very similar to that used to train a radial basis function (RBF) neural network. The network is compared to two types of RBF networks in a non-trivial dataset: the Horse Colic problem, taken as a case study and analyzed in detail.Postprint (published version
    • …
    corecore