4,138 research outputs found

    Self-Organizing Time Map: An Abstraction of Temporal Multivariate Patterns

    Full text link
    This paper adopts and adapts Kohonen's standard Self-Organizing Map (SOM) for exploratory temporal structure analysis. The Self-Organizing Time Map (SOTM) implements SOM-type learning to one-dimensional arrays for individual time units, preserves the orientation with short-term memory and arranges the arrays in an ascending order of time. The two-dimensional representation of the SOTM attempts thus twofold topology preservation, where the horizontal direction preserves time topology and the vertical direction data topology. This enables discovering the occurrence and exploring the properties of temporal structural changes in data. For representing qualities and properties of SOTMs, we adapt measures and visualizations from the standard SOM paradigm, as well as introduce a measure of temporal structural changes. The functioning of the SOTM, and its visualizations and quality and property measures, are illustrated on artificial toy data. The usefulness of the SOTM in a real-world setting is shown on poverty, welfare and development indicators

    Auto-SOM: recursive parameter estimation for guidance of self-organizing feature maps

    Get PDF
    An important technique for exploratory data analysis is to forma mapping from the high-dimensional data space to a low-dimensional representation space such that neighborhoods are preserved. A popular method for achieving this is Kohonen's self-organizing map (SOM) algorithm. However, in its original form, this requires the user to choose the values of several parameters heuristically to achieve good performance. Here we present the Auto-SOM, an algorithm that estimates the learning parameters during the training of SOMs automatically. The application of Auto-SOM provides the facility to avoid neighborhood violations up to a user-defined degree in either mapping direction. Auto-SOM consists of a Kalman filter implementation of the SOM coupled with a recursive parameter estimation method. The Kalman filter trains the neurons' weights with estimated learning coefficients so as to minimize the variance of the estimation error. The recursive parameter estimation method estimates the width of the neighborhood function by minimizing the prediction error variance of the Kalman filter. In addition, the "topographic function" is incorporated to measure neighborhood violations and prevent the map's converging to configurations with neighborhood violations. It is demonstrated that neighborhoods can be preserved in both mapping directions as desired for dimension-reducing applications. The development of neighborhood-preserving maps and their convergence behavior is demonstrated by three examples accounting for the basic applications of self-organizing feature maps

    Image inpainting based on self-organizing maps by using multi-agent implementation

    Get PDF
    AbstractThe image inpainting is a well-known task of visual editing. However, the efficiency strongly depends on sizes and textural neighborhood of “missing” area. Various methods of image inpainting exist, among which the Kohonen Self-Organizing Map (SOM) network as a mean of unsupervised learning is widely used. The weaknesses of the Kohonen SOM network such as the necessity for tuning of algorithm parameters and the low computational speed caused the application of multi- agent system with a multi-mapping possibility and a parallel processing by the identical agents. During experiments, it was shown that the preliminary image segmentation and the creation of the SOMs for each type of homogeneous textures provide better results in comparison with the classical SOM application. Also the optimal number of inpainting agents was determined. The quality of inpainting was estimated by several metrics, and good results were obtained in complex images

    SARDSRN: A NEURAL NETWORK SHIFT-REDUCE PARSER

    Get PDF
    Simple Recurrent Networks (SRNs) have been widely used in natural language tasks. SARDSRN extends the SRN by explicitly representing the input sequence in a SARDNET self-organizing map. The distributed SRN component leads to good generalization and robust cognitive properties, whereas the SARDNET map provides exact representations of the sentence constituents. This combination allows SARDSRN to learn to parse sentences with more complicated structure than can the SRN alone, and suggests that the approach could scale up to realistic natural language

    Sleep Spindle Detection by Using Merge Neural Gas

    Get PDF
    In this paper the Merge Neural Gas (MNG) model is applied to detect sleep spindles in EEG. Features are extracted from windows of the EEG by using short time Fourier transform. The total power spectrum is computed in six frequency bands and used as input to the MNG network. The results show that MNG outperforms simple neural gas in correctly detecting sleep spindles. In addition the temporal quantization results as well as sleep trajectories are visualized on two-dimensional maps by using the OVING projection method

    Dynamics and topographic organization of recursive self-organizing maps

    Get PDF
    Recently there has been an outburst of interest in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. However, there is no general consensus as to how best to process sequences using topographicmaps, and this topic remains an active focus of neurocomputational research. The representational capabilities and internal representations of the models are not well understood. Here, we rigorously analyze a generalization of the self-organizingmap (SOM) for processing sequential data, recursive SOM (RecSOM) (Voegtlin, 2002), as a nonautonomous dynamical system consisting of a set of fixed input maps. We argue that contractive fixed-input maps are likely to produce Markovian organizations of receptive fields on the RecSOM map. We derive bounds on parameter β (weighting the importance of importing past information when processing sequences) under which contractiveness of the fixed-input maps is guaranteed. Some generalizations of SOM contain a dynamic module responsible for processing temporal contexts as an integral part of the model. We show that Markovian topographic maps of sequential data can be produced using a simple fixed (nonadaptable) dynamic module externally feeding a standard topographic model designed to process static vectorial data of fixed dimensionality (e.g., SOM). However, by allowing trainable feedback connections, one can obtain Markovian maps with superior memory depth and topography preservation. We elaborate on the importance of non-Markovian organizations in topographic maps of sequential data. © 2006 Massachusetts Institute of Technology

    SOME REMARKS ON THE SELF-ORGANIZING FEATURE MAPS

    Full text link
    Joint Research on Environmental Science and Technology for the Eart
    corecore