5,513 research outputs found

    A self-organising mixture network for density modelling

    Get PDF
    A completely unsupervised mixture distribution network, namely the self-organising mixture network, is proposed for learning arbitrary density functions. The algorithm minimises the Kullback-Leibler information by means of stochastic approximation methods. The density functions are modelled as mixtures of parametric distributions such as Gaussian and Cauchy. The first layer of the network is similar to the Kohonen's self-organising map (SOM), but with the parameters of the class conditional densities as the learning weights. The winning mechanism is based on maximum posterior probability, and the updating of weights can be limited to a small neighbourhood around the winner. The second layer accumulates the responses of these local nodes, weighted by the learning mixing parameters. The network possesses simple structure and computation, yet yields fast and robust convergence. Experimental results are also presente

    Towards the optimal Bayes classifier using an extended self-organising map

    Get PDF
    In this paper, we propose an extended self-organising learning scheme, in which both distance measure and neighbourhood function have been replaced by the neuron's posterior probabilities. Updating of weights is within a limited but fixed sized neighbourhood of the winner. Each unit will converge to one component of a mixture distribution of input samples, so that an optimal pattern classifier can be formed. The proposed learning scheme can be used to train other forms of unsupervised networks, such as radial-basis-function networks. An application example on textured image segmentation is presented

    Self-Organising Networks for Classification: developing Applications to Science Analysis for Astroparticle Physics

    Full text link
    Physics analysis in astroparticle experiments requires the capability of recognizing new phenomena; in order to establish what is new, it is important to develop tools for automatic classification, able to compare the final result with data from different detectors. A typical example is the problem of Gamma Ray Burst detection, classification, and possible association to known sources: for this task physicists will need in the next years tools to associate data from optical databases, from satellite experiments (EGRET, GLAST), and from Cherenkov telescopes (MAGIC, HESS, CANGAROO, VERITAS)

    Neural networks for gamma-hadron separation in MAGIC

    Full text link
    Neural networks have proved to be versatile and robust for particle separation in many experiments related to particle astrophysics. We apply these techniques to separate gamma rays from hadrons for the MAGIC Cerenkov Telescope. Two types of neural network architectures have been used for the classi cation task: one is the MultiLayer Perceptron (MLP) based on supervised learning, and the other is the Self-Organising Tree Algorithm (SOTA), which is based on unsupervised learning. We propose a new architecture by combining these two neural networks types to yield better and faster classi cation results for our classi cation problem.Comment: 6 pages, 4 figures, to be published in the Proceedings of the 6th International Symposium ''Frontiers of Fundamental and Computational Physics'' (FFP6), Udine (Italy), Sep. 26-29, 200

    ASPECT: A spectra clustering tool for exploration of large spectral surveys

    Full text link
    We present the novel, semi-automated clustering tool ASPECT for analysing voluminous archives of spectra. The heart of the program is a neural network in form of Kohonen's self-organizing map. The resulting map is designed as an icon map suitable for the inspection by eye. The visual analysis is supported by the option to blend in individual object properties such as redshift, apparent magnitude, or signal-to-noise ratio. In addition, the package provides several tools for the selection of special spectral types, e.g. local difference maps which reflect the deviations of all spectra from one given input spectrum (real or artificial). ASPECT is able to produce a two-dimensional topological map of a huge number of spectra. The software package enables the user to browse and navigate through a huge data pool and helps him to gain an insight into underlying relationships between the spectra and other physical properties and to get the big picture of the entire data set. We demonstrate the capability of ASPECT by clustering the entire data pool of 0.6 million spectra from the Data Release 4 of the Sloan Digital Sky Survey (SDSS). To illustrate the results regarding quality and completeness we track objects from existing catalogues of quasars and carbon stars, respectively, and connect the SDSS spectra with morphological information from the GalaxyZoo project.Comment: 15 pages, 14 figures; accepted for publication in Astronomy and Astrophysic

    How Many Dissimilarity/Kernel Self Organizing Map Variants Do We Need?

    Full text link
    In numerous applicative contexts, data are too rich and too complex to be represented by numerical vectors. A general approach to extend machine learning and data mining techniques to such data is to really on a dissimilarity or on a kernel that measures how different or similar two objects are. This approach has been used to define several variants of the Self Organizing Map (SOM). This paper reviews those variants in using a common set of notations in order to outline differences and similarities between them. It discusses the advantages and drawbacks of the variants, as well as the actual relevance of the dissimilarity/kernel SOM for practical applications
    • …
    corecore