145,139 research outputs found
Magnification Control in Winner Relaxing Neural Gas
An important goal in neural map learning, which can conveniently be
accomplished by magnification control, is to achieve information optimal coding
in the sense of information theory. In the present contribution we consider the
winner relaxing approach for the neural gas network. Originally, winner
relaxing learning is a slight modification of the self-organizing map learning
rule that allows for adjustment of the magnification behavior by an a priori
chosen control parameter. We transfer this approach to the neural gas
algorithm. The magnification exponent can be calculated analytically for
arbitrary dimension from a continuum theory, and the entropy of the resulting
map is studied numerically conf irming the theoretical prediction. The
influence of a diagonal term, which can be added without impacting the
magnification, is studied numerically. This approach to maps of maximal mutual
information is interesting for applications as the winner relaxing term only
adds computational cost of same order and is easy to implement. In particular,
it is not necessary to estimate the generally unknown data probability density
as in other magnification control approaches.Comment: 14pages, 2 figure
Adaptive Resonance Theory and Diffusion Maps for Clustering Applications in Pattern Analysis
Adaptive Resonance is primarily a theory that learning is regulated by resonance phenomena in neural circuits. Diffusion maps are a class of kernel methods on edge-weighted graphs. While either of these approaches have demonstrated success in image analysis, their combination is particularly effective. These techniques are reviewed and some example applications are given
Dynamic Model to Assess Organisational Readiness during Information System Implementation
This paper presents a methodology for assessing an organisation’s readiness to implement an information system (IS). We use the technique of fuzzy cognitive maps which draws on the theory of Neural Networks. The techniques comprising the model have been used in many applications, but their use in the planning of change management projects is relatively new. The paper explains the theory, presents a numerical example, and suggests practical uses of the model
OnionNet: Sharing Features in Cascaded Deep Classifiers
The focus of our work is speeding up evaluation of deep neural networks in
retrieval scenarios, where conventional architectures may spend too much time
on negative examples. We propose to replace a monolithic network with our novel
cascade of feature-sharing deep classifiers, called OnionNet, where subsequent
stages may add both new layers as well as new feature channels to the previous
ones. Importantly, intermediate feature maps are shared among classifiers,
preventing them from the necessity of being recomputed. To accomplish this, the
model is trained end-to-end in a principled way under a joint loss. We validate
our approach in theory and on a synthetic benchmark. As a result demonstrated
in three applications (patch matching, object detection, and image retrieval),
our cascade can operate significantly faster than both monolithic networks and
traditional cascades without sharing at the cost of marginal decrease in
precision.Comment: Accepted to BMVC 201
Kernel method for nonlinear Granger causality
Important information on the structure of complex systems, consisting of more
than one component, can be obtained by measuring to which extent the individual
components exchange information among each other. Such knowledge is needed to
reach a deeper comprehension of phenomena ranging from turbulent fluids to
neural networks, as well as complex physiological signals. The linear Granger
approach, to detect cause-effect relationships between time series, has emerged
in recent years as a leading statistical technique to accomplish this task.
Here we generalize Granger causality to the nonlinear case using the theory of
reproducing kernel Hilbert spaces. Our method performs linear Granger causality
in the feature space of suitable kernel functions, assuming arbitrary degree of
nonlinearity. We develop a new strategy to cope with the problem of
overfitting, based on the geometry of reproducing kernel Hilbert spaces.
Applications to coupled chaotic maps and physiological data sets are presented.Comment: Revised version, accepted for publication on Physical Review Letter
Geometric deep learning and equivariant neural networks
We survey the mathematical foundations of geometric deep learning, focusing on group equivariant and gauge equivariant neural networks. We develop gauge equivariant convolutional neural networks on arbitrary manifolds M using principal bundles with structure group K and equivariant maps between sections of associated vector bundles. We also discuss group equivariant neural networks for homogeneous spaces M= G/ K , which are instead equivariant with respect to the global symmetry G on M . Group equivariant layers can be interpreted as intertwiners between induced representations of G, and we show their relation to gauge equivariant convolutional layers. We analyze several applications of this formalism, including semantic segmentation and object detection networks. We also discuss the case of spherical networks in great detail, corresponding to the case M= S2= SO (3) / SO (2) . Here we emphasize the use of Fourier analysis involving Wigner matrices, spherical harmonics and Clebsch–Gordan coefficients for G= SO (3) , illustrating the power of representation theory for deep learning
Application of artificial neural network in market segmentation: A review on recent trends
Despite the significance of Artificial Neural Network (ANN) algorithm to
market segmentation, there is a need of a comprehensive literature review and a
classification system for it towards identification of future trend of market
segmentation research. The present work is the first identifiable academic
literature review of the application of neural network based techniques to
segmentation. Our study has provided an academic database of literature between
the periods of 2000-2010 and proposed a classification scheme for the articles.
One thousands (1000) articles have been identified, and around 100 relevant
selected articles have been subsequently reviewed and classified based on the
major focus of each paper. Findings of this study indicated that the research
area of ANN based applications are receiving most research attention and self
organizing map based applications are second in position to be used in
segmentation. The commonly used models for market segmentation are data mining,
intelligent system etc. Our analysis furnishes a roadmap to guide future
research and aid knowledge accretion and establishment pertaining to the
application of ANN based techniques in market segmentation. Thus the present
work will significantly contribute to both the industry and academic research
in business and marketing as a sustainable valuable knowledge source of market
segmentation with the future trend of ANN application in segmentation.Comment: 24 pages, 7 figures,3 Table
Model Reduction and Neural Networks for Parametric PDEs
We develop a general framework for data-driven approximation of input-output maps between infinite-dimensional spaces. The proposed approach is motivated by the recent successes of neural networks and deep learning, in combination with ideas from model reduction. This combination results in a neural network approximation which, in principle, is defined on infinite-dimensional spaces and, in practice, is robust to the dimension of finite-dimensional approximations of these spaces required for computation. For a class of input-output maps, and suitably chosen probability measures on the inputs, we prove convergence of the proposed approximation methodology. Numerically we demonstrate the effectiveness of the method on a class of parametric elliptic PDE problems, showing convergence and robustness of the approximation scheme with respect to the size of the discretization, and compare our method with existing algorithms from the literature
- …