7,928 research outputs found
Modeling the Flux-Charge Relation of Memristor with Neural Network of Smooth Hinge Functions
The memristor was proposed to characterize the flux-charge relation. We propose the generalized flux-charge relation model of memristor with neural network of smooth hinge functions. There is effective identification algorithm for the neural network of smooth hinge functions. The representation capability of this model is theoretically guaranteed. Any functional flux-charge relation of a memristor can be approximated by the model. We also give application examples to show that the given model can approximate the flux-charge relation of existing piecewise linear memristor model, window function memristor model, and a physical memristor device
Morphological Network: How Far Can We Go with Morphological Neurons?
In recent years, the idea of using morphological operations as networks has
received much attention. Mathematical morphology provides very efficient and
useful image processing and image analysis tools based on basic operators like
dilation and erosion, defined in terms of kernels. Many other morphological
operations are built up using the dilation and erosion operations. Although the
learning of structuring elements such as dilation or erosion using the
backpropagation algorithm is not new, the order and the way these morphological
operations are used is not standard. In this paper, we have theoretically
analyzed the use of morphological operations for processing 1D feature vectors
and shown that this gets extended to the 2D case in a simple manner. Our
theoretical results show that a morphological block represents a sum of hinge
functions. Hinge functions are used in many places for classification and
regression tasks (Breiman (1993)). We have also proved a universal
approximation theorem -- a stack of two morphological blocks can approximate
any continuous function over arbitrary compact sets. To experimentally validate
the efficacy of this network in real-life applications, we have evaluated its
performance on satellite image classification datasets since morphological
operations are very sensitive to geometrical shapes and structures. We have
also shown results on a few tasks like segmentation of blood vessels from
fundus images, segmentation of lungs from chest x-ray and image dehazing. The
results are encouraging and further establishes the potential of morphological
networks.Comment: 35 pages, 19 figures, 7 table
A jamming transition from under- to over-parametrization affects loss landscape and generalization
We argue that in fully-connected networks a phase transition delimits the
over- and under-parametrized regimes where fitting can or cannot be achieved.
Under some general conditions, we show that this transition is sharp for the
hinge loss. In the whole over-parametrized regime, poor minima of the loss are
not encountered during training since the number of constraints to satisfy is
too small to hamper minimization. Our findings support a link between this
transition and the generalization properties of the network: as we increase the
number of parameters of a given model, starting from an under-parametrized
network, we observe that the generalization error displays three phases: (i)
initial decay, (ii) increase until the transition point --- where it displays a
cusp --- and (iii) slow decay toward a constant for the rest of the
over-parametrized regime. Thereby we identify the region where the classical
phenomenon of over-fitting takes place, and the region where the model keeps
improving, in line with previous empirical observations for modern neural
networks.Comment: arXiv admin note: text overlap with arXiv:1809.0934
- …