1,271,263 research outputs found
Contour detection by CORF operator
We propose a contour operator, called CORF, inspired by
the properties of simple cells in visual cortex. It combines, by a weighted
geometric mean, the blurred responses of difference-of-Gaussian operators
that model cells in the lateral geniculate nucleus (LGN). An operator
that has gained particular popularity as a computational model of a simple
cell is based on a family of Gabor Functions (GFs). However, the GF
operator short-cuts the LGN, and its effectiveness in contour detection
tasks, which is assumed to be the primary biological role of simple cells,
has never been compared with the effectiveness of alternative operators.
We compare the performances of the CORF and the GF operators using
the RuG and the Berkeley data sets of natural scenes with associated
ground truths. The proposed CORF operator outperforms the GF operator
(RuG: t(39)=4.39, p<10−4 and Berkeley: t(499)=4.95, p<10−6).peer-reviewe
Analog Neural Networks as Decoders
Analog neural networks with feedback can be used to implement l(Winner-Take-All (KWTA) networks. In turn, KWTA networks can be
used as decoders of a class of nonlinear error-correcting codes. By interconnecting
such KWTA networks, we can construct decoders capable
of decoding more powerful codes. We consider several families of interconnected
KWTA networks, analyze their performance in terms of coding
theory metrics, and consider the feasibility of embedding such networks in
VLSI technologies
A new neural network technique for the design of multilayered microwave shielded bandpass filters
In this work, we propose a novel technique based on neural networks, for the design of microwave filters in shielded printed technology. The technique uses radial basis function neural networks to represent the non linear relations between the quality factors and coupling coefficients, with the geometrical dimensions of the resonators. The radial basis function neural networks are employed for the first time in the design task of shielded printed filters, and permit a fast and precise operation with only a limited set of training data. Thanks to a new cascade configuration, a set of two neural networks provide the dimensions of the complete filter in a fast and accurate way. To improve the calculation of the geometrical dimensions, the neural networks can take as inputs both electrical parameters and physical dimensions computed by other neural networks. The neural network technique is combined with gradient based optimization methods to further improve the response of the filters. Results are presented to demonstrate the usefulness of the proposed technique for the design of practical microwave printed coupled line and hairpin filters
Optimal modularity and memory capacity of neural reservoirs
The neural network is a powerful computing framework that has been exploited
by biological evolution and by humans for solving diverse problems. Although
the computational capabilities of neural networks are determined by their
structure, the current understanding of the relationships between a neural
network's architecture and function is still primitive. Here we reveal that
neural network's modular architecture plays a vital role in determining the
neural dynamics and memory performance of the network of threshold neurons. In
particular, we demonstrate that there exists an optimal modularity for memory
performance, where a balance between local cohesion and global connectivity is
established, allowing optimally modular networks to remember longer. Our
results suggest that insights from dynamical analysis of neural networks and
information spreading processes can be leveraged to better design neural
networks and may shed light on the brain's modular organization
A Digital Neuromorphic Architecture Efficiently Facilitating Complex Synaptic Response Functions Applied to Liquid State Machines
Information in neural networks is represented as weighted connections, or
synapses, between neurons. This poses a problem as the primary computational
bottleneck for neural networks is the vector-matrix multiply when inputs are
multiplied by the neural network weights. Conventional processing architectures
are not well suited for simulating neural networks, often requiring large
amounts of energy and time. Additionally, synapses in biological neural
networks are not binary connections, but exhibit a nonlinear response function
as neurotransmitters are emitted and diffuse between neurons. Inspired by
neuroscience principles, we present a digital neuromorphic architecture, the
Spiking Temporal Processing Unit (STPU), capable of modeling arbitrary complex
synaptic response functions without requiring additional hardware components.
We consider the paradigm of spiking neurons with temporally coded information
as opposed to non-spiking rate coded neurons used in most neural networks. In
this paradigm we examine liquid state machines applied to speech recognition
and show how a liquid state machine with temporal dynamics maps onto the
STPU-demonstrating the flexibility and efficiency of the STPU for instantiating
neural algorithms.Comment: 8 pages, 4 Figures, Preprint of 2017 IJCN
- …
