3,268 research outputs found
A Binary Neural Shape Matcher using Johnson Counters and Chain Codes
In this paper, we introduce a neural network-based shape matching algorithm that uses Johnson Counter codes coupled with chain codes. Shape matching is a fundamental requirement in content-based image retrieval systems. Chain codes describe shapes using sequences of numbers. They are simple and flexible. We couple this power with the efficiency and flexibility of a binary associative-memory neural network. We focus on the implementation details of the algorithm when it is constructed using the neural network. We demonstrate how the binary associative-memory neural network can index and match chain codes where the chain code elements are represented by Johnson codes
Neural Avalanches at the Critical Point between Replay and Non-Replay of Spatiotemporal Patterns
We model spontaneous cortical activity with a network of coupled spiking
units, in which multiple spatio-temporal patterns are stored as dynamical
attractors. We introduce an order parameter, which measures the overlap
(similarity) between the activity of the network and the stored patterns. We
find that, depending on the excitability of the network, different working
regimes are possible. For high excitability, the dynamical attractors are
stable, and a collective activity that replays one of the stored patterns
emerges spontaneously, while for low excitability, no replay is induced.
Between these two regimes, there is a critical region in which the dynamical
attractors are unstable, and intermittent short replays are induced by noise.
At the critical spiking threshold, the order parameter goes from zero to one,
and its fluctuations are maximized, as expected for a phase transition (and as
observed in recent experimental results in the brain). Notably, in this
critical region, the avalanche size and duration distributions follow power
laws. Critical exponents are consistent with a scaling relationship observed
recently in neural avalanches measurements. In conclusion, our simple model
suggests that avalanche power laws in cortical spontaneous activity may be the
effect of a network at the critical point between the replay and non-replay of
spatio-temporal patterns
Slowly evolving geometry in recurrent neural networks I: extreme dilution regime
We study extremely diluted spin models of neural networks in which the
connectivity evolves in time, although adiabatically slowly compared to the
neurons, according to stochastic equations which on average aim to reduce
frustration. The (fast) neurons and (slow) connectivity variables equilibrate
separately, but at different temperatures. Our model is exactly solvable in
equilibrium. We obtain phase diagrams upon making the condensed ansatz (i.e.
recall of one pattern). These show that, as the connectivity temperature is
lowered, the volume of the retrieval phase diverges and the fraction of
mis-aligned spins is reduced. Still one always retains a region in the
retrieval phase where recall states other than the one corresponding to the
`condensed' pattern are locally stable, so the associative memory character of
our model is preserved.Comment: 18 pages, 6 figure
- …