1,208 research outputs found
Quantum Pattern Retrieval by Qubit Networks with Hebb Interactions
Qubit networks with long-range interactions inspired by the Hebb rule can be
used as quantum associative memories. Starting from a uniform superposition,
the unitary evolution generated by these interactions drives the network
through a quantum phase transition at a critical computation time, after which
ferromagnetic order guarantees that a measurement retrieves the stored memory.
The maximum memory capacity p of these qubit networks is reached at a memory
density p/n=1.Comment: To appear in Physical Review Letter
Possible Roles of Neural Electron Spin Networks in Memory and Consciousness
Spin is the origin of quantum effects in both Bohm and Hestenes quantum formulism and a fundamental quantum process associated with the structure of space-time. Thus, we have recently theorized that spin is the mind-pixel and developed a qualitative model of consciousness based on nuclear spins inside neural membranes and proteins. In this paper, we explore the possibility of unpaired electron spins being the mind-pixels. Besides free O2 and NO, the main sources of unpaired electron spins in neural membranes and proteins are transition metal ions and O2 and NO bound/absorbed to large molecules, free radicals produced through biochemical reactions and excited molecular triplet states induced by fluctuating internal magnetic fields. We show that unpaired electron spin networks inside neural membranes and proteins are modulated by action potentials through exchange and dipolar coupling tensors and spin-orbital coupling and g-factor tensors and perturbed by microscopically strong and fluctuating internal magnetic fields produced largely by diffusing O2. We argue that these spin networks could be involved in brain functions since said modulation inputs information carried by the neural spike trains into them, said perturbation activates various dynamics within them and the combination of the two likely produce stochastic resonance thus synchronizing said dynamics to the neural firings. Although quantum coherence is desirable, it is not required for these spin networks to serve as the microscopic components for the classical neural networks. On the quantum aspect, we speculate that human brain works as follows with unpaired electron spins being the mind-pixels: Through action potential modulated electron spin interactions and fluctuating internal magnetic field driven activations, the neural electron spin networks inside neural membranes and proteins form various entangled quantum states some of which survive decoherence through quantum Zeno effects or in decoherence-free subspaces and then collapse contextually via irreversible and non-computable means producing consciousness and, in turn, the collective spin dynamics associated with said collapses have effects through spin chemistry on classical neural activities thus influencing the neural networks of the brain. Thus, according to this alternative model, the unpaired electron spin networks are the âmind-screen,â the neural membranes and proteins are the mind-screen and memory matrices, and diffusing O2 and NO are pixel-activating agents. Together, they form the neural substrates of consciousness
Storage of phase-coded patterns via STDP in fully-connected and sparse network: a study of the network capacity
We study the storage and retrieval of phase-coded patterns as stable
dynamical attractors in recurrent neural networks, for both an analog and a
integrate-and-fire spiking model. The synaptic strength is determined by a
learning rule based on spike-time-dependent plasticity, with an asymmetric time
window depending on the relative timing between pre- and post-synaptic
activity. We store multiple patterns and study the network capacity.
For the analog model, we find that the network capacity scales linearly with
the network size, and that both capacity and the oscillation frequency of the
retrieval state depend on the asymmetry of the learning time window. In
addition to fully-connected networks, we study sparse networks, where each
neuron is connected only to a small number z << N of other neurons. Connections
can be short range, between neighboring neurons placed on a regular lattice, or
long range, between randomly chosen pairs of neurons. We find that a small
fraction of long range connections is able to amplify the capacity of the
network. This imply that a small-world-network topology is optimal, as a
compromise between the cost of long range connections and the capacity
increase.
Also in the spiking integrate and fire model the crucial result of storing
and retrieval of multiple phase-coded patterns is observed. The capacity of the
fully-connected spiking network is investigated, together with the relation
between oscillation frequency of retrieval state and window asymmetry
Dreaming neural networks: forgetting spurious memories and reinforcing pure ones
The standard Hopfield model for associative neural networks accounts for
biological Hebbian learning and acts as the harmonic oscillator for pattern
recognition, however its maximal storage capacity is , far
from the theoretical bound for symmetric networks, i.e. . Inspired
by sleeping and dreaming mechanisms in mammal brains, we propose an extension
of this model displaying the standard on-line (awake) learning mechanism (that
allows the storage of external information in terms of patterns) and an
off-line (sleep) unlearningconsolidating mechanism (that allows
spurious-pattern removal and pure-pattern reinforcement): this obtained daily
prescription is able to saturate the theoretical bound , remaining
also extremely robust against thermal noise. Both neural and synaptic features
are analyzed both analytically and numerically. In particular, beyond obtaining
a phase diagram for neural dynamics, we focus on synaptic plasticity and we
give explicit prescriptions on the temporal evolution of the synaptic matrix.
We analytically prove that our algorithm makes the Hebbian kernel converge with
high probability to the projection matrix built over the pure stored patterns.
Furthermore, we obtain a sharp and explicit estimate for the "sleep rate" in
order to ensure such a convergence. Finally, we run extensive numerical
simulations (mainly Monte Carlo sampling) to check the approximations
underlying the analytical investigations (e.g., we developed the whole theory
at the so called replica-symmetric level, as standard in the
Amit-Gutfreund-Sompolinsky reference framework) and possible finite-size
effects, finding overall full agreement with the theory.Comment: 31 pages, 12 figure
A Model of an Oscillatory Neural Network with Multilevel Neurons for Pattern Recognition and Computing
The current study uses a novel method of multilevel neurons and high order
synchronization effects described by a family of special metrics, for pattern
recognition in an oscillatory neural network (ONN). The output oscillator
(neuron) of the network has multilevel variations in its synchronization value
with the reference oscillator, and allows classification of an input pattern
into a set of classes. The ONN model is implemented on thermally-coupled
vanadium dioxide oscillators. The ONN is trained by the simulated annealing
algorithm for selection of the network parameters. The results demonstrate that
ONN is capable of classifying 512 visual patterns (as a cell array 3 * 3,
distributed by symmetry into 102 classes) into a set of classes with a maximum
number of elements up to fourteen. The classification capability of the network
depends on the interior noise level and synchronization effectiveness
parameter. The model allows for designing multilevel output cascades of neural
networks with high net data throughput. The presented method can be applied in
ONNs with various coupling mechanisms and oscillator topology.Comment: 26 pages, 24 figure
- âŠ