2,111 research outputs found
Time Resolution Dependence of Information Measures for Spiking Neurons: Atoms, Scaling, and Universality
The mutual information between stimulus and spike-train response is commonly
used to monitor neural coding efficiency, but neuronal computation broadly
conceived requires more refined and targeted information measures of
input-output joint processes. A first step towards that larger goal is to
develop information measures for individual output processes, including
information generation (entropy rate), stored information (statistical
complexity), predictable information (excess entropy), and active information
accumulation (bound information rate). We calculate these for spike trains
generated by a variety of noise-driven integrate-and-fire neurons as a function
of time resolution and for alternating renewal processes. We show that their
time-resolution dependence reveals coarse-grained structural properties of
interspike interval statistics; e.g., -entropy rates that diverge less
quickly than the firing rate indicate interspike interval correlations. We also
find evidence that the excess entropy and regularized statistical complexity of
different types of integrate-and-fire neurons are universal in the
continuous-time limit in the sense that they do not depend on mechanism
details. This suggests a surprising simplicity in the spike trains generated by
these model neurons. Interestingly, neurons with gamma-distributed ISIs and
neurons whose spike trains are alternating renewal processes do not fall into
the same universality class. These results lead to two conclusions. First, the
dependence of information measures on time resolution reveals mechanistic
details about spike train generation. Second, information measures can be used
as model selection tools for analyzing spike train processes.Comment: 20 pages, 6 figures;
http://csc.ucdavis.edu/~cmg/compmech/pubs/trdctim.ht
Closed-loop estimation of retinal network sensitivity reveals signature of efficient coding
According to the theory of efficient coding, sensory systems are adapted to
represent natural scenes with high fidelity and at minimal metabolic cost.
Testing this hypothesis for sensory structures performing non-linear
computations on high dimensional stimuli is still an open challenge. Here we
develop a method to characterize the sensitivity of the retinal network to
perturbations of a stimulus. Using closed-loop experiments, we explore
selectively the space of possible perturbations around a given stimulus. We
then show that the response of the retinal population to these small
perturbations can be described by a local linear model. Using this model, we
computed the sensitivity of the neural response to arbitrary temporal
perturbations of the stimulus, and found a peak in the sensitivity as a
function of the frequency of the perturbations. Based on a minimal theory of
sensory processing, we argue that this peak is set to maximize information
transmission. Our approach is relevant to testing the efficient coding
hypothesis locally in any context where no reliable encoding model is known
A survey of visual preprocessing and shape representation techniques
Many recent theories and methods proposed for visual preprocessing and shape representation are summarized. The survey brings together research from the fields of biology, psychology, computer science, electrical engineering, and most recently, neural networks. It was motivated by the need to preprocess images for a sparse distributed memory (SDM), but the techniques presented may also prove useful for applying other associative memories to visual pattern recognition. The material of this survey is divided into three sections: an overview of biological visual processing; methods of preprocessing (extracting parts of shape, texture, motion, and depth); and shape representation and recognition (form invariance, primitives and structural descriptions, and theories of attention)
PRINCIPLES OF INFORMATION PROCESSING IN NEURONAL AVALANCHES
How the brain processes information is poorly understood. It has been suggested that the imbalance of excitation and inhibition (E/I) can significantly affect information processing in the brain. Neuronal avalanches, a type of spontaneous activity recently discovered, have been ubiquitously observed in vitro and in vivo when the cortical network is in the E/I balanced state. In this dissertation, I experimentally demonstrate that several properties regarding information processing in the cortex, i.e. the entropy of spontaneous activity, the information transmission between stimulus and response, the diversity of synchronized states and the discrimination of external stimuli, are optimized when the cortical network is in the E/I balanced state, exhibiting neuronal avalanche dynamics. These experimental studies not only support the hypothesis that the cortex operates in the critical state, but also suggest that criticality is a potential principle of information processing in the cortex. Further, we study the interaction structure in population neuronal dynamics, and discovered a special structure of higher order interactions that are inherent in the neuronal dynamics
Spike detection using the continuous wavelet transform
This paper combines wavelet transforms with basic detection theory to develop a new unsupervised method for robustly detecting and localizing spikes in noisy neural recordings. The method does not require the construction of templates, or the supervised setting of thresholds. We present extensive Monte Carlo simulations, based on actual extracellular recordings, to show that this technique surpasses other commonly used methods in a wide variety of recording conditions. We further demonstrate that falsely detected spikes corresponding to our method resemble actual spikes more than the false positives of other techniques such as amplitude thresholding. Moreover, the simplicity of the method allows for nearly real-time execution
- …