34 research outputs found
Recommended from our members
Modeling and Decoding Motor Cortical Activity Using a Switching Kalman Filter
We present a switching Kalman filter model for the real-time inference of hand kinematics from a population of motor cortical neurons. Firing rates are modeled as a Gaussian mixture where the mean of each Gaussian component is a linear function of hand kinematics. A "hidden state" models the probability of each mixture component and evolves over time in a Markov chain. The model generalizes previous encoding and decoding methods, addresses the non-Gaussian nature of firing rates, and can cope with crudely sorted neural data common in on-line prosthetic applications.Mathematic
Estimating the entropy of binary time series: Methodology, some theory and a simulation study
Partly motivated by entropy-estimation problems in neuroscience, we present a
detailed and extensive comparison between some of the most popular and
effective entropy estimation methods used in practice: The plug-in method, four
different estimators based on the Lempel-Ziv (LZ) family of data compression
algorithms, an estimator based on the Context-Tree Weighting (CTW) method, and
the renewal entropy estimator.
**Methodology. Three new entropy estimators are introduced. For two of the
four LZ-based estimators, a bootstrap procedure is described for evaluating
their standard error, and a practical rule of thumb is heuristically derived
for selecting the values of their parameters. ** Theory. We prove that, unlike
their earlier versions, the two new LZ-based estimators are consistent for
every finite-valued, stationary and ergodic process. An effective method is
derived for the accurate approximation of the entropy rate of a finite-state
HMM with known distribution. Heuristic calculations are presented and
approximate formulas are derived for evaluating the bias and the standard error
of each estimator. ** Simulation. All estimators are applied to a wide range of
data generated by numerous different processes with varying degrees of
dependence and memory. Some conclusions drawn from these experiments include:
(i) For all estimators considered, the main source of error is the bias. (ii)
The CTW method is repeatedly and consistently seen to provide the most accurate
results. (iii) The performance of the LZ-based estimators is often comparable
to that of the plug-in method. (iv) The main drawback of the plug-in method is
its computational inefficiency.Comment: 34 pages, 3 figure
Composition
this paper will be the difficulty in distinguishing the inside of an entity from its outside
Compositionality in Neural Systems
angements of symbols that are possible a priori from a mere combinatorial point of view are illegitimate as linguistic constructions. The number of character strings of length 1,000 that make up a proper English text is vanishingly small when compared to the number of all possible strings of such length. Thus, while infinitely productive, language is at the same time severely constrained. When observed from the "surface," the composition mechanism in language appears simple. Individual characters are assembled into syllables, which are themselves assembled into words, further composed into phrases, sentences, etc. One text differs from another text in the same language only by the relative positioning (relations) among the constituents (symbols) , and not for instance by the frequencies of occurrence of each symbol; these frequencies are about the same for any sufficiently long text. Yet, encoded within this apparently simple Elie Bienenstock a