459 research outputs found
Rejoinder
Rejoinder of "Statistical Inference: The Big Picture" by R. E. Kass
[arXiv:1106.2895]Comment: Published in at http://dx.doi.org/10.1214/11-STS337REJ the
Statistical Science (http://www.imstat.org/sts/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Information In The Non-Stationary Case
Information estimates such as the ``direct method'' of Strong et al. (1998)
sidestep the difficult problem of estimating the joint distribution of response
and stimulus by instead estimating the difference between the marginal and
conditional entropies of the response. While this is an effective estimation
strategy, it tempts the practitioner to ignore the role of the stimulus and the
meaning of mutual information. We show here that, as the number of trials
increases indefinitely, the direct (or ``plug-in'') estimate of marginal
entropy converges (with probability 1) to the entropy of the time-averaged
conditional distribution of the response, and the direct estimate of the
conditional entropy converges to the time-averaged entropy of the conditional
distribution of the response. Under joint stationarity and ergodicity of the
response and stimulus, the difference of these quantities converges to the
mutual information. When the stimulus is deterministic or non-stationary the
direct estimate of information no longer estimates mutual information, which is
no longer meaningful, but it remains a measure of variability of the response
distribution across time
Statistical Inference: The Big Picture
Statistics has moved beyond the frequentist-Bayesian controversies of the
past. Where does this leave our ability to interpret results? I suggest that a
philosophy compatible with statistical practice, labeled here statistical
pragmatism, serves as a foundation for inference. Statistical pragmatism is
inclusive and emphasizes the assumptions that connect statistical models with
observed data. I argue that introductory courses often mischaracterize the
process of statistical inference and I propose an alternative "big picture"
depiction.Comment: Published in at http://dx.doi.org/10.1214/10-STS337 the Statistical
Science (http://www.imstat.org/sts/) by the Institute of Mathematical
Statistics (http://www.imstat.org
An Implementation of Bayesian Adaptive Regression Splines (BARS) in C with S and R Wrappers
BARS (DiMatteo, Genovese, and Kass 2001) uses the powerful reversible-jump MCMC engine to perform spline-based generalized nonparametric regression. It has been shown to work well in terms of having small mean-squared error in many examples (smaller than known competitors), as well as producing visually-appealing fits that are smooth (filtering out high-frequency noise) while adapting to sudden changes (retaining high-frequency signal). However, BARS is computationally intensive. The original implementation in S was too slow to be practical in certain situations, and was found to handle some data sets incorrectly. We have implemented BARS in C for the normal and Poisson cases, the latter being important in neurophysiological and other point-process applications. The C implementation includes all needed subroutines for fitting Poisson regression, manipulating B-splines (using code created by Bates and Venables), and finding starting values for Poisson regression (using code for density estimation created by Kooperberg). The code utilizes only freely-available external libraries (LAPACK and BLAS) and is otherwise self-contained. We have also provided wrappers so that BARS can be used easily within S or R.
Assessment of synchrony in multiple neural spike trains using loglinear point process models
Neural spike trains, which are sequences of very brief jumps in voltage
across the cell membrane, were one of the motivating applications for the
development of point process methodology. Early work required the assumption of
stationarity, but contemporary experiments often use time-varying stimuli and
produce time-varying neural responses. More recently, many statistical methods
have been developed for nonstationary neural point process data. There has also
been much interest in identifying synchrony, meaning events across two or more
neurons that are nearly simultaneous at the time scale of the recordings. A
natural statistical approach is to discretize time, using short time bins, and
to introduce loglinear models for dependency among neurons, but previous use of
loglinear modeling technology has assumed stationarity. We introduce a succinct
yet powerful class of time-varying loglinear models by (a) allowing
individual-neuron effects (main effects) to involve time-varying intensities;
(b) also allowing the individual-neuron effects to involve autocovariation
effects (history effects) due to past spiking, (c) assuming excess synchrony
effects (interaction effects) do not depend on history, and (d) assuming all
effects vary smoothly across time.Comment: Published in at http://dx.doi.org/10.1214/10-AOAS429 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Approximate Methods for State-Space Models
State-space models provide an important body of techniques for analyzing
time-series, but their use requires estimating unobserved states. The optimal
estimate of the state is its conditional expectation given the observation
histories, and computing this expectation is hard when there are
nonlinearities. Existing filtering methods, including sequential Monte Carlo,
tend to be either inaccurate or slow. In this paper, we study a nonlinear
filter for nonlinear/non-Gaussian state-space models, which uses Laplace's
method, an asymptotic series expansion, to approximate the state's conditional
mean and variance, together with a Gaussian conditional distribution. This {\em
Laplace-Gaussian filter} (LGF) gives fast, recursive, deterministic state
estimates, with an error which is set by the stochastic characteristics of the
model and is, we show, stable over time. We illustrate the estimation ability
of the LGF by applying it to the problem of neural decoding and compare it to
sequential Monte Carlo both in simulations and with real data. We find that the
LGF can deliver superior results in a small fraction of the computing time.Comment: 31 pages, 4 figures. Different pagination from journal version due to
incompatible style files but same content; the supplemental file for the
journal appears here as appendices B--E
False discovery rate regression: an application to neural synchrony detection in primary visual cortex
Many approaches for multiple testing begin with the assumption that all tests
in a given study should be combined into a global false-discovery-rate
analysis. But this may be inappropriate for many of today's large-scale
screening problems, where auxiliary information about each test is often
available, and where a combined analysis can lead to poorly calibrated error
rates within different subsets of the experiment. To address this issue, we
introduce an approach called false-discovery-rate regression that directly uses
this auxiliary information to inform the outcome of each test. The method can
be motivated by a two-groups model in which covariates are allowed to influence
the local false discovery rate, or equivalently, the posterior probability that
a given observation is a signal. This poses many subtle issues at the interface
between inference and computation, and we investigate several variations of the
overall approach. Simulation evidence suggests that: (1) when covariate effects
are present, FDR regression improves power for a fixed false-discovery rate;
and (2) when covariate effects are absent, the method is robust, in the sense
that it does not lead to inflated error rates. We apply the method to neural
recordings from primary visual cortex. The goal is to detect pairs of neurons
that exhibit fine-time-scale interactions, in the sense that they fire together
more often than expected due to chance. Our method detects roughly 50% more
synchronous pairs versus a standard FDR-controlling analysis. The companion R
package FDRreg implements all methods described in the paper
An Implementation of Bayesian Adaptive Regression Splines (BARS) in C with S and R Wrappers
BARS (DiMatteo, Genovese, and Kass 2001) uses the powerful reversible-jump MCMC engine to perform spline-based generalized nonparametric regression. It has been shown to work well in terms of having small mean-squared error in many examples (smaller than known competitors), as well as producing visually-appealing fits that are smooth (filtering out high-frequency noise) while adapting to sudden changes (retaining high-frequency signal). However, BARS is computationally intensive. The original implementation in S was too slow to be practical in certain situations, and was found to handle some data sets incorrectly. We have implemented BARS in C for the normal and Poisson cases, the latter being important in neurophysiological and other point-process applications. The C implementation includes all needed subroutines for fitting Poisson regression, manipulating B-splines (using code created by Bates and Venables), and finding starting values for Poisson regression (using code for density estimation created by Kooperberg). The code utilizes only freely-available external libraries (LAPACK and BLAS) and is otherwise self-contained. We have also provided wrappers so that BARS can be used easily within S or R
Channel Openings Are Necessary but not Sufficient for Use-dependent Block of Cardiac Na+ Channels by Flecainide: Evidence from the Analysis of Disease-linked Mutations
Na+ channel blockers such as flecainide have found renewed usefulness in the diagnosis and treatment of two clinical syndromes arising from inherited mutations in SCN5A, the gene encoding the α subunit of the cardiac voltage–gated Na+ channel. The Brugada syndrome (BrS) and the LQT-3 variant of the Long QT syndrome are caused by disease-linked SCN5A mutations that act to change functional and pharmacological properties of the channel. Here we have explored a set of SCN5A mutations linked both to BrS and LQT-3 to determine what disease-modified channel properties underlie distinct responses to the Na+ channel blocker flecainide. We focused on flecainide block that develops with repetitive channel activity, so-called use-dependent block (UDB). Our results indicate that mutation-induced changes in the voltage-dependence of channel availability (inactivation) may act as determinants of flecainide block. The data further indicate that UDB by flecainide requires channel opening, but is not likely due to open channel block. Rather, flecainide appears to interact with inactivation states that follow depolarization-induced channel opening, and mutation-induced changes in channel inactivation will alter flecainide block independent of the disease to which the mutation is linked. Analysis of flecainide block of mutant channels linked to these rare disorders has provided novel insight into the molecular determinants of drug action
Teaching Computation in Neuroscience: Notes on the 2019 Society for Neuroscience Professional Development Workshop on Teaching
The 2019 Society for Neuroscience Professional 1Development Workshop on Teaching reviewed current tools, approaches, and examples for teaching computation in neuroscience. Robert Kass described the statistical foundations that students need to properly analyze data. Pascal Wallisch compared MATLAB and Python as programming languages for teaching students. Adrienne Fairhall discussed computational methods, training opportunities, and curricular considerations. Walt Babiec provided a view from the trenches on practical aspects of teaching computational neuroscience. Mathew Abrams concluded the session with an overview of resources for teaching and learning computational modeling in neuroscience
- …