8,925 research outputs found

    Cross-Fertilizing Strategies for Better EM Mountain Climbing and DA Field Exploration: A Graphical Guide Book

    Full text link
    In recent years, a variety of extensions and refinements have been developed for data augmentation based model fitting routines. These developments aim to extend the application, improve the speed and/or simplify the implementation of data augmentation methods, such as the deterministic EM algorithm for mode finding and stochastic Gibbs sampler and other auxiliary-variable based methods for posterior sampling. In this overview article we graphically illustrate and compare a number of these extensions, all of which aim to maintain the simplicity and computation stability of their predecessors. We particularly emphasize the usefulness of identifying similarities between the deterministic and stochastic counterparts as we seek more efficient computational strategies. We also demonstrate the applicability of data augmentation methods for handling complex models with highly hierarchical structure, using a high-energy high-resolution spectral imaging model for data from satellite telescopes, such as the Chandra X-ray Observatory.Comment: Published in at http://dx.doi.org/10.1214/09-STS309 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    A new method for the estimation of variance matrix with prescribed zeros in nonlinear mixed effects models

    Get PDF
    We propose a new method for the Maximum Likelihood Estimator (MLE) of nonlinear mixed effects models when the variance matrix of Gaussian random effects has a prescribed pattern of zeros (PPZ). The method consists in coupling the recently developed Iterative Conditional Fitting (ICF) algorithm with the Expectation Maximization (EM) algorithm. It provides positive definite estimates for any sample size, and does not rely on any structural assumption on the PPZ. It can be easily adapted to many versions of EM.Comment: Accepted for publication in Statistics and Computin

    The EM Algorithm

    Get PDF
    The Expectation-Maximization (EM) algorithm is a broadly applicable approach to the iterative computation of maximum likelihood (ML) estimates, useful in a variety of incomplete-data problems. Maximum likelihood estimation and likelihood-based inference are of central importance in statistical theory and data analysis. Maximum likelihood estimation is a general-purpose method with attractive properties. It is the most-often used estimation technique in the frequentist framework; it is also relevant in the Bayesian framework (Chapter III.11). Often Bayesian solutions are justified with the help of likelihoods and maximum likelihood estimates (MLE), and Bayesian solutions are similar to penalized likelihood estimates. Maximum likelihood estimation is an ubiquitous technique and is used extensively in every area where statistical techniques are used. --

    Estimation in the partially observed stochastic Morris-Lecar neuronal model with particle filter and stochastic approximation methods

    Get PDF
    Parameter estimation in multidimensional diffusion models with only one coordinate observed is highly relevant in many biological applications, but a statistically difficult problem. In neuroscience, the membrane potential evolution in single neurons can be measured at high frequency, but biophysical realistic models have to include the unobserved dynamics of ion channels. One such model is the stochastic Morris-Lecar model, defined by a nonlinear two-dimensional stochastic differential equation. The coordinates are coupled, that is, the unobserved coordinate is nonautonomous, the model exhibits oscillations to mimic the spiking behavior, which means it is not of gradient-type, and the measurement noise from intracellular recordings is typically negligible. Therefore, the hidden Markov model framework is degenerate, and available methods break down. The main contributions of this paper are an approach to estimate in this ill-posed situation and nonasymptotic convergence results for the method. Specifically, we propose a sequential Monte Carlo particle filter algorithm to impute the unobserved coordinate, and then estimate parameters maximizing a pseudo-likelihood through a stochastic version of the Expectation-Maximization algorithm. It turns out that even the rate scaling parameter governing the opening and closing of ion channels of the unobserved coordinate can be reasonably estimated. An experimental data set of intracellular recordings of the membrane potential of a spinal motoneuron of a red-eared turtle is analyzed, and the performance is further evaluated in a simulation study.Comment: Published in at http://dx.doi.org/10.1214/14-AOAS729 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org
    • 

    corecore