6,635 research outputs found

    Accelerating delayed-acceptance Markov chain Monte Carlo algorithms

    Full text link
    Delayed-acceptance Markov chain Monte Carlo (DA-MCMC) samples from a probability distribution via a two-stages version of the Metropolis-Hastings algorithm, by combining the target distribution with a "surrogate" (i.e. an approximate and computationally cheaper version) of said distribution. DA-MCMC accelerates MCMC sampling in complex applications, while still targeting the exact distribution. We design a computationally faster, albeit approximate, DA-MCMC algorithm. We consider parameter inference in a Bayesian setting where a surrogate likelihood function is introduced in the delayed-acceptance scheme. When the evaluation of the likelihood function is computationally intensive, our scheme produces a 2-4 times speed-up, compared to standard DA-MCMC. However, the acceleration is highly problem dependent. Inference results for the standard delayed-acceptance algorithm and our approximated version are similar, indicating that our algorithm can return reliable Bayesian inference. As a computationally intensive case study, we introduce a novel stochastic differential equation model for protein folding data.Comment: 40 pages, 21 figures, 10 table

    Efficient data augmentation for fitting stochastic epidemic models to prevalence data

    Full text link
    Stochastic epidemic models describe the dynamics of an epidemic as a disease spreads through a population. Typically, only a fraction of cases are observed at a set of discrete times. The absence of complete information about the time evolution of an epidemic gives rise to a complicated latent variable problem in which the state space size of the epidemic grows large as the population size increases. This makes analytically integrating over the missing data infeasible for populations of even moderate size. We present a data augmentation Markov chain Monte Carlo (MCMC) framework for Bayesian estimation of stochastic epidemic model parameters, in which measurements are augmented with subject-level disease histories. In our MCMC algorithm, we propose each new subject-level path, conditional on the data, using a time-inhomogeneous continuous-time Markov process with rates determined by the infection histories of other individuals. The method is general, and may be applied, with minimal modifications, to a broad class of stochastic epidemic models. We present our algorithm in the context of multiple stochastic epidemic models in which the data are binomially sampled prevalence counts, and apply our method to data from an outbreak of influenza in a British boarding school

    A bi-dimensional finite mixture model for longitudinal data subject to dropout

    Full text link
    In longitudinal studies, subjects may be lost to follow-up, or miss some of the planned visits, leading to incomplete response sequences. When the probability of non-response, conditional on the available covariates and the observed responses, still depends on unobserved outcomes, the dropout mechanism is said to be non ignorable. A common objective is to build a reliable association structure to account for dependence between the longitudinal and the dropout processes. Starting from the existing literature, we introduce a random coefficient based dropout model where the association between outcomes is modeled through discrete latent effects. These effects are outcome-specific and account for heterogeneity in the univariate profiles. Dependence between profiles is introduced by using a bi-dimensional representation for the corresponding distribution. In this way, we define a flexible latent class structure which allows to efficiently describe both dependence within the two margins of interest and dependence between them. By using this representation we show that, unlike standard (unidimensional) finite mixture models, the non ignorable dropout model properly nests its ignorable counterpart. We detail the proposed modeling approach by analyzing data from a longitudinal study on the dynamics of cognitive functioning in the elderly. Further, the effects of assumptions about non ignorability of the dropout process on model parameter estimates are (locally) investigated using the index of (local) sensitivity to non-ignorability

    Estimating the GARCH Diffusion: Simulated Maximum Likelihood in Continuous Time

    Get PDF

    Adoption of augmented reality technology by university students

    Get PDF
    In recent times, Augmented Reality has gained more relevance in the field of education. This relevance has been enhanced due to its ease of use, as well as the availability of the technical devices for the students. The present study was conducted with students enrolled in the Pedagogy Degree in the Faculty of Education at the University of Seville. The objective was to understand the degree of technological acceptance of students during their interaction with the AR objects produced, the performance achieved by the students, and if their gender affected their acquisition of knowledge. For this, three data collection instruments were utilized: a multiple choice test for the analysis of the student's performance after the interaction, the Technology Acceptance Model (TAM) diagnostic instrument, created by Davis (1989), and an “ad hoc” instrument created so that the students could evaluate the class notes enriched with the AR objects created. The study has allowed us to broaden the scientific knowledge of the TAM by Davis, to understand that AR objects can be utilized in university teaching, and to know that the student's gender does not influence learning.Ministry of Economy and Competitiveness of Spain EDU-5746-

    Hierarchical Models for Relational Event Sequences

    Full text link
    Interaction within small groups can often be represented as a sequence of events, where each event involves a sender and a recipient. Recent methods for modeling network data in continuous time model the rate at which individuals interact conditioned on the previous history of events as well as actor covariates. We present a hierarchical extension for modeling multiple such sequences, facilitating inferences about event-level dynamics and their variation across sequences. The hierarchical approach allows one to share information across sequences in a principled manner---we illustrate the efficacy of such sharing through a set of prediction experiments. After discussing methods for adequacy checking and model selection for this class of models, the method is illustrated with an analysis of high school classroom dynamics

    Estimating the GARCH Diffusion: Simulated Maximum Likelihood in Continuous Time

    Get PDF
    A new algorithm is developed to provide a simulated maximum likelihood estimation of the GARCH diffusion model of Nelson (1990) based on return data only. The method combines two accurate approximation procedures, namely, the polynomial expansion of AĂŻt-Sahalia (2008) to approximate the transition probability density of return and volatility, and the Efficient Importance Sampler (EIS) of Richard and Zhang (2007) to integrate out the volatility. The first and second order terms in the polynomial expansion are used to generate a base-line importance density for an EIS algorithm. The higher order terms are included when evaluating the importance weights. Monte Carlo experiments show that the new method works well and the discretization error is well controlled by the polynomial expansion. In the empirical application, we fit the GARCH diffusion to equity data, perform diagnostics on the model fit, and test the finiteness of the importance weights.Ecient importance sampling; GARCH diusion model; Simulated Maximum likelihood; Stochastic volatility
    • …
    corecore