1,371 research outputs found

    A Hybrid Galerkin–Monte-Carlo Approach to Higher-Dimensional Population Balances in Polymerization Kinetics

    Get PDF
    Population balance models describing not only the chain-length distribution of a polymer but also additional properties like branching or composition are still difficult to solve numerically. For simulation of such systems two essentially different approaches are discussed in the literature: deterministic solvers based on rate equations and stochastic Monte-Carlo (MC) strategies based on chemical master equations. The paper presents a novel hybrid approach to polymer reaction kinetics that combines the best of these two worlds. We discuss the theoretical conditions of the algorithm, describe its numerical realization, and show that, if applicable, it is more efficient than full-scale MC approaches and leads to more detailed information in additional property indices than deterministic solvers

    Fully-Automatic Multiresolution Idealization for Filtered Ion Channel Recordings: Flickering Event Detection

    Full text link
    We propose a new model-free segmentation method, JULES, which combines recent statistical multiresolution techniques with local deconvolution for idealization of ion channel recordings. The multiresolution criterion takes into account scales down to the sampling rate enabling the detection of flickering events, i.e., events on small temporal scales, even below the filter frequency. For such small scales the deconvolution step allows for a precise determination of dwell times and, in particular, of amplitude levels, a task which is not possible with common thresholding methods. This is confirmed theoretically and in a comprehensive simulation study. In addition, JULES can be applied as a preprocessing method for a refined hidden Markov analysis. Our new methodolodgy allows us to show that gramicidin A flickering events have the same amplitude as the slow gating events. JULES is available as an R function jules in the package clampSeg

    On Markov State Models for Metastable Processes

    Get PDF
    We consider Markov processes on large state spaces and want to find low-dimensional structure-preserving approximations of the process in the sense that the longest timescales of the dynamics of the original process are reproduced well. Recent years have seen the advance of so-called Markov state models (MSM) for processes on very large state spaces exhibiting metastable dynamics. It has been demonstrated that MSMs are especially useful for modelling the interesting slow dynamics of biomolecules (cf. Noe et al, PNAS(106) 2009) and materials. From the mathematical perspective, MSMs result from Galerkin projection of the transfer operator underlying the original process onto some low-dimensional subspace which leads to an approximation of the dominant eigenvalues of the transfer operators and thus of the longest timescales of the original dynamics. Until now, most articles on MSMs have been based on full subdivisions of state space, i.e., Galerkin projections onto subspaces spanned by indicator functions. We show how to generalize MSMs to alternative low-dimensional subspaces with superior approximation properties, and how to analyse the approximation quality (dominant eigenvalues, propagation of functions) of the resulting MSMs. To this end, we give an overview of the construction of MSMs, the associated stochastics and functional-analysis background, and its algorithmic consequences. Furthermore, we illustrate the mathematical construction with numerical examples

    On the Approximation Quality of Markov State Models

    Get PDF
    We consider a continuous-time Markov process on a large continuous or discrete state space. The process is assumed to have strong enough ergodicity properties and to exhibit a number of metastable sets. Markov state models (MSMs) are designed to represent the effective dynamics of such a process by a Markov chain that jumps between the metastable sets with the transition rates of the original process. MSMs have been used for a number of applications, including molecular dynamics, for more than a decade. Their approximation quality, however, has not yet been fully understood. In particular, it would be desirable to have a sharp error bound for the difference in propagation of probability densities between the MSM and the original process on long timescales. Here, we provide such a bound for a rather general class of Markov processes ranging from diffusions in energy landscapes to Markov jump processes on large discrete spaces. Furthermore, we discuss how this result provides formal support or shows the limitations of algorithmic strategies that have been found to be useful for the construction of MSMs. Our findings are illustrated by numerical experiments

    Estimating the eigenvalue error of Markov State Models

    Get PDF
    We consider a continuous-time, ergodic Markov process on a large continuous or discrete state space. The process is assumed to exhibit a number of metastable sets. Markov state models (MSM) are designed to represent the effective dynamics of such a process by a Markov chain that jumps between the metastable sets with the transition rates of the original process. MSM are used for a number of applications, including molecular dynamics (cf. Noe et al, PNAS(106) 2009)[1], since more than a decade. The rigorous and fully general (no zero temperature limit or comparable restrictions) analysis of their approximation quality, however, has only been started recently. Our first article on this topics (Sarich et al, MMS(8) 2010)[2] introduces an error bound for the difference in propagation of probability densities between the MSM and the original process on long time scales. Herein we provide upper bounds for the error in the eigenvalues between the MSM and the original process which means that we analyse how well the longest timescales in the original process are approximated by the MSM. Our findings are illustrated by numerical experiments

    Observation Uncertainty in Reversible Markov Chains

    Get PDF
    In many applications one is interested in finding a simplified model which captures the essential dynamical behavior of a real life process. If the essential dynamics can be assumed to be (approximately) memoryless then a reasonable choice for a model is a Markov model whose parameters are estimated by means of Bayesian inference from an observed time series. We propose an efficient Monte Carlo Markov Chain framework to assess the uncertainty of the Markov model and related observables. The derived Gibbs sampler allows for sampling distributions of transition matrices subject to reversibility and/or sparsity constraints. The performance of the suggested sampling scheme is demonstrated and discussed for a variety of model examples. The uncertainty analysis of functions of the Markov model under investigation is discussed in application to the identification of conformations of the trialanine molecule via Robust Perron Cluster Analysis (PCCA+)
    • …
    corecore