19 research outputs found

    Inference for stochastic chemical kinetics using moment equations and system size expansion

    Get PDF
    Quantitative mechanistic models are valuable tools for disentangling biochemical pathways and for achieving a comprehensive understanding of biological systems. However, to be quantitative the parameters of these models have to be estimated from experimental data. In the presence of significant stochastic fluctuations this is a challenging task as stochastic simulations are usually too time-consuming and a macroscopic description using reaction rate equations (RREs) is no longer accurate. In this manuscript, we therefore consider moment-closure approximation (MA) and the system size expansion (SSE), which approximate the statistical moments of stochastic processes and tend to be more precise than macroscopic descriptions. We introduce gradient-based parameter optimization methods and uncertainty analysis methods for MA and SSE. Efficiency and reliability of the methods are assessed using simulation examples as well as by an application to data for Epo-induced JAK/STAT signaling. The application revealed that even if merely population-average data are available, MA and SSE improve parameter identifiability in comparison to RRE. Furthermore, the simulation examples revealed that the resulting estimates are more reliable for an intermediate volume regime. In this regime the estimation error is reduced and we propose methods to determine the regime boundaries. These results illustrate that inference using MA and SSE is feasible and possesses a high sensitivity

    A framework for performing data-driven modeling of tumor growth with radiotherapy treatment

    No full text
    Recent technological advances make it possible to collect detailed information about tumors, and yet clinical assessments about treatment responses are typically based on sparse datasets. In this work, we propose a workflow for choosing an appropriate model, verifying parameter identifiability, and assessing the amount of data necessary to accurately calibrate model parameters. As a proof-of-concept, we compare tumor growth models of varying complexity in an effort to determine the level of model complexity needed to accurately predict tumor growth dynamics and response to radiotherapy. We consider a simple, one-compartment ordinary differential equation model which tracks tumor volume and a two-compartment model that accounts for tumor volume and the fraction of necrotic cells contained within the tumor. We investigate the structural and practical identifiability of these models, and the impact of noise on identifiability. We also generate synthetic data from a more complex, spatially-resolved, cellular automaton model (CA) that simulates tumor growth and response to radiotherapy. We investigate the fit of the ODE models to tumor volume data generated by the CA in various parameter regimes, and we use sequential model calibration to determine how many data points are required to accurately infer model parameters. Our results suggest that if data on tumor volumes alone is provided, then a tumor with a large necrotic volume is the most challenging case to fit. However, supplementing data on total tumor volume with additional information on the necrotic volume enables the two compartment ODE model to perform significantly better than the one compartment model in terms of parameter convergence and predictive power

    Neuronal Model Hand-Tuning

    No full text
    corecore