8 research outputs found
Approximating solutions of the chemical master equation using neural networks
The Chemical Master Equation (CME) provides an accurate description of stochastic biochemical reaction networks in well-mixed conditions, but it cannot be solved analytically for most systems of practical interest. Although Monte Carlo methods provide a principled means to probe system dynamics, the large number of simulations typically required can render the estimation of molecule number distributions and other quantities infeasible. In this article, we aim to leverage the representational power of neural networks to approximate the solutions of the CME and propose a framework for the Neural Estimation of Stochastic Simulations for Inference and Exploration (Nessie). Our approach is based on training neural networks to learn the distributions predicted by the CME from relatively few stochastic simulations. We show on biologically relevant examples that simple neural networks with one hidden layer can capture highly complex distributions across parameter space, thereby accelerating computationally intensive tasks such as parameter exploration and inference
Parameter estimation for biochemical reaction networks using Wasserstein distances
We present a method for estimating parameters in stochastic models of
biochemical reaction networks by fitting steady-state distributions using
Wasserstein distances. We simulate a reaction network at different parameter
settings and train a Gaussian process to learn the Wasserstein distance between
observations and the simulator output for all parameters. We then use Bayesian
optimization to find parameters minimizing this distance based on the trained
Gaussian process. The effectiveness of our method is demonstrated on the
three-stage model of gene expression and a genetic feedback loop for which
moment-based methods are known to perform poorly. Our method is applicable to
any simulator model of stochastic reaction networks, including Brownian
Dynamics.Comment: 22 pages, 8 figures. Slight modifications/additions to the text;
added new section (Section 4.4) and Appendi
Training deep neural density estimators to identify mechanistic models of neural dynamics
Mechanistic modeling in neuroscience aims to explain observed phenomena in terms of underlying causes. However, determining which model parameters agree with complex and stochastic neural data presents a significant challenge. We address this challenge with a machine learning tool which uses deep neural density estimatorsâtrained using model simulationsâto carry out Bayesian inference and retrieve the full space of parameters compatible with raw data or selected data features. Our method is scalable in parameters and data features and can rapidly analyze new data after initial training. We demonstrate the power and flexibility of our approach on receptive fields, ion channels, and HodgkinâHuxley models. We also characterize the space of circuit configurations giving rise to rhythmic activity in the crustacean stomatogastric ganglion, and use these results to derive hypotheses for underlying compensation mechanisms. Our approach will help close the gap between data-driven and theory-driven models of neural dynamics
Inference and Uncertainty Quantification of Stochastic Gene Expression via Synthetic Models
Estimating uncertainty in model predictions is a central task in quantitative biology. Biological models at the single-cell level are intrinsically stochastic and nonlinear, creating formidable challenges for their statistical estimation which inevitably has to rely on approximations that trade accuracy for tractability. Despite intensive interest, a sweet spot in this trade-off has not been found yet. We propose a flexible procedure for uncertainty quantification in a wide class of reaction networks describing stochastic gene expression including those with feedback. The method is based on creating a tractable coarse-graining of the model that is learned from simulations, a synthetic model, to approximate the likelihood function. We demonstrate that synthetic models can substantially outperform state-of-the-art approaches on a number of non-trivial systems and datasets, yielding an accurate and computationally viable solution to uncertainty quantification in stochastic models of gene expression
A stochastic vs deterministic perspective on the timing of cellular events
Abstract Cells are the fundamental units of life, and like all life forms, they change over time. Changes in cell state are driven by molecular processes; of these many are initiated when molecule numbers reach and exceed specific thresholds, a characteristic that can be described as âdigital cellular logicâ. Here we show how molecular and cellular noise profoundly influence the time to cross a critical thresholdâthe first-passage timeâand map out scenarios in which stochastic dynamics result in shorter or longer average first-passage times compared to noise-less dynamics. We illustrate the dependence of the mean first-passage time on noise for a set of exemplar models of gene expression, auto-regulatory feedback control, and enzyme-mediated catalysis. Our theory provides intuitive insight into the origin of these effects and underscores two important insights: (i) deterministic predictions for cellular event timing can be highly inaccurate when molecule numbers are within the range known for many cells; (ii) molecular noise can significantly shift mean first-passage times, particularly within auto-regulatory genetic feedback circuits
JuliaGaussianProcesses/KernelFunctions.jl: v0.10.58
<h2>KernelFunctions v0.10.58</h2>
<p><a href="https://github.com/JuliaGaussianProcesses/KernelFunctions.jl/compare/v0.10.57...v0.10.58">Diff since v0.10.57</a></p>
<p><strong>Merged pull requests:</strong></p>
<ul>
<li>Fix Matern AD failures (#528) (@simsurace)</li>
<li>Stop using deprecated signatures from Distances.jl (#529) (@simsurace)</li>
</ul>
TuringLang/AdvancedHMC.jl: v0.6.0
<h2>AdvancedHMC v0.6.0</h2>
<p><a href="https://github.com/TuringLang/AdvancedHMC.jl/compare/v0.5.5...v0.6.0">Diff since v0.5.5</a></p>
<p><strong>Merged pull requests:</strong></p>
<ul>
<li>fix: arg order (#349) (@xukai92)</li>
<li>CompatHelper: bump compat for AbstractMCMC to 5, (keep existing compat) (#352) (@github-actions[bot])</li>
<li>Deprecate <code>init_params</code> which is no longer in AbstractMCMC (#353) (@torfjelde)</li>
<li>CompatHelper: add new compat entry for Statistics at version 1, (keep existing compat) (#354) (@github-actions[bot])</li>
<li>Removed deprecation of init_params + bump minor version (#355) (@torfjelde)</li>
<li>Fix some tests. (#356) (@yebai)</li>
<li>Fix docs CI (#357) (@yebai)</li>
</ul>
<p><strong>Closed issues:</strong></p>
<ul>
<li>Doc string error for NUTS (#346)</li>
</ul>