12,970 research outputs found
AReS and MaRS - Adversarial and MMD-Minimizing Regression for SDEs
Stochastic differential equations are an important modeling class in many
disciplines. Consequently, there exist many methods relying on various
discretization and numerical integration schemes. In this paper, we propose a
novel, probabilistic model for estimating the drift and diffusion given noisy
observations of the underlying stochastic system. Using state-of-the-art
adversarial and moment matching inference techniques, we avoid the
discretization schemes of classical approaches. This leads to significant
improvements in parameter accuracy and robustness given random initial guesses.
On four established benchmark systems, we compare the performance of our
algorithms to state-of-the-art solutions based on extended Kalman filtering and
Gaussian processes.Comment: Published at the Thirty-sixth International Conference on Machine
Learning (ICML 2019
Stochastic gradient descent performs variational inference, converges to limit cycles for deep networks
Stochastic gradient descent (SGD) is widely believed to perform implicit
regularization when used to train deep neural networks, but the precise manner
in which this occurs has thus far been elusive. We prove that SGD minimizes an
average potential over the posterior distribution of weights along with an
entropic regularization term. This potential is however not the original loss
function in general. So SGD does perform variational inference, but for a
different loss than the one used to compute the gradients. Even more
surprisingly, SGD does not even converge in the classical sense: we show that
the most likely trajectories of SGD for deep networks do not behave like
Brownian motion around critical points. Instead, they resemble closed loops
with deterministic components. We prove that such "out-of-equilibrium" behavior
is a consequence of highly non-isotropic gradient noise in SGD; the covariance
matrix of mini-batch gradients for deep networks has a rank as small as 1% of
its dimension. We provide extensive empirical validation of these claims,
proven in the appendix
Stochastic neural network models for gene regulatory networks
Recent advances in gene-expression profiling technologies provide large amounts of gene expression data. This raises the possibility for a functional understanding of genome dynamics by means of mathematical modelling. As gene expression involves intrinsic noise, stochastic models are essential for better descriptions of gene regulatory networks. However, stochastic modelling for large scale gene expression data sets is still in the very early developmental stage. In this paper we present some stochastic models by introducing stochastic processes into neural network models that can describe intermediate regulation for large scale gene networks. Poisson random variables are used to represent chance events in the processes of synthesis and degradation. For expression data with normalized concentrations, exponential or normal random variables are used to realize fluctuations. Using a network with three genes, we show how to use stochastic simulations for studying robustness and stability properties of gene expression patterns under the influence of noise, and how to use stochastic models to predict statistical distributions of expression levels in population of cells. The discussion suggest that stochastic neural network models can give better description of gene regulatory networks and provide criteria for measuring the reasonableness o mathematical models
- …