1,067 research outputs found
A variational approach to moment-closure approximations for the kinetics of biomolecular reaction networks
Approximate solutions of the chemical master equation and the chemical
Fokker-Planck equation are an important tool in the analysis of biomolecular
reaction networks. Previous studies have highlighted a number of problems with
the moment-closure approach used to obtain such approximations, calling it an
ad-hoc method. In this article, we give a new variational derivation of
moment-closure equations which provides us with an intuitive understanding of
their properties and failure modes and allows us to correct some of these
problems. We use mixtures of product-Poisson distributions to obtain a flexible
parametric family which solves the commonly observed problem of divergences at
low system sizes. We also extend the recently introduced entropic matching
approach to arbitrary ansatz distributions and Markov processes, demonstrating
that it is a special case of variational moment closure. This provides us with
a particularly principled approximation method. Finally, we extend the above
approaches to cover the approximation of multi-time joint distributions,
resulting in a viable alternative to process-level approximations which are
often intractable.Comment: Minor changes and clarifications; corrected some typo
An Infinitesimal Probabilistic Model for Principal Component Analysis of Manifold Valued Data
We provide a probabilistic and infinitesimal view of how the principal
component analysis procedure (PCA) can be generalized to analysis of nonlinear
manifold valued data. Starting with the probabilistic PCA interpretation of the
Euclidean PCA procedure, we show how PCA can be generalized to manifolds in an
intrinsic way that does not resort to linearization of the data space. The
underlying probability model is constructed by mapping a Euclidean stochastic
process to the manifold using stochastic development of Euclidean
semimartingales. The construction uses a connection and bundles of covariant
tensors to allow global transport of principal eigenvectors, and the model is
thereby an example of how principal fiber bundles can be used to handle the
lack of global coordinate system and orientations that characterizes manifold
valued statistics. We show how curvature implies non-integrability of the
equivalent of Euclidean principal subspaces, and how the stochastic flows
provide an alternative to explicit construction of such subspaces. We describe
estimation procedures for inference of parameters and prediction of principal
components, and we give examples of properties of the model on embedded
surfaces
Past, Present, and Future of Simultaneous Localization And Mapping: Towards the Robust-Perception Age
Simultaneous Localization and Mapping (SLAM)consists in the concurrent
construction of a model of the environment (the map), and the estimation of the
state of the robot moving within it. The SLAM community has made astonishing
progress over the last 30 years, enabling large-scale real-world applications,
and witnessing a steady transition of this technology to industry. We survey
the current state of SLAM. We start by presenting what is now the de-facto
standard formulation for SLAM. We then review related work, covering a broad
set of topics including robustness and scalability in long-term mapping, metric
and semantic representations for mapping, theoretical performance guarantees,
active SLAM and exploration, and other new frontiers. This paper simultaneously
serves as a position paper and tutorial to those who are users of SLAM. By
looking at the published research with a critical eye, we delineate open
challenges and new research issues, that still deserve careful scientific
investigation. The paper also contains the authors' take on two questions that
often animate discussions during robotics conferences: Do robots need SLAM? and
Is SLAM solved
Physicsâconstrained nonâGaussian probabilistic learning on manifolds
International audienceAn extension of the probabilistic learning on manifolds (PLoM), recently introduced by the authors, has been presented: In addition to the initial data set given for performing the probabilistic learning, constraints are given, which correspond to statistics of experiments or of physical models. We consider a non-Gaussian random vector whose unknown probability distribution has to satisfy constraints. The method consists in constructing a generator using the PLoM and the classical Kullback-Leibler minimum cross-entropy principle. The resulting optimization problem is reformulated using Lagrange multipliers associated with the constraints. The optimal solution of the Lagrange multipliers is computed using an efficient iterative algorithm. At each iteration, the Markov chainMonte Carlo algorithm developed for the PLoM is used, consisting in solving an ItĂŽ stochastic differential equation that is projected on a diffusion-maps basis. The method and the algorithm are efficient and allow the construction ofprobabilistic models for high-dimensional problems from small initial data sets and for which an arbitrary number of constraints are specified. The first application is sufficiently simple in order to be easily reproduced. The second one is relative to a stochastic elliptic boundary value problem in high dimension
Perspectives on Multi-Level Dynamics
As Physics did in previous centuries, there is currently a common dream of
extracting generic laws of nature in economics, sociology, neuroscience, by
focalising the description of phenomena to a minimal set of variables and
parameters, linked together by causal equations of evolution whose structure
may reveal hidden principles. This requires a huge reduction of dimensionality
(number of degrees of freedom) and a change in the level of description. Beyond
the mere necessity of developing accurate techniques affording this reduction,
there is the question of the correspondence between the initial system and the
reduced one. In this paper, we offer a perspective towards a common framework
for discussing and understanding multi-level systems exhibiting structures at
various spatial and temporal levels. We propose a common foundation and
illustrate it with examples from different fields. We also point out the
difficulties in constructing such a general setting and its limitations
Barycentric Subspace Analysis on Manifolds
This paper investigates the generalization of Principal Component Analysis
(PCA) to Riemannian manifolds. We first propose a new and general type of
family of subspaces in manifolds that we call barycentric subspaces. They are
implicitly defined as the locus of points which are weighted means of
reference points. As this definition relies on points and not on tangent
vectors, it can also be extended to geodesic spaces which are not Riemannian.
For instance, in stratified spaces, it naturally allows principal subspaces
that span several strata, which is impossible in previous generalizations of
PCA. We show that barycentric subspaces locally define a submanifold of
dimension k which generalizes geodesic subspaces.Second, we rephrase PCA in
Euclidean spaces as an optimization on flags of linear subspaces (a hierarchy
of properly embedded linear subspaces of increasing dimension). We show that
the Euclidean PCA minimizes the Accumulated Unexplained Variances by all the
subspaces of the flag (AUV). Barycentric subspaces are naturally nested,
allowing the construction of hierarchically nested subspaces. Optimizing the
AUV criterion to optimally approximate data points with flags of affine spans
in Riemannian manifolds lead to a particularly appealing generalization of PCA
on manifolds called Barycentric Subspaces Analysis (BSA).Comment: Annals of Statistics, Institute of Mathematical Statistics, A
Para\^itr
- âŠ