22,649 research outputs found
ABC random forests for Bayesian parameter inference
This preprint has been reviewed and recommended by Peer Community In
Evolutionary Biology (http://dx.doi.org/10.24072/pci.evolbiol.100036).
Approximate Bayesian computation (ABC) has grown into a standard methodology
that manages Bayesian inference for models associated with intractable
likelihood functions. Most ABC implementations require the preliminary
selection of a vector of informative statistics summarizing raw data.
Furthermore, in almost all existing implementations, the tolerance level that
separates acceptance from rejection of simulated parameter values needs to be
calibrated. We propose to conduct likelihood-free Bayesian inferences about
parameters with no prior selection of the relevant components of the summary
statistics and bypassing the derivation of the associated tolerance level. The
approach relies on the random forest methodology of Breiman (2001) applied in a
(non parametric) regression setting. We advocate the derivation of a new random
forest for each component of the parameter vector of interest. When compared
with earlier ABC solutions, this method offers significant gains in terms of
robustness to the choice of the summary statistics, does not depend on any type
of tolerance level, and is a good trade-off in term of quality of point
estimator precision and credible interval estimations for a given computing
time. We illustrate the performance of our methodological proposal and compare
it with earlier ABC methods on a Normal toy example and a population genetics
example dealing with human population evolution. All methods designed here have
been incorporated in the R package abcrf (version 1.7) available on CRAN.Comment: Main text: 24 pages, 6 figures Supplementary Information: 14 pages, 5
figure
Entropy-based parametric estimation of spike train statistics
We consider the evolution of a network of neurons, focusing on the asymptotic
behavior of spikes dynamics instead of membrane potential dynamics. The spike
response is not sought as a deterministic response in this context, but as a
conditional probability : "Reading out the code" consists of inferring such a
probability. This probability is computed from empirical raster plots, by using
the framework of thermodynamic formalism in ergodic theory. This gives us a
parametric statistical model where the probability has the form of a Gibbs
distribution. In this respect, this approach generalizes the seminal and
profound work of Schneidman and collaborators. A minimal presentation of the
formalism is reviewed here, while a general algorithmic estimation method is
proposed yielding fast convergent implementations. It is also made explicit how
several spike observables (entropy, rate, synchronizations, correlations) are
given in closed-form from the parametric estimation. This paradigm does not
only allow us to estimate the spike statistics, given a design choice, but also
to compare different models, thus answering comparative questions about the
neural code such as : "are correlations (or time synchrony or a given set of
spike patterns, ..) significant with respect to rate coding only ?" A numerical
validation of the method is proposed and the perspectives regarding spike-train
code analysis are also discussed.Comment: 37 pages, 8 figures, submitte
copulaedas: An R Package for Estimation of Distribution Algorithms Based on Copulas
The use of copula-based models in EDAs (estimation of distribution
algorithms) is currently an active area of research. In this context, the
copulaedas package for R provides a platform where EDAs based on copulas can be
implemented and studied. The package offers complete implementations of various
EDAs based on copulas and vines, a group of well-known optimization problems,
and utility functions to study the performance of the algorithms. Newly
developed EDAs can be easily integrated into the package by extending an S4
class with generic functions for their main components. This paper presents
copulaedas by providing an overview of EDAs based on copulas, a description of
the implementation of the package, and an illustration of its use through
examples. The examples include running the EDAs defined in the package,
implementing new algorithms, and performing an empirical study to compare the
behavior of different algorithms on benchmark functions and a real-world
problem
A sharp interface isogeometric strategy for moving boundary problems
The proposed methodology is first utilized to model stationary and propagating cracks. The crack face is enriched with the Heaviside function which captures the displacement discontinuity. Meanwhile, the crack tips are enriched with asymptotic displacement functions to reproduce the tip singularity. The enriching degrees of freedom associated with the crack tips are chosen as stress intensity factors (SIFs) such that these quantities can be directly extracted from the solution without a-posteriori integral calculation.
As a second application, the Stefan problem is modeled with a hybrid function/derivative enriched interface. Since the interface geometry is explicitly defined, normals and curvatures can be analytically obtained at any point on the interface, allowing for complex boundary conditions dependent on curvature or normal to be naturally imposed. Thus, the enriched approximation naturally captures the interfacial discontinuity in temperature gradient and enables the imposition of Gibbs-Thomson condition during solidification simulation.
The shape optimization through configuration of finite-sized heterogeneities is lastly studied. The optimization relies on the recently derived configurational derivative that describes the sensitivity of an arbitrary objective with respect to arbitrary design modifications of a heterogeneity inserted into a domain. The THB-splines, which serve as the underlying approximation, produce sufficiently smooth solution near the boundaries of the heterogeneity for accurate calculation of the configurational derivatives. (Abstract shortened by ProQuest.
Autonomous search of an airborne release in urban environments using informed tree planning
The use of autonomous vehicles for chemical source localisation is a key
enabling tool for disaster response teams to safely and efficiently deal with
chemical emergencies. Whilst much work has been performed on source
localisation using autonomous systems, most previous works have assumed an open
environment or employed simplistic obstacle avoidance, separate to the
estimation procedure. In this paper, we explore the coupling of the path
planning task for both source term estimation and obstacle avoidance in a
holistic framework. The proposed system intelligently produces potential gas
sampling locations based on the current estimation of the wind field and the
local map. Then a tree search is performed to generate paths toward the
estimated source location that traverse around any obstacles and still allow
for exploration of potentially superior sampling locations. The proposed
informed tree planning algorithm is then tested against the Entrotaxis
technique in a series of high fidelity simulations. The proposed system is
found to reduce source position error far more efficiently than Entrotaxis in a
feature rich environment, whilst also exhibiting vastly more consistent and
robust results
Feedback control of parametrized PDEs via model order reduction and dynamic programming principle
In this paper, we investigate infinite horizon optimal control problems for parametrized partial differential equations. We are interested in feedback control via dynamic programming equations which is well-known to suffer from the curse of dimensionality. Thus, we apply parametric model order reduction techniques to construct low-dimensional subspaces with suitable information on the control problem, where the dynamic programming equations can be approximated. To guarantee a low number of basis functions, we combine recent basis generation methods and parameter partitioning techniques. Furthermore, we present a novel technique to construct non-uniform grids in the reduced domain, which is based on statistical information. Finally, we discuss numerical examples to illustrate the effectiveness of the proposed methods for PDEs in two space dimensions
- …