4,219 research outputs found
Bounding rare event probabilities in computer experiments
We are interested in bounding probabilities of rare events in the context of
computer experiments. These rare events depend on the output of a physical
model with random input variables. Since the model is only known through an
expensive black box function, standard efficient Monte Carlo methods designed
for rare events cannot be used. We then propose a strategy to deal with this
difficulty based on importance sampling methods. This proposal relies on
Kriging metamodeling and is able to achieve sharp upper confidence bounds on
the rare event probabilities. The variability due to the Kriging metamodeling
step is properly taken into account. The proposed methodology is applied to a
toy example and compared to more standard Bayesian bounds. Finally, a
challenging real case study is analyzed. It consists of finding an upper bound
of the probability that the trajectory of an airborne load will collide with
the aircraft that has released it.Comment: 21 pages, 6 figure
Maximin design on non hypercube domain and kernel interpolation
In the paradigm of computer experiments, the choice of an experimental design
is an important issue. When no information is available about the black-box
function to be approximated, an exploratory design have to be used. In this
context, two dispersion criteria are usually considered: the minimax and the
maximin ones. In the case of a hypercube domain, a standard strategy consists
of taking the maximin design within the class of Latin hypercube designs.
However, in a non hypercube context, it does not make sense to use the Latin
hypercube strategy. Moreover, whatever the design is, the black-box function is
typically approximated thanks to kernel interpolation. Here, we first provide a
theoretical justification to the maximin criterion with respect to kernel
interpolations. Then, we propose simulated annealing algorithms to determine
maximin designs in any bounded connected domain. We prove the convergence of
the different schemes.Comment: 3 figure
Approximate Bayesian Computational methods
Also known as likelihood-free methods, approximate Bayesian computational
(ABC) methods have appeared in the past ten years as the most satisfactory
approach to untractable likelihood problems, first in genetics then in a
broader spectrum of applications. However, these methods suffer to some degree
from calibration difficulties that make them rather volatile in their
implementation and thus render them suspicious to the users of more traditional
Monte Carlo methods. In this survey, we study the various improvements and
extensions made to the original ABC algorithm over the recent years.Comment: 7 figure
The occurrence of ribonucleic acid in the lutoid fraction (Lysosomal Compartment) from Hevea brasiliensis Künth. (Müll-Arg.) latex
Sciences et société en interaction sur Internet. Éléments pour une histoire de l'édition électronique en sciences humaines et sociales
doi:10.4074/S0336150009001100The rise of digital networks is a critical time in the complicated history of the relationship between science and society, both in terms of technological development and its impact on scientific communication. The example of humanities and social science highlights their mediating role in the relationship between science and society.Dans l'histoire des rapports complexes qu'entretiennent sciences et société, le développement des réseaux numériques constitue un moment stratégique, que ce soit au niveau de leur développement technique, ou des modifications que ce développement produit sur les formes de la communication scientifique. Le cas particulier des sciences humaines et sociales met bien en évidence le rôle de médiation que les TIC jouent dans les relations entre sciences et société
Humanités numériques: État des lieux et positionnement de la recherche française dans le contexte international
Si, voici quelques années, il pouvait paraître étrange d'associer le numérique aux humanités, le monde scientifique voit aujourd'hui collaborer informaticiens et sociologues, ingénieurs et spécialistes de littérature. Ces alliances inédites renouvellent profondément les formes, les rythmes et la circulation des sciences humaines. Comprendre une révolution scientifiqueFaire le point sur cette mutation en cours, mesurer la part que peut y prendre la recherche française : tel est le double enjeu de l'étude sur les Humanités numériques aujourd'hui publiée par l'Institut français. Réalisée par Marin Dacos et Pierre Mounier, animateurs du Centre pour l'édition électronique ouverte (acteur majeur en France dans ce domaine à travers le portail Open Edition, palette d'instruments bien connue des chercheurs), Humanités numériques – État des lieux et positionnement de la recherche française dans le contexte international trace les contours d'un champ en pleine expansion. Qu'il s'agisse de publier en ligne de vastes fonds d'archives, d'analyser en direct l'impact des réseaux sociaux sur les mobilisations militantes ou de mettre au travail une communauté d'internautes sur des pans entiers du patrimoine culturel, le tournant numérique concerne aujourd'hui tous les secteurs de la recherche sur la société, les œuvres et la culture : l'étude ici publiée éclaire de manière précise et accessible cette transformation globale
Efficient learning in ABC algorithms
Approximate Bayesian Computation has been successfully used in population
genetics to bypass the calculation of the likelihood. These methods provide
accurate estimates of the posterior distribution by comparing the observed
dataset to a sample of datasets simulated from the model. Although
parallelization is easily achieved, computation times for ensuring a suitable
approximation quality of the posterior distribution are still high. To
alleviate the computational burden, we propose an adaptive, sequential
algorithm that runs faster than other ABC algorithms but maintains accuracy of
the approximation. This proposal relies on the sequential Monte Carlo sampler
of Del Moral et al. (2012) but is calibrated to reduce the number of
simulations from the model. The paper concludes with numerical experiments on a
toy example and on a population genetic study of Apis mellifera, where our
algorithm was shown to be faster than traditional ABC schemes
Universality of Tip Singularity Formation in Freezing Water Drops
A drop of water deposited on a cold plate freezes into an ice drop with a
pointy tip. While this phenomenon clearly finds its origin in the expansion of
water upon freezing, a quantitative description of the tip singularity has
remained elusive. Here we demonstrate how the geometry of the freezing front,
determined by heat transfer considerations, is crucial for the tip formation.
We perform systematic measurements of the angles of the conical tip, and reveal
the dynamics of the solidification front in a Hele-Shaw geometry. It is found
that the cone angle is independent of substrate temperature and wetting angle,
suggesting a universal, self-similar mechanism that does not depend on the rate
of solidification. We propose a model for the freezing front and derive
resulting tip angles analytically, in good agreement with observations.Comment: Letter format, 5 pages, 3 figures. Note: authors AGM and ORE
contributed equally to the pape
- …