17,136 research outputs found
Asteroseismology of Massive Stars : Some Words of Caution
Although playing a key role in the understanding of the supernova phenomenon,
the evolution of massive stars still suffers from uncertainties in their
structure, even during their "quiet" main sequence phase and later on during
their subgiant and helium burning phases. What is the extent of the mixed
central region? In the local mixing length theory (LMLT) frame, are there
structural differences using Schwarzschild or Ledoux convection criterion?
Where are located the convective zone boundaries? Are there intermediate
convection zones during MS and post-MS phase, and what is their extent and
location? We discuss these points and show how asteroseismology could bring
some light on these questions.Comment: 10 pages, 5 figures, IAU Symposium 307, New windows on massive stars:
asteroseismology, interferometry, and spectropolarimetry, G. Meynet, C.
Georgy, J.H. Groh & Ph. Stee, ed
Validating Predictions of Unobserved Quantities
The ultimate purpose of most computational models is to make predictions,
commonly in support of some decision-making process (e.g., for design or
operation of some system). The quantities that need to be predicted (the
quantities of interest or QoIs) are generally not experimentally observable
before the prediction, since otherwise no prediction would be needed. Assessing
the validity of such extrapolative predictions, which is critical to informed
decision-making, is challenging. In classical approaches to validation, model
outputs for observed quantities are compared to observations to determine if
they are consistent. By itself, this consistency only ensures that the model
can predict the observed quantities under the conditions of the observations.
This limitation dramatically reduces the utility of the validation effort for
decision making because it implies nothing about predictions of unobserved QoIs
or for scenarios outside of the range of observations. However, there is no
agreement in the scientific community today regarding best practices for
validation of extrapolative predictions made using computational models. The
purpose of this paper is to propose and explore a validation and predictive
assessment process that supports extrapolative predictions for models with
known sources of error. The process includes stochastic modeling, calibration,
validation, and predictive assessment phases where representations of known
sources of uncertainty and error are built, informed, and tested. The proposed
methodology is applied to an illustrative extrapolation problem involving a
misspecified nonlinear oscillator
Detection of genuinely entangled and non-separable -partite quantum states
We investigate the detection of entanglement in -partite quantum states.
We obtain practical separability criteria to identify genuinely entangled and
non-separable mixed quantum states. No numerical optimization or eigenvalue
evaluation is needed, and our criteria can be evaluated by simple computations
involving components of the density matrix. We provide examples in which our
criteria perform better than all known separability criteria. Specifically, we
are able to detect genuine -partite entanglement which has previously not
been identified. In addition, our criteria can be used in today's experiment.Comment: 8 pages, one figur
Sampling design may obscure species–area relationships in landscape-scale field studies
We investigated 1) the role of area per se in explaining anuran species richness on reservoir forest islands, after controlling for several confounding factors. We also assessed 2) how sampling design affects the inferential power of island species–area relationships (ISARs) aiming to 3) provide guidelines to yield reliable estimates of area-induced species losses in patchy systems. We surveyed anurans with autonomous recording units at 151 plots located on 74 islands and four continuous forest sites at the Balbina Hydroelectric Reservoir landscape, central Brazilian Amazonia. We applied semi-log ISAR models to assess the effect of sampling design on the fit and slope of species–area curves. To do so, we subsampled our surveyed islands following both a 1) stratified and 2) non-stratified random selection of 5, 10, 15, 20 and 25 islands covering 1) the full range in island size (0.45–1699 ha) and 2) only islands smaller than 100 ha, respectively. We also compiled 25 datasets from the literature to assess the generality of our findings. Island size explained ca half of the variation in species richness. The fit and slope of species–area curves were affected mainly by the range in island size considered, and to a very small extent by the number of islands surveyed. In our literature review, all datasets covering a range of patch sizes larger than 300 ha yielded a positive ISAR, whereas the number of patches alone did not affect the detection of ISARs. We conclude that 1) area per se plays a major role in explaining anuran species richness on forest islands within an Amazonian anthropogenic archipelago; 2) the inferential power of island species–area relationships is severely degraded by sub-optimal sampling designs; 3) at least 10 habitat patches spanning three orders of magnitude in size should be surveyed to yield reliable species–area estimates in patchy systems
Using genotype abundance to improve phylogenetic inference
Modern biological techniques enable very dense genetic sampling of unfolding
evolutionary histories, and thus frequently sample some genotypes multiple
times. This motivates strategies to incorporate genotype abundance information
in phylogenetic inference. In this paper, we synthesize a stochastic process
model with standard sequence-based phylogenetic optimality, and show that tree
estimation is substantially improved by doing so. Our method is validated with
extensive simulations and an experimental single-cell lineage tracing study of
germinal center B cell receptor affinity maturation
Extreme-Point-based Heuristics for the Three-Dimensional Bin Packing problem
One of the main issues in addressing three-dimensional packing problems is finding an efficient and accurate definition of the points at which to place the items inside the bins, because the performance of exact and heuristic solution methods is actually strongly influenced by the choice of a placement rule. We introduce the extreme point concept and present a new extreme point-based rule for packing items inside a three-dimensional container. The extreme point rule is independent from the particular packing problem addressed and can handle additional constraints, such as fixing the position of the items. The new extreme point rule is also used to derive new constructive heuristics for the three-dimensional bin-packing problem. Extensive computational results show the effectiveness of the new heuristics compared to state-of-the-art results. Moreover, the same heuristics, when applied to the two-dimensional bin-packing problem, outperform those specifically designed for the proble
- …