69,762 research outputs found
What is the best risk measure in practice? A comparison of standard measures
Expected Shortfall (ES) has been widely accepted as a risk measure that is
conceptually superior to Value-at-Risk (VaR). At the same time, however, it has
been criticised for issues relating to backtesting. In particular, ES has been
found not to be elicitable which means that backtesting for ES is less
straightforward than, e.g., backtesting for VaR. Expectiles have been suggested
as potentially better alternatives to both ES and VaR. In this paper, we
revisit commonly accepted desirable properties of risk measures like coherence,
comonotonic additivity, robustness and elicitability. We check VaR, ES and
Expectiles with regard to whether or not they enjoy these properties, with
particular emphasis on Expectiles. We also consider their impact on capital
allocation, an important issue in risk management. We find that, despite the
caveats that apply to the estimation and backtesting of ES, it can be
considered a good risk measure. As a consequence, there is no sufficient
evidence to justify an all-inclusive replacement of ES by Expectiles in
applications. For backtesting ES, we propose an empirical approach that
consists in replacing ES by a set of four quantiles, which should allow to make
use of backtesting methods for VaR.
Keywords: Backtesting; capital allocation; coherence; diversification;
elicitability; expected shortfall; expectile; forecasts; probability integral
transform (PIT); risk measure; risk management; robustness; value-at-riskComment: 27 pages, 1 tabl
Modeling of the HIV infection epidemic in the Netherlands: A multi-parameter evidence synthesis approach
Multi-parameter evidence synthesis (MPES) is receiving growing attention from
the epidemiological community as a coherent and flexible analytical framework
to accommodate a disparate body of evidence available to inform disease
incidence and prevalence estimation. MPES is the statistical methodology
adopted by the Health Protection Agency in the UK for its annual national
assessment of the HIV epidemic, and is acknowledged by the World Health
Organization and UNAIDS as a valuable technique for the estimation of adult HIV
prevalence from surveillance data. This paper describes the results of
utilizing a Bayesian MPES approach to model HIV prevalence in the Netherlands
at the end of 2007, using an array of field data from different study designs
on various population risk subgroups and with a varying degree of regional
coverage. Auxiliary data and expert opinion were additionally incorporated to
resolve issues arising from biased, insufficient or inconsistent evidence. This
case study offers a demonstration of the ability of MPES to naturally integrate
and critically reconcile disparate and heterogeneous sources of evidence, while
producing reliable estimates of HIV prevalence used to support public health
decision-making.Comment: Published in at http://dx.doi.org/10.1214/11-AOAS488 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Disease Progression Modeling and Prediction through Random Effect Gaussian Processes and Time Transformation
The development of statistical approaches for the joint modelling of the
temporal changes of imaging, biochemical, and clinical biomarkers is of
paramount importance for improving the understanding of neurodegenerative
disorders, and for providing a reference for the prediction and quantification
of the pathology in unseen individuals. Nonetheless, the use of disease
progression models for probabilistic predictions still requires investigation,
for example for accounting for missing observations in clinical data, and for
accurate uncertainty quantification. We tackle this problem by proposing a
novel Gaussian process-based method for the joint modeling of imaging and
clinical biomarker progressions from time series of individual observations.
The model is formulated to account for individual random effects and time
reparameterization, allowing non-parametric estimates of the biomarker
evolution, as well as high flexibility in specifying correlation structure, and
time transformation models. Thanks to the Bayesian formulation, the model
naturally accounts for missing data, and allows for uncertainty quantification
in the estimate of evolutions, as well as for probabilistic prediction of
disease staging in unseen patients. The experimental results show that the
proposed model provides a biologically plausible description of the evolution
of Alzheimer's pathology across the whole disease time-span as well as
remarkable predictive performance when tested on a large clinical cohort with
missing observations.Comment: 13 pages, 2 figure
Langevin PDF simulation of particle deposition in a turbulent pipe flow
The paper deals with the description of particle deposition on walls from a
turbulent flow over a large range of particle diameter, using a Langevin PDF
model. The first aim of the work is to test how the present Langevin model is
able to describe this phenomenon and to outline the physical as- pects which
play a major role in particle deposition. The general features and
characteristics of the present stochastic model are first recalled. Then,
results obtained with the standard form of the model are presented along with
an analysis which has been carried out to check the sensitivity of the
predictions on different mean fluid quantities. These results show that the
physical repre- sentation of the near-wall physics has to be improved and that,
in particular, one possible route is to introduce specific features related to
the near-wall coherent structures. In the following, we propose a simple
phenomenological model that introduces some of the effects due to the presence
of turbulent coherent structures on particles in a thin layer close to the
wall. The results obtained with this phenomenological model are in good
agreement with experimental evidence and this suggests to pursue in that
direction, towards the development of more general and rigorous stochastic
models that provide a link between a geometrical description of turbulent flow
and a statistical one.Comment: 40 pages, 8 figure
Scientific Argumentation as a Foundation for the Design of Inquiry-Based Science Instruction
Despite the attention that inquiry has received in science education research and policy, a coherent means for implementing inquiry in the classroom has been missing [1]. In recent research, scientific argumentation has received increasing attention for its role in science and in science education [2]. In this article, we propose that organizing a unit of instruction around building a scientific argument can bring inquiry practices together in the classroom in a coherent way. We outline a framework for argumentation, focusing on arguments that are central to science—arguments for the best explanation. We then use this framework as the basis for a set of design principles for developing a sequence of inquiry-based learning activities that support students in the construction of a scientific argument. We show that careful analysis of the argument that students are expected to build provides designers with a foundation for selecting resources and designing supports for scientific inquiry. Furthermore, we show that creating multiple opportunities for students to critique and refine their explanations through evidence-based argumentation fosters opportunities for critical thinking, while building science knowledge and knowledge of the nature of science
Towards a Testbed for Dynamic Vehicle Routing Algorithms
Since modern transport services are becoming more flexible, demand-responsive, and energy/cost efficient, there is a growing demand for large-scale microscopic simulation platforms in order to test sophisticated routing algorithms. Such platforms have to simulate in detail, not only the dynamically changing demand and supply of the relevant service, but also traffic flow and other relevant transport services. This paper presents the DVRP extension to the open-source MATSim simulator. The extension is designed to be highly general and customizable to simulate a wide range of dynamic rich vehicle routing problems. The extension allows plugging in of various algorithms that are responsible for continuous re-optimisation of routes in response to changes in the system. The DVRP extension has been used in many research and commercial projects dealing with simulation of electric and autonomous taxis, demand-responsive transport, personal rapid transport, free-floating car sharing and parking search
Tracing the cosmic velocity field at z ~ 0.1 from galaxy luminosities in the SDSS DR7
Spatial modulations in the distribution of observed luminosities (computed
using redshifts) of ~ 5 10 galaxies from the SDSS Data Release 7,
probe the cosmic peculiar velocity field out to z ~ 0.1. Allowing for
luminosity evolution, the r-band luminosity function, determined via a
spline-based estimator, is well represented by a Schechter form with
M(z) - 5logh = -20.52 - 1.6(z - 0.1) 0.05 and
= -1.1 0.03. Bulk flows and higher velocity moments in
two redshift bins, 0.02 < z < 0.07 and 0.07 < z < 0.22, agree with the
predictions of the CDM model, as obtained from mock galaxy catalogs
designed to match the observations. Assuming a CDM model, we estimate
1.1 0.4 for the amplitude of the linear matter
power spectrum, where the low accuracy is due to the limited number of
galaxies. While the low-z bin is robust against coherent photometric
uncertainties, the bias of results from the second bin is consistent with the ~
1% magnitude tilt reported by the SDSS collaboration. The systematics are
expected to have a significantly lower impact in future datasets with larger
sky coverage and better photometric calibration.Comment: 21 pages, 11 figures, accepted versio
- …