18,744 research outputs found
An Efficient Monte Carlo-based Probabilistic Time-Dependent Routing Calculation Targeting a Server-Side Car Navigation System
Incorporating speed probability distribution to the computation of the route
planning in car navigation systems guarantees more accurate and precise
responses. In this paper, we propose a novel approach for dynamically selecting
the number of samples used for the Monte Carlo simulation to solve the
Probabilistic Time-Dependent Routing (PTDR) problem, thus improving the
computation efficiency. The proposed method is used to determine in a proactive
manner the number of simulations to be done to extract the travel-time
estimation for each specific request while respecting an error threshold as
output quality level. The methodology requires a reduced effort on the
application development side. We adopted an aspect-oriented programming
language (LARA) together with a flexible dynamic autotuning library (mARGOt)
respectively to instrument the code and to take tuning decisions on the number
of samples improving the execution efficiency. Experimental results demonstrate
that the proposed adaptive approach saves a large fraction of simulations
(between 36% and 81%) with respect to a static approach while considering
different traffic situations, paths and error requirements. Given the
negligible runtime overhead of the proposed approach, it results in an
execution-time speedup between 1.5x and 5.1x. This speedup is reflected at
infrastructure-level in terms of a reduction of around 36% of the computing
resources needed to support the whole navigation pipeline
A statistical test on the reliability of the non-coevality of stars in binary systems
We develop a statistical test on the expected difference in age estimates of
two coeval stars in detached double-lined eclipsing binary systems that are
only caused by observational uncertainties. We focus on stars in the mass range
[0.8; 1.6] Msun, and on stars in the main-sequence phase. The ages were
obtained by means of the maximum-likelihood SCEPtER technique. The
observational constraints used in the recovery procedure are stellar mass,
radius, effective temperature, and metallicity [Fe/H]. We defined the statistic
W computed as the ratio of the absolute difference of estimated ages for the
two stars over the age of the older one. We determined the critical values of
this statistics above which coevality can be rejected. The median expected
difference in the reconstructed age between the coeval stars of a binary system
-- caused alone by the observational uncertainties -- shows a strong dependence
on the evolutionary stage. This ranges from about 20% for an evolved primary
star to about 75% for a near ZAMS primary. The median difference also shows an
increase with the mass of the primary star from 20% for 0.8 Msun stars to about
50% for 1.6 Msun stars. The reliability of these results was checked by
repeating the process with a grid of stellar models computed by a different
evolutionary code. We show that the W test is much more sensible to age
differences in the binary system components than the alternative approach of
comparing the confidence interval of the age of the two stars. We also found
that the distribution of W is, for almost all the examined cases, well
approximated by beta distributions. The proposed method improves upon the
techniques that are commonly adopted for judging the coevality of an observed
system. It also provides a result founded on reliable statistics that
simultaneously accounts for all the observational uncertainties.Comment: Abstract shortened. Accepted for publication in A&A. One reference
fixe
On the realized volatility of the ECX CO2 emissions 2008 futures contract: distribution, dynamics and forecasting
The recent implementation of the EU Emissions Trading Scheme (EU ETS) in January 2005 created new financial risks for emitting firms. To deal with these risks, options are traded since October 2006. Because the EU ETS is a new market, the relevant underlying model for option pricing is still a controversial issue. This article improves our understanding of this issue by characterizing the conditional and unconditional distributions of the realized volatility for the 2008 futures contract in the European Climate Exchange (ECX), which is valid during Phase II (2008-2012) of the EU ETS. The realized volatility measures from naive, kernel-based and subsampling estimators are used to obtain inferences about the distributional and dynamic properties of the ECX emissions futures volatility. The distribution of the daily realized volatility in logarithmic form is shown to be close to normal. The mixture-of-distributions hypothesis is strongly rejected, as the returns standardized using daily measures of volatility clearly departs from normality. A simplified HAR-RV model (Corsi, 2009) with only a weekly component, which reproduces long memory properties of the series, is then used to model the volatility dynamics. Finally, the predictive accuracy of the HAR-RV model is tested against GARCH specifications using one-step-ahead forecasts, which confirms the HAR-RV superior ability. Our conclusions indicate that (i) the standard Brownian motion is not an adequate tool for option pricing in the EU ETS, and (ii) a jump component should be included in the stochastic process to price options, thus providing more efficient tools for risk-management activities.CO2 Price; Realized Volatility; HAR-RV; GARCH; Futures Trading; Emissions Markets; EU ETS; Intraday data; Forecasting
DyMo: Dynamic Monitoring of Large Scale LTE-Multicast Systems
LTE evolved Multimedia Broadcast/Multicast Service (eMBMS) is an attractive
solution for video delivery to very large groups in crowded venues. However,
deployment and management of eMBMS systems is challenging, due to the lack of
realtime feedback from the User Equipment (UEs). Therefore, we present the
Dynamic Monitoring (DyMo) system for low-overhead feedback collection. DyMo
leverages eMBMS for broadcasting Stochastic Group Instructions to all UEs.
These instructions indicate the reporting rates as a function of the observed
Quality of Service (QoS). This simple feedback mechanism collects very limited
QoS reports from the UEs. The reports are used for network optimization,
thereby ensuring high QoS to the UEs. We present the design aspects of DyMo and
evaluate its performance analytically and via extensive simulations.
Specifically, we show that DyMo infers the optimal eMBMS settings with
extremely low overhead, while meeting strict QoS requirements under different
UE mobility patterns and presence of network component failures. For instance,
DyMo can detect the eMBMS Signal-to-Noise Ratio (SNR) experienced by the 0.1%
percentile of the UEs with Root Mean Square Error (RMSE) of 0.05% with only 5
to 10 reports per second regardless of the number of UEs
Modelling the distribution of health related quality of life of advancedmelanoma patients in a longitudinal multi-centre clinical trial using M-quantile random effects regression
Health-related quality of life assessment is important in the clinical
evaluation of patients with metastatic disease that may offer useful
information in understanding the clinical effectiveness of a treatment. To
assess if a set of explicative variables impacts on the health-related quality
of life, regression models are routinely adopted. However, the interest of
researchers may be focussed on modelling other parts (e.g. quantiles) of this
conditional distribution. In this paper, we present an approach based on
quantile and M-quantile regression to achieve this goal. We applied the
methodologies to a prospective, randomized, multi-centre clinical trial. In
order to take into account the hierarchical nature of the data we extended the
M-quantile regression model to a three-level random effects specification and
estimated it by maximum likelihood
Recommended from our members
Computational Methods for Parameter Estimation in Climate Models
Intensive computational methods have been used by Earth scientists in a wide range of problems in data inversion and uncertainty quantification such as earthquake epicenter location and climate projections. To quantify the uncertainties resulting from a range of plausible model configurations it is necessary to estimate a multidimensional probability distribution. The computational cost of estimating these distributions for geoscience applications is impractical using traditional methods such as Metropolis/Gibbs algorithms as simulation costs limit the number of experiments that can be obtained reasonably. Several alternate sampling strategies have been proposed that could improve on the sampling efficiency including Multiple Very Fast Simulated Annealing (MVFSA) and Adaptive Metropolis algorithms. The performance of these proposed sampling strategies are evaluated with a surrogate climate model that is able to approximate the noise and response behavior of a realistic atmospheric general circulation model (AGCM). The surrogate model is fast enough that its evaluation can be embedded in these Monte Carlo algorithms. We show that adaptive methods can be superior to MVFSA to approximate the known posterior distribution with fewer forward evaluations. However the adaptive methods can also be limited by inadequate sample mixing. The Single Component and Delayed Rejection Adaptive Metropolis algorithms were found to resolve these limitations, although challenges remain to approximating multi-modal distributions. The results show that these advanced methods of statistical inference can provide practical solutions to the climate model calibration problem and challenges in quantifying climate projection uncertainties. The computational methods would also be useful to problems outside climate prediction, particularly those where sampling is limited by availability of computational resources.National Science Foundation OCE-0415251CONACyT-Mexico 159764Institute for Geophysic
On the realized volatility of the ECX CO2 emissions 2008 futures contract: distribution, dynamics and forecasting
The recent implementation of the EU Emissions Trading Scheme (EU ETS) in January 2005 created new financial risks for emitting firms. To deal with these risks, options are traded since October 2006. Because the EU ETS is a new market, the relevant underlying model for option pricing is still a controversial issue. This article improves our understanding of this issue by characterizing the conditional and unconditional distributions of the realized volatility for the 2008 futures contract in the European Climate Exchange (ECX), which is valid during Phase II (2008-2012) of the EU ETS. The realized volatility measures from naive, kernel-based and subsampling estimators are used to obtain inferences about the distributional and dynamic properties of the ECX emissions futures volatility. The distribution of the daily realized volatility in logarithmic form is shown to be close to normal. The mixture-of-distributions hypothesis is strongly rejected, as the returns standardized using daily measures of volatility clearly departs from normality. A simplified HAR-RV model (Corsi, 2009) with only a weekly component, which reproduces long memory properties of the series, is then used to model the volatility dynamics. Finally, the predictive accuracy of the HAR-RV model is tested against GARCH specifications using one-step-ahead forecasts, which confirms the HAR-RV superior ability. Our conclusions indicate that (i) the standard Brownian motion is not an adequate tool for option pricing in the EU ETS, and (ii) a jump component should be included in the stochastic process to price options, thus providing more efficient tools for risk-management activities.CO2 Price, Realized Volatility, HAR-RV, GARCH, Futures Trading, Emissions Markets, EU ETS, Intraday data, Forecasting
- …