111,671 research outputs found

    Empirical-likelihood-based confidence interval for the mean with a heavy-tailed distribution

    Full text link
    Empirical-likelihood-based confidence intervals for a mean were introduced by Owen [Biometrika 75 (1988) 237-249], where at least a finite second moment is required. This excludes some important distributions, for example, those in the domain of attraction of a stable law with index between 1 and 2. In this article we use a method similar to Qin and Wong [Scand. J. Statist. 23 (1996) 209-219] to derive an empirical-likelihood-based confidence interval for the mean when the underlying distribution has heavy tails. Our method can easily be extended to obtain a confidence interval for any order of moment of a heavy-tailed distribution

    Optimal Quantum Measurements of Expectation Values of Observables

    Get PDF
    Experimental characterizations of a quantum system involve the measurement of expectation values of observables for a preparable state |psi> of the quantum system. Such expectation values can be measured by repeatedly preparing |psi> and coupling the system to an apparatus. For this method, the precision of the measured value scales as 1/sqrt(N) for N repetitions of the experiment. For the problem of estimating the parameter phi in an evolution exp(-i phi H), it is possible to achieve precision 1/N (the quantum metrology limit) provided that sufficient information about H and its spectrum is available. We consider the more general problem of estimating expectations of operators A with minimal prior knowledge of A. We give explicit algorithms that approach precision 1/N given a bound on the eigenvalues of A or on their tail distribution. These algorithms are particularly useful for simulating quantum systems on quantum computers because they enable efficient measurement of observables and correlation functions. Our algorithms are based on a method for efficiently measuring the complex overlap of |psi> and U|psi>, where U is an implementable unitary operator. We explicitly consider the issue of confidence levels in measuring observables and overlaps and show that, as expected, confidence levels can be improved exponentially with linear overhead. We further show that the algorithms given here can typically be parallelized with minimal increase in resource usage.Comment: 22 page

    Monte Carlo-based tail exponent estimator

    Full text link
    In this paper we propose a new approach to estimation of the tail exponent in financial stock markets. We begin the study with the finite sample behavior of the Hill estimator under {\alpha}-stable distributions. Using large Monte Carlo simulations, we show that the Hill estimator overestimates the true tail exponent and can hardly be used on samples with small length. Utilizing our results, we introduce a Monte Carlo-based method of estimation for the tail exponent. Our proposed method is not sensitive to the choice of tail size and works well also on small data samples. The new estimator also gives unbiased results with symmetrical confidence intervals. Finally, we demonstrate the power of our estimator on the international world stock market indices. On the two separate periods of 2002-2005 and 2006-2009, we estimate the tail exponent

    Software timing analysis for complex hardware with survivability and risk analysis

    Get PDF
    The increasing automation of safety-critical real-time systems, such as those in cars and planes, leads, to more complex and performance-demanding on-board software and the subsequent adoption of multicores and accelerators. This causes software's execution time dispersion to increase due to variable-latency resources such as caches, NoCs, advanced memory controllers and the like. Statistical analysis has been proposed to model the Worst-Case Execution Time (WCET) of software running such complex systems by providing reliable probabilistic WCET (pWCET) estimates. However, statistical models used so far, which are based on risk analysis, are overly pessimistic by construction. In this paper we prove that statistical survivability and risk analyses are equivalent in terms of tail analysis and, building upon survivability analysis theory, we show that Weibull tail models can be used to estimate pWCET distributions reliably and tightly. In particular, our methodology proves the correctness-by-construction of the approach, and our evaluation provides evidence about the tightness of the pWCET estimates obtained, which allow decreasing them reliably by 40% for a railway case study w.r.t. state-of-the-art exponential tails.This work is a collaboration between Argonne National Laboratory and the Barcelona Supercomputing Center within the Joint Laboratory for Extreme-Scale Computing. This research is supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, under contract number DE-AC02- 06CH11357, program manager Laura Biven, and by the Spanish Government (SEV2015-0493), by the Spanish Ministry of Science and Innovation (contract TIN2015-65316-P), by Generalitat de Catalunya (contract 2014-SGR-1051).Peer ReviewedPostprint (author's final draft

    Characterization of the frequency of extreme events by the Generalized Pareto Distribution

    Full text link
    Based on recent results in extreme value theory, we use a new technique for the statistical estimation of distribution tails. Specifically, we use the Gnedenko-Pickands-Balkema-de Haan theorem, which gives a natural limit law for peak-over-threshold values in the form of the Generalized Pareto Distribution (GPD). Useful in finance, insurance, hydrology, we investigate here the earthquake energy distribution described by the Gutenberg-Richter seismic moment-frequency law and analyze shallow earthquakes (depth h < 70 km) in the Harvard catalog over the period 1977-2000 in 18 seismic zones. The whole GPD is found to approximate the tails of the seismic moment distributions quite well above moment-magnitudes larger than mW=5.3 and no statistically significant regional difference is found for subduction and transform seismic zones. We confirm that the b-value is very different in mid-ocean ridges compared to other zones (b=1.50=B10.09 versus b=1.00=B10.05 corresponding to a power law exponent close to 1 versus 2/3) with a very high statistical confidence. We propose a physical mechanism for this, contrasting slow healing ruptures in mid-ocean ridges with fast healing ruptures in other zones. Deviations from the GPD at the very end of the tail are detected in the sample containing earthquakes from all major subduction zones (sample size of 4985 events). We propose a new statistical test of significance of such deviations based on the bootstrap method. The number of events deviating from the tails of GPD in the studied data sets (15-20 at most) is not sufficient for determining the functional form of those deviations. Thus, it is practically impossible to give preference to one of the previously suggested parametric families describing the ends of tails of seismic moment distributions.Comment: pdf document of 21 pages + 2 tables + 20 figures (ps format) + one file giving the regionalizatio

    Underlying Dynamics of Typical Fluctuations of an Emerging Market Price Index: The Heston Model from Minutes to Months

    Full text link
    We investigate the Heston model with stochastic volatility and exponential tails as a model for the typical price fluctuations of the Brazilian S\~ao Paulo Stock Exchange Index (IBOVESPA). Raw prices are first corrected for inflation and a period spanning 15 years characterized by memoryless returns is chosen for the analysis. Model parameters are estimated by observing volatility scaling and correlation properties. We show that the Heston model with at least two time scales for the volatility mean reverting dynamics satisfactorily describes price fluctuations ranging from time scales larger than 20 minutes to 160 days. At time scales shorter than 20 minutes we observe autocorrelated returns and power law tails incompatible with the Heston model. Despite major regulatory changes, hyperinflation and currency crises experienced by the Brazilian market in the period studied, the general success of the description provided may be regarded as an evidence for a general underlying dynamics of price fluctuations at intermediate mesoeconomic time scales well approximated by the Heston model. We also notice that the connection between the Heston model and Ehrenfest urn models could be exploited for bringing new insights into the microeconomic market mechanics.Comment: 20 pages, 9 figures, to appear in Physica

    Inference for Extremal Conditional Quantile Models, with an Application to Market and Birthweight Risks

    Get PDF
    Quantile regression is an increasingly important empirical tool in economics and other sciences for analyzing the impact of a set of regressors on the conditional distribution of an outcome. Extremal quantile regression, or quantile regression applied to the tails, is of interest in many economic and financial applications, such as conditional value-at-risk, production efficiency, and adjustment bands in (S,s) models. In this paper we provide feasible inference tools for extremal conditional quantile models that rely upon extreme value approximations to the distribution of self-normalized quantile regression statistics. The methods are simple to implement and can be of independent interest even in the non-regression case. We illustrate the results with two empirical examples analyzing extreme fluctuations of a stock return and extremely low percentiles of live infants' birthweights in the range between 250 and 1500 grams.Comment: 41 pages, 9 figure
    • …
    corecore